[
  {
    "path": ".github/CODEOWNERS",
    "content": "* @DomInvivo\n"
  },
  {
    "path": ".github/CODE_OF_CONDUCT.md",
    "content": "# Contributor Covenant Code of Conduct\n\n## Our Pledge\n\nWe as members, contributors, and leaders pledge to make participation in our\ncommunity a harassment-free experience for everyone, regardless of age, body\nsize, visible or invisible disability, ethnicity, sex characteristics, gender\nidentity and expression, level of experience, education, socio-economic status,\nnationality, personal appearance, race, religion, or sexual identity\nand orientation.\n\nWe pledge to act and interact in ways that contribute to an open, welcoming,\ndiverse, inclusive, and healthy community.\n\n## Our Standards\n\nExamples of behavior that contributes to a positive environment for our\ncommunity include:\n\n* Demonstrating empathy and kindness toward other people\n* Being respectful of differing opinions, viewpoints, and experiences\n* Giving and gracefully accepting constructive feedback\n* Accepting responsibility and apologizing to those affected by our mistakes,\n  and learning from the experience\n* Focusing on what is best not just for us as individuals, but for the\n  overall community\n\nExamples of unacceptable behavior include:\n\n* The use of sexualized language or imagery, and sexual attention or\n  advances of any kind\n* Trolling, insulting or derogatory comments, and personal or political attacks\n* Public or private harassment\n* Publishing others' private information, such as a physical or email\n  address, without their explicit permission\n* Other conduct which could reasonably be considered inappropriate in a\n  professional setting\n\n## Enforcement Responsibilities\n\nCommunity leaders are responsible for clarifying and enforcing our standards of\nacceptable behavior and will take appropriate and fair corrective action in\nresponse to any behavior that they deem inappropriate, threatening, offensive,\nor harmful.\n\nCommunity leaders have the right and responsibility to remove, edit, or reject\ncomments, commits, code, wiki edits, issues, and other contributions that are\nnot aligned to this Code of Conduct, and will communicate reasons for moderation\ndecisions when appropriate.\n\n## Scope\n\nThis Code of Conduct applies within all community spaces, and also applies when\nan individual is officially representing the community in public spaces.\nExamples of representing our community include using an official e-mail address,\nposting via an official social media account, or acting as an appointed\nrepresentative at an online or offline event.\n\n## Enforcement\n\nInstances of abusive, harassing, or otherwise unacceptable behavior may be\nreported to the community leaders responsible for enforcement at\n.\nAll complaints will be reviewed and investigated promptly and fairly.\n\nAll community leaders are obligated to respect the privacy and security of the\nreporter of any incident.\n\n## Enforcement Guidelines\n\nCommunity leaders will follow these Community Impact Guidelines in determining\nthe consequences for any action they deem in violation of this Code of Conduct:\n\n### 1. Correction\n\n**Community Impact**: Use of inappropriate language or other behavior deemed\nunprofessional or unwelcome in the community.\n\n**Consequence**: A private, written warning from community leaders, providing\nclarity around the nature of the violation and an explanation of why the\nbehavior was inappropriate. A public apology may be requested.\n\n### 2. Warning\n\n**Community Impact**: A violation through a single incident or series\nof actions.\n\n**Consequence**: A warning with consequences for continued behavior. No\ninteraction with the people involved, including unsolicited interaction with\nthose enforcing the Code of Conduct, for a specified period of time. This\nincludes avoiding interactions in community spaces as well as external channels\nlike social media. Violating these terms may lead to a temporary or\npermanent ban.\n\n### 3. Temporary Ban\n\n**Community Impact**: A serious violation of community standards, including\nsustained inappropriate behavior.\n\n**Consequence**: A temporary ban from any sort of interaction or public\ncommunication with the community for a specified period of time. No public or\nprivate interaction with the people involved, including unsolicited interaction\nwith those enforcing the Code of Conduct, is allowed during this period.\nViolating these terms may lead to a permanent ban.\n\n### 4. Permanent Ban\n\n**Community Impact**: Demonstrating a pattern of violation of community\nstandards, including sustained inappropriate behavior,  harassment of an\nindividual, or aggression toward or disparagement of classes of individuals.\n\n**Consequence**: A permanent ban from any sort of public interaction within\nthe community.\n\n## Attribution\n\nThis Code of Conduct is adapted from the [Contributor Covenant][homepage],\nversion 2.0, available at\nhttps://www.contributor-covenant.org/version/2/0/code_of_conduct.html.\n\nCommunity Impact Guidelines were inspired by [Mozilla's code of conduct\nenforcement ladder](https://github.com/mozilla/diversity).\n\n[homepage]: https://www.contributor-covenant.org\n\nFor answers to common questions about this code of conduct, see the FAQ at\nhttps://www.contributor-covenant.org/faq. Translations are available at\nhttps://www.contributor-covenant.org/translations.\n"
  },
  {
    "path": ".github/PULL_REQUEST_TEMPLATE.md",
    "content": "## Changelogs\n\n- _enumerate the changes of that PR._\n\n---\n\n_Checklist:_\n\n- [ ] _Was this PR discussed in an issue? It is recommended to first discuss a new feature into a GitHub issue before opening a PR._\n- [ ] _Add tests to cover the fixed bug(s) or the new introduced feature(s) (if appropriate)._\n- [ ] _Update the API documentation is a new function is added, or an existing one is deleted._\n- [ ] _Write concise and explanatory changelogs above._\n- [ ] _If possible, assign one of the following labels to the PR: `feature`, `fix` or `test` (or ask a maintainer to do it for you)._\n\n---\n\n_discussion related to that PR_\n"
  },
  {
    "path": ".github/workflows/code-check.yml",
    "content": "name: code-check\n\non:\n  push:\n    branches: [\"main\"]\n    tags: [\"*\"]\n  pull_request:\n    branches:\n      - \"*\"\n      - \"!gh-pages\"\n\njobs:\n  python-format-black:\n    name: Python lint [black]\n    runs-on: ubuntu-latest\n    steps:\n      - name: Checkout the code\n        uses: actions/checkout@v3\n\n      - name: Set up Python\n        uses: actions/setup-python@v4\n        with:\n          python-version: \"3.10\"\n\n      - name: Install black\n        run: |\n          pip install black>=23\n\n      - name: Lint\n        run: black --check .\n"
  },
  {
    "path": ".github/workflows/doc.yml",
    "content": "name: doc\n\non:\n  push:\n    branches: [\"main\"]\n\n# Prevent doc action on `main` to conflict with each others.\nconcurrency:\n  group: doc-${{ github.ref }}\n  cancel-in-progress: true\n\njobs:\n  doc:\n    runs-on: \"ubuntu-latest\"\n    timeout-minutes: 30\n\n    defaults:\n      run:\n        shell: bash -l {0}\n\n    steps:\n      - name: Checkout the code\n        uses: actions/checkout@v3\n\n      - name: Setup mamba\n        uses: mamba-org/setup-micromamba@v1\n        with:\n          environment-file: env.yml\n          environment-name: graphium\n          cache-environment: true\n          cache-downloads: true\n\n      - name: Install library\n        run: |\n          python -m pip install --no-deps .\n          pip install typer-cli\n\n      - name: Configure git\n        run: |\n          git config --global user.name \"${GITHUB_ACTOR}\"\n          git config --global user.email \"${GITHUB_ACTOR}@users.noreply.github.com\"\n\n      - name: Deploy the doc\n        run: |\n\n          echo \"Auto-generating typer docs\"\n          typer graphium.cli.__main__ utils docs --name graphium --output docs/cli/graphium.md\n          \n          echo \"Get the gh-pages branch\"\n          git fetch origin gh-pages\n\n          echo \"Build and deploy the doc on main\"\n          mike deploy --push main\n"
  },
  {
    "path": ".github/workflows/release.yml",
    "content": "name: release\n\non:\n  workflow_dispatch:\n    inputs:\n      release-version:\n        description: \"A valid Semver version string\"\n        required: true\n\npermissions:\n  contents: write\n  pull-requests: write\n\njobs:\n  release:\n    # Do not release if not triggered from the default branch\n    if: github.ref == format('refs/heads/{0}', github.event.repository.default_branch)\n\n    runs-on: ubuntu-latest\n    timeout-minutes: 30\n\n    defaults:\n      run:\n        shell: bash -l {0}\n\n    steps:\n      - name: Checkout the code\n        uses: actions/checkout@v3\n\n      - name: Setup mamba\n        uses: mamba-org/setup-micromamba@v1\n        with:\n          environment-file: env.yml\n          environment-name: graphium\n          cache-environment: true\n          cache-downloads: true\n          create-args: >-\n            pip\n            semver\n            python-build\n            setuptools_scm\n\n      - name: Check the version is valid semver\n        run: |\n          RELEASE_VERSION=\"${{ inputs.release-version }}\"\n\n          {\n            pysemver check $RELEASE_VERSION\n          } || {\n            echo \"The version '$RELEASE_VERSION' is not a valid Semver version string.\"\n            echo \"Please use a valid semver version string. More details at https://semver.org/\"\n            echo \"The release process is aborted.\"\n            exit 1\n          }\n\n      - name: Check the version is higher than the latest one\n        run: |\n          # Retrieve the git tags first\n          git fetch --prune --unshallow --tags &> /dev/null\n\n          RELEASE_VERSION=\"${{ inputs.release-version }}\"\n          LATEST_VERSION=$(git describe --abbrev=0 --tags)\n\n          IS_HIGHER_VERSION=$(pysemver compare $RELEASE_VERSION $LATEST_VERSION)\n\n          if [ \"$IS_HIGHER_VERSION\" != \"1\" ]; then\n            echo \"The version '$RELEASE_VERSION' is not higher than the latest version '$LATEST_VERSION'.\"\n            echo \"The release process is aborted.\"\n            exit 1\n          fi\n\n      - name: Build Changelog\n        id: github_release\n        uses: mikepenz/release-changelog-builder-action@v4\n        env:\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        with:\n          toTag: \"main\"\n\n      - name: Configure git\n        run: |\n          git config --global user.name \"${GITHUB_ACTOR}\"\n          git config --global user.email \"${GITHUB_ACTOR}@users.noreply.github.com\"\n\n      - name: Create and push git tag\n        env:\n          GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n        run: |\n          # Tag the release\n          git tag -a \"${{ inputs.release-version }}\" -m \"Release version ${{ inputs.release-version }}\"\n\n          # Checkout the git tag\n          git checkout \"${{ inputs.release-version }}\"\n\n          # Push the modified changelogs\n          git push origin main\n\n          # Push the tags\n          git push origin \"${{ inputs.release-version }}\"\n\n      - name: Install library\n        run: python -m pip install --no-deps .\n\n      - name: Build the wheel and sdist\n        run: python -m build --no-isolation\n\n      - name: Publish package to PyPI\n        uses: pypa/gh-action-pypi-publish@release/v1\n        with:\n          password: ${{ secrets.PYPI_API_TOKEN }}\n          packages-dir: dist/\n\n      - name: Deploy the doc\n        run: |\n          echo \"Get the gh-pages branch\"\n          git fetch origin gh-pages\n\n          echo \"Build and deploy the doc on ${{ inputs.release-version }}\"\n          mike deploy --push stable\n          mike deploy --push ${{ inputs.release-version }}\n\n      - name: Create GitHub Release\n        uses: softprops/action-gh-release@de2c0eb89ae2a093876385947365aca7b0e5f844\n        with:\n          tag_name: ${{ inputs.release-version }}\n          body: ${{steps.github_release.outputs.changelog}}\n"
  },
  {
    "path": ".github/workflows/test.yml",
    "content": "name: test\n\non:\n  push:\n    branches: [\"main\"]\n    tags: [\"*\"]\n  pull_request:\n    branches:\n      - \"*\"\n      - \"!gh-pages\"\n  schedule:\n    - cron: \"0 4 * * *\"\n\njobs:\n  test:\n    strategy:\n      fail-fast: false\n      matrix:\n        python-version: [\"3.8\", \"3.9\", \"3.10\"]\n        pytorch-version: [\"2.0\"]\n\n    runs-on: \"ubuntu-latest\"\n    timeout-minutes: 30\n\n    defaults:\n      run:\n        shell: bash -l {0}\n\n    name: |\n        regular_env -\n        python=${{ matrix.python-version }} -\n        pytorch=${{ matrix.pytorch-version }}\n\n    steps:\n      - name: Checkout the code\n        uses: actions/checkout@v3\n\n      - name: Setup mamba\n        uses: mamba-org/setup-micromamba@v1\n        with:\n          environment-file: env.yml\n          environment-name: graphium\n          cache-environment: true\n          cache-downloads: true\n          create-args: >-\n            python=${{ matrix.python-version }}\n            pytorch=${{ matrix.pytorch-version }}\n\n      - name: Install library\n        run: python -m pip install --no-deps -e . # `-e` required for correct `coverage` run.\n\n      - name: Run tests\n        run: pytest -m 'not ipu'\n\n      - name: Test CLI\n        run: graphium --help\n\n      - name: Test building the doc\n        run: mkdocs build\n\n      - name: Codecov Upload\n        uses: codecov/codecov-action@v3\n        with:\n          files: ./coverage.xml\n          flags: unittests\n          name: codecov-umbrella\n          fail_ci_if_error: false\n          verbose: false\n          env_vars: ${{ matrix.python-version }},${{ matrix.pytorch-version }}\n"
  },
  {
    "path": ".github/workflows/test_ipu.yml",
    "content": "name: test-ipu\n\non:\n  push:\n    branches: [\"main\"]\n    tags: [\"*\"]\n  pull_request:\n    branches:\n      - \"*\"\n      - \"!gh-pages\"\n  schedule:\n    - cron: \"0 4 * * *\"\n\njobs:\n  test-ipu:\n    strategy:\n      fail-fast: false\n      matrix:\n        python-version: [\"3.8\"]\n        pytorch-version: [\"2.0\"]\n\n    runs-on: \"ubuntu-20.04\"\n    timeout-minutes: 30\n\n    defaults:\n      run:\n        shell: bash -l {0}\n\n    name: |\n        poptorch_env - \n        python=${{ matrix.python-version }} -\n        pytorch=${{ matrix.pytorch-version }}\n\n    steps:\n      - name: Checkout the code\n        uses: actions/checkout@v3\n\n      - name: Activate SDK + Install Requirements\n        run: |\n          python3 -m pip install --upgrade pip\n          wget -q -O 'poplar_sdk-ubuntu_20_04-3.3.0-208993bbb7.tar.gz' 'https://downloads.graphcore.ai/direct?package=poplar-poplar_sdk_ubuntu_20_04_3.3.0_208993bbb7-3.3.0&file=poplar_sdk-ubuntu_20_04-3.3.0-208993bbb7.tar.gz'\n          tar -xzf poplar_sdk-ubuntu_20_04-3.3.0-208993bbb7.tar.gz\n          python3 -m pip install poplar_sdk-ubuntu_20_04-3.3.0+1403-208993bbb7/poptorch-3.3.0+113432_960e9c294b_ubuntu_20_04-cp38-cp38-linux_x86_64.whl\n          # Enable Poplar SDK (including Poplar and PopART)\n          source poplar_sdk-ubuntu_20_04-3.3.0+1403-208993bbb7/enable \n          \n          python -c \"import poptorch\"\n\n          # Download the datafiles (Total ~ 10Mb - nothing compared to the libraries)\n          wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k.csv.gz\n          wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21-7k-12-labels.csv.gz\n          wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9.csv.gz\n\n\n          # Install the IPU specific and graphium requirements\n          pip install -r requirements_ipu.txt\n          # Install Graphium in dev mode\n          python -m pip install --no-deps -e .\n          python3 -m pytest -m 'not skip_ipu'\n\n      - name: Codecov Upload\n        uses: codecov/codecov-action@v3\n        with:\n          files: ./coverage.xml\n          flags: unittests\n          name: codecov-umbrella\n          fail_ci_if_error: false\n          verbose: false\n          env_vars: ${{ matrix.python-version }},${{ matrix.pytorch-version }}\n"
  },
  {
    "path": ".gitignore",
    "content": "############ graphium Custom GitIgnore ##############\n\n# Workspace\n*.code-workspace\n.vscode/\n.idea/\n\n# Training logs and models\n*.out\n*.cache\n*.ckpt\n*.pickle\n*.datacache\n*.datacache.gz\nlightning_logs/\nlogs/\nmultirun/\nhparam-search-results/\nmodels_checkpoints/\noutputs/\nout/\nsaved_models/\nwandb/\nout/\ndatacache/\ntests/temp_cache*\npredictions/\ndraft/\nscripts-expts/\nsweeps/\nmup/\n\n# Data and predictions\ngraphium/data/ZINC_bench_gnn/\ngraphium/data/BindingDB/\ngraphium/data/mega-pubchem/\ngraphium/data/cache/\ngraphium/data/b3lyp/\ngraphium/data/PCQM4Mv2/\ngraphium/data/PCQM4M/\ngraphium/data/neurips2023/small-dataset/\ngraphium/data/neurips2023/large-dataset/\ngraphium/data/neurips2023/dummy-dataset/\ngraphium/data/make_data_splits/*.csv*\ngraphium/data/make_data_splits/*.pt*\ngraphium/data/make_data_splits/*.parquet*\n*.csv.gz\n*.pt\n\n# Others\nexpts_untracked/\ndebug/\nchange_commits.sh\ngraphium/features/test_new_pes.ipynb\n\n# IPU related ignores and profiler outputs\n*.a\n*.cbor\n*.capnp\n*.pop\n*.popart\n*.pop_cache\n*.popef\n*.pvti*\n\n############ END graphium Custom GitIgnore ##############\n\n\n# Byte-compiled / optimized / DLL files\n__pycache__/\n*.py[cod]\n*$py.class\n*.txt\n# C extensions\n*.so\n# Distribution / packaging\n.Python\nbuild/\ndevelop-eggs/\ndist/\ndownloads/\neggs/\n.eggs/\nlib/\nlib64/\nparts/\nsdist/\nvar/\nwheels/\npip-wheel-metadata/\nshare/python-wheels/\n*.egg-info/\n.installed.cfg\n*.egg\nMANIFEST\n\n# PyInstaller\n#  Usually these files are written by a python script from a template\n#  before PyInstaller builds the exe, so as to inject date/other infos into it.\n*.manifest\n*.spec\n\n# Installer logs\npip-log.txt\npip-delete-this-directory.txt\n\n# Unit test / coverage reports\nhtmlcov/\n.tox/\n.nox/\n.coverage\n.coverage.*\n.cache\nnosetests.xml\ncoverage.xml\n*.cover\n*.py,cover\n.hypothesis/\n.pytest_cache/\n\n# Translations\n*.mo\n*.pot\n\n# Django stuff:\n*.log\nlocal_settings.py\ndb.sqlite3\ndb.sqlite3-journal\n\n# Flask stuff:\ninstance/\n.webassets-cache\n\n# Scrapy stuff:\n.scrapy\n\n# Sphinx documentation\ndocs/_build/\n\n# PyBuilder\ntarget/\n\n# Jupyter Notebook\n.ipynb_checkpoints\n\n# IPython\nprofile_default/\nipython_config.py\n\n# pyenv\n.python-version\n\n# pipenv\n#   According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.\n#   However, in case of collaboration, if having platform-specific dependencies or dependencies\n#   having no cross-platform support, pipenv may install dependencies that don't work, or not\n#   install all needed dependencies.\n#Pipfile.lock\n\n# PEP 582; used by e.g. github.com/David-OConnor/pyflow\n__pypackages__/\n\n# Celery stuff\ncelerybeat-schedule\ncelerybeat.pid\n\n# SageMath parsed files\n*.sage.py\n\n# Environments\n.env\n.venv\nenv/\nvenv/\nENV/\nenv.bak/\nvenv.bak/\n\n# Spyder project settings\n.spyderproject\n.spyproject\n\n# Rope project settings\n.ropeproject\n\n# mkdocs documentation\n/site\n\n# mypy\n.mypy_cache/\n.dmypy.json\ndmypy.json\n\n# Pyre type checker\n.pyre/\n"
  },
  {
    "path": "LICENSE",
    "content": " Apache License\r\n                           Version 2.0, January 2004\r\n                        http://www.apache.org/licenses/\r\n\r\n   TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION\r\n\r\n   1. Definitions.\r\n\r\n      \"License\" shall mean the terms and conditions for use, reproduction,\r\n      and distribution as defined by Sections 1 through 9 of this document.\r\n\r\n      \"Licensor\" shall mean the copyright owner or entity authorized by\r\n      the copyright owner that is granting the License.\r\n\r\n      \"Legal Entity\" shall mean the union of the acting entity and all\r\n      other entities that control, are controlled by, or are under common\r\n      control with that entity. For the purposes of this definition,\r\n      \"control\" means (i) the power, direct or indirect, to cause the\r\n      direction or management of such entity, whether by contract or\r\n      otherwise, or (ii) ownership of fifty percent (50%) or more of the\r\n      outstanding shares, or (iii) beneficial ownership of such entity.\r\n\r\n      \"You\" (or \"Your\") shall mean an individual or Legal Entity\r\n      exercising permissions granted by this License.\r\n\r\n      \"Source\" form shall mean the preferred form for making modifications,\r\n      including but not limited to software source code, documentation\r\n      source, and configuration files.\r\n\r\n      \"Object\" form shall mean any form resulting from mechanical\r\n      transformation or translation of a Source form, including but\r\n      not limited to compiled object code, generated documentation,\r\n      and conversions to other media types.\r\n\r\n      \"Work\" shall mean the work of authorship, whether in Source or\r\n      Object form, made available under the License, as indicated by a\r\n      copyright notice that is included in or attached to the work\r\n      (an example is provided in the Appendix below).\r\n\r\n      \"Derivative Works\" shall mean any work, whether in Source or Object\r\n      form, that is based on (or derived from) the Work and for which the\r\n      editorial revisions, annotations, elaborations, or other modifications\r\n      represent, as a whole, an original work of authorship. For the purposes\r\n      of this License, Derivative Works shall not include works that remain\r\n      separable from, or merely link (or bind by name) to the interfaces of,\r\n      the Work and Derivative Works thereof.\r\n\r\n      \"Contribution\" shall mean any work of authorship, including\r\n      the original version of the Work and any modifications or additions\r\n      to that Work or Derivative Works thereof, that is intentionally\r\n      submitted to Licensor for inclusion in the Work by the copyright owner\r\n      or by an individual or Legal Entity authorized to submit on behalf of\r\n      the copyright owner. For the purposes of this definition, \"submitted\"\r\n      means any form of electronic, verbal, or written communication sent\r\n      to the Licensor or its representatives, including but not limited to\r\n      communication on electronic mailing lists, source code control systems,\r\n      and issue tracking systems that are managed by, or on behalf of, the\r\n      Licensor for the purpose of discussing and improving the Work, but\r\n      excluding communication that is conspicuously marked or otherwise\r\n      designated in writing by the copyright owner as \"Not a Contribution.\"\r\n\r\n      \"Contributor\" shall mean Licensor and any individual or Legal Entity\r\n      on behalf of whom a Contribution has been received by Licensor and\r\n      subsequently incorporated within the Work.\r\n\r\n   2. Grant of Copyright License. Subject to the terms and conditions of\r\n      this License, each Contributor hereby grants to You a perpetual,\r\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\r\n      copyright license to reproduce, prepare Derivative Works of,\r\n      publicly display, publicly perform, sublicense, and distribute the\r\n      Work and such Derivative Works in Source or Object form.\r\n\r\n   3. Grant of Patent License. Subject to the terms and conditions of\r\n      this License, each Contributor hereby grants to You a perpetual,\r\n      worldwide, non-exclusive, no-charge, royalty-free, irrevocable\r\n      (except as stated in this section) patent license to make, have made,\r\n      use, offer to sell, sell, import, and otherwise transfer the Work,\r\n      where such license applies only to those patent claims licensable\r\n      by such Contributor that are necessarily infringed by their\r\n      Contribution(s) alone or by combination of their Contribution(s)\r\n      with the Work to which such Contribution(s) was submitted. If You\r\n      institute patent litigation against any entity (including a\r\n      cross-claim or counterclaim in a lawsuit) alleging that the Work\r\n      or a Contribution incorporated within the Work constitutes direct\r\n      or contributory patent infringement, then any patent licenses\r\n      granted to You under this License for that Work shall terminate\r\n      as of the date such litigation is filed.\r\n\r\n   4. Redistribution. You may reproduce and distribute copies of the\r\n      Work or Derivative Works thereof in any medium, with or without\r\n      modifications, and in Source or Object form, provided that You\r\n      meet the following conditions:\r\n\r\n      (a) You must give any other recipients of the Work or\r\n          Derivative Works a copy of this License; and\r\n\r\n      (b) You must cause any modified files to carry prominent notices\r\n          stating that You changed the files; and\r\n\r\n      (c) You must retain, in the Source form of any Derivative Works\r\n          that You distribute, all copyright, patent, trademark, and\r\n          attribution notices from the Source form of the Work,\r\n          excluding those notices that do not pertain to any part of\r\n          the Derivative Works; and\r\n\r\n      (d) If the Work includes a \"NOTICE\" text file as part of its\r\n          distribution, then any Derivative Works that You distribute must\r\n          include a readable copy of the attribution notices contained\r\n          within such NOTICE file, excluding those notices that do not\r\n          pertain to any part of the Derivative Works, in at least one\r\n          of the following places: within a NOTICE text file distributed\r\n          as part of the Derivative Works; within the Source form or\r\n          documentation, if provided along with the Derivative Works; or,\r\n          within a display generated by the Derivative Works, if and\r\n          wherever such third-party notices normally appear. The contents\r\n          of the NOTICE file are for informational purposes only and\r\n          do not modify the License. You may add Your own attribution\r\n          notices within Derivative Works that You distribute, alongside\r\n          or as an addendum to the NOTICE text from the Work, provided\r\n          that such additional attribution notices cannot be construed\r\n          as modifying the License.\r\n\r\n      You may add Your own copyright statement to Your modifications and\r\n      may provide additional or different license terms and conditions\r\n      for use, reproduction, or distribution of Your modifications, or\r\n      for any such Derivative Works as a whole, provided Your use,\r\n      reproduction, and distribution of the Work otherwise complies with\r\n      the conditions stated in this License.\r\n\r\n   5. Submission of Contributions. Unless You explicitly state otherwise,\r\n      any Contribution intentionally submitted for inclusion in the Work\r\n      by You to the Licensor shall be under the terms and conditions of\r\n      this License, without any additional terms or conditions.\r\n      Notwithstanding the above, nothing herein shall supersede or modify\r\n      the terms of any separate license agreement you may have executed\r\n      with Licensor regarding such Contributions.\r\n\r\n   6. Trademarks. This License does not grant permission to use the trade\r\n      names, trademarks, service marks, or product names of the Licensor,\r\n      except as required for reasonable and customary use in describing the\r\n      origin of the Work and reproducing the content of the NOTICE file.\r\n\r\n   7. Disclaimer of Warranty. Unless required by applicable law or\r\n      agreed to in writing, Licensor provides the Work (and each\r\n      Contributor provides its Contributions) on an \"AS IS\" BASIS,\r\n      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or\r\n      implied, including, without limitation, any warranties or conditions\r\n      of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A\r\n      PARTICULAR PURPOSE. You are solely responsible for determining the\r\n      appropriateness of using or redistributing the Work and assume any\r\n      risks associated with Your exercise of permissions under this License.\r\n\r\n   8. Limitation of Liability. In no event and under no legal theory,\r\n      whether in tort (including negligence), contract, or otherwise,\r\n      unless required by applicable law (such as deliberate and grossly\r\n      negligent acts) or agreed to in writing, shall any Contributor be\r\n      liable to You for damages, including any direct, indirect, special,\r\n      incidental, or consequential damages of any character arising as a\r\n      result of this License or out of the use or inability to use the\r\n      Work (including but not limited to damages for loss of goodwill,\r\n      work stoppage, computer failure or malfunction, or any and all\r\n      other commercial damages or losses), even if such Contributor\r\n      has been advised of the possibility of such damages.\r\n\r\n   9. Accepting Warranty or Additional Liability. While redistributing\r\n      the Work or Derivative Works thereof, You may choose to offer,\r\n      and charge a fee for, acceptance of support, warranty, indemnity,\r\n      or other liability obligations and/or rights consistent with this\r\n      License. However, in accepting such obligations, You may act only\r\n      on Your own behalf and on Your sole responsibility, not on behalf\r\n      of any other Contributor, and only if You agree to indemnify,\r\n      defend, and hold each Contributor harmless for any liability\r\n      incurred by, or claims asserted against, such Contributor by reason\r\n      of your accepting any such warranty or additional liability.\r\n\r\n   END OF TERMS AND CONDITIONS\r\n\r\n   APPENDIX: How to apply the Apache License to your work.\r\n\r\n      To apply the Apache License to your work, attach the following\r\n      boilerplate notice, with the fields enclosed by brackets \"[]\"\r\n      replaced with your own identifying information. (Don't include\r\n      the brackets!)  The text should be enclosed in the appropriate\r\n      comment syntax for the file format. We also recommend that a\r\n      file or class name and description of purpose be included on the\r\n      same \"printed page\" as the copyright notice for easier\r\n      identification within third-party archives.\r\n\r\n   Copyright 2023 Valence Labs\r\n   Copyright 2023 Recursion Pharmaceuticals\r\n   Copyright 2023 Graphcore Limited\r\n\r\n   Various Academic groups have also contributed to this software under\r\n   the given license. These include, but are not limited, to the following\r\n\r\n   1. University of Montreal\r\n   2. McGill University\r\n   3. Mila - Institut Quebecois d'intelligence artificielle\r\n   4. HEC Montreal\r\n   5. CIFAR AI Chair\r\n   6. New Jersey Institute of Technology\r\n   7. RWTH Aachen University\r\n\r\n   Licensed under the Apache License, Version 2.0 (the \"License\");\r\n   you may not use this file except in compliance with the License.\r\n   You may obtain a copy of the License at\r\n\r\n       http://www.apache.org/licenses/LICENSE-2.0\r\n\r\n   Unless required by applicable law or agreed to in writing, software\r\n   distributed under the License is distributed on an \"AS IS\" BASIS,\r\n   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.\r\n   See the License for the specific language governing permissions and\r\n   limitations under the License.\r\n"
  },
  {
    "path": "README.md",
    "content": "<div align=\"center\">\n    <img src=\"docs/images/banner-tight.png\" height=\"200px\">\n    <h3>Scaling molecular GNNs to infinity</h3>\n</div>\n\n---\n\n[![PyPI](https://img.shields.io/pypi/v/graphium)](https://pypi.org/project/graphium/)\n[![Conda](https://img.shields.io/conda/v/conda-forge/graphium?label=conda&color=success)](https://anaconda.org/conda-forge/graphium)\n[![PyPI - Downloads](https://img.shields.io/pypi/dm/graphium)](https://pypi.org/project/graphium/)\n[![Conda](https://img.shields.io/conda/dn/conda-forge/graphium)](https://anaconda.org/conda-forge/graphium)\n[![license](https://img.shields.io/badge/License-Apache%202.0-blue.svg)](https://github.com/datamol-io/graphium/blob/main/LICENSE)\n[![GitHub Repo stars](https://img.shields.io/github/stars/datamol-io/graphium)](https://github.com/datamol-io/graphium/stargazers)\n[![GitHub Repo stars](https://img.shields.io/github/forks/datamol-io/graphium)](https://github.com/datamol-io/graphium/network/members)\n[![test](https://github.com/datamol-io/graphium/actions/workflows/test.yml/badge.svg)](https://github.com/datamol-io/graphium/actions/workflows/test.yml)\n[![test-ipu](https://github.com/datamol-io/graphium/actions/workflows/test_ipu.yml/badge.svg)](https://github.com/datamol-io/graphium/actions/workflows/test_ipu.yml)\n[![release](https://github.com/datamol-io/graphium/actions/workflows/release.yml/badge.svg)](https://github.com/datamol-io/graphium/actions/workflows/release.yml)\n[![code-check](https://github.com/datamol-io/graphium/actions/workflows/code-check.yml/badge.svg)](https://github.com/datamol-io/graphium/actions/workflows/code-check.yml)\n[![doc](https://github.com/datamol-io/graphium/actions/workflows/doc.yml/badge.svg)](https://github.com/datamol-io/graphium/actions/workflows/doc.yml)\n[![codecov](https://codecov.io/gh/datamol-io/graphium/branch/main/graph/badge.svg?token=bHOkKY5Fze)](https://codecov.io/gh/datamol-io/graphium)\n[![hydra](https://img.shields.io/badge/Config-Hydra_1.3-89b8cd)](https://hydra.cc/)\n\nA deep learning library focused on graph representation learning for real-world chemical tasks.\n\n- ✅ State-of-the-art GNN architectures.\n- 🐍 Extensible API: build your own GNN model and train it with ease.\n- ⚗️ Rich featurization: powerful and flexible built-in molecular featurization.\n- 🧠 Pretrained models: for fast and easy inference or transfer learning.\n- ⮔ Read-to-use training loop based on [Pytorch Lightning](https://www.pytorchlightning.ai/).\n- 🔌 Have a new dataset? Graphium provides a simple plug-and-play interface. Change the path, the name of the columns to predict, the atomic featurization, and you’re ready to play!\n\n## Documentation\n\nVisit https://graphium-docs.datamol.io/.\n\n## Installation for developers\n\n### For CPU and GPU developers\n\nUse [`mamba`](https://github.com/mamba-org/mamba), a faster and better alternative to `conda`.\n\nIf you are using a GPU, we recommend enforcing the CUDA version that you need with `CONDA_OVERRIDE_CUDA=XX.X`.\n\n```bash\n# Install Graphium's dependencies in a new environment named `graphium`\nmamba env create -f env.yml -n graphium\n\n# To force the CUDA version to 11.2, or any other version you prefer, use the following command:\n# CONDA_OVERRIDE_CUDA=11.2 mamba env create -f env.yml -n graphium\n\n# Install Graphium in dev mode\nmamba activate graphium\npip install --no-deps -e .\n```\n\n### For IPU developers\n```bash\n# Install Graphcore's SDK and Graphium dependencies in a new environment called `.graphium_ipu`\n./install_ipu.sh .graphium_ipu\n```\n\nThe above step needs to be done once. After that, enable the SDK and the environment as follows:\n\n```bash\nsource enable_ipu.sh .graphium_ipu\n```\n\n## Training a model\n\nTo learn how to train a model, we invite you to look at the documentation, or the jupyter notebooks available [here](https://github.com/datamol-io/graphium/tree/master/docs/tutorials/model_training).\n\nIf you are not familiar with [PyTorch](https://pytorch.org/docs) or [PyTorch-Lightning](https://pytorch-lightning.readthedocs.io/en/latest/), we highly recommend going through their tutorial first.\n\n## Running an experiment\nWe have setup Graphium with `hydra` for managing config files. To run an experiment go to the `expts/` folder. For example, to benchmark a GCN on the ToyMix dataset run\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn\n```\nTo change parameters specific to this experiment like switching from `fp16` to `fp32` precision, you can either override them directly in the CLI via\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn trainer.trainer.precision=32\n```\nor change them permanently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.\nIntegrating `hydra` also allows you to quickly switch between accelerators. E.g., running\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn accelerator=gpu\n```\nautomatically selects the correct configs to run the experiment on GPU.\nFinally, you can also run a fine-tuning loop:\n```bash\ngraphium-train +finetuning=admet\n```\n\nTo use a config file you built from scratch you can run\n```bash\ngraphium-train --config-path [PATH] --config-name [CONFIG]\n```\nThanks to the modular nature of `hydra` you can reuse many of our config settings for your own experiments with Graphium.\n\n## Preparing the data in advance\nThe data preparation including the featurization (e.g., of molecules from smiles to pyg-compatible format) is embedded in the pipeline and will be performed when executing `graphium-train [...]`.\n\nHowever, when working with larger datasets, it is recommended to perform data preparation in advance using a machine with sufficient allocated memory (e.g., ~400GB in the case of `LargeMix`). Preparing data in advance is also beneficial when running lots of concurrent jobs with identical molecular featurization, so that resources aren't wasted and processes don't conflict reading/writing in the same directory.\n\nThe following command-line will prepare the data and cache it, then use it to train a model.\n```bash\n# First prepare the data and cache it in `path_to_cached_data`\ngraphium data prepare ++datamodule.args.processed_graph_data_path=[path_to_cached_data]\n\n# Then train the model on the prepared data\ngraphium-train [...] datamodule.args.processed_graph_data_path=[path_to_cached_data]\n```\n\n**Note** that `datamodule.args.processed_graph_data_path` can also be specified at `expts/hydra_configs/`.\n\n**Note** that, every time the configs of `datamodule.args.featurization` changes, you will need to run a new data preparation, which will automatically be saved in a separate directory that uses a hash unique to the configs.\n\n## License\n\nUnder the Apache-2.0 license. See [LICENSE](LICENSE).\n\n## Documentation\n\n- Diagram for data processing in Graphium.\n\n<img src=\"docs/images/datamodule.png\" alt=\"Data Processing Chart\" width=\"60%\" height=\"60%\">\n\n- Diagram for Muti-task network in Graphium\n\n<img src=\"docs/images/full_graph_network.png\" alt=\"Full Graph Multi-task Network\" width=\"80%\" height=\"80%\">\n"
  },
  {
    "path": "cleanup_files.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n# Delete saved files like wandb, checkpoints, profiling etc.\n# Usage: bash cleanup_files.sh\nrm -rf wandb/*\nrm -rf checkpoints/*\nrm -rf pro_vision/*\nrm -rf outputs/*\nrm -rf models_checkpoints/*\nrm -rf datacache/*\n"
  },
  {
    "path": "codecov.yml",
    "content": "coverage:\n  range: \"50...80\"\n  status:\n    project:\n      default:\n        threshold: 1%\n    patch: false\n\ncomment:\n  layout: \"header, diff, flags, components\" # show component info in the PR comment\n\ncomponent_management:\n  default_rules: # default rules that will be inherited by all components\n    statuses:\n      - type: project # in this case every component that doens't have a status defined will have a project type one\n        target: auto\n        branches:\n          - \"!main\"\n  individual_components:\n    - component_id: ipu # this is an identifier that should not be changed\n      name: ipu # this is a display name, and can be changed freely\n      paths:\n        - graphium/ipu/**\n"
  },
  {
    "path": "docs/_assets/css/custom-graphium.css",
    "content": ":root {\n  --graphium-primary: #548235;\n  --graphium-secondary: #343a40;\n\n  /* Primary color shades */\n  --md-primary-fg-color: var(--graphium-primary);\n  --md-primary-fg-color--light: var(--graphium-primary);\n  --md-primary-fg-color--dark: var(--graphium-primary);\n  --md-primary-bg-color: var(--graphium-secondary);\n  --md-primary-bg-color--light: var(--graphium-secondary);\n  --md-text-link-color: var(--graphium-secondary);\n\n  /* Accent color shades */\n  --md-accent-fg-color: var(--graphium-secondary);\n  --md-accent-fg-color--transparent: var(--graphium-secondary);\n  --md-accent-bg-color: var(--graphium-secondary);\n  --md-accent-bg-color--light: var(--graphium-secondary);\n}\n\n:root > * {\n  /* Code block color shades */\n  --md-code-bg-color: hsla(0, 0%, 96%, 1);\n  --md-code-fg-color: hsla(200, 18%, 26%, 1);\n\n  /* Footer */\n  --md-footer-bg-color: var(--graphium-primary);\n  /* --md-footer-bg-color--dark: hsla(0, 0%, 0%, 0.32); */\n  --md-footer-fg-color: var(--graphium-secondary);\n  --md-footer-fg-color--light: var(--graphium-secondary);\n  --md-footer-fg-color--lighter: var(--graphium-secondary);\n}\n\n.md-header {\n  background-image: linear-gradient(to right, #548235, #8DB771);\n}\n\n.md-footer {\n  background-image: linear-gradient(to right, #548235, #8DB771);\n}\n\n.md-tabs {\n  background-image: linear-gradient(to right, #f4f6f9, #e2cec3);\n}\n\n.md-header__topic {\n  color: #fff;\n}\n\n.md-source__repository,\n.md-source__icon,\n.md-search__input,\n.md-search__input::placeholder,\n.md-search__input ~ .md-search__icon,\n.md-footer__inner.md-grid,\n.md-copyright__highlight,\n.md-copyright,\n.md-footer-meta.md-typeset a,\n.md-version {\n  color: #fff !important;\n}\n\n.md-search__form {\n  background-color: rgba(255, 255, 255, 0.2);\n}\n\n.md-search__input {\n  color: #222 !important;\n}\n\n.md-header__topic {\n  color: #fff;\n  font-size: 1.4em;\n}\n\n/* Increase the size of the logo */\n.md-header__button.md-logo img,\n.md-header__button.md-logo svg {\n  height: 2rem !important;\n}\n\n/* Reduce the margin around the logo */\n.md-header__button.md-logo {\n  margin: 0.4em;\n  padding: 0.4em;\n}\n\n/* Remove the `In` and `Out` block in rendered Jupyter notebooks */\n.md-container .jp-Cell-outputWrapper .jp-OutputPrompt.jp-OutputArea-prompt,\n.md-container .jp-Cell-inputWrapper .jp-InputPrompt.jp-InputArea-prompt {\n  display: none !important;\n}\n"
  },
  {
    "path": "docs/_assets/css/custom.css",
    "content": "/* Indentation. */\ndiv.doc-contents:not(.first) {\n  padding-left: 25px;\n  border-left: 4px solid #e6e6e6;\n  margin-bottom: 80px;\n}\n\n/* Don't capitalize names. */\nh5.doc-heading {\n  text-transform: none !important;\n}\n\n/* Don't use vertical space on hidden ToC entries. */\n.hidden-toc::before {\n  margin-top: 0 !important;\n  padding-top: 0 !important;\n}\n\n/* Don't show permalink of hidden ToC entries. */\n.hidden-toc a.headerlink {\n  display: none;\n}\n\n/* Avoid breaking parameters name, etc. in table cells. */\ntd code {\n  word-break: normal !important;\n}\n\n/* For pieces of Markdown rendered in table cells. */\ntd p {\n  margin-top: 0 !important;\n  margin-bottom: 0 !important;\n}\n"
  },
  {
    "path": "docs/_assets/js/google-analytics.js",
    "content": "var gtag_id = \"G-K76VNHBRM4\";\n\nvar script = document.createElement(\"script\");\nscript.src = \"https://www.googletagmanager.com/gtag/js?id=\" + gtag_id;\ndocument.head.appendChild(script);\n\nwindow.dataLayer = window.dataLayer || [];\nfunction gtag() {\n  dataLayer.push(arguments);\n}\ngtag(\"js\", new Date());\ngtag(\"config\", gtag_id);\n"
  },
  {
    "path": "docs/api/graphium.config.md",
    "content": "graphium.config\n====================\nhelper functions to load and convert config files\n\n::: graphium.config._loader\n::: graphium.config.config_convert\n::: graphium.config._load\n"
  },
  {
    "path": "docs/api/graphium.data.md",
    "content": "graphium.data\n====================\nmodule for loading datasets and collating batches\n\n=== \"Contents\"\n\n    * [Data Module](#data-module)\n    * [Collate Module](#collate-module)\n    * [Util Functions](#util-functions)\n\n## Data Module\n------------\n::: graphium.data.datamodule\n\n\n## Collate Module\n------------\n::: graphium.data.collate\n\n\n## Util Functions\n------------\n::: graphium.data.utils\n"
  },
  {
    "path": "docs/api/graphium.features.md",
    "content": "graphium.features\n====================\nFeature extraction and manipulation\n\n=== \"Contents\"\n\n    * [Featurizer](#featurizer)\n    * [Positional Encoding](#positional-encoding)\n    * [Properties](#properties)\n    * [Spectral PE](#spectral-pe)\n    * [Random Walk PE](#random-walk-pe)\n    * [NMP](#nmp)\n\n## Featurizer\n------------\n::: graphium.features.featurizer\n\n\n## Positional Encoding\n------------\n::: graphium.features.positional_encoding\n\n\n## Properties\n------------\n::: graphium.features.properties\n\n\n## Spectral PE\n------------\n::: graphium.features.spectral\n\n\n## Random Walk PE\n------------\n::: graphium.features.rw\n\n\n## NMP\n------------\n::: graphium.features.nmp\n"
  },
  {
    "path": "docs/api/graphium.finetuning.md",
    "content": "graphium.finetuning\n====================\nModule for finetuning models and doing linear probing (fingerprinting).\n\n::: graphium.finetuning.finetuning.GraphFinetuning\n\n::: graphium.finetuning.finetuning_architecture.FullGraphFinetuningNetwork\n\n::: graphium.finetuning.finetuning_architecture.PretrainedModel\n\n::: graphium.finetuning.finetuning_architecture.FinetuningHead\n\n::: graphium.finetuning.fingerprinting.Fingerprinter\n"
  },
  {
    "path": "docs/api/graphium.ipu.md",
    "content": "graphium.ipu\n====================\nCode for adapting to run on IPU\n\n=== \"Contents\"\n\n    * [IPU Dataloader](#ipu-dataloader)\n    * [IPU Losses](#ipu-losses)\n    * [IPU Metrics](#ipu-metrics)\n    * [IPU Simple Lightning](#ipu-simple-lightning)\n    * [IPU Utils](#ipu-utils)\n    * [IPU Wrapper](#ipu-wrapper)\n    * [To Dense Batch](#to-dense-batch)\n\n## IPU Dataloader\n------------\n::: graphium.ipu.ipu_dataloader\n\n\n## IPU Losses\n------------\n::: graphium.ipu.ipu_losses\n\n\n## IPU Metrics\n------------\n::: graphium.ipu.ipu_metrics\n\n\n## IPU Simple Lightning\n------------\n::: graphium.ipu.ipu_simple_lightning\n\n\n## IPU Utils\n------------\n::: graphium.ipu.ipu_utils\n\n\n## IPU Wrapper\n------------\n::: graphium.ipu.ipu_wrapper\n\n\n## To Dense Batch\n------------\n::: graphium.ipu.to_dense_batch\n\n"
  },
  {
    "path": "docs/api/graphium.nn/architectures.md",
    "content": "graphium.nn.architectures\n====================\n\nHigh level architectures in the library\n\n=== \"Contents\"\n\n    * [Global Architectures](#global-architectures)\n    * [PyG Architectures](#pyg-architectures)\n    * [Encoder Manager](#encoder-manager)\n\n\n## Global Architectures\n------------\n::: graphium.nn.architectures.global_architectures\n\n\n## PyG Architectures\n------------\n::: graphium.nn.architectures.pyg_architectures\n\n\n## Encoder Manager\n------------\n::: graphium.nn.architectures.encoder_manager\n"
  },
  {
    "path": "docs/api/graphium.nn/encoders.md",
    "content": "graphium.nn.encoders\n====================\n\nImplementations of positional encoders in the library\n\n=== \"Contents\"\n\n    * [Base Encoder](#base-encoder)\n    * [Gaussian Kernal Positional Encoder](#gaussian-kernal-positional-encoder)\n    * [Laplacian Positional Encoder](#laplacian-positional-encoder)\n    * [MLP Encoder](#mlp-encoder)\n    * [Signnet Positional Encoder](#signnet-positional-encoder)\n\n\n## Base Encoder\n------------\n::: graphium.nn.encoders.base_encoder\n\n\n## Gaussian Kernal Positional Encoder\n------------\n::: graphium.nn.encoders.gaussian_kernel_pos_encoder\n\n\n## Laplacian Positional Encoder\n------------\n::: graphium.nn.encoders.laplace_pos_encoder\n\n\n## MLP Encoder\n------------\n::: graphium.nn.encoders.mlp_encoder\n\n\n## Signnet Positional Encoder\n------------\n::: graphium.nn.encoders.signnet_pos_encoder\n"
  },
  {
    "path": "docs/api/graphium.nn/graphium.nn.md",
    "content": "graphium.nn\n====================\n\nBase structures for neural networks in the library (to be inherented by other modules)\n\n=== \"Contents\"\n\n    * [Base Graph Layer](#base-graph-layer)\n    * [Base Layers](#base-layers)\n    * [Residual Connection](#residual-connections)\n\n## Base Graph Layer\n------------\n::: graphium.nn.base_graph_layer\n\n\n## Base Layers\n------------\n::: graphium.nn.base_layers\n\n\n## Residual Connections\n------------\n::: graphium.nn.residual_connections\n\n"
  },
  {
    "path": "docs/api/graphium.nn/pyg_layers.md",
    "content": "graphium.nn.pyg_layers\n====================\nImplementations of GNN layers based on PyG in the library\n\n=== \"Contents\"\n\n    * [Gated GCN Layer](#gated-gcn-layer)\n    * [GPS Layer](#gps-layer)\n    * [GIN and GINE Layers](#gin-and-gine-layers)\n    * [MPNN Layer](#mpnn-layer)\n    * [PNA Layer](#pna-layer)\n    * [Pooling Layer](#pooling-layers)\n\n## Gated GCN Layer\n------------\n::: graphium.nn.pyg_layers.gated_gcn_pyg\n\n\n## GPS Layer\n------------\n::: graphium.nn.pyg_layers.gps_pyg\n\n\n## GIN and GINE Layers\n------------\n::: graphium.nn.pyg_layers.gin_pyg\n\n\n## MPNN Layer\n------------\n::: graphium.nn.pyg_layers.mpnn_pyg\n\n\n## PNA Layer\n------------\n::: graphium.nn.pyg_layers.pna_pyg\n\n\n## Pooling Layers\n------------\n::: graphium.nn.pyg_layers.pooling_pyg\n\n\n## Utils\n------------\n::: graphium.nn.pyg_layers.utils\n"
  },
  {
    "path": "docs/api/graphium.trainer.md",
    "content": "graphium.trainer\n====================\n\nCode for training models\n\n=== \"Contents\"\n    * [Predictor](#predictor)\n    * [Metrics](#metrics)\n    * [Predictor Summaries](#predictor-summaries)\n    * [Predictor Options](#predictor-options)\n\n\n## Predictor\n------------\n::: graphium.trainer.predictor\n\n\n## Metrics\n------------\n::: graphium.trainer.metrics\n\n\n## Predictor Summaries\n------------\n::: graphium.trainer.predictor_summaries\n\n\n## Predictor Options\n------------\n::: graphium.trainer.predictor_options\n\n\n"
  },
  {
    "path": "docs/api/graphium.utils.md",
    "content": "graphium.utils\n====================\nmodule for utility functions\n\n=== \"Contents\"\n    * [Argument Checker](#argument-checker)\n    * [Decorators](#decorators)\n    * [Dictionary of Tensors](#dictionary-of-tensors)\n    * [File System](#file-system)\n    * [Hashing](#hashing)\n    * [Moving Average Tracker](#moving-average-tracker)\n    * [MUP](#mup)\n    * [Read File](#read-file)\n    * [Safe Run](#safe-run)\n    * [Spaces](#spaces)\n    * [Tensor](#tensor)\n\n\n## Argument Checker\n----------------\n::: graphium.utils.arg_checker\n\n\n## Decorators\n----------------\n::: graphium.utils.decorators\n\n\n## File System\n----------------\n::: graphium.utils.fs\n\n\n## Hashing\n----------------\n::: graphium.utils.hashing\n\n\n## Moving Average Tracker\n----------------\n::: graphium.utils.moving_average_tracker\n\n\n## MUP\n----------------\n::: graphium.utils.mup\n\n\n## Read File\n----------------\n::: graphium.utils.read_file\n\n## Safe Run\n----------------\n::: graphium.utils.safe_run\n\n\n## Spaces\n----------------\n::: graphium.utils.spaces\n\n\n## Tensor\n----------------\n::: graphium.utils.tensor\n"
  },
  {
    "path": "docs/baseline.md",
    "content": "# ToyMix Baseline - Test set metrics\n\nFrom the paper to be released soon. Below, you can see the baselines for the `ToyMix` dataset, a multitasking dataset comprising of `QM9`, `Zinc12k` and `Tox21`. The datasets and their splits are available on [this link](https://zenodo.org/record/7998401). The following baselines are all for models with ~150k parameters.\n\nOne can observe that the smaller datasets (`Zinc12k` and `Tox21`) beneficiate from adding another unrelated task (`QM9`), where the labels are computed from DFT simulations.\n\n**NEW baselines added 2023/09/18**: Multitask baselines have been added for GatedGCN and MPNN++ (sum aggretator) using 3 random seeds. They achieve the best performance by a significant margin on Zinc12k and Tox21, while sacrificing a little on QM9.\n\n| Dataset   | Model | MAE ↓     | Pearson ↑ | R² ↑     | MAE ↓   | Pearson ↑ | R² ↑   |\n|-----------|-------|-----------|-----------|-----------|---------|-----------|---------|\n|    | <th colspan=\"3\" style=\"text-align: center;\">Single-Task Model</th>  <th colspan=\"3\" style=\"text-align: center;\">Multi-Task Model</th>   |\n|\n| **QM9**   | GCN   | 0.102 ± 0.0003 | 0.958 ± 0.0007 | 0.920 ± 0.002 | 0.119 ± 0.01 | 0.955 ± 0.001 | 0.915 ± 0.001 |\n|           | GIN   | 0.0976 ± 0.0006 | **0.959 ± 0.0002** | **0.922 ± 0.0004** | 0.117 ± 0.01 | 0.950 ± 0.002 | 0.908 ± 0.003 |\n|           | GINE  | **0.0959 ± 0.0002** | 0.955 ± 0.002 | 0.918 ± 0.004 | 0.102 ± 0.01 | 0.956 ± 0.0009 | 0.918 ± 0.002 |\n|       |   GatedGCN    |       |       |       | 0.1212 ± 0.0009 | 0.9457 ± 0.0002 | 0.8964 ± 0.0006 |\n|       |   MPNN++ (sum)    |       |       |    | 0.1174 ± 0.0012 | 0.9460 ± 0.0005 | 0.8989 ± 0.0008 |\n **Zinc12k** | GCN   | 0.348 ± 0.02 | 0.941 ± 0.002 | 0.863 ± 0.01 | 0.226 ± 0.004 | 0.973 ± 0.0005 | 0.940 ± 0.003 |\n|           | GIN   | 0.303 ± 0.007 | 0.950 ± 0.003 | 0.889 ± 0.003 | 0.189 ± 0.004 | 0.978 ± 0.006 | 0.953 ± 0.002 |\n|           | GINE  | 0.266 ± 0.02 | 0.961 ± 0.003 | 0.915 ± 0.01 | 0.147 ± 0.009 | 0.987 ± 0.001 | 0.971 ± 0.003 |\n|       | GatedGCN       |       |       |       | 0.1282 ± 0.0045 | 0.9850 ± 0.0006 | 0.9639 ± 0.0024 |\n|       | MPNN++ (sum)   |       |       |       | **0.1002 ± 0.0025** | **0.9909 ± 0.0004** | **0.9777 ± 0.0014** |\n\n|           |       | BCE ↓     | AUROC ↑ | AP ↑     | BCE ↓   | AUROC ↑ | AP ↑   |\n|-----------|-------|-----------|-----------|-----------|---------|-----------|---------|\n|    | <th colspan=\"3\" style=\"text-align: center;\">Single-Task Model</th>  <th colspan=\"3\" style=\"text-align: center;\">Multi-Task Model</th>   |\n|\n| **Tox21**   | GCN   | 0.202 ± 0.005 | 0.773 ± 0.006 | 0.334 ± 0.03 | 0.176 ± 0.001 | 0.850 ± 0.006 | 0.446 ± 0.01 |\n|           | GIN   | 0.200 ± 0.002 | 0.789 ± 0.009 | 0.350 ± 0.01 | 0.176 ± 0.001 | 0.841 ± 0.005 | 0.454 ± 0.009 |\n|           | GINE  | 0.201 ± 0.007 | 0.783 ± 0.007 | 0.345 ± 0.02 | 0.177 ± 0.0008 | 0.836 ± 0.004 | 0.455 ± 0.008 |\n|       | GatedGCN       |       |       |       | 0.1733 ± 0.0015 | 0.8522 ± 0.0022 | **0.4620 ± 0.0118** |\n|       | MPNN++ (sum)   |       |       |       | **0.1725 ± 0.0012** | **0.8569 ± 0.0005** | 0.4598 ± 0.0044 |\n\n\n# LargeMix Baseline\n## LargeMix test set metrics\n\nFrom the paper to be released soon. Below, you can see the baselines for the `LargeMix` dataset, a multitasking dataset comprising of `PCQM4M_N4`, `PCQM4M_G25`, `PCBA_1328`, `L1000_VCAP`, and `L1000_MCF7`. The datasets and their splits are available on [this link](https://zenodo.org/record/7998401). The following baselines are all for models with 4-6M parameters.\n\nOne can observe that the smaller datasets (`L1000_VCAP` and `L1000_MCF7`) beneficiate tremendously from the multitasking. Indeed, the lack of molecular samples means that it is very easy for a model to overfit.\n\nWhile `PCQM4M_G25` has no noticeable changes, the node predictions of `PCQM4M_N4` and assay predictions of `PCBA_1328` take a hit, but it is most likely due to underfitting since the training loss is also increased. It seems that 4-6M parameters is far from sufficient to capturing all of the tasks simultaneously, which motivates the need for a larger model.\n\n| Dataset   | Model | MAE ↓     | Pearson ↑ | R² ↑     | MAE ↓   | Pearson ↑ | R² ↑   |\n|-----------|-------|-----------|-----------|-----------|---------|-----------|---------|\n|    | <th colspan=\"3\" style=\"text-align: center;\">Single-Task Model</th>  <th colspan=\"3\" style=\"text-align: center;\">Multi-Task Model</th>   |\n|\n| **Pcqm4m_g25** | GCN | 0.2362 ± 0.0003 | 0.8781 ± 0.0005 | 0.7803 ± 0.0006 | 0.2458 ± 0.0007 | 0.8701 ± 0.0002 | 0.8189 ± 0.0004 |\n|               | GIN | 0.2270 ± 0.0003 | 0.8854 ± 0.0004 | 0.7912 ± 0.0006 | 0.2352 ± 0.0006 | 0.8802 ± 0.0007 | 0.7827 ± 0.0005 |\n|               | GINE| **0.2223 ± 0.0007** | **0.8874 ± 0.0003** | **0.7949 ± 0.0001** | 0.2315 ± 0.0002 | 0.8823 ± 0.0002 | 0.7864 ± 0.0008 |\n| **Pcqm4m_n4** | GCN | 0.2080 ± 0.0003 | 0.5497 ± 0.0010 | 0.2942 ± 0.0007 | 0.2040 ± 0.0001 | 0.4796 ± 0.0006 | 0.2185 ± 0.0002 |\n|               | GIN | 0.1912 ± 0.0027 | **0.6138 ± 0.0088** | **0.3688 ± 0.0116** | 0.1966 ± 0.0003 | 0.5198 ± 0.0008 | 0.2602 ± 0.0012 |\n|               | GINE| **0.1910 ± 0.0001** | 0.6127 ± 0.0003 | 0.3666 ± 0.0008 | 0.1941 ± 0.0003 | 0.5303 ± 0.0023 | 0.2701 ± 0.0034 |\n\n\n|           |       | BCE ↓     | AUROC ↑ | AP ↑     | BCE ↓   | AUROC ↑ | AP ↑   |\n|-----------|-------|-----------|-----------|-----------|---------|-----------|---------|\n|    | <th colspan=\"3\" style=\"text-align: center;\">Single-Task Model</th>  <th colspan=\"3\" style=\"text-align: center;\">Multi-Task Model</th>   |\n| <hi> | <hi> | <hi> | <hi> | <hi> | <hi> | <hi> | <hi> |\n| **Pcba\\_1328**    | GCN      | **0.0316 ± 0.0000** | **0.7960 ± 0.0020** | **0.3368 ± 0.0027** | 0.0349 ± 0.0002 | 0.7661 ± 0.0031 | 0.2527 ± 0.0041 |\n|               | GIN      | 0.0324 ± 0.0000 | 0.7941 ± 0.0018 | 0.3328 ± 0.0019 | 0.0342 ± 0.0001 | 0.7747 ± 0.0025 | 0.2650 ± 0.0020 |\n|               | GINE      | 0.0320 ± 0.0001 | 0.7944 ± 0.0023 | 0.3337 ± 0.0027 | 0.0341 ± 0.0001 | 0.7737 ± 0.0007 | 0.2611 ± 0.0043 |\n| **L1000\\_vcap**   | GCN      | 0.1900 ± 0.0002 | 0.5788 ± 0.0034 | 0.3708 ± 0.0007 | 0.1872 ± 0.0020 | 0.6362 ± 0.0012 | 0.4022 ± 0.0008 |\n|               | GIN      | 0.1909 ± 0.0005 | 0.5734 ± 0.0029 | 0.3731 ± 0.0014 | 0.1870 ± 0.0010 | 0.6351 ± 0.0014 | 0.4062 ± 0.0001 |\n|               | GINE      | 0.1907 ± 0.0006 | 0.5708 ± 0.0079 | 0.3705 ± 0.0015 | **0.1862 ± 0.0007** | **0.6398 ± 0.0043** | **0.4068 ± 0.0023** |\n| **L1000\\_mcf7**   | GCN      | 0.1869 ± 0.0003 | 0.6123 ± 0.0051 | 0.3866 ± 0.0010 | 0.1863 ± 0.0011 | **0.6401 ± 0.0021** | 0.4194 ± 0.0004 |\n|               | GIN      | 0.1862 ± 0.0003 | 0.6202 ± 0.0091 | 0.3876 ± 0.0017 | 0.1874 ± 0.0013 | 0.6367 ± 0.0066 | **0.4198 ± 0.0036** |\n|               | GINE      | **0.1856 ± 0.0005** | 0.6166 ± 0.0017 | 0.3892 ± 0.0035 | 0.1873 ± 0.0009 | 0.6347 ± 0.0048 | 0.4177 ± 0.0024 |\n\n## LargeMix training set loss\n\nBelow is the loss on the training set. One can observe that the multi-task model always underfits the single-task, except on the two `L1000` datasets.\n\nThis is not surprising as they contain two orders of magnitude more datapoints and pose a significant challenge for the relatively small models used in this analysis. This favors the Single dataset setup (which uses a model of the same size) and we conjecture larger models to bridge this gap moving forward.\n\n|            |       | CE or MSE loss in single-task $\\downarrow$ | CE or MSE loss in multi-task $\\downarrow$ |\n|------------|-------|-----------------------------------------|-----------------------------------------|\n|\n| **Pcqm4m\\_g25**    | GCN   | **0.2660 ± 0.0005** | 0.2767 ± 0.0015 |\n|             | GIN   | **0.2439 ± 0.0004** | 0.2595 ± 0.0016 |\n|             | GINE  | **0.2424 ± 0.0007** | 0.2568 ± 0.0012 |\n|\n| **Pcqm4m\\_n4**    | GCN   | **0.2515 ± 0.0002** | 0.2613 ± 0.0008 |\n|             | GIN   | **0.2317 ± 0.0003** | 0.2512 ± 0.0008 |\n|             | GINE  | **0.2272 ± 0.0001** | 0.2483 ± 0.0004 |\n|\n| **Pcba\\_1328**    | GCN   | **0.0284 ± 0.0010** | 0.0382 ± 0.0005 |\n|             | GIN   | **0.0249 ± 0.0017** | 0.0359 ± 0.0011 |\n|             | GINE  | **0.0258 ± 0.0017** | 0.0361 ± 0.0008 |\n|\n| **L1000\\_vcap**   | GCN   | 0.1906 ± 0.0036 | **0.1854 ± 0.0148** |\n|             | GIN   | 0.1854 ± 0.0030 | **0.1833 ± 0.0185** |\n|             | GINE  | **0.1860 ± 0.0025** | 0.1887 ± 0.0200 |\n|\n| **L1000\\_mcf7**   | GCN   | 0.1902 ± 0.0038 | **0.1829 ± 0.0095** |\n|             | GIN   | 0.1873 ± 0.0033 | **0.1701 ± 0.0142** |\n|             | GINE  | 0.1883 ± 0.0039 | **0.1771 ± 0.0010** |\n\n## NEW: Largemix improved sweep - 2023/08-18\n\nUnsatisfied with the prior results, we ran a bayesian search over a broader set of parameters, and including only more expressive models, namely GINE, GatedGCN and MPNN++. We further increase the number of parameters to 10M due to evidence of underfitting. We evaluate only the multitask setting.\n\nWe observe a significant improvement over all tasks, with a very notable r2-score increase of +0.53 (0.27 -> 0.80) compared to the best node-level property prediction on PCQM4M_N4.\n\nThe results are reported below over 1 seed. We are currently running more seeds of the same models.\n\n| Dataset       | Model          | MAE ↓     | Pearson ↑ | R² ↑     |\n|---------------|----------------|--------|---------|--------|\n| **PCQM4M_G25**    | GINE           | 0.2250 | 0.8840  | 0.7911 |\n|               | GatedGCN       | 0.2457 | 0.8698  | 0.7688 |\n|               | MPNN++ (sum)   | 0.2269 | 0.8802  | 0.7855 |\n|\n| **PCQM4M_N4**     | GINE           | 0.2699 | 0.8475  | 0.7182 |\n|               | GatedGCN       | 0.3337 | 0.8102  | 0.6566 |\n|               | MPNN++ (sum)   | 0.2114 | 0.8942  | 0.8000 |\n\n| Dataset       | Model          | BCE ↓     | AUROC ↑ | AP ↑     |\n|---------------|----------------|--------|---------|--------|\n| **PCBA_1328**     | GINE           | 0.0334 | 0.7879  | 0.2808 |\n|               | GatedGCN       | 0.0351 | 0.7788  | 0.2611 |\n|               | MPNN++ (sum)   | 0.0344 | 0.7815  | 0.2666 |\n|\n| **L1000_VCAP**    | GINE           | 0.1907 | 0.6416  | 0.4042 |\n|               | GatedGCN       | 0.1866 | 0.6395  | 0.4092 |\n|               | MPNN++ (sum)   | 0.1867 | 0.6478  | 0.4131 |\n|\n| **L1000_MCF7**    | GINE           | 0.1931 | 0.6352  | 0.4235 |\n|               | GatedGCN       | 0.1859 | 0.6547  | 0.4224 |\n|               | MPNN++ (sum)   | 0.1870 | 0.6593  | 0.4254 |\n\n\n\n# UltraLarge Baseline\n\n## UltraLarge test set metrics\n\nFor `UltraLarge`, we provide results for the same GNN baselines as for\n`LargeMix`. Each model is trained for 50 epochs and results are averaged over 3 seeds. The remaining\nsetup is the same as for TOYMIX (Section E.1), reporting metrics on the Single Dataset and Multi Dataset using the same performance metrics. We further use the same models (in terms of size) as used for `LargeMix`.\n\nFor now, we report only the results for a subset representing 5% of the total dataset due to computational constraint, but aim to provide the full results soon.\n\nResults discussion. `UltraLarge` results can be found in Table 6. Interestingly, on both graph- and node-level tasks we observe that there is no advantage of multi-tasking in terms of performance. We\nexpect that for this ultra-large dataset, significantly larger models are needed to successfully leverage the multi-task setup. This could be attributed to underfitting, as already demonstrated for `LargeMix`. Nonetheless, our baselines set the stage for large-scale pre-training on `UltraLarge`.\n\nThe results presented used approximately 500 GPU hours of compute, with\nmore compute used for development and hyperparameter search.\n\nWe further note that the graph-level tasks results are very strong. Regarding the node-level tasks, they are expected to underperform in low-parameters regime, due to clear signs of underfitting, a very large amount of labels to learn, and susceptibility to over-smoothing from traditional GNNs.\n\n\n| Dataset          | Model | MAE ↓             | Pearson ↑         | R² ↑          | MAE ↓             | Pearson ↑         | R² ↑              |\n|------------------|-------|-------------------|-------------------|-------------------|-------------------|-------------------|-------------------|\n|    | <th colspan=\"3\" style=\"text-align: center;\">Single-Task Model</th>  <th colspan=\"3\" style=\"text-align: center;\">Multi-Task Model</th>   |\n| <hi> | <hi> | <hi> | <hi> | <hi> | <hi> | <hi> | <hi> |\n| **Pm6_83m_g62**      | GCN   | .2606 ± .0011     | .9004 ± .0003     | .7997 ± .0009       | .2625 ± .0011     | .8896 ± .0001     | .7982 ± .0001     |\n|                  | GIN   | .2546 ± .0021     | .9051 ± .0019     | .8064 ± .0037       | .2562 ± .0000     | .8901 ± .0000     | .806 ± .0000      |\n|                  | GINE  | **.2538 ± .0006** | **.9059 ± .0010** | **.8082 ± .0015**   | .258 ± .0011      | .904 ± .0000      | .8048 ± .0001     |\n|\n| **Pm6_83m_n7**       | GCN   | .5803 ± .0001     | .3372 ± .0004     | .1191 ± .0002      | .5971 ± .0002     | .3164 ± .0001     | .1019 ± .0011     |\n|                  | GIN   | .573 ± .0002      | .3478 ± .0001     | **.1269 ± .0002**  | .5831 ± .0001     | .3315 ± .0005     | .1141 ± .0000     |\n|                  | GINE  | **.572 ± .0004**  | **.3487 ± .0002** | .1266 ± .0001      | .5839 ± .0004     | .3294 ± .0002     | .1104 ± .0000     |\n\n## UltraLarge training set loss\n\nIn the table below, we observe that the multi-task model slightly underfits the single-task model, indicating that parameters can be efficiently shared between the node-level and graph-level tasks. We further note that the training loss and the test MAE are almost equal for all tasks, indicating further benefits as we scale both the model and the data.\n\n|                  |       | **MAE loss in single-task ↓** | **MAE loss in multi-task ↓** |\n|------------------|-------|---------------------------|--------------------------|\n|\n| Pm6_83m_g62      | GCN   | **.2679 ± .0020**        | .2713 ± .0017           |\n|                  | GIN   | **.2582 ± .0018**        | .2636 ± .0014           |\n|                  | GINE  | **.2567 ± .0036**        | .2603 ± .0021           |\n|\n| Pm6_83m_n7       | GCN   | **.5818 ± .0021**        | .5955 ± .0023           |\n|                  | GIN   | **.5707 ± .0019**        | .5851 ± .0038           |\n|                  | GINE  | **.5724 ± .0015**        | .5832 ± .0027           |\n\n"
  },
  {
    "path": "docs/cli/graphium-train.md",
    "content": "# `graphium-train`\n\nTo support advanced configuration, Graphium uses [`hydra`](https://hydra.cc/) to manage and write config files. A limitation of `hydra`, is that it is designed to function as the main entrypoint for a CLI application and does not easily support subcommands. For that reason, we introduced the `graphium-train` command in addition to the [`graphium`](./graphium.md) command.\n\n!!! info \"Curious about the configs?\"\n    If you would like to learn more about the configs, please visit the docs [here](https://github.com/datamol-io/graphium/tree/main/expts/hydra-configs).\n\nThis page documents `graphium-train`.\n\n## Running an experiment\nWe have setup Graphium with `hydra` for managing config files. To run an experiment go to the `expts/` folder. For example, to benchmark a GCN on the ToyMix dataset run\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn\n```\nTo change parameters specific to this experiment like switching from `fp16` to `fp32` precision, you can either override them directly in the CLI via\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn trainer.trainer.precision=32\n```\nor change them permanently in the dedicated experiment config under `expts/hydra-configs/toymix_gcn.yaml`.\nIntegrating `hydra` also allows you to quickly switch between accelerators. E.g., running\n```bash\ngraphium-train architecture=toymix tasks=toymix training=toymix model=gcn accelerator=gpu\n```\nautomatically selects the correct configs to run the experiment on GPU.\nFinally, you can also run a fine-tuning loop:\n```bash\ngraphium-train +finetuning=admet\n```\n\nTo use a config file you built from scratch you can run\n```bash\ngraphium-train --config-path [PATH] --config-name [CONFIG]\n```\nThanks to the modular nature of `hydra` you can reuse many of our config settings for your own experiments with Graphium.\n\n### Preparing the data in advance\nThe data preparation including the featurization (e.g., of molecules from smiles to pyg-compatible format) is embedded in the pipeline and will be performed when executing `graphium-train [...]`.\n\nHowever, when working with larger datasets, it is recommended to perform data preparation in advance using a machine with sufficient allocated memory (e.g., ~400GB in the case of `LargeMix`). Preparing data in advance is also beneficial when running lots of concurrent jobs with identical molecular featurization, so that resources aren't wasted and processes don't conflict reading/writing in the same directory.\n\nThe following command-line will prepare the data and cache it, then use it to train a model.\n```bash\n# First prepare the data and cache it in `path_to_cached_data`\ngraphium data prepare ++datamodule.args.processed_graph_data_path=[path_to_cached_data]\n\n# Then train the model on the prepared data\ngraphium-train [...] datamodule.args.processed_graph_data_path=[path_to_cached_data]\n```\n\n??? note \"Config vs. Override\"\n    As with any configuration, note that `datamodule.args.processed_graph_data_path` can also be specified in the configs at `expts/hydra_configs/`.\n\n??? note \"Featurization\"\n    Every time the configs of `datamodule.args.featurization` change, you will need to run a new data preparation, which will automatically be saved in a separate directory that uses a hash unique to the configs."
  },
  {
    "path": "docs/cli/graphium.md",
    "content": "# `graphium`\n\n**Usage**:\n\n```console\n$ graphium [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--help`: Show this message and exit.\n\n**Commands**:\n\n* `data`: Graphium datasets.\n* `finetune`: Utility CLI for extra fine-tuning utilities.\n\n## `graphium data`\n\nGraphium datasets.\n\n**Usage**:\n\n```console\n$ graphium data [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--help`: Show this message and exit.\n\n**Commands**:\n\n* `download`: Download a Graphium dataset.\n* `list`: List available Graphium dataset.\n* `prepare`: Prepare a Graphium dataset.\n\n### `graphium data download`\n\nDownload a Graphium dataset.\n\n**Usage**:\n\n```console\n$ graphium data download [OPTIONS] NAME OUTPUT\n```\n\n**Arguments**:\n\n* `NAME`: [required]\n* `OUTPUT`: [required]\n\n**Options**:\n\n* `--progress / --no-progress`: [default: progress]\n* `--help`: Show this message and exit.\n\n### `graphium data list`\n\nList available Graphium dataset.\n\n**Usage**:\n\n```console\n$ graphium data list [OPTIONS]\n```\n\n**Options**:\n\n* `--help`: Show this message and exit.\n\n### `graphium data prepare`\n\nPrepare a Graphium dataset.\n\n**Usage**:\n\n```console\n$ graphium data prepare [OPTIONS] OVERRIDES...\n```\n\n**Arguments**:\n\n* `OVERRIDES...`: [required]\n\n**Options**:\n\n* `--help`: Show this message and exit.\n\n## `graphium finetune`\n\nUtility CLI for extra fine-tuning utilities.\n\n**Usage**:\n\n```console\n$ graphium finetune [OPTIONS] COMMAND [ARGS]...\n```\n\n**Options**:\n\n* `--help`: Show this message and exit.\n\n**Commands**:\n\n* `admet`: Utility CLI to easily fine-tune a model on...\n* `fingerprint`: Endpoint for getting fingerprints from a...\n\n### `graphium finetune admet`\n\nUtility CLI to easily fine-tune a model on (a subset of) the benchmarks in the TDC ADMET group.\n\nA major limitation is that we cannot use all features of the Hydra CLI, such as multiruns.\n\n**Usage**:\n\n```console\n$ graphium finetune admet [OPTIONS] OVERRIDES...\n```\n\n**Arguments**:\n\n* `OVERRIDES...`: [required]\n\n**Options**:\n\n* `--name TEXT`\n* `--inclusive-filter / --no-inclusive-filter`: [default: inclusive-filter]\n* `--help`: Show this message and exit.\n\n### `graphium finetune fingerprint`\n\nEndpoint for getting fingerprints from a pretrained model.\n\nThe pretrained model should be a `.ckpt` path or pre-specified, named model within Graphium.\nThe fingerprint layer specification should be of the format `module:layer`.\nIf specified as a list, the fingerprints from all the specified layers will be concatenated.\nSee the docs of the `graphium.finetuning.fingerprinting.Fingerprinter` class for more info.\n\n**Usage**:\n\n```console\n$ graphium finetune fingerprint [OPTIONS] FINGERPRINT_LAYER_SPEC... PRETRAINED_MODEL SAVE_DESTINATION\n```\n\n**Arguments**:\n\n* `FINGERPRINT_LAYER_SPEC...`: [required]\n* `PRETRAINED_MODEL`: [required]\n* `SAVE_DESTINATION`: [required]\n\n**Options**:\n\n* `--output-type TEXT`: Either numpy (.npy) or torch (.pt) output  [default: torch]\n* `-o, --override TEXT`: Hydra overrides\n* `--help`: Show this message and exit.\n"
  },
  {
    "path": "docs/cli/reference.md",
    "content": "# CLI Reference\n\nInstalling the Graphium library, makes two CLI tools available. \n\n- [`graphium-train`](./graphium-train.md) is the hydra endpoint - specifically meant for training, finetuning and testing. Since this uses `@hydra.main`, it has access to all advanced hydra functionality such as tab completion, multirun, working directory management, logging management. \n- [`graphium`](./graphium.md) is the more general CLI endpoint, organized with various sub commands. \n\nIdeally, we would've integrated both in a single CLI endpoint, but the hydra CLI cannot be a subcommand of another CLI, nor does it support easily adding subcommands, which is why provide two separate CLI tools with different purposes.\n\n!!! note \"Interactive, embedded CLI docs with `--help`\"\n    In addition to these pages, you can also use `graphium --help` and `graphium-train --help` to interactively navigate the documentation of these tools directly in the CLI."
  },
  {
    "path": "docs/contribute.md",
    "content": "# Contribute\n\nWe are happy to see that you want to contribute 🤗.\nFeel free to open an issue or pull request at any time. But first, follow this page to install Graphium in dev mode.\n\n## Installation for developers\n\n### For CPU and GPU developers\n\nUse [`mamba`](https://github.com/mamba-org/mamba), a preferred alternative to conda, to create your environment:\n\n```bash\n# Install Graphium's dependencies in a new environment named `graphium`\nmamba env create -f env.yml -n graphium\n\n# Install Graphium in dev mode\nmamba activate graphium\npip install --no-deps -e .\n```\n\n### For IPU developers\n\nDownload the SDK and use pypi to create your environment:\n\n```bash\n# Install Graphcore's SDK and Graphium dependencies in a new environment called `.graphium_ipu`\n./install_ipu.sh .graphium_ipu\n```\n\nThe above step needs to be done once. After that, enable the SDK and the environment as follows:\n\n```bash\nsource enable_ipu.sh .graphium_ipu\n```\n\n## Build the documentation\n\nYou can build and serve the documentation locally with:\n\n```bash\n# Build and serve the doc\nmkdocs serve\n```\n"
  },
  {
    "path": "docs/datasets.md",
    "content": "# Graphium Datasets\n\nGraphium datasets are hosted at on Zenodo \n- ***ToyMix*** and  ***LargeMix*** dataseets are hosted on [this link](https://doi.org/10.5281/zenodo.7998401)\n- ***UltraLarge*** dataset is hosted on [this link](https://doi.org/10.5281/zenodo.8370547)\n\nInstead of provinding datasets as a single entity, our aim is to provide dataset mixes containing a variety of datasets that are meant to be predicted simultaneously using multi-tasking.\n\nThey are visually described in this image, with detailed description below.\n![Visual description of the ToyMix, LargeMix, UltraLarge datasets](dataset_abstract.png)\n\n## ToyMix (QM9 + Tox21 + Zinc12K)\n\nThe ***ToyMix*** dataset combines the ***QM9***, ***Tox21***, and ***Zinc12K*** datasets. These datasets are well-known in the literature and used as toy datasets, or very simple datasets, in various contexts to enable fast iterations of models. By regrouping toy datasets from quantum ML, drug discovery, and GNN expressivity, we hope that the learned model will be representative of the model performance we can expect on the larger datasets.\n\n### Train/Validation/Test Splits\nfor all the datasets in ***ToyMix*** are split randomly with a ratio of 0.8/0.1/0.1. Random splitting is used since it is the simplest and fits the idea of having a toy dataset well.\n\n### QM9\nis a well-known dataset in the field of 3D GNNs. It consists of 19 graph-level quantum properties associated to an energy-minimized 3D conformation of the molecules [1]. It is considered a simple dataset since all the molecules have at most 9 heavy atoms. We chose QM9 in our ***ToyMix*** since it is very similar to the larger proposed quantum datasets, PCQM4M\\_multitask and PM6\\_83M, but with smaller molecules.\n\n\n### Tox21\nis a well-known dataset for researchers in machine learning for drug discovery [2]. It consists of a multi-label classification task with 12 labels, with most labels missing and a strong imbalance towards the negative class. We chose ***Tox21*** in our ***ToyMix*** since it is very similar to the larger proposed bioassay dataset, ***PCBA\\_1328\\_1564k*** both in terms of sparsity and imbalance and to the ***L1000*** datasets in terms of imbalance.\n\n### ZINC12k\nis a well-known dataset for researchers in GNN expressivity [3]. We include it in our ***ToyMix*** since GNN expressivity is very important for performance on large-scale data. Hence, we hope that the performance on this task will correlate well with the performance when scaling.\n\n## LargeMix (PCQM4M + PCBA1328 + L1000)\nIn this section, we present the ***LargeMix*** dataset, comprised of four different datasets with tasks taken from quantum chemistry (***PCQM4M***), bio-assays (***PCBA***) and transcriptomics.\n\n### Train/validation/test/test\\_seen\nSplits For the ***PCQM4M\\_G25\\_N4***, we create a 0.92/0.04/0.04 split. Then, for all the other datasets in ***LargeMix***, we first create a \"test\\_seen\" split by taking the set of molecules from ***L1000*** and ***PCBA1328*** that are also present in the training set of ***PCQM4M\\_G25\\_N4***, such that we can evaluate whether having the quantum properties of a molecule helps generalize for biological properties. For the remaining parts, we split randomly with a ratio of 0.92/0.04/0.04.\n\n\n\n### L1000 VCAP and MCF7\nThe ***LINCS L1000*** is a database of high-throughput transcriptomics that screened more than 30,000 perturbations on a set of 978 landmark genes [4] from multiple cell lines. ***VCAP*** and ***MCF7*** are, respectively, prostate cancer and human breast cancer cell lines. In ***L1000***, most of the perturbagens are chemical, meaning that small drug-like molecules are added to the cell lines to observe how the gene expressions change. This allows to generate biological signatures of the molecules, which are known to correlate with drug activity and side effects.\n\n\nTo process the data into our two datasets comprising the ***VCAP*** and ***MCF7*** cell lines, we used their \"level 5\" data composed of the cleanup data converted to z-scores, and filtered to keep only chemical perturbagens. However, we were left with multiple data points per molecule since some variables could change (e.g., incubation time) and generate a new measure. Given our objective of generating a single signature per molecule, we decided to take the measurements with the strongest global activity such that the variance over the 978 genes is maximal. Then, since these signatures are generally noisy, we binned them into five classes corresponding to z-scores based on the thresholds $\\{-4, -2, 2, 4\\}$.\n\nThe cell lines ***VCAP*** and ***MCF7*** were selected since they have a higher number of unique molecule perturbagens than other cell lines. They also have a relatively lower data imbalance, with ~92% falling in the \"neutral class\" when the z-score was between -2 and 2.\n\n### PCBA1328\nThis dataset is very similar to the ***OGBG-PCBA*** dataset [5], but instead of being limited to 128 assays and 437k molecules, it comprises 1,328 assays and 1.56M molecules. This dataset is very interesting for pre-training molecular models since it contains information about a molecule's behavior in various settings relevant to biochemists, with evidence that it improves binding predictions. Analogous to the gene expression, we obtain a bio-assay-expression of each molecule.\n\nTo gather the data, we have looped over the PubChem index of bioassays [6] and collected every dataset such that it contains more than 6,000 molecules annotated with either ``Active'' or ``Inactive'' and at least 10 of each. Then, we converted all the molecular IDs to canonical SMILES and used it to merge all of the bioassays into a single dataset.\n\n### PCQM4M\\_G25\\_N4\nThis dataset comes from the same data source as the ***OGBG-PCQM4M*** dataset, famously known for being part of the OGB large-scale challenge [7] and being one of the only graph datasets where pure Transformers have proven successful. The data source is the PubChemQC project [8] that computed DFT properties on the energy-minimized conformation of 3.8M small molecules from PubChem.\n\nContrarily to the OGB challenge, we aim to provide enough data for pre-training GNNs, so we do not limit ourselves to the HOMO-LUMO gap prediction [7]. Instead, we gather properties directly given by the DFT (e.g., energies) and compute other 3D descriptors from the conformation (e.g., inertia, the plane of best fit). We also gather node-level properties, the Mulliken and Lowdin charges at each atom. Furthermore, about half of the molecules have time-dependent DFT to help inform about the molecule's excited state. Looking forward, we plan on adding edge-level tasks to enable the prediction of bond properties, such as their lengths and the gradient of the charges.\n\n\n## UltraLarge Dataset\n### PM6\\_83M\nThis dataset is similar to the ***PCQM4M*** and comes from the same PubChemQC project. However, it uses the PM6 semi-empirical computation of the quantum properties, which is orders of magnitude faster than DFT computation at the expense of less accuracy [8, 9].\n\nThis dataset covers 83M unique molecules, 62 graph-level tasks, and 7 node-level tasks. To our knowledge, this is the largest dataset available for training 2D-GNNs regarding the number of unique molecules. The various tasks come from four different molecular states, namely ``S0'' for the ground state, ``T0'' for the lowest energy triplet excited state, ``cation'' for the positively charged state, and ``anion'' for the negatively charged state. In total, there are 221M PM6 computations.\n\n## References\n[1] https://www.nature.com/articles/sdata201422/\n\n[2] https://europepmc.org/article/MED/23603828\n\n[3] https://arxiv.org/abs/2003.00982v3\n\n[4] https://pubmed.ncbi.nlm.nih.gov/29195078/\n\n[5] https://arxiv.org/abs/2005.00687\n\n[6] https://pubmed.ncbi.nlm.nih.gov/26400175/\n\n[7] https://arxiv.org/abs/2103.09430\n\n[8] https://pubs.acs.org/doi/10.1021/acs.jcim.7b00083\n\n[9] https://arxiv.org/abs/1904.06046\n\n"
  },
  {
    "path": "docs/design.md",
    "content": "# Graphium Library Design\n\n---\n\n\nThe library is designed with 3 things in mind:\n\n- High modularity and configurability with *YAML* files\n- Contain the state-of-the art GNNs, including positional encodings and graph Transformers\n- Massively multitasking across diverse and sparse datasets\n\nThe current page will walk you through the different aspects of the design that enable that.\n\n### Diagram for data processing in Graphium.\n\nFirst, when working with molecules, there are tons of options regarding atomic and bond featurisation that can be extracted from the periodic table, from empirical results, or from simulated 3D structures.\n\nSecond, when working with graph Transformers, there are plenty of options regarding the positional and structural encodings (PSE) that are fundamental in driving the accuracy and the generalization of the models.\n\nWith this in mind, we propose a very versatile chemical and PSE encoding, alongside an encoder manager, that can be fully configured from the yaml files. The idea is to assign matching *input keys* to both the features and the encoders, then pool the outputs according to the *output keys*. It is better summarized in the image below.\n\n<img src=\"images/datamodule.png\" alt= \"Data Processing Chart\" width=\"100%\" height=\"100%\">\n\n\n\n### Diagram for Muti-task network in Graphium\n\nAs mentioned, we want to be able to pperform massive multi-tasking to enable us to work across a huge diversity of datasets. The idea is to use a combination of multiple task-heads, where a different MLP is applied to each task predictions. However, it is also designed such that each task can have as many labels as desired, thus enabling to group labels together according to whether they should share weights/losses.\n\nThe design is better explained in the image below. Notice how the *keys* outputed by GraphDict are used differently across the different GNN layers.\n\n<img src=\"images/full_graph_network.png\" alt= \"Full Graph Multi-task Network\" width=\"100%\" height=\"100%\">\n\n## Structure of the code\n\nThe code is built to rapidly iterate on different architectures of neural networks (NN) and graph neural networks (GNN) with Pytorch. The main focus of this work is molecular tasks, and we use the package `rdkit` to transform molecular SMILES into graphs.\n\nBelow are a list of directory and their respective documentations:\n\n- cli\n- [config](https://github.com/datamol-io/graphium/blob/main/graphium/config/README.md)\n- [data](https://github.com/datamol-io/graphium/blob/main/graphium/data/README.md)\n- [features](https://github.com/datamol-io/graphium/tree/main/graphium/features/README.md)\n- finetuning\n- [ipu](https://github.com/datamol-io/graphium/tree/main/graphium/ipu/README.md)\n- [nn](https://github.com/datamol-io/graphium/tree/main/graphium/nn/README.md)\n- [trainer](https://github.com/datamol-io/graphium/tree/main/graphium/trainer/README.md)\n- [utils](https://github.com/datamol-io/graphium/tree/main/graphium/features/README.md)\n\n\n## Structure of the configs\n\nMaking the library very modular requires to have configuration files that have >200 lines, which becomes intractable, especially when we only want to have minor changes between configurations.\n\nHence, we use [hydra](https://hydra.cc/docs/intro/) to enable splitting the configurations into smaller and composable configuration files.\n\nExamples of possibilities include:\n\n- Switching between accelerators (CPU, GPU and IPU)\n- Benchmarking different models on the same dataset\n- Fine-tuning a pre-trained model on a new dataset\n\n[In this document](https://github.com/datamol-io/graphium/tree/main/expts/hydra-configs#readme), we describe in details how each of the above functionality is achieved and how users can benefit from this design to achieve the most with Graphium with as little configuration as possible.\n\n"
  },
  {
    "path": "docs/index.md",
    "content": "# Overview\n\nA deep learning library focused on graph representation learning for real-world chemical tasks.\n\n- ✅ State-of-the-art GNN architectures.\n- 🐍 Extensible API: build your own GNN model and train it with ease.\n- ⚗️ Rich featurization: powerful and flexible built-in molecular featurization.\n- 🧠 Pretrained models: for fast and easy inference or transfer learning.\n- ⮔ Read-to-use training loop based on [Pytorch Lightning](https://www.pytorchlightning.ai/).\n- 🔌 Have a new dataset? Graphium provides a simple plug-and-play interface. Change the path, the name of the columns to predict, the atomic featurization, and you’re ready to play!\n\n## Installation\n\n### For CPU or GPU\nUse [`mamba`](https://github.com/mamba-org/mamba):\n\n```bash\n# Install Graphium\nmamba install -c conda-forge graphium\n```\n\nor pip:\n\n```bash\npip install graphium\n```\n\n### For IPU\n```bash\n# Install Graphcore's SDK and Graphium dependencies in a new environment called `.graphium_ipu`\n./install_ipu.sh .graphium_ipu\n```\n\nThe above step needs to be done once. After that, enable the SDK and the environment as follows:\n\n```bash\nsource enable_ipu.sh .graphium_ipu\n```\n\nFinally, you will need to install graphium with pip\n```bash\npip install graphium\n```\n"
  },
  {
    "path": "docs/license.md",
    "content": "```\n{!LICENSE!}\n```\n"
  },
  {
    "path": "docs/pretrained_models.md",
    "content": "# Graphium pretrained models\n\nGraphium aims to provide a set of pretrained models that you can use for inference or transfer learning. The models will be made available once trained and validated.\n\n## Listing all available models\n\nTo know which models are available, you can run the following command\n\n```python\nimport graphium\n\nprint(graphium.trainer.PredictorModule.list_pretrained_models())\n```\n\n## Dummy pre-trained model\n\nAt the moment, only `tests/dummy-pretrained-model.ckpt` is provided, which is mostly useful for development and debugging of the checkpointing and finetuning pipelines.\n\nYou can load a pretrained models using the Graphium API:\n\n```python\nimport graphium\n\npredictor = graphium.trainer.PredictorModule.load_pretrained_models(\"dummy-pretrained-model\")\n```\n"
  },
  {
    "path": "docs/tutorials/feature_processing/add_new_positional_encoding.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Add new positional encoding\\n\",\n    \"\\n\",\n    \"One of the main advantage of this library is the ability to easily incorporate novel positional encodings on the node, edge and graph level. The positional encodings are computed and feed into respective encoders and then the hidden embeddings from all pe encoders are pooled (according to if they are node, edge, or graph level) and then feed into the GNN layers as features. The designs allow any combination of positional encodings to be used by modifying the configuration file. For more details on the data processing part, please visit the [design page of the doc](https://graphium-docs.datamol.io/stable/design.html).\\n\",\n    \"\\n\",\n    \"Here is the workflow for computing and processing positional encoding in the library:\\n\",\n    \"1. edit related parts in the yaml configuration file\\n\",\n    \"\\n\",\n    \"2. compute the raw positional encoding from the graph in [`graphium/features/positional_encoding.py`](https://graphium-docs.datamol.io/stable/api/graphium.features.html#graphium.features.positional_encoding) (from the [`graph positional encoder`](https://graphium-docs.datamol.io/stable/api/graphium.features.html#graphium.features.positional_encoding.graph_positional_encoder))\\n\",\n    \"\\n\",\n    \"3. feed the raw positional encoding into the respective (specialized) encoders in [`graphium/nn/encoders`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html). For example, a simple [`MLP positional encoder`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.mlp_encoder) can be found. \\n\",\n    \"\\n\",\n    \"4. Output the hidden embeddings of pe from the encoders in their respective output keys: `feat`(node feature), `edge_feat`(edge feature), `graph_feat`(graph feature) and potentially other keys if needed such as `nodepair_feat` \\n\",\n    \"\\n\",\n    \"5. pool the hidden embeddings with same keys together: for example, all output with `feat` key will be pooled together\\n\",\n    \"\\n\",\n    \"6. Construct the [`PyG Batch`](https://pytorch-geometric.readthedocs.io/en/latest/generated/torch_geometric.data.Batch.html#torch_geometric.data.Batch), batch of graphs, each contain the output keys seen above, ready for use in the [GNN layers](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html) \\n\",\n    \"\\n\",\n    \"Since this library is built using PyG, we recommend looking at their [Docs](https://pytorch-geometric.readthedocs.io/en/latest/) and [Tutorials](https://pytorch-geometric.readthedocs.io/en/latest/get_started/introduction.html) for more info. \\n\",\n    \"\\n\",\n    \"We start by editing the configuration file first.\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Table of content\\n\",\n    \"1. [edit the config file](#Edit-the-yaml-Configuration-File)\\n\",\n    \"2. [compute the pe from mol](#Compute-the-Positional-Encoding)\\n\",\n    \"3. [add existing encoder](#Add-Existing-Encoder)\\n\",\n    \"4. [add specialized encoder](#Add-Specialized-Encoder)\\n\",\n    \"5. [add the keys to spaces](#Add-the-Keys-to-Spaces)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Edit the yaml Configuration File\\n\",\n    \"\\n\",\n    \"### Computing Raw PE\\n\",\n    \"We will use the degree of each node as a positional encoding in this tutorial. \\n\",\n    \"First start with an existing yaml configuration file, you can find them in `expts/configs`\\n\",\n    \"\\n\",\n    \"We first look at where in the yaml file is the raw positional encodings computed. `deg_pos` is added as an example below. You can add relevant arguments for computing the positional encoding here as well such as `normalize` in the example.\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"```\\n\",\n    \"pos_encoding_as_features:\\n\",\n    \"    pos_types:\\n\",\n    \"      deg_pos: #example, degree centrality\\n\",\n    \"        pos_type: degree\\n\",\n    \"        normalize: False\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Specifying Encoders for the PE\\n\",\n    \"Now we want to specify arguments for the encoders associated with the pe\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"```\\n\",\n    \"pe_encoders:\\n\",\n    \"    out_dim: 64\\n\",\n    \"    pool: \\\"sum\\\" #choice of pooling across multiple pe encoders\\n\",\n    \"    last_norm: None #\\\"batch_norm\\\", \\\"layer_norm\\\"\\n\",\n    \"    encoders: \\n\",\n    \"      deg_pos: #same name from the previous cell\\n\",\n    \"        encoder_type: \\\"mlp\\\" #or you can specify your own specialized encoder\\n\",\n    \"        input_keys: [\\\"degree\\\"] #same as the pos_type configured before\\n\",\n    \"        output_keys: [\\\"feat\\\"] #node feature\\n\",\n    \"        hidden_dim: 64\\n\",\n    \"        num_layers: 1\\n\",\n    \"        dropout: 0.1\\n\",\n    \"        normalization: \\\"none\\\"   #\\\"batch_norm\\\" or \\\"layer_norm\\\"\\n\",\n    \"        first_normalization: \\\"layer_norm\\\"   #\\\"batch_norm\\\" or \\\"layer_norm\\\"\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Compute the Positional Encoding\\n\",\n    \"Next, we want to compute the raw degree of each node from the molecule graph.\\n\",\n    \"\\n\",\n    \"### add function to compute the pe\\n\",\n    \"Go to [graphium/features](https://graphium-docs.datamol.io/stable/api/graphium.features.html) and add a new file `deg.py` to add the function to compute the pe. \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from typing import Tuple, Union, Optional\\n\",\n    \"\\n\",\n    \"from scipy import sparse\\n\",\n    \"from scipy.sparse import spmatrix\\n\",\n    \"import numpy as np\\n\",\n    \"\\n\",\n    \"def compute_deg(adj: Union[np.ndarray, spmatrix], normalize: bool) -> np.ndarray:\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Compute the node degree positional encoding \\n\",\n    \"\\n\",\n    \"    Parameters:\\n\",\n    \"        adj: Adjacency matrix\\n\",\n    \"        normalize: indicate if the degree across all nodes are normalized to [0,1] or not\\n\",\n    \"    Returns:\\n\",\n    \"        2D array with shape (num_nodes, 1) specifying (outgoing) degree for each node\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    \\n\",\n    \"    #first adj convert to scipy sparse matrix if not already\\n\",\n    \"    if type(adj) is np.ndarray:\\n\",\n    \"        adj = sparse.csr_matrix(adj)\\n\",\n    \"    \\n\",\n    \"    #https://docs.scipy.org/doc/scipy/reference/generated/scipy.sparse.csr_matrix.sum.html\\n\",\n    \"    degs = adj.sum(axis=0) #sum over each row\\n\",\n    \"    \\n\",\n    \"    if (normalize): #normalize the degree sequence to [0,1]\\n\",\n    \"        degs = degs / np.max(degs)\\n\",\n    \"    return degs\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Test with toy matrix\\n\",\n    \"\\n\",\n    \"here we will test if our code compute the degrees of each node correctly\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"matrix([[1., 1., 1., 1., 1.]])\"\n      ]\n     },\n     \"execution_count\": 2,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"adj = np.identity(5) #make an identity matrix\\n\",\n    \"normalize = True\\n\",\n    \"\\n\",\n    \"degs = compute_deg(adj, normalize=normalize)\\n\",\n    \"\\n\",\n    \"degs\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### add to positional_encoding.py\\n\",\n    \"\\n\",\n    \"To compute the new pe along with all existing pe, we need to add the function we wrote to [`graphium/feature/positional_encoding.py`](https://graphium-docs.datamol.io/stable/api/graphium.features.html#graphium.features.positional_encoding). Modify the [`graph_positional_encoder`](https://graphium-docs.datamol.io/stable/api/graphium.features.html#graphium.features.positional_encoding.graph_positional_encoder) function by adding `pos_type == \\\"degree\\\"` logic \"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Add Existing Encoder\\n\",\n    \"\\n\",\n    \"In order to pool over all the positional encodings, we need to add encoder to process the raw computed positional encoding and ensure the output dimension from all pe encoders are the same. When designing the encoder, you can either use an existing encoder or write a specialized encoder you made\\n\",\n    \"\\n\",\n    \"here we can simply specify [`MLPEncoder`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.mlp_encoder.MLPEncoder) in the yaml file and the library will automatically feed the raw positional encoding to a mlp encoder based on the input arguments. Note that in this example, the encoder takes in the pe stored at the input key `degree` and then outputs to the output key `feat`\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"```\\n\",\n    \"encoders: \\n\",\n    \"  deg_pos: \\n\",\n    \"    encoder_type: \\\"mlp\\\" \\n\",\n    \"    input_keys: [\\\"degree\\\"] \\n\",\n    \"    output_keys: [\\\"feat\\\"] # node feature\\n\",\n    \"    hidden_dim: 64\\n\",\n    \"    num_layers: 1\\n\",\n    \"    dropout: 0.1\\n\",\n    \"    normalization: \\\"none\\\"   #\\\"batch_norm\\\" or \\\"layer_norm\\\"\\n\",\n    \"    first_normalization: \\\"layer_norm\\\"   #\\\"batch_norm\\\" or \\\"layer_norm\\\"\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Add Specialized Encoder\\n\",\n    \"\\n\",\n    \"You can also add specialized encoder, such as `laplacian_pe` for the laplacian eigenvectors and eigenvalues. Here, we can add a new `deg_pos_encoder.py` in [`graphium/nn/encoders`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html). As an example and template, please see the [`MLPEncoder`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.mlp_encoder.MLPEncoder)\\n\",\n    \"\\n\",\n    \"Note that all new encoders must inherent from [`BaseEncoder`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.base_encoder.BaseEncoder) class and implement the following abstract methods\\n\",\n    \"\\n\",\n    \"- [`forward`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.base_encoder.BaseEncoder.forward): the forward function of the encoder, how to process the input\\n\",\n    \"\\n\",\n    \"- [`parse_input_keys`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.base_encoder.BaseEncoder.parse_input_keys): how to parse the input keys\\n\",\n    \"\\n\",\n    \"- [`parse_output_keys`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html#graphium.nn.encoders.laplace_pos_encoder.LapPENodeEncoder.parse_output_keys): how to parse the output keys\\n\",\n    \"\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Add the Keys to Spaces\\n\",\n    \"\\n\",\n    \"In order to directly find the correct encoders from the yaml file, we need to specify which key corresponding to what class. \\n\",\n    \"\\n\",\n    \"- add our new `deg_pos_encoder` to `graphium/utils/spaces.py` in the `PE_ENCODERS_DICT`\\n\",\n    \"- add our new `deg_pos_encoder` to [`graphium/nn/architectures/encoder_manager.py`](https://graphium-docs.datamol.io/stable/api/graphium.nn/architectures.html#graphium.nn.architectures.encoder_manager.EncoderManager) in the `PE_ENCODERS_DICT`\\n\",\n    \"- add the import of our encoder to  `graphium/nn/encoders/__init__.py`\\n\",\n    \"\\n\",\n    \"Now we can modify the yaml file to use our new encoder\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"```\\n\",\n    \"encoders: \\n\",\n    \"  deg_pos: \\n\",\n    \"    encoder_type: \\\"deg_pos_encoder\\\" \\n\",\n    \"    input_keys: [\\\"degree\\\"] \\n\",\n    \"    output_keys: [\\\"feat\\\"] # node feature\\n\",\n    \"    hidden_dim: 64\\n\",\n    \"    #any other keys that might be used for initialization\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.10\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "docs/tutorials/feature_processing/choosing_parallelization.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"0693c5c9\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Choosing the right parallelization\\n\",\n    \"\\n\",\n    \"This tutorial is meant to help you benchmark different parallel processing methods for the processing of molecules into graphs. This will allow you to chose the one most suitable for your machine, since the benchmarks vary per machine.\\n\",\n    \"\\n\",\n    \"In general, we find that using `joblib` with the `loky` parallel processing and a batch size of `1000` is most beneficial. The logic is abstracted into `datamol.parallelized_with_batches`\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"b5df2ac6-2ded-4597-a445-f2b5fb106330\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"The autoreload extension is already loaded. To reload it, use:\\n\",\n      \"  %reload_ext autoreload\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import joblib\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import datamol as dm\\n\",\n    \"import pandas as pd\\n\",\n    \"\\n\",\n    \"# from pandarallel import pandarallel\\n\",\n    \"\\n\",\n    \"# pandarallel.initialize(progress_bar=True, nb_workers=joblib.cpu_count())\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"f81fe0f5-055f-436e-9ef4-e58a89fd50ec\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Setup\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"0f31e18d-bdd9-4d9b-8ba5-81e5887b857e\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# download from https://raw.githubusercontent.com/aspuru-guzik-group/chemical_vae/master/models/zinc_properties/250k_rndm_zinc_drugs_clean_3.csv\\n\",\n    \"# data = pd.read_csv(\\\"/home/hadim/250k_rndm_zinc_drugs_clean_3.csv\\\", usecols=[\\\"smiles\\\"])\\n\",\n    \"\\n\",\n    \"# download from https://storage.valencelabs.com/graphium/datasets/QM9/norm_qm9.csv\\n\",\n    \"data = pd.read_csv(\\\"https://storage.valencelabs.com/graphium/datasets/QM9/norm_qm9.csv\\\", usecols=[\\\"smiles\\\"])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"a1197c31-7dbc-4fd7-a69a-5215e1a96b8e\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"rows_number_list = [250_000]\\n\",\n    \"batch_size_list = [10, 100, 1_000, 10_000]\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def smiles_to_unique_mol_id(smiles):\\n\",\n    \"    try:\\n\",\n    \"        mol = dm.to_mol(mol=smiles)\\n\",\n    \"        mol_id = dm.unique_id(mol)\\n\",\n    \"    except:\\n\",\n    \"        mol_id = \\\"\\\"\\n\",\n    \"    if mol_id is None:\\n\",\n    \"        mol_id = \\\"\\\"\\n\",\n    \"    return mol_id\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def smiles_to_unique_mol_id_batch(smiles_list):\\n\",\n    \"    mol_id_list = []\\n\",\n    \"    for smiles in smiles_list:\\n\",\n    \"        mol_id_list.append(smiles_to_unique_mol_id(smiles))\\n\",\n    \"    return mol_id_list\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"5d8eea8f-b983-46e7-bfb4-b8012b3ede1a\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Benchmarks\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"id\": \"2f8ce5c3-4232-4279-8ea3-7a74832303be\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"benchmark = []\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"691f8963-6cac-4d58-b8a0-8049f1185486\",\n   \"metadata\": {},\n   \"source\": [\n    \"### No batch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 11,\n   \"id\": \"a246cdcf-b5ea-4c9e-9ccc-dd3c544587bb\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"cc396220c7144c8d8b195fb87694bbfe\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/133885 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for n in rows_number_list:\\n\",\n    \"    df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"    with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"        out = dm.parallelized(\\n\",\n    \"            smiles_to_unique_mol_id,\\n\",\n    \"            df[\\\"smiles\\\"].values,\\n\",\n    \"            progress=True,\\n\",\n    \"            n_jobs=-1,\\n\",\n    \"            scheduler=\\\"processes\\\",\\n\",\n    \"        )\\n\",\n    \"\\n\",\n    \"    datum = {\\n\",\n    \"        \\\"batch\\\": False,\\n\",\n    \"        \\\"batch_size\\\": None,\\n\",\n    \"        \\\"scheduler\\\": \\\"loky_processes\\\",\\n\",\n    \"        \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"        \\\"duration_seconds\\\": d.duration,\\n\",\n    \"        \\\"n_rows\\\": len(df),\\n\",\n    \"    }\\n\",\n    \"    benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"41139854-892f-4481-8dd6-a3972bb6ece0\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Batch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"845a72d5-0b3f-4a21-8d7d-8312671e9924\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"0187ede3dbd8429995511883319e246f\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/13388 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"fdd5ac6375b444359f6e8cef7b755b9b\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/1338 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"c11c5894843a46848725d0c03fef9e02\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/133 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"bd34e1a806604192bbd6f95ba6435b44\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/13 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for batch_size in batch_size_list:\\n\",\n    \"    for n in rows_number_list:\\n\",\n    \"        df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"        with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"            out = dm.parallelized_with_batches(\\n\",\n    \"                smiles_to_unique_mol_id_batch,\\n\",\n    \"                df[\\\"smiles\\\"].values,\\n\",\n    \"                batch_size=batch_size,\\n\",\n    \"                progress=True,\\n\",\n    \"                n_jobs=-1,\\n\",\n    \"                scheduler=\\\"processes\\\",\\n\",\n    \"            )\\n\",\n    \"        assert len(out) == len(df), f\\\"{len(out)} != {len(df)}\\\"\\n\",\n    \"\\n\",\n    \"        datum = {\\n\",\n    \"            \\\"batch\\\": True,\\n\",\n    \"            \\\"batch_size\\\": batch_size,\\n\",\n    \"            \\\"scheduler\\\": \\\"loky_processes\\\",\\n\",\n    \"            \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"            \\\"duration_seconds\\\": d.duration,\\n\",\n    \"            \\\"n_rows\\\": len(df),\\n\",\n    \"        }\\n\",\n    \"        benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"ec7529d1-2ca3-42ec-a811-3dc010a03350\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"928c2236948d4a109c79a6bc79bd4649\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"VBox(children=(HBox(children=(IntProgress(value=0, description='0.00%', max=558), Label(value='0 / 558'))), HB…\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for n in rows_number_list:\\n\",\n    \"    df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"    with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"        _ = df[\\\"smiles\\\"].parallel_apply(smiles_to_unique_mol_id)\\n\",\n    \"\\n\",\n    \"    datum = {\\n\",\n    \"        \\\"batch\\\": False,\\n\",\n    \"        \\\"batch_size\\\": None,\\n\",\n    \"        \\\"scheduler\\\": \\\"pandarallel\\\",\\n\",\n    \"        \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"        \\\"duration_seconds\\\": d.duration,\\n\",\n    \"        \\\"n_rows\\\": len(df),\\n\",\n    \"    }\\n\",\n    \"    benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"1ddf56ef-6349-4d43-9720-9e53699e9a8e\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Results\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"989ae5dd-9826-4adb-af0a-64d9e2c19e2c\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>batch</th>\\n\",\n       \"      <th>batch_size</th>\\n\",\n       \"      <th>scheduler</th>\\n\",\n       \"      <th>duration_minutes</th>\\n\",\n       \"      <th>duration_seconds</th>\\n\",\n       \"      <th>n_rows</th>\\n\",\n       \"      <th>duration_seconds_per_mol</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>1000.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.015375</td>\\n\",\n       \"      <td>0.922496</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000007</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>100.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.037034</td>\\n\",\n       \"      <td>2.222021</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000017</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>10000.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.055083</td>\\n\",\n       \"      <td>3.304987</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000025</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>5</th>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>pandarallel</td>\\n\",\n       \"      <td>0.121338</td>\\n\",\n       \"      <td>7.280271</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000054</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>10.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.165935</td>\\n\",\n       \"      <td>9.956113</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000074</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>3.639053</td>\\n\",\n       \"      <td>218.343192</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.001631</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"   batch  batch_size       scheduler  duration_minutes  duration_seconds  \\\\\\n\",\n       \"3   True      1000.0  loky_processes          0.015375          0.922496   \\n\",\n       \"2   True       100.0  loky_processes          0.037034          2.222021   \\n\",\n       \"4   True     10000.0  loky_processes          0.055083          3.304987   \\n\",\n       \"5  False         NaN     pandarallel          0.121338          7.280271   \\n\",\n       \"1   True        10.0  loky_processes          0.165935          9.956113   \\n\",\n       \"0  False         NaN  loky_processes          3.639053        218.343192   \\n\",\n       \"\\n\",\n       \"   n_rows  duration_seconds_per_mol  \\n\",\n       \"3  133885                  0.000007  \\n\",\n       \"2  133885                  0.000017  \\n\",\n       \"4  133885                  0.000025  \\n\",\n       \"5  133885                  0.000054  \\n\",\n       \"1  133885                  0.000074  \\n\",\n       \"0  133885                  0.001631  \"\n      ]\n     },\n     \"execution_count\": 8,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"b = pd.DataFrame(benchmark)\\n\",\n    \"b[\\\"duration_seconds_per_mol\\\"] = b[\\\"duration_seconds\\\"] / b[\\\"n_rows\\\"]\\n\",\n    \"\\n\",\n    \"b.sort_values(\\\"duration_seconds_per_mol\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"04c0b10c-bf99-43bf-979a-9d1f0c1ff1fd\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"39b3d77e\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "docs/tutorials/feature_processing/csv_to_parquet.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import pandas as pd\\n\",\n    \"import graphium\\n\",\n    \"from os.path import dirname, abspath\\n\",\n    \"\\n\",\n    \"MAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\\n\",\n    \"\\n\",\n    \"# TODO create funciton to read parquet, test from GCP storage (put it in path, should support gs and path, explore function from Pandas instead of parquet \\\"pq\\\")\\n\",\n    \"def _csv_to_parquet(csv_path, parquet_path):\\n\",\n    \"    df = pd.read_csv(csv_path)\\n\",\n    \"    df.to_parquet(parquet_path)\\n\",\n    \"\\n\",\n    \"_csv_to_parquet(MAIN_DIR + '/graphium/data/QM9/micro_qm9.csv', MAIN_DIR + '/graphium/data/QM9/micro_qm9.parquet')\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"    # TODO create funciton to specify if you read parquet or csv\\n\",\n    \"    # TODO replace all location with call for _read_csv and make sure to read all files if path ends with \\\"*\\\"\\n\",\n    \"    # def read_table:\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/nethome/andyh/graphium/docs/tutorials/feature_processing\\r\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"!pwd\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import graphium\\n\",\n    \"import os\\n\",\n    \"from os.path import dirname, abspath\\n\",\n    \"MAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\\n\",\n    \"os.chdir(MAIN_DIR) # No need for this file\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.10\"\n  },\n  \"vscode\": {\n   \"interpreter\": {\n    \"hash\": \"b813b16c721ca8d9e65e6e6945a38a6ae48015ae799c455bab639da90d3bb278\"\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "docs/tutorials/feature_processing/timing_parallel.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"id\": \"0693c5c9\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Timing parallel processing\\n\",\n    \"\\n\",\n    \"This tutorial is meant to help you benchmark different parallel processing methods for the processing of molecules into graphs. This will allow you to chose the one most suitable for your machine, since the benchmarks vary per machine.\\n\",\n    \"\\n\",\n    \"In general, we find that using `joblib` with the `loky` parallel processing and a batch size of `1000` is most beneficial. The logic is abstracted into `datamol.parallelized_with_batches`\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"b5df2ac6-2ded-4597-a445-f2b5fb106330\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"INFO: Pandarallel will run on 240 workers.\\n\",\n      \"INFO: Pandarallel will use Memory file system to transfer data between the main process and workers.\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import joblib\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import datamol as dm\\n\",\n    \"import pandas as pd\\n\",\n    \"\\n\",\n    \"from pandarallel import pandarallel\\n\",\n    \"\\n\",\n    \"pandarallel.initialize(progress_bar=True, nb_workers=joblib.cpu_count())\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"f81fe0f5-055f-436e-9ef4-e58a89fd50ec\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Setup\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"0f31e18d-bdd9-4d9b-8ba5-81e5887b857e\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"# download from https://raw.githubusercontent.com/aspuru-guzik-group/chemical_vae/master/models/zinc_properties/250k_rndm_zinc_drugs_clean_3.csv\\n\",\n    \"# data = pd.read_csv(\\\"/home/hadim/250k_rndm_zinc_drugs_clean_3.csv\\\", usecols=[\\\"smiles\\\"])\\n\",\n    \"\\n\",\n    \"# download from https://storage.valencelabs.com/graphium/datasets/QM9/norm_qm9.csv\\n\",\n    \"data = pd.read_csv(\\\"https://storage.valencelabs.com/graphium/datasets/QM9/norm_qm9.csv\\\", usecols=[\\\"smiles\\\"])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"a1197c31-7dbc-4fd7-a69a-5215e1a96b8e\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"rows_number_list = [250_000]\\n\",\n    \"batch_size_list = [10, 100, 1_000, 10_000]\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def smiles_to_unique_mol_id(smiles):\\n\",\n    \"    try:\\n\",\n    \"        mol = dm.to_mol(mol=smiles)\\n\",\n    \"        mol_id = dm.unique_id(mol)\\n\",\n    \"    except:\\n\",\n    \"        mol_id = \\\"\\\"\\n\",\n    \"    if mol_id is None:\\n\",\n    \"        mol_id = \\\"\\\"\\n\",\n    \"    return mol_id\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"def smiles_to_unique_mol_id_batch(smiles_list):\\n\",\n    \"    mol_id_list = []\\n\",\n    \"    for smiles in smiles_list:\\n\",\n    \"        mol_id_list.append(smiles_to_unique_mol_id(smiles))\\n\",\n    \"    return mol_id_list\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"5d8eea8f-b983-46e7-bfb4-b8012b3ede1a\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Benchmarks\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"2f8ce5c3-4232-4279-8ea3-7a74832303be\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [],\n   \"source\": [\n    \"benchmark = []\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"691f8963-6cac-4d58-b8a0-8049f1185486\",\n   \"metadata\": {},\n   \"source\": [\n    \"### No batch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"a246cdcf-b5ea-4c9e-9ccc-dd3c544587bb\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"b2e528726a384b258d6ed3576bc0db1c\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/133885 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for n in rows_number_list:\\n\",\n    \"    df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"    with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"        out = dm.parallelized(\\n\",\n    \"            smiles_to_unique_mol_id,\\n\",\n    \"            df[\\\"smiles\\\"].values,\\n\",\n    \"            progress=True,\\n\",\n    \"            n_jobs=-1,\\n\",\n    \"            scheduler=\\\"processes\\\",\\n\",\n    \"        )\\n\",\n    \"\\n\",\n    \"    datum = {\\n\",\n    \"        \\\"batch\\\": False,\\n\",\n    \"        \\\"batch_size\\\": None,\\n\",\n    \"        \\\"scheduler\\\": \\\"loky_processes\\\",\\n\",\n    \"        \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"        \\\"duration_seconds\\\": d.duration,\\n\",\n    \"        \\\"n_rows\\\": len(df),\\n\",\n    \"    }\\n\",\n    \"    benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"41139854-892f-4481-8dd6-a3972bb6ece0\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Batch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"845a72d5-0b3f-4a21-8d7d-8312671e9924\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"85e79375f3e94d9abc435bdc8d28d2df\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/13388 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"f5df65b3d2474b02aff1058e044f7dcf\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/1338 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"3a5170b0055e4971bd0ea5e7b6cdcbd8\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/133 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"fa091db65b454db4bc824439645290ad\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/13 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for batch_size in batch_size_list:\\n\",\n    \"    for n in rows_number_list:\\n\",\n    \"        df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"        with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"            out = dm.parallelized_with_batches(\\n\",\n    \"                smiles_to_unique_mol_id_batch,\\n\",\n    \"                df[\\\"smiles\\\"].values,\\n\",\n    \"                batch_size=batch_size,\\n\",\n    \"                progress=True,\\n\",\n    \"                n_jobs=-1,\\n\",\n    \"                scheduler=\\\"processes\\\",\\n\",\n    \"            )\\n\",\n    \"        assert len(out) == len(df), f\\\"{len(out)} != {len(df)}\\\"\\n\",\n    \"\\n\",\n    \"        datum = {\\n\",\n    \"            \\\"batch\\\": True,\\n\",\n    \"            \\\"batch_size\\\": batch_size,\\n\",\n    \"            \\\"scheduler\\\": \\\"loky_processes\\\",\\n\",\n    \"            \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"            \\\"duration_seconds\\\": d.duration,\\n\",\n    \"            \\\"n_rows\\\": len(df),\\n\",\n    \"        }\\n\",\n    \"        benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"ec7529d1-2ca3-42ec-a811-3dc010a03350\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"a9de4fa3630e4c9aa4d81cac4f10b4c8\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"VBox(children=(HBox(children=(IntProgress(value=0, description='0.00%', max=558), Label(value='0 / 558'))), HB…\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"for n in rows_number_list:\\n\",\n    \"    df = data.iloc[:n]\\n\",\n    \"\\n\",\n    \"    with dm.utils.perf.watch_duration(log=False) as d:\\n\",\n    \"        _ = df[\\\"smiles\\\"].parallel_apply(smiles_to_unique_mol_id)\\n\",\n    \"\\n\",\n    \"    datum = {\\n\",\n    \"        \\\"batch\\\": False,\\n\",\n    \"        \\\"batch_size\\\": None,\\n\",\n    \"        \\\"scheduler\\\": \\\"pandarallel\\\",\\n\",\n    \"        \\\"duration_minutes\\\": d.duration_minutes,\\n\",\n    \"        \\\"duration_seconds\\\": d.duration,\\n\",\n    \"        \\\"n_rows\\\": len(df),\\n\",\n    \"    }\\n\",\n    \"    benchmark.append(datum)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"1ddf56ef-6349-4d43-9720-9e53699e9a8e\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Results\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"989ae5dd-9826-4adb-af0a-64d9e2c19e2c\",\n   \"metadata\": {\n    \"tags\": []\n   },\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>batch</th>\\n\",\n       \"      <th>batch_size</th>\\n\",\n       \"      <th>scheduler</th>\\n\",\n       \"      <th>duration_minutes</th>\\n\",\n       \"      <th>duration_seconds</th>\\n\",\n       \"      <th>n_rows</th>\\n\",\n       \"      <th>duration_seconds_per_mol</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>1000.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.014199</td>\\n\",\n       \"      <td>0.851930</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000006</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>100.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.037132</td>\\n\",\n       \"      <td>2.227947</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000017</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>10000.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.047438</td>\\n\",\n       \"      <td>2.846266</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000021</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>5</th>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>pandarallel</td>\\n\",\n       \"      <td>0.118230</td>\\n\",\n       \"      <td>7.093791</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000053</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>True</td>\\n\",\n       \"      <td>10.0</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>0.222177</td>\\n\",\n       \"      <td>13.330603</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.000100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>False</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>loky_processes</td>\\n\",\n       \"      <td>4.002346</td>\\n\",\n       \"      <td>240.140754</td>\\n\",\n       \"      <td>133885</td>\\n\",\n       \"      <td>0.001794</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"   batch  batch_size       scheduler  duration_minutes  duration_seconds   \\n\",\n       \"3   True      1000.0  loky_processes          0.014199          0.851930  \\\\\\n\",\n       \"2   True       100.0  loky_processes          0.037132          2.227947   \\n\",\n       \"4   True     10000.0  loky_processes          0.047438          2.846266   \\n\",\n       \"5  False         NaN     pandarallel          0.118230          7.093791   \\n\",\n       \"1   True        10.0  loky_processes          0.222177         13.330603   \\n\",\n       \"0  False         NaN  loky_processes          4.002346        240.140754   \\n\",\n       \"\\n\",\n       \"   n_rows  duration_seconds_per_mol  \\n\",\n       \"3  133885                  0.000006  \\n\",\n       \"2  133885                  0.000017  \\n\",\n       \"4  133885                  0.000021  \\n\",\n       \"5  133885                  0.000053  \\n\",\n       \"1  133885                  0.000100  \\n\",\n       \"0  133885                  0.001794  \"\n      ]\n     },\n     \"execution_count\": 8,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"b = pd.DataFrame(benchmark)\\n\",\n    \"b[\\\"duration_seconds_per_mol\\\"] = b[\\\"duration_seconds\\\"] / b[\\\"n_rows\\\"]\\n\",\n    \"\\n\",\n    \"b.sort_values(\\\"duration_seconds_per_mol\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"04c0b10c-bf99-43bf-979a-9d1f0c1ff1fd\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"id\": \"39b3d77e\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"graphium_ipu\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.10\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "docs/tutorials/gnn/add_new_gnn_layers.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Creating GNN layers\\n\",\n    \"\\n\",\n    \"One of the primary advantage of this library is the fact that GNN layers are independent from model architecture, thus allowing more flexibility with the code by easily swapping different layer types as hyper-parameters. However, this requires that the layers be implemented using the PyG library, and must be inherited from the class [`BaseGraphStructure`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure), which standardizes the inputs, outputs and properties of the layers. Thus, the architecture can be handled independantly using the class [`FeedForwardPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/architectures.html#graphium.nn.architectures.pyg_architectures.FeedForwardPyg) or other custom classes.\\n\",\n    \"\\n\",\n    \"We will first start by a simple layer that does not use edges, to a more complex layer that uses edges.\\n\",\n    \"\\n\",\n    \"Since these examples are built on top of PyG, we recommend looking at their [Docs](https://pytorch-geometric.readthedocs.io/en/latest/) and [Tutorials](https://pytorch-geometric.readthedocs.io/en/latest/get_started/introduction.html) for more info. \"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Table of content\\n\",\n    \"1. [Define test graph](#Define-synthetic-test-graph)\\n\",\n    \"2. [Create simple layer](#Creating-a-simple-layer)\\n\",\n    \"3. [Test simple layer](#Test-our-simple-layer)\\n\",\n    \"4. [Create complex layer](#Creating-a-complex-layer-with-edges)\\n\",\n    \"5. [Test complex layer](#Test-our-complex-layer) \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 81,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"The autoreload extension is already loaded. To reload it, use:\\n\",\n      \"  %reload_ext autoreload\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import torch\\n\",\n    \"from torch import Tensor\\n\",\n    \"from torch_geometric.data import Data, Batch\\n\",\n    \"from torch_geometric.nn.conv import MessagePassing\\n\",\n    \"from torch_scatter import scatter\\n\",\n    \"from torch_geometric.typing import (\\n\",\n    \"    OptPairTensor,\\n\",\n    \"    OptTensor\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"from copy import deepcopy\\n\",\n    \"from typing import Callable, Union, Optional, List\\n\",\n    \"from graphium.nn.base_graph_layer import BaseGraphStructure\\n\",\n    \"from graphium.nn.base_layers import FCLayer\\n\",\n    \"from graphium.utils.decorators import classproperty\\n\",\n    \"\\n\",\n    \"_ = torch.manual_seed(42)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Define synthetic test graph\\n\",\n    \"\\n\",\n    \"We define below a small batched graph on which we can test the created layers. You can also use synthetic graphs to define unit tests \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 82,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"DataBatch(edge_index=[2, 7], feat=[7, 5], edge_feat=[7, 13], batch=[7], ptr=[3])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"in_dim = 5          # Input node-feature dimensions\\n\",\n    \"out_dim = 11        # Desired output node-feature dimensions\\n\",\n    \"in_dim_edges = 13   # Input edge-feature dimensions\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# Let's create 2 simple pyg graphs. \\n\",\n    \"# start by specifying the edges with edge index\\n\",\n    \"edge_idx1 = torch.tensor([[0, 1, 2],\\n\",\n    \"                          [1, 2, 3]])\\n\",\n    \"edge_idx2 = torch.tensor([[2, 0, 0, 1],\\n\",\n    \"                          [0, 1, 2, 0]])\\n\",\n    \"\\n\",\n    \"# specify the node features, convention with variable x\\n\",\n    \"x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# specify the edge features in e\\n\",\n    \"e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# make the pyg graph objects with our constructed features\\n\",\n    \"g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\\n\",\n    \"g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\\n\",\n    \"\\n\",\n    \"# put the two graphs into a Batch graph\\n\",\n    \"bg = Batch.from_data_list([g1, g2])\\n\",\n    \"\\n\",\n    \"# The batched graph will show as a single graph with 7 nodes\\n\",\n    \"print(bg)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Creating a simple layer\\n\",\n    \"\\n\",\n    \"Here, we will show how to create a GNN layer that simples does a mean aggregation on the neighbouring features.\\n\",\n    \"\\n\",\n    \"First, for the layer to be fully compatible with the flexible architecture provided by [`FeedForwardPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/architectures.html#graphium.nn.architectures.pyg_architectures.FeedForwardPyg), it needs to inherit from the class [`BaseGraphStructure`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure). This base-layer has multiple virtual methods that must be implemented in any class that inherits from it.\\n\",\n    \"\\n\",\n    \"The virtual methods are below\\n\",\n    \"\\n\",\n    \"- [`layer_supports_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure): We want to return `False` since our layer doesn't support edges\\n\",\n    \"- [`layer_inputs_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.layer_inputs_edges): We want to return `False` since our layer doesn't input edges\\n\",\n    \"- [`layer_outputs_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.layer_outputs_edges): We want to return `False` since our layer doesn't output edges\\n\",\n    \"- [`out_dim_factor`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.out_dim_factor): We want to return `1` since the output dimension does not depend on internal parameters.\\n\",\n    \"\\n\",\n    \"The example is given below\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 83,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# inherit from message passing class from pyg \\n\",\n    \"# inherit from BaseGraphStructure\\n\",\n    \"# this example is also based of : https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/conv/simple_conv.html#SimpleConv.forward\\n\",\n    \"\\n\",\n    \"class SimpleMeanLayer(MessagePassing, BaseGraphStructure):\\n\",\n    \"    def __init__(self, \\n\",\n    \"                 in_dim: int, \\n\",\n    \"                 out_dim: int, \\n\",\n    \"                 activation: Union[Callable, str] = \\\"relu\\\", \\n\",\n    \"                 dropout: float = 0.0, \\n\",\n    \"                 normalization: Union[str, Callable] = \\\"none\\\",\\n\",\n    \"                 aggr: str = \\\"mean\\\",\\n\",\n    \"                ):\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        add documentation in the format shown here for this layer to be automatically added to the graphium api reference\\n\",\n    \"        the type information will also be automatically shown in the doc\\n\",\n    \"        \\n\",\n    \"        Parameters:\\n\",\n    \"\\n\",\n    \"            in_dim:\\n\",\n    \"                Input feature dimensions of the layer\\n\",\n    \"\\n\",\n    \"            out_dim:\\n\",\n    \"                Output feature dimensions of the layer\\n\",\n    \"\\n\",\n    \"            activation:\\n\",\n    \"                activation function to use in the layer\\n\",\n    \"\\n\",\n    \"            dropout:\\n\",\n    \"                The ratio of units to dropout. Must be between 0 and 1\\n\",\n    \"\\n\",\n    \"            normalization:\\n\",\n    \"                Normalization to use. Choices:\\n\",\n    \"\\n\",\n    \"                - \\\"none\\\" or `None`: No normalization\\n\",\n    \"                - \\\"batch_norm\\\": Batch normalization\\n\",\n    \"                - \\\"layer_norm\\\": Layer normalization\\n\",\n    \"                - `Callable`: Any callable function\\n\",\n    \"            aggr:\\n\",\n    \"                what aggregation to use (\\\"add\\\", \\\"mean\\\" or \\\"max\\\")\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        # Initialize the parent class\\n\",\n    \"        MessagePassing.__init__(self, node_dim=0, aggr=aggr)\\n\",\n    \"        BaseGraphStructure.__init__(self,\\n\",\n    \"                                    in_dim=in_dim, \\n\",\n    \"                                    out_dim=out_dim, \\n\",\n    \"                                    activation=activation,\\n\",\n    \"                                    dropout=dropout, \\n\",\n    \"                                    normalization=normalization)\\n\",\n    \"        \\n\",\n    \"        self.aggr = aggr\\n\",\n    \"        self._initialize_activation_dropout_norm()\\n\",\n    \"        # Create the mlp layer \\n\",\n    \"        # https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_layers.MLP\\n\",\n    \"        self.transform = FCLayer(in_dim=in_dim, out_dim=out_dim)\\n\",\n    \"            \\n\",\n    \"    # define the forward function \\n\",\n    \"    def forward(self, \\n\",\n    \"                batch: Union[Data, Batch],\\n\",\n    \"               ) -> Union[Data, Batch]:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        similarly add documentation to the functions in the class\\n\",\n    \"        \\n\",\n    \"        Parameters:\\n\",\n    \"            batch: pyg Batch graphs to pass through the layer\\n\",\n    \"        Returns:\\n\",\n    \"            batch: pyg Batch graphs\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        \\n\",\n    \"        x = batch.feat\\n\",\n    \"        edge_index = batch.edge_index\\n\",\n    \"        \\n\",\n    \"        if isinstance(x, Tensor):\\n\",\n    \"            x: OptPairTensor = (x, x)\\n\",\n    \"\\n\",\n    \"        # propagate_type: (x: OptPairTensor, edge_weight: OptTensor)\\n\",\n    \"        out = self.propagate(edge_index, x=x)\\n\",\n    \"        out = self.transform(out)\\n\",\n    \"        out = self.apply_norm_activation_dropout(out, batch_idx=batch.batch)\\n\",\n    \"        batch.feat = out\\n\",\n    \"        return batch\\n\",\n    \"    \\n\",\n    \"    def message(self, x_j):\\n\",\n    \"\\n\",\n    \"        return x_j\\n\",\n    \"\\n\",\n    \"    # Finally, we define all the virtual properties according to how the class works with proper documentation of course\\n\",\n    \"    @classproperty\\n\",\n    \"    def layer_supports_edges(cls) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Return a boolean specifying if the layer type supports edges or not.\\n\",\n    \"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                False for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return False\\n\",\n    \"\\n\",\n    \"    @property\\n\",\n    \"    def layer_inputs_edges(self) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                Returns False\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return False\\n\",\n    \"            \\n\",\n    \"    @property\\n\",\n    \"    def layer_outputs_edges(self) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                Always ``False`` for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return False\\n\",\n    \"\\n\",\n    \"    @property\\n\",\n    \"    def out_dim_factor(self) -> int:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Get the factor by which the output dimension is multiplied for\\n\",\n    \"        the next layer.\\n\",\n    \"\\n\",\n    \"        For standard layers, this will return ``1``.\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            int:\\n\",\n    \"                Always ``1`` for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return 1\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Test our simple layer\\n\",\n    \"\\n\",\n    \"Now, we are ready to test the `SimpleMeanLayer` on our constructed graphs. Note that in this example, we **ignore** the edge features since they are not supported.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 84,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"\\n\",\n    \"layer = SimpleMeanLayer(\\n\",\n    \"            in_dim=in_dim, \\n\",\n    \"            out_dim=out_dim, \\n\",\n    \"            activation=\\\"relu\\\", \\n\",\n    \"            dropout=.3, \\n\",\n    \"            normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"graph = layer(graph)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Creating a complex layer with edges\\n\",\n    \"\\n\",\n    \"Here, we will show how to create a GNN layer that does a mean aggregation on the neighbouring features, concatenated to the edge features with their neighbours. In that case, only the node features will change, and the network will not update the edge features.\\n\",\n    \"\\n\",\n    \"The virtual methods will have different outputs\\n\",\n    \"\\n\",\n    \"- [`layer_supports_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure): We want to return `True` since our layer does support edges\\n\",\n    \"- [`layer_inputs_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.layer_inputs_edges): We want to return `True` since our layer does input edges\\n\",\n    \"- [`layer_outputs_edges`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.layer_outputs_edges): We want to return `False` since our layer will not output new edges\\n\",\n    \"- [`out_dim_factor`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure.out_dim_factor): We want to return `1` since the output dimension does not depend on internal parameters.\\n\",\n    \"\\n\",\n    \"The example is given below\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 85,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# inherent from message passing class from pyg \\n\",\n    \"# inherent from BaseGraphStructure\\n\",\n    \"# adapting example from graphium/nn/pyg_layers/png_pyg.py\\n\",\n    \"\\n\",\n    \"class ComplexMeanLayer(MessagePassing, BaseGraphStructure):\\n\",\n    \"    def __init__(self, \\n\",\n    \"                 in_dim: int, \\n\",\n    \"                 out_dim: int, \\n\",\n    \"                 in_dim_edges: int, \\n\",\n    \"                 activation: Union[Callable, str] = \\\"relu\\\", \\n\",\n    \"                 dropout: float = 0.0, \\n\",\n    \"                 normalization: Union[str, Callable] = \\\"none\\\",\\n\",\n    \"                 aggr: str = \\\"mean\\\",\\n\",\n    \"                ):\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        add documentation in the format shown here for this layer to be automatically added to the graphium api reference\\n\",\n    \"        the type information will also be automatically shown in the doc\\n\",\n    \"        \\n\",\n    \"        Parameters:\\n\",\n    \"\\n\",\n    \"            in_dim:\\n\",\n    \"                Input feature dimensions of the layer\\n\",\n    \"\\n\",\n    \"            out_dim:\\n\",\n    \"                Output feature dimensions of the layer\\n\",\n    \"            \\n\",\n    \"            in_dim_edges:\\n\",\n    \"                Input edge feature dimension of the layer\\n\",\n    \"\\n\",\n    \"            activation:\\n\",\n    \"                activation function to use in the layer\\n\",\n    \"\\n\",\n    \"            dropout:\\n\",\n    \"                The ratio of units to dropout. Must be between 0 and 1\\n\",\n    \"\\n\",\n    \"            normalization:\\n\",\n    \"                Normalization to use. Choices:\\n\",\n    \"\\n\",\n    \"                - \\\"none\\\" or `None`: No normalization\\n\",\n    \"                - \\\"batch_norm\\\": Batch normalization\\n\",\n    \"                - \\\"layer_norm\\\": Layer normalization\\n\",\n    \"                - `Callable`: Any callable function\\n\",\n    \"            aggr:\\n\",\n    \"                what aggregation to use (\\\"add\\\", \\\"mean\\\" or \\\"max\\\")\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        # Initialize the parent class\\n\",\n    \"        MessagePassing.__init__(self, node_dim=0, aggr=aggr)\\n\",\n    \"        BaseGraphStructure.__init__(self,\\n\",\n    \"                                    in_dim=in_dim, \\n\",\n    \"                                    out_dim=out_dim, \\n\",\n    \"                                    activation=activation,\\n\",\n    \"                                    dropout=dropout, \\n\",\n    \"                                    normalization=normalization)\\n\",\n    \"        \\n\",\n    \"        self.aggr = aggr\\n\",\n    \"        self._initialize_activation_dropout_norm()\\n\",\n    \"        # Create the mlp layer \\n\",\n    \"        # https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_layers.MLP\\n\",\n    \"        self.transform = FCLayer(in_dim=(in_dim + in_dim_edges), out_dim=out_dim)\\n\",\n    \"            \\n\",\n    \"    # define the forward function \\n\",\n    \"    def forward(self, \\n\",\n    \"                batch: Union[Data, Batch],\\n\",\n    \"               ) -> Union[Data, Batch]:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        similarly add documentation to the functions in the class\\n\",\n    \"        \\n\",\n    \"        Parameters:\\n\",\n    \"            batch: pyg Batch graphs to pass through the layer\\n\",\n    \"        Returns:\\n\",\n    \"            batch: pyg Batch graphs\\n\",\n    \"        \\\"\\\"\\\"        \\n\",\n    \"        x = batch.feat\\n\",\n    \"        edge_index = batch.edge_index\\n\",\n    \"        edge_feat = batch.edge_feat\\n\",\n    \"        \\n\",\n    \"        if isinstance(x, Tensor):\\n\",\n    \"            x: OptPairTensor = (x, x)\\n\",\n    \"\\n\",\n    \"        # propagate_type: (x: OptPairTensor, edge_weight: OptTensor)\\n\",\n    \"        out = self.propagate(edge_index, x=x, edge_feat=edge_feat, size=None)\\n\",\n    \"        out = self.transform(out)\\n\",\n    \"        out = self.apply_norm_activation_dropout(out, batch_idx=batch.batch)\\n\",\n    \"        batch.feat = out\\n\",\n    \"        return batch\\n\",\n    \"    \\n\",\n    \"    def message(self, x_j: Tensor, edge_feat: OptTensor) -> Tensor:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        message function\\n\",\n    \"\\n\",\n    \"        Parameters:\\n\",\n    \"            x_i: node features\\n\",\n    \"            x_j: neighbour node features\\n\",\n    \"            edge_feat: edge features\\n\",\n    \"        Returns:\\n\",\n    \"            feat: the message\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        feat = torch.cat([x_j, edge_feat], dim=-1)\\n\",\n    \"        return feat\\n\",\n    \"    \\n\",\n    \"    def aggregate(\\n\",\n    \"        self,\\n\",\n    \"        inputs: Tensor,\\n\",\n    \"        index: Tensor,\\n\",\n    \"        edge_index: Tensor,\\n\",\n    \"        dim_size: Optional[int] = None,\\n\",\n    \"    ) -> Tensor:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        aggregate function\\n\",\n    \"        Parameters:\\n\",\n    \"            inputs: input features\\n\",\n    \"            index: index of the nodes\\n\",\n    \"            edge_index: edge index\\n\",\n    \"            dim_size: dimension size\\n\",\n    \"        Returns:\\n\",\n    \"            out: aggregated features\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        out = scatter(inputs, index, 0, None, dim_size, reduce=self.aggr)\\n\",\n    \"        return out\\n\",\n    \"    \\n\",\n    \"\\n\",\n    \"    # Finally, we define all the virtual properties according to how the class works with proper documentation of course\\n\",\n    \"    @classproperty\\n\",\n    \"    def layer_supports_edges(cls) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Return a boolean specifying if the layer type supports edges or not.\\n\",\n    \"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                False for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return True\\n\",\n    \"\\n\",\n    \"    @property\\n\",\n    \"    def layer_inputs_edges(self) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                Returns True\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return True\\n\",\n    \"            \\n\",\n    \"    @property\\n\",\n    \"    def layer_outputs_edges(self) -> bool:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            bool:\\n\",\n    \"                Always ``True`` for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return True\\n\",\n    \"\\n\",\n    \"    @property\\n\",\n    \"    def out_dim_factor(self) -> int:\\n\",\n    \"        r\\\"\\\"\\\"\\n\",\n    \"        Get the factor by which the output dimension is multiplied for\\n\",\n    \"        the next layer.\\n\",\n    \"\\n\",\n    \"        For standard layers, this will return ``1``.\\n\",\n    \"        Returns:\\n\",\n    \"\\n\",\n    \"            int:\\n\",\n    \"                Always ``1`` for the current class\\n\",\n    \"        \\\"\\\"\\\"\\n\",\n    \"        return 1\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Test our complex layer \\n\",\n    \"\\n\",\n    \"Now, we are ready to test the `ComplexMeanLayer` on our constructed graphs. Note that in this example, we utilize the edge features and node features\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 86,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"\\n\",\n    \"layer = ComplexMeanLayer(\\n\",\n    \"            in_dim=in_dim, \\n\",\n    \"            out_dim=out_dim, \\n\",\n    \"            in_dim_edges=in_dim_edges,\\n\",\n    \"            activation=\\\"relu\\\", \\n\",\n    \"            dropout=.3, \\n\",\n    \"            normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"graph = layer(graph)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "docs/tutorials/gnn/making_gnn_networks.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Making GNN Networks\\n\",\n    \"\\n\",\n    \"In this example, you will learn how to easily build a full multi-task GNN using any kind of GNN layer. This tutorial uses the architecture defined by the class `FullGraphMultiTaskNetwork`.\\n\",\n    \"\\n\",\n    \"`FullGraphMultiTaskNetwork` is an architecture that takes as input node features and (optionally) edge features. It applies a pre-MLP on both sets of features, then passes them into a main GNN network, and finally applies graph output NNs to produces the final outputs on possibly several task levels (graph, node, or edge-level).\\n\",\n    \"\\n\",\n    \"The network is very easy to built via a dictionnary of parameter that allow to customize each part of the network.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 114,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"The autoreload extension is already loaded. To reload it, use:\\n\",\n      \"  %reload_ext autoreload\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import torch\\n\",\n    \"from copy import deepcopy\\n\",\n    \"\\n\",\n    \"from torch_geometric.data import Data, Batch\\n\",\n    \"\\n\",\n    \"from graphium.nn.architectures import FullGraphMultiTaskNetwork\\n\",\n    \"\\n\",\n    \"_ = torch.manual_seed(42)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"We will first create some simple batched graphs that will be used accross the examples. Here, `bg` is a batch containing 2 graphs with random node features.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 115,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"DataBatch(edge_index=[2, 7], feat=[7, 5], edge_feat=[7, 13], batch=[7], ptr=[3])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"in_dim = 5          # Input node-feature dimensions\\n\",\n    \"in_dim_edges = 13   # Input edge-feature dimensions\\n\",\n    \"out_dim = 11        # Desired output node-feature dimensions\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# Let's create 2 simple pyg graphs. \\n\",\n    \"# start by specifying the edges with edge index\\n\",\n    \"edge_idx1 = torch.tensor([[0, 1, 2],\\n\",\n    \"                          [1, 2, 3]])\\n\",\n    \"edge_idx2 = torch.tensor([[2, 0, 0, 1],\\n\",\n    \"                          [0, 1, 2, 0]])\\n\",\n    \"\\n\",\n    \"# specify the node features, convention with variable x\\n\",\n    \"x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# specify the edge features in e\\n\",\n    \"e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# make the pyg graph objects with our constructed features\\n\",\n    \"g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\\n\",\n    \"g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\\n\",\n    \"\\n\",\n    \"# put the two graphs into a Batch graph\\n\",\n    \"bg = Batch.from_data_list([g1, g2])\\n\",\n    \"\\n\",\n    \"# The batched graph will show as a single graph with 7 nodes\\n\",\n    \"print(bg)\\n\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Building a network\\n\",\n    \"\\n\",\n    \"To build the network, we must define the arguments to pass at the different steps:\\n\",\n    \"\\n\",\n    \"- `pre_nn_kwargs`: The parameters used by a feed-forward neural network on the input node-features, before passing to the convolutional layers. See class `FeedForwardNN` for details on the required parameters. Will be ignored if set to `None`.\\n\",\n    \"\\n\",\n    \"- `gnn_kwargs`: The parameters used by a feed-forward **graph** neural network on the features after it has passed through the pre-processing NN. See class `FeedForwardGraph` for details on the required parameters.\\n\",\n    \"\\n\",\n    \"- `task_heads_kwargs`: The parameters used to specify the different task heads for possibly multiple tasks, each with a specified `task_level` (graph, node or edge).\\n\",\n    \"\\n\",\n    \"- `graph_output_nn_kwargs`: The parameters used by the graph output NNs to process the features after the GNN layers. We need to to specify a NN for each `task_level` occuring in in the `task_heads_kwargs`.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 116,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"temp_dim_1 = 23\\n\",\n    \"temp_dim_2 = 17\\n\",\n    \"\\n\",\n    \"pre_nn_kwargs = {\\n\",\n    \"    \\\"in_dim\\\": in_dim,\\n\",\n    \"    \\\"out_dim\\\": temp_dim_1,\\n\",\n    \"    \\\"hidden_dims\\\": 4,\\n\",\n    \"    \\\"depth\\\": 2,\\n\",\n    \"    \\\"activation\\\": 'relu',\\n\",\n    \"    \\\"last_activation\\\": \\\"none\\\",\\n\",\n    \"    \\\"dropout\\\": 0.2\\n\",\n    \"}\\n\",\n    \"\\n\",\n    \"gnn_kwargs = {\\n\",\n    \"    \\\"in_dim\\\": temp_dim_1,\\n\",\n    \"    \\\"out_dim\\\": temp_dim_2,\\n\",\n    \"    \\\"hidden_dims\\\": 5,\\n\",\n    \"    \\\"depth\\\": 1,\\n\",\n    \"    \\\"activation\\\": 'gelu',\\n\",\n    \"    \\\"last_activation\\\": None,\\n\",\n    \"    \\\"dropout\\\": 0.1,\\n\",\n    \"    \\\"normalization\\\": 'layer_norm',\\n\",\n    \"    \\\"last_normalization\\\": 'layer_norm',\\n\",\n    \"    \\\"residual_type\\\": 'simple',\\n\",\n    \"    \\\"virtual_node\\\": None,\\n\",\n    \"    \\\"layer_type\\\": 'pyg:gcn',\\n\",\n    \"    \\\"layer_kwargs\\\": None\\n\",\n    \"}\\n\",\n    \"\\n\",\n    \"task_heads_kwargs = {\\n\",\n    \"    \\\"graph-task-1\\\": {\\n\",\n    \"        \\\"task_level\\\": 'graph',\\n\",\n    \"        \\\"out_dim\\\": 3,\\n\",\n    \"        \\\"hidden_dims\\\": 32,\\n\",\n    \"        \\\"depth\\\": 2,\\n\",\n    \"        \\\"activation\\\": 'relu',\\n\",\n    \"        \\\"last_activation\\\": None,\\n\",\n    \"        \\\"dropout\\\": 0.1,\\n\",\n    \"        \\\"normalization\\\": None,\\n\",\n    \"        \\\"last_normalization\\\": None,\\n\",\n    \"        \\\"residual_type\\\": \\\"none\\\"\\n\",\n    \"    },\\n\",\n    \"    \\\"graph-task-2\\\": {\\n\",\n    \"        \\\"task_level\\\": 'graph',\\n\",\n    \"        \\\"out_dim\\\": 4,\\n\",\n    \"        \\\"hidden_dims\\\": 32,\\n\",\n    \"        \\\"depth\\\": 2,\\n\",\n    \"        \\\"activation\\\": 'relu',\\n\",\n    \"        \\\"last_activation\\\": None,\\n\",\n    \"        \\\"dropout\\\": 0.1,\\n\",\n    \"        \\\"normalization\\\": None,\\n\",\n    \"        \\\"last_normalization\\\": None,\\n\",\n    \"        \\\"residual_type\\\": \\\"none\\\"\\n\",\n    \"    },\\n\",\n    \"    \\\"node-task-1\\\": {\\n\",\n    \"        \\\"task_level\\\": 'node',\\n\",\n    \"        \\\"out_dim\\\": 2,\\n\",\n    \"        \\\"hidden_dims\\\": 32,\\n\",\n    \"        \\\"depth\\\": 2,\\n\",\n    \"        \\\"activation\\\": 'relu',\\n\",\n    \"        \\\"last_activation\\\": None,\\n\",\n    \"        \\\"dropout\\\": 0.1,\\n\",\n    \"        \\\"normalization\\\": None,\\n\",\n    \"        \\\"last_normalization\\\": None,\\n\",\n    \"        \\\"residual_type\\\": \\\"none\\\"\\n\",\n    \"    }\\n\",\n    \"}\\n\",\n    \"\\n\",\n    \"graph_output_nn_kwargs = {\\n\",\n    \"    \\\"graph\\\": {\\n\",\n    \"        \\\"pooling\\\": ['sum'],\\n\",\n    \"        \\\"out_dim\\\": temp_dim_2,\\n\",\n    \"        \\\"hidden_dims\\\": temp_dim_2,\\n\",\n    \"        \\\"depth\\\": 1,\\n\",\n    \"        \\\"activation\\\": 'relu',\\n\",\n    \"        \\\"last_activation\\\": None,\\n\",\n    \"        \\\"dropout\\\": 0.1,\\n\",\n    \"        \\\"normalization\\\": None,\\n\",\n    \"        \\\"last_normalization\\\": None,\\n\",\n    \"        \\\"residual_type\\\": \\\"none\\\"\\n\",\n    \"    },\\n\",\n    \"    \\\"node\\\": {\\n\",\n    \"        \\\"pooling\\\": None,\\n\",\n    \"        \\\"out_dim\\\": temp_dim_2,\\n\",\n    \"        \\\"hidden_dims\\\": temp_dim_2,\\n\",\n    \"        \\\"depth\\\": 1,\\n\",\n    \"        \\\"activation\\\": 'relu',\\n\",\n    \"        \\\"last_activation\\\": None,\\n\",\n    \"        \\\"dropout\\\": 0.1,\\n\",\n    \"        \\\"normalization\\\": None,\\n\",\n    \"        \\\"last_normalization\\\": None,\\n\",\n    \"        \\\"residual_type\\\": \\\"none\\\"\\n\",\n    \"    }\\n\",\n    \"}\\n\",\n    \"    \\n\",\n    \"\\n\",\n    \"gnn_net = FullGraphMultiTaskNetwork(\\n\",\n    \"    gnn_kwargs=gnn_kwargs,\\n\",\n    \"    pre_nn_kwargs=pre_nn_kwargs, \\n\",\n    \"    task_heads_kwargs=task_heads_kwargs,\\n\",\n    \"    graph_output_nn_kwargs = graph_output_nn_kwargs\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Applying the network\\n\",\n    \"\\n\",\n    \"Once the network is defined, we only need to run the forward pass on the input graphs to get a prediction.\\n\",\n    \"\\n\",\n    \"The model outputs a dictionary of outputs, one for each task specified in `task_heads_kwargs`.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 117,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"\\n\",\n      \"\\n\",\n      \"FullGNN\\n\",\n      \"---------\\n\",\n      \"    pre-NN(depth=2, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[5 -> 4 -> 23]\\n\",\n      \"    \\n\",\n      \"    GNN(depth=1, ResidualConnectionSimple(skip_steps=1))\\n\",\n      \"        GCNConvPyg[23 -> 17]\\n\",\n      \"        \\n\",\n      \"    \\n\",\n      \"        Task heads:\\n\",\n      \"        graph-task-1: NN-graph-task-1(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 3]\\n\",\n      \"        graph-task-2: NN-graph-task-2(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 4]\\n\",\n      \"        node-task-1: NN-node-task-1(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 2]\\n\",\n      \"\\n\",\n      \"\\n\",\n      \"graph-task-1 torch.Size([2, 3])\\n\",\n      \"graph-task-2 torch.Size([2, 4])\\n\",\n      \"node-task-1 torch.Size([7, 2])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(\\\"\\\\n\\\")\\n\",\n    \"\\n\",\n    \"print(gnn_net)\\n\",\n    \"print(\\\"\\\\n\\\")\\n\",\n    \"\\n\",\n    \"out = gnn_net(graph)\\n\",\n    \"for task in out.keys():\\n\",\n    \"    print(task, out[task].shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Building a network (using additional edge features)\\n\",\n    \"\\n\",\n    \"We can further use a GNN that uses additional edge features as inputs. For that, we only need to slightly modify the `gnn_kwargs` from above. Below, we define the new `gnn_edge_kwargs`, where we can set the `layer_type` to one of the GNN layers that support edge features (e.g., `pyg:gine` or `pyg:gps`) and add the additional parameter `in_dim_edges`. Optionally, we can configure a preprocessing NN for the the edge features via a dictionaly `pre_nn_edges_kwargs` analogous to `pre_nn_kwargs` above, or skip this step by setting it to `None`.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 118,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"temp_dim_edges = 8\\n\",\n    \"\\n\",\n    \"gnn_edge_kwargs = {\\n\",\n    \"    \\\"in_dim\\\": temp_dim_1,\\n\",\n    \"    \\\"in_dim_edges\\\": temp_dim_edges,\\n\",\n    \"    \\\"out_dim\\\": temp_dim_2,\\n\",\n    \"    \\\"hidden_dims\\\": 5,\\n\",\n    \"    \\\"depth\\\": 1,\\n\",\n    \"    \\\"activation\\\": 'gelu',\\n\",\n    \"    \\\"last_activation\\\": None,\\n\",\n    \"    \\\"dropout\\\": 0.1,\\n\",\n    \"    \\\"normalization\\\": 'layer_norm',\\n\",\n    \"    \\\"last_normalization\\\": 'layer_norm',\\n\",\n    \"    \\\"residual_type\\\": 'simple',\\n\",\n    \"    \\\"virtual_node\\\": None,\\n\",\n    \"    \\\"layer_type\\\": 'pyg:gine',\\n\",\n    \"    \\\"layer_kwargs\\\": None\\n\",\n    \"}\\n\",\n    \"\\n\",\n    \"pre_nn_edges_kwargs = {\\n\",\n    \"    \\\"in_dim\\\": in_dim_edges,\\n\",\n    \"    \\\"out_dim\\\": temp_dim_edges,\\n\",\n    \"    \\\"hidden_dims\\\": 4,\\n\",\n    \"    \\\"depth\\\": 2,\\n\",\n    \"    \\\"activation\\\": 'relu',\\n\",\n    \"    \\\"last_activation\\\": \\\"none\\\",\\n\",\n    \"    \\\"dropout\\\": 0.2\\n\",\n    \"}\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"gnn_net_edges = FullGraphMultiTaskNetwork(\\n\",\n    \"    gnn_kwargs=gnn_edge_kwargs,\\n\",\n    \"    pre_nn_kwargs=pre_nn_kwargs,\\n\",\n    \"    pre_nn_edges_kwargs=pre_nn_edges_kwargs,\\n\",\n    \"    task_heads_kwargs=task_heads_kwargs,\\n\",\n    \"    graph_output_nn_kwargs = graph_output_nn_kwargs\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Applying the network (using additional edge features)\\n\",\n    \"\\n\",\n    \"Again, we only need to run the forward pass on the input graphs to get a prediction.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 119,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"\\n\",\n      \"\\n\",\n      \"FullGNN\\n\",\n      \"---------\\n\",\n      \"    pre-NN(depth=2, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[5 -> 4 -> 23]\\n\",\n      \"    \\n\",\n      \"    GNN(depth=1, ResidualConnectionSimple(skip_steps=1))\\n\",\n      \"        GCNConvPyg[23 -> 17]\\n\",\n      \"        \\n\",\n      \"    \\n\",\n      \"        Task heads:\\n\",\n      \"        graph-task-1: NN-graph-task-1(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 3]\\n\",\n      \"        graph-task-2: NN-graph-task-2(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 4]\\n\",\n      \"        node-task-1: NN-node-task-1(depth=2, ResidualConnectionNone)\\n\",\n      \"            [FCLayer[17 -> 32 -> 2]\\n\",\n      \"\\n\",\n      \"\\n\",\n      \"graph-task-1 torch.Size([2, 3])\\n\",\n      \"graph-task-2 torch.Size([2, 4])\\n\",\n      \"node-task-1 torch.Size([7, 2])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"print(\\\"\\\\n\\\")\\n\",\n    \"\\n\",\n    \"print(gnn_net)\\n\",\n    \"print(\\\"\\\\n\\\")\\n\",\n    \"\\n\",\n    \"out = gnn_net_edges(graph)\\n\",\n    \"for task in out.keys():\\n\",\n    \"    print(task, out[task].shape)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3.8.8 64-bit ('goli': conda)\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "docs/tutorials/gnn/using_gnn_layers.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Using GNN layers\\n\",\n    \"\\n\",\n    \"The current library implements multiple state-of-the-art graph neural networks. In this tutorial, you will learn how to use the **GCN**, **GIN**, **GINE**, **GPS**, **Gated-GCN** and **PNA** layers in a simple `forward` context.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 53,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"The autoreload extension is already loaded. To reload it, use:\\n\",\n      \"  %reload_ext autoreload\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import torch\\n\",\n    \"\\n\",\n    \"import torch_geometric as pyg\\n\",\n    \"from torch_geometric.data import Data, Batch\\n\",\n    \"\\n\",\n    \"from copy import deepcopy\\n\",\n    \"\\n\",\n    \"from graphium.nn.pyg_layers import (\\n\",\n    \"    GCNConvPyg,\\n\",\n    \"    GINConvPyg,\\n\",\n    \"    GatedGCNPyg,\\n\",\n    \"    GINEConvPyg,\\n\",\n    \"    GPSLayerPyg,\\n\",\n    \"    PNAMessagePassingPyg\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"_ = torch.manual_seed(42)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"We will first create some simple batched graphs that will be used accross the examples. Here, `bg` is a batch containing 2 graphs with random node features.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 54,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"DataBatch(edge_index=[2, 7], feat=[7, 5], edge_feat=[7, 13], batch=[7], ptr=[3])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"in_dim = 5          # Input node-feature dimensions\\n\",\n    \"in_dim_edges = 13   # Input edge-feature dimensions\\n\",\n    \"out_dim = 11        # Desired output node-feature dimensions\\n\",\n    \"out_dim_edges = 15  # Desired output edge-feature dimensions\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# Let's create 2 simple pyg graphs. \\n\",\n    \"# start by specifying the edges with edge index\\n\",\n    \"edge_idx1 = torch.tensor([[0, 1, 2],\\n\",\n    \"                          [1, 2, 3]])\\n\",\n    \"edge_idx2 = torch.tensor([[2, 0, 0, 1],\\n\",\n    \"                          [0, 1, 2, 0]])\\n\",\n    \"\\n\",\n    \"# specify the node features, convention with variable x\\n\",\n    \"x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# specify the edge features in e\\n\",\n    \"e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\\n\",\n    \"\\n\",\n    \"# make the pyg graph objects with our constructed features\\n\",\n    \"g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\\n\",\n    \"g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\\n\",\n    \"\\n\",\n    \"# put the two graphs into a Batch graph\\n\",\n    \"bg = Batch.from_data_list([g1, g2])\\n\",\n    \"\\n\",\n    \"# The batched graph will show as a single graph with 7 nodes\\n\",\n    \"print(bg)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## GCN Layer\\n\",\n    \"\\n\",\n    \"To use the GCN layer from the *Kipf et al.* paper, the steps are very simple. We create the layer with the desired attributes, and apply it to the graph.\\n\",\n    \"\\n\",\n    \"<sub>Kipf, Thomas N., and Max Welling. \\\"Semi-supervised classification with graph convolutional networks.\\\" arXiv preprint arXiv:1609.02907 (2016).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 55,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"GCNConvPyg(5 -> 11, activation=relu)\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# The GCN method doesn't support edge features, so we ignore them\\n\",\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer\\n\",\n    \"layer = GCNConvPyg(\\n\",\n    \"            in_dim=in_dim, out_dim=out_dim, \\n\",\n    \"            activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"# We apply the forward loop on the node features\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"# 7 is the number of nodes, 5 number of input features and 11 number of output features\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## GIN Layer\\n\",\n    \"\\n\",\n    \"To use the GIN layer from the *Xu et al.* paper, the steps are identical to GCN.\\n\",\n    \"\\n\",\n    \"<sub>Xu, Keyulu, et al. \\\"How powerful are graph neural networks?.\\\" arXiv preprint arXiv:1810.00826 (2018).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 56,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"GINConvPyg(5 -> 11, activation=relu)\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer\\n\",\n    \"layer = GINConvPyg(\\n\",\n    \"            in_dim=in_dim, out_dim=out_dim, \\n\",\n    \"            activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"# We apply the forward loop on the node features\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"# 7 is the number of nodes, 5 number of input features and 11 number of output features\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## GINE Layer\\n\",\n    \"\\n\",\n    \"To use the GINE layer from the *Hu et al.* paper, we also need to provide additional edge features as inputs.\\n\",\n    \"\\n\",\n    \"<sub>Hu, Weihua, et al. \\\"Strategies for Pre-training Graph Neural Networks.\\\" arXiv preprint arXiv:1905.12265 (2019).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 57,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"GINEConvPyg(5 -> 11, activation=relu)\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# The GINE method uses edge features, so we have to pass the input dimension\\n\",\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer\\n\",\n    \"layer = GINEConvPyg(\\n\",\n    \"            in_dim=in_dim, out_dim=out_dim,\\n\",\n    \"            in_dim_edges=in_dim_edges,\\n\",\n    \"            activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"# We apply the forward loop on the node features\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"# 7 is the number of nodes, 5 number of input features and 11 number of output features\\n\",\n    \"# 7 is the number of edges, 13 number of input edge features\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## GPS Layer\\n\",\n    \"\\n\",\n    \"To use the GPS layer from the *Rampášek et al.* paper, we also need to provide additional edge features as inputs. It is a hybrid approach using both a GNN and transformer in conjunction. Therefore, we further need to specify the GNN type and attention type used in the layer.\\n\",\n    \"\\n\",\n    \"<sub>Rampášek, Ladislav, et al. \\\"Recipe for a General, Powerful, Scalable Graph Transformer.\\\" arXiv preprint arXiv:2205.12454 (2022).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 58,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"GPSLayerPyg(5 -> 11, activation=relu)\\n\",\n      \"torch.Size([7, 11])\\n\",\n      \"torch.Size([7, 13])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer\\n\",\n    \"layer = GPSLayerPyg(\\n\",\n    \"            in_dim=in_dim, out_dim=out_dim,\\n\",\n    \"            in_dim_edges=in_dim_edges,\\n\",\n    \"            mpnn_type = \\\"pyg:gine\\\", attn_type = \\\"full-attention\\\",\\n\",\n    \"            activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"# We apply the forward loop on the node features\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"# 7 is the number of nodes, 5 number of input features and 11 number of output features\\n\",\n    \"# 7 is the number of edges, 13 number of input edge features\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Gated-GCN Layer\\n\",\n    \"\\n\",\n    \"To use the Gated-GCN layer from the *Bresson et al.* paper, the steps are different since the layer not only requires edge features as inputs, but also outputs new edge features. Therefore, we have to further specify the number of output edge features\\n\",\n    \"\\n\",\n    \"<sub>Bresson, Xavier, and Thomas Laurent. \\\"Residual gated graph convnets.\\\" arXiv preprint arXiv:1711.07553 (2017).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 59,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"GatedGCNPyg()\\n\",\n      \"torch.Size([7, 11])\\n\",\n      \"torch.Size([7, 15])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer\\n\",\n    \"layer = GatedGCNPyg(\\n\",\n    \"            in_dim=in_dim, out_dim=out_dim,\\n\",\n    \"            in_dim_edges=in_dim_edges, out_dim_edges=out_dim_edges,\\n\",\n    \"            activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"# We apply the forward loop on the node features\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"# 7 is the number of nodes, 5 number of input features and 11 number of output features\\n\",\n    \"# 7 is the number of edges, 13 number of input edge features and 15 the the number of output edge features\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## PNA\\n\",\n    \"\\n\",\n    \"PNA is a multi-aggregator method proposed by *Corso et al.*. It supports 2 types of aggregations, convolutional *PNA-conv* or message passing *PNA-msgpass*. Here, we provide the typically more powerful *PNA-msgpass*. It supports edges as inputs, but doesn't output edges. Here, we need to further specify the aggregators and scalers specific to this layer.\\n\",\n    \"\\n\",\n    \"<sub>Corso, Gabriele, et al. \\\"Principal Neighbourhood Aggregation for Graph Nets.\\\"\\n\",\n    \"arXiv preprint arXiv:2004.05718 (2020).</sub>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 60,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch.Size([7, 5])\\n\",\n      \"torch.Size([7, 13])\\n\",\n      \"PNAMessagePassingPyg()\\n\",\n      \"torch.Size([7, 11])\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"graph = deepcopy(bg)\\n\",\n    \"print(graph.feat.shape)\\n\",\n    \"print(graph.edge_feat.shape)\\n\",\n    \"\\n\",\n    \"# We create the layer, and need to specify the aggregators and scalers\\n\",\n    \"layer = PNAMessagePassingPyg(\\n\",\n    \"    in_dim=in_dim, out_dim=out_dim,\\n\",\n    \"    in_dim_edges=in_dim_edges,\\n\",\n    \"    aggregators=[\\\"mean\\\", \\\"max\\\", \\\"min\\\", \\\"std\\\"],\\n\",\n    \"    scalers=[\\\"identity\\\", \\\"amplification\\\", \\\"attenuation\\\"],\\n\",\n    \"    activation=\\\"relu\\\", dropout=.3, normalization=\\\"batch_norm\\\")\\n\",\n    \"\\n\",\n    \"graph = layer(graph)\\n\",\n    \"\\n\",\n    \"print(layer)\\n\",\n    \"print(graph.feat.shape)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3.8.8 64-bit ('goli': conda)\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  },\n  \"orig_nbformat\": 2\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "docs/tutorials/model_training/simple-molecular-model.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Building and training a simple model from configurations\\n\",\n    \"\\n\",\n    \"This tutorial will walk you through how to use a configuration file to define all the parameters of a model and of the trainer. This tutorial focuses on training from SMILES data in a CSV format.\\n\",\n    \"\\n\",\n    \"The work flow of testing your code on the entire pipeline is as follows:\\n\",\n    \"\\n\",\n    \"1. Select a subset of the [available configs](https://github.com/datamol-io/graphium/tree/main/expts/hydra-configs) as a starting point.\\n\",\n    \"2. Create additional configs or modify the existing configs to suit your needs.\\n\",\n    \"3. Train or fine-tune a model with the `graphium-train` CLI.\\n\",\n    \"\\n\",\n    \"## Creating the yaml file\\n\",\n    \"\\n\",\n    \"The first step is to create a YAML file containing all the required configurations, with an example given at `graphium/expts/hydra-configs/main.yaml`. We will go through each part of the configurations. See also the README [here](https://github.com/datamol-io/graphium/tree/main/expts/hydra-configs).\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 19,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import yaml\\n\",\n    \"import omegaconf\\n\",\n    \"\\n\",\n    \"from hydra import compose, initialize\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 20,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def print_config_with_key(config, key):\\n\",\n    \"    new_config = {key: config[key]}\\n\",\n    \"    print(omegaconf.OmegaConf.to_yaml(new_config))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 21,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Yaml file loaded\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# First, let's read the yaml configuration file\\n\",\n    \"with initialize(version_base=None, config_path=\\\"../../../expts/hydra-configs\\\"):\\n\",\n    \"    yaml_config = compose(config_name=\\\"main\\\")\\n\",\n    \"\\n\",\n    \"print(\\\"Yaml file loaded\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Constants\\n\",\n    \"\\n\",\n    \"First, we define the constants such as the random seed and whether the model should raise or ignore an error.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 22,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"constants:\\n\",\n      \"  name: neurips2023_small_data_gcn\\n\",\n      \"  seed: 42\\n\",\n      \"  max_epochs: 100\\n\",\n      \"  data_dir: expts/data/neurips2023/small-dataset\\n\",\n      \"  raise_train_error: true\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"constants\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Datamodule\\n\",\n    \"\\n\",\n    \"Here, we define all the parameters required by the datamodule to run correctly, such as the dataset path, whether to cache, the columns for the training, the molecular featurization to use, the train/val/test splits and the batch size.\\n\",\n    \"\\n\",\n    \"For more details, see class [`MultitaskFromSmilesDataModule`](https://graphium-docs.datamol.io/stable/api/graphium.data.html#graphium.data.datamodule.MultitaskFromSmilesDataModule)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 23,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"datamodule:\\n\",\n      \"  module_type: MultitaskFromSmilesDataModule\\n\",\n      \"  args:\\n\",\n      \"    prepare_dict_or_graph: pyg:graph\\n\",\n      \"    featurization_n_jobs: 4\\n\",\n      \"    featurization_progress: true\\n\",\n      \"    featurization_backend: loky\\n\",\n      \"    processed_graph_data_path: ../datacache/neurips2023-small/\\n\",\n      \"    num_workers: 4\\n\",\n      \"    persistent_workers: false\\n\",\n      \"    featurization:\\n\",\n      \"      atom_property_list_onehot:\\n\",\n      \"      - atomic-number\\n\",\n      \"      - group\\n\",\n      \"      - period\\n\",\n      \"      - total-valence\\n\",\n      \"      atom_property_list_float:\\n\",\n      \"      - degree\\n\",\n      \"      - formal-charge\\n\",\n      \"      - radical-electron\\n\",\n      \"      - aromatic\\n\",\n      \"      - in-ring\\n\",\n      \"      edge_property_list:\\n\",\n      \"      - bond-type-onehot\\n\",\n      \"      - stereo\\n\",\n      \"      - in-ring\\n\",\n      \"      add_self_loop: false\\n\",\n      \"      explicit_H: false\\n\",\n      \"      use_bonds_weights: false\\n\",\n      \"      pos_encoding_as_features:\\n\",\n      \"        pos_types:\\n\",\n      \"          lap_eigvec:\\n\",\n      \"            pos_level: node\\n\",\n      \"            pos_type: laplacian_eigvec\\n\",\n      \"            num_pos: 8\\n\",\n      \"            normalization: none\\n\",\n      \"            disconnected_comp: true\\n\",\n      \"          lap_eigval:\\n\",\n      \"            pos_level: node\\n\",\n      \"            pos_type: laplacian_eigval\\n\",\n      \"            num_pos: 8\\n\",\n      \"            normalization: none\\n\",\n      \"            disconnected_comp: true\\n\",\n      \"          rw_pos:\\n\",\n      \"            pos_level: node\\n\",\n      \"            pos_type: rw_return_probs\\n\",\n      \"            ksteps: 16\\n\",\n      \"    task_specific_args:\\n\",\n      \"      qm9:\\n\",\n      \"        df: null\\n\",\n      \"        df_path: ${constants.data_dir}/qm9.csv.gz\\n\",\n      \"        smiles_col: smiles\\n\",\n      \"        label_cols:\\n\",\n      \"        - A\\n\",\n      \"        - B\\n\",\n      \"        - C\\n\",\n      \"        - mu\\n\",\n      \"        - alpha\\n\",\n      \"        - homo\\n\",\n      \"        - lumo\\n\",\n      \"        - gap\\n\",\n      \"        - r2\\n\",\n      \"        - zpve\\n\",\n      \"        - u0\\n\",\n      \"        - u298\\n\",\n      \"        - h298\\n\",\n      \"        - g298\\n\",\n      \"        - cv\\n\",\n      \"        - u0_atom\\n\",\n      \"        - u298_atom\\n\",\n      \"        - h298_atom\\n\",\n      \"        - g298_atom\\n\",\n      \"        splits_path: ${constants.data_dir}/qm9_random_splits.pt\\n\",\n      \"        seed: ${constants.seed}\\n\",\n      \"        task_level: graph\\n\",\n      \"        label_normalization:\\n\",\n      \"          normalize_val_test: true\\n\",\n      \"          method: normal\\n\",\n      \"      tox21:\\n\",\n      \"        df: null\\n\",\n      \"        df_path: ${constants.data_dir}/Tox21-7k-12-labels.csv.gz\\n\",\n      \"        smiles_col: smiles\\n\",\n      \"        label_cols:\\n\",\n      \"        - NR-AR\\n\",\n      \"        - NR-AR-LBD\\n\",\n      \"        - NR-AhR\\n\",\n      \"        - NR-Aromatase\\n\",\n      \"        - NR-ER\\n\",\n      \"        - NR-ER-LBD\\n\",\n      \"        - NR-PPAR-gamma\\n\",\n      \"        - SR-ARE\\n\",\n      \"        - SR-ATAD5\\n\",\n      \"        - SR-HSE\\n\",\n      \"        - SR-MMP\\n\",\n      \"        - SR-p53\\n\",\n      \"        splits_path: ${constants.data_dir}/Tox21_random_splits.pt\\n\",\n      \"        seed: ${constants.seed}\\n\",\n      \"        task_level: graph\\n\",\n      \"      zinc:\\n\",\n      \"        df: null\\n\",\n      \"        df_path: ${constants.data_dir}/ZINC12k.csv.gz\\n\",\n      \"        smiles_col: smiles\\n\",\n      \"        label_cols:\\n\",\n      \"        - SA\\n\",\n      \"        - logp\\n\",\n      \"        - score\\n\",\n      \"        splits_path: ${constants.data_dir}/ZINC12k_random_splits.pt\\n\",\n      \"        seed: ${constants.seed}\\n\",\n      \"        task_level: graph\\n\",\n      \"        label_normalization:\\n\",\n      \"          normalize_val_test: true\\n\",\n      \"          method: normal\\n\",\n      \"    batch_size_training: 200\\n\",\n      \"    batch_size_inference: 200\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"datamodule\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Architecture\\n\",\n    \"\\n\",\n    \"The architecture is based on [`FullGraphMultiTaskNetwork`](https://graphium-docs.datamol.io/stable/api/graphium.nn/architectures.html#graphium.nn.architectures.global_architectures.FullGraphMultiTaskNetwork).\\n\",\n    \"Here, we define all the layers for the model, including the layers for the pre-processing MLP (input layers `pre-nn` and `pre_nn_edges`), the positional encoder (`pe_encoders`), the post-processing MLP (output layers `post-nn`), and the main GNN (graph neural network `gnn`).\\n\",\n    \"\\n\",\n    \"You can find details in the following: \\n\",\n    \"- info about the positional encoder in [`graphium.nn.encoders`](https://graphium-docs.datamol.io/stable/api/graphium.nn/encoders.html)\\n\",\n    \"- info about the gnn layers in [`graphium.nn.pyg_layers`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html)\\n\",\n    \"- info about the architecture [`FullGraphMultiTaskNetwork`](https://graphium-docs.datamol.io/stable/api/graphium.nn/architectures.html#graphium.nn.architectures.global_architectures.FullGraphMultiTaskNetwork)\\n\",\n    \"- Main class for the GNN layers in [`BaseGraphStructure`](https://graphium-docs.datamol.io/stable/api/graphium.nn/graphium.nn.html#graphium.nn.base_graph_layer.BaseGraphStructure)\\n\",\n    \"\\n\",\n    \"The parameters allow to chose the feature size, the depth, the skip connections, the pooling and the virtual node. It also support different GNN layers such as [`GatedGCNPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html#graphium.nn.pyg_layers.gated_gcn_pyg), [`GINConvPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html#graphium.nn.pyg_layers.gin_pyg), [`GINEConvPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html#graphium.nn.pyg_layers.gin_pyg.GINEConvPyg), [`GPSLayerPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html#graphium.nn.pyg_layers.gps_pyg.GPSLayerPyg), [`MPNNPlusPyg`](https://graphium-docs.datamol.io/stable/api/graphium.nn/pyg_layers.html#graphium.nn.pyg_layers.mpnn_pyg.MPNNPlusPyg).\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 24,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"architecture:\\n\",\n      \"  model_type: FullGraphMultiTaskNetwork\\n\",\n      \"  mup_base_path: null\\n\",\n      \"  pre_nn:\\n\",\n      \"    out_dim: 64\\n\",\n      \"    hidden_dims: 256\\n\",\n      \"    depth: 2\\n\",\n      \"    activation: relu\\n\",\n      \"    last_activation: none\\n\",\n      \"    dropout: 0.18\\n\",\n      \"    normalization: layer_norm\\n\",\n      \"    last_normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"    residual_type: none\\n\",\n      \"  pre_nn_edges: null\\n\",\n      \"  pe_encoders:\\n\",\n      \"    out_dim: 32\\n\",\n      \"    pool: sum\\n\",\n      \"    last_norm: None\\n\",\n      \"    encoders:\\n\",\n      \"      la_pos:\\n\",\n      \"        encoder_type: laplacian_pe\\n\",\n      \"        input_keys:\\n\",\n      \"        - laplacian_eigvec\\n\",\n      \"        - laplacian_eigval\\n\",\n      \"        output_keys:\\n\",\n      \"        - feat\\n\",\n      \"        hidden_dim: 64\\n\",\n      \"        out_dim: 32\\n\",\n      \"        model_type: DeepSet\\n\",\n      \"        num_layers: 2\\n\",\n      \"        num_layers_post: 1\\n\",\n      \"        dropout: 0.1\\n\",\n      \"        first_normalization: none\\n\",\n      \"      rw_pos:\\n\",\n      \"        encoder_type: mlp\\n\",\n      \"        input_keys:\\n\",\n      \"        - rw_return_probs\\n\",\n      \"        output_keys:\\n\",\n      \"        - feat\\n\",\n      \"        hidden_dim: 64\\n\",\n      \"        out_dim: 32\\n\",\n      \"        num_layers: 2\\n\",\n      \"        dropout: 0.1\\n\",\n      \"        normalization: layer_norm\\n\",\n      \"        first_normalization: layer_norm\\n\",\n      \"  gnn:\\n\",\n      \"    in_dim: 64\\n\",\n      \"    out_dim: 96\\n\",\n      \"    hidden_dims: 96\\n\",\n      \"    depth: 4\\n\",\n      \"    activation: gelu\\n\",\n      \"    last_activation: none\\n\",\n      \"    dropout: 0.1\\n\",\n      \"    normalization: layer_norm\\n\",\n      \"    last_normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"    residual_type: simple\\n\",\n      \"    virtual_node: none\\n\",\n      \"    layer_type: pyg:gcn\\n\",\n      \"    layer_kwargs: null\\n\",\n      \"  graph_output_nn:\\n\",\n      \"    graph:\\n\",\n      \"      pooling:\\n\",\n      \"      - sum\\n\",\n      \"      out_dim: 96\\n\",\n      \"      hidden_dims: 96\\n\",\n      \"      depth: 1\\n\",\n      \"      activation: relu\\n\",\n      \"      last_activation: none\\n\",\n      \"      dropout: ${architecture.pre_nn.dropout}\\n\",\n      \"      normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"      last_normalization: none\\n\",\n      \"      residual_type: none\\n\",\n      \"  task_heads:\\n\",\n      \"    qm9:\\n\",\n      \"      task_level: graph\\n\",\n      \"      out_dim: 19\\n\",\n      \"      hidden_dims: 128\\n\",\n      \"      depth: 2\\n\",\n      \"      activation: relu\\n\",\n      \"      last_activation: none\\n\",\n      \"      dropout: ${architecture.pre_nn.dropout}\\n\",\n      \"      normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"      last_normalization: none\\n\",\n      \"      residual_type: none\\n\",\n      \"    tox21:\\n\",\n      \"      task_level: graph\\n\",\n      \"      out_dim: 12\\n\",\n      \"      hidden_dims: 64\\n\",\n      \"      depth: 2\\n\",\n      \"      activation: relu\\n\",\n      \"      last_activation: none\\n\",\n      \"      dropout: ${architecture.pre_nn.dropout}\\n\",\n      \"      normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"      last_normalization: none\\n\",\n      \"      residual_type: none\\n\",\n      \"    zinc:\\n\",\n      \"      task_level: graph\\n\",\n      \"      out_dim: 3\\n\",\n      \"      hidden_dims: 32\\n\",\n      \"      depth: 2\\n\",\n      \"      activation: relu\\n\",\n      \"      last_activation: none\\n\",\n      \"      dropout: ${architecture.pre_nn.dropout}\\n\",\n      \"      normalization: ${architecture.pre_nn.normalization}\\n\",\n      \"      last_normalization: none\\n\",\n      \"      residual_type: none\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"architecture\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Predictor\\n\",\n    \"\\n\",\n    \"In the predictor, we define the loss functions, the metrics to track on the progress bar, and all the parameters necessary for the optimizer.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 25,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"predictor:\\n\",\n      \"  metrics_on_progress_bar:\\n\",\n      \"    qm9:\\n\",\n      \"    - mae\\n\",\n      \"    tox21:\\n\",\n      \"    - auroc\\n\",\n      \"    zinc:\\n\",\n      \"    - mae\\n\",\n      \"  loss_fun:\\n\",\n      \"    qm9: mae_ipu\\n\",\n      \"    tox21: bce_logits_ipu\\n\",\n      \"    zinc: mae_ipu\\n\",\n      \"  random_seed: ${constants.seed}\\n\",\n      \"  optim_kwargs:\\n\",\n      \"    lr: 4.0e-05\\n\",\n      \"  torch_scheduler_kwargs:\\n\",\n      \"    module_type: WarmUpLinearLR\\n\",\n      \"    max_num_epochs: ${constants.max_epochs}\\n\",\n      \"    warmup_epochs: 10\\n\",\n      \"    verbose: false\\n\",\n      \"  scheduler_kwargs: null\\n\",\n      \"  target_nan_mask: null\\n\",\n      \"  multitask_handling: flatten\\n\",\n      \"  metrics_every_n_train_steps: 300\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"predictor\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Metrics\\n\",\n    \"\\n\",\n    \"All the metrics can be defined there. If we want to use a classification metric, we can also define a threshold.\\n\",\n    \"\\n\",\n    \"See class [`graphium.trainer.metrics.MetricWrapper`](https://graphium-docs.datamol.io/stable/api/graphium.trainer.html#graphium.trainer.metrics.MetricWrapper) for more details.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 26,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"metrics:\\n\",\n      \"  qm9:\\n\",\n      \"  - name: mae\\n\",\n      \"    metric: mae_ipu\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: flatten\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"  - name: pearsonr\\n\",\n      \"    metric: pearsonr_ipu\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"  - name: r2_score\\n\",\n      \"    metric: r2_score_ipu\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"  tox21:\\n\",\n      \"  - name: auroc\\n\",\n      \"    metric: auroc_ipu\\n\",\n      \"    task: binary\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"  - name: avpr\\n\",\n      \"    metric: average_precision_ipu\\n\",\n      \"    task: binary\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"  - name: f1 > 0.5\\n\",\n      \"    metric: f1\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    target_to_int: true\\n\",\n      \"    num_classes: 2\\n\",\n      \"    average: micro\\n\",\n      \"    threshold_kwargs:\\n\",\n      \"      operator: greater\\n\",\n      \"      threshold: 0.5\\n\",\n      \"      th_on_preds: true\\n\",\n      \"      th_on_target: true\\n\",\n      \"  - name: precision > 0.5\\n\",\n      \"    metric: precision\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    average: micro\\n\",\n      \"    threshold_kwargs:\\n\",\n      \"      operator: greater\\n\",\n      \"      threshold: 0.5\\n\",\n      \"      th_on_preds: true\\n\",\n      \"      th_on_target: true\\n\",\n      \"  zinc:\\n\",\n      \"  - name: mae\\n\",\n      \"    metric: mae_ipu\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: flatten\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"  - name: pearsonr\\n\",\n      \"    metric: pearsonr_ipu\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"  - name: r2_score\\n\",\n      \"    metric: r2_score_ipu\\n\",\n      \"    target_nan_mask: null\\n\",\n      \"    multitask_handling: mean-per-label\\n\",\n      \"    threshold_kwargs: null\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"metrics\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Trainer\\n\",\n    \"\\n\",\n    \"Finally, the Trainer defines the parameters for the number of epochs to train, the checkpoints, and the patience.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 27,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"trainer:\\n\",\n      \"  seed: ${constants.seed}\\n\",\n      \"  model_checkpoint:\\n\",\n      \"    filename: ${constants.name}\\n\",\n      \"    save_last: true\\n\",\n      \"    dirpath: models_checkpoints/neurips2023-small-gcn/\\n\",\n      \"  trainer:\\n\",\n      \"    precision: 32\\n\",\n      \"    max_epochs: ${constants.max_epochs}\\n\",\n      \"    min_epochs: 1\\n\",\n      \"    check_val_every_n_epoch: 20\\n\",\n      \"    accumulate_grad_batches: 1\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"trainer\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Training the model\\n\",\n    \"\\n\",\n    \"Now that we defined all the configuration files, we want to train the model. The steps are fairly easy using the config loaders, and are given below.\\n\",\n    \"\\n\",\n    \"First make sure the dataset file is downloaded. Using `config_gps_10M_pcqm4m.yaml` as an example, make sure the file specified by `df_path` in the config is available.\\n\",\n    \"In this case, we need to download `pcqm4mv2-20k.csv` into the specified directory `graphium/data/PCQM4M/pcqm4mv2-20k.csv`.\\n\",\n    \"\\n\",\n    \"After that, we can simply run a training through the CLI:\\n\",\n    \"```bash\\n\",\n    \"graphium-train\\n\",\n    \"```\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "enable_ipu.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Graphcore Limited.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nGraphcore Limited is not liable\nfor any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n#!/bin/bash\n\n# Default location for the virtual environment\ndefault_venv_name=\".graphium_ipu\"\n\n# Allow the user to specify the location of their virtual environment\n# If not specified, use the default location\nvenv_name=${1:-$default_venv_name}\n\n# Constants\nsdk_path=\"${venv_name}/poplar_sdk-ubuntu_20_04-3.3.0+1403-208993bbb7\"\n\n# Source the virtual environment\nsource ${venv_name}/bin/activate\nsource ${sdk_path}/enable"
  },
  {
    "path": "env.yml",
    "content": "channels:\n  - conda-forge\n  # - pyg # Add for Windows\n\ndependencies:\n  - python >=3.8\n  - pip\n  - typer\n  - loguru\n  - omegaconf >=2.0.0\n  - tqdm\n  - platformdirs\n\n  # scientific\n  - numpy\n  - scipy >=1.4\n  - pandas >=1.0\n  - scikit-learn\n  - fastparquet\n\n  # viz\n  - matplotlib >=3.0.1\n  - seaborn\n\n  # cloud IO\n  - fsspec >=2021.6\n  - s3fs >=2021.6\n  - gcsfs >=2021.6\n\n  # ML packages\n  - cuda-version # works also with CPU-only system.\n  - pytorch >=1.12\n  - lightning >=2.0\n  - torchmetrics >=0.7.0,<0.11\n  - ogb\n  - pytorch_geometric >=2.0 # Use `pyg` for Windows instead of `pytorch_geometric`\n  - wandb\n  - mup\n  - pytorch_sparse >=0.6\n  - pytorch_cluster >=1.5\n  - pytorch_scatter >=2.0\n\n  # chemistry\n  - rdkit\n  - datamol >=0.10\n\n  # Optional deps\n  - sympy\n  - tensorboard\n  - pydantic <2  # because of lightning. See https://github.com/Lightning-AI/lightning/issues/18026 and https://github.com/Lightning-AI/lightning/pull/18022\n\n  # Dev\n  - pytest >=6.0\n  - pytest-xdist\n  - pytest-cov\n  - pytest-forked\n  - nbconvert\n  - black >=23\n  - jupyterlab\n  - ipywidgets\n\n  # Doc\n  - mkdocs\n  - mkdocs-material\n  - mkdocs-material-extensions\n  - mkdocstrings\n  - mkdocstrings-python\n  - mkdocs-jupyter\n  - markdown-include\n  - mike >=1.0.0\n\n  - pip:\n      - lightning-graphcore # optional, for using IPUs only\n      - hydra-core>=1.3.2\n      - hydra-optuna-sweeper\n"
  },
  {
    "path": "expts/__init__.py",
    "content": ""
  },
  {
    "path": "expts/configs/config_gps_10M_pcqm4m.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name pcqm4mv2_mpnn_4layer\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 20 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 60\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 16 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 120\n        # Data handling-related\n        batch_size_training: 64\n        batch_size_inference: 16\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(20) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n  ipu_inference_config: # Optional. If not provided, same as `ipu_config`\n    - deviceIterations(80) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 256\n#       batch_size_inference: 64\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      homolumo:\n        df: null\n        task_level: \"graph\"\n        df_path: ~/scratch/data/graphium/data/PCQM4M/pcqm4mv2-20k.csv\n        # wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv\n        # or set path as https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv directly\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"homo_lumo_gap\"]\n        # sample_size: 30000 # use sample_size for test\n        # splits_path: graphium/data/PCQM4M/split_dict_v2.pt  # Download with `wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/split_dict_v2.pt`\n        split_val: 0.1\n        split_test: 0.1\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      mask_nan: 0\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      conformer_property_list: [positions_3d]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 0 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 64\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 32\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.0\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    pooling: [sum]\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 32\n        out_dim: 32\n        in_dim_edges: 16\n        out_dim_edges: 16\n      attn_type: \"full-attention\" # \"full-attention\", \"none\"\n      # biased_attention: false\n      attn_kwargs:\n        num_heads: *num_heads\n      biased_attention_key: nodepair_gaussian_bias_3d\n\n\n  post_nn: null\n\n  task_heads:\n    homolumo:\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    homolumo: [\"mae\", \"pearsonr\"]\n  loss_fun:\n    homolumo: mse_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 5\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor homolumo/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n  flag_kwargs:\n    n_steps: 0 # 1\n    alpha: 0.0 # 0.01\n\n# Task-specific\nmetrics:\n  homolumo:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  logger:\n    save_dir: logs/PCQMv2\n    name: *name\n    project: PCQMv2_mpnn\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQv2/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/configs/config_gps_10M_pcqm4m_mod.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name pcqm4mv2_mpnn_4layer\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  accelerator:\n    type: gpu  # cpu or ipu or gpu\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      homolumo:\n        df: null\n        task_level: \"graph\"\n        df_path: ~/scratch/data/graphium/data/PCQM4M/pcqm4mv2-20k.csv\n        # wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv\n        # or set path as https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv directly\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"homo_lumo_gap\"]\n        # sample_size: 30000 # use sample_size for test\n        # splits_path: graphium/data/PCQM4Mv2/split_dict.pt  # Download with `wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/split_dict.pt`\n        split_val: 0.1\n        split_test: 0.1\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      mask_nan: 0\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      conformer_property_list: [positions_3d]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          node_laplacian_eigvec:\n            pos_type: laplacian_eigvec\n            pos_level: node\n            num_pos: 8\n            normalization: \"none\"\n            disconnected_comp: True\n          node_laplacian_eigval:\n            pos_type: laplacian_eigval\n            pos_level: node\n            num_pos: 8\n            normalization: \"none\"\n            disconnected_comp: True\n          rw_return_probs:\n            pos_type: rw_return_probs\n            pos_level: node\n            ksteps: [4, 8]\n          nodepair_rw_transition_probs:\n            pos_type: rw_transition_probs\n            pos_level: edge\n            ksteps: [2, 4]\n          nodepair_rw_return_probs:\n            pos_type: rw_return_probs\n            pos_level: nodepair\n            ksteps: [4]\n          electrostatic:\n            pos_type: electrostatic\n            pos_level: node\n          edge_commute:\n            pos_type: commute\n            pos_level: edge\n          nodepair_graphormer:\n            pos_type: graphormer\n            pos_level: nodepair\n\n    # Data handling-related\n    batch_size_training: 64\n    batch_size_inference: 16\n    num_workers: 0 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n    # ipu_dataloader_training_opts:\n    #   mode: async\n    #   max_num_nodes_per_graph: 20 # train max nodes: 20, max_edges: 54\n    #   max_num_edges_per_graph: 60\n\n    # ipu_dataloader_inference_opts:\n    #   mode: async\n    #   max_num_nodes_per_graph: 20 # valid max nodes: 51, max_edges: 118\n    #   max_num_edges_per_graph: 120\n    #   # test-dev max nodes: 50, max_edges: 116\n    #   # test-challenge max nodes: 51, max_edges: 106\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 64\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 32\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: &pe_out_dim 32\n    edge_out_dim: &edge_pe_out_dim 16\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders:\n      emb_la_pos:\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      emb_rwse:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_electrostatic:\n        encoder_type: \"mlp\"\n        input_keys: [\"electrostatic\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_edge_rwse:\n        encoder_type: \"mlp\"\n        input_keys: [\"edge_rw_transition_probs\"]\n        output_keys: [\"edge_feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_edge_pes:\n        encoder_type: \"cat_mlp\"\n        input_keys: [\"edge_rw_transition_probs\", \"edge_commute\"]\n        output_keys: [\"edge_feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      gaussian_pos:\n        encoder_type: \"gaussian_kernel\"\n        input_keys: [\"positions_3d\"]\n        output_keys: [\"feat\", \"nodepair_gaussian_bias_3d\"]\n        num_heads: &num_heads 2\n        num_layers: 2\n        embed_dim: *pe_out_dim\n        use_input_keys_prefix: False\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.0\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    pooling: [sum]\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 32\n        out_dim: 32\n        in_dim_edges: 16\n        out_dim_edges: 16\n      attn_type: \"full-attention\" # \"full-attention\", \"none\"\n      # biased_attention: false\n      attn_kwargs:\n        num_heads: *num_heads\n      biased_attention_key: nodepair_gaussian_bias_3d\n\n\n  post_nn: null\n\n  task_heads:\n    homolumo:\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    homolumo: [\"mae\", \"pearsonr\"]\n  loss_fun:\n    homolumo: mse_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 5\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor homolumo/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n  flag_kwargs:\n    n_steps: 0 # 1\n    alpha: 0.0 # 0.01\n\n# Task-specific\nmetrics:\n  homolumo:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  logger:\n    save_dir: logs/PCQMv2\n    name: *name\n    project: PCQMv2_mpnn\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQv2/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    precision: 32\n    max_epochs: *max_epochs\n    min_epochs: 1\n    accumulate_grad_batches: 2\n    check_val_every_n_epoch: 20\n\n"
  },
  {
    "path": "expts/configs/config_mpnn_10M_b3lyp.yaml",
    "content": "# Testing the mpnn only model with the b3lyp dataset on IPU.\nconstants:\n  name: &name b3lyp_mpnn_4layer\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 20 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 60\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 16 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 120\n        # Data handling-related\n        batch_size_training: 64\n        batch_size_inference: 16\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(20) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n  ipu_inference_config: # Optional. If not provided, same as `ipu_config`\n    - deviceIterations(80) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 256\n#       batch_size_inference: 64\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      betagap:\n        df: null\n        task_level: \"graph\"\n        df_path: graphium/data/b3lyp/b3lyp_mini.parquet #graphium/data/b3lyp/b3lyp_mini.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/b3lyp/b3lyp_mini.parquet\n        # or set path as https://storage.valencelabs.com/graphium/datasets/b3lyp/b3lyp_mini.parquet directly\n        smiles_col: \"smiles\"\n        label_cols: [\"beta_gap\"]\n        # sample_size: 30000 # use sample_size for test\n        split_val: 0.1\n        split_test: 0.1\n      alphagap:\n        df: null\n        task_level: \"graph\"\n        df_path: graphium/data/b3lyp/b3lyp_mini.parquet #graphium/data/b3lyp/b3lyp_mini.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/b3lyp/b3lyp_mini.parquet\n        # or set path as https://storage.valencelabs.com/graphium/datasets/b3lyp/b3lyp_mini.parquet directly\n        smiles_col: \"smiles\"\n        label_cols: [\"alpha_gap\"]\n        # sample_size: 30000 # use sample_size for test\n        # splits_path: graphium/data/PCQM4M/split_dict_v2.pt  # Download with `wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/split_dict_v2.pt`\n        split_val: 0.1\n        split_test: 0.1\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/b3lyp/\"\n    dataloading_from: ram\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 0 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 64\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 32\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.0\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 32\n        out_dim: 32\n        in_dim_edges: 16\n        out_dim_edges: 16\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      # biased_attention: false\n      attn_kwargs: null\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: 256\n      hidden_dims: 256\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    alphagap:\n      task_level: graph\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    betagap:\n      task_level: graph\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    alphagap: [\"mae\", \"pearsonr\"]\n    betagap: [\"mae\", \"pearsonr\"]\n  loss_fun:\n    alphagap: mse_ipu\n    betagap: mse_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor homolumo/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n  flag_kwargs:\n    n_steps: 0 # 1\n    alpha: 0.0 # 0.01\n\n# Task-specific\nmetrics:\n  alphagap: &alpha_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  betagap: *alpha_metrics\n\ntrainer:\n  logger:\n    save_dir: logs/b3lyp\n    name: *name\n    project: PCQMv2_mpnn\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/b3lyp/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/configs/config_mpnn_pcqm4m.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name pcqm4mv2_mpnn_4layer\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  accelerator:\n    type: cpu  # cpu or ipu or gpu\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      homolumo:\n        df: null\n        task_level: \"graph\"\n        df_path: graphium/data/PCQM4M/pcqm4mv2-20k.csv\n        # wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv\n        # or set path as https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-20k.csv directly\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"homo_lumo_gap\"]\n        # sample_size: 6000 # use sample_size for test\n        splits_path: graphium/data/PCQM4Mv2/split_dict_v2.pt  # Download with `wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/split_dict_v2.pt`\n        # graphium/data/PCQM4Mv2/split_dict.pt\n        # graphium/data/PCQM4Mv2/pcqm4m_split.csv\n        split_names: [\"train\", \"valid\", \"test-dev\"]\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 20\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"graphium/data/PCQM4Mv2/\"\n    dataloading_from: ram\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          la_pos: &pos_enc\n            pos_type: laplacian_eigvec_eigval #laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_type: rwse\n            ksteps: 16\n      # pos_encoding_as_directions: *pos_enc # Only for DGN or directional pooling\n\n    # Data handling-related\n    batch_size_training: 64\n    batch_size_inference: 16\n    num_workers: 40 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n    # ipu_dataloader_training_opts:\n    #   mode: async\n    #   max_num_nodes_per_graph: 20 # train max nodes: 20, max_edges: 54\n    #   max_num_edges_per_graph: 60\n\n    # ipu_dataloader_inference_opts:\n    #   mode: async\n    #   max_num_nodes_per_graph: 20 # valid max nodes: 51, max_edges: 118\n    #   max_num_edges_per_graph: 120\n    #   # test-dev max nodes: 50, max_edges: 116\n    #   # test-challenge max nodes: 51, max_edges: 106\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 64\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 32\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.0\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    pooling: [sum]\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 32\n        out_dim: 32\n        in_dim_edges: 16\n        out_dim_edges: 16\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      # biased_attention: false\n      attn_kwargs: null\n\n\n  post_nn: null\n\n  task_heads:\n    homolumo:\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    homolumo: [\"mae\", \"pearsonr\"]\n  loss_fun:\n    homolumo: mse_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor homolumo/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n  flag_kwargs:\n    n_steps: 0 # 1\n    alpha: 0.0 # 0.01\n\n# Task-specific\nmetrics:\n  homolumo:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  logger:\n    save_dir: logs/PCQMv2\n    name: *name\n    project: PCQMv2_mpnn\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQv2/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    precision: 32\n    max_epochs: *max_epochs\n    min_epochs: 1\n    accumulate_grad_batches: 2\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/data/micro_zinc_splits.csv",
    "content": "train,val,test\n957,392.0,654.0\n223,588.0,430.0\n916,538.0,317.0\n410,903.0,629.0\n287,923.0,927.0\n496,677.0,641.0\n456,144.0,518.0\n589,83.0,645.0\n151,385.0,649.0\n10,996.0,504.0\n7,506.0,716.0\n180,205.0,557.0\n769,161.0,931.0\n389,746.0,804.0\n304,99.0,30.0\n867,444.0,836.0\n177,814.0,469.0\n79,984.0,448.0\n937,292.0,863.0\n494,377.0,412.0\n771,986.0,61.0\n587,864.0,805.0\n552,966.0,326.0\n881,974.0,520.0\n808,106.0,263.0\n186,670.0,874.0\n925,238.0,920.0\n268,995.0,226.0\n108,387.0,749.0\n717,696.0,724.0\n37,871.0,583.0\n47,515.0,928.0\n258,596.0,117.0\n173,801.0,459.0\n979,301.0,633.0\n846,420.0,239.0\n578,688.0,421.0\n320,572.0,484.0\n69,451.0,399.0\n221,318.0,886.0\n96,87.0,133.0\n732,672.0,759.0\n303,944.0,774.0\n48,891.0,523.0\n455,311.0,103.0\n140,297.0,610.0\n734,776.0,743.0\n138,972.0,104.0\n699,548.0,574.0\n706,152.0,204.0\n713,418.0,822.0\n705,214.0,989.0\n230,254.0,620.0\n134,709.0,648.0\n551,164.0,66.0\n92,835.0,75.0\n890,395.0,929.0\n185,539.0,184.0\n691,445.0,712.0\n31,573.0,98.0\n195,894.0,997.0\n630,852.0,94.0\n447,893.0,959.0\n967,673.0,728.0\n280,819.0,689.0\n813,829.0,323.0\n788,711.0,658.0\n432,750.0,722.0\n876,153.0,305.0\n766,524.0,600.0\n764,800.0,23.0\n135,279.0,76.0\n12,700.0,73.0\n136,71.0,740.0\n406,942.0,482.0\n267,935.0,773.0\n234,854.0,208.0\n531,59.0,985.0\n274,124.0,812.0\n938,913.0,122.0\n512,591.0,398.0\n353,693.0,726.0\n127,790.0,380.0\n737,54.0,293.0\n401,283.0,429.0\n121,492.0,747.0\n563,702.0,906.0\n694,954.0,316.0\n176,360.0,842.0\n25,624.0,145.0\n187,898.0,302.0\n434,719.0,781.0\n828,628.0,65.0\n113,232.0,388.0\n199,537.0,314.0\n227,60.0,825.0\n411,206.0,536.0\n590,567.0,787.0\n581,878.0,495.0\n697,383.0,89.0\n147,683.0,331.0\n798,806.0,338.0\n918,49.0,656.0\n751,993.0,350.0\n415,85.0,485.0\n692,150.0,840.0\n120,64.0,760.0\n686,980.0,41.0\n466,356.0,261.0\n517,735.0,855.0\n101,486.0,439.0\n528,883.0,503.0\n845,18.0,38.0\n644,879.0,202.0\n698,478.0,519.0\n501,336.0,625.0\n86,40.0,105.0\n224,472.0,763.0\n422,522.0,493.0\n15,276.0,405.0\n767,397.0,950.0\n193,192.0,156.0\n437,889.0,351.0\n376,638.0,603.0\n910,549.0,461.0\n288,680.0,919.0\n171,381.0,765.0\n955,655.0,940.0\n932,490.0,352.0\n623,132.0,917.0\n848,17.0,142.0\n592,643.0,870.0\n155,24.0,508.0\n457,157.0,561.0\n216,229.0,174.0\n250,453.0,951.0\n483,857.0,550.0\n404,463.0,601.0\n282,626.0,904.0\n529,646.0,831.0\n513,477.0,668.0\n189,211.0,880.0\n441,307.0,824.0\n576,408.0,613.0\n2,428.0,657.0\n744,615.0,241.0\n111,378.0,402.0\n718,100.0,873.0\n391,499.0,16.0\n322,785.0,755.0\n566,675.0,584.0\n452,794.0,650.0\n860,129.0,669.0\n952,729.0,598.0\n921,290.0,468.0\n875,337.0,571.0\n42,602.0,660.0\n128,553.0,443.0\n172,431.0,219.0\n908,26.0,577.0\n21,994.0,489.0\n895,95.0,58.0\n414,479.0,373.0\n505,197.0,119.0\n809,278.0,652.0\n371,249.0,137.0\n899,442.0,868.0\n6,594.0,826.0\n960,269.0,666.0\n964,324.0,341.0\n84,887.0,252.0\n329,502.0,470.0\n344,662.0,858.0\n102,27.0,262.0\n933,123.0,953.0\n608,877.0,667.0\n681,862.0,359.0\n542,924.0,595.0\n977,945.0,419.0\n582,514.0,902.0\n568,256.0,1.0\n464,321.0,939.0\n355,343.0,833.0\n454,527.0,163.0\n115,400.0,67.0\n36,14.0,168.0\n190,295.0,851.0\n799,973.0,721.0\n642,965.0,74.0\n285,62.0,756.0\n328,110.0,245.0\n141,213.0,970.0\n237,555.0,786.0\n130,333.0,754.0\n81,704.0,362.0\n8,481.0,348.0\n210,792.0,827.0\n118,546.0,762.0\n963,196.0,375.0\n139,260.0,334.0\n992,,\n13,,\n417,,\n632,,\n427,,\n28,,\n560,,\n525,,\n509,,\n723,,\n264,,\n653,,\n181,,\n823,,\n884,,\n91,,\n471,,\n535,,\n782,,\n914,,\n257,,\n143,,\n179,,\n222,,\n255,,\n897,,\n20,,\n146,,\n349,,\n922,,\n565,,\n170,,\n200,,\n497,,\n365,,\n820,,\n607,,\n661,,\n242,,\n160,,\n235,,\n236,,\n68,,\n701,,\n43,,\n948,,\n90,,\n983,,\n240,,\n310,,\n46,,\n166,,\n57,,\n838,,\n346,,\n892,,\n720,,\n690,,\n271,,\n856,,\n97,,\n299,,\n423,,\n209,,\n384,,\n357,,\n810,,\n265,,\n943,,\n541,,\n165,,\n768,,\n659,,\n368,,\n225,,\n821,,\n604,,\n631,,\n22,,\n253,,\n714,,\n544,,\n159,,\n148,,\n956,,\n796,,\n476,,\n33,,\n684,,\n178,,\n9,,\n207,,\n545,,\n532,,\n635,,\n149,,\n296,,\n533,,\n715,,\n473,,\n850,,\n559,,\n647,,\n298,,\n990,,\n982,,\n640,,\n926,,\n498,,\n270,,\n738,,\n912,,\n736,,\n424,,\n272,,\n56,,\n651,,\n366,,\n770,,\n784,,\n637,,\n911,,\n599,,\n802,,\n885,,\n78,,\n586,,\n968,,\n830,,\n843,,\n866,,\n731,,\n999,,\n319,,\n50,,\n361,,\n175,,\n861,,\n987,,\n289,,\n450,,\n394,,\n832,,\n564,,\n687,,\n909,,\n363,,\n11,,\n45,,\n248,,\n543,,\n679,,\n34,,\n480,,\n158,,\n674,,\n547,,\n347,,\n791,,\n569,,\n300,,\n425,,\n981,,\n975,,\n386,,\n507,,\n847,,\n218,,\n627,,\n154,,\n436,,\n286,,\n930,,\n888,,\n35,,\n685,,\n72,,\n570,,\n109,,\n217,,\n562,,\n795,,\n962,,\n82,,\n775,,\n882,,\n580,,\n460,,\n554,,\n703,,\n407,,\n526,,\n511,,\n93,,\n621,,\n4,,\n752,,\n131,,\n315,,\n370,,\n978,,\n70,,\n593,,\n682,,\n947,,\n818,,\n339,,\n676,,\n971,,\n345,,\n797,,\n3,,\n616,,\n663,,\n390,,\n725,,\n462,,\n281,,\n664,,\n745,,\n0,,\n844,,\n617,,\n946,,\n294,,\n859,,\n748,,\n915,,\n998,,\n742,,\n530,,\n116,,\n309,,\n741,,\n988,,\n275,,\n606,,\n900,,\n374,,\n367,,\n107,,\n312,,\n639,,\n182,,\n739,,\n934,,\n39,,\n811,,\n5,,\n63,,\n487,,\n80,,\n753,,\n51,,\n678,,\n335,,\n841,,\n426,,\n284,,\n29,,\n251,,\n611,,\n585,,\n396,,\n491,,\n340,,\n815,,\n488,,\n793,,\n358,,\n77,,\n259,,\n44,,\n228,,\n243,,\n125,,\n761,,\n516,,\n958,,\n558,,\n665,,\n540,,\n244,,\n215,,\n277,,\n521,,\n403,,\n198,,\n730,,\n413,,\n332,,\n710,,\n458,,\n500,,\n579,,\n112,,\n865,,\n247,,\n52,,\n695,,\n758,,\n614,,\n382,,\n619,,\n372,,\n393,,\n126,,\n961,,\n449,,\n839,,\n634,,\n816,,\n88,,\n169,,\n907,,\n597,,\n707,,\n789,,\n474,,\n556,,\n618,,\n727,,\n778,,\n575,,\n32,,\n991,,\n636,,\n191,,\n114,,\n803,,\n233,,\n896,,\n246,,\n446,,\n379,,\n612,,\n733,,\n779,,\n905,,\n183,,\n475,,\n435,,\n409,,\n837,,\n949,,\n273,,\n212,,\n941,,\n433,,\n467,,\n780,,\n465,,\n510,,\n220,,\n438,,\n708,,\n817,,\n416,,\n325,,\n969,,\n188,,\n872,,\n55,,\n313,,\n291,,\n53,,\n194,,\n369,,\n849,,\n869,,\n853,,\n772,,\n201,,\n203,,\n777,,\n327,,\n807,,\n976,,\n231,,\n783,,\n167,,\n364,,\n306,,\n342,,\n266,,\n609,,\n901,,\n162,,\n534,,\n440,,\n834,,\n671,,\n330,,\n308,,\n19,,\n936,,\n354,,\n757,,\n622,,\n605,,\n"
  },
  {
    "path": "expts/data/tiny_zinc_splits.csv",
    "content": "train,val,test\n3,61.0,65.0\n21,64.0,24.0\n26,89.0,90.0\n42,28.0,37.0\n23,95.0,83.0\n87,46.0,63.0\n68,30.0,58.0\n99,91.0,85.0\n77,48.0,70.0\n41,43.0,18.0\n0,7.0,81.0\n36,72.0,94.0\n29,38.0,13.0\n11,47.0,1.0\n79,16.0,33.0\n82,62.0,14.0\n27,8.0,97.0\n51,76.0,2.0\n25,78.0,59.0\n88,6.0,44.0\n96,,\n17,,\n20,,\n45,,\n67,,\n12,,\n54,,\n49,,\n32,,\n69,,\n60,,\n55,,\n57,,\n35,,\n53,,\n92,,\n4,,\n73,,\n75,,\n9,,\n71,,\n84,,\n80,,\n15,,\n39,,\n50,,\n86,,\n10,,\n5,,\n34,,\n22,,\n56,,\n66,,\n31,,\n74,,\n52,,\n19,,\n40,,\n98,,\n93,,\n"
  },
  {
    "path": "expts/dataset_benchmark.py",
    "content": "import os\nfrom os.path import dirname, abspath\nimport yaml\nfrom omegaconf import DictConfig\nfrom datetime import datetime\nfrom copy import deepcopy\n\nimport graphium\nfrom graphium.config._loader import load_datamodule, load_accelerator\nimport time\nfrom typing import Optional, List, Sequence\nimport wandb\nimport statistics\nimport tqdm\nimport torch\nimport numpy as np\n\n# Set up the working directory\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nos.chdir(MAIN_DIR)\n# CONFIG_FILE = \"expts/neurips2023_configs/debug/config_large_gcn_debug.yaml\"\nCONFIG_FILE = \"expts/neurips2023_configs/config_large_gcn.yaml\"\n# CONFIG_FILE = \"expts/configs/config_pcqmv2_mpnn.yaml\"\n# CONFIG_FILE = \"expts/configs/config_ipu_qm9.yaml\"\n\n\ndef benchmark(fn, *args, message=\"\", log2wandb=False, **kwargs):\n    start = time.time()\n    value = fn(*args, **kwargs)\n    duration = time.time() - start\n    print(f\"{message} {duration:.3f} secs\")\n    if log2wandb:\n        wandb.log({message: duration})\n    return value\n\n\ndef benchmark_dataloader(dataloader, name, n_epochs=5, log2wandb=False):\n    print(f\"length of {name} dataloader: {len(dataloader)}\")\n    epoch_times = [0] * n_epochs\n    tputs = [0] * n_epochs\n    n_batches = [0] * n_epochs\n    n_graphs = [0] * n_epochs\n    n_nodes = [0] * n_epochs\n    n_edges = [0] * n_epochs\n    for i in range(n_epochs):\n        start = time.time()\n        for data in tqdm.tqdm(dataloader):\n            n_batches[i] += 1\n            n_graphs[i] += torch.sum(torch.max(data[\"features\"][\"batch\"], dim=-1).values).item()\n            n_nodes[i] += np.prod(data[\"features\"][\"batch\"].shape)\n            n_edges[i] += np.prod(data[\"features\"][\"edge_weight\"].shape)\n        epoch_times[i] = time.time() - start\n        tputs[i] = n_graphs[i] / epoch_times[i]\n\n    average_tput = statistics.mean(tputs)\n    print(f\"{name} dataloader average tput {average_tput}\")\n    print(f\"{name} dataloader tputs per epoch {tputs}\")\n    print(f\"{name} dataloader epoch times {epoch_times}\")\n    print(f\"{name} dataloader total batches per epoch {n_batches}\")\n    print(f\"{name} dataloader total graphs per epoch {n_graphs}\")\n    print(f\"{name} dataloader total nodes per epoch {n_nodes}\")\n    print(f\"{name} dataloader total edges per epoch {n_edges}\")\n\n    if log2wandb:\n        wandb.log({\"average tput\": average_tput})\n\n        for i in range(n_epochs):\n            d = {\n                \"epoch\": i,\n                \"tput per epoch\": tputs[i],\n                \"epoch times \": epoch_times[i],\n                \"batches per epoch\": n_batches[i],\n                \"graphs per epoch\": n_graphs[i],\n                \"nodes per epoch\": n_nodes[i],\n                \"edges per epoch\": n_edges[i],\n            }\n            print(d)\n            wandb.log(d)\n\n\ndef main(\n    cfg: DictConfig,\n    stages: Optional[Sequence[str]] = None,\n    run_name: str = \"dataset_benchmark\",\n    add_date_time: bool = True,\n    log2wandb: bool = False,\n) -> None:\n    if add_date_time:\n        run_name += \"_\" + datetime.now().strftime(\"%d.%m.%Y_%H.%M.%S\")\n\n    if log2wandb:\n        wandb.init(project=\"multitask-gnn\", name=run_name, config=cfg)\n\n    cfg = deepcopy(cfg)\n    cfg, accelerator_type = load_accelerator(cfg)\n    # Load and initialize the dataset\n    datamodule = benchmark(\n        load_datamodule, cfg, accelerator_type, message=\"Load duration\", log2wandb=log2wandb\n    )\n\n    benchmark(datamodule.prepare_data, message=\"Prepare duration\", log2wandb=log2wandb)\n\n    if False:  # stages is not None:\n        for stage in stages:\n            benchmark(datamodule.setup, stage, message=f\"Setup {stage} duration\", log2wandb=log2wandb)\n    else:\n        benchmark(datamodule.setup, message=f\"Setup duration\", log2wandb=log2wandb)\n\n    if stages is None or {\"train\", \"fit\"}.intersection(stages):\n        dataloader = datamodule.train_dataloader()\n        benchmark_dataloader(\n            dataloader, name=\"train\", n_epochs=cfg[\"trainer\"][\"trainer\"][\"max_epochs\"], log2wandb=log2wandb\n        )\n    if stages is None or {\"val\", \"valid\", \"validation\"}.intersection(stages):\n        dataloader = datamodule.val_dataloader()\n        benchmark_dataloader(\n            dataloader,\n            name=\"validation\",\n            n_epochs=cfg[\"trainer\"][\"trainer\"][\"max_epochs\"],\n            log2wandb=log2wandb,\n        )\n    if stages is None or {\"test\", \"testing\"}.intersection(stages):\n        dataloader = datamodule.test_dataloader()\n        benchmark_dataloader(\n            dataloader, name=\"testing\", n_epochs=cfg[\"trainer\"][\"trainer\"][\"max_epochs\"], log2wandb=log2wandb\n        )\n\n\nif __name__ == \"__main__\":\n    with open(os.path.join(MAIN_DIR, CONFIG_FILE), \"r\") as f:\n        cfg = yaml.safe_load(f)\n    main(cfg, stages=[\"train\"], log2wandb=True)\n"
  },
  {
    "path": "expts/debug_yaml.py",
    "content": "# test_yaml.py\n\n\nimport yaml\n\nCONFIG_FILE = \"neurips2023_configs/config_small_mpnn.yaml\"\n\nimport re\nfrom collections import defaultdict\n\n# def get_anchors_and_aliases(filepath):\n#     anchors = defaultdict(list)\n#     current_level = {}\n\n#     with open(filepath, \"r\") as file:\n#         for line in file:\n#             indent = len(line) - len(line.lstrip(' '))\n#             key_match = re.search(r'(\\w+):', line)\n#             anchor_match = re.search(r'&(\\w+)', line)\n#             alias_match = re.search(r'\\*(\\w+)', line)\n#             if key_match:\n#                 key = key_match.group(1)\n#                 # Compute the full path of the current key.\n#                 full_path = '.'.join([current_level[i] for i in sorted(current_level.keys()) if i < indent] + [key])\n#                 current_level[indent] = key\n#                 # Remove any keys that are indented more than the current line.\n#                 keys_to_remove = [i for i in current_level if i > indent]\n#                 for i in keys_to_remove:\n#                     del current_level[i]\n#             else:\n#                 full_path = '.'.join([current_level[i] for i in sorted(current_level.keys())])\n#             if anchor_match:\n#                 anchor = anchor_match.group(1)\n#                 anchors[anchor].append(full_path)\n#             if alias_match:\n#                 alias = alias_match.group(1)\n#                 anchors[alias].append(full_path)\n#     return {k: v for k, v in anchors.items() if len(v) > 1}\n\n\ndef get_anchors_and_aliases(filepath):\n    anchors = defaultdict(list)\n    current_level = {}\n    anchor_to_path = {}\n\n    with open(filepath, \"r\") as file:\n        for line in file:\n            indent = len(line) - len(line.lstrip(\" \"))\n            key_match = re.search(r\"(\\w+):\", line)\n            anchor_match = re.search(r\"&(\\w+)\", line)\n            alias_match = re.search(r\"\\*(\\w+)\", line)\n            if key_match:\n                key = key_match.group(1)\n                # Compute the full path of the current key.\n                full_path = \".\".join(\n                    [current_level[i] for i in sorted(current_level.keys()) if i < indent] + [key]\n                )\n                current_level[indent] = key\n                # Remove any keys that are indented more than the current line.\n                keys_to_remove = [i for i in current_level if i > indent]\n                for i in keys_to_remove:\n                    del current_level[i]\n            else:\n                full_path = \".\".join([current_level[i] for i in sorted(current_level.keys())])\n            if anchor_match:\n                anchor = anchor_match.group(1)\n                anchor_to_path[anchor] = full_path\n            if alias_match:\n                alias = alias_match.group(1)\n                if alias in anchor_to_path:\n                    anchors[anchor_to_path[alias]].append(full_path)\n    return anchors\n\n\nrefs = get_anchors_and_aliases(CONFIG_FILE)\nimport ipdb\n\nipdb.set_trace()\npass\n"
  },
  {
    "path": "expts/hydra-configs/README.md",
    "content": "# Configuring Graphium with Hydra\nThis document provides users with a point of entry to composing configs in Graphium. As a flexible library with many features, configuration is an important part of Graphium. To make configurations as reusable as possible while providing maximum flexibility, we integrated Graphium with `hydra`. Our config structure is designed to make the following functionality as accessible as possible:\n\n- Switching between **accelerators** (CPU, GPU and IPU)\n- **Benchmarking** different models on the same dataset\n- **Fine-tuning** a pre-trained model on a new dataset\n\nIn what follows, we describe how each of the above functionality is achieved and how users can benefit from this design to achieve the most with Graphium with as little configuration as possible.\n\n## Accelerators\nWith Graphium supporting CPU, GPU and IPU hardware, easily switching between these accelerators is pre-configured. General, accelerator-specific configs are specified under `accelerator/`, whereas experiment-specific differences between the accelerators are specialized under `training/accelerator`.\n\n## Benchmarking\nBenchmarking multiple models on the same datasets and tasks requires us to easily switch between model configurations without redefining major parts of the architecture, task heads, featurization, metrics, predictor, etc. For example, when changing from a GCN to a GIN model, a simple switch of `architecture.gnn.layer_type: 'pyg:gin'` might suffice. Hence, we abstract the `model` configs under `model/` where such model configurations can be specified.\nIn addition, switching models may have implications on configs specific to your current experiment, such as the name of the run or the directory to which model checkpoints are written. To enable such overrides, we can utilize `hydra` [specializations](https://hydra.cc/docs/patterns/specializing_config/). For example, for our ToyMix dataset, we specify the layer type under `model/[model_name].yaml`, e.g., for the GCN layer,\n\n```yaml\n# @package _global_\n\narchitecture:\n  gnn:\n    layer_type: 'pyg:gcn'\n```\n\nand set experiment-related parameters in `training/model/toymix_[model_name].yaml` as a specialization, e.g., for the GIN layer,\n\n```yaml\n# @package _global_\n\nconstants:\n  name: neurips2023_small_data_gin\n  ...\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gin/${now:%Y-%m-%d_%H-%M-%S}/\n```\nWe can now utilize `hydra` to e.g., run a sweep over our models on the ToyMix dataset via\n\n```bash\ngraphium-train -m model=gcn,gin\n```\nwhere the ToyMix dataset is pre-configured in `main.yaml`. Read on to find out how to define new datasets and architectures for pre-training and fine-tuning.\n\n## Pre-training / Fine-tuning\nSay you trained a model with the following command:\n```bash\ngraphium-train --config-name \"main\"\n```\n\nFine-tuning this model on downstream tasks is then as simple as:\n```bash\ngraphium-train --config-name \"main\" +finetuning=...\n```\n\nFrom a configuration point-of-view, fine-tuning requires us to load a pre-trained model and override part of the training configuration to fine-tune it on downstream tasks. To allow a quick switch between pre-training and fine-tuning, by default, we configure models and the corresponding tasks in a separate manner. More specifically,\n\n- under `architecture/` we store architecture related configurations such as the definition of the GNN/Transformer layers or positional/structural encoders\n- under `tasks/` we store configurations specific to one task set, such as the multi-task dataset ToyMix\n  - under `tasks/task_heads` we specify the task-specific heads to add on top of the base architecture.\n  - under `tasks/loss_metrics_datamodule` we specify the data-module to use and the task-specific loss functions and metrics\n- under `training/` we store configurations specific to training models which could be different for each combination of `architecture` and `tasks`\n- under `finetuning/` we store configurations with overrides\n\nSince architecture and tasks are logically separated it now becomes very easy to e.g., use an existing architecture backbone on a new set of tasks or a new dataset altogether. Additionally, separating training allows us to specify different training parameters for e.g., pre-training and fine-tuning of the same architecture and task set.\n\nWe will now detail how you can add new architectures, tasks and training configurations.\n\n### Adding an architecture\nThe architecture config consists of specifications of the neural network components, including encoders, under the config key `architecture` and the featurization, containing the positional/structural information that is to be extracted from the data.\nTo add a new architecture, create a file `architecture/my_architecture.yaml` with the following information specified:\n```yaml\n# @package _global_\narchitecture:\n  model_type: FullGraphMultiTaskNetwork # for example\n  pre_nn:\n    ...\n\n  pre_nn_edges:\n    ...\n\n  pe_encoders:\n    encoders: # your encoders\n      ...\n\n  gnn: # your GNN definition\n    ...\n\n  graph_output_nn: # output NNs for different levels such as graph, node, etc.\n    graph:\n      ...\n    node:\n      ...\n    ...\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args: # Make sure to not specify anything task-specific here\n    ...\n  featurization:\n    ...\n```\nYou can then select your new architecture during training, e.g., by running\n```bash\ngraphium-train architecture=my_architecture\n```\n\n### Adding tasks\nThe task set config consists of specifications for the task head neural nets under the config key `architecture.task_heads`; if required, any task-specific arguments to the datamodule you use, e.g., `datamodule.args.task_specfic_args` when using the `MultitaskFromSmilesDataModule` datamodule; the per-task metrics under the config key `metrics.[task]` where `[task]` matches the tasks specified under `architecture.task_heads`; the per-task configs of the `predictor` module, as well as the loss functions of the task set under the config key `predictor.loss_fun`.\nTo add a new task set, create a file `tasks/my_tasks.yaml` with the following information specified:\n```yaml\n# @package _global_\narchitecture:\n    task_heads:\n        task1:\n            ...\n        task2:\n            ...\n\ndatamodule: # optional, depends on your concrete datamodule class. Here: \"MultitaskFromSmilesDataModule\"\n    args:\n        task_specific_args:\n            task1:\n                ...\n            task2:\n                ...\n\nmetrics:\n    task1:\n        ...\n    task2:\n        ...\n\npredictor:\n  metrics_on_progress_bar:\n    task1:\n    task2:\n  loss_fun: ... # your loss functions for the multi-tasking\n```\nYou can then select your new dataset during training, e.g., by running\n```bash\ngraphium-train tasks=my_tasks\n```\n\n### Adding training configs\nThe training configs consist of specifications to the `predictor` and `trainer` modules.\nTo add new training configs, create a file `training/my_training.yaml` with the following information specified:\n```yaml\n# @package _global_\npredictor:\n    optim_kwargs:\n    lr: 4.e-5\n    torch_scheduler_kwargs: # example\n        module_type: WarmUpLinearLR\n        max_num_epochs: &max_epochs 100\n        warmup_epochs: 10\n        verbose: False\n    scheduler_kwargs:\n        ...\n\ntrainer:\n  ...\n  trainer: # example\n    precision: 16\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n```\n"
  },
  {
    "path": "expts/hydra-configs/__init__.py",
    "content": ""
  },
  {
    "path": "expts/hydra-configs/accelerator/cpu.yaml",
    "content": "type: cpu"
  },
  {
    "path": "expts/hydra-configs/accelerator/gpu.yaml",
    "content": "type: gpu"
  },
  {
    "path": "expts/hydra-configs/accelerator/ipu.yaml",
    "content": "type: ipu\nipu_config:\n    - deviceIterations(60) # IPU would require large batches to be ready for the model.\n    # 60 for PCQM4mv2\n    # 30 for largemix\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n\nipu_inference_config:\n    # set device iteration and replication factor to 1 during inference\n    # gradient accumulation was set to 1 in the code\n    - deviceIterations(1)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n"
  },
  {
    "path": "expts/hydra-configs/accelerator/ipu_pipeline.yaml",
    "content": "type: ipu\nipu_config:\n    - deviceIterations(60) # IPU would require large batches to be ready for the model.\n    # 60 for PCQM4mv2\n    # 30 for largemix\n    - replicationFactor(4)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n\nipu_inference_config:\n    # set device iteration and replication factor to 1 during inference\n    # gradient accumulation was set to 1 in the code\n    - deviceIterations(60)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n\naccelerator_kwargs:\n    _accelerator: \"ipu\"\n    gnn_layers_per_ipu: [4, 4, 4, 4]"
  },
  {
    "path": "expts/hydra-configs/architecture/largemix.yaml",
    "content": "# @package _global_\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: ${constants.norm}\n    last_normalization: ${constants.norm}\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: ${constants.norm} #\"batch_norm\" or \"layer_norm\"\n        first_normalization: ${constants.norm} #\"batch_norm\" or \"layer_norm\"\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: ${constants.norm}\n    last_normalization: ${constants.norm}\n    residual_type: simple\n    virtual_node: 'none'\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: 1024\n      hidden_dims: *gnn_dim\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: ${constants.norm}\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: 64\n      hidden_dims: *gnn_dim\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: ${constants.norm}\n      last_normalization: \"none\"\n      residual_type: none\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args:\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 20\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: ${constants.datacache_path}\n    dataloading_from: \"disk\"\n    num_workers: 20 # -1 to use all\n    persistent_workers: True\n    featurization:\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16"
  },
  {
    "path": "expts/hydra-configs/architecture/pcqm4m.yaml",
    "content": "# @package _global_\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 256\n    hidden_dims: 1024\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 128\n    hidden_dims: 512\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 256\n    hidden_dims: 256\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: 256\n      hidden_dims: 256\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: ${constants.datacache_path}\n    num_workers: 40 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n"
  },
  {
    "path": "expts/hydra-configs/architecture/toymix.yaml",
    "content": "# @package _global_\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: 0.18\n    normalization: layer_norm\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 96\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs: null # Parameters for the model itself. You could define dropout_attn: 0.1\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args:\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path:  ${constants.datacache_path}\n    dataloading_from: ram\n    num_workers: 30 # -1 to use all\n    persistent_workers: False\n    featurization:\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features:\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # normalization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # normalization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16"
  },
  {
    "path": "expts/hydra-configs/experiment/toymix_mpnn.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: neurips2023_small_data_mpnn\n  entity: \"multitask-gnn\"\n  seed: 42\n  max_epochs: 100\n  data_dir: expts/data/neurips2023/small-dataset\n  raise_train_error: true\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-mpnn/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/finetuning/admet.yaml",
    "content": "# @package _global_\n\n# == Fine-tuning configs in Graphium ==\n# \n# A fine-tuning config is a appendum to a (pre-)training config.\n# Since many things (e.g. the architecture), will stay constant between (pre-)training and fine-tuning,\n# this config should be as minimal as possible to avoid unnecessary duplication. It only specifies\n# what to override with regards to the config used for (pre-)training.\n# \n# Given the following training command: \n# >>> graphium-train --cfg /path/to/train.yaml\n# \n# Fine-tuning now is as easy as: \n# >>> graphium-train --cfg /path/to/train.yaml +finetune=admet\n#\n# NOTE: This config can be used for each of the benchmarks in the TDC ADMET benchmark suite.\n#     The only thing that needs to be changed is the `constants.task` key.\n\n\n## == Overrides == \n\ndefaults:\n  # This file contains all metrics and loss function info for all ADMET tasks.\n  # This config is filtered at runtime based on the `constants.task` key.\n  - override /tasks/loss_metrics_datamodule: admet\n\nconstants:\n  \n  # For now, we assume a model is always fine-tuned on a single task at a time.\n  # You can override this value with any of the benchmark names in the TDC benchmark suite.\n  # See also https://tdcommons.ai/benchmark/admet_group/overview/\n  task: lipophilicity_astrazeneca\n\n  name: finetuning_${constants.task}_gcn\n  wandb:\n    name: ${constants.name}\n    project: ${constants.task}\n    entity: multitask-gnn\n    save_dir: logs/${constants.task}\n  seed: 42\n  max_epochs: 100\n  data_dir: expts/data/admet/${constants.task}\n  raise_train_error: true\n\npredictor:\n  optim_kwargs:\n    lr: 4.e-5\n\n# == Fine-tuning config == \n\nfinetuning:\n\n  # For now, we assume a model is always fine-tuned on a single task at a time.\n  # You can override this value with any of the benchmark names in the TDC benchmark suite.\n  # See also https://tdcommons.ai/benchmark/admet_group/overview/\n  task: ${constants.task}\n  level: graph\n\n  # Pretrained model\n  pretrained_model: dummy-pretrained-model\n  finetuning_module: task_heads # gnn  \n  sub_module_from_pretrained: zinc # optional\n  new_sub_module: ${constants.task} # optional\n  \n  # keep_modules_after_finetuning_module: # optional\n  #   graph_output_nn/graph: {}\n  #   task_heads/zinc:\n  #     new_sub_module: lipophilicity_astrazeneca\n  #     out_dim: 1\n\n\n  # Changes to finetuning_module                                                 \n  drop_depth: 1\n  new_out_dim: 8\n  added_depth: 2\n\n  # Training\n  unfreeze_pretrained_depth: 0\n  epoch_unfreeze_all: none\n\n  # Optional finetuning head appended to model after finetuning_module\n  finetuning_head:\n    task: ${constants.task}\n    previous_module: task_heads\n    incoming_level: graph\n    model_type: mlp\n    in_dim: 8\n    out_dim: 1\n    hidden_dims: 8\n    depth: 2\n    last_layer_is_readout: true\n"
  },
  {
    "path": "expts/hydra-configs/finetuning/admet_baseline.yaml",
    "content": "# @package _global_\n\ndefaults:\n  - override /tasks/loss_metrics_datamodule: admet\n\nconstants:\n  task: tbd\n  name: finetune_${constants.task}\n  wandb:\n    name: ${constants.name}\n    project: finetuning\n    entity: recursion\n  seed: 42\n  max_epochs: 100\n  data_dir: ../data/graphium/admet/${constants.task}\n  datacache_path: ../datacache/admet/${constants.task}\n  raise_train_error: true\n  metric: ${get_metric_name:${constants.task}}\n\ndatamodule:\n  args:\n    batch_size_training: 32\n    dataloading_from: ram\n    persistent_workers: true\n    num_workers: 4\n  \ntrainer:\n  model_checkpoint:\n    # save_top_k: 1\n    # monitor: graph_${constants.task}/${constants.metric}/val\n    # mode: ${get_metric_mode:${constants.task}}\n    # save_last: true\n    # filename: best\n    dirpath: model_checkpoints/finetuning/${constants.task}/${now:%Y-%m-%d_%H-%M-%S.%f}/\n    every_n_epochs: 200\n  trainer:\n    precision: 32\n    check_val_every_n_epoch: 1\n  # early_stopping:\n  #   monitor: graph_${constants.task}/${constants.metric}/val\n  #   mode: ${get_metric_mode:${constants.task}}\n  #   min_delta: 0.001\n  #   patience: 10\n  accumulate_grad_batches: none\n  # test_from_checkpoint: best.ckpt\n  # test_from_checkpoint: ${trainer.model_checkpoint.dirpath}/best.ckpt\n  \npredictor:\n  optim_kwargs:\n    lr: 0.000005\n\n\n# == Fine-tuning config == \n\nfinetuning:\n  task: ${constants.task}\n  level: graph\n  pretrained_model: tbd\n  finetuning_module: graph_output_nn  \n  sub_module_from_pretrained: graph\n  new_sub_module: graph\n\n  keep_modules_after_finetuning_module: # optional\n    task_heads-pcqm4m_g25:\n      new_sub_module: ${constants.task}\n      hidden_dims: 256\n      depth: 2\n      last_activation: ${get_last_activation:${constants.task}}\n      out_dim: 1\n\n  epoch_unfreeze_all: tbd"
  },
  {
    "path": "expts/hydra-configs/hparam_search/optuna.yaml",
    "content": "# @package _global_\n#\n# For running a hyper-parameter search, we use the Optuna plugin for hydra.\n# This makes optuna available as a sweeper in hydra and integrates easily with the rest of the codebase. \n# For more info, see https://hydra.cc/docs/plugins/optuna_sweeper/\n#\n# To run a hyper-param search, \n#   (1) Update this config, specifically the hyper-param search space;\n#   (2) Run `graphium-train +hparam_search=optuna` from the command line.\n\n\ndefaults:\n  - override /hydra/sweeper: optuna\n  # Optuna supports various sweepers (e.g. grid search, random search, TPE sampler)\n  - override /hydra/sweeper/sampler: tpe\n\nhyper_param_search: \n  # For the sweeper to work, the main process needs to return\n  # the objective value(s) (as a float) we are trying to optimize. \n\n  # Assuming this is a metric, the `objective` key specifies which metric. \n  # Optuna supports multi-parameter optimization as well. \n  # If configured correctly, you can specify multiple keys.\n  objective: loss/test\n\n  # Where to save results to\n  # NOTE (cwognum): Ideally, we would use the `hydra.sweep.dir` key, but they don't support remote paths.\n  # save_destination: gs://path/to/bucket\n  # overwrite_destination: false\n\nhydra:\n  # Run in multirun mode by default (i.e. actually use the sweeper)\n  mode: MULTIRUN\n\n  # Changes the working directory\n  sweep:\n    dir: hparam-search-results/${constants.name}\n    subdir: ${hydra.job.num}\n\n  # Sweeper config\n  sweeper:\n    sampler:\n      seed: ${constants.seed}\n    direction: minimize\n    study_name: ${constants.name}\n    storage: null\n    n_trials: 100\n    n_jobs: 1\n\n    # The hyper-parameter search space definition\n    # See https://hydra.cc/docs/plugins/optuna_sweeper/#search-space-configuration for the options\n    params:\n      predictor.optim_kwargs.lr: tag(log, interval(0.00001, 0.001))\n\n"
  },
  {
    "path": "expts/hydra-configs/main.yaml",
    "content": "defaults:\n\n  # Accelerators\n  - accelerator: cpu\n\n  # Pre-training/fine-tuning\n  - architecture: toymix\n  - tasks: toymix\n  - training: toymix\n\n  # Benchmarking\n  - model: gcn\n\n  # Specializations\n  - training/accelerator: ${training}_${accelerator}\n  - training/model: ${training}_${model}\n"
  },
  {
    "path": "expts/hydra-configs/model/gated_gcn.yaml",
    "content": "# @package _global_\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: ${constants.gnn_dim} # &gnn_dim 704\n    hidden_dims: ${architecture.gnn.out_dim} # *gnn_dim\n    hidden_dims_edges: ${constants.gnn_edge_dim}\n    layer_type: 'pyg:gated-gcn'\n\n  graph_output_nn:\n    graph:\n      hidden_dims: ${architecture.gnn.out_dim}\n    node:\n      hidden_dims: ${architecture.gnn.out_dim}\n"
  },
  {
    "path": "expts/hydra-configs/model/gcn.yaml",
    "content": "# @package _global_\n\narchitecture:\n  gnn:\n    layer_type: 'pyg:gcn'"
  },
  {
    "path": "expts/hydra-configs/model/gin.yaml",
    "content": "# @package _global_\n\narchitecture:\n  gnn:\n    layer_type: 'pyg:gin'"
  },
  {
    "path": "expts/hydra-configs/model/gine.yaml",
    "content": "# @package _global_\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: ${constants.gnn_edge_dim}\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: ${constants.gnn_dim}\n    hidden_dims: ${architecture.gnn.out_dim}\n    layer_type: 'pyg:gine'\n\n  graph_output_nn:\n    graph:\n      hidden_dims: ${architecture.gnn.out_dim}\n    node:\n      hidden_dims: ${architecture.gnn.out_dim}\n"
  },
  {
    "path": "expts/hydra-configs/model/gpspp.yaml",
    "content": "# @package _global_\n\ndatamodule:\n  args:\n    batch_size_training: 32\n    featurization:\n      conformer_property_list: [positions_3d]\n\ntrainer:\n  trainer:\n    accumulate_grad_batches: 2\n\narchitecture:\n  pe_encoders:\n    encoders:\n      gaussian_pos:\n        encoder_type: \"gaussian_kernel\"\n        input_keys: [\"positions_3d\"]\n        output_keys: [\"feat\", \"nodepair_gaussian_bias_3d\"]\n        num_heads: 32\n        num_layers: 1 #2\n        embed_dim: 32\n        out_dim: 32 # need num of gaussian kernels 128\n        # but currently it checks pe_out_dim == pe_out_dim in encoder_manager.py, line 128\n        use_input_keys_prefix: False\n\n  gnn:\n    layer_type: 'pyg:gps'\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 256\n        out_dim: 256\n        in_dim_edges: 128\n        out_dim_edges: 128\n      attn_type: \"full-attention\" # \"full-attention\", \"none\"\n      precision: &precision 16-true\n      biased_attention_key: \"nodepair_gaussian_bias_3d\" # 3D_bias\n      attn_kwargs:\n        num_heads: 32\n      droppath_rate_attn: 0.0\n      droppath_rate_ffn: 0.0\n"
  },
  {
    "path": "expts/hydra-configs/model/mpnn.yaml",
    "content": "# @package _global_\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: ${constants.gnn_dim}\n    hidden_dims: ${architecture.gnn.out_dim}\n    hidden_dims_edges: ${constants.gnn_edge_dim}\n    layer_type: 'pyg:mpnnplus'\n\n  graph_output_nn:\n    graph:\n      hidden_dims: ${architecture.gnn.out_dim}\n    node:\n      hidden_dims: ${architecture.gnn.out_dim}\n"
  },
  {
    "path": "expts/hydra-configs/tasks/admet.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: admet\n  - loss_metrics_datamodule: admet"
  },
  {
    "path": "expts/hydra-configs/tasks/l1000_mcf7.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: l1000_mcf7\n  - loss_metrics_datamodule: l1000_mcf7"
  },
  {
    "path": "expts/hydra-configs/tasks/l1000_vcap.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: l1000_vcap\n  - loss_metrics_datamodule: l1000_vcap"
  },
  {
    "path": "expts/hydra-configs/tasks/largemix.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: largemix\n  - loss_metrics_datamodule: largemix"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/admet.yaml",
    "content": "# @package _global_\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    # All below metrics are directly copied from the TDC website.\n    # For more information, see https://tdcommons.ai/benchmark/admet_group/overview/\n    caco2_wang: [\"mae\"]\n    hia_hou: [\"auroc\"]\n    pgp_broccatelli: [\"auroc\"]\n    bioavailability_ma: [\"auroc\"]\n    lipophilicity_astrazeneca: [\"mae\"]\n    solubility_aqsoldb: [\"mae\"]\n    bbb_martins: [\"auroc\"]\n    ppbr_az: [\"mae\"]\n    vdss_lombardo: [\"spearman\"]\n    cyp2d6_veith: [\"auprc\"]\n    cyp3a4_veith: [\"auprc\"]\n    cyp2c9_veith: [\"auprc\"]\n    cyp2d6_substrate_carbonmangels: [\"auprc\"]\n    cyp3a4_substrate_carbonmangels: [\"auprc\"]\n    cyp2c9_substrate_carbonmangels: [\"auprc\"]\n    half_life_obach: [\"spearman\"]\n    clearance_microsome_az: [\"spearman\"]\n    clearance_hepatocyte_az: [\"spearman\"]\n    herg: [\"auroc\"]\n    ames: [\"auroc\"]\n    dili: [\"auroc\"]\n    ld50_zhu: [\"auroc\"]\n  loss_fun:\n    caco2_wang: mae\n    hia_hou: bce\n    pgp_broccatelli: bce\n    bioavailability_ma: bce\n    lipophilicity_astrazeneca: mae\n    solubility_aqsoldb: mae\n    bbb_martins: bce\n    ppbr_az: mae\n    vdss_lombardo: mae\n    cyp2d6_veith: bce\n    cyp3a4_veith: bce\n    cyp2c9_veith: bce\n    cyp2d6_substrate_carbonmangels: bce\n    cyp3a4_substrate_carbonmangels: bce\n    cyp2c9_substrate_carbonmangels: bce\n    half_life_obach: mae\n    clearance_microsome_az: mae\n    clearance_hepatocyte_az: mae\n    herg: bce\n    ames: bce\n    dili: bce\n    ld50_zhu: mae\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 4.e-5 # warmup can be scheduled using torch_scheduler_kwargs\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 10\n    warmup_epochs: 10\n    verbose: False\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  caco2_wang: &regression_metrics\n    - name: mae\n      metric: mae\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: spearman\n      metric: spearmanr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: pearson\n      metric: pearsonr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  hia_hou: &classification_metrics\n    - name: auroc\n      metric: auroc\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: auprc\n      metric: averageprecision\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: accuracy\n      metric: accuracy\n      multitask_handling: mean-per-label\n      target_to_int: True\n      average: micro\n      threshold_kwargs: &threshold_05\n        operator: greater\n        threshold: 0.5\n        th_on_preds: True\n        th_on_target: True\n    - name: mcc\n      metric: mcc\n      num_classes: 2\n      multitask_handling: mean-per-label\n      target_to_int: True\n      average: micro\n      threshold_kwargs: *threshold_05\n  pgp_broccatelli: *classification_metrics\n  bioavailability_ma: *classification_metrics\n  lipophilicity_astrazeneca: *regression_metrics\n  solubility_aqsoldb: *regression_metrics\n  bbb_martins: *classification_metrics\n  ppbr_az: *regression_metrics\n  vdss_lombardo: *regression_metrics\n  cyp2d6_veith: *classification_metrics\n  cyp3a4_veith: *classification_metrics\n  cyp2c9_veith: *classification_metrics\n  cyp2d6_substrate_carbonmangels: *classification_metrics\n  cyp3a4_substrate_carbonmangels: *classification_metrics\n  cyp2c9_substrate_carbonmangels: *classification_metrics\n  half_life_obach: *regression_metrics\n  clearance_microsome_az: *regression_metrics\n  clearance_hepatocyte_az: *regression_metrics\n  herg: *classification_metrics\n  ames: *classification_metrics\n  dili: *classification_metrics\n  ld50_zhu: *regression_metrics\n\ndatamodule:\n  module_type: \"ADMETBenchmarkDataModule\"\n  args:\n    # TDC specific\n    tdc_benchmark_names: null\n    tdc_train_val_seed: ${constants.seed}\n"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/l1000_mcf7.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    l1000_mcf7: []\n  metrics_on_training_set:\n    l1000_mcf7: []\n  loss_fun:\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: 0.5\n\nmetrics:\n  l1000_mcf7:\n    - name: auroc\n      metric: auroc\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ndatamodule:\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_mcf7:\n        df: null\n        df_path: ../data/graphium/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/l1000_vcap.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n  metrics_on_training_set:\n    l1000_vcap: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: 0.5\n\nmetrics:\n  l1000_vcap:\n    - name: auroc\n      metric: auroc\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ndatamodule:\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: ../data/graphium/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/largemix.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n    l1000_mcf7: []\n    pcba_1328: []\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    l1000_vcap: []\n    l1000_mcf7: []\n    pcba_1328: []\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: 0.5\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: ${predictor.loss_fun.l1000_vcap.alpha}\n    pcba_1328: bce_logits_ipu\n    pcqm4m_g25: mae_ipu\n    pcqm4m_n4: mae_ipu\n\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auroc\n      metric: auroc\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  l1000_mcf7: *classif_metrics\n  pcba_1328:\n  # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n    - name: auroc\n      metric: auroc\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  pcqm4m_n4: *pcqm_metrics\n\ndatamodule:\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: ../data/graphium/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0\n\n      l1000_mcf7:\n        df: null\n        df_path: ../data/graphium/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0\n\n      pcba_1328:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0\n\n      pcqm4m_g25:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0\n\n      pcqm4m_n4:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: ../data/graphium/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        seed: 42\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/pcba_1328.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    pcba_1328: []\n  metrics_on_training_set:\n    pcba_1328: []\n  loss_fun:\n    pcba_1328: bce_logits_ipu\n\nmetrics:\n  pcba_1328:\n  # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n    - name: auroc\n      metric: auroc\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n\ndatamodule:\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcba_1328:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/pcqm4m.yaml",
    "content": "# @package _global_\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    homolumo: []\n  metrics_on_training_set:\n    homolumo: [\"pearsonr\"]  \n  loss_fun:\n    homolumo: mae_ipu\n\n# Task-specific\nmetrics:\n  homolumo:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      homolumo:\n        df: null\n        task_level: \"graph\"\n        df_path: graphium/data/PCQM4M/pcqm4mv2.csv\n        # wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2.csv\n        # or set path as https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2.csv directly\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"homo_lumo_gap\"]\n        # sample_size: 8000 # use sample_size for test\n        splits_path: graphium/data/PCQM4M/split_dict_v2.pt  # Download with `wget https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/split_dict_v2.pt`\n        split_names: [\"train\", \"valid\", \"test-dev\"]\n        # graphium/data/PCQM4Mv2/split_dict.pt\n        # graphium/data/PCQM4Mv2/pcqm4m_split.csv\n        # split_val: 0.1\n        # split_test: 0.1\n        seed: ${constants.seed}\n        label_normalization:\n          method: \"normal\"\n"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/pcqm4m_g25.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_g25: []\n  metrics_on_training_set:\n    pcqm4m_g25: []\n  loss_fun:\n    pcqm4m_g25: mae_ipu\n\nmetrics:\n  pcqm4m_g25:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ndatamodule:\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_g25:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: ../data/graphium/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/pcqm4m_n4.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    pcqm4m_n4: []\n  loss_fun:\n    pcqm4m_n4: mae_ipu\n\nmetrics:\n  pcqm4m_n4:\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ndatamodule:\n      pcqm4m_n4:\n        df: null\n        df_path: ../data/graphium/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: ../data/graphium/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        # split_names: [train, val, test_seen]\n        seed: 42\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0"
  },
  {
    "path": "expts/hydra-configs/tasks/loss_metrics_datamodule/toymix.yaml",
    "content": "# @package _global_\n\npredictor:\n  metrics_on_progress_bar:\n    qm9: [\"mae\"]\n    tox21: [\"auroc\"]\n    zinc: [\"mae\"]\n  loss_fun:\n    qm9: mae_ipu\n    tox21: bce_logits_ipu\n    zinc: mae_ipu\n\nmetrics:\n  qm9: &qm9_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  tox21:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: f1 > 0.5\n      metric: f1\n      multitask_handling: mean-per-label\n      target_to_int: True\n      num_classes: 2\n      average: micro\n      threshold_kwargs: &threshold_05\n        operator: greater\n        threshold: 0.5\n        th_on_preds: True\n        th_on_target: True\n    - name: precision > 0.5\n      metric: precision\n      multitask_handling: mean-per-label\n      average: micro\n      threshold_kwargs: *threshold_05\n  zinc: *qm9_metrics\n\ndatamodule:\n  args:\n    task_specific_args:\n      qm9:\n        df: null\n        df_path: ${constants.data_dir}/qm9.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"A\", \"B\", \"C\", \"mu\", \"alpha\", \"homo\", \"lumo\", \"gap\", \"r2\", \"zpve\", \"u0\", \"u298\", \"h298\", \"g298\", \"cv\", \"u0_atom\", \"u298_atom\", \"h298_atom\", \"g298_atom\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: ${constants.data_dir}/qm9_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9_random_splits.pt`\n        seed: ${constants.seed} #*seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      tox21:\n        df: null\n        df_path: ${constants.data_dir}/Tox21-7k-12-labels.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21-7k-12-labels.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"NR-AR\", \"NR-AR-LBD\", \"NR-AhR\", \"NR-Aromatase\", \"NR-ER\", \"NR-ER-LBD\", \"NR-PPAR-gamma\", \"SR-ARE\", \"SR-ATAD5\", \"SR-HSE\", \"SR-MMP\", \"SR-p53\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: ${constants.data_dir}/Tox21_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21_random_splits.pt`\n        seed: ${constants.seed}\n        task_level: graph\n\n      zinc:\n        df: null\n        df_path: ${constants.data_dir}/ZINC12k.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"SA\", \"logp\", \"score\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: ${constants.data_dir}/ZINC12k_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k_random_splits.pt`\n        seed: ${constants.seed}\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\""
  },
  {
    "path": "expts/hydra-configs/tasks/pcba_1328.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: pcba_1328\n  - loss_metrics_datamodule: pcba_1328"
  },
  {
    "path": "expts/hydra-configs/tasks/pcqm4m.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: pcqm4m\n  - loss_metrics_datamodule: pcqm4m"
  },
  {
    "path": "expts/hydra-configs/tasks/pcqm4m_g25.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: pcqm4m_g25\n  - loss_metrics_datamodule: pcqm4m_g25"
  },
  {
    "path": "expts/hydra-configs/tasks/pcqm4m_n4.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: pcqm4m_n4\n  - loss_metrics_datamodule: pcqm4m_n4"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/admet.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    caco2_wang: &regression_head\n      task_level: graph\n      out_dim: 1\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: &dropout 0.5\n      normalization: &normalization \"layer_norm\"\n      last_normalization: \"none\"\n      residual_type: none\n    hia_hou: &classification_head\n      task_level: graph\n      out_dim: 1\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pgp_broccatelli: *classification_head\n    bioavailability_ma: *classification_head\n    lipophilicity_astrazeneca: *regression_head\n    solubility_aqsoldb: *regression_head\n    bbb_martins: *classification_head\n    ppbr_az: *regression_head\n    vdss_lombardo: *regression_head\n    cyp2d6_veith: *classification_head\n    cyp3a4_veith: *classification_head\n    cyp2c9_veith: *classification_head\n    cyp2d6_substrate_carbonmangels: *classification_head\n    cyp3a4_substrate_carbonmangels: *classification_head\n    cyp2c9_substrate_carbonmangels: *classification_head\n    half_life_obach: *regression_head\n    clearance_microsome_az: *regression_head\n    clearance_hepatocyte_az: *regression_head\n    herg: *classification_head\n    ames: *classification_head\n    dili: *classification_head\n    ld50_zhu: *regression_head\n    "
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/l1000_mcf7.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/l1000_vcap.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/largemix.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 256\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 256\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 128\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/pcba_1328.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/pcqm4m.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    homolumo:\n      task_level: graph\n      out_dim: 1\n      hidden_dims: 256\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: 0.18\n      normalization: layer_norm\n      last_normalization: \"none\"\n      residual_type: none\n"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/pcqm4m_g25.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/pcqm4m_n4.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none"
  },
  {
    "path": "expts/hydra-configs/tasks/task_heads/toymix.yaml",
    "content": "# @package _global_\n\narchitecture:\n  task_heads:\n    qm9:\n      task_level: graph\n      out_dim: 19\n      hidden_dims: 128\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    tox21:\n      task_level: graph\n      out_dim: 12\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n    zinc:\n      task_level: graph\n      out_dim: 3\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: ${architecture.pre_nn.dropout}\n      normalization: ${architecture.pre_nn.normalization}\n      last_normalization: \"none\"\n      residual_type: none\n"
  },
  {
    "path": "expts/hydra-configs/tasks/toymix.yaml",
    "content": "# NOTE: We cannot have a single config, since for fine-tuning we will\n#  only want to override the loss_metrics_datamodule, whereas for training we will\n#  want to override both. \n\ndefaults:\n  - task_heads: toymix\n  - loss_metrics_datamodule: toymix"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/largemix_cpu.yaml",
    "content": "# @package _global_\n\ndatamodule:\n  args:\n    batch_size_training: 200\n    batch_size_inference: 200\n    featurization_n_jobs: 20\n    num_workers: 20\n\npredictor:\n  metrics_every_n_train_steps: 1000\n  torch_scheduler_kwargs:\n    max_num_epochs: ${constants.max_epochs}\n\ntrainer:\n  trainer:\n    precision: 32\n    accumulate_grad_batches: 2\n    max_epochs: ${constants.max_epochs}"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/largemix_gpu.yaml",
    "content": "# @package _global_\n\naccelerator:\n  float32_matmul_precision: medium\n\ndatamodule:\n  args:\n    batch_size_training: 2048\n    batch_size_inference: 2048\n    featurization_n_jobs: 6\n    num_workers: 6\n\npredictor:\n  metrics_every_n_train_steps: 1000\n  torch_scheduler_kwargs:\n    max_num_epochs: ${constants.max_epochs}\n\ntrainer:\n  trainer:\n    precision: 16-mixed\n    max_epochs: ${constants.max_epochs}"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/largemix_ipu.yaml",
    "content": "# @package _global_\n\ndatamodule:\n    args:\n      ipu_dataloader_training_opts:\n        mode: async\n        max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n        max_num_edges_per_graph: 100\n      ipu_dataloader_inference_opts:\n        mode: async\n        max_num_nodes_per_graph: 35 # valid max nodes: 51, max_edges: 118\n        max_num_edges_per_graph: 100\n      # Data handling-related\n      batch_size_training: 30\n      batch_size_inference: 30\n\npredictor:\n  optim_kwargs:\n    loss_scaling: 1024\n\ntrainer:\n  trainer:\n      precision: 16-true\n      accumulate_grad_batches: 2"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/pcqm4m_ipu.yaml",
    "content": "# @package _global_\n\ndatamodule:\n  args:\n    ipu_dataloader_training_opts:\n      mode: async\n      max_num_nodes_per_graph: 16 # train max nodes: 20, max_edges: 54\n      max_num_edges_per_graph: 60\n    ipu_dataloader_inference_opts:\n      mode: async\n      max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n      max_num_edges_per_graph: 120\n    # Data handling-related\n    batch_size_inference: 16\n\npredictor:\n  metrics_every_n_train_steps: 100\n  optim_kwargs:\n    loss_scaling: 1024\n\ntrainer:\n  trainer:\n    precision: 16-true\n"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/toymix_cpu.yaml",
    "content": "# @package _global_\n\ndatamodule:\n  args:\n    batch_size_training: 200\n    batch_size_inference: 200\n    featurization_n_jobs: 4\n    num_workers: 4\n\npredictor:\n  optim_kwargs: {}\n  metrics_every_n_train_steps: 300\n  torch_scheduler_kwargs:\n    max_num_epochs: ${constants.max_epochs}\n\ntrainer:\n  trainer:\n    precision: 32\n    accumulate_grad_batches: 1\n    max_epochs: ${constants.max_epochs}"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/toymix_gpu.yaml",
    "content": "# @package _global_\n\naccelerator:\n  float32_matmul_precision: medium\n\ndatamodule:\n  args:\n    batch_size_training: 200\n    batch_size_inference: 200\n    featurization_n_jobs: 4\n    num_workers: 4\n\npredictor:\n  optim_kwargs: {}\n  metrics_every_n_train_steps: 300\n  torch_scheduler_kwargs:\n    max_num_epochs: ${constants.max_epochs}\n\ntrainer:\n  trainer:\n    accumulate_grad_batches: 1\n    max_epochs: ${constants.max_epochs}"
  },
  {
    "path": "expts/hydra-configs/training/accelerator/toymix_ipu.yaml",
    "content": "# @package _global_\n\ndatamodule:\n  args:\n    ipu_dataloader_training_opts:\n      mode: async\n      max_num_nodes_per_graph: 44 # train max nodes: 20, max_edges: 54\n      max_num_edges_per_graph: 80\n    ipu_dataloader_inference_opts:\n      mode: async\n      max_num_nodes_per_graph: 44 # valid max nodes: 51, max_edges: 118\n      max_num_edges_per_graph: 80\n    # Data handling-related\n    batch_size_training: 50\n    batch_size_inference: 50\n\npredictor:\n  optim_kwargs:\n    loss_scaling: 1024\n\ntrainer:\n  trainer:\n    accumulate_grad_batches: 4"
  },
  {
    "path": "expts/hydra-configs/training/largemix.yaml",
    "content": "# @package _global_\n\npredictor:\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 5\n    verbose: False\n  scheduler_kwargs:\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\ntrainer:\n  seed: ${constants.seed}\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: ${constants.name}\n    project: ${constants.name}\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/${now:%Y-%m-%d_%H-%M-%S}/\n    filename: ${constants.name}\n    save_last: True         # saving last model\n    # save_top_k: 1           # and best model\n    # monitor: loss/val       # wrt validation loss\n  trainer:\n    precision: 16-mixed\n    max_epochs: ${predictor.torch_scheduler_kwargs.max_num_epochs}\n    min_epochs: 1\n    check_val_every_n_epoch: 10"
  },
  {
    "path": "expts/hydra-configs/training/model/largemix_gated_gcn.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: large_data_gated_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023-expts\n    entity: multitask-gnn\n  entity: multitask-gnn\n  seed: 42\n  max_epochs: 200\n  data_dir: ../data/graphium/large-dataset/\n  raise_train_error: true\n  datacache_path: ../datacache/large-dataset/\n  gnn_dim: 512\n  gnn_edge_dim: 128\n  norm: \"layer_norm\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/gated_gcn/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/model/largemix_gcn.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: large_data_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023-expts\n    entity: multitask-gnn\n  entity: multitask-gnn\n  seed: 42\n  max_epochs: 200\n  data_dir: ../data/graphium/large-dataset/\n  raise_train_error: true\n  datacache_path: ../datacache/large-dataset/\n  norm: \"layer_norm\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/gcn/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/model/largemix_gin.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: large_data_gin\n  wandb:\n    name: ${constants.name}\n    project: neurips2023-expts\n    entity: multitask-gnn\n  entity: multitask-gnn\n  seed: 42\n  max_epochs: 200\n  data_dir: ../data/graphium/large-dataset/\n  raise_train_error: true\n  datacache_path: ../datacache/large-dataset/\n  norm: \"layer_norm\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/gin/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/model/largemix_gine.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: large_data_gine\n  wandb:\n    name: ${constants.name}\n    project: neurips2023-expts\n    entity: multitask-gnn\n  entity: multitask-gnn\n  seed: 42\n  max_epochs: 200\n  data_dir: ../data/graphium/large-dataset/\n  raise_train_error: true\n  datacache_path: ../datacache/large-dataset/\n  gnn_dim: 512\n  gnn_edge_dim: 32\n  norm: \"layer_norm\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/gine/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/model/largemix_mpnn.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: large_data_mpnn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023-expts\n    entity: multitask-gnn\n  entity: multitask-gnn\n  seed: 42\n  max_epochs: 200\n  data_dir: ../data/graphium/large-dataset/\n  raise_train_error: true\n  datacache_path: ../datacache/large-dataset/\n  gnn_dim: 512\n  gnn_edge_dim: 256\n  norm: \"layer_norm\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: model_checkpoints/large-dataset/mpnn/${now:%Y-%m-%d_%H-%M-%S}/\n"
  },
  {
    "path": "expts/hydra-configs/training/model/pcqm4m_gpspp.yaml",
    "content": "# @package _global_\n\n# GPS++ model with the PCQMv2 dataset.\nconstants:\n  name: pcqm4mv2_gpspp_4layer\n  seed: 42\n  max_epochs: 100\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  datacache_path: \"/localdata/PCQM4Mv2/\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQ4Mv2/gpspp/${now:%Y-%m-%d_%H-%M-%S}/\n"
  },
  {
    "path": "expts/hydra-configs/training/model/pcqm4m_mpnn.yaml",
    "content": "# @package _global_\n\n# MPNN model with the PCQMv2 dataset.\nconstants:\n  name: pcqm4mv2_mpnn_4layer\n  seed: 42\n  max_epochs: 100\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  datacache_path: \"/localdata/PCQM4Mv2/\"\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQ4Mv2/mpnn/${now:%Y-%m-%d_%H-%M-%S}/\n"
  },
  {
    "path": "expts/hydra-configs/training/model/toymix_gcn.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: neurips2023_small_data_gcn\n  seed: 42\n  max_epochs: 100\n  data_dir: expts/data/neurips2023/small-dataset\n  raise_train_error: true\n  datacache_path: ../datacache/neurips2023-small/\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/small-dataset/gcn/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/model/toymix_gin.yaml",
    "content": "# @package _global_\n\nconstants:\n  name: neurips2023_small_data_gin\n  seed: 42\n  max_epochs: 100\n  data_dir: expts/data/neurips2023/small-dataset\n  raise_train_error: true\n  datacache_path: ../datacache/neurips2023-small/\n\ntrainer:\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gin/${now:%Y-%m-%d_%H-%M-%S}/"
  },
  {
    "path": "expts/hydra-configs/training/pcqm4m.yaml",
    "content": "# @package _global_\n\npredictor:\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 4.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: ${constants.max_epochs}\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor homolumo/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n  flag_kwargs:\n    n_steps: 0 # 1\n    alpha: 0.0 # 0.01\n\n\ntrainer:\n  seed: ${constants.seed}\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/PCMQ4Mv2/${now:%Y-%m-%d_%H-%M-%S}/\n    filename: ${constants.name}\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: ${constants.max_epochs}\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/hydra-configs/training/toymix.yaml",
    "content": "# @package _global_\n\npredictor:\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 4.e-5 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: ${constants.max_epochs}\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs: null\n  target_nan_mask: null\n  multitask_handling: flatten # flatten, mean-per-label\n\ntrainer:\n  seed: ${constants.seed}\n  model_checkpoint:\n    filename: ${constants.name}\n    save_last: True\n  trainer:\n    precision: 16\n    max_epochs: ${constants.max_epochs}\n    min_epochs: 1\n    check_val_every_n_epoch: 20"
  },
  {
    "path": "expts/main_run_get_fingerprints.py",
    "content": "# General imports\nimport os\nfrom os.path import dirname, abspath\nimport yaml\nimport numpy as np\nimport pandas as pd\nimport torch\nimport fsspec\nfrom lightning.pytorch.utilities.model_summary import ModelSummary\n\n# Current project imports\nimport graphium\nfrom graphium.config._loader import load_datamodule, load_trainer\nfrom graphium.utils.fs import mkdir\nfrom graphium.trainer.predictor import PredictorModule\n\n\n# Set up the working directory\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nos.chdir(MAIN_DIR)\n\n\n# MODEL_FILE = \"models_checkpoints/micro_ZINC/model.ckpt\"\n# CONFIG_FILE = \"expts/config_micro_ZINC.yaml\"\n\n\ndef main() -> None:\n    LIST_CONCAT_LAST_LAYERS = [1, 0, [1, 2], [0, 1, 2]]\n    DATA_NAME_ALL = [\"molbace\"]  # , \"mollipo\", \"moltox21\", \"molHIV\"]\n    MODEL_PATH = \"https://storage.valencelabs.com/graphium/pretrained-models\"\n    MODEL_NAME = \"graphium-zinc-micro-dummy-test\"\n    MODEL_FILE = f\"{MODEL_PATH}/{MODEL_NAME}/model.ckpt\"\n    MODEL_CONFIG = f\"{MODEL_PATH}/{MODEL_NAME}/configs.yaml\"\n\n    predictor = PredictorModule.load_from_checkpoint(MODEL_FILE)\n\n    print(predictor.model)\n    print(ModelSummary(predictor, max_depth=4))\n\n    for data_name in DATA_NAME_ALL:\n        DATA_CONFIG = f\"{MAIN_DIR}/expts/config_{data_name}_pretrained.yaml\"\n\n        with fsspec.open(DATA_CONFIG, \"r\") as f:\n            data_cfg = yaml.safe_load(f)\n        with fsspec.open(os.path.join(MODEL_CONFIG), \"r\") as f:\n            model_cfg = yaml.safe_load(f)\n\n        # Load and initialize the dataset\n        data_cfg[\"datamodule\"][\"args\"][\"featurization\"] = model_cfg[\"datamodule\"][\"args\"][\"featurization\"]\n        datamodule = load_datamodule(data_cfg)\n        print(\"\\ndatamodule:\\n\", datamodule, \"\\n\")\n\n        for concat_last_layers in LIST_CONCAT_LAST_LAYERS:\n            export_dir = f\"predictions/fingerprints-model-{MODEL_NAME}\"\n            export_df_path = f\"{export_dir}/{data_name}-concatlayers-{concat_last_layers}.csv.gz\"\n\n            predictor.model.concat_last_layers = concat_last_layers\n            trainer = load_trainer(data_cfg)\n\n            # Run the model prediction\n            preds = trainer.predict(model=predictor, datamodule=datamodule)\n            if isinstance(preds[0], torch.Tensor):\n                preds = [p.detach().cpu().numpy() for p in preds]\n            preds = np.concatenate(preds, axis=0)\n\n            # Generate output dataframe\n            df = {\"SMILES\": datamodule.dataset.smiles}\n\n            target = datamodule.dataset.labels\n            for ii in range(target.shape[1]):\n                df[f\"Target-{ii}\"] = target[:, ii]\n\n            for ii in range(preds.shape[1]):\n                df[f\"Preds-{ii}\"] = preds[:, ii]\n            df = pd.DataFrame(df)\n            mkdir(export_dir)\n            df.to_csv(export_df_path)\n            print(df)\n            print(f\"file saved to:`{export_df_path}`\")\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "expts/main_run_multitask.py",
    "content": "import hydra\nfrom omegaconf import DictConfig\n\n\n@hydra.main(version_base=None, config_path=\"hydra-configs\", config_name=\"main\")\ndef main(cfg: DictConfig) -> None:\n    raise DeprecationWarning(\n        \"This script is deprecated. Use `python graphium/cli/train_finetune.py` (or `graphium-train`) instead!\"\n    )\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "expts/main_run_predict.py",
    "content": "# General imports\nimport os\nfrom os.path import dirname, abspath\nimport yaml\nfrom copy import deepcopy\nfrom omegaconf import DictConfig\nimport numpy as np\nimport pandas as pd\nimport torch\nfrom lightning.pytorch.utilities.model_summary import ModelSummary\n\n# Current project imports\nimport graphium\nfrom graphium.config._loader import load_datamodule, load_trainer\nfrom graphium.utils.fs import mkdir\nfrom graphium.trainer.predictor import PredictorModule\n\n\n# Set up the working directory\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nos.chdir(MAIN_DIR)\n\nDATA_NAME = \"molhiv\"\nMODEL_FILE = \"models_checkpoints/ogb-molpcba/model-v2.ckpt\"\nCONFIG_FILE = f\"expts/config_{DATA_NAME}_pretrained.yaml\"\n\n# MODEL_FILE = \"models_checkpoints/micro_ZINC/model.ckpt\"\n# CONFIG_FILE = \"expts/config_micro_ZINC.yaml\"\n\n\ndef main(cfg: DictConfig) -> None:\n    cfg = deepcopy(cfg)\n\n    # Load and initialize the dataset\n    datamodule = load_datamodule(cfg)\n    print(\"\\ndatamodule:\\n\", datamodule, \"\\n\")\n\n    export_df_path = f\"predictions/preds-{DATA_NAME}.csv.gz\"\n\n    predictor = PredictorModule.load_from_checkpoint(MODEL_FILE)\n\n    print(predictor.model)\n    print(ModelSummary(predictor, max_depth=4))\n\n    trainer = load_trainer(cfg)\n\n    # Run the model prediction\n    preds = trainer.predict(model=predictor, datamodule=datamodule)\n    if isinstance(preds[0], torch.Tensor):\n        preds = [p.detach().cpu().numpy() for p in preds]\n    preds = np.concatenate(preds, axis=0)\n\n    # Generate output dataframe\n    df = {\"SMILES\": datamodule.dataset.smiles}\n\n    target = datamodule.dataset.labels\n    for ii in range(target.shape[1]):\n        df[f\"Target-{ii}\"] = target[:, ii]\n\n    for ii in range(preds.shape[1]):\n        df[f\"Preds-{ii}\"] = preds[:, ii]\n    df = pd.DataFrame(df)\n    mkdir(\"predictions\")\n    df.to_csv(export_df_path)\n    print(df)\n    print(f\"file saved to:`{export_df_path}`\")\n\n\nif __name__ == \"__main__\":\n    with open(os.path.join(MAIN_DIR, CONFIG_FILE), \"r\") as f:\n        cfg = yaml.safe_load(f)\n    main(cfg)\n"
  },
  {
    "path": "expts/main_run_test.py",
    "content": "# General imports\nimport os\nfrom os.path import dirname, abspath\nimport yaml\nfrom copy import deepcopy\nfrom omegaconf import DictConfig\nfrom lightning.pytorch.utilities.model_summary import ModelSummary\n\n# Current project imports\nimport graphium\nfrom graphium.config._loader import load_datamodule, load_metrics, load_trainer\n\nfrom graphium.trainer.predictor import PredictorModule\n\n\n# Set up the working directory\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nos.chdir(MAIN_DIR)\n\nMODEL_FILE = \"models_checkpoints/ogb-molpcba/model-v2.ckpt\"\n\nCONFIG_FILE = \"expts/config_molPCBA.yaml\"\n\n\ndef main(cfg: DictConfig) -> None:\n    cfg = deepcopy(cfg)\n\n    # Load and initialize the dataset\n    datamodule = load_datamodule(cfg)\n    print(\"\\ndatamodule:\\n\", datamodule, \"\\n\")\n\n    metrics = load_metrics(cfg)\n    print(metrics)\n\n    predictor = PredictorModule.load_from_checkpoint(MODEL_FILE)\n    predictor.metrics = metrics\n\n    print(predictor.model)\n    print(ModelSummary(predictor, max_depth=4))\n\n    trainer = load_trainer(cfg)\n\n    # Run the model testing\n    trainer.test(model=predictor, datamodule=datamodule, ckpt_path=MODEL_FILE)\n\n\nif __name__ == \"__main__\":\n    with open(os.path.join(MAIN_DIR, CONFIG_FILE), \"r\") as f:\n        cfg = yaml.safe_load(f)\n    main(cfg)\n"
  },
  {
    "path": "expts/neurips2023_configs/base_config/large.yaml",
    "content": "# @package _global_\n\nconstants:\n  seed: 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  entity: multitask-gnn\n  datacache_path: \"/localdata/neurips2023-large/\"\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 35 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 30\n        batch_size_inference: 30\n    predictor:\n      metrics_every_n_train_steps: 1000\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16-true\n        accumulate_grad_batches: 2\n\n  ipu_config:\n    - deviceIterations(30) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n    # - Precision.enableFloatingPointExceptions(True)\n\n  ipu_inference_config:\n  # set device iteration and replication factor to 1 during inference\n  # gradient accumulation was set to 1 in the code\n    - deviceIterations(1)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       args:\n#         batch_size_training: 64\n#         batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n        epoch_sampling_fraction: 1.0\n\n      l1000_mcf7:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n        epoch_sampling_fraction: 1.0\n\n      pcba_1328:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n        epoch_sampling_fraction: 1.0\n\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0\n\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: 42\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    dataloading_from: disk\n    processed_graph_data_path: ${constants.datacache_path}\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 32 # -1 to use all\n    persistent_workers: True # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 2934\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n    l1000_mcf7: []\n    pcba_1328: []\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    l1000_vcap: []\n    l1000_mcf7: []\n    pcba_1328: []\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: 0.5\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 3\n      alpha: ${predictor.loss_fun.l1000_vcap.alpha}\n    pcba_1328: bce_logits_ipu\n    pcqm4m_g25: mae_ipu\n    pcqm4m_n4: mae_ipu\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auroc\n      metric: auroc\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      num_classes: 3\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  l1000_mcf7: *classif_metrics\n  pcba_1328:\n  # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n    - name: auroc\n      metric: auroc\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: ${constants.seed}\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: ${constants.name}\n    project: ${constants.name}\n  model_checkpoint:\n    dirpath: models_checkpoints/${constants.name}/\n    filename: ${constants.name}\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: ${predictor.torch_scheduler_kwargs.max_num_epochs} \n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/base_config/large_pcba.yaml",
    "content": "# @package _global_\n\nconstants:\n  seed: 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  entity: multitask-gnn\n  datacache_path: \"/localdata/neurips2023-large/\"\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 35 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 30\n        batch_size_inference: 30\n    predictor:\n      metrics_every_n_train_steps: 1000\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16-true\n        accumulate_grad_batches: 2\n\n  ipu_config:\n    - deviceIterations(30) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n    # - Precision.enableFloatingPointExceptions(True)\n\n  ipu_inference_config:\n  # set device iteration and replication factor to 1 during inference\n  # gradient accumulation was set to 1 in the code\n    - deviceIterations(1)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       args:\n#         batch_size_training: 64\n#         batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # l1000_mcf7:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      pcba_1328:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n        epoch_sampling_fraction: 1.0\n\n      # pcqm4m_g25:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n      #   # or set path as the URL directly\n      #   smiles_col: \"ordered_smiles\"\n      #   label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n      #   label_normalization:\n      #     normalize_val_test: True\n      #     method: \"normal\"\n      #   epoch_sampling_fraction: 1.0\n\n        #      pcqm4m_n4:\n        #df: null\n        #df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        ## wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        ## or set path as the URL directly\n        #smiles_col: \"ordered_smiles\"\n        #label_cols: node_* # node_* means all columns starting with \"node_\"\n        ## sample_size: 2000 # use sample_size for test\n        #task_level: node\n        #splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        #seed: 42\n        #label_normalization:\n        #  normalize_val_test: True\n        #  method: \"normal\"\n        #epoch_sampling_fraction: 1.0\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    dataloading_from: disk\n    processed_graph_data_path: ${constants.datacache_path}\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 32 # -1 to use all\n    persistent_workers: True # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    # l1000_vcap:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # l1000_mcf7:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    # pcqm4m_g25:\n    #   task_level: graph\n    #   out_dim: 25\n    #   hidden_dims: 32\n    #   depth: 2\n    #   activation: relu\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    #pcqm4m_n4:\n    #  task_level: node\n    #  out_dim: 4\n    #  hidden_dims: 32\n    #  depth: 2\n    #  activation: relu\n    #  last_activation: none\n    #  dropout: *dropout\n    #  normalization: *normalization\n    #  last_normalization: \"none\"\n    #  residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    pcba_1328: []\n    # pcqm4m_g25: []\n    #pcqm4m_n4: []\n  metrics_on_training_set:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    pcba_1328: []\n    # pcqm4m_g25: []\n    #pcqm4m_n4: []\n  loss_fun:\n    # l1000_vcap:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: 0.5\n    # l1000_mcf7:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: ${predictor.loss_fun.l1000_vcap.alpha}\n    pcba_1328: bce_logits_ipu\n    # pcqm4m_g25: mae_ipu\n    #pcqm4m_n4: mae_ipu\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  # l1000_vcap: &classif_metrics\n  #   - name: auroc\n  #     metric: auroc\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  #   - name: avpr\n  #     metric: averageprecision\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  # l1000_mcf7: *classif_metrics\n  pcba_1328:\n  # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n    - name: auroc\n      metric: auroc\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n    - name: avpr\n      metric: averageprecision\n      task: binary\n      multitask_handling: mean-per-label\n      target_nan_mask: ignore\n      threshold_kwargs: null\n      # pcqm4m_n4: &pcqm_metrics\n      #- name: mae\n      #metric: mae_ipu\n      #target_nan_mask: null\n      #multitask_handling: mean-per-label\n      #threshold_kwargs: null\n      #- name: pearsonr\n      #metric: pearsonr_ipu\n      #threshold_kwargs: null\n      #target_nan_mask: null\n      #multitask_handling: mean-per-label\n      #- name: r2\n      #metric: r2_score_ipu\n      #threshold_kwargs: null\n      #target_nan_mask: null\n      #multitask_handling: mean-per-label\n  # pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: ${constants.seed}\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: ${constants.name}\n    project: ${constants.name}\n  model_checkpoint:\n    dirpath: models_checkpoints/${constants.name}/\n    filename: ${constants.name}\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: ${predictor.torch_scheduler_kwargs.max_num_epochs} \n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/base_config/large_pcqm_g25.yaml",
    "content": "# @package _global_\n\nconstants:\n  seed: 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  entity: multitask-gnn\n  datacache_path: \"/localdata/neurips2023-large/\"\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 35 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 30\n        batch_size_inference: 30\n    predictor:\n      metrics_every_n_train_steps: 1000\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16-true\n        accumulate_grad_batches: 2\n\n  ipu_config:\n    - deviceIterations(30) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n    # - Precision.enableFloatingPointExceptions(True)\n\n  ipu_inference_config:\n  # set device iteration and replication factor to 1 during inference\n  # gradient accumulation was set to 1 in the code\n    - deviceIterations(1)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       args:\n#         batch_size_training: 64\n#         batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # l1000_mcf7:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # pcba_1328:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0\n\n      # pcqm4m_n4:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n      #   # or set path as the URL directly\n      #   smiles_col: \"ordered_smiles\"\n      #   label_cols: node_* # node_* means all columns starting with \"node_\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: node\n      #   splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n      #   seed: 42\n      #   label_normalization:\n      #     normalize_val_test: True\n      #     method: \"normal\"\n      #   epoch_sampling_fraction: 1.0\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    dataloading_from: disk\n    processed_graph_data_path: ${constants.datacache_path}\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 32 # -1 to use all\n    persistent_workers: True # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    # l1000_vcap:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # l1000_mcf7:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # pcba_1328:\n    #   task_level: graph\n    #   out_dim: 1328\n    #   hidden_dims: 64\n    #   depth: 2\n    #   activation: relu\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    # pcqm4m_n4:\n    #   task_level: node\n    #   out_dim: 4\n    #   hidden_dims: 32\n    #   depth: 2\n    #   activation: relu\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    # pcba_1328: []\n    pcqm4m_g25: []\n    # pcqm4m_n4: []\n  metrics_on_training_set:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    # pcba_1328: []\n    pcqm4m_g25: []\n    # pcqm4m_n4: []\n  loss_fun:\n    # l1000_vcap:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: 0.5\n    # l1000_mcf7:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: ${predictor.loss_fun.l1000_vcap.alpha}\n    # pcba_1328: bce_logits_ipu\n    pcqm4m_g25: mae_ipu\n    # pcqm4m_n4: mae_ipu\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  # l1000_vcap: &classif_metrics\n  #   - name: auroc\n  #     metric: auroc\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  #   - name: avpr\n  #     metric: averageprecision\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  # l1000_mcf7: *classif_metrics\n  # pcba_1328:\n  # # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n  #   - name: auroc\n  #     metric: auroc\n  #     task: binary\n  #     multitask_handling: mean-per-label\n  #     target_nan_mask: ignore\n  #     threshold_kwargs: null\n  #   - name: avpr\n  #     metric: averageprecision\n  #     task: binary\n  #     multitask_handling: mean-per-label\n  #     target_nan_mask: ignore\n  #     threshold_kwargs: null\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  # pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: ${constants.seed}\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: ${constants.name}\n    project: ${constants.name}\n  model_checkpoint:\n    dirpath: models_checkpoints/${constants.name}/\n    filename: ${constants.name}\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: ${predictor.torch_scheduler_kwargs.max_num_epochs} \n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/base_config/large_pcqm_n4.yaml",
    "content": "# @package _global_\n\nconstants:\n  seed: 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  entity: multitask-gnn\n  datacache_path: \"/localdata/neurips2023-large/\"\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 35 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 30\n        batch_size_inference: 30\n    predictor:\n      metrics_every_n_train_steps: 1000\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16-true\n        accumulate_grad_batches: 2\n\n  ipu_config:\n    - deviceIterations(30) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 96)\n    - Precision.enableStochasticRounding(True)\n    # - Precision.enableFloatingPointExceptions(True)\n\n  ipu_inference_config:\n  # set device iteration and replication factor to 1 during inference\n  # gradient accumulation was set to 1 in the code\n    - deviceIterations(1)\n    - replicationFactor(1)\n    - Precision.enableStochasticRounding(False)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       args:\n#         batch_size_training: 64\n#         batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # l1000_mcf7:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-2_th2.csv.gz\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # pcba_1328:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n      #   # or set path as the URL directly\n      #   smiles_col: \"SMILES\"\n      #   label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n      #   epoch_sampling_fraction: 1.0\n\n      # pcqm4m_g25:\n      #   df: null\n      #   df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n      #   # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n      #   # or set path as the URL directly\n      #   smiles_col: \"ordered_smiles\"\n      #   label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n      #   # sample_size: 2000 # use sample_size for test\n      #   task_level: graph\n      #   splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n      #   label_normalization:\n      #     normalize_val_test: True\n      #     method: \"normal\"\n      #   epoch_sampling_fraction: 1.0\n\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: 42\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n        epoch_sampling_fraction: 1.0\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    dataloading_from: disk\n    processed_graph_data_path: ${constants.datacache_path}\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 32 # -1 to use all\n    persistent_workers: True # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    # l1000_vcap:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # l1000_mcf7:\n    #   task_level: graph\n    #   out_dim: 2934\n    #   hidden_dims: 128\n    #   depth: 2\n    #   activation: none\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # pcba_1328:\n    #   task_level: graph\n    #   out_dim: 1328\n    #   hidden_dims: 64\n    #   depth: 2\n    #   activation: relu\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    # pcqm4m_g25:\n    #   task_level: graph\n    #   out_dim: 25\n    #   hidden_dims: 32\n    #   depth: 2\n    #   activation: relu\n    #   last_activation: none\n    #   dropout: *dropout\n    #   normalization: *normalization\n    #   last_normalization: \"none\"\n    #   residual_type: none\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    # pcba_1328: []\n    # pcqm4m_g25: []\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    # l1000_vcap: []\n    # l1000_mcf7: []\n    # pcba_1328: []\n    # pcqm4m_g25: []\n    pcqm4m_n4: []\n  loss_fun:\n    # l1000_vcap:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: 0.5\n    # l1000_mcf7:\n    #   name: hybrid_ce_ipu\n    #   n_brackets: 3\n    #   alpha: ${predictor.loss_fun.l1000_vcap.alpha}\n    # pcba_1328: bce_logits_ipu\n    # pcqm4m_g25: mae_ipu\n    pcqm4m_n4: mae_ipu\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  # l1000_vcap: &classif_metrics\n  #   - name: auroc\n  #     metric: auroc\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  #   - name: avpr\n  #     metric: averageprecision\n  #     num_classes: 3\n  #     task: multiclass\n  #     target_to_int: True\n  #     target_nan_mask: -1000\n  #     ignore_index: -1000\n  #     multitask_handling: mean-per-label\n  #     threshold_kwargs: null\n  # l1000_mcf7: *classif_metrics\n  # pcba_1328:\n  # # use auroc and averageprecision (non_ipu version) so tha nans are handled correctly\n  #   - name: auroc\n  #     metric: auroc\n  #     task: binary\n  #     multitask_handling: mean-per-label\n  #     target_nan_mask: ignore\n  #     threshold_kwargs: null\n  #   - name: avpr\n  #     metric: averageprecision\n  #     task: binary\n  #     multitask_handling: mean-per-label\n  #     target_nan_mask: ignore\n  #     threshold_kwargs: null\n  pcqm4m_n4: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  # pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: ${constants.seed}\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: ${constants.name}\n    project: ${constants.name}\n  model_checkpoint:\n    dirpath: models_checkpoints/${constants.name}/\n    filename: ${constants.name}\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: ${predictor.torch_scheduler_kwargs.max_num_epochs} \n    min_epochs: 1\n    check_val_every_n_epoch: 20\nh: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/base_config/small.yaml",
    "content": "# @package _global_\n\nconstants:\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n  entity: multitask-gnn\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 44 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 80\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 44 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 80\n        # Data handling-related\n        batch_size_training: 50\n        batch_size_inference: 50\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(5) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      qm9:\n        df: null\n        df_path: data/neurips2023/small-dataset/qm9.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"A\", \"B\", \"C\", \"mu\", \"alpha\", \"homo\", \"lumo\", \"gap\", \"r2\", \"zpve\", \"u0\", \"u298\", \"h298\", \"g298\", \"cv\", \"u0_atom\", \"u298_atom\", \"h298_atom\", \"g298_atom\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/qm9_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9_random_splits.pt`\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      tox21:\n        df: null\n        df_path: data/neurips2023/small-dataset/Tox21-7k-12-labels.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21-7k-12-labels.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"NR-AR\", \"NR-AR-LBD\", \"NR-AhR\", \"NR-Aromatase\", \"NR-ER\", \"NR-ER-LBD\", \"NR-PPAR-gamma\", \"SR-ARE\", \"SR-ATAD5\", \"SR-HSE\", \"SR-MMP\", \"SR-p53\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/Tox21_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21_random_splits.pt`\n        seed: *seed\n        task_level: graph\n\n      zinc:\n        df: null\n        df_path: data/neurips2023/small-dataset/ZINC12k.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"SA\", \"logp\", \"score\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/ZINC12k_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k_random_splits.pt`\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null   # Set as null to avoid a pre-nn network\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 96\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_kwargs: null # Parameters for the model itself. You could define dropout_attn: 0.1\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    qm9:\n      task_level: graph\n      out_dim: 19\n      hidden_dims: 128\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    tox21:\n      task_level: graph\n      out_dim: 12\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    zinc:\n      task_level: graph\n      out_dim: 3\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    qm9: [\"mae\"]\n    tox21: [\"auroc\"]\n    zinc: [\"mae\"]\n  loss_fun:\n    qm9: mae_ipu\n    tox21: bce_ipu\n    zinc: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-5 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  qm9: &qm9_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  tox21:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: f1 > 0.5\n      metric: f1\n      multitask_handling: mean-per-label\n      target_to_int: True\n      num_classes: 2\n      average: micro\n      threshold_kwargs: &threshold_05\n        operator: greater\n        threshold: 0.5\n        th_on_preds: True\n        th_on_target: True\n    - name: precision > 0.5\n      metric: precision\n      multitask_handling: mean-per-label\n      average: micro\n      threshold_kwargs: *threshold_05\n  zinc: *qm9_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-small/\n    name: ${constants.name}\n    project: ${constants.name}\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/${constants.name}/\n    filename: ${constants.name}\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/baseline/config_small_gcn_baseline.yaml",
    "content": "# Testing the gcn model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_gcn\n  seed: &seed 3000\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 44 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 80\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 100 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 200\n        # Data handling-related\n        batch_size_training: 5\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(5) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      qm9:\n        df: null\n        df_path: data/neurips2023/small-dataset/qm9.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"A\", \"B\", \"C\", \"mu\", \"alpha\", \"homo\", \"lumo\", \"gap\", \"r2\", \"zpve\", \"u0\", \"u298\", \"h298\", \"g298\", \"cv\", \"u0_atom\", \"u298_atom\", \"h298_atom\", \"g298_atom\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/qm9_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9_random_splits.pt`\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      tox21:\n        df: null\n        df_path: data/neurips2023/small-dataset/Tox21-7k-12-labels.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21-7k-12-labels.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"NR-AR\", \"NR-AR-LBD\", \"NR-AhR\", \"NR-Aromatase\", \"NR-ER\", \"NR-ER-LBD\", \"NR-PPAR-gamma\", \"SR-ARE\", \"SR-ATAD5\", \"SR-HSE\", \"SR-MMP\", \"SR-p53\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/Tox21_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21_random_splits.pt`\n        seed: *seed\n        task_level: graph\n\n      zinc:\n        df: null\n        df_path: data/neurips2023/small-dataset/ZINC12k.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"SA\", \"logp\", \"score\"]\n        # sample_size: 2000 # use sample_size for test\n        splits_path: data/neurips2023/small-dataset/ZINC12k_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k_random_splits.pt`\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null   # Set as null to avoid a pre-nn network\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 128\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs: null # Parameters for the model itself. You could define dropout_attn: 0.1\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    qm9:\n      task_level: graph\n      out_dim: 19\n      hidden_dims: 256\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    tox21:\n      task_level: graph\n      out_dim: 12\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    zinc:\n      task_level: graph\n      out_dim: 3\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    qm9: [\"mae\"]\n    tox21: [\"auroc\"]\n    zinc: [\"mae\"]\n  loss_fun:\n    qm9: mae_ipu\n    tox21: bce_ipu\n    zinc: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 300\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  qm9: &qm9_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  tox21:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: f1 > 0.5\n      metric: f1\n      multitask_handling: flatten\n      target_to_int: True\n      num_classes: 2\n      average: micro\n      threshold_kwargs: &threshold_05\n        operator: greater\n        threshold: 0.5\n        th_on_preds: True\n        th_on_target: True\n    - name: precision > 0.5\n      metric: precision\n      multitask_handling: flatten\n      average: micro\n      threshold_kwargs: *threshold_05\n  zinc: *qm9_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-small/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gcn/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/baseline/config_small_gin_baseline.yaml",
    "content": "# Testing the gin model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_gin\n  config_override: \"expts/neurips2023_configs/baseline/config_small_gcn_baseline.yaml\"\n  seed: &seed 1000\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 96\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\ntrainer:\n  seed: *seed\n  logger:\n    name: *name\n    project: *name\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gin/\n    filename: *name\n"
  },
  {
    "path": "expts/neurips2023_configs/baseline/config_small_gine_baseline.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_gine\n  config_override: \"expts/neurips2023_configs/baseline/config_small_gcn_baseline.yaml\"\n  seed: &seed 1000\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 96\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\ntrainer:\n  seed: *seed\n  logger:\n    name: *name\n    project: *name\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gine/\n    filename: *name\n"
  },
  {
    "path": "expts/neurips2023_configs/config_classifigression_l1000.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_mpnn\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\n#accelerator:\n#  type: ipu  # cpu or ipu or gpu\n#  config_override:\n#    datamodule:\n#      args:\n#        ipu_dataloader_training_opts:\n#          mode: async\n#          max_num_nodes_per_graph: 24 # train max nodes: 20, max_edges: 54\n#          max_num_edges_per_graph: 60\n#        ipu_dataloader_inference_opts:\n#          mode: async\n#          max_num_nodes_per_graph: 24 # valid max nodes: 51, max_edges: 118\n#          max_num_edges_per_graph: 60\n#        # Data handling-related\n#        batch_size_training: 50\n#        batch_size_inference: 50\n##    predictor:\n##      optim_kwargs:\n##        loss_scaling: 1024\n#    trainer:\n#      trainer:\n#        precision: 16\n#        accumulate_grad_batches: 4\n#\n#  ipu_config:\n#    - deviceIterations(20) # IPU would require large batches to be ready for the model.\n#    - replicationFactor(16)\n#    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n#    # - enableExecutableCaching(\"pop_compiler_cache\")\n#    - TensorLocations.numIOTiles(128)\n#    - _Popart.set(\"defaultBufferingDepth\", 128)\n#    - Precision.enableStochasticRounding(True)\n\naccelerator:\n type: gpu  # cpu or ipu or gpu\n config_override:\n   datamodule:\n     batch_size_training: 64\n     batch_size_inference: 256\n   trainer:\n     trainer:\n       precision: 32\n       accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/small-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n\n      l1000_mcf7:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/small-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 1\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 5 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 64\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: *gnn_dim\n        out_dim: *gnn_dim\n        in_dim_edges: 32\n        out_dim_edges: 32\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      attn_kwargs: null\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 4890  # n_columns (978) x n_brackets (5)\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 4890  # n_columns (978) x n_brackets (5)\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n    l1000_mcf7: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce\n      n_brackets: 5\n    l1000_mcf7:\n      name: hybrid_ce\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-3 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 0\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: ignore\n  multitask_handling: flatten\n\n# Task-specific\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auc\n      metric: auroc\n      target_nan_mask: ignore\n      multitask_handling: flatten\n      squeeze_targets: True\n      target_to_int: True\n      task: multiclass\n      num_classes: 5\n    - name: ap\n      metric: averageprecision\n      target_nan_mask: ignore\n      multitask_handling: flatten\n      squeeze_targets: True\n      target_to_int: True\n      task: multiclass\n      num_classes: 5\n    - name: accuracy\n      metric: accuracy\n      target_nan_mask: ignore\n      multitask_handling: flatten\n      squeeze_targets: True\n      target_to_int: True\n      task: multiclass\n      num_classes: 5\n      top_k: 1\n  l1000_mcf7: *classif_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-small/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 1\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gcn.yaml",
    "content": "# Running the gcn model with the largemix dataset on IPU.\n\ndefaults:\n  - base_config: large\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gcn_g25.yaml",
    "content": "# Running the gcn model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gcn_gpu.yaml",
    "content": "# Testing GCN on LargeMix with FP16/32 on GPU\n\ndefaults:\n  - base_config: large\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gcn_gpu\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\naccelerator:\n  type: gpu\n  float32_matmul_precision: medium\n  config_override:\n    datamodule:\n      args:\n        batch_size_training: 80\n        batch_size_inference: 80\n    predictor:\n      metrics_every_n_train_steps: 1000\n    trainer:\n      trainer:\n        precision: 32 # 16-mixed\n        accumulate_grad_batches: 1\n\ndatamodule:\n  args:\n    task_specific_args:\n      l1000_vcap:\n        df_path: expts/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-4.csv.gz # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        splits_path: expts/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n\n      l1000_mcf7:\n        df_path: expts/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-4.csv.gz # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        splits_path: expts/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n\n      pcba_1328:\n        df_path: expts/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        splits_path: expts/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n\n      pcqm4m_g25:\n        df_path: expts/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        splits_path: expts/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n\n      pcqm4m_n4:\n        df_path: expts/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        splits_path: expts/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n\n    featurization_n_jobs: 4 # 30\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    num_workers: 4 # 30\n\narchitecture:\n  task_heads:\n    pcba_1328:\n      last_activation: null\n\npredictor:\n  loss_fun:\n    pcba_1328: bce_logits_ipu\n\n  torch_scheduler_kwargs:\n    max_num_epochs: &max_epochs 20\n\ntrainer:\n  trainer:\n    max_epochs: *max_epochs\n    check_val_every_n_epoch: 1\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gcn_n4.yaml",
    "content": "# Running the gcn model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gcn_pcba.yaml",
    "content": "# Running the gcn model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - base_config: large_pcba\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gcn\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gin.yaml",
    "content": "# Running the gin model with the largemix dataset on IPU.\ndefaults:\n  - base_config: large\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gin\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gin'"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gin_g25.yaml",
    "content": "# Running the gin model with the largemix dataset on IPU.\ndefaults:\n  # - base_config: large\n  - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gin\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gin'\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gin_n4.yaml",
    "content": "# Running the gin model with the largemix dataset on IPU.\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gin\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gin'\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gin_pcba.yaml",
    "content": "# Running the gin model with the largemix dataset on IPU.\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - base_config: large_pcba\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gin\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gin'\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gine.yaml",
    "content": "# Running the gine model with the largemix dataset on IPU.\n\ndefaults:\n  - base_config: large\n  # - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gine\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gine'\n\n  graph_output_nn:\n    graph:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n    node:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gine_g25.yaml",
    "content": "# Running the gine model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gine\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gine'\n\n  graph_output_nn:\n    graph:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n    node:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gine_n4.yaml",
    "content": "# Running the gine model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  - base_config: large_pcqm_n4\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gine\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gine'\n\n  graph_output_nn:\n    graph:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n    node:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_gine_pcba.yaml",
    "content": "# Running the gine model with the largemix dataset on IPU.\n\ndefaults:\n  # - base_config: large\n  # - base_config: large_pcqm_g25\n  # - base_config: large_pcqm_n4\n  - base_config: large_pcba\n  - _self_\n\nconstants:\n  name: neurips2023_large_data_gine\n  wandb:\n    name: ${constants.name}\n    project: neurips2023_large_graphcore\n    entity: multitask-gnn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gine'\n\n  graph_output_nn:\n    graph:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n    node:\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n"
  },
  {
    "path": "expts/neurips2023_configs/config_large_mpnn.yaml",
    "content": "# Running the mpnn model with the largemix dataset on IPU.\n\ndefaults:\n - base_config: large\n\nconstants:\n  name: neurips2023_large_data_mpnn\n\narchitecture:\n\n  pre_nn:\n    out_dim: 160\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:\n    out_dim: 64\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: 0.18\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 160 # should be consistent with pre_nn.out_dim\n    out_dim: 256\n    hidden_dims: &gnn_dim 160 # should consistent with pre_nn.out_dim when multi-layer mpnn is used (ffn layer)\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gps'\n    layer_kwargs:\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: 160 # should consistent with pre_nn.out_dim when multi-layer mpnn is used (node_model layer)\n        out_dim: 160 # should consistent with pre_nn.out_dim when multi-layer mpnn is used (node_model layer)\n        in_dim_edges: 64 # should consistent with pre_nn_edges.out_dim when multi-layer mpnn is used (edge_model layer)\n        out_dim_edges: 64 # should consistent with pre_nn_edges.out_dim when multi-layer mpnn is used (edge_model layer)\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      # biased_attention: false\n      attn_kwargs: null\n"
  },
  {
    "path": "expts/neurips2023_configs/config_luis_jama.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_mpnn\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\n# accelerator:\n#   type: ipu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       args:\n#         ipu_dataloader_training_opts:\n#           mode: async\n#           max_num_nodes_per_graph: 24 # train max nodes: 20, max_edges: 54\n#           max_num_edges_per_graph: 60\n#         ipu_dataloader_inference_opts:\n#           mode: async\n#           max_num_nodes_per_graph: 24 # valid max nodes: 51, max_edges: 118\n#           max_num_edges_per_graph: 60\n#         # Data handling-related\n#         batch_size_training: 50\n#         batch_size_inference: 50\n#     predictor:\n#       optim_kwargs:\n#         loss_scaling: 1024\n#     trainer:\n#       trainer:\n#         precision: 16\n#         accumulate_grad_batches: 4\n\n  # ipu_config:\n  #   - deviceIterations(20) # IPU would require large batches to be ready for the model.\n  #   - replicationFactor(16)\n  #   # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n  #   # - enableExecutableCaching(\"pop_compiler_cache\")\n  #   - TensorLocations.numIOTiles(128)\n  #   - _Popart.set(\"defaultBufferingDepth\", 128)\n  #   - Precision.enableStochasticRounding(True)\n\naccelerator:\n  type: cpu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      batch_size_training: 64\n      batch_size_inference: 256\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n\n      pcqm20k_g13:\n        df: null\n        df_path: graphium/data/neurips2023/dummy-dataset/PCQM20k_G13_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Dummy-dataset/PCQM20k_G13_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        sample_size: 2000 # use sample_size for test\n        split_val: 0.1\n        split_test: 0.1\n        task_level: graph\n        label_normalization:\n          method: \"normal\"\n\n      pcqm20k_n4:\n        df: null\n        df_path: graphium/data/neurips2023/dummy-dataset/PCQM20k_G13_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Dummy-dataset/PCQM20k_G13_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        sample_size: 2000 # use sample_size for test\n        split_val: 0.1\n        split_test: 0.1\n        seed: *seed\n        task_level: node\n        label_normalization:\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 4 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 64\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      node_residual: false\n      mpnn_type: 'pyg:mpnnplus'\n      mpnn_kwargs:\n        in_dim: *gnn_dim\n        out_dim: *gnn_dim\n        in_dim_edges: 32\n        out_dim_edges: 32\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      attn_kwargs: null\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm20k_g13:\n      task_level: graph\n      out_dim: 13\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm20k_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm20k_g13: []\n    pcqm20k_n4: []\n  loss_fun:\n    pcqm20k_g13: mae_ipu\n    pcqm20k_n4: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-5 # warmup can be scheduled using torch_scheduler_kwargs\n    #weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 100\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: ignore # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm20k_g13: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  pcqm20k_n4: *pcqm_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-small/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small/\n    filename: *name\n    #monitor: *monitor\n    #mode: *mode\n    save_top_k: 1\n    every_n_epochs: 100\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/config_small_gated_gcn.yaml",
    "content": "# Testing the gated_gcn model with the PCQMv2 dataset on IPU.\n\ndefaults:\n  - base_config: small\n  - _self_\n\nconstants:\n  name: neurips2023_small_data_gated_gcn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gated-gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_small_gcn.yaml",
    "content": "# Testing the gcn model with the toymix dataset on IPU.\n\ndefaults:\n  - base_config: small\n  - _self_\n\nconstants:\n  name: neurips2023_small_data_gcn\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n"
  },
  {
    "path": "expts/neurips2023_configs/config_small_gcn_gpu.yaml",
    "content": "# Testing GCN on ToyMix with FP16/32 on GPU\n\ndefaults:\n  - base_config: small\n  - _self_\n\nconstants:\n  name: neurips2023_small_data_gcn_gpu\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\naccelerator:\n  type: gpu  # cpu or ipu or gpu\n  float32_matmul_precision: medium\n  config_override:\n    datamodule:\n      args:\n        # Data handling-related\n        batch_size_training: 200\n        batch_size_inference: 200\n    predictor:\n      metrics_every_n_train_steps: 300\n    trainer:\n      trainer:\n        precision: 32 # 16-mixed\n        accumulate_grad_batches: 1\n\ndatamodule:\n  args:\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      qm9:\n        df_path: expts/data/neurips2023/small-dataset/qm9.csv.gz\n        splits_path: expts/data/neurips2023/small-dataset/qm9_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9_random_splits.pt`\n\n      tox21:\n        df_path: expts/data/neurips2023/small-dataset/Tox21-7k-12-labels.csv.gz\n        splits_path: expts/data/neurips2023/small-dataset/Tox21_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21_random_splits.pt`\n\n      zinc:\n        df_path: expts/data/neurips2023/small-dataset/ZINC12k.csv.gz\n        splits_path: expts/data/neurips2023/small-dataset/ZINC12k_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k_random_splits.pt`\n    featurization_n_jobs: 4 # 30\n    processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    num_workers: 4 # 30\n\narchitecture:\n  task_heads:\n    tox21:\n      last_activation: none\n\npredictor:\n  loss_fun:\n    tox21: bce_logits_ipu\n\n  torch_scheduler_kwargs:\n    max_num_epochs: &max_epochs 300\n\ntrainer:\n  trainer:\n    max_epochs: *max_epochs\n\n"
  },
  {
    "path": "expts/neurips2023_configs/config_small_gin.yaml",
    "content": "# Testing the gin model with the PCQMv2 dataset on IPU.\n\ndefaults:\n  - base_config: small\n  - _self_\n\nconstants:\n  name: neurips2023_small_data_gin\n\narchitecture:\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gin'\n"
  },
  {
    "path": "expts/neurips2023_configs/config_small_gine.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\n\ndefaults:\n  - base_config: small\n  - _self_\n\nconstants:\n  name: neurips2023_small_data_gine\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps"
  },
  {
    "path": "expts/neurips2023_configs/config_small_mpnn.yaml",
    "content": "# Testing the mpnn only model with the PCQMv2 dataset on IPU.\n\ndefaults:\n - base_config: small\n\nconstants:\n  name: neurips2023_small_data_mpnn\n\narchitecture:\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: ${architecture.pre_nn.dropout}\n    normalization: ${architecture.pre_nn.normalization}\n    last_normalization: ${architecture.pre_nn.normalization}\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 64\n    hidden_dims: *gnn_dim\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      mpnn_type: 'pyg:mpnnplus'\n      out_dim_edges: 32\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gcn/config_large_gcn_mcf7.yaml",
    "content": "# Testing the gcn model\nconstants:\n  name: &name neurips2023_large_data_gcn_mcf7\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_mcf7:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/mcf7/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n  task_heads:\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_mcf7: []\n  metrics_on_training_set:\n    l1000_mcf7: []\n  loss_fun:\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_mcf7: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gcn/mcf7/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gcn/config_large_gcn_pcba.yaml",
    "content": "# Testing the gcn model\nconstants:\n  name: &name neurips2023_large_data_gcn_pcba\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 200 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 400\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcba_1328:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/pcba/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcba_1328: []\n  metrics_on_training_set:\n    pcba_1328: []\n  loss_fun:\n    pcba_1328: bce_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcba_1328:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gcn/pcba/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gcn/config_large_gcn_vcap.yaml",
    "content": "# Testing the gcn model\nconstants:\n  name: &name neurips2023_large_data_gcn_vcap\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 150\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/vcap/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 768\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n  metrics_on_training_set:\n    l1000_vcap: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gcn/vcap/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_g25.yaml",
    "content": "# Testing the gin model\nconstants:\n  name: &name neurips2023_large_data_gin_g25\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/g25/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_g25: []\n  metrics_on_training_set:\n    pcqm4m_g25: []\n  loss_fun:\n    pcqm4m_g25: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gin/g25/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_mcf7.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_mcf7\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_mcf7:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/mcf7/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n  task_heads:\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_mcf7: []\n  metrics_on_training_set:\n    l1000_mcf7: []\n  loss_fun:\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_mcf7: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/mcf7/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_n4.yaml",
    "content": "# Testing the gin model\nconstants:\n  name: &name neurips2023_large_data_gin_n4\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: *seed\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/n4/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    pcqm4m_n4: []\n  loss_fun:\n    pcqm4m_n4: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_n4: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gin/n4/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_pcba.yaml",
    "content": "# Testing the gin model\nconstants:\n  name: &name neurips2023_large_data_gin_pcba\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 200 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 400\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcba_1328:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/pcba/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcba_1328: []\n  metrics_on_training_set:\n    pcba_1328: []\n  loss_fun:\n    pcba_1328: bce_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcba_1328:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gin/pcba/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_pcq.yaml",
    "content": "# Testing the gin model\nconstants:\n  name: &name neurips2023_large_data_gin_pcq\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: *seed\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/pcq/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  loss_fun:\n    pcqm4m_g25: mae_ipu\n    pcqm4m_n4: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gin/pcq/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gin/config_large_gin_vcap.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_vcap\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 150\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/vcap/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges: null\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 64 # or otherwise the correct value\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gin' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n  metrics_on_training_set:\n    l1000_vcap: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/vcap/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_g25.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_g25\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/g25/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_g25: []\n  metrics_on_training_set:\n    pcqm4m_g25: []\n  loss_fun:\n    pcqm4m_g25: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/g25/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_mcf7.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_mcf7\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_mcf7:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_MCF7_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_mcf7_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_mcf7_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/mcf7/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n  task_heads:\n    l1000_mcf7:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_mcf7: []\n  metrics_on_training_set:\n    l1000_mcf7: []\n  loss_fun:\n    l1000_mcf7:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_mcf7: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/mcf7/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_n4.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_n4\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: *seed\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/n4/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    pcqm4m_n4: []\n  loss_fun:\n    pcqm4m_n4: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_n4: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/n4/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_pcba.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_pcba\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 200 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 400\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcba_1328:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCBA_1328_1564k.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCBA_1328_1564k.parquet\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: assayID-*  # assayID-* means all columns starting with \"assayID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcba_1328_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcba_1328_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/pcba/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcba_1328:\n      task_level: graph\n      out_dim: 1328\n      hidden_dims: 64\n      depth: 2\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcba_1328: []\n  metrics_on_training_set:\n    pcba_1328: []\n  loss_fun:\n    pcba_1328: bce_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcba_1328:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/pcba/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_pcq.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_pcq\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 30 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 100\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 10\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(10) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      pcqm4m_g25:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: graph_*  # graph_* means all columns starting with \"graph_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      pcqm4m_n4:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/PCQM4M_G25_N4.parquet\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/PCQM4M_G25_N4.parquet\n        # or set path as the URL directly\n        smiles_col: \"ordered_smiles\"\n        label_cols: node_* # node_* means all columns starting with \"node_\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: node\n        splits_path: graphium/data/neurips2023/large-dataset/pcqm4m_g25_n4_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/pcqm4m_g25_n4_random_splits.pt`\n        seed: *seed\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/pcq/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    node:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    pcqm4m_g25:\n      task_level: graph\n      out_dim: 25\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    pcqm4m_n4:\n      task_level: node\n      out_dim: 4\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  metrics_on_training_set:\n    pcqm4m_g25: []\n    pcqm4m_n4: []\n  loss_fun:\n    pcqm4m_g25: mae_ipu\n    pcqm4m_n4: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  pcqm4m_g25: &pcqm_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2\n      metric: r2_score_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n  pcqm4m_n4: *pcqm_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/pcq/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/neurips2023_configs/single_task_gine/config_large_gine_vcap.yaml",
    "content": "# Testing the gine model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_large_data_gine_vcap\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 100\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 60 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 150\n        # Data handling-related\n        batch_size_training: 10\n        batch_size_inference: 2\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 32\n        accumulate_grad_batches: 8\n\n  ipu_config:\n    - deviceIterations(1) # IPU would require large batches to be ready for the model.\n    - replicationFactor(16)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      l1000_vcap:\n        df: null\n        df_path: graphium/data/neurips2023/large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/LINCS_L1000_VCAP_0-4.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"SMILES\"\n        label_cols: geneID-*  # geneID-* means all columns starting with \"geneID-\"\n        # sample_size: 2000 # use sample_size for test\n        task_level: graph\n        splits_path: graphium/data/neurips2023/large-dataset/l1000_vcap_random_splits.pt  # Download with `wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Large-dataset/l1000_vcap_random_splits.pt`\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    processed_graph_data_path: \"../datacache/neurips2023-large/vcap/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: 30 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 64\n    hidden_dims: 256\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.18\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 128\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: &gnn_dim 704\n    hidden_dims: *gnn_dim\n    depth: 4\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gine' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    l1000_vcap:\n      task_level: graph\n      out_dim: 4890\n      hidden_dims: 128\n      depth: 2\n      activation: none\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    l1000_vcap: []\n  metrics_on_training_set:\n    l1000_vcap: []\n  loss_fun:\n    l1000_vcap:\n      name: hybrid_ce_ipu\n      n_brackets: 5\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-4 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 20\n    warmup_epochs: 10\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  l1000_vcap: &classif_metrics\n    - name: auroc\n      metric: auroc_ipu\n      num_classes: 5\n      task: multiclass\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      num_classes: 5\n      task: multiclass\n      target_to_int: True\n      target_nan_mask: -1000\n      ignore_index: -1000\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-large/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-large-gine/vcap/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "expts/run_validation_test.py",
    "content": "# General imports\nimport argparse\nimport os\nfrom os.path import dirname, abspath\nimport yaml\nfrom copy import deepcopy\nfrom omegaconf import DictConfig, OmegaConf\nimport timeit\nfrom loguru import logger\nfrom datetime import datetime\nfrom pytorch_lightning.utilities.model_summary import ModelSummary\n\n# Current project imports\nimport graphium\nfrom graphium.config._loader import (\n    load_datamodule,\n    load_metrics,\n    load_architecture,\n    load_predictor,\n    load_trainer,\n    save_params_to_wandb,\n    load_accelerator,\n)\nfrom graphium.utils.safe_run import SafeRun\n\nimport hydra\n\n# WandB\nimport wandb\n\n# Set up the working directory\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nos.chdir(MAIN_DIR)\n\n\n@hydra.main(version_base=None, config_path=\"hydra-configs\", config_name=\"main\")\ndef main(cfg: DictConfig) -> None:\n    cfg = OmegaConf.to_container(cfg, resolve=True)\n\n    run_name: str = \"main\"\n    add_date_time: bool = True\n\n    st = timeit.default_timer()\n\n    date_time_suffix = \"\"\n    if add_date_time:\n        date_time_suffix = datetime.now().strftime(\"%d.%m.%Y_%H.%M.%S\")\n\n    wandb.init(entity=cfg[\"constants\"][\"entity\"], project=cfg[\"constants\"][\"name\"], config=cfg)\n\n    # Initialize the accelerator\n    cfg, accelerator_type = load_accelerator(cfg)\n\n    # Load and initialize the dataset\n    datamodule = load_datamodule(cfg, accelerator_type)\n\n    # Initialize the network\n    model_class, model_kwargs = load_architecture(\n        cfg,\n        in_dims=datamodule.in_dims,\n    )\n\n    datamodule.prepare_data()\n\n    metrics = load_metrics(cfg)\n    logger.info(metrics)\n\n    predictor = load_predictor(\n        cfg, model_class, model_kwargs, metrics, accelerator_type, datamodule.task_norms\n    )\n    logger.info(predictor.model)\n    logger.info(ModelSummary(predictor, max_depth=4))\n\n    trainer = load_trainer(cfg, run_name, accelerator_type, date_time_suffix)\n    save_params_to_wandb(trainer.logger, cfg, predictor, datamodule)\n\n    # Determine the max num nodes and edges in training and validation\n    predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\"val\"])\n\n    # Run the model validation\n    with SafeRun(name=\"VALIDATING\", raise_error=cfg[\"constants\"][\"raise_train_error\"], verbose=True):\n        trainer.validate(\n            model=predictor,\n            ckpt_path=f'{cfg[\"trainer\"][\"model_checkpoint\"][\"dirpath\"]}{cfg[\"trainer\"][\"seed\"]}/{cfg[\"trainer\"][\"model_checkpoint\"][\"filename\"]}.ckpt',\n            datamodule=datamodule,\n        )\n\n    # Run the model testing\n    with SafeRun(name=\"TESTING\", raise_error=cfg[\"constants\"][\"raise_train_error\"], verbose=True):\n        trainer.test(\n            model=predictor,\n            ckpt_path=f'{cfg[\"trainer\"][\"model_checkpoint\"][\"dirpath\"]}{cfg[\"trainer\"][\"seed\"]}/{cfg[\"trainer\"][\"model_checkpoint\"][\"filename\"]}.ckpt',\n            datamodule=datamodule,\n        )\n\n    logger.info(\"--------------------------------------------\")\n    logger.info(\"total computation used\", timeit.default_timer() - st)\n    logger.info(\"--------------------------------------------\")\n    wandb.finish()\n\n    return trainer.callback_metrics\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "graphium/__init__.py",
    "content": "from ._version import __version__\n\nfrom .config import load_config\n\nfrom . import utils\nfrom . import features\nfrom . import data\nfrom . import nn\nfrom . import trainer\n"
  },
  {
    "path": "graphium/_version.py",
    "content": "from importlib.metadata import version\nfrom importlib.metadata import PackageNotFoundError\n\ntry:\n    __version__ = version(\"graphium\")\nexcept PackageNotFoundError:\n    # package is not installed\n    __version__ = \"dev\"\n"
  },
  {
    "path": "graphium/cli/__init__.py",
    "content": "from .data import data_app\nfrom .parameters import param_app\nfrom .finetune_utils import finetune_app\nfrom .main import app\n"
  },
  {
    "path": "graphium/cli/__main__.py",
    "content": "from graphium.cli.main import app\n\nif __name__ == \"__main__\":\n    app()\n"
  },
  {
    "path": "graphium/cli/data.py",
    "content": "import timeit\nfrom typing import List\nfrom omegaconf import OmegaConf\nimport typer\nimport graphium\n\nfrom loguru import logger\nfrom hydra import initialize, compose\n\nfrom .main import app\nfrom graphium.config._loader import load_datamodule\n\n\ndata_app = typer.Typer(help=\"Graphium datasets.\")\napp.add_typer(data_app, name=\"data\")\n\n\n@data_app.command(name=\"download\", help=\"Download a Graphium dataset.\")\ndef download(name: str, output: str, progress: bool = True):\n    args = {}\n    args[\"name\"] = name\n    args[\"output_path\"] = output\n    args[\"extract_zip\"] = True\n    args[\"progress\"] = progress\n\n    logger.info(f\"Download dataset '{name}' into {output}.\")\n\n    fpath = graphium.data.utils.download_graphium_dataset(**args)\n\n    logger.info(f\"Dataset available at {fpath}.\")\n\n\n@data_app.command(name=\"list\", help=\"List available Graphium dataset.\")\ndef list():\n    logger.info(\"Graphium datasets:\")\n    logger.info(graphium.data.utils.list_graphium_datasets())\n\n\n@data_app.command(name=\"prepare\", help=\"Prepare a Graphium dataset.\")\ndef prepare_data(overrides: List[str]) -> None:\n    with initialize(version_base=None, config_path=\"../../expts/hydra-configs\"):\n        cfg = compose(\n            config_name=\"main\",\n            overrides=overrides,\n        )\n    cfg = OmegaConf.to_container(cfg, resolve=True)\n    st = timeit.default_timer()\n\n    # Checking that `processed_graph_data_path` is provided\n    path = cfg[\"datamodule\"][\"args\"].get(\"processed_graph_data_path\", None)\n    if path is None:\n        raise ValueError(\n            \"Please provide `datamodule.args.processed_graph_data_path` to specify the caching dir.\"\n        )\n    logger.info(f\"The caching dir is set to '{path}'\")\n\n    # Data-module\n    datamodule = load_datamodule(cfg, \"cpu\")\n    datamodule.prepare_data()\n\n    logger.info(f\"Data preparation took {timeit.default_timer() - st:.2f} seconds.\")\n"
  },
  {
    "path": "graphium/cli/finetune_utils.py",
    "content": "from typing import List, Literal, Optional\n\nimport fsspec\nimport numpy as np\nimport torch\nimport tqdm\nimport typer\nimport yaml\nfrom datamol.utils import fs\nfrom hydra import compose, initialize\nfrom hydra.core.hydra_config import HydraConfig\nfrom loguru import logger\nfrom omegaconf import OmegaConf\n\nfrom graphium.config._loader import load_accelerator, load_datamodule\nfrom graphium.finetuning.fingerprinting import Fingerprinter\nfrom graphium.utils import fs\nfrom graphium.trainer.predictor import PredictorModule\n\nfrom .main import app\nfrom .train_finetune_test import run_training_finetuning_testing\n\nfinetune_app = typer.Typer(help=\"Utility CLI for extra fine-tuning utilities.\")\napp.add_typer(finetune_app, name=\"finetune\")\n\n\n@finetune_app.command(name=\"admet\")\ndef benchmark_tdc_admet_cli(\n    overrides: List[str],\n    name: Optional[List[str]] = None,\n    inclusive_filter: bool = True,\n):\n    \"\"\"\n    Utility CLI to easily fine-tune a model on (a subset of) the benchmarks in the TDC ADMET group.\n\n    A major limitation is that we cannot use all features of the Hydra CLI, such as multiruns.\n    \"\"\"\n    try:\n        from tdc.utils import retrieve_benchmark_names\n    except ImportError:\n        raise ImportError(\"TDC needs to be installed to use this CLI. Run `pip install PyTDC`.\")\n\n    # Get the benchmarks to run this for\n    if len(name) == 0:\n        name = retrieve_benchmark_names(\"admet_group\")\n\n    if not inclusive_filter:\n        name = [n for n in retrieve_benchmark_names(\"admet_group\") if n not in name]\n\n    logger.info(f\"Running fine-tuning for the following benchmarks: {name}\")\n    results = {}\n\n    # Use the Compose API to construct the config\n    for n in name:\n        overrides += [\"+finetuning=admet\", f\"constants.task={n}\"]\n\n        with initialize(version_base=None, config_path=\"../../expts/hydra-configs\"):\n            cfg = compose(\n                config_name=\"main\",\n                overrides=overrides,\n            )\n\n        # Run the training loop\n        ret = run_training_finetuning_testing(cfg)\n        ret = {k: v.item() for k, v in ret.items()}\n        results[n] = ret\n\n    # Save to the results_dir by default or to the Hydra output_dir if needed.\n    # This distinction is needed, because Hydra's output_dir cannot be remote.\n    save_dir = cfg[\"constants\"].get(\"results_dir\", HydraConfig.get()[\"runtime\"][\"output_dir\"])\n    fs.mkdir(save_dir, exist_ok=True)\n    path = fs.join(save_dir, \"results.yaml\")\n    logger.info(f\"Saving results to {path}\")\n\n    with fsspec.open(path, \"w\") as f:\n        yaml.dump(results, f)\n\n\n@finetune_app.command(name=\"fingerprint\")\ndef get_fingerprints_from_model(\n    fingerprint_layer_spec: List[str],\n    pretrained_model: str,\n    save_destination: str,\n    output_type: str = typer.Option(\"torch\", help=\"Either numpy (.npy) or torch (.pt) output\"),\n    overrides: Optional[List[str]] = typer.Option(None, \"--override\", \"-o\", help=\"Hydra overrides\"),\n):\n    \"\"\"Endpoint for getting fingerprints from a pretrained model.\n\n    The pretrained model should be a `.ckpt` path or pre-specified, named model within Graphium.\n    The fingerprint layer specification should be of the format `module:layer`.\n    If specified as a list, the fingerprints from all the specified layers will be concatenated.\n    See the docs of the `graphium.finetuning.fingerprinting.Fingerprinter` class for more info.\n    \"\"\"\n\n    if overrides is None:\n        overrides = []\n\n    with initialize(version_base=None, config_path=\"../../expts/hydra-configs\"):\n        cfg = compose(config_name=\"main\", overrides=overrides)\n        cfg = OmegaConf.to_container(cfg, resolve=True)\n\n    ## == Instantiate all required objects from their respective configs ==\n\n    # Accelerator\n    cfg, accelerator_type = load_accelerator(cfg)\n\n    # Data-module\n    datamodule = load_datamodule(cfg, accelerator_type)\n    datamodule.prepare_data()\n\n    # The predict_dataloader() returns either predict or test, so we need to run both.\n    datamodule.setup(\"test\")\n    datamodule.setup(\"predict\")\n\n    # Model\n    predictor = PredictorModule.load_pretrained_model(\n        pretrained_model,\n        device=accelerator_type,\n    )\n\n    ## == Fingerprinter\n    with Fingerprinter(model=predictor, fingerprint_spec=fingerprint_layer_spec, out_type=output_type) as fp:\n        fps = fp.get_fingerprints_for_dataset(datamodule.predict_dataloader())\n\n    fs.mkdir(save_destination, exist_ok=True)\n\n    if output_type == \"numpy\":\n        path = fs.join(save_destination, \"fingerprints.npy\")\n        logger.info(f\"Saving fingerprints to {path}\")\n        with fsspec.open(path, \"wb\") as f:\n            np.save(path, fps)\n\n    else:\n        path = fs.join(save_destination, \"fingerprints.pt\")\n        logger.info(f\"Saving fingerprints to {path}\")\n        torch.save(fps, path)\n\n\ndef get_tdc_task_specific(task: str, output: Literal[\"name\", \"mode\", \"last_activation\"]):\n    if output == \"last_activation\":\n        config_arch_path = \"expts/hydra-configs/tasks/task_heads/admet.yaml\"\n        with open(config_arch_path, \"r\") as yaml_file:\n            config_tdc_arch = yaml.load(yaml_file, Loader=yaml.FullLoader)\n\n        return config_tdc_arch[\"architecture\"][\"task_heads\"][task][\"last_activation\"]\n\n    else:\n        config_metrics_path = \"expts/hydra-configs/tasks/loss_metrics_datamodule/admet.yaml\"\n        with open(config_metrics_path, \"r\") as yaml_file:\n            config_tdc_task_metric = yaml.load(yaml_file, Loader=yaml.FullLoader)\n\n        metric = config_tdc_task_metric[\"predictor\"][\"metrics_on_progress_bar\"][task][0]\n\n        metric_mode_map = {\n            \"mae\": \"min\",\n            \"auroc\": \"max\",\n            \"auprc\": \"max\",\n            \"spearman\": \"max\",\n        }\n\n        if output == \"name\":\n            return metric\n        elif output == \"mode\":\n            return metric_mode_map[metric]\n\n\nOmegaConf.register_new_resolver(\"get_metric_name\", lambda x: get_tdc_task_specific(x, output=\"name\"))\nOmegaConf.register_new_resolver(\"get_metric_mode\", lambda x: get_tdc_task_specific(x, output=\"mode\"))\nOmegaConf.register_new_resolver(\n    \"get_last_activation\", lambda x: get_tdc_task_specific(x, output=\"last_activation\")\n)\nOmegaConf.register_new_resolver(\"eval\", lambda x: eval(x, {\"np\": np}))\n"
  },
  {
    "path": "graphium/cli/fingerprints.py",
    "content": "from .main import app\n\n\n@app.command(name=\"fp\")\ndef get_fingerprints_from_model(): ...\n"
  },
  {
    "path": "graphium/cli/main.py",
    "content": "import typer\n\n\napp = typer.Typer(add_completion=False)\n\n\nif __name__ == \"__main__\":\n    app()\n"
  },
  {
    "path": "graphium/cli/parameters.py",
    "content": "import timeit\nfrom typing import List\nfrom omegaconf import DictConfig, OmegaConf\nimport typer\n\nimport numpy as np\n\nfrom loguru import logger\nfrom hydra import initialize, compose\n\nfrom .main import app\nfrom graphium.config._loader import (\n    load_accelerator,\n    load_architecture,\n    load_datamodule,\n)\n\nfrom graphium.trainer.predictor_options import ModelOptions\n\n\nparam_app = typer.Typer(help=\"Parameter counts.\")\napp.add_typer(param_app, name=\"params\")\n\n\n@param_app.command(name=\"infer\", help=\"Infer parameter count.\")\ndef infer_parameter_count(overrides: List[str] = []) -> int:\n    with initialize(version_base=None, config_path=\"../../expts/hydra-configs\"):\n        cfg = compose(\n            config_name=\"main\",\n            overrides=overrides,\n        )\n\n    cfg = OmegaConf.to_container(cfg, resolve=True)\n\n    ## Accelerator\n    cfg, accelerator_type = load_accelerator(cfg)\n\n    ## Datamodule\n    datamodule = load_datamodule(cfg, accelerator_type)\n\n    ## Architecture\n    model_class, model_kwargs = load_architecture(cfg, in_dims=datamodule.in_dims)\n    model_options = ModelOptions(\n        model_class=model_class,\n        model_kwargs=model_kwargs,\n    )\n    model = model_options.model_class(**model_options.model_kwargs)\n\n    # Count parameters\n    num_params = sum(p.numel() for p in model.parameters() if p.requires_grad)\n\n    logger.info(f\"Number of parameters: {num_params}.\")\n\n    return num_params\n"
  },
  {
    "path": "graphium/cli/train_finetune_test.py",
    "content": "from typing import List, Literal, Union\nimport os\nimport time\nimport timeit\nfrom datetime import datetime\n\nimport fsspec\nimport hydra\nimport numpy as np\nimport torch\nimport wandb\nimport yaml\nfrom datamol.utils import fs\nfrom hydra.core.hydra_config import HydraConfig\nfrom hydra.types import RunMode\nfrom lightning.pytorch.utilities.model_summary import ModelSummary\nfrom loguru import logger\nfrom omegaconf import DictConfig, OmegaConf\n\nfrom graphium.config._loader import (\n    load_accelerator,\n    load_architecture,\n    load_datamodule,\n    load_metrics,\n    load_predictor,\n    load_trainer,\n    save_params_to_wandb,\n    get_checkpoint_path,\n)\nfrom graphium.finetuning import (\n    FINETUNING_CONFIG_KEY,\n    GraphFinetuning,\n    modify_cfg_for_finetuning,\n)\nfrom graphium.hyper_param_search import (\n    HYPER_PARAM_SEARCH_CONFIG_KEY,\n    extract_main_metric_for_hparam_search,\n)\nfrom graphium.trainer.predictor import PredictorModule\nfrom graphium.utils.safe_run import SafeRun\n\nimport graphium.cli.finetune_utils\n\nTESTING_ONLY_CONFIG_KEY = \"testing_only\"\n\n\n@hydra.main(version_base=None, config_path=\"../../expts/hydra-configs\", config_name=\"main\")\ndef cli(cfg: DictConfig) -> None:\n    \"\"\"\n    The main CLI endpoint for training, fine-tuning and evaluating Graphium models.\n    \"\"\"\n    return run_training_finetuning_testing(cfg)\n\n\ndef get_replication_factor(cfg):\n    try:\n        ipu_config = cfg.get(\"accelerator\", {}).get(\"ipu_config\", [])\n        for item in ipu_config:\n            if \"replicationFactor\" in item:\n                # Extract the number between parentheses\n                start = item.find(\"(\") + 1\n                end = item.find(\")\")\n                if start != 0 and end != -1:\n                    return int(item[start:end])\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n\n    # Return default value if replicationFactor is not found or an error occurred\n    return 1\n\n\ndef get_gradient_accumulation_factor(cfg):\n    \"\"\"\n    WARNING: This MUST be called after accelerator overrides have been applied\n    (i.e. after `load_accelerator` has been called)\n    \"\"\"\n    try:\n        # Navigate through the nested dictionaries and get the gradient accumulation factor\n        grad_accumulation_factor = cfg.get(\"trainer\", {}).get(\"trainer\", {}).get(\"accumulate_grad_batches\", 1)\n\n        # Ensure that the extracted value is an integer\n        return int(grad_accumulation_factor)\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n\n    # Return default value if an error occurred\n    return 1\n\n\ndef get_training_batch_size(cfg):\n    \"\"\"\n    WARNING: This MUST be called after accelerator overrides have been applied\n    (i.e. after `load_accelerator` has been called)\n    \"\"\"\n    try:\n        # Navigate through the nested dictionaries and get the training batch size\n        batch_size_training = cfg.get(\"datamodule\", {}).get(\"args\", {}).get(\"batch_size_training\", 1)\n\n        # Ensure that the extracted value is an integer\n        return int(batch_size_training)\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n\n    # Return default value if an error occurred\n    return 1\n\n\ndef get_training_device_iterations(cfg):\n    try:\n        ipu_config = cfg.get(\"accelerator\", {}).get(\"ipu_config\", [])\n        for item in ipu_config:\n            if \"deviceIterations\" in item:\n                # Extract the number between parentheses\n                start = item.find(\"(\") + 1\n                end = item.find(\")\")\n                if start != 0 and end != -1:\n                    return int(item[start:end])\n    except Exception as e:\n        print(f\"An error occurred: {e}\")\n\n    # Return default value if deviceIterations is not found or an error occurred\n    return 1\n\n\ndef run_training_finetuning_testing(cfg: DictConfig) -> None:\n    \"\"\"\n    The main (pre-)training and fine-tuning loop.\n    \"\"\"\n\n    unresolved_cfg = OmegaConf.to_container(cfg, resolve=False)\n    cfg = OmegaConf.to_container(cfg, resolve=True)\n\n    # Get the current date and time\n    now = datetime.now()\n    # Format the datetime as a string\n    filename_datetime_suffix = now.strftime(\"%Y%m%d_%H%M%S\")\n    # Append the datetime string to the existing filename in the cfg dictionary\n    cfg[\"trainer\"][\"model_checkpoint\"][\"filename\"] += f\"_{filename_datetime_suffix}\"\n    cfg[\"trainer\"][\"model_checkpoint\"][\"dirpath\"] = (\n        cfg[\"trainer\"][\"model_checkpoint\"][\"dirpath\"][:-1] + f\"_{filename_datetime_suffix}\"\n    )\n\n    dst_dir = cfg[\"constants\"].get(\"results_dir\")\n    hydra_cfg = HydraConfig.get()\n    output_dir = hydra_cfg[\"runtime\"][\"output_dir\"]\n\n    if dst_dir is not None and fs.exists(dst_dir) and len(fs.get_mapper(dst_dir).fs.ls(dst_dir)) > 0:\n        logger.warning(\n            \"The destination directory is not empty. \"\n            \"If files already exist, this would lead to a crash at the end of training.\"\n        )\n        # We pause here briefly, to make sure the notification is seen as there's lots of logs afterwards\n        time.sleep(5)\n    # Modify the config for finetuning\n    if FINETUNING_CONFIG_KEY in cfg:\n        cfg = modify_cfg_for_finetuning(cfg)\n\n    st = timeit.default_timer()\n\n    # Initialize wandb only on first rank\n    if os.environ.get(\"RANK\", \"0\") == \"0\":\n        # Disable wandb if the user is not logged in.\n        wandb_cfg = cfg[\"constants\"].get(\"wandb\")\n        if wandb_cfg is not None and wandb.login() is False:\n            logger.info(\n                \"Not logged in to wandb - disabling wandb logging.\\n\"\n                + \"To enable wandb, run `wandb login` from the command line.\"\n            )\n            wandb.init(mode=\"disabled\")\n        elif wandb_cfg is not None:\n            wandb.init(config=cfg, **wandb_cfg)\n    else:\n        wandb_cfg = None\n\n    ## == Instantiate all required objects from their respective configs ==\n    # Accelerator\n    cfg, accelerator_type = load_accelerator(cfg)\n\n    ## Data-module\n    datamodule = load_datamodule(cfg, accelerator_type)\n    datamodule.prepare_data()\n\n    testing_only = cfg.get(TESTING_ONLY_CONFIG_KEY, False)\n\n    if testing_only:\n        # Load pre-trained model\n        predictor = PredictorModule.load_pretrained_model(\n            name_or_path=get_checkpoint_path(cfg), device=accelerator_type\n        )\n\n    else:\n        ## Architecture\n        model_class, model_kwargs = load_architecture(cfg, in_dims=datamodule.in_dims)\n\n        ## Metrics\n        metrics = load_metrics(cfg)\n\n        # Note: these MUST be called after `cfg, accelerator = load_accelerator(cfg)`\n        replicas = get_replication_factor(cfg)\n        gradient_acc = get_gradient_accumulation_factor(cfg)\n        micro_bs = get_training_batch_size(cfg)\n        device_iterations = get_training_device_iterations(cfg)\n\n        global_bs = replicas * gradient_acc * micro_bs * device_iterations\n\n        ## Predictor\n        predictor = load_predictor(\n            config=cfg,\n            model_class=model_class,\n            model_kwargs=model_kwargs,\n            metrics=metrics,\n            task_levels=datamodule.get_task_levels(),\n            accelerator_type=accelerator_type,\n            featurization=datamodule.featurization,\n            task_norms=datamodule.task_norms,\n            replicas=replicas,\n            gradient_acc=gradient_acc,\n            global_bs=global_bs,\n        )\n\n    logger.info(predictor.model)\n    logger.info(ModelSummary(predictor, max_depth=4))\n\n    ## Trainer\n    date_time_suffix = datetime.now().strftime(\"%d.%m.%Y_%H.%M.%S\")\n    trainer = load_trainer(cfg, accelerator_type, date_time_suffix)\n\n    if not testing_only:\n        # Add the fine-tuning callback to trainer\n        if FINETUNING_CONFIG_KEY in cfg:\n            finetuning_training_kwargs = cfg[\"finetuning\"][\"training_kwargs\"]\n            trainer.callbacks.append(GraphFinetuning(**finetuning_training_kwargs))\n\n        if wandb_cfg is not None:\n            save_params_to_wandb(trainer.logger, cfg, predictor, datamodule, unresolved_config=unresolved_cfg)\n\n        # Determine the max num nodes and edges in training and validation\n        logger.info(\"Computing the maximum number of nodes and edges per graph\")\n        predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\"train\", \"val\"])\n\n        # When resuming training from a checkpoint, we need to provide the path to the checkpoint in the config\n        resume_ckpt_path = cfg[\"trainer\"].get(\"resume_from_checkpoint\", None)\n\n        # Run the model training\n        with SafeRun(name=\"TRAINING\", raise_error=cfg[\"constants\"][\"raise_train_error\"], verbose=True):\n            trainer.fit(model=predictor, datamodule=datamodule, ckpt_path=resume_ckpt_path)\n\n        # Save validation metrics - Base utility in case someone doesn't use a logger.\n        results = trainer.callback_metrics\n        results = {k: v.item() if torch.is_tensor(v) else v for k, v in results.items()}\n        with fsspec.open(fs.join(output_dir, \"val_results.yaml\"), \"w\") as f:\n            yaml.dump(results, f)\n\n    # Determine the max num nodes and edges in testing\n    predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\"test\"])\n\n    # When checkpoints are logged during training, we can, e.g., use the best or last checkpoint for testing\n    test_ckpt_path = None\n    test_ckpt_name = cfg[\"trainer\"].get(\"test_from_checkpoint\", None)\n    test_ckpt_dir = cfg[\"trainer\"][\"model_checkpoint\"].get(\"dirpath\", None)\n    if test_ckpt_name is not None and test_ckpt_dir is not None:\n        test_ckpt_path = os.path.join(test_ckpt_dir, test_ckpt_name)\n\n    # Run the model testing\n    with SafeRun(name=\"TESTING\", raise_error=cfg[\"constants\"][\"raise_train_error\"], verbose=True):\n        trainer.test(model=predictor, datamodule=datamodule, ckpt_path=test_ckpt_path)\n\n    logger.info(\"-\" * 50)\n    logger.info(\"Total compute time:\", timeit.default_timer() - st)\n    logger.info(\"-\" * 50)\n\n    save_checkpoint_to_wandb = cfg[\"trainer\"].get(\"save_checkpoint_to_wandb\")\n    if save_checkpoint_to_wandb is True:\n        # Save initial model state - and upload checkpoint to wandb\n        if cfg[\"trainer\"][\"model_checkpoint\"][\"save_last\"] is True:\n            checkpoint_path = f\"{cfg['trainer']['model_checkpoint']['dirpath']}/{cfg['trainer']['model_checkpoint']['filename']}-v1.ckpt\"\n            # Log the initial model checkpoint to wandb\n            wandb.save(checkpoint_path)\n        wandb.finish()\n\n    # Save test metrics - Base utility in case someone doesn't use a logger.\n    results = trainer.callback_metrics\n    results = {k: v.item() if torch.is_tensor(v) else v for k, v in results.items()}\n    with fsspec.open(fs.join(output_dir, \"test_results.yaml\"), \"w\") as f:\n        yaml.dump(results, f)\n\n    # When part of of a hyper-parameter search, we are very specific about how we save our results\n    # NOTE (cwognum): We also check if the we are in multi-run mode, as the sweeper is otherwise not active.\n    if HYPER_PARAM_SEARCH_CONFIG_KEY in cfg and hydra_cfg.mode == RunMode.MULTIRUN:\n        results = extract_main_metric_for_hparam_search(results, cfg[HYPER_PARAM_SEARCH_CONFIG_KEY])\n\n    # Copy the current working directory to remote\n    # By default, processes should just write results to Hydra's output directory.\n    # However, this currently does not support remote storage, which is why we copy the results here if needed.\n    # For more info, see also: https://github.com/facebookresearch/hydra/issues/993\n\n    if dst_dir is not None:\n        src_dir = hydra_cfg[\"runtime\"][\"output_dir\"]\n        dst_dir = fs.join(dst_dir, fs.get_basename(src_dir))\n        fs.mkdir(dst_dir, exist_ok=True)\n        fs.copy_dir(src_dir, dst_dir)\n\n    return results\n\n\nif __name__ == \"__main__\":\n    cli()\n"
  },
  {
    "path": "graphium/config/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\n- `_load.py` and `config_convert.py`: loading from `yaml` config file \n- `_loader.py`: retrieve relevant arguments from the config and loading different modules with the correct arguments"
  },
  {
    "path": "graphium/config/__init__.py",
    "content": "from ._load import load_config\n\nfrom ._loader import load_architecture\nfrom ._loader import load_datamodule\nfrom ._loader import load_metrics\nfrom ._loader import load_predictor\nfrom ._loader import load_trainer\nfrom ._loader import save_params_to_wandb\nfrom ._loader import load_accelerator\n"
  },
  {
    "path": "graphium/config/_load.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport importlib.resources\n\nimport omegaconf\n\n\ndef load_config(name: str):\n    \"\"\"Load a default config file by its name.\n\n    Args:\n        name: name of the config to load.\n    \"\"\"\n\n    with importlib.resources.open_text(\"graphium.config\", f\"{name}.yaml\") as f:\n        config = omegaconf.OmegaConf.load(f)\n\n    return config\n"
  },
  {
    "path": "graphium/config/_loader.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n# Misc\nimport os\nfrom copy import deepcopy\nfrom typing import Any, Callable, Dict, Mapping, Optional, Tuple, Type, Union\n\nimport joblib\nimport mup\nimport omegaconf\n\n# Torch\nimport torch\nimport yaml\n\n# Lightning\nfrom lightning import Trainer\nfrom lightning.pytorch.callbacks import EarlyStopping, ModelCheckpoint, LearningRateMonitor\nfrom lightning.pytorch.loggers import Logger, WandbLogger\nfrom loguru import logger\n\nfrom graphium.data.datamodule import BaseDataModule, MultitaskFromSmilesDataModule\nfrom graphium.finetuning.finetuning_architecture import FullGraphFinetuningNetwork\nfrom graphium.ipu.ipu_dataloader import IPUDataloaderOptions\nfrom graphium.ipu.ipu_utils import import_poptorch, load_ipu_options\nfrom graphium.nn.architectures import FullGraphMultiTaskNetwork\nfrom graphium.nn.utils import MupMixin\nfrom graphium.trainer.metrics import MetricWrapper\nfrom graphium.trainer.predictor import PredictorModule\nfrom graphium.utils.command_line_utils import get_anchors_and_aliases, update_config\n\n# Graphium\nfrom graphium.utils.mup import set_base_shapes\nfrom graphium.utils.spaces import DATAMODULE_DICT, GRAPHIUM_PRETRAINED_MODELS_DICT\nfrom graphium.utils import fs\n\n\ndef get_accelerator(\n    config_acc: Union[omegaconf.DictConfig, Dict[str, Any]],\n) -> str:\n    \"\"\"\n    Get the accelerator from the config file, and ensure that they are\n    consistant.\n    \"\"\"\n\n    # Get the accelerator type\n    accelerator_type = config_acc[\"type\"]\n\n    # Get the GPU info\n    if (accelerator_type == \"gpu\") and (not torch.cuda.is_available()):\n        raise ValueError(f\"GPUs selected, but GPUs are not available on this device\")\n\n    # Get the IPU info\n    if accelerator_type == \"ipu\":\n        poptorch = import_poptorch()\n        if poptorch is None:\n            raise ValueError(\"IPUs selected, but PopTorch is not available\")\n        if not poptorch.ipuHardwareIsAvailable():\n            raise ValueError(\n                \"IPUs selected, but no IPU is available/visible on this device. \"\n                \"If you do have IPUs, please check that the IPUOF_VIPU_API_PARTITION_ID and \"\n                \"IPUOF_VIPU_API_HOST environment variables are set.\"\n            )\n\n    # Fall on cpu at the end\n    if accelerator_type is None:\n        accelerator_type = \"cpu\"\n    return accelerator_type\n\n\ndef _get_ipu_opts(config: Union[omegaconf.DictConfig, Dict[str, Any]]) -> Tuple[str, str]:\n    r\"\"\"\n    Get the paths of the IPU-specific config files from the main YAML config\n    \"\"\"\n\n    accelerator_options = config[\"accelerator\"]\n    accelerator_type = accelerator_options[\"type\"]\n\n    if accelerator_type != \"ipu\":\n        return None, None\n    ipu_opts = accelerator_options[\"ipu_config\"]\n    ipu_inference_opts = accelerator_options.get(\"ipu_inference_config\", None)\n\n    return ipu_opts, ipu_inference_opts\n\n\ndef load_datamodule(\n    config: Union[omegaconf.DictConfig, Dict[str, Any]], accelerator_type: str\n) -> BaseDataModule:\n    \"\"\"\n    Load the datamodule from the specified configurations at the key\n    `datamodule: args`.\n    If the accelerator is IPU, load the IPU options as well.\n\n    Parameters:\n        config: The config file, with key `datamodule: args`\n        accelerator_type: The accelerator type, e.g. \"cpu\", \"gpu\", \"ipu\"\n    Returns:\n        datamodule: The datamodule used to process and load the data\n    \"\"\"\n\n    cfg_data = config[\"datamodule\"][\"args\"]\n\n    # Instanciate the datamodule\n    module_class = DATAMODULE_DICT[config[\"datamodule\"][\"module_type\"]]\n\n    if accelerator_type != \"ipu\":\n        datamodule = module_class(\n            **config[\"datamodule\"][\"args\"],\n        )\n        return datamodule\n\n    # IPU specific adjustments\n    else:\n        ipu_opts, ipu_inference_opts = _get_ipu_opts(config)\n\n        # Default empty values for the IPU configurations\n        ipu_training_opts = None\n\n        ipu_dataloader_training_opts = cfg_data.pop(\"ipu_dataloader_training_opts\", {})\n        ipu_dataloader_inference_opts = cfg_data.pop(\"ipu_dataloader_inference_opts\", {})\n        ipu_training_opts, ipu_inference_opts = load_ipu_options(\n            ipu_opts=ipu_opts,\n            seed=config[\"constants\"][\"seed\"],\n            model_name=config[\"constants\"][\"name\"],\n            gradient_accumulation=config[\"trainer\"][\"trainer\"].get(\"accumulate_grad_batches\", None),\n            ipu_inference_opts=ipu_inference_opts,\n            precision=config[\"trainer\"][\"trainer\"].get(\"precision\"),\n        )\n\n        # Define the Dataloader options for the IPU on the training sets\n        bz_train = cfg_data[\"batch_size_training\"]\n        ipu_dataloader_training_opts = IPUDataloaderOptions(\n            batch_size=bz_train, **ipu_dataloader_training_opts\n        )\n        ipu_dataloader_training_opts.set_kwargs()\n\n        # Define the Dataloader options for the IPU on the inference sets\n        bz_test = cfg_data[\"batch_size_inference\"]\n        ipu_dataloader_inference_opts = IPUDataloaderOptions(\n            batch_size=bz_test, **ipu_dataloader_inference_opts\n        )\n        ipu_dataloader_inference_opts.set_kwargs()\n\n        datamodule = module_class(\n            ipu_training_opts=ipu_training_opts,\n            ipu_inference_opts=ipu_inference_opts,\n            ipu_dataloader_training_opts=ipu_dataloader_training_opts,\n            ipu_dataloader_inference_opts=ipu_dataloader_inference_opts,\n            **config[\"datamodule\"][\"args\"],\n        )\n\n        return datamodule\n\n\ndef load_metrics(config: Union[omegaconf.DictConfig, Dict[str, Any]]) -> Dict[str, MetricWrapper]:\n    \"\"\"\n    Loading the metrics to be tracked.\n    Parameters:\n        config: The config file, with key `metrics`\n    Returns:\n        metrics: A dictionary of all the metrics\n    \"\"\"\n\n    task_metrics = {}\n    cfg_metrics = config.get(\"metrics\", None)\n    if cfg_metrics is None:\n        return task_metrics\n    cfg_metrics = {key: deepcopy(value) for key, value in config[\"metrics\"].items()}\n    # Wrap every metric in the class `MetricWrapper` to standardize them\n    for task in cfg_metrics:\n        task_metrics[task] = {}\n        if cfg_metrics[task] is None:\n            cfg_metrics[task] = []\n        for this_metric in cfg_metrics[task]:\n            name = this_metric.pop(\"name\")\n            task_metrics[task][name] = MetricWrapper(**this_metric)\n    return task_metrics\n\n\ndef load_architecture(\n    config: Union[omegaconf.DictConfig, Dict[str, Any]],\n    in_dims: Dict[str, int],\n) -> Union[FullGraphMultiTaskNetwork, torch.nn.Module]:\n    \"\"\"\n    Loading the architecture used for training.\n    Parameters:\n        config: The config file, with key `architecture`\n        in_dims: Dictionary of the input dimensions for various\n    Returns:\n        architecture: The datamodule used to process and load the data\n    \"\"\"\n\n    if isinstance(config, dict) and \"finetuning\" not in config:\n        config = omegaconf.OmegaConf.create(config)\n    cfg_arch = config[\"architecture\"]\n\n    # Select the architecture\n    model_type = cfg_arch[\"model_type\"].lower()\n    if model_type == \"fullgraphmultitasknetwork\":\n        model_class = FullGraphMultiTaskNetwork\n    elif model_type == \"fullgraphfinetuningnetwork\":\n        model_class = FullGraphFinetuningNetwork\n    else:\n        raise ValueError(f\"Unsupported model_type=`{model_type}`\")\n\n    # Prepare the various kwargs\n    pe_encoders_kwargs = (\n        dict(cfg_arch[\"pe_encoders\"]) if cfg_arch.get(\"pe_encoders\", None) is not None else None\n    )\n\n    pre_nn_kwargs = dict(cfg_arch[\"pre_nn\"]) if cfg_arch[\"pre_nn\"] is not None else None\n    pre_nn_edges_kwargs = dict(cfg_arch[\"pre_nn_edges\"]) if cfg_arch[\"pre_nn_edges\"] is not None else None\n    gnn_kwargs = dict(cfg_arch[\"gnn\"])\n    graph_output_nn_kwargs = (\n        dict(cfg_arch[\"graph_output_nn\"]) if cfg_arch[\"graph_output_nn\"] is not None else None\n    )\n    task_heads_kwargs = (\n        cfg_arch[\"task_heads\"] if cfg_arch[\"task_heads\"] is not None else None\n    )  # This is of type ListConfig containing TaskHeadParams\n\n    # Initialize the input dimension for the positional encoders\n    if pe_encoders_kwargs is not None:\n        pe_encoders_kwargs = dict(pe_encoders_kwargs)\n        for encoder in pe_encoders_kwargs[\"encoders\"]:\n            pe_encoders_kwargs[\"encoders\"][encoder] = dict(pe_encoders_kwargs[\"encoders\"][encoder])\n        pe_encoders_kwargs.setdefault(\n            \"in_dims\", in_dims\n        )  # set the input dimensions of all pe with info from the data-module\n    pe_out_dim = 0 if pe_encoders_kwargs is None else pe_encoders_kwargs.get(\"out_dim\", None)\n    edge_pe_out_dim = 0 if pe_encoders_kwargs is None else pe_encoders_kwargs.get(\"edge_out_dim\", None)\n\n    # Set the default `node` input dimension for the pre-processing neural net and graph neural net\n    in_dim = in_dims[\"feat\"]\n    if pe_out_dim is not None:\n        in_dim += pe_out_dim\n    if pre_nn_kwargs is not None:\n        pre_nn_kwargs = dict(pre_nn_kwargs)\n        pre_nn_kwargs.setdefault(\"in_dim\", in_dim)\n    else:\n        gnn_kwargs.setdefault(\"in_dim\", in_dim)\n\n    # Set the default `edge` input dimension for the pre-processing neural net and graph neural net\n    edge_in_dim = in_dims[\"edge_feat\"]\n    if edge_pe_out_dim is not None:\n        edge_in_dim += edge_pe_out_dim\n    if pre_nn_edges_kwargs is not None:\n        pre_nn_edges_kwargs = dict(pre_nn_edges_kwargs)\n        pre_nn_edges_kwargs.setdefault(\"in_dim\", edge_in_dim)\n    else:\n        gnn_kwargs.setdefault(\"in_dim\", edge_in_dim)\n\n    # Set the parameters for the full network\n    if \"finetuning\" not in config:\n        task_heads_kwargs = omegaconf.OmegaConf.to_object(task_heads_kwargs)\n\n    # Set all the input arguments for the model\n    model_kwargs = dict(\n        gnn_kwargs=gnn_kwargs,\n        pre_nn_kwargs=pre_nn_kwargs,\n        pre_nn_edges_kwargs=pre_nn_edges_kwargs,\n        pe_encoders_kwargs=pe_encoders_kwargs,\n        graph_output_nn_kwargs=graph_output_nn_kwargs,\n        task_heads_kwargs=task_heads_kwargs,\n    )\n    # Get accelerator_kwargs if they exist\n    accelerator_kwargs = config[\"accelerator\"].get(\"accelerator_kwargs\", None)\n    if accelerator_kwargs is not None:\n        model_kwargs[\"accelerator_kwargs\"] = accelerator_kwargs\n\n    if model_class is FullGraphFinetuningNetwork:\n        finetuning_head_kwargs = config[\"finetuning\"].pop(\"finetuning_head\", None)\n        pretrained_overwriting_kwargs = config[\"finetuning\"].pop(\"overwriting_kwargs\")\n        pretrained_model = pretrained_overwriting_kwargs.pop(\"pretrained_model\")\n\n        model_kwargs = {\n            \"pretrained_model_kwargs\": deepcopy(model_kwargs),\n            \"pretrained_overwriting_kwargs\": pretrained_overwriting_kwargs,\n            \"pretrained_model\": pretrained_model,\n            \"finetuning_head_kwargs\": finetuning_head_kwargs,\n        }\n\n    return model_class, model_kwargs\n\n\ndef load_predictor(\n    config: Union[omegaconf.DictConfig, Dict[str, Any]],\n    model_class: Type[torch.nn.Module],\n    model_kwargs: Dict[str, Any],\n    metrics: Dict[str, MetricWrapper],\n    task_levels: Dict[str, str],\n    accelerator_type: str,\n    featurization: Dict[str, str] = None,\n    task_norms: Optional[Dict[Callable, Any]] = None,\n    replicas: int = 1,\n    gradient_acc: int = 1,\n    global_bs: int = 1,\n) -> PredictorModule:\n    \"\"\"\n    Defining the predictor module, which handles the training logic from `lightning.LighningModule`\n    Parameters:\n        model_class: The torch Module containing the main forward function\n        accelerator_type: The accelerator type, e.g. \"cpu\", \"gpu\", \"ipu\"\n    Returns:\n        predictor: The predictor module\n    \"\"\"\n\n    if accelerator_type == \"ipu\":\n        from graphium.ipu.ipu_wrapper import PredictorModuleIPU\n\n        predictor_class = PredictorModuleIPU\n    else:\n        predictor_class = PredictorModule\n\n    cfg_pred = dict(deepcopy(config[\"predictor\"]))\n    predictor = predictor_class(\n        model_class=model_class,\n        model_kwargs=model_kwargs,\n        metrics=metrics,\n        task_levels=task_levels,\n        featurization=featurization,\n        task_norms=task_norms,\n        replicas=replicas,\n        gradient_acc=gradient_acc,\n        global_bs=global_bs,\n        **cfg_pred,\n    )\n\n    mup_scale_factor = config[\"architecture\"].pop(\"mup_scale_factor\", None)\n\n    if mup_scale_factor is not None and mup_scale_factor != 1:\n        unscaled_model = predictor.model\n        scaled_model_kwargs = unscaled_model.scale_kwargs(scale_factor=mup_scale_factor)\n        del predictor\n        predictor = predictor_class(\n            model_class=model_class,\n            model_kwargs=scaled_model_kwargs,\n            metrics=metrics,\n            task_levels=task_levels,\n            featurization=featurization,\n            task_norms=task_norms,\n            replicas=replicas,\n            gradient_acc=gradient_acc,\n            global_bs=global_bs,\n            **cfg_pred,\n        )\n\n    # mup base shapes\n    mup_base_path = config[\"architecture\"].pop(\"mup_base_path\", None)\n    predictor = load_mup(mup_base_path, predictor)\n\n    return predictor\n\n\ndef load_mup(mup_base_path: str, predictor: PredictorModule) -> PredictorModule:\n    \"\"\"\n    Load the base shapes for the mup, based either on a `.ckpt` or `.yaml` file.\n    If `.yaml`, it should be generated by `mup.save_base_shapes`\n    \"\"\"\n    model = predictor.model\n\n    if not isinstance(model, MupMixin):\n        raise TypeError(\"load_mup can only be applied to models that use the MupMixin\")\n\n    if mup_base_path is None:\n        base = model.__class__(**model.make_mup_base_kwargs(divide_factor=2))\n    elif mup_base_path.endswith(\".ckpt\"):\n        base = predictor.__class__.load_from_checkpoint(mup_base_path, map_location=\"cpu\")\n    elif mup_base_path.endswith(\".yaml\"):\n        base = mup_base_path\n    else:\n        raise ValueError(f\"Unrecognized file type {mup_base_path}\")\n    predictor.model = set_base_shapes(predictor.model, base, rescale_params=False)\n    return predictor\n\n\ndef load_trainer(\n    config: Union[omegaconf.DictConfig, Dict[str, Any]],\n    accelerator_type: str,\n    date_time_suffix: str = \"\",\n) -> Trainer:\n    \"\"\"\n    Defining the pytorch-lightning Trainer module.\n    Parameters:\n        config: The config file, with key `trainer`\n        accelerator_type: The accelerator type, e.g. \"cpu\", \"gpu\", \"ipu\"\n        date_time_suffix: The date and time of the current run. To be used for logging.\n    Returns:\n        trainer: the trainer module\n    \"\"\"\n    cfg_trainer = deepcopy(config[\"trainer\"])\n\n    # Define the IPU plugin if required\n    strategy = cfg_trainer[\"trainer\"].pop(\"strategy\", \"auto\")\n    if accelerator_type == \"ipu\":\n        ipu_opts, ipu_inference_opts = _get_ipu_opts(config)\n\n        training_opts, inference_opts = load_ipu_options(\n            ipu_opts=ipu_opts,\n            ipu_inference_opts=ipu_inference_opts,\n            seed=config[\"constants\"][\"seed\"],\n            model_name=config[\"constants\"][\"name\"],\n            gradient_accumulation=config[\"trainer\"][\"trainer\"].get(\"accumulate_grad_batches\", None),\n            precision=config[\"trainer\"][\"trainer\"].get(\"precision\"),\n        )\n\n        if strategy != \"auto\":\n            raise ValueError(\"IPUs selected, but strategy is not set to 'auto'\")\n\n        from lightning_graphcore import IPUStrategy\n\n        strategy = IPUStrategy(training_opts=training_opts, inference_opts=inference_opts)\n\n    # Get devices\n    devices = cfg_trainer[\"trainer\"].pop(\"devices\", 1)\n    if accelerator_type == \"ipu\":\n        devices = 1  # number of IPUs used is defined in the ipu options files\n\n    # Remove the gradient accumulation from IPUs, since it's handled by the device\n    if accelerator_type == \"ipu\":\n        cfg_trainer[\"trainer\"].pop(\"accumulate_grad_batches\", None)\n\n    # Define the early stopping parameters\n    trainer_kwargs = {}\n    callbacks = []\n    if \"early_stopping\" in cfg_trainer.keys():\n        callbacks.append(EarlyStopping(**cfg_trainer[\"early_stopping\"]))\n\n    # Define the early model checkpoing parameters\n    if \"model_checkpoint\" in cfg_trainer.keys():\n        callbacks.append(ModelCheckpoint(**cfg_trainer[\"model_checkpoint\"]))\n\n    if \"learning_rate_monitor\" in cfg_trainer.keys():\n        callbacks.append(LearningRateMonitor(**cfg_trainer[\"learning_rate_monitor\"]))\n    else:\n        callbacks.append(LearningRateMonitor())\n\n    # Define the logger parameters\n    wandb_cfg = config[\"constants\"].get(\"wandb\")\n    if wandb_cfg is not None:\n        name = wandb_cfg.pop(\"name\", \"main\")\n        if len(date_time_suffix) > 0:\n            name += f\"_{date_time_suffix}\"\n        trainer_kwargs[\"logger\"] = WandbLogger(name=name, log_model=True, **wandb_cfg)\n\n    trainer_kwargs[\"callbacks\"] = callbacks\n    trainer = Trainer(\n        detect_anomaly=True,\n        strategy=strategy,\n        accelerator=accelerator_type,\n        devices=devices,\n        **cfg_trainer[\"trainer\"],\n        **trainer_kwargs,\n    )\n    return trainer\n\n\ndef save_params_to_wandb(\n    logger: Logger,\n    config: Union[omegaconf.DictConfig, Dict[str, Any]],\n    predictor: PredictorModule,\n    datamodule: MultitaskFromSmilesDataModule,\n    unresolved_config: Optional[Union[omegaconf.DictConfig, Dict[str, Any]]] = None,\n):\n    \"\"\"\n    Save a few stuff to weights-and-biases WandB\n    Parameters:\n        logger: The object used to log the training. Usually WandbLogger\n        config: The config file, with key `trainer`\n        predictor: The predictor used to handle the train/val/test steps logic\n        datamodule: The datamodule used to load the data into training\n        unresolved_config: The unresolved config file\n    \"\"\"\n\n    # Get the wandb runner and directory\n    wandb_run = logger.experiment\n\n    if wandb_run is None:\n        wandb_dir = \"\"\n    else:\n        wandb_dir = wandb_run.dir\n\n    # Save the mup base model to WandB as a yaml file\n    mup.save_base_shapes(predictor.model, os.path.join(wandb_dir, \"mup_base_params.yaml\"))\n\n    # Save the full configs as a YAML file\n    with open(os.path.join(wandb_dir, \"full_configs.yaml\"), \"w\") as file:\n        yaml.dump(config, file)\n\n    if unresolved_config is not None:\n        with open(os.path.join(wandb_dir, \"unresolved_config.yaml\"), \"w\") as file:\n            yaml.dump(unresolved_config, file)\n\n    # Save the featurizer into wandb\n    featurizer_path = os.path.join(wandb_dir, \"featurizer.pickle\")\n    joblib.dump(datamodule.smiles_transformer, featurizer_path)\n\n    # Save the featurizer and configs into wandb\n    if wandb_run is not None:\n        wandb_run.save(os.path.join(wandb_dir, \"*.yaml\"), wandb_dir)\n        wandb_run.save(os.path.join(wandb_dir, \"*.pickle\"), wandb_dir)\n\n\ndef load_accelerator(config: Union[omegaconf.DictConfig, Dict[str, Any]]) -> Tuple[Dict[str, Any], str]:\n    config = deepcopy(config)\n    config_acc = config.get(\"accelerator\", {})\n\n    # Merge the accelerator config with the main config\n    config_override = config_acc.get(\"config_override\", {})\n    merge_dicts(config, config_override)\n    accelerator_type = get_accelerator(config_acc)\n\n    if accelerator_type == \"gpu\":\n        precision = config_acc.get(\"float32_matmul_precision\", None)\n        if precision is not None:\n            torch.set_float32_matmul_precision(precision)\n\n    return config, accelerator_type\n\n\ndef load_config_override(\n    config: Union[omegaconf.DictConfig, Dict[str, Any]], main_dir: Optional[Union[str, os.PathLike]] = None\n) -> Dict[str, Any]:\n    config = deepcopy(config)\n    config_override_path = config[\"constants\"].get(\"config_override\", None)\n    if config_override_path is not None:\n        if main_dir is not None:\n            config_override_path = os.path.join(main_dir, config_override_path)\n        with open(config_override_path, \"r\") as f:\n            cfg_override = yaml.safe_load(f)\n        config = merge_dicts(cfg_override, config, on_exist=\"overwrite\")\n    return config\n\n\ndef load_yaml_config(\n    config_path: Union[str, os.PathLike],\n    main_dir: Optional[Union[str, os.PathLike]] = None,\n    unknown_args=None,\n) -> Dict[str, Any]:\n    \"\"\"\n    Load a YAML config file and return it as a dictionary.\n    Also returns the anchors `&` and aliases `*` of the YAML file.\n    Then, update the config with the unknown arguments.\n    Finally, update the config with the config override file specified in `constants.config_override`.\n\n    Parameters:\n        config_path: The path to the YAML config file\n        main_dir: The main directory of the project. If specified, the config override file will be loaded from this directory\n        unknown_args: The unknown arguments to update the config with, taken from `argparse.parse_known_args`\n\n    Returns:\n        config: The config dictionary\n\n    \"\"\"\n    if main_dir is not None:\n        config_path = os.path.join(main_dir, config_path)\n\n    with open(config_path, \"r\") as f:\n        config = yaml.safe_load(f)\n        refs = get_anchors_and_aliases(config_path)\n        if unknown_args is not None:\n            config = update_config(config, unknown_args, refs)\n    config = load_config_override(config, main_dir)  # This goes here to avoid overriding the hparam search\n\n    return config\n\n\ndef merge_dicts(\n    dict_a: Dict[str, Any], dict_b: Dict[str, Any], previous_dict_path: str = \"\", on_exist: str = \"raise\"\n) -> None:\n    \"\"\"\n    Recursively merges dict_b into dict_a. If a key is missing from dict_a,\n    it is added from dict_b. If a key exists in both, an error is raised.\n    `dict_a` is modified in-place.\n\n    Parameters:\n        dict_a: The dictionary to merge into. Modified in-place.\n        dict_b: The dictionary to merge from.\n        previous_dict_path: The key path of the parent dictionary,\n        used to track the recursive calls.\n        on_exist: What to do if a key already exists in dict_a. Options are \"raise\", \"overwrite\", \"ignore\".\n\n    Raises:\n        ValueError: If a key path already exists in dict_a.\n\n    \"\"\"\n    assert on_exist in [\n        \"raise\",\n        \"overwrite\",\n        \"ignore\",\n    ], f\"on_exist must be one of ['raise', 'overwrite', 'ignore'], got {on_exist}\"\n\n    for key, value_b in dict_b.items():\n        if key not in dict_a:\n            dict_a[key] = value_b\n        else:\n            value_a = dict_a[key]\n            if previous_dict_path == \"\":\n                previous_dict_path = key\n            else:\n                previous_dict_path = f\"{previous_dict_path}/{key}\"\n            if isinstance(value_a, dict) and isinstance(value_b, dict):\n                merge_dicts(value_a, value_b, previous_dict_path=previous_dict_path, on_exist=on_exist)\n            else:\n                if value_a != value_b:\n                    if on_exist == \"raise\":\n                        raise ValueError(f\"Dict path already exists: {previous_dict_path}\")\n                    elif on_exist == \"overwrite\":\n                        dict_a[key] = value_b\n                    elif on_exist == \"ignore\":\n                        pass\n    return dict_a\n\n\ndef get_checkpoint_path(config: Union[omegaconf.DictConfig, Dict[str, Any]]) -> str:\n    \"\"\"\n    Get the checkpoint path from a config file.\n    If the path is a valid name or a valid path, return it.\n    Otherwise, assume it refers to a file in the checkpointing dir.\n    \"\"\"\n\n    cfg_trainer = config[\"trainer\"]\n\n    path = config.get(\"ckpt_name_for_testing\", \"last.ckpt\")\n    if path in GRAPHIUM_PRETRAINED_MODELS_DICT or fs.exists(path):\n        return path\n\n    if \"model_checkpoint\" in cfg_trainer.keys():\n        dirpath = cfg_trainer[\"model_checkpoint\"][\"dirpath\"]\n        path = fs.join(dirpath, path)\n\n    if not fs.exists(path):\n        raise ValueError(f\"Checkpoint path `{path}` does not exist\")\n\n    return path\n"
  },
  {
    "path": "graphium/config/config_convert.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport omegaconf\n\n\ndef recursive_config_reformating(configs):\n    r\"\"\"\n    For a given configuration file, convert all `DictConfig` to `dict`,\n    all `ListConfig` to `list`, and all `byte` to `str`.\n\n    This helps avoid errors when dumping a yaml file.\n    \"\"\"\n\n    if isinstance(configs, omegaconf.DictConfig):\n        configs = dict(configs)\n    elif isinstance(configs, omegaconf.ListConfig):\n        configs = list(configs)\n\n    if isinstance(configs, dict):\n        for k, v in configs.items():\n            if isinstance(v, bytes):\n                configs[k] = str(v)\n            else:\n                configs[k] = recursive_config_reformating(v)\n    elif isinstance(configs, list):\n        for k, v in enumerate(configs):\n            if isinstance(v, bytes):\n                configs[k] = str(v)\n            else:\n                configs[k] = recursive_config_reformating(v)\n\n    return configs\n"
  },
  {
    "path": "graphium/config/dummy_finetuning_from_gnn.yaml",
    "content": "# Here, we are finetuning a FullGraphMultitaskNetwork\n# trained on ToyMix. We finetune from the gnn on the \n# TDC dataset lipophilicity_astraceneca\n\n# Here are the changes to the architecture:\n#   Change gnn:\n#     depth:        4 -> 4 - 2 + 3 = 5\n#\n#   Keep modules after gnn and apply modifications\n#     graph_output_nn/graph:\n#       pooling: sum -> mean\n#       depth: 1 -> 2\n#     task_heads/zinc:\n#       new_sub_module: lipophilicity_astrazeneca\n#       out_dim: 3 -> 1\n#\n# Finetuning training:\n#   unfreeze one additional layer of pretrained gnn\n#   after 1 epochs, unfreeze all layers\n\n\n###################################################\n########### How to combine information  ###########\n###################################################\n\n\n###########################\n### FINETUNING-SPECIFIC ###\n###########################\n\nfinetuning:\n  # New task\n  task: lipophilicity_astrazeneca\n  level: graph\n\n  # Pretrained model\n  pretrained_model: dummy-pretrained-model\n  finetuning_module: gnn \n\n  # Changes to finetuning_module                                                  \n  drop_depth: 2\n  added_depth: 3\n\n  keep_modules_after_finetuning_module: # optional\n    graph_output_nn-graph:\n      pooling: [mean]\n      depth: 2\n    task_heads-zinc:\n      new_sub_module: lipophilicity_astrazeneca\n      out_dim: 1\n\n  # Finetuning training\n  unfreeze_pretrained_depth: 1\n  epoch_unfreeze_all: 1\n\nconstants:\n  seed: 42\n  max_epochs: 2\n\naccelerator:\n  float32_matmul_precision: medium\n  type: cpu\n\npredictor:\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 4.e-5\n  scheduler_kwargs: null\n  target_nan_mask: null\n  multitask_handling: flatten # flatten, mean-per-label\n  \n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: 2\n    warmup_epochs: 1\n    verbose: False\n  \n  metrics_on_progress_bar:\n    lipophilicity_astrazeneca: [\"mae\"]\n  loss_fun:\n    lipophilicity_astrazeneca: mae\n\nmetrics:\n  lipophilicity_astrazeneca:\n    - name: mae\n      metric: mae\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: spearman\n      metric: spearmanr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: pearson\n      metric: pearsonr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: ${constants.seed}\n  trainer:\n    precision: 32\n    max_epochs: 2\n    min_epochs: 1\n    check_val_every_n_epoch: 1\n    accumulate_grad_batches: 1\n  \n##################\n### DATAMODULE ###\n##################\n\ndatamodule:\n\n### FROM FINETUNING ###\n\n  module_type: \"ADMETBenchmarkDataModule\"\n  args:\n    # TDC specific\n    tdc_benchmark_names: [lipophilicity_astrazeneca]\n    tdc_train_val_seed: ${constants.seed}\n    \n    batch_size_training: 200\n    batch_size_inference: 200\n    featurization_n_jobs: 0\n    num_workers: 0\n\n    prepare_dict_or_graph: pyg:graph\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    persistent_workers: False"
  },
  {
    "path": "graphium/config/dummy_finetuning_from_task_head.yaml",
    "content": "# Here, we are finetuning a FullGraphMultitaskNetwork\n# trained on ToyMix. We finetune from the zinc task-head\n# (graph-level) on the TDC dataset lipophilicity_astraceneca\n\n# Here are the changes to the architecture:\n#   Change zinc task-head:\n#     depth:        2 -> 2 - 1 + 2 = 3\n#     out_dim:      3 -> 8\n#\n#   Add finetuning head\n#     model_type:   FeedForwardNN\n#     out_dim:      1\n#     hidden_dims:  8\n#     depth:        2\n#\n# Finetuning training:\n#   after 1 epochs, unfreeze all layers\n\n\n###################################################\n########### How to combine information  ###########\n###################################################\n\n\n###########################\n### FINETUNING-SPECIFIC ###\n###########################\n\nfinetuning:\n  # New task\n  task: lipophilicity_astrazeneca\n  level: graph\n\n  # Pretrained model\n  pretrained_model: dummy-pretrained-model\n  finetuning_module: task_heads  \n  sub_module_from_pretrained: zinc # optional\n  new_sub_module: lipophilicity_astrazeneca # optional\n  # keep_modules_after_finetuning_module: # optional\n\n  # Changes to finetuning_module                                                  \n  drop_depth: 1\n  new_out_dim: 8\n  added_depth: 2\n\n  # Optional finetuning head appended to model after finetuning_module\n  finetuning_head: # none\n    task: lipophilicity_astrazeneca\n    previous_module: task_heads\n    incoming_level: graph\n    model_type: mlp\n    in_dim: 8\n    out_dim: 1\n    hidden_dims: 8\n    depth: 2\n    last_layer_is_readout: true\n\n  # Finetuning training\n  unfreeze_pretrained_depth: 0\n  epoch_unfreeze_all: 1\n\nconstants:\n  seed: 42\n  max_epochs: 2\n\naccelerator:\n  float32_matmul_precision: medium\n  type: cpu\n\npredictor:\n  random_seed: ${constants.seed}\n  optim_kwargs:\n    lr: 4.e-5\n  scheduler_kwargs: null\n  target_nan_mask: null\n  multitask_handling: flatten # flatten, mean-per-label\n  \n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: 2\n    warmup_epochs: 1\n    verbose: False\n  \n  metrics_on_progress_bar:\n    lipophilicity_astrazeneca: [\"mae\"]\n  loss_fun:\n    lipophilicity_astrazeneca: mae\n\nmetrics:\n  lipophilicity_astrazeneca:\n    - name: mae\n      metric: mae\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: spearman\n      metric: spearmanr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: pearson\n      metric: pearsonr\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n\ntrainer:\n  seed: ${constants.seed}\n  trainer:\n    precision: 32\n    max_epochs: 2\n    min_epochs: 1\n    check_val_every_n_epoch: 1\n    accumulate_grad_batches: 1\n  \n##################\n### DATAMODULE ###\n##################\n\ndatamodule:\n\n### FROM FINETUNING ###\n\n  module_type: \"ADMETBenchmarkDataModule\"\n  args:\n    # TDC specific\n    tdc_benchmark_names: [lipophilicity_astrazeneca]\n    tdc_train_val_seed: ${constants.seed}\n    \n    batch_size_training: 200\n    batch_size_inference: 200\n    featurization_n_jobs: 0\n    num_workers: 0\n\n    prepare_dict_or_graph: pyg:graph\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    persistent_workers: False\n\n\n\n\n"
  },
  {
    "path": "graphium/config/fake_and_missing_multilevel_multitask_pyg.yaml",
    "content": "datamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      score:\n        df: null\n        task_level: \"edge\"\n        df_path: \"./tests/fake_and_missing_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"edge_label_list\", \"edge_label_np\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n      logp:\n        df: null\n        task_level: \"node\"\n        df_path: \"./tests/fake_and_missing_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"node_label_list\", \"node_label_np\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n      SA:\n        df: null\n        task_level: \"graph\"\n        df_path: \"./tests/fake_and_missing_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"graph_label\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null                 # This may not always be provided\n        sample_size: null                 # This may not always be provided\n        idx_col: null                     # This may not always be provided\n        weights_col: null                 # This may not always be provided\n\n    # Featurization\n    featurization_n_jobs: 16\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [\"atomic-number\", \"degree\"]\n      atom_property_list_float: []\n      edge_property_list: [\"in-ring\", \"bond-type-onehot\"]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n\n    # Data handling-related\n    batch_size_training: 16\n    batch_size_inference: 16\n\narchitecture:     # The parameters for the full graph network are taken from `config_micro_ZINC.yaml`\n  model_type: FullGraphMultiTaskNetwork\n  pre_nn:         # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization \"batch_norm\"\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:            # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: random\n    virtual_node: 'sum'\n    layer_type: 'pyg:pna-msgpass'\n    layer_kwargs:\n      # num_heads: 3\n      aggregators: [mean, max]\n      scalers: [identity, amplification, attenuation]\n\n  graph_output_nn:\n    node:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    graph:\n      pooling: [sum, max]\n      out_dim: 1\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    edge:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    nodepair:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:     # Set as null to avoid task heads. Recall that the arguments for the TaskHeads is a List of TaskHeadParams\n    task_1:\n      task_level: \"node\"\n      out_dim: 5\n      hidden_dims: [5, 6, 7]\n      #depth: none                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_2:\n      task_level: \"edge\"\n      out_dim: 3\n      hidden_dims: [8, 9, 10]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_3:\n      task_level: \"graph\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_4:\n      task_level: \"nodepair\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none"
  },
  {
    "path": "graphium/config/fake_multilevel_multitask_pyg.yaml",
    "content": "datamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      score:\n        df: null\n        task_level: \"edge\"\n        df_path: \"./tests/converted_fake_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"edge_label_list\", \"edge_label_np\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n      logp:\n        df: null\n        task_level: \"node\"\n        df_path: \"./tests/converted_fake_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"node_label_list\", \"node_label_np\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n      SA:\n        df: null\n        task_level: \"graph\"\n        df_path: \"./tests/converted_fake_multilevel_data.parquet\"\n        smiles_col: \"ordered_smiles\"\n        label_cols: [\"graph_label\"]\n        split_val: 0.0\n        split_test: 0.0\n        seed: 19\n        splits_path: null                 # This may not always be provided\n        sample_size: null                 # This may not always be provided\n        idx_col: null                     # This may not always be provided\n        weights_col: null                 # This may not always be provided\n\n    # Featurization\n    featurization_n_jobs: 16\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [\"atomic-number\", \"degree\"]\n      atom_property_list_float: []\n      edge_property_list: [\"in-ring\", \"bond-type-onehot\"]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n\n    # Data handling-related\n    batch_size_training: 16\n    batch_size_inference: 16\n\narchitecture:     # The parameters for the full graph network are taken from `config_micro_ZINC.yaml`\n  model_type: FullGraphMultiTaskNetwork\n  pre_nn:         # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization \"batch_norm\"\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:            # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: random\n    virtual_node: 'sum'\n    layer_type: 'pyg:pna-msgpass'\n    layer_kwargs:\n      # num_heads: 3\n      aggregators: [mean, max]\n      scalers: [identity, amplification, attenuation]\n\n  graph_output_nn:\n    node:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    graph:\n      pooling: [sum, max]\n      out_dim: 1\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    edge:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    nodepair:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:     # Set as null to avoid task heads. Recall that the arguments for the TaskHeads is a List of TaskHeadParams\n    task_1:\n      task_level: \"node\"\n      out_dim: 5\n      hidden_dims: [5, 6, 7]\n      #depth: none                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_2:\n      task_level: \"edge\"\n      out_dim: 3\n      hidden_dims: [8, 9, 10]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_3:\n      task_level: \"graph\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_4:\n      task_level: \"nodepair\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none"
  },
  {
    "path": "graphium/config/zinc_default_multitask_pyg.yaml",
    "content": "datamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      SA:\n        df: null\n        task_level: \"graph\"\n        df_path: \"graphium/data/multitask/tiny_ZINC_SA.csv\"\n        smiles_col: \"SMILES\"\n        label_cols: [\"SA\"]\n        split_val: 0.2\n        split_test: 0.2\n        seed: 19\n        splits_path: null                 # This may not always be provided\n        sample_size: null                 # This may not always be provided\n        idx_col: null                     # This may not always be provided\n        weights_col: null                 # This may not always be provided\n      logp:\n        df: null\n        task_level: \"graph\"\n        df_path: \"graphium/data/multitask/tiny_ZINC_logp.csv\"\n        smiles_col: \"SMILES\"\n        label_cols: [\"logp\"]\n        split_val: 0.2\n        split_test: 0.2\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n      score:\n        df: null\n        task_level: \"graph\"\n        df_path: \"graphium/data/multitask/tiny_ZINC_score.csv\"\n        smiles_col: \"SMILES\"\n        label_cols: [\"score\"]\n        split_val: 0.2\n        split_test: 0.2\n        seed: 19\n        splits_path: null\n        sample_size: null\n        idx_col: null\n        weights_col: null\n        weights_type: null\n\n    # Featurization\n    featurization_n_jobs: 16\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [\"atomic-number\", \"degree\"]\n      atom_property_list_float: []\n      edge_property_list: [\"in-ring\", \"bond-type-onehot\"]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n\n    # Data handling-related\n    batch_size_training: 16\n    batch_size_inference: 16\n\narchitecture:     # The parameters for the full graph network are taken from `config_micro_ZINC.yaml`\n  model_type: FullGraphMultiTaskNetwork\n  pre_nn:         # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization \"batch_norm\"\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:            # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: random\n    virtual_node: 'sum'\n    layer_type: 'pyg:pna-msgpass'\n    layer_kwargs:\n      # num_heads: 3\n      aggregators: [mean, max]\n      scalers: [identity, amplification, attenuation]\n\n  graph_output_nn:\n    node:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    graph:\n      pooling: [sum, max]\n      out_dim: 1\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    edge:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    nodepair:\n      out_dim: 16\n      hidden_dims: 32\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:     # Set as null to avoid task heads. Recall that the arguments for the TaskHeads is a List of TaskHeadParams\n    task_1:\n      task_level: \"node\"\n      out_dim: 5\n      hidden_dims: [5, 6, 7]\n      #depth: none                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_2:\n      task_level: \"edge\"\n      out_dim: 3\n      hidden_dims: [8, 9, 10]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_3:\n      task_level: \"graph\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\n    task_4:\n      task_level: \"nodepair\"\n      out_dim: 4\n      hidden_dims: [2, 2, 2]\n      activation: relu\n      last_activation: none\n      dropout: 0.2\n      normalization: none\n      residual_type: none\naccelerator:\n  type: cpu"
  },
  {
    "path": "graphium/data/L1000/datasets_goli-L1000_parse_gctx_to_csv.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Imports\\n\",\n    \"from os import makedirs\\n\",\n    \"import pandas as pd\\n\",\n    \"import numpy as np\\n\",\n    \"from cmapPy.pandasGEXpress.parse import parse\\n\",\n    \"from rdkit.Chem.AllChem import MolFromSmiles, MolToSmiles\\n\",\n    \"from IPython.display import display\\n\",\n    \"from matplotlib import pyplot as plt\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Constants\\n\",\n    \"COMP_FILE = \\\"metadata/compoundinfo_beta.txt\\\"\\n\",\n    \"GENE_FILE = \\\"metadata/geneinfo_beta.txt\\\"\\n\",\n    \"INFO_FILE = \\\"data/instinfo_beta.txt\\\"\\n\",\n    \"LEVEL5_FILE = \\\"data/level5_beta_trt_cp_n720216x12328.gctx\\\"\\n\",\n    \"THRESHOLD = 4\\n\",\n    \"THRESHOLD_lst = [-6, -3, 3, 6]\\n\",\n    \"OUT_DIR_LST = []\\n\",\n    \"OUT_DIR = f\\\"out/th_{THRESHOLD}/\\\"\\n\",\n    \"OUT_DIR_LST.append(f\\\"out/less_th_{THRESHOLD_lst[0]}/\\\")\\n\",\n    \"for i in range(len(THRESHOLD_lst)-1):\\n\",\n    \"    OUT_DIR_LST.append(f\\\"out/th_{THRESHOLD_lst[i]}_{THRESHOLD_lst[i+1]}/\\\")\\n\",\n    \"OUT_DIR_LST.append(f\\\"out/bigger_th_{THRESHOLD_lst[3]}/\\\")\\n\",\n    \"# U2OS: Bone cancer line with most data (20k mols).\\n\",\n    \"# HA1E: Non-cancer kidney line with the most data (5k mols).\\n\",\n    \"CHOSEN_CELL_LINES = [\\\"U2OS\\\", \\\"HA1E\\\", \\\"VCAP\\\", \\\"A549\\\", \\\"MCF7\\\", \\\"PC3\\\", \\\"A375\\\"]\\n\",\n    \"MIN_ACTIVE_PER_COL = 15\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Function to convert a SMILES to a canonical SMILES, or return None in case of error\\n\",\n    \"def canonical_smiles(smiles):\\n\",\n    \"    out = None\\n\",\n    \"    try:\\n\",\n    \"        out = MolToSmiles(MolFromSmiles(smiles))\\n\",\n    \"    except:\\n\",\n    \"        out = None\\n\",\n    \"\\n\",\n    \"    return out\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Read the compounds data. This allows to relate the perturbation identification \\\"pert_id\\\" to molecular SMILES\\n\",\n    \"comp_df = pd.read_csv(COMP_FILE, sep=\\\"\\\\t\\\", index_col=\\\"pert_id\\\", \\n\",\n    \"                    usecols=[\\\"pert_id\\\", \\\"canonical_smiles\\\", \\\"inchi_key\\\", \\\"compound_aliases\\\"])\\n\",\n    \"comp_df = comp_df.rename(columns={\\\"canonical_smiles\\\": \\\"SMILES\\\"})\\n\",\n    \"print(comp_df.shape)\\n\",\n    \"comp_df\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 26,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"(978, 2)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>gene_id</th>\\n\",\n       \"      <th>feature_space</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2154</th>\\n\",\n       \"      <td>16</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2155</th>\\n\",\n       \"      <td>23</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2156</th>\\n\",\n       \"      <td>25</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2157</th>\\n\",\n       \"      <td>30</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2158</th>\\n\",\n       \"      <td>39</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3127</th>\\n\",\n       \"      <td>200081</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3128</th>\\n\",\n       \"      <td>200734</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3129</th>\\n\",\n       \"      <td>256364</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3130</th>\\n\",\n       \"      <td>375346</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3131</th>\\n\",\n       \"      <td>388650</td>\\n\",\n       \"      <td>landmark</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>978 rows × 2 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"      gene_id feature_space\\n\",\n       \"2154       16      landmark\\n\",\n       \"2155       23      landmark\\n\",\n       \"2156       25      landmark\\n\",\n       \"2157       30      landmark\\n\",\n       \"2158       39      landmark\\n\",\n       \"...       ...           ...\\n\",\n       \"3127   200081      landmark\\n\",\n       \"3128   200734      landmark\\n\",\n       \"3129   256364      landmark\\n\",\n       \"3130   375346      landmark\\n\",\n       \"3131   388650      landmark\\n\",\n       \"\\n\",\n       \"[978 rows x 2 columns]\"\n      ]\n     },\n     \"execution_count\": 26,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Read the genes data. This allows to find which genes are measured, and which are inferred linearily\\n\",\n    \"gene_df = pd.read_csv(GENE_FILE, sep=\\\"\\\\t\\\", usecols=[\\\"gene_id\\\", \\\"feature_space\\\"])\\n\",\n    \"realgene_df = gene_df[gene_df[\\\"feature_space\\\"] == \\\"landmark\\\"]\\n\",\n    \"realgene_id = realgene_df[\\\"gene_id\\\"].values\\n\",\n    \"print(realgene_df.shape)\\n\",\n    \"realgene_df\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 27,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>pert_mfc_id</th>\\n\",\n       \"      <th>sample_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>cmap_name</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>ERG_11</td>\\n\",\n       \"      <td>ERG013_VCAP_72H_X3_B11:O14</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>ERG</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>TRCN0000072237</td>\\n\",\n       \"      <td>TAK004_U2OS_96H_X2_B10_DUO52HI53LO:D10</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>LACZ</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>SOD3</td>\\n\",\n       \"      <td>CYT001_HEPG2_2H_X2_B12:N12</td>\\n\",\n       \"      <td>HEPG2</td>\\n\",\n       \"      <td>SOD3</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>ENTRY00543</td>\\n\",\n       \"      <td>HSF038_HEK293T_48H_X2_B12:M01</td>\\n\",\n       \"      <td>HEK293T</td>\\n\",\n       \"      <td>PDGFRA</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>BRD-K79781870</td>\\n\",\n       \"      <td>DOS043_A375_24H_X1_F3B5_DUO52HI53LO:D17</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>BRD-K79781870</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3026455</th>\\n\",\n       \"      <td>BRD-K07955840</td>\\n\",\n       \"      <td>AICHI002_BJAB_4H_X1.A2_B40:N04</td>\\n\",\n       \"      <td>BJAB</td>\\n\",\n       \"      <td>PF-05212384</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3026456</th>\\n\",\n       \"      <td>BRD-K64606589</td>\\n\",\n       \"      <td>HDAC001_MCF7_24H_X1_F1B3_DUO52HI53LO:H10</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>apicidin</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3026457</th>\\n\",\n       \"      <td>BRD-K61894884</td>\\n\",\n       \"      <td>HDAC001_PC3_24H_X1_F1B3_DUO52HI53LO:N04</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>BRD-K61894884</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3026458</th>\\n\",\n       \"      <td>BRD-K57545991</td>\\n\",\n       \"      <td>RAD001_MCF7_24H_X3_F1B5_DUO52HI53LO:L07</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>enalapril</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3026459</th>\\n\",\n       \"      <td>BRD-K96042922</td>\\n\",\n       \"      <td>RAD001_A549_6H_X3_F1B5_DUO52HI53LO:F18</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>etanidazole</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>3026460 rows × 4 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"            pert_mfc_id                                 sample_id cell_iname  \\\\\\n\",\n       \"0                ERG_11                ERG013_VCAP_72H_X3_B11:O14       VCAP   \\n\",\n       \"1        TRCN0000072237    TAK004_U2OS_96H_X2_B10_DUO52HI53LO:D10       U2OS   \\n\",\n       \"2                  SOD3                CYT001_HEPG2_2H_X2_B12:N12      HEPG2   \\n\",\n       \"3            ENTRY00543             HSF038_HEK293T_48H_X2_B12:M01    HEK293T   \\n\",\n       \"4         BRD-K79781870   DOS043_A375_24H_X1_F3B5_DUO52HI53LO:D17       A375   \\n\",\n       \"...                 ...                                       ...        ...   \\n\",\n       \"3026455   BRD-K07955840            AICHI002_BJAB_4H_X1.A2_B40:N04       BJAB   \\n\",\n       \"3026456   BRD-K64606589  HDAC001_MCF7_24H_X1_F1B3_DUO52HI53LO:H10       MCF7   \\n\",\n       \"3026457   BRD-K61894884   HDAC001_PC3_24H_X1_F1B3_DUO52HI53LO:N04        PC3   \\n\",\n       \"3026458   BRD-K57545991   RAD001_MCF7_24H_X3_F1B5_DUO52HI53LO:L07       MCF7   \\n\",\n       \"3026459   BRD-K96042922    RAD001_A549_6H_X3_F1B5_DUO52HI53LO:F18       A549   \\n\",\n       \"\\n\",\n       \"             cmap_name  \\n\",\n       \"0                  ERG  \\n\",\n       \"1                 LACZ  \\n\",\n       \"2                 SOD3  \\n\",\n       \"3               PDGFRA  \\n\",\n       \"4        BRD-K79781870  \\n\",\n       \"...                ...  \\n\",\n       \"3026455    PF-05212384  \\n\",\n       \"3026456       apicidin  \\n\",\n       \"3026457  BRD-K61894884  \\n\",\n       \"3026458      enalapril  \\n\",\n       \"3026459    etanidazole  \\n\",\n       \"\\n\",\n       \"[3026460 rows x 4 columns]\"\n      ]\n     },\n     \"execution_count\": 27,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"\\n\",\n    \"info_df = pd.read_csv(INFO_FILE, sep=\\\"\\\\t\\\", usecols=[\\\"pert_mfc_id\\\", \\\"sample_id\\\", \\\"cmap_name\\\", \\\"cell_iname\\\"])\\n\",\n    \"info_df\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 28,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"12328\\n\",\n      \"978\\n\",\n      \"978\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Read the first column of the full Level-5 data file, and find the row indexes (ridx) corresponding to real genes\\n\",\n    \"data_gene_id = parse(LEVEL5_FILE, cidx=[0]).row_metadata_df.index.values\\n\",\n    \"data_gene_id = data_gene_id.astype(int)\\n\",\n    \"keep_gene_ridx = [ii for ii, id in enumerate(data_gene_id) if id in realgene_id]\\n\",\n    \"\\n\",\n    \"print(len(data_gene_id))\\n\",\n    \"print(len(realgene_id))\\n\",\n    \"print(len(keep_gene_ridx))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 29,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th>rid</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>cid</th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"      <th></th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>ABY001_A375_XH:BRD-A61304759:0.625:24</th>\\n\",\n       \"      <td>-1.166500</td>\\n\",\n       \"      <td>-0.606900</td>\\n\",\n       \"      <td>-0.733650</td>\\n\",\n       \"      <td>-1.481400</td>\\n\",\n       \"      <td>1.281200</td>\\n\",\n       \"      <td>4.095450</td>\\n\",\n       \"      <td>-0.712400</td>\\n\",\n       \"      <td>0.995350</td>\\n\",\n       \"      <td>-0.031650</td>\\n\",\n       \"      <td>-0.990250</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-4.591100</td>\\n\",\n       \"      <td>-2.772350</td>\\n\",\n       \"      <td>-1.522700</td>\\n\",\n       \"      <td>0.975500</td>\\n\",\n       \"      <td>-0.533750</td>\\n\",\n       \"      <td>-1.987450</td>\\n\",\n       \"      <td>-0.883050</td>\\n\",\n       \"      <td>-1.605100</td>\\n\",\n       \"      <td>0.005250</td>\\n\",\n       \"      <td>0.979050</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>ABY001_A375_XH:BRD-A61304759:0.625:3</th>\\n\",\n       \"      <td>0.794862</td>\\n\",\n       \"      <td>-0.358541</td>\\n\",\n       \"      <td>0.122322</td>\\n\",\n       \"      <td>-0.550787</td>\\n\",\n       \"      <td>-0.178181</td>\\n\",\n       \"      <td>1.566406</td>\\n\",\n       \"      <td>0.058614</td>\\n\",\n       \"      <td>0.308965</td>\\n\",\n       \"      <td>0.369855</td>\\n\",\n       \"      <td>-0.948085</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.737773</td>\\n\",\n       \"      <td>-1.188786</td>\\n\",\n       \"      <td>0.710014</td>\\n\",\n       \"      <td>0.212338</td>\\n\",\n       \"      <td>0.813161</td>\\n\",\n       \"      <td>-1.252179</td>\\n\",\n       \"      <td>-2.421624</td>\\n\",\n       \"      <td>-0.335863</td>\\n\",\n       \"      <td>0.308946</td>\\n\",\n       \"      <td>-0.352101</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>ABY001_A375_XH:BRD-A61304759:10:24</th>\\n\",\n       \"      <td>2.599445</td>\\n\",\n       \"      <td>1.755998</td>\\n\",\n       \"      <td>-0.776326</td>\\n\",\n       \"      <td>-4.121394</td>\\n\",\n       \"      <td>2.539309</td>\\n\",\n       \"      <td>0.533612</td>\\n\",\n       \"      <td>-5.299499</td>\\n\",\n       \"      <td>1.496123</td>\\n\",\n       \"      <td>0.462508</td>\\n\",\n       \"      <td>2.645836</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.199841</td>\\n\",\n       \"      <td>1.496310</td>\\n\",\n       \"      <td>-1.691253</td>\\n\",\n       \"      <td>-1.129814</td>\\n\",\n       \"      <td>-3.005869</td>\\n\",\n       \"      <td>-3.355338</td>\\n\",\n       \"      <td>-0.400337</td>\\n\",\n       \"      <td>0.068793</td>\\n\",\n       \"      <td>-0.495560</td>\\n\",\n       \"      <td>0.044498</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>ABY001_A375_XH:BRD-A61304759:10:3</th>\\n\",\n       \"      <td>0.230140</td>\\n\",\n       \"      <td>1.530381</td>\\n\",\n       \"      <td>-0.823664</td>\\n\",\n       \"      <td>0.111742</td>\\n\",\n       \"      <td>0.497451</td>\\n\",\n       \"      <td>-1.489498</td>\\n\",\n       \"      <td>0.113403</td>\\n\",\n       \"      <td>0.309370</td>\\n\",\n       \"      <td>0.087925</td>\\n\",\n       \"      <td>-1.126528</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.054724</td>\\n\",\n       \"      <td>0.081189</td>\\n\",\n       \"      <td>1.334616</td>\\n\",\n       \"      <td>0.923640</td>\\n\",\n       \"      <td>0.600097</td>\\n\",\n       \"      <td>-2.425969</td>\\n\",\n       \"      <td>-2.049004</td>\\n\",\n       \"      <td>-0.486649</td>\\n\",\n       \"      <td>0.594023</td>\\n\",\n       \"      <td>1.365092</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>ABY001_A375_XH:BRD-A61304759:2.5:24</th>\\n\",\n       \"      <td>2.498556</td>\\n\",\n       \"      <td>3.288291</td>\\n\",\n       \"      <td>-0.831289</td>\\n\",\n       \"      <td>-3.811227</td>\\n\",\n       \"      <td>-0.816384</td>\\n\",\n       \"      <td>4.508691</td>\\n\",\n       \"      <td>-3.575771</td>\\n\",\n       \"      <td>1.432798</td>\\n\",\n       \"      <td>0.086126</td>\\n\",\n       \"      <td>1.699897</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.342500</td>\\n\",\n       \"      <td>1.914565</td>\\n\",\n       \"      <td>-0.571770</td>\\n\",\n       \"      <td>-0.849444</td>\\n\",\n       \"      <td>-3.464440</td>\\n\",\n       \"      <td>-2.774929</td>\\n\",\n       \"      <td>-0.593055</td>\\n\",\n       \"      <td>-0.425393</td>\\n\",\n       \"      <td>0.606134</td>\\n\",\n       \"      <td>-1.007184</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>TSAI002_NPC-8_XH:CI-994:10</th>\\n\",\n       \"      <td>1.345983</td>\\n\",\n       \"      <td>1.670408</td>\\n\",\n       \"      <td>-0.476440</td>\\n\",\n       \"      <td>-2.529742</td>\\n\",\n       \"      <td>-0.085121</td>\\n\",\n       \"      <td>-1.593576</td>\\n\",\n       \"      <td>-0.682030</td>\\n\",\n       \"      <td>-0.727800</td>\\n\",\n       \"      <td>-0.848668</td>\\n\",\n       \"      <td>-5.717160</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.135530</td>\\n\",\n       \"      <td>-2.435960</td>\\n\",\n       \"      <td>-1.364442</td>\\n\",\n       \"      <td>-0.471916</td>\\n\",\n       \"      <td>-3.696775</td>\\n\",\n       \"      <td>-0.480867</td>\\n\",\n       \"      <td>0.580864</td>\\n\",\n       \"      <td>-1.001272</td>\\n\",\n       \"      <td>0.734962</td>\\n\",\n       \"      <td>-2.252348</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>TSAI002_NPC-8_XH:COMPE:2</th>\\n\",\n       \"      <td>-1.086894</td>\\n\",\n       \"      <td>-0.097282</td>\\n\",\n       \"      <td>-0.066396</td>\\n\",\n       \"      <td>-1.676546</td>\\n\",\n       \"      <td>0.870530</td>\\n\",\n       \"      <td>-0.634537</td>\\n\",\n       \"      <td>-0.058797</td>\\n\",\n       \"      <td>3.930091</td>\\n\",\n       \"      <td>-0.814762</td>\\n\",\n       \"      <td>0.419928</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.081087</td>\\n\",\n       \"      <td>0.830733</td>\\n\",\n       \"      <td>0.829341</td>\\n\",\n       \"      <td>0.053908</td>\\n\",\n       \"      <td>-0.668944</td>\\n\",\n       \"      <td>-1.414609</td>\\n\",\n       \"      <td>0.347334</td>\\n\",\n       \"      <td>1.881268</td>\\n\",\n       \"      <td>0.202310</td>\\n\",\n       \"      <td>0.466976</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>TSAI002_NPC-8_XH:DAC-3:5</th>\\n\",\n       \"      <td>-0.172855</td>\\n\",\n       \"      <td>-0.120174</td>\\n\",\n       \"      <td>-0.810217</td>\\n\",\n       \"      <td>0.721430</td>\\n\",\n       \"      <td>0.245437</td>\\n\",\n       \"      <td>0.178710</td>\\n\",\n       \"      <td>1.026641</td>\\n\",\n       \"      <td>-0.151166</td>\\n\",\n       \"      <td>-0.043268</td>\\n\",\n       \"      <td>0.508430</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.055938</td>\\n\",\n       \"      <td>-0.204528</td>\\n\",\n       \"      <td>-0.076671</td>\\n\",\n       \"      <td>-0.097271</td>\\n\",\n       \"      <td>0.548531</td>\\n\",\n       \"      <td>-0.037314</td>\\n\",\n       \"      <td>0.554130</td>\\n\",\n       \"      <td>-0.466869</td>\\n\",\n       \"      <td>0.236111</td>\\n\",\n       \"      <td>-2.094904</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>TSAI002_NPC-8_XH:SAHA:2.5</th>\\n\",\n       \"      <td>-2.144240</td>\\n\",\n       \"      <td>1.323213</td>\\n\",\n       \"      <td>-0.087796</td>\\n\",\n       \"      <td>-0.895325</td>\\n\",\n       \"      <td>-1.104476</td>\\n\",\n       \"      <td>-0.261422</td>\\n\",\n       \"      <td>-0.444195</td>\\n\",\n       \"      <td>-1.646474</td>\\n\",\n       \"      <td>-0.167281</td>\\n\",\n       \"      <td>0.542381</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.291012</td>\\n\",\n       \"      <td>-0.796454</td>\\n\",\n       \"      <td>-2.154324</td>\\n\",\n       \"      <td>-0.060246</td>\\n\",\n       \"      <td>0.025645</td>\\n\",\n       \"      <td>-0.894975</td>\\n\",\n       \"      <td>-1.356140</td>\\n\",\n       \"      <td>0.823688</td>\\n\",\n       \"      <td>-0.078019</td>\\n\",\n       \"      <td>-2.017400</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>TSAI002_NPC-8_XH:SRT3657:5</th>\\n\",\n       \"      <td>0.643173</td>\\n\",\n       \"      <td>0.010850</td>\\n\",\n       \"      <td>-0.523062</td>\\n\",\n       \"      <td>0.286562</td>\\n\",\n       \"      <td>-0.135734</td>\\n\",\n       \"      <td>-0.970101</td>\\n\",\n       \"      <td>0.343622</td>\\n\",\n       \"      <td>-0.639882</td>\\n\",\n       \"      <td>0.499465</td>\\n\",\n       \"      <td>-3.587306</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-3.262180</td>\\n\",\n       \"      <td>0.404180</td>\\n\",\n       \"      <td>0.424725</td>\\n\",\n       \"      <td>-0.224264</td>\\n\",\n       \"      <td>0.261778</td>\\n\",\n       \"      <td>0.282515</td>\\n\",\n       \"      <td>-0.107758</td>\\n\",\n       \"      <td>-0.762146</td>\\n\",\n       \"      <td>0.115392</td>\\n\",\n       \"      <td>-0.723844</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>720216 rows × 978 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"rid                                       10007      1001     10013     10038  \\\\\\n\",\n       \"cid                                                                             \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24 -1.166500 -0.606900 -0.733650 -1.481400   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3   0.794862 -0.358541  0.122322 -0.550787   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24     2.599445  1.755998 -0.776326 -4.121394   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3      0.230140  1.530381 -0.823664  0.111742   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24    2.498556  3.288291 -0.831289 -3.811227   \\n\",\n       \"...                                         ...       ...       ...       ...   \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10             1.345983  1.670408 -0.476440 -2.529742   \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2              -1.086894 -0.097282 -0.066396 -1.676546   \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5              -0.172855 -0.120174 -0.810217  0.721430   \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -2.144240  1.323213 -0.087796 -0.895325   \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5             0.643173  0.010850 -0.523062  0.286562   \\n\",\n       \"\\n\",\n       \"rid                                       10046     10049     10051     10057  \\\\\\n\",\n       \"cid                                                                             \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24  1.281200  4.095450 -0.712400  0.995350   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3  -0.178181  1.566406  0.058614  0.308965   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24     2.539309  0.533612 -5.299499  1.496123   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3      0.497451 -1.489498  0.113403  0.309370   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24   -0.816384  4.508691 -3.575771  1.432798   \\n\",\n       \"...                                         ...       ...       ...       ...   \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10            -0.085121 -1.593576 -0.682030 -0.727800   \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2               0.870530 -0.634537 -0.058797  3.930091   \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5               0.245437  0.178710  1.026641 -0.151166   \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -1.104476 -0.261422 -0.444195 -1.646474   \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5            -0.135734 -0.970101  0.343622 -0.639882   \\n\",\n       \"\\n\",\n       \"rid                                       10058     10059  ...      9918  \\\\\\n\",\n       \"cid                                                        ...             \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24 -0.031650 -0.990250  ... -4.591100   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3   0.369855 -0.948085  ... -0.737773   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24     0.462508  2.645836  ... -0.199841   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3      0.087925 -1.126528  ...  0.054724   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24    0.086126  1.699897  ... -1.342500   \\n\",\n       \"...                                         ...       ...  ...       ...   \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10            -0.848668 -5.717160  ...  1.135530   \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2              -0.814762  0.419928  ...  0.081087   \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5              -0.043268  0.508430  ... -0.055938   \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -0.167281  0.542381  ...  2.291012   \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5             0.499465 -3.587306  ... -3.262180   \\n\",\n       \"\\n\",\n       \"rid                                        9924      9926      9928       993  \\\\\\n\",\n       \"cid                                                                             \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24 -2.772350 -1.522700  0.975500 -0.533750   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3  -1.188786  0.710014  0.212338  0.813161   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24     1.496310 -1.691253 -1.129814 -3.005869   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3      0.081189  1.334616  0.923640  0.600097   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24    1.914565 -0.571770 -0.849444 -3.464440   \\n\",\n       \"...                                         ...       ...       ...       ...   \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10            -2.435960 -1.364442 -0.471916 -3.696775   \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2               0.830733  0.829341  0.053908 -0.668944   \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5              -0.204528 -0.076671 -0.097271  0.548531   \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -0.796454 -2.154324 -0.060246  0.025645   \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5             0.404180  0.424725 -0.224264  0.261778   \\n\",\n       \"\\n\",\n       \"rid                                         994      9943      9961       998  \\\\\\n\",\n       \"cid                                                                             \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24 -1.987450 -0.883050 -1.605100  0.005250   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3  -1.252179 -2.421624 -0.335863  0.308946   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24    -3.355338 -0.400337  0.068793 -0.495560   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3     -2.425969 -2.049004 -0.486649  0.594023   \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24   -2.774929 -0.593055 -0.425393  0.606134   \\n\",\n       \"...                                         ...       ...       ...       ...   \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10            -0.480867  0.580864 -1.001272  0.734962   \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2              -1.414609  0.347334  1.881268  0.202310   \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5              -0.037314  0.554130 -0.466869  0.236111   \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -0.894975 -1.356140  0.823688 -0.078019   \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5             0.282515 -0.107758 -0.762146  0.115392   \\n\",\n       \"\\n\",\n       \"rid                                        9988  \\n\",\n       \"cid                                              \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:24  0.979050  \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:0.625:3  -0.352101  \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:24     0.044498  \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:10:3      1.365092  \\n\",\n       \"ABY001_A375_XH:BRD-A61304759:2.5:24   -1.007184  \\n\",\n       \"...                                         ...  \\n\",\n       \"TSAI002_NPC-8_XH:CI-994:10            -2.252348  \\n\",\n       \"TSAI002_NPC-8_XH:COMPE:2               0.466976  \\n\",\n       \"TSAI002_NPC-8_XH:DAC-3:5              -2.094904  \\n\",\n       \"TSAI002_NPC-8_XH:SAHA:2.5             -2.017400  \\n\",\n       \"TSAI002_NPC-8_XH:SRT3657:5            -0.723844  \\n\",\n       \"\\n\",\n       \"[720216 rows x 978 columns]\"\n      ]\n     },\n     \"execution_count\": 29,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Read the full Level-5 data file by only taking rows with real genes\\n\",\n    \"parsed = parse(LEVEL5_FILE, ridx=keep_gene_ridx)\\n\",\n    \"parsed_df = parsed.data_df.transpose()\\n\",\n    \"parsed_df\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 30,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"720216\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>-1.166500</td>\\n\",\n       \"      <td>-0.606900</td>\\n\",\n       \"      <td>-0.733650</td>\\n\",\n       \"      <td>-1.481400</td>\\n\",\n       \"      <td>1.281200</td>\\n\",\n       \"      <td>4.095450</td>\\n\",\n       \"      <td>-0.712400</td>\\n\",\n       \"      <td>0.995350</td>\\n\",\n       \"      <td>-0.031650</td>\\n\",\n       \"      <td>-0.990250</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.883050</td>\\n\",\n       \"      <td>-1.605100</td>\\n\",\n       \"      <td>0.005250</td>\\n\",\n       \"      <td>0.979050</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:0.625:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>0.794862</td>\\n\",\n       \"      <td>-0.358541</td>\\n\",\n       \"      <td>0.122322</td>\\n\",\n       \"      <td>-0.550787</td>\\n\",\n       \"      <td>-0.178181</td>\\n\",\n       \"      <td>1.566406</td>\\n\",\n       \"      <td>0.058614</td>\\n\",\n       \"      <td>0.308965</td>\\n\",\n       \"      <td>0.369855</td>\\n\",\n       \"      <td>-0.948085</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.421624</td>\\n\",\n       \"      <td>-0.335863</td>\\n\",\n       \"      <td>0.308946</td>\\n\",\n       \"      <td>-0.352101</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:0.625:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>2.599445</td>\\n\",\n       \"      <td>1.755998</td>\\n\",\n       \"      <td>-0.776326</td>\\n\",\n       \"      <td>-4.121394</td>\\n\",\n       \"      <td>2.539309</td>\\n\",\n       \"      <td>0.533612</td>\\n\",\n       \"      <td>-5.299499</td>\\n\",\n       \"      <td>1.496123</td>\\n\",\n       \"      <td>0.462508</td>\\n\",\n       \"      <td>2.645836</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.400337</td>\\n\",\n       \"      <td>0.068793</td>\\n\",\n       \"      <td>-0.495560</td>\\n\",\n       \"      <td>0.044498</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:10:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>0.230140</td>\\n\",\n       \"      <td>1.530381</td>\\n\",\n       \"      <td>-0.823664</td>\\n\",\n       \"      <td>0.111742</td>\\n\",\n       \"      <td>0.497451</td>\\n\",\n       \"      <td>-1.489498</td>\\n\",\n       \"      <td>0.113403</td>\\n\",\n       \"      <td>0.309370</td>\\n\",\n       \"      <td>0.087925</td>\\n\",\n       \"      <td>-1.126528</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.049004</td>\\n\",\n       \"      <td>-0.486649</td>\\n\",\n       \"      <td>0.594023</td>\\n\",\n       \"      <td>1.365092</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:10:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>2.498556</td>\\n\",\n       \"      <td>3.288291</td>\\n\",\n       \"      <td>-0.831289</td>\\n\",\n       \"      <td>-3.811227</td>\\n\",\n       \"      <td>-0.816384</td>\\n\",\n       \"      <td>4.508691</td>\\n\",\n       \"      <td>-3.575771</td>\\n\",\n       \"      <td>1.432798</td>\\n\",\n       \"      <td>0.086126</td>\\n\",\n       \"      <td>1.699897</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.593055</td>\\n\",\n       \"      <td>-0.425393</td>\\n\",\n       \"      <td>0.606134</td>\\n\",\n       \"      <td>-1.007184</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:2.5:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429384</th>\\n\",\n       \"      <td>0.605400</td>\\n\",\n       \"      <td>-0.179350</td>\\n\",\n       \"      <td>-0.672100</td>\\n\",\n       \"      <td>-0.628750</td>\\n\",\n       \"      <td>0.111400</td>\\n\",\n       \"      <td>0.544600</td>\\n\",\n       \"      <td>0.591700</td>\\n\",\n       \"      <td>-1.256000</td>\\n\",\n       \"      <td>0.485650</td>\\n\",\n       \"      <td>-4.867000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.829850</td>\\n\",\n       \"      <td>-0.131600</td>\\n\",\n       \"      <td>-0.814150</td>\\n\",\n       \"      <td>-0.741750</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:0.1235</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429385</th>\\n\",\n       \"      <td>0.160400</td>\\n\",\n       \"      <td>0.277550</td>\\n\",\n       \"      <td>-0.141750</td>\\n\",\n       \"      <td>1.081850</td>\\n\",\n       \"      <td>-0.488050</td>\\n\",\n       \"      <td>0.783200</td>\\n\",\n       \"      <td>-0.577050</td>\\n\",\n       \"      <td>-0.049500</td>\\n\",\n       \"      <td>-0.270950</td>\\n\",\n       \"      <td>0.489950</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.115850</td>\\n\",\n       \"      <td>0.279050</td>\\n\",\n       \"      <td>-0.244850</td>\\n\",\n       \"      <td>-0.436050</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:0.3704</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429386</th>\\n\",\n       \"      <td>0.400950</td>\\n\",\n       \"      <td>0.139500</td>\\n\",\n       \"      <td>1.385550</td>\\n\",\n       \"      <td>-3.053850</td>\\n\",\n       \"      <td>5.312200</td>\\n\",\n       \"      <td>0.763750</td>\\n\",\n       \"      <td>-1.333150</td>\\n\",\n       \"      <td>0.471400</td>\\n\",\n       \"      <td>-0.385800</td>\\n\",\n       \"      <td>0.401000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.331250</td>\\n\",\n       \"      <td>0.954800</td>\\n\",\n       \"      <td>0.373200</td>\\n\",\n       \"      <td>-0.479750</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:1.1111</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429387</th>\\n\",\n       \"      <td>0.721800</td>\\n\",\n       \"      <td>0.454000</td>\\n\",\n       \"      <td>0.146150</td>\\n\",\n       \"      <td>-1.052350</td>\\n\",\n       \"      <td>0.394500</td>\\n\",\n       \"      <td>0.732550</td>\\n\",\n       \"      <td>-1.404100</td>\\n\",\n       \"      <td>0.240850</td>\\n\",\n       \"      <td>-0.753750</td>\\n\",\n       \"      <td>1.155500</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.236550</td>\\n\",\n       \"      <td>0.410850</td>\\n\",\n       \"      <td>0.219400</td>\\n\",\n       \"      <td>-0.355300</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:10</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429388</th>\\n\",\n       \"      <td>1.308400</td>\\n\",\n       \"      <td>-0.412800</td>\\n\",\n       \"      <td>-0.355400</td>\\n\",\n       \"      <td>0.805100</td>\\n\",\n       \"      <td>-0.674500</td>\\n\",\n       \"      <td>-0.141400</td>\\n\",\n       \"      <td>0.934500</td>\\n\",\n       \"      <td>-1.567500</td>\\n\",\n       \"      <td>-1.117700</td>\\n\",\n       \"      <td>-0.781900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.064000</td>\\n\",\n       \"      <td>0.556300</td>\\n\",\n       \"      <td>-0.815800</td>\\n\",\n       \"      <td>-1.526200</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:3.3333</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>414261 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"0      -1.166500 -0.606900 -0.733650 -1.481400  1.281200  4.095450 -0.712400   \\n\",\n       \"1       0.794862 -0.358541  0.122322 -0.550787 -0.178181  1.566406  0.058614   \\n\",\n       \"2       2.599445  1.755998 -0.776326 -4.121394  2.539309  0.533612 -5.299499   \\n\",\n       \"3       0.230140  1.530381 -0.823664  0.111742  0.497451 -1.489498  0.113403   \\n\",\n       \"4       2.498556  3.288291 -0.831289 -3.811227 -0.816384  4.508691 -3.575771   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"429384  0.605400 -0.179350 -0.672100 -0.628750  0.111400  0.544600  0.591700   \\n\",\n       \"429385  0.160400  0.277550 -0.141750  1.081850 -0.488050  0.783200 -0.577050   \\n\",\n       \"429386  0.400950  0.139500  1.385550 -3.053850  5.312200  0.763750 -1.333150   \\n\",\n       \"429387  0.721800  0.454000  0.146150 -1.052350  0.394500  0.732550 -1.404100   \\n\",\n       \"429388  1.308400 -0.412800 -0.355400  0.805100 -0.674500 -0.141400  0.934500   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"0       0.995350 -0.031650 -0.990250  ... -0.883050 -1.605100  0.005250   \\n\",\n       \"1       0.308965  0.369855 -0.948085  ... -2.421624 -0.335863  0.308946   \\n\",\n       \"2       1.496123  0.462508  2.645836  ... -0.400337  0.068793 -0.495560   \\n\",\n       \"3       0.309370  0.087925 -1.126528  ... -2.049004 -0.486649  0.594023   \\n\",\n       \"4       1.432798  0.086126  1.699897  ... -0.593055 -0.425393  0.606134   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"429384 -1.256000  0.485650 -4.867000  ... -0.829850 -0.131600 -0.814150   \\n\",\n       \"429385 -0.049500 -0.270950  0.489950  ... -1.115850  0.279050 -0.244850   \\n\",\n       \"429386  0.471400 -0.385800  0.401000  ... -0.331250  0.954800  0.373200   \\n\",\n       \"429387  0.240850 -0.753750  1.155500  ... -0.236550  0.410850  0.219400   \\n\",\n       \"429388 -1.567500 -1.117700 -0.781900  ... -0.064000  0.556300 -0.815800   \\n\",\n       \"\\n\",\n       \"            9988                                      full_id        pert_id  \\\\\\n\",\n       \"0       0.979050        ABY001_A375_XH:BRD-A61304759:0.625:24  BRD-A61304759   \\n\",\n       \"1      -0.352101         ABY001_A375_XH:BRD-A61304759:0.625:3  BRD-A61304759   \\n\",\n       \"2       0.044498           ABY001_A375_XH:BRD-A61304759:10:24  BRD-A61304759   \\n\",\n       \"3       1.365092            ABY001_A375_XH:BRD-A61304759:10:3  BRD-A61304759   \\n\",\n       \"4      -1.007184          ABY001_A375_XH:BRD-A61304759:2.5:24  BRD-A61304759   \\n\",\n       \"...          ...                                          ...            ...   \\n\",\n       \"429384 -0.741750  RAD001_PC3_6H:BRD-K95986273-001-01-9:0.1235  BRD-K95986273   \\n\",\n       \"429385 -0.436050  RAD001_PC3_6H:BRD-K95986273-001-01-9:0.3704  BRD-K95986273   \\n\",\n       \"429386 -0.479750  RAD001_PC3_6H:BRD-K95986273-001-01-9:1.1111  BRD-K95986273   \\n\",\n       \"429387 -0.355300      RAD001_PC3_6H:BRD-K95986273-001-01-9:10  BRD-K95986273   \\n\",\n       \"429388 -1.526200  RAD001_PC3_6H:BRD-K95986273-001-01-9:3.3333  BRD-K95986273   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"0             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"1             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"2             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"3             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"4             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"429384         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429385         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429386         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429387         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429388         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"0       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"1       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"2       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"3       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"4       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"429384  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429385  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429386  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429387  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429388  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"\\n\",\n       \"[414261 rows x 984 columns]\"\n      ]\n     },\n     \"execution_count\": 30,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Add the \\\"pert_id\\\" and \\\"cell_iname\\\" to the dataframe\\n\",\n    \"pert_id = [\\\"-\\\".join(id.split(\\\":\\\")[1].split(\\\"-\\\")[:2]) for id in parsed_df.index]\\n\",\n    \"cell_iname = [id.split(\\\"_\\\")[1] for id in parsed_df.index]\\n\",\n    \"data_cols = list(parsed_df.columns)\\n\",\n    \"parsed_df2 = parsed_df.copy(deep=True)\\n\",\n    \"print(len(parsed_df2.index))\\n\",\n    \"parsed_df2[\\\"full_id\\\"] = parsed_df2.index\\n\",\n    \"parsed_df2[\\\"pert_id\\\"] = pert_id\\n\",\n    \"parsed_df2[\\\"cell_iname\\\"] = cell_iname\\n\",\n    \"\\n\",\n    \"# Remove all rows that are not small molecules (don't contain \\\"BRD\\\").\\n\",\n    \"# Merge with `comp_df` to get the compound information associated to each `pert_id`\\n\",\n    \"# Remove all rows that don't have a valid SMILES\\n\",\n    \"parsed_df2 = parsed_df2[parsed_df2[\\\"pert_id\\\"].str.contains(\\\"BRD\\\")]\\n\",\n    \"parsed_df2 = pd.merge(parsed_df2, comp_df, on=\\\"pert_id\\\")\\n\",\n    \"is_good_smiles = np.array([isinstance(s, str) for s in parsed_df2[\\\"SMILES\\\"]])\\n\",\n    \"parsed_df2 = parsed_df2[is_good_smiles]\\n\",\n    \"parsed_df2\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 31,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"top 20 cell lines, number of unique molecules:\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"U2OS        16058\\n\",\n       \"VCAP        15220\\n\",\n       \"A549        12285\\n\",\n       \"MCF7        11622\\n\",\n       \"PC3         11521\\n\",\n       \"A375        10694\\n\",\n       \"HT29        10078\\n\",\n       \"HA1E         5514\\n\",\n       \"HCC515       5351\\n\",\n       \"HEPG2        4585\\n\",\n       \"NPC          3481\\n\",\n       \"NEU          2508\\n\",\n       \"ASC          2443\\n\",\n       \"SKB          2434\\n\",\n       \"PHH          1851\\n\",\n       \"FIBRNPC       611\\n\",\n       \"U937          399\\n\",\n       \"NCIH2073      368\\n\",\n       \"NCIH596       368\\n\",\n       \"NCIH508       367\\n\",\n       \"Name: cell_iname, dtype: int64\"\n      ]\n     },\n     \"execution_count\": 31,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Find the 20 cell lines with the most unique molecules data points, and print them\\n\",\n    \"cols_merge_by = [\\\"pert_id\\\", \\\"cell_iname\\\"]\\n\",\n    \"cols_other = list(set(parsed_df2.columns) - set(cols_merge_by))\\n\",\n    \"agg_dict = {col: \\\"mean\\\" if col in data_cols else \\\"first\\\" for col in cols_other}\\n\",\n    \"grouped_by_cell_mol = parsed_df2.groupby(by=cols_merge_by, axis=0, as_index=False).agg(agg_dict)\\n\",\n    \"print(\\\"top 20 cell lines, number of unique molecules:\\\")\\n\",\n    \"top20_lines = grouped_by_cell_mol[\\\"cell_iname\\\"].value_counts()[:20]\\n\",\n    \"top20_lines\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 32,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"U2OS\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>354</th>\\n\",\n       \"      <td>0.266000</td>\\n\",\n       \"      <td>-0.479600</td>\\n\",\n       \"      <td>-0.198300</td>\\n\",\n       \"      <td>-0.431300</td>\\n\",\n       \"      <td>-1.229350</td>\\n\",\n       \"      <td>3.355450</td>\\n\",\n       \"      <td>-1.548500</td>\\n\",\n       \"      <td>0.493650</td>\\n\",\n       \"      <td>-0.611050</td>\\n\",\n       \"      <td>0.528150</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.097050</td>\\n\",\n       \"      <td>-2.536850</td>\\n\",\n       \"      <td>-1.160750</td>\\n\",\n       \"      <td>0.172700</td>\\n\",\n       \"      <td>PAC001_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>355</th>\\n\",\n       \"      <td>0.215200</td>\\n\",\n       \"      <td>0.210250</td>\\n\",\n       \"      <td>1.565950</td>\\n\",\n       \"      <td>-0.410100</td>\\n\",\n       \"      <td>-0.532450</td>\\n\",\n       \"      <td>3.426700</td>\\n\",\n       \"      <td>-0.762800</td>\\n\",\n       \"      <td>1.400200</td>\\n\",\n       \"      <td>0.557150</td>\\n\",\n       \"      <td>-0.067150</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.420500</td>\\n\",\n       \"      <td>-1.600900</td>\\n\",\n       \"      <td>-0.944150</td>\\n\",\n       \"      <td>0.896000</td>\\n\",\n       \"      <td>PAC002_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>356</th>\\n\",\n       \"      <td>-0.030250</td>\\n\",\n       \"      <td>0.215500</td>\\n\",\n       \"      <td>1.020750</td>\\n\",\n       \"      <td>0.212600</td>\\n\",\n       \"      <td>0.240450</td>\\n\",\n       \"      <td>2.905800</td>\\n\",\n       \"      <td>-1.067600</td>\\n\",\n       \"      <td>0.687250</td>\\n\",\n       \"      <td>0.232700</td>\\n\",\n       \"      <td>0.448650</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.074000</td>\\n\",\n       \"      <td>-1.339700</td>\\n\",\n       \"      <td>-0.656750</td>\\n\",\n       \"      <td>-0.375600</td>\\n\",\n       \"      <td>PAC003_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>357</th>\\n\",\n       \"      <td>0.131900</td>\\n\",\n       \"      <td>1.166850</td>\\n\",\n       \"      <td>1.117750</td>\\n\",\n       \"      <td>-0.466700</td>\\n\",\n       \"      <td>-0.843450</td>\\n\",\n       \"      <td>4.599200</td>\\n\",\n       \"      <td>-2.009750</td>\\n\",\n       \"      <td>-0.575650</td>\\n\",\n       \"      <td>0.672100</td>\\n\",\n       \"      <td>0.367900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.687200</td>\\n\",\n       \"      <td>-1.443900</td>\\n\",\n       \"      <td>-1.615750</td>\\n\",\n       \"      <td>-0.186750</td>\\n\",\n       \"      <td>PAC004_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>358</th>\\n\",\n       \"      <td>-0.034000</td>\\n\",\n       \"      <td>0.874000</td>\\n\",\n       \"      <td>0.655750</td>\\n\",\n       \"      <td>-0.269150</td>\\n\",\n       \"      <td>0.966950</td>\\n\",\n       \"      <td>3.022300</td>\\n\",\n       \"      <td>-0.939900</td>\\n\",\n       \"      <td>0.985750</td>\\n\",\n       \"      <td>0.717850</td>\\n\",\n       \"      <td>-0.352600</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.305050</td>\\n\",\n       \"      <td>-1.605700</td>\\n\",\n       \"      <td>-2.667600</td>\\n\",\n       \"      <td>-0.660750</td>\\n\",\n       \"      <td>PAC005_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423912</th>\\n\",\n       \"      <td>-0.517706</td>\\n\",\n       \"      <td>-1.212317</td>\\n\",\n       \"      <td>-0.322461</td>\\n\",\n       \"      <td>0.124004</td>\\n\",\n       \"      <td>-0.497949</td>\\n\",\n       \"      <td>0.409649</td>\\n\",\n       \"      <td>1.260086</td>\\n\",\n       \"      <td>0.022732</td>\\n\",\n       \"      <td>-0.279040</td>\\n\",\n       \"      <td>0.357222</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.349653</td>\\n\",\n       \"      <td>1.112821</td>\\n\",\n       \"      <td>-0.600327</td>\\n\",\n       \"      <td>-0.213830</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K77801455-001-05-3:10</td>\\n\",\n       \"      <td>BRD-K77801455</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>CCC(=O)NCC(=O)OCC(=O)c1ccccc1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>Propionylamino-acetic acid 2-oxo-2-phenyl-ethy...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423922</th>\\n\",\n       \"      <td>0.404553</td>\\n\",\n       \"      <td>-0.282865</td>\\n\",\n       \"      <td>0.209292</td>\\n\",\n       \"      <td>0.007066</td>\\n\",\n       \"      <td>0.490206</td>\\n\",\n       \"      <td>-0.355228</td>\\n\",\n       \"      <td>-0.748379</td>\\n\",\n       \"      <td>0.106611</td>\\n\",\n       \"      <td>-0.473985</td>\\n\",\n       \"      <td>-0.632465</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.196296</td>\\n\",\n       \"      <td>-0.104630</td>\\n\",\n       \"      <td>0.070716</td>\\n\",\n       \"      <td>-0.157745</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K81266242-001-05-1:10</td>\\n\",\n       \"      <td>BRD-K81266242</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>O=C(CCc1ccccc1)N1CCN(CC1)c1ccccn1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>1-(3-phenylpropanoyl)-4-(2-pyridinyl)piperazine</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423925</th>\\n\",\n       \"      <td>-0.451088</td>\\n\",\n       \"      <td>0.057352</td>\\n\",\n       \"      <td>-0.274881</td>\\n\",\n       \"      <td>-0.319326</td>\\n\",\n       \"      <td>-0.444744</td>\\n\",\n       \"      <td>0.394264</td>\\n\",\n       \"      <td>0.334506</td>\\n\",\n       \"      <td>-0.001001</td>\\n\",\n       \"      <td>-0.517647</td>\\n\",\n       \"      <td>-0.022595</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.168200</td>\\n\",\n       \"      <td>1.282395</td>\\n\",\n       \"      <td>0.302800</td>\\n\",\n       \"      <td>-0.405775</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K83028309-001-05-3:10</td>\\n\",\n       \"      <td>BRD-K83028309</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>Nc1cc(nc2cc(nn12)-c1ccccc1)-c1ccc(Br)cc1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>5-(4-Bromo-phenyl)-2-phenyl-pyrazolo[1,5-a]pyr...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423939</th>\\n\",\n       \"      <td>-1.002630</td>\\n\",\n       \"      <td>0.584975</td>\\n\",\n       \"      <td>0.709276</td>\\n\",\n       \"      <td>0.607589</td>\\n\",\n       \"      <td>0.158020</td>\\n\",\n       \"      <td>-1.167863</td>\\n\",\n       \"      <td>0.213422</td>\\n\",\n       \"      <td>-0.155531</td>\\n\",\n       \"      <td>0.696174</td>\\n\",\n       \"      <td>-0.907303</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.025072</td>\\n\",\n       \"      <td>0.210780</td>\\n\",\n       \"      <td>-0.037277</td>\\n\",\n       \"      <td>-0.869588</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K87879912-001-05-4:10</td>\\n\",\n       \"      <td>BRD-K87879912</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>O=S(=O)(CCOc1ccccc1)c1nc2ccccc2[nH]1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423955</th>\\n\",\n       \"      <td>-0.570759</td>\\n\",\n       \"      <td>-0.515178</td>\\n\",\n       \"      <td>0.889132</td>\\n\",\n       \"      <td>-2.230757</td>\\n\",\n       \"      <td>2.850153</td>\\n\",\n       \"      <td>-1.396367</td>\\n\",\n       \"      <td>-0.140172</td>\\n\",\n       \"      <td>-0.561217</td>\\n\",\n       \"      <td>0.466810</td>\\n\",\n       \"      <td>-1.153605</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.611325</td>\\n\",\n       \"      <td>1.261738</td>\\n\",\n       \"      <td>0.581113</td>\\n\",\n       \"      <td>0.099712</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K94948151-001-06-6:10</td>\\n\",\n       \"      <td>BRD-K94948151</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COc1cc(C[C@H](C)[C@H](C)Cc2ccc3OCOc3c2)ccc1O</td>\\n\",\n       \"      <td>QDDILOVMGWUNGD-UONOGXRCSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>22631 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"354     0.266000 -0.479600 -0.198300 -0.431300 -1.229350  3.355450 -1.548500   \\n\",\n       \"355     0.215200  0.210250  1.565950 -0.410100 -0.532450  3.426700 -0.762800   \\n\",\n       \"356    -0.030250  0.215500  1.020750  0.212600  0.240450  2.905800 -1.067600   \\n\",\n       \"357     0.131900  1.166850  1.117750 -0.466700 -0.843450  4.599200 -2.009750   \\n\",\n       \"358    -0.034000  0.874000  0.655750 -0.269150  0.966950  3.022300 -0.939900   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"423912 -0.517706 -1.212317 -0.322461  0.124004 -0.497949  0.409649  1.260086   \\n\",\n       \"423922  0.404553 -0.282865  0.209292  0.007066  0.490206 -0.355228 -0.748379   \\n\",\n       \"423925 -0.451088  0.057352 -0.274881 -0.319326 -0.444744  0.394264  0.334506   \\n\",\n       \"423939 -1.002630  0.584975  0.709276  0.607589  0.158020 -1.167863  0.213422   \\n\",\n       \"423955 -0.570759 -0.515178  0.889132 -2.230757  2.850153 -1.396367 -0.140172   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"354     0.493650 -0.611050  0.528150  ...  0.097050 -2.536850 -1.160750   \\n\",\n       \"355     1.400200  0.557150 -0.067150  ...  1.420500 -1.600900 -0.944150   \\n\",\n       \"356     0.687250  0.232700  0.448650  ... -0.074000 -1.339700 -0.656750   \\n\",\n       \"357    -0.575650  0.672100  0.367900  ...  0.687200 -1.443900 -1.615750   \\n\",\n       \"358     0.985750  0.717850 -0.352600  ... -1.305050 -1.605700 -2.667600   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"423912  0.022732 -0.279040  0.357222  ...  1.349653  1.112821 -0.600327   \\n\",\n       \"423922  0.106611 -0.473985 -0.632465  ...  0.196296 -0.104630  0.070716   \\n\",\n       \"423925 -0.001001 -0.517647 -0.022595  ...  0.168200  1.282395  0.302800   \\n\",\n       \"423939 -0.155531  0.696174 -0.907303  ...  0.025072  0.210780 -0.037277   \\n\",\n       \"423955 -0.561217  0.466810 -1.153605  ...  0.611325  1.261738  0.581113   \\n\",\n       \"\\n\",\n       \"            9988                                   full_id        pert_id  \\\\\\n\",\n       \"354     0.172700  PAC001_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"355     0.896000  PAC002_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"356    -0.375600  PAC003_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"357    -0.186750  PAC004_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"358    -0.660750  PAC005_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"...          ...                                       ...            ...   \\n\",\n       \"423912 -0.213830  PAC068_U2OS_6H:BRD-K77801455-001-05-3:10  BRD-K77801455   \\n\",\n       \"423922 -0.157745  PAC068_U2OS_6H:BRD-K81266242-001-05-1:10  BRD-K81266242   \\n\",\n       \"423925 -0.405775  PAC068_U2OS_6H:BRD-K83028309-001-05-3:10  BRD-K83028309   \\n\",\n       \"423939 -0.869588  PAC068_U2OS_6H:BRD-K87879912-001-05-4:10  BRD-K87879912   \\n\",\n       \"423955  0.099712  PAC068_U2OS_6H:BRD-K94948151-001-06-6:10  BRD-K94948151   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"354           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"355           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"356           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"357           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"358           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"423912        U2OS                      CCC(=O)NCC(=O)OCC(=O)c1ccccc1   \\n\",\n       \"423922        U2OS                  O=C(CCc1ccccc1)N1CCN(CC1)c1ccccn1   \\n\",\n       \"423925        U2OS           Nc1cc(nc2cc(nn12)-c1ccccc1)-c1ccc(Br)cc1   \\n\",\n       \"423939        U2OS               O=S(=O)(CCOc1ccccc1)c1nc2ccccc2[nH]1   \\n\",\n       \"423955        U2OS       COc1cc(C[C@H](C)[C@H](C)Cc2ccc3OCOc3c2)ccc1O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  \\\\\\n\",\n       \"354     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"355     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"356     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"357     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"358     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"...                             ...   \\n\",\n       \"423912                          NaN   \\n\",\n       \"423922                          NaN   \\n\",\n       \"423925                          NaN   \\n\",\n       \"423939                          NaN   \\n\",\n       \"423955  QDDILOVMGWUNGD-UONOGXRCSA-N   \\n\",\n       \"\\n\",\n       \"                                         compound_aliases  \\n\",\n       \"354                                                   NaN  \\n\",\n       \"355                                                   NaN  \\n\",\n       \"356                                                   NaN  \\n\",\n       \"357                                                   NaN  \\n\",\n       \"358                                                   NaN  \\n\",\n       \"...                                                   ...  \\n\",\n       \"423912  Propionylamino-acetic acid 2-oxo-2-phenyl-ethy...  \\n\",\n       \"423922    1-(3-phenylpropanoyl)-4-(2-pyridinyl)piperazine  \\n\",\n       \"423925  5-(4-Bromo-phenyl)-2-phenyl-pyrazolo[1,5-a]pyr...  \\n\",\n       \"423939                                                NaN  \\n\",\n       \"423955                                                NaN  \\n\",\n       \"\\n\",\n       \"[22631 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"HA1E\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>106</th>\\n\",\n       \"      <td>-3.013548</td>\\n\",\n       \"      <td>0.849998</td>\\n\",\n       \"      <td>-0.471952</td>\\n\",\n       \"      <td>-4.127614</td>\\n\",\n       \"      <td>1.278855</td>\\n\",\n       \"      <td>3.154837</td>\\n\",\n       \"      <td>1.450641</td>\\n\",\n       \"      <td>1.296630</td>\\n\",\n       \"      <td>-0.429045</td>\\n\",\n       \"      <td>0.139617</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.103442</td>\\n\",\n       \"      <td>-4.727867</td>\\n\",\n       \"      <td>-1.920775</td>\\n\",\n       \"      <td>-0.160905</td>\\n\",\n       \"      <td>DOSVAL002_HA1E_24H:BRD-A61304759:10</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>107</th>\\n\",\n       \"      <td>-0.923984</td>\\n\",\n       \"      <td>0.169660</td>\\n\",\n       \"      <td>0.154486</td>\\n\",\n       \"      <td>-1.153042</td>\\n\",\n       \"      <td>-0.351350</td>\\n\",\n       \"      <td>2.678426</td>\\n\",\n       \"      <td>2.441816</td>\\n\",\n       \"      <td>1.726164</td>\\n\",\n       \"      <td>-0.662480</td>\\n\",\n       \"      <td>0.971808</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.590330</td>\\n\",\n       \"      <td>-4.548966</td>\\n\",\n       \"      <td>-1.674109</td>\\n\",\n       \"      <td>-2.167018</td>\\n\",\n       \"      <td>DOSVAL002_HA1E_24H:BRD-A61304759:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>108</th>\\n\",\n       \"      <td>-2.871151</td>\\n\",\n       \"      <td>1.047984</td>\\n\",\n       \"      <td>-1.013709</td>\\n\",\n       \"      <td>-2.247387</td>\\n\",\n       \"      <td>-0.014152</td>\\n\",\n       \"      <td>2.040751</td>\\n\",\n       \"      <td>2.078447</td>\\n\",\n       \"      <td>0.631643</td>\\n\",\n       \"      <td>-1.353293</td>\\n\",\n       \"      <td>1.209673</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.080195</td>\\n\",\n       \"      <td>-3.855181</td>\\n\",\n       \"      <td>-2.286743</td>\\n\",\n       \"      <td>-1.489498</td>\\n\",\n       \"      <td>DOSVAL002_HA1E_24H:BRD-A61304759:5</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>130</th>\\n\",\n       \"      <td>-1.602455</td>\\n\",\n       \"      <td>0.846125</td>\\n\",\n       \"      <td>-0.184822</td>\\n\",\n       \"      <td>-3.970871</td>\\n\",\n       \"      <td>6.530649</td>\\n\",\n       \"      <td>0.050334</td>\\n\",\n       \"      <td>2.001070</td>\\n\",\n       \"      <td>7.324931</td>\\n\",\n       \"      <td>-0.659349</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.898046</td>\\n\",\n       \"      <td>-6.817307</td>\\n\",\n       \"      <td>-3.343323</td>\\n\",\n       \"      <td>-1.189793</td>\\n\",\n       \"      <td>DOSVAL003_HA1E_24H:BRD-A61304759:10</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>131</th>\\n\",\n       \"      <td>1.147850</td>\\n\",\n       \"      <td>0.468300</td>\\n\",\n       \"      <td>3.880000</td>\\n\",\n       \"      <td>0.008400</td>\\n\",\n       \"      <td>-0.811200</td>\\n\",\n       \"      <td>3.177950</td>\\n\",\n       \"      <td>2.405700</td>\\n\",\n       \"      <td>2.352800</td>\\n\",\n       \"      <td>-1.297900</td>\\n\",\n       \"      <td>0.873900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.477500</td>\\n\",\n       \"      <td>-7.021399</td>\\n\",\n       \"      <td>-1.506050</td>\\n\",\n       \"      <td>-2.879350</td>\\n\",\n       \"      <td>DOSVAL003_HA1E_24H:BRD-A61304759:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428533</th>\\n\",\n       \"      <td>-0.260942</td>\\n\",\n       \"      <td>-1.251015</td>\\n\",\n       \"      <td>0.363250</td>\\n\",\n       \"      <td>-1.018164</td>\\n\",\n       \"      <td>0.953256</td>\\n\",\n       \"      <td>-0.653346</td>\\n\",\n       \"      <td>0.526826</td>\\n\",\n       \"      <td>-0.318230</td>\\n\",\n       \"      <td>0.755271</td>\\n\",\n       \"      <td>-0.339214</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.248439</td>\\n\",\n       \"      <td>0.307065</td>\\n\",\n       \"      <td>-0.885660</td>\\n\",\n       \"      <td>-0.235281</td>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K83493571-001-02-7:0.12</td>\\n\",\n       \"      <td>BRD-K83493571</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1</td>\\n\",\n       \"      <td>XWOMTTOIGDQNSS-SSDOTTSWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428534</th>\\n\",\n       \"      <td>-0.947571</td>\\n\",\n       \"      <td>1.256580</td>\\n\",\n       \"      <td>-0.693020</td>\\n\",\n       \"      <td>-0.355582</td>\\n\",\n       \"      <td>0.079166</td>\\n\",\n       \"      <td>0.242886</td>\\n\",\n       \"      <td>-0.026739</td>\\n\",\n       \"      <td>-0.611576</td>\\n\",\n       \"      <td>-0.457813</td>\\n\",\n       \"      <td>-0.756950</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.364928</td>\\n\",\n       \"      <td>0.388835</td>\\n\",\n       \"      <td>-0.503934</td>\\n\",\n       \"      <td>-1.223861</td>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K83493571-001-02-7:0.37</td>\\n\",\n       \"      <td>BRD-K83493571</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1</td>\\n\",\n       \"      <td>XWOMTTOIGDQNSS-SSDOTTSWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428535</th>\\n\",\n       \"      <td>-0.525437</td>\\n\",\n       \"      <td>-0.199946</td>\\n\",\n       \"      <td>-0.405681</td>\\n\",\n       \"      <td>-0.694295</td>\\n\",\n       \"      <td>0.153188</td>\\n\",\n       \"      <td>-1.539271</td>\\n\",\n       \"      <td>0.260968</td>\\n\",\n       \"      <td>0.276569</td>\\n\",\n       \"      <td>0.027623</td>\\n\",\n       \"      <td>0.204523</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.416565</td>\\n\",\n       \"      <td>0.177532</td>\\n\",\n       \"      <td>0.545663</td>\\n\",\n       \"      <td>0.016072</td>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K83493571-001-02-7:1.11</td>\\n\",\n       \"      <td>BRD-K83493571</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1</td>\\n\",\n       \"      <td>XWOMTTOIGDQNSS-SSDOTTSWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428536</th>\\n\",\n       \"      <td>1.317981</td>\\n\",\n       \"      <td>0.104952</td>\\n\",\n       \"      <td>1.105384</td>\\n\",\n       \"      <td>0.492846</td>\\n\",\n       \"      <td>0.249595</td>\\n\",\n       \"      <td>-0.191166</td>\\n\",\n       \"      <td>0.863211</td>\\n\",\n       \"      <td>-0.026311</td>\\n\",\n       \"      <td>-1.277453</td>\\n\",\n       \"      <td>-0.335337</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.916682</td>\\n\",\n       \"      <td>-0.907757</td>\\n\",\n       \"      <td>0.268371</td>\\n\",\n       \"      <td>0.144862</td>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K83493571-001-02-7:10</td>\\n\",\n       \"      <td>BRD-K83493571</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1</td>\\n\",\n       \"      <td>XWOMTTOIGDQNSS-SSDOTTSWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428537</th>\\n\",\n       \"      <td>-0.033225</td>\\n\",\n       \"      <td>1.987736</td>\\n\",\n       \"      <td>-0.425942</td>\\n\",\n       \"      <td>-0.621020</td>\\n\",\n       \"      <td>-0.723554</td>\\n\",\n       \"      <td>-0.182915</td>\\n\",\n       \"      <td>-1.136104</td>\\n\",\n       \"      <td>-0.709499</td>\\n\",\n       \"      <td>-0.070721</td>\\n\",\n       \"      <td>1.339338</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.672382</td>\\n\",\n       \"      <td>0.386939</td>\\n\",\n       \"      <td>-0.116340</td>\\n\",\n       \"      <td>-0.755391</td>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K83493571-001-02-7:3.33</td>\\n\",\n       \"      <td>BRD-K83493571</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1</td>\\n\",\n       \"      <td>XWOMTTOIGDQNSS-SSDOTTSWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>18560 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"106    -3.013548  0.849998 -0.471952 -4.127614  1.278855  3.154837  1.450641   \\n\",\n       \"107    -0.923984  0.169660  0.154486 -1.153042 -0.351350  2.678426  2.441816   \\n\",\n       \"108    -2.871151  1.047984 -1.013709 -2.247387 -0.014152  2.040751  2.078447   \\n\",\n       \"130    -1.602455  0.846125 -0.184822 -3.970871  6.530649  0.050334  2.001070   \\n\",\n       \"131     1.147850  0.468300  3.880000  0.008400 -0.811200  3.177950  2.405700   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"428533 -0.260942 -1.251015  0.363250 -1.018164  0.953256 -0.653346  0.526826   \\n\",\n       \"428534 -0.947571  1.256580 -0.693020 -0.355582  0.079166  0.242886 -0.026739   \\n\",\n       \"428535 -0.525437 -0.199946 -0.405681 -0.694295  0.153188 -1.539271  0.260968   \\n\",\n       \"428536  1.317981  0.104952  1.105384  0.492846  0.249595 -0.191166  0.863211   \\n\",\n       \"428537 -0.033225  1.987736 -0.425942 -0.621020 -0.723554 -0.182915 -1.136104   \\n\",\n       \"\\n\",\n       \"           10057     10058      10059  ...      9943      9961       998  \\\\\\n\",\n       \"106     1.296630 -0.429045   0.139617  ... -2.103442 -4.727867 -1.920775   \\n\",\n       \"107     1.726164 -0.662480   0.971808  ... -1.590330 -4.548966 -1.674109   \\n\",\n       \"108     0.631643 -1.353293   1.209673  ... -2.080195 -3.855181 -2.286743   \\n\",\n       \"130     7.324931 -0.659349 -10.000000  ... -2.898046 -6.817307 -3.343323   \\n\",\n       \"131     2.352800 -1.297900   0.873900  ... -2.477500 -7.021399 -1.506050   \\n\",\n       \"...          ...       ...        ...  ...       ...       ...       ...   \\n\",\n       \"428533 -0.318230  0.755271  -0.339214  ...  0.248439  0.307065 -0.885660   \\n\",\n       \"428534 -0.611576 -0.457813  -0.756950  ... -0.364928  0.388835 -0.503934   \\n\",\n       \"428535  0.276569  0.027623   0.204523  ...  0.416565  0.177532  0.545663   \\n\",\n       \"428536 -0.026311 -1.277453  -0.335337  ...  0.916682 -0.907757  0.268371   \\n\",\n       \"428537 -0.709499 -0.070721   1.339338  ... -0.672382  0.386939 -0.116340   \\n\",\n       \"\\n\",\n       \"            9988                                       full_id        pert_id  \\\\\\n\",\n       \"106    -0.160905           DOSVAL002_HA1E_24H:BRD-A61304759:10  BRD-A61304759   \\n\",\n       \"107    -2.167018           DOSVAL002_HA1E_24H:BRD-A61304759:20  BRD-A61304759   \\n\",\n       \"108    -1.489498            DOSVAL002_HA1E_24H:BRD-A61304759:5  BRD-A61304759   \\n\",\n       \"130    -1.189793           DOSVAL003_HA1E_24H:BRD-A61304759:10  BRD-A61304759   \\n\",\n       \"131    -2.879350           DOSVAL003_HA1E_24H:BRD-A61304759:20  BRD-A61304759   \\n\",\n       \"...          ...                                           ...            ...   \\n\",\n       \"428533 -0.235281  PCLB003_HA1E_24H:BRD-K83493571-001-02-7:0.12  BRD-K83493571   \\n\",\n       \"428534 -1.223861  PCLB003_HA1E_24H:BRD-K83493571-001-02-7:0.37  BRD-K83493571   \\n\",\n       \"428535  0.016072  PCLB003_HA1E_24H:BRD-K83493571-001-02-7:1.11  BRD-K83493571   \\n\",\n       \"428536  0.144862    PCLB003_HA1E_24H:BRD-K83493571-001-02-7:10  BRD-K83493571   \\n\",\n       \"428537 -0.755391  PCLB003_HA1E_24H:BRD-K83493571-001-02-7:3.33  BRD-K83493571   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"106           HA1E  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"107           HA1E  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"108           HA1E  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"130           HA1E  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"131           HA1E  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"428533        HA1E                     C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1   \\n\",\n       \"428534        HA1E                     C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1   \\n\",\n       \"428535        HA1E                     C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1   \\n\",\n       \"428536        HA1E                     C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1   \\n\",\n       \"428537        HA1E                     C[C@@H]1CC(=O)NN=C1c1ccc(I)cc1   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"106     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"107     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"108     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"130     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"131     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"428533  XWOMTTOIGDQNSS-SSDOTTSWSA-N               NaN  \\n\",\n       \"428534  XWOMTTOIGDQNSS-SSDOTTSWSA-N               NaN  \\n\",\n       \"428535  XWOMTTOIGDQNSS-SSDOTTSWSA-N               NaN  \\n\",\n       \"428536  XWOMTTOIGDQNSS-SSDOTTSWSA-N               NaN  \\n\",\n       \"428537  XWOMTTOIGDQNSS-SSDOTTSWSA-N               NaN  \\n\",\n       \"\\n\",\n       \"[18560 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"VCAP\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>220</th>\\n\",\n       \"      <td>0.342880</td>\\n\",\n       \"      <td>0.229699</td>\\n\",\n       \"      <td>0.662968</td>\\n\",\n       \"      <td>0.112721</td>\\n\",\n       \"      <td>0.031991</td>\\n\",\n       \"      <td>0.223551</td>\\n\",\n       \"      <td>-0.312346</td>\\n\",\n       \"      <td>0.603852</td>\\n\",\n       \"      <td>0.173999</td>\\n\",\n       \"      <td>-0.088876</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.354263</td>\\n\",\n       \"      <td>-0.067259</td>\\n\",\n       \"      <td>0.125386</td>\\n\",\n       \"      <td>-0.309967</td>\\n\",\n       \"      <td>ERG005_VCAP_24H:BRD-A61304759-001-01-0:1</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>221</th>\\n\",\n       \"      <td>-1.243966</td>\\n\",\n       \"      <td>0.884699</td>\\n\",\n       \"      <td>0.361250</td>\\n\",\n       \"      <td>-0.877446</td>\\n\",\n       \"      <td>0.137078</td>\\n\",\n       \"      <td>1.132378</td>\\n\",\n       \"      <td>-0.255957</td>\\n\",\n       \"      <td>1.240245</td>\\n\",\n       \"      <td>-1.849554</td>\\n\",\n       \"      <td>0.469157</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.528902</td>\\n\",\n       \"      <td>0.165169</td>\\n\",\n       \"      <td>-2.121547</td>\\n\",\n       \"      <td>-0.109978</td>\\n\",\n       \"      <td>ERG005_VCAP_24H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>222</th>\\n\",\n       \"      <td>0.054080</td>\\n\",\n       \"      <td>0.722968</td>\\n\",\n       \"      <td>1.164083</td>\\n\",\n       \"      <td>-0.604267</td>\\n\",\n       \"      <td>0.272366</td>\\n\",\n       \"      <td>0.123806</td>\\n\",\n       \"      <td>-0.787444</td>\\n\",\n       \"      <td>-0.049253</td>\\n\",\n       \"      <td>-0.339428</td>\\n\",\n       \"      <td>-0.474537</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.271633</td>\\n\",\n       \"      <td>0.196106</td>\\n\",\n       \"      <td>0.786506</td>\\n\",\n       \"      <td>-0.593048</td>\\n\",\n       \"      <td>ERG005_VCAP_24H:BRD-A61304759-001-01-0:5</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>223</th>\\n\",\n       \"      <td>1.446668</td>\\n\",\n       \"      <td>-0.646172</td>\\n\",\n       \"      <td>0.734950</td>\\n\",\n       \"      <td>-0.025302</td>\\n\",\n       \"      <td>-0.581938</td>\\n\",\n       \"      <td>0.221347</td>\\n\",\n       \"      <td>-0.812867</td>\\n\",\n       \"      <td>0.183442</td>\\n\",\n       \"      <td>0.291759</td>\\n\",\n       \"      <td>0.328828</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.038288</td>\\n\",\n       \"      <td>-0.052247</td>\\n\",\n       \"      <td>1.144971</td>\\n\",\n       \"      <td>0.232041</td>\\n\",\n       \"      <td>ERG005_VCAP_48H:BRD-A61304759-001-01-0:1</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>224</th>\\n\",\n       \"      <td>-1.647826</td>\\n\",\n       \"      <td>0.715185</td>\\n\",\n       \"      <td>-1.479616</td>\\n\",\n       \"      <td>-2.191869</td>\\n\",\n       \"      <td>0.867779</td>\\n\",\n       \"      <td>1.617837</td>\\n\",\n       \"      <td>-3.098122</td>\\n\",\n       \"      <td>0.713327</td>\\n\",\n       \"      <td>-5.324870</td>\\n\",\n       \"      <td>-0.913241</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.650342</td>\\n\",\n       \"      <td>0.283260</td>\\n\",\n       \"      <td>-0.322341</td>\\n\",\n       \"      <td>-0.007010</td>\\n\",\n       \"      <td>ERG005_VCAP_48H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>392266</th>\\n\",\n       \"      <td>-0.192500</td>\\n\",\n       \"      <td>0.508600</td>\\n\",\n       \"      <td>2.958000</td>\\n\",\n       \"      <td>-1.932200</td>\\n\",\n       \"      <td>0.625200</td>\\n\",\n       \"      <td>4.463600</td>\\n\",\n       \"      <td>-2.164600</td>\\n\",\n       \"      <td>2.193400</td>\\n\",\n       \"      <td>-8.490000</td>\\n\",\n       \"      <td>1.876900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.317700</td>\\n\",\n       \"      <td>0.059300</td>\\n\",\n       \"      <td>0.172300</td>\\n\",\n       \"      <td>0.411000</td>\\n\",\n       \"      <td>ERG021_VCAP_24H:BRD-K54606188:0.37</td>\\n\",\n       \"      <td>BRD-K54606188</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...</td>\\n\",\n       \"      <td>DNVXATUJJDPFDM-KRWDZBQOSA-N</td>\\n\",\n       \"      <td>JQ1-(+)</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>392267</th>\\n\",\n       \"      <td>0.062300</td>\\n\",\n       \"      <td>2.650200</td>\\n\",\n       \"      <td>4.473000</td>\\n\",\n       \"      <td>-2.151800</td>\\n\",\n       \"      <td>1.434600</td>\\n\",\n       \"      <td>4.882200</td>\\n\",\n       \"      <td>-2.347200</td>\\n\",\n       \"      <td>0.762400</td>\\n\",\n       \"      <td>-7.368100</td>\\n\",\n       \"      <td>2.178000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.935600</td>\\n\",\n       \"      <td>-1.046000</td>\\n\",\n       \"      <td>0.059300</td>\\n\",\n       \"      <td>1.767200</td>\\n\",\n       \"      <td>ERG021_VCAP_24H:BRD-K54606188:1.11</td>\\n\",\n       \"      <td>BRD-K54606188</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...</td>\\n\",\n       \"      <td>DNVXATUJJDPFDM-KRWDZBQOSA-N</td>\\n\",\n       \"      <td>JQ1-(+)</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>392268</th>\\n\",\n       \"      <td>-2.960600</td>\\n\",\n       \"      <td>5.703700</td>\\n\",\n       \"      <td>-1.779000</td>\\n\",\n       \"      <td>-2.988500</td>\\n\",\n       \"      <td>0.603800</td>\\n\",\n       \"      <td>3.217200</td>\\n\",\n       \"      <td>-6.156700</td>\\n\",\n       \"      <td>1.033800</td>\\n\",\n       \"      <td>0.648600</td>\\n\",\n       \"      <td>-1.277400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.282700</td>\\n\",\n       \"      <td>-0.674500</td>\\n\",\n       \"      <td>-2.119500</td>\\n\",\n       \"      <td>1.252500</td>\\n\",\n       \"      <td>ERG021_VCAP_24H:BRD-K54606188:10</td>\\n\",\n       \"      <td>BRD-K54606188</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...</td>\\n\",\n       \"      <td>DNVXATUJJDPFDM-KRWDZBQOSA-N</td>\\n\",\n       \"      <td>JQ1-(+)</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>392269</th>\\n\",\n       \"      <td>1.223100</td>\\n\",\n       \"      <td>2.595200</td>\\n\",\n       \"      <td>4.695200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>4.339800</td>\\n\",\n       \"      <td>-2.816400</td>\\n\",\n       \"      <td>-0.030300</td>\\n\",\n       \"      <td>-7.763000</td>\\n\",\n       \"      <td>2.217900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.241800</td>\\n\",\n       \"      <td>2.065300</td>\\n\",\n       \"      <td>-0.366800</td>\\n\",\n       \"      <td>1.030100</td>\\n\",\n       \"      <td>ERG021_VCAP_24H:BRD-K54606188:3.33</td>\\n\",\n       \"      <td>BRD-K54606188</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...</td>\\n\",\n       \"      <td>DNVXATUJJDPFDM-KRWDZBQOSA-N</td>\\n\",\n       \"      <td>JQ1-(+)</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>392325</th>\\n\",\n       \"      <td>-0.402300</td>\\n\",\n       \"      <td>-0.492900</td>\\n\",\n       \"      <td>-0.258800</td>\\n\",\n       \"      <td>-0.648900</td>\\n\",\n       \"      <td>-0.217500</td>\\n\",\n       \"      <td>-0.741000</td>\\n\",\n       \"      <td>0.212600</td>\\n\",\n       \"      <td>-1.074600</td>\\n\",\n       \"      <td>-0.018100</td>\\n\",\n       \"      <td>-0.457000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.859500</td>\\n\",\n       \"      <td>1.115800</td>\\n\",\n       \"      <td>0.142100</td>\\n\",\n       \"      <td>0.999000</td>\\n\",\n       \"      <td>ERG020_VCAP_24H:BRD-K55877261:10.1</td>\\n\",\n       \"      <td>BRD-K55877261</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@H](C)[C@@H](CN(C)Cc2ccccn2)Oc...</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>44452 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013      10038      10046     10049  \\\\\\n\",\n       \"220     0.342880  0.229699  0.662968   0.112721   0.031991  0.223551   \\n\",\n       \"221    -1.243966  0.884699  0.361250  -0.877446   0.137078  1.132378   \\n\",\n       \"222     0.054080  0.722968  1.164083  -0.604267   0.272366  0.123806   \\n\",\n       \"223     1.446668 -0.646172  0.734950  -0.025302  -0.581938  0.221347   \\n\",\n       \"224    -1.647826  0.715185 -1.479616  -2.191869   0.867779  1.617837   \\n\",\n       \"...          ...       ...       ...        ...        ...       ...   \\n\",\n       \"392266 -0.192500  0.508600  2.958000  -1.932200   0.625200  4.463600   \\n\",\n       \"392267  0.062300  2.650200  4.473000  -2.151800   1.434600  4.882200   \\n\",\n       \"392268 -2.960600  5.703700 -1.779000  -2.988500   0.603800  3.217200   \\n\",\n       \"392269  1.223100  2.595200  4.695200 -10.000000  10.000000  4.339800   \\n\",\n       \"392325 -0.402300 -0.492900 -0.258800  -0.648900  -0.217500 -0.741000   \\n\",\n       \"\\n\",\n       \"           10051     10057     10058     10059  ...      9943      9961  \\\\\\n\",\n       \"220    -0.312346  0.603852  0.173999 -0.088876  ... -0.354263 -0.067259   \\n\",\n       \"221    -0.255957  1.240245 -1.849554  0.469157  ... -0.528902  0.165169   \\n\",\n       \"222    -0.787444 -0.049253 -0.339428 -0.474537  ... -0.271633  0.196106   \\n\",\n       \"223    -0.812867  0.183442  0.291759  0.328828  ...  0.038288 -0.052247   \\n\",\n       \"224    -3.098122  0.713327 -5.324870 -0.913241  ... -1.650342  0.283260   \\n\",\n       \"...          ...       ...       ...       ...  ...       ...       ...   \\n\",\n       \"392266 -2.164600  2.193400 -8.490000  1.876900  ...  2.317700  0.059300   \\n\",\n       \"392267 -2.347200  0.762400 -7.368100  2.178000  ...  2.935600 -1.046000   \\n\",\n       \"392268 -6.156700  1.033800  0.648600 -1.277400  ...  1.282700 -0.674500   \\n\",\n       \"392269 -2.816400 -0.030300 -7.763000  2.217900  ...  4.241800  2.065300   \\n\",\n       \"392325  0.212600 -1.074600 -0.018100 -0.457000  ...  0.859500  1.115800   \\n\",\n       \"\\n\",\n       \"             998      9988                                    full_id  \\\\\\n\",\n       \"220     0.125386 -0.309967   ERG005_VCAP_24H:BRD-A61304759-001-01-0:1   \\n\",\n       \"221    -2.121547 -0.109978  ERG005_VCAP_24H:BRD-A61304759-001-01-0:20   \\n\",\n       \"222     0.786506 -0.593048   ERG005_VCAP_24H:BRD-A61304759-001-01-0:5   \\n\",\n       \"223     1.144971  0.232041   ERG005_VCAP_48H:BRD-A61304759-001-01-0:1   \\n\",\n       \"224    -0.322341 -0.007010  ERG005_VCAP_48H:BRD-A61304759-001-01-0:20   \\n\",\n       \"...          ...       ...                                        ...   \\n\",\n       \"392266  0.172300  0.411000         ERG021_VCAP_24H:BRD-K54606188:0.37   \\n\",\n       \"392267  0.059300  1.767200         ERG021_VCAP_24H:BRD-K54606188:1.11   \\n\",\n       \"392268 -2.119500  1.252500           ERG021_VCAP_24H:BRD-K54606188:10   \\n\",\n       \"392269 -0.366800  1.030100         ERG021_VCAP_24H:BRD-K54606188:3.33   \\n\",\n       \"392325  0.142100  0.999000         ERG020_VCAP_24H:BRD-K55877261:10.1   \\n\",\n       \"\\n\",\n       \"              pert_id  cell_iname  \\\\\\n\",\n       \"220     BRD-A61304759        VCAP   \\n\",\n       \"221     BRD-A61304759        VCAP   \\n\",\n       \"222     BRD-A61304759        VCAP   \\n\",\n       \"223     BRD-A61304759        VCAP   \\n\",\n       \"224     BRD-A61304759        VCAP   \\n\",\n       \"...               ...         ...   \\n\",\n       \"392266  BRD-K54606188        VCAP   \\n\",\n       \"392267  BRD-K54606188        VCAP   \\n\",\n       \"392268  BRD-K54606188        VCAP   \\n\",\n       \"392269  BRD-K54606188        VCAP   \\n\",\n       \"392325  BRD-K55877261        VCAP   \\n\",\n       \"\\n\",\n       \"                                                   SMILES  \\\\\\n\",\n       \"220     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"221     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"222     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"223     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"224     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...                                                   ...   \\n\",\n       \"392266  Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...   \\n\",\n       \"392267  Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...   \\n\",\n       \"392268  Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...   \\n\",\n       \"392269  Cc1sc-2c(c1C)C(=N[C@@H](CC(=O)OC(C)(C)C)c1nnc(...   \\n\",\n       \"392325  C[C@@H](CO)N1C[C@H](C)[C@@H](CN(C)Cc2ccccn2)Oc...   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"220     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"221     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"222     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"223     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"224     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"392266  DNVXATUJJDPFDM-KRWDZBQOSA-N           JQ1-(+)  \\n\",\n       \"392267  DNVXATUJJDPFDM-KRWDZBQOSA-N           JQ1-(+)  \\n\",\n       \"392268  DNVXATUJJDPFDM-KRWDZBQOSA-N           JQ1-(+)  \\n\",\n       \"392269  DNVXATUJJDPFDM-KRWDZBQOSA-N           JQ1-(+)  \\n\",\n       \"392325                          NaN               NaN  \\n\",\n       \"\\n\",\n       \"[44452 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"A549\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>6</th>\\n\",\n       \"      <td>0.267136</td>\\n\",\n       \"      <td>-0.456491</td>\\n\",\n       \"      <td>2.165093</td>\\n\",\n       \"      <td>0.229504</td>\\n\",\n       \"      <td>-0.124224</td>\\n\",\n       \"      <td>1.576591</td>\\n\",\n       \"      <td>-1.828272</td>\\n\",\n       \"      <td>-0.225401</td>\\n\",\n       \"      <td>1.707536</td>\\n\",\n       \"      <td>-0.433627</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.181523</td>\\n\",\n       \"      <td>-1.382612</td>\\n\",\n       \"      <td>-1.131088</td>\\n\",\n       \"      <td>1.328648</td>\\n\",\n       \"      <td>ABY001_A549_XH:BRD-A61304759:0.625:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>7</th>\\n\",\n       \"      <td>-0.324586</td>\\n\",\n       \"      <td>0.342067</td>\\n\",\n       \"      <td>0.473313</td>\\n\",\n       \"      <td>-1.177638</td>\\n\",\n       \"      <td>0.367503</td>\\n\",\n       \"      <td>0.820698</td>\\n\",\n       \"      <td>0.339237</td>\\n\",\n       \"      <td>0.549441</td>\\n\",\n       \"      <td>0.062816</td>\\n\",\n       \"      <td>-1.037903</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.805294</td>\\n\",\n       \"      <td>-0.009581</td>\\n\",\n       \"      <td>-0.667608</td>\\n\",\n       \"      <td>-1.633135</td>\\n\",\n       \"      <td>ABY001_A549_XH:BRD-A61304759:0.625:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>8</th>\\n\",\n       \"      <td>-0.117392</td>\\n\",\n       \"      <td>0.200503</td>\\n\",\n       \"      <td>3.915682</td>\\n\",\n       \"      <td>0.172298</td>\\n\",\n       \"      <td>-0.620917</td>\\n\",\n       \"      <td>1.896570</td>\\n\",\n       \"      <td>-0.351663</td>\\n\",\n       \"      <td>2.388200</td>\\n\",\n       \"      <td>2.241555</td>\\n\",\n       \"      <td>0.882148</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.274837</td>\\n\",\n       \"      <td>-2.573916</td>\\n\",\n       \"      <td>-1.041967</td>\\n\",\n       \"      <td>0.582418</td>\\n\",\n       \"      <td>ABY001_A549_XH:BRD-A61304759:10:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>9</th>\\n\",\n       \"      <td>-0.735243</td>\\n\",\n       \"      <td>0.217042</td>\\n\",\n       \"      <td>-0.535154</td>\\n\",\n       \"      <td>-0.395466</td>\\n\",\n       \"      <td>4.530125</td>\\n\",\n       \"      <td>0.816603</td>\\n\",\n       \"      <td>0.998695</td>\\n\",\n       \"      <td>0.983210</td>\\n\",\n       \"      <td>0.226104</td>\\n\",\n       \"      <td>0.306891</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.253998</td>\\n\",\n       \"      <td>-0.720851</td>\\n\",\n       \"      <td>-0.501165</td>\\n\",\n       \"      <td>-1.326902</td>\\n\",\n       \"      <td>ABY001_A549_XH:BRD-A61304759:10:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>10</th>\\n\",\n       \"      <td>0.703145</td>\\n\",\n       \"      <td>-0.459059</td>\\n\",\n       \"      <td>2.882389</td>\\n\",\n       \"      <td>-0.590942</td>\\n\",\n       \"      <td>-0.809936</td>\\n\",\n       \"      <td>1.652521</td>\\n\",\n       \"      <td>-1.464148</td>\\n\",\n       \"      <td>1.901619</td>\\n\",\n       \"      <td>1.833337</td>\\n\",\n       \"      <td>-0.166399</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.108275</td>\\n\",\n       \"      <td>-3.197631</td>\\n\",\n       \"      <td>0.172597</td>\\n\",\n       \"      <td>0.360270</td>\\n\",\n       \"      <td>ABY001_A549_XH:BRD-A61304759:2.5:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429339</th>\\n\",\n       \"      <td>-0.160872</td>\\n\",\n       \"      <td>-0.734598</td>\\n\",\n       \"      <td>1.037349</td>\\n\",\n       \"      <td>0.409271</td>\\n\",\n       \"      <td>0.470073</td>\\n\",\n       \"      <td>-1.236599</td>\\n\",\n       \"      <td>0.196407</td>\\n\",\n       \"      <td>-0.355948</td>\\n\",\n       \"      <td>-0.174792</td>\\n\",\n       \"      <td>-0.057376</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.337520</td>\\n\",\n       \"      <td>0.789028</td>\\n\",\n       \"      <td>0.747753</td>\\n\",\n       \"      <td>-0.793304</td>\\n\",\n       \"      <td>RAD001_A549_6H:BRD-K95986273-001-01-9:0.1235</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429340</th>\\n\",\n       \"      <td>-0.732872</td>\\n\",\n       \"      <td>1.154789</td>\\n\",\n       \"      <td>-0.203751</td>\\n\",\n       \"      <td>-3.644891</td>\\n\",\n       \"      <td>-0.587741</td>\\n\",\n       \"      <td>-0.480639</td>\\n\",\n       \"      <td>0.555011</td>\\n\",\n       \"      <td>-0.190167</td>\\n\",\n       \"      <td>0.048464</td>\\n\",\n       \"      <td>0.048362</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.742586</td>\\n\",\n       \"      <td>0.149100</td>\\n\",\n       \"      <td>0.208475</td>\\n\",\n       \"      <td>1.110267</td>\\n\",\n       \"      <td>RAD001_A549_6H:BRD-K95986273-001-01-9:0.3704</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429341</th>\\n\",\n       \"      <td>0.777660</td>\\n\",\n       \"      <td>-1.287675</td>\\n\",\n       \"      <td>1.083690</td>\\n\",\n       \"      <td>0.470522</td>\\n\",\n       \"      <td>-0.752226</td>\\n\",\n       \"      <td>0.692116</td>\\n\",\n       \"      <td>0.607769</td>\\n\",\n       \"      <td>-0.312699</td>\\n\",\n       \"      <td>0.636528</td>\\n\",\n       \"      <td>-0.097325</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.280083</td>\\n\",\n       \"      <td>0.451212</td>\\n\",\n       \"      <td>0.341323</td>\\n\",\n       \"      <td>-1.047601</td>\\n\",\n       \"      <td>RAD001_A549_6H:BRD-K95986273-001-01-9:1.1111</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429342</th>\\n\",\n       \"      <td>-1.804538</td>\\n\",\n       \"      <td>1.066838</td>\\n\",\n       \"      <td>-1.099731</td>\\n\",\n       \"      <td>-4.657009</td>\\n\",\n       \"      <td>0.519173</td>\\n\",\n       \"      <td>1.021468</td>\\n\",\n       \"      <td>-1.191034</td>\\n\",\n       \"      <td>-0.361392</td>\\n\",\n       \"      <td>-0.004369</td>\\n\",\n       \"      <td>0.618981</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.331334</td>\\n\",\n       \"      <td>-0.257574</td>\\n\",\n       \"      <td>-0.680384</td>\\n\",\n       \"      <td>0.707249</td>\\n\",\n       \"      <td>RAD001_A549_6H:BRD-K95986273-001-01-9:10</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429343</th>\\n\",\n       \"      <td>-0.038218</td>\\n\",\n       \"      <td>-0.778984</td>\\n\",\n       \"      <td>0.328021</td>\\n\",\n       \"      <td>0.310619</td>\\n\",\n       \"      <td>-0.354523</td>\\n\",\n       \"      <td>0.761366</td>\\n\",\n       \"      <td>0.896427</td>\\n\",\n       \"      <td>-0.306524</td>\\n\",\n       \"      <td>0.416208</td>\\n\",\n       \"      <td>-0.348911</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.107414</td>\\n\",\n       \"      <td>0.642712</td>\\n\",\n       \"      <td>0.483057</td>\\n\",\n       \"      <td>1.250445</td>\\n\",\n       \"      <td>RAD001_A549_6H:BRD-K95986273-001-01-9:3.3333</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>38070 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"6       0.267136 -0.456491  2.165093  0.229504 -0.124224  1.576591 -1.828272   \\n\",\n       \"7      -0.324586  0.342067  0.473313 -1.177638  0.367503  0.820698  0.339237   \\n\",\n       \"8      -0.117392  0.200503  3.915682  0.172298 -0.620917  1.896570 -0.351663   \\n\",\n       \"9      -0.735243  0.217042 -0.535154 -0.395466  4.530125  0.816603  0.998695   \\n\",\n       \"10      0.703145 -0.459059  2.882389 -0.590942 -0.809936  1.652521 -1.464148   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"429339 -0.160872 -0.734598  1.037349  0.409271  0.470073 -1.236599  0.196407   \\n\",\n       \"429340 -0.732872  1.154789 -0.203751 -3.644891 -0.587741 -0.480639  0.555011   \\n\",\n       \"429341  0.777660 -1.287675  1.083690  0.470522 -0.752226  0.692116  0.607769   \\n\",\n       \"429342 -1.804538  1.066838 -1.099731 -4.657009  0.519173  1.021468 -1.191034   \\n\",\n       \"429343 -0.038218 -0.778984  0.328021  0.310619 -0.354523  0.761366  0.896427   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"6      -0.225401  1.707536 -0.433627  ... -0.181523 -1.382612 -1.131088   \\n\",\n       \"7       0.549441  0.062816 -1.037903  ... -0.805294 -0.009581 -0.667608   \\n\",\n       \"8       2.388200  2.241555  0.882148  ... -0.274837 -2.573916 -1.041967   \\n\",\n       \"9       0.983210  0.226104  0.306891  ... -1.253998 -0.720851 -0.501165   \\n\",\n       \"10      1.901619  1.833337 -0.166399  ... -0.108275 -3.197631  0.172597   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"429339 -0.355948 -0.174792 -0.057376  ...  0.337520  0.789028  0.747753   \\n\",\n       \"429340 -0.190167  0.048464  0.048362  ... -0.742586  0.149100  0.208475   \\n\",\n       \"429341 -0.312699  0.636528 -0.097325  ...  0.280083  0.451212  0.341323   \\n\",\n       \"429342 -0.361392 -0.004369  0.618981  ... -1.331334 -0.257574 -0.680384   \\n\",\n       \"429343 -0.306524  0.416208 -0.348911  ... -0.107414  0.642712  0.483057   \\n\",\n       \"\\n\",\n       \"            9988                                       full_id        pert_id  \\\\\\n\",\n       \"6       1.328648         ABY001_A549_XH:BRD-A61304759:0.625:24  BRD-A61304759   \\n\",\n       \"7      -1.633135          ABY001_A549_XH:BRD-A61304759:0.625:3  BRD-A61304759   \\n\",\n       \"8       0.582418            ABY001_A549_XH:BRD-A61304759:10:24  BRD-A61304759   \\n\",\n       \"9      -1.326902             ABY001_A549_XH:BRD-A61304759:10:3  BRD-A61304759   \\n\",\n       \"10      0.360270           ABY001_A549_XH:BRD-A61304759:2.5:24  BRD-A61304759   \\n\",\n       \"...          ...                                           ...            ...   \\n\",\n       \"429339 -0.793304  RAD001_A549_6H:BRD-K95986273-001-01-9:0.1235  BRD-K95986273   \\n\",\n       \"429340  1.110267  RAD001_A549_6H:BRD-K95986273-001-01-9:0.3704  BRD-K95986273   \\n\",\n       \"429341 -1.047601  RAD001_A549_6H:BRD-K95986273-001-01-9:1.1111  BRD-K95986273   \\n\",\n       \"429342  0.707249      RAD001_A549_6H:BRD-K95986273-001-01-9:10  BRD-K95986273   \\n\",\n       \"429343  1.250445  RAD001_A549_6H:BRD-K95986273-001-01-9:3.3333  BRD-K95986273   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"6             A549  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"7             A549  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"8             A549  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"9             A549  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"10            A549  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"429339        A549             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429340        A549             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429341        A549             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429342        A549             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429343        A549             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"6       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"7       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"8       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"9       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"10      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"429339  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429340  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429341  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429342  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429343  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"\\n\",\n       \"[38070 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"MCF7\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>118</th>\\n\",\n       \"      <td>-0.151476</td>\\n\",\n       \"      <td>0.621351</td>\\n\",\n       \"      <td>2.347261</td>\\n\",\n       \"      <td>-3.394480</td>\\n\",\n       \"      <td>0.705109</td>\\n\",\n       \"      <td>3.007318</td>\\n\",\n       \"      <td>-1.392812</td>\\n\",\n       \"      <td>1.340819</td>\\n\",\n       \"      <td>-0.493219</td>\\n\",\n       \"      <td>0.226146</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.995036</td>\\n\",\n       \"      <td>2.507559</td>\\n\",\n       \"      <td>-0.663364</td>\\n\",\n       \"      <td>0.389883</td>\\n\",\n       \"      <td>DOSVAL002_MCF7_24H:BRD-A61304759:10</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>119</th>\\n\",\n       \"      <td>1.237476</td>\\n\",\n       \"      <td>3.581586</td>\\n\",\n       \"      <td>1.872769</td>\\n\",\n       \"      <td>-2.050381</td>\\n\",\n       \"      <td>3.755759</td>\\n\",\n       \"      <td>1.854455</td>\\n\",\n       \"      <td>-1.684269</td>\\n\",\n       \"      <td>1.683576</td>\\n\",\n       \"      <td>-1.984033</td>\\n\",\n       \"      <td>-0.013219</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.744408</td>\\n\",\n       \"      <td>1.580303</td>\\n\",\n       \"      <td>-1.773016</td>\\n\",\n       \"      <td>-0.429320</td>\\n\",\n       \"      <td>DOSVAL002_MCF7_24H:BRD-A61304759:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>120</th>\\n\",\n       \"      <td>0.109900</td>\\n\",\n       \"      <td>-2.607200</td>\\n\",\n       \"      <td>-0.529500</td>\\n\",\n       \"      <td>-3.435650</td>\\n\",\n       \"      <td>0.337200</td>\\n\",\n       \"      <td>2.949850</td>\\n\",\n       \"      <td>-1.325150</td>\\n\",\n       \"      <td>1.797250</td>\\n\",\n       \"      <td>-1.341850</td>\\n\",\n       \"      <td>-5.448550</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.018150</td>\\n\",\n       \"      <td>0.543850</td>\\n\",\n       \"      <td>-0.603250</td>\\n\",\n       \"      <td>-0.402750</td>\\n\",\n       \"      <td>DOSVAL002_MCF7_24H:BRD-A61304759:5</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>142</th>\\n\",\n       \"      <td>1.547964</td>\\n\",\n       \"      <td>3.207467</td>\\n\",\n       \"      <td>4.032832</td>\\n\",\n       \"      <td>-5.399337</td>\\n\",\n       \"      <td>3.068183</td>\\n\",\n       \"      <td>7.430446</td>\\n\",\n       \"      <td>-1.348630</td>\\n\",\n       \"      <td>2.029970</td>\\n\",\n       \"      <td>-2.210826</td>\\n\",\n       \"      <td>-3.993357</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.389575</td>\\n\",\n       \"      <td>1.363373</td>\\n\",\n       \"      <td>-1.417060</td>\\n\",\n       \"      <td>-2.060568</td>\\n\",\n       \"      <td>DOSVAL003_MCF7_24H:BRD-A61304759:10</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>143</th>\\n\",\n       \"      <td>2.426345</td>\\n\",\n       \"      <td>3.543426</td>\\n\",\n       \"      <td>2.950618</td>\\n\",\n       \"      <td>-3.654894</td>\\n\",\n       \"      <td>2.748431</td>\\n\",\n       \"      <td>7.618900</td>\\n\",\n       \"      <td>-1.326657</td>\\n\",\n       \"      <td>2.542947</td>\\n\",\n       \"      <td>-1.990267</td>\\n\",\n       \"      <td>-0.078416</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.912753</td>\\n\",\n       \"      <td>1.904016</td>\\n\",\n       \"      <td>-0.254895</td>\\n\",\n       \"      <td>-0.477555</td>\\n\",\n       \"      <td>DOSVAL003_MCF7_24H:BRD-A61304759:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429366</th>\\n\",\n       \"      <td>0.336000</td>\\n\",\n       \"      <td>-0.053700</td>\\n\",\n       \"      <td>1.909550</td>\\n\",\n       \"      <td>0.475500</td>\\n\",\n       \"      <td>0.467850</td>\\n\",\n       \"      <td>-0.579900</td>\\n\",\n       \"      <td>-1.290500</td>\\n\",\n       \"      <td>1.730400</td>\\n\",\n       \"      <td>-0.820750</td>\\n\",\n       \"      <td>-0.195550</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.468450</td>\\n\",\n       \"      <td>0.488400</td>\\n\",\n       \"      <td>-0.301500</td>\\n\",\n       \"      <td>0.283500</td>\\n\",\n       \"      <td>RAD001_MCF7_6H:BRD-K95986273-001-01-9:0.1235</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429367</th>\\n\",\n       \"      <td>-1.852700</td>\\n\",\n       \"      <td>0.868050</td>\\n\",\n       \"      <td>-1.025050</td>\\n\",\n       \"      <td>-1.688300</td>\\n\",\n       \"      <td>0.241750</td>\\n\",\n       \"      <td>1.051900</td>\\n\",\n       \"      <td>-0.838650</td>\\n\",\n       \"      <td>0.863800</td>\\n\",\n       \"      <td>-0.495850</td>\\n\",\n       \"      <td>0.691850</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-3.095350</td>\\n\",\n       \"      <td>0.715700</td>\\n\",\n       \"      <td>0.848050</td>\\n\",\n       \"      <td>2.662850</td>\\n\",\n       \"      <td>RAD001_MCF7_6H:BRD-K95986273-001-01-9:0.3704</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429368</th>\\n\",\n       \"      <td>-0.207700</td>\\n\",\n       \"      <td>-1.787300</td>\\n\",\n       \"      <td>0.709050</td>\\n\",\n       \"      <td>0.779100</td>\\n\",\n       \"      <td>1.664750</td>\\n\",\n       \"      <td>-0.175200</td>\\n\",\n       \"      <td>-1.194150</td>\\n\",\n       \"      <td>1.242800</td>\\n\",\n       \"      <td>1.852100</td>\\n\",\n       \"      <td>-0.948950</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.501750</td>\\n\",\n       \"      <td>-1.127150</td>\\n\",\n       \"      <td>2.060150</td>\\n\",\n       \"      <td>0.049900</td>\\n\",\n       \"      <td>RAD001_MCF7_6H:BRD-K95986273-001-01-9:1.1111</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429369</th>\\n\",\n       \"      <td>-0.002700</td>\\n\",\n       \"      <td>-0.621450</td>\\n\",\n       \"      <td>-0.344900</td>\\n\",\n       \"      <td>-0.138900</td>\\n\",\n       \"      <td>-0.752350</td>\\n\",\n       \"      <td>1.163100</td>\\n\",\n       \"      <td>-1.199750</td>\\n\",\n       \"      <td>1.737650</td>\\n\",\n       \"      <td>1.823350</td>\\n\",\n       \"      <td>-5.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-4.205700</td>\\n\",\n       \"      <td>-0.281300</td>\\n\",\n       \"      <td>-0.412050</td>\\n\",\n       \"      <td>1.203800</td>\\n\",\n       \"      <td>RAD001_MCF7_6H:BRD-K95986273-001-01-9:10</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429370</th>\\n\",\n       \"      <td>0.460200</td>\\n\",\n       \"      <td>-0.915150</td>\\n\",\n       \"      <td>-0.344350</td>\\n\",\n       \"      <td>-0.426900</td>\\n\",\n       \"      <td>0.637800</td>\\n\",\n       \"      <td>0.241450</td>\\n\",\n       \"      <td>-0.230800</td>\\n\",\n       \"      <td>2.149500</td>\\n\",\n       \"      <td>0.006550</td>\\n\",\n       \"      <td>0.044450</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.572100</td>\\n\",\n       \"      <td>0.398950</td>\\n\",\n       \"      <td>-0.646150</td>\\n\",\n       \"      <td>2.635350</td>\\n\",\n       \"      <td>RAD001_MCF7_6H:BRD-K95986273-001-01-9:3.3333</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>49146 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"118    -0.151476  0.621351  2.347261 -3.394480  0.705109  3.007318 -1.392812   \\n\",\n       \"119     1.237476  3.581586  1.872769 -2.050381  3.755759  1.854455 -1.684269   \\n\",\n       \"120     0.109900 -2.607200 -0.529500 -3.435650  0.337200  2.949850 -1.325150   \\n\",\n       \"142     1.547964  3.207467  4.032832 -5.399337  3.068183  7.430446 -1.348630   \\n\",\n       \"143     2.426345  3.543426  2.950618 -3.654894  2.748431  7.618900 -1.326657   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"429366  0.336000 -0.053700  1.909550  0.475500  0.467850 -0.579900 -1.290500   \\n\",\n       \"429367 -1.852700  0.868050 -1.025050 -1.688300  0.241750  1.051900 -0.838650   \\n\",\n       \"429368 -0.207700 -1.787300  0.709050  0.779100  1.664750 -0.175200 -1.194150   \\n\",\n       \"429369 -0.002700 -0.621450 -0.344900 -0.138900 -0.752350  1.163100 -1.199750   \\n\",\n       \"429370  0.460200 -0.915150 -0.344350 -0.426900  0.637800  0.241450 -0.230800   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"118     1.340819 -0.493219  0.226146  ...  1.995036  2.507559 -0.663364   \\n\",\n       \"119     1.683576 -1.984033 -0.013219  ...  1.744408  1.580303 -1.773016   \\n\",\n       \"120     1.797250 -1.341850 -5.448550  ...  0.018150  0.543850 -0.603250   \\n\",\n       \"142     2.029970 -2.210826 -3.993357  ...  0.389575  1.363373 -1.417060   \\n\",\n       \"143     2.542947 -1.990267 -0.078416  ... -0.912753  1.904016 -0.254895   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"429366  1.730400 -0.820750 -0.195550  ...  0.468450  0.488400 -0.301500   \\n\",\n       \"429367  0.863800 -0.495850  0.691850  ... -3.095350  0.715700  0.848050   \\n\",\n       \"429368  1.242800  1.852100 -0.948950  ...  1.501750 -1.127150  2.060150   \\n\",\n       \"429369  1.737650  1.823350 -5.000000  ... -4.205700 -0.281300 -0.412050   \\n\",\n       \"429370  2.149500  0.006550  0.044450  ... -2.572100  0.398950 -0.646150   \\n\",\n       \"\\n\",\n       \"            9988                                       full_id        pert_id  \\\\\\n\",\n       \"118     0.389883           DOSVAL002_MCF7_24H:BRD-A61304759:10  BRD-A61304759   \\n\",\n       \"119    -0.429320           DOSVAL002_MCF7_24H:BRD-A61304759:20  BRD-A61304759   \\n\",\n       \"120    -0.402750            DOSVAL002_MCF7_24H:BRD-A61304759:5  BRD-A61304759   \\n\",\n       \"142    -2.060568           DOSVAL003_MCF7_24H:BRD-A61304759:10  BRD-A61304759   \\n\",\n       \"143    -0.477555           DOSVAL003_MCF7_24H:BRD-A61304759:20  BRD-A61304759   \\n\",\n       \"...          ...                                           ...            ...   \\n\",\n       \"429366  0.283500  RAD001_MCF7_6H:BRD-K95986273-001-01-9:0.1235  BRD-K95986273   \\n\",\n       \"429367  2.662850  RAD001_MCF7_6H:BRD-K95986273-001-01-9:0.3704  BRD-K95986273   \\n\",\n       \"429368  0.049900  RAD001_MCF7_6H:BRD-K95986273-001-01-9:1.1111  BRD-K95986273   \\n\",\n       \"429369  1.203800      RAD001_MCF7_6H:BRD-K95986273-001-01-9:10  BRD-K95986273   \\n\",\n       \"429370  2.635350  RAD001_MCF7_6H:BRD-K95986273-001-01-9:3.3333  BRD-K95986273   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"118           MCF7  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"119           MCF7  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"120           MCF7  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"142           MCF7  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"143           MCF7  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"429366        MCF7             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429367        MCF7             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429368        MCF7             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429369        MCF7             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429370        MCF7             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"118     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"119     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"120     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"142     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"143     AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"429366  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429367  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429368  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429369  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429370  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"\\n\",\n       \"[49146 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"PC3\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>54</th>\\n\",\n       \"      <td>1.132950</td>\\n\",\n       \"      <td>0.123550</td>\\n\",\n       \"      <td>-3.740800</td>\\n\",\n       \"      <td>-0.016550</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-3.262950</td>\\n\",\n       \"      <td>-1.161100</td>\\n\",\n       \"      <td>2.449150</td>\\n\",\n       \"      <td>-0.093950</td>\\n\",\n       \"      <td>1.196950</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.914450</td>\\n\",\n       \"      <td>-0.257750</td>\\n\",\n       \"      <td>0.881500</td>\\n\",\n       \"      <td>-1.070300</td>\\n\",\n       \"      <td>ABY001_PC3_XH:BRD-A61304759:0.625:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>55</th>\\n\",\n       \"      <td>0.647987</td>\\n\",\n       \"      <td>-0.216487</td>\\n\",\n       \"      <td>0.400504</td>\\n\",\n       \"      <td>-0.625137</td>\\n\",\n       \"      <td>0.584906</td>\\n\",\n       \"      <td>-0.928926</td>\\n\",\n       \"      <td>-0.301941</td>\\n\",\n       \"      <td>-0.812206</td>\\n\",\n       \"      <td>0.153869</td>\\n\",\n       \"      <td>-0.414961</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.728474</td>\\n\",\n       \"      <td>0.341705</td>\\n\",\n       \"      <td>1.250524</td>\\n\",\n       \"      <td>0.415484</td>\\n\",\n       \"      <td>ABY001_PC3_XH:BRD-A61304759:0.625:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>56</th>\\n\",\n       \"      <td>1.528193</td>\\n\",\n       \"      <td>-1.074302</td>\\n\",\n       \"      <td>-1.323786</td>\\n\",\n       \"      <td>-0.371677</td>\\n\",\n       \"      <td>2.599559</td>\\n\",\n       \"      <td>1.951470</td>\\n\",\n       \"      <td>-1.349102</td>\\n\",\n       \"      <td>1.918611</td>\\n\",\n       \"      <td>-0.246618</td>\\n\",\n       \"      <td>1.007504</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.715374</td>\\n\",\n       \"      <td>0.830848</td>\\n\",\n       \"      <td>-0.373370</td>\\n\",\n       \"      <td>0.462187</td>\\n\",\n       \"      <td>ABY001_PC3_XH:BRD-A61304759:10:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>57</th>\\n\",\n       \"      <td>-0.121456</td>\\n\",\n       \"      <td>-0.782534</td>\\n\",\n       \"      <td>-1.036697</td>\\n\",\n       \"      <td>-3.055483</td>\\n\",\n       \"      <td>0.268898</td>\\n\",\n       \"      <td>1.331479</td>\\n\",\n       \"      <td>-0.922628</td>\\n\",\n       \"      <td>-0.598186</td>\\n\",\n       \"      <td>-0.808855</td>\\n\",\n       \"      <td>-2.345565</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.017446</td>\\n\",\n       \"      <td>2.098738</td>\\n\",\n       \"      <td>-1.825348</td>\\n\",\n       \"      <td>0.044387</td>\\n\",\n       \"      <td>ABY001_PC3_XH:BRD-A61304759:10:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>58</th>\\n\",\n       \"      <td>1.209288</td>\\n\",\n       \"      <td>0.214489</td>\\n\",\n       \"      <td>-2.202977</td>\\n\",\n       \"      <td>-0.599925</td>\\n\",\n       \"      <td>2.654822</td>\\n\",\n       \"      <td>2.009625</td>\\n\",\n       \"      <td>-0.924628</td>\\n\",\n       \"      <td>1.924302</td>\\n\",\n       \"      <td>-0.218531</td>\\n\",\n       \"      <td>0.153803</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.776036</td>\\n\",\n       \"      <td>-0.103730</td>\\n\",\n       \"      <td>-0.251273</td>\\n\",\n       \"      <td>-1.010999</td>\\n\",\n       \"      <td>ABY001_PC3_XH:BRD-A61304759:2.5:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429384</th>\\n\",\n       \"      <td>0.605400</td>\\n\",\n       \"      <td>-0.179350</td>\\n\",\n       \"      <td>-0.672100</td>\\n\",\n       \"      <td>-0.628750</td>\\n\",\n       \"      <td>0.111400</td>\\n\",\n       \"      <td>0.544600</td>\\n\",\n       \"      <td>0.591700</td>\\n\",\n       \"      <td>-1.256000</td>\\n\",\n       \"      <td>0.485650</td>\\n\",\n       \"      <td>-4.867000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.829850</td>\\n\",\n       \"      <td>-0.131600</td>\\n\",\n       \"      <td>-0.814150</td>\\n\",\n       \"      <td>-0.741750</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:0.1235</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429385</th>\\n\",\n       \"      <td>0.160400</td>\\n\",\n       \"      <td>0.277550</td>\\n\",\n       \"      <td>-0.141750</td>\\n\",\n       \"      <td>1.081850</td>\\n\",\n       \"      <td>-0.488050</td>\\n\",\n       \"      <td>0.783200</td>\\n\",\n       \"      <td>-0.577050</td>\\n\",\n       \"      <td>-0.049500</td>\\n\",\n       \"      <td>-0.270950</td>\\n\",\n       \"      <td>0.489950</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.115850</td>\\n\",\n       \"      <td>0.279050</td>\\n\",\n       \"      <td>-0.244850</td>\\n\",\n       \"      <td>-0.436050</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:0.3704</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429386</th>\\n\",\n       \"      <td>0.400950</td>\\n\",\n       \"      <td>0.139500</td>\\n\",\n       \"      <td>1.385550</td>\\n\",\n       \"      <td>-3.053850</td>\\n\",\n       \"      <td>5.312200</td>\\n\",\n       \"      <td>0.763750</td>\\n\",\n       \"      <td>-1.333150</td>\\n\",\n       \"      <td>0.471400</td>\\n\",\n       \"      <td>-0.385800</td>\\n\",\n       \"      <td>0.401000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.331250</td>\\n\",\n       \"      <td>0.954800</td>\\n\",\n       \"      <td>0.373200</td>\\n\",\n       \"      <td>-0.479750</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:1.1111</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429387</th>\\n\",\n       \"      <td>0.721800</td>\\n\",\n       \"      <td>0.454000</td>\\n\",\n       \"      <td>0.146150</td>\\n\",\n       \"      <td>-1.052350</td>\\n\",\n       \"      <td>0.394500</td>\\n\",\n       \"      <td>0.732550</td>\\n\",\n       \"      <td>-1.404100</td>\\n\",\n       \"      <td>0.240850</td>\\n\",\n       \"      <td>-0.753750</td>\\n\",\n       \"      <td>1.155500</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.236550</td>\\n\",\n       \"      <td>0.410850</td>\\n\",\n       \"      <td>0.219400</td>\\n\",\n       \"      <td>-0.355300</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:10</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>429388</th>\\n\",\n       \"      <td>1.308400</td>\\n\",\n       \"      <td>-0.412800</td>\\n\",\n       \"      <td>-0.355400</td>\\n\",\n       \"      <td>0.805100</td>\\n\",\n       \"      <td>-0.674500</td>\\n\",\n       \"      <td>-0.141400</td>\\n\",\n       \"      <td>0.934500</td>\\n\",\n       \"      <td>-1.567500</td>\\n\",\n       \"      <td>-1.117700</td>\\n\",\n       \"      <td>-0.781900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.064000</td>\\n\",\n       \"      <td>0.556300</td>\\n\",\n       \"      <td>-0.815800</td>\\n\",\n       \"      <td>-1.526200</td>\\n\",\n       \"      <td>RAD001_PC3_6H:BRD-K95986273-001-01-9:3.3333</td>\\n\",\n       \"      <td>BRD-K95986273</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O</td>\\n\",\n       \"      <td>RBCPBVNYTIFDIR-RIYZIHGNSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>36845 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038      10046     10049     10051  \\\\\\n\",\n       \"54      1.132950  0.123550 -3.740800 -0.016550  10.000000 -3.262950 -1.161100   \\n\",\n       \"55      0.647987 -0.216487  0.400504 -0.625137   0.584906 -0.928926 -0.301941   \\n\",\n       \"56      1.528193 -1.074302 -1.323786 -0.371677   2.599559  1.951470 -1.349102   \\n\",\n       \"57     -0.121456 -0.782534 -1.036697 -3.055483   0.268898  1.331479 -0.922628   \\n\",\n       \"58      1.209288  0.214489 -2.202977 -0.599925   2.654822  2.009625 -0.924628   \\n\",\n       \"...          ...       ...       ...       ...        ...       ...       ...   \\n\",\n       \"429384  0.605400 -0.179350 -0.672100 -0.628750   0.111400  0.544600  0.591700   \\n\",\n       \"429385  0.160400  0.277550 -0.141750  1.081850  -0.488050  0.783200 -0.577050   \\n\",\n       \"429386  0.400950  0.139500  1.385550 -3.053850   5.312200  0.763750 -1.333150   \\n\",\n       \"429387  0.721800  0.454000  0.146150 -1.052350   0.394500  0.732550 -1.404100   \\n\",\n       \"429388  1.308400 -0.412800 -0.355400  0.805100  -0.674500 -0.141400  0.934500   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"54      2.449150 -0.093950  1.196950  ... -0.914450 -0.257750  0.881500   \\n\",\n       \"55     -0.812206  0.153869 -0.414961  ... -2.728474  0.341705  1.250524   \\n\",\n       \"56      1.918611 -0.246618  1.007504  ... -0.715374  0.830848 -0.373370   \\n\",\n       \"57     -0.598186 -0.808855 -2.345565  ... -0.017446  2.098738 -1.825348   \\n\",\n       \"58      1.924302 -0.218531  0.153803  ... -1.776036 -0.103730 -0.251273   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"429384 -1.256000  0.485650 -4.867000  ... -0.829850 -0.131600 -0.814150   \\n\",\n       \"429385 -0.049500 -0.270950  0.489950  ... -1.115850  0.279050 -0.244850   \\n\",\n       \"429386  0.471400 -0.385800  0.401000  ... -0.331250  0.954800  0.373200   \\n\",\n       \"429387  0.240850 -0.753750  1.155500  ... -0.236550  0.410850  0.219400   \\n\",\n       \"429388 -1.567500 -1.117700 -0.781900  ... -0.064000  0.556300 -0.815800   \\n\",\n       \"\\n\",\n       \"            9988                                      full_id        pert_id  \\\\\\n\",\n       \"54     -1.070300         ABY001_PC3_XH:BRD-A61304759:0.625:24  BRD-A61304759   \\n\",\n       \"55      0.415484          ABY001_PC3_XH:BRD-A61304759:0.625:3  BRD-A61304759   \\n\",\n       \"56      0.462187            ABY001_PC3_XH:BRD-A61304759:10:24  BRD-A61304759   \\n\",\n       \"57      0.044387             ABY001_PC3_XH:BRD-A61304759:10:3  BRD-A61304759   \\n\",\n       \"58     -1.010999           ABY001_PC3_XH:BRD-A61304759:2.5:24  BRD-A61304759   \\n\",\n       \"...          ...                                          ...            ...   \\n\",\n       \"429384 -0.741750  RAD001_PC3_6H:BRD-K95986273-001-01-9:0.1235  BRD-K95986273   \\n\",\n       \"429385 -0.436050  RAD001_PC3_6H:BRD-K95986273-001-01-9:0.3704  BRD-K95986273   \\n\",\n       \"429386 -0.479750  RAD001_PC3_6H:BRD-K95986273-001-01-9:1.1111  BRD-K95986273   \\n\",\n       \"429387 -0.355300      RAD001_PC3_6H:BRD-K95986273-001-01-9:10  BRD-K95986273   \\n\",\n       \"429388 -1.526200  RAD001_PC3_6H:BRD-K95986273-001-01-9:3.3333  BRD-K95986273   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"54             PC3  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"55             PC3  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"56             PC3  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"57             PC3  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"58             PC3  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"429384         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429385         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429386         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429387         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"429388         PC3             Cc1nc2ccccc2n1N=Cc1ccc(s1)[N+]([O-])=O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"54      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"55      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"56      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"57      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"58      AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"429384  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429385  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429386  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429387  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"429388  RBCPBVNYTIFDIR-RIYZIHGNSA-N               NaN  \\n\",\n       \"\\n\",\n       \"[36845 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"A375\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>-1.166500</td>\\n\",\n       \"      <td>-0.606900</td>\\n\",\n       \"      <td>-0.733650</td>\\n\",\n       \"      <td>-1.481400</td>\\n\",\n       \"      <td>1.281200</td>\\n\",\n       \"      <td>4.095450</td>\\n\",\n       \"      <td>-0.712400</td>\\n\",\n       \"      <td>0.995350</td>\\n\",\n       \"      <td>-0.031650</td>\\n\",\n       \"      <td>-0.990250</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.883050</td>\\n\",\n       \"      <td>-1.605100</td>\\n\",\n       \"      <td>0.005250</td>\\n\",\n       \"      <td>0.979050</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:0.625:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>0.794862</td>\\n\",\n       \"      <td>-0.358541</td>\\n\",\n       \"      <td>0.122322</td>\\n\",\n       \"      <td>-0.550787</td>\\n\",\n       \"      <td>-0.178181</td>\\n\",\n       \"      <td>1.566406</td>\\n\",\n       \"      <td>0.058614</td>\\n\",\n       \"      <td>0.308965</td>\\n\",\n       \"      <td>0.369855</td>\\n\",\n       \"      <td>-0.948085</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.421624</td>\\n\",\n       \"      <td>-0.335863</td>\\n\",\n       \"      <td>0.308946</td>\\n\",\n       \"      <td>-0.352101</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:0.625:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>2.599445</td>\\n\",\n       \"      <td>1.755998</td>\\n\",\n       \"      <td>-0.776326</td>\\n\",\n       \"      <td>-4.121394</td>\\n\",\n       \"      <td>2.539309</td>\\n\",\n       \"      <td>0.533612</td>\\n\",\n       \"      <td>-5.299499</td>\\n\",\n       \"      <td>1.496123</td>\\n\",\n       \"      <td>0.462508</td>\\n\",\n       \"      <td>2.645836</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.400337</td>\\n\",\n       \"      <td>0.068793</td>\\n\",\n       \"      <td>-0.495560</td>\\n\",\n       \"      <td>0.044498</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:10:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>0.230140</td>\\n\",\n       \"      <td>1.530381</td>\\n\",\n       \"      <td>-0.823664</td>\\n\",\n       \"      <td>0.111742</td>\\n\",\n       \"      <td>0.497451</td>\\n\",\n       \"      <td>-1.489498</td>\\n\",\n       \"      <td>0.113403</td>\\n\",\n       \"      <td>0.309370</td>\\n\",\n       \"      <td>0.087925</td>\\n\",\n       \"      <td>-1.126528</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.049004</td>\\n\",\n       \"      <td>-0.486649</td>\\n\",\n       \"      <td>0.594023</td>\\n\",\n       \"      <td>1.365092</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:10:3</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>2.498556</td>\\n\",\n       \"      <td>3.288291</td>\\n\",\n       \"      <td>-0.831289</td>\\n\",\n       \"      <td>-3.811227</td>\\n\",\n       \"      <td>-0.816384</td>\\n\",\n       \"      <td>4.508691</td>\\n\",\n       \"      <td>-3.575771</td>\\n\",\n       \"      <td>1.432798</td>\\n\",\n       \"      <td>0.086126</td>\\n\",\n       \"      <td>1.699897</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.593055</td>\\n\",\n       \"      <td>-0.425393</td>\\n\",\n       \"      <td>0.606134</td>\\n\",\n       \"      <td>-1.007184</td>\\n\",\n       \"      <td>ABY001_A375_XH:BRD-A61304759:2.5:24</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428617</th>\\n\",\n       \"      <td>-0.580379</td>\\n\",\n       \"      <td>1.485145</td>\\n\",\n       \"      <td>-1.204291</td>\\n\",\n       \"      <td>-1.012525</td>\\n\",\n       \"      <td>-0.450992</td>\\n\",\n       \"      <td>0.134765</td>\\n\",\n       \"      <td>-0.635437</td>\\n\",\n       \"      <td>-0.413369</td>\\n\",\n       \"      <td>0.629057</td>\\n\",\n       \"      <td>-0.939117</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.301474</td>\\n\",\n       \"      <td>-0.844485</td>\\n\",\n       \"      <td>-1.120032</td>\\n\",\n       \"      <td>-0.260242</td>\\n\",\n       \"      <td>PRISM001_A375_6H:BRD-K47804338-001-01-6:5</td>\\n\",\n       \"      <td>BRD-K47804338</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc(NC(=O)Nc2ccc3O[C@@H](CN(C)Cc4ccc5OCOc5...</td>\\n\",\n       \"      <td>NGRSXEMLDRQNMH-JAOJVIEDSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428622</th>\\n\",\n       \"      <td>-0.728779</td>\\n\",\n       \"      <td>0.216934</td>\\n\",\n       \"      <td>-0.230840</td>\\n\",\n       \"      <td>0.678832</td>\\n\",\n       \"      <td>-0.169310</td>\\n\",\n       \"      <td>-0.659888</td>\\n\",\n       \"      <td>0.288666</td>\\n\",\n       \"      <td>0.798826</td>\\n\",\n       \"      <td>-0.144811</td>\\n\",\n       \"      <td>-0.720356</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.088713</td>\\n\",\n       \"      <td>-0.253468</td>\\n\",\n       \"      <td>0.584100</td>\\n\",\n       \"      <td>0.151686</td>\\n\",\n       \"      <td>PRISM001_A375_24H:BRD-K65000194-001-01-1:5</td>\\n\",\n       \"      <td>BRD-K65000194</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...</td>\\n\",\n       \"      <td>NGRSXEMLDRQNMH-FNFPNEPISA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428623</th>\\n\",\n       \"      <td>-0.542132</td>\\n\",\n       \"      <td>0.293748</td>\\n\",\n       \"      <td>0.141321</td>\\n\",\n       \"      <td>-3.669676</td>\\n\",\n       \"      <td>0.769465</td>\\n\",\n       \"      <td>0.132505</td>\\n\",\n       \"      <td>-0.666395</td>\\n\",\n       \"      <td>0.152104</td>\\n\",\n       \"      <td>0.360985</td>\\n\",\n       \"      <td>0.135885</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.485555</td>\\n\",\n       \"      <td>0.613135</td>\\n\",\n       \"      <td>-0.104802</td>\\n\",\n       \"      <td>0.074249</td>\\n\",\n       \"      <td>PRISM001_A375_6H:BRD-K65000194-001-01-1:5</td>\\n\",\n       \"      <td>BRD-K65000194</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...</td>\\n\",\n       \"      <td>NGRSXEMLDRQNMH-FNFPNEPISA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428628</th>\\n\",\n       \"      <td>-0.861263</td>\\n\",\n       \"      <td>-0.403781</td>\\n\",\n       \"      <td>0.113876</td>\\n\",\n       \"      <td>0.717960</td>\\n\",\n       \"      <td>0.250757</td>\\n\",\n       \"      <td>-0.132691</td>\\n\",\n       \"      <td>-0.141309</td>\\n\",\n       \"      <td>0.013201</td>\\n\",\n       \"      <td>-0.019395</td>\\n\",\n       \"      <td>1.028209</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.392342</td>\\n\",\n       \"      <td>0.000498</td>\\n\",\n       \"      <td>0.227225</td>\\n\",\n       \"      <td>-0.320918</td>\\n\",\n       \"      <td>PRISM001_A375_24H:BRD-K72490684-001-01-2:5</td>\\n\",\n       \"      <td>BRD-K72490684</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...</td>\\n\",\n       \"      <td>NGRSXEMLDRQNMH-PHKGMDLESA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428629</th>\\n\",\n       \"      <td>-0.249332</td>\\n\",\n       \"      <td>-0.532578</td>\\n\",\n       \"      <td>-0.230950</td>\\n\",\n       \"      <td>0.146238</td>\\n\",\n       \"      <td>-0.073509</td>\\n\",\n       \"      <td>0.132697</td>\\n\",\n       \"      <td>-1.529438</td>\\n\",\n       \"      <td>0.161819</td>\\n\",\n       \"      <td>-0.173739</td>\\n\",\n       \"      <td>-1.379441</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.051463</td>\\n\",\n       \"      <td>0.044101</td>\\n\",\n       \"      <td>0.742501</td>\\n\",\n       \"      <td>-1.953608</td>\\n\",\n       \"      <td>PRISM001_A375_6H:BRD-K72490684-001-01-2:5</td>\\n\",\n       \"      <td>BRD-K72490684</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...</td>\\n\",\n       \"      <td>NGRSXEMLDRQNMH-PHKGMDLESA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>23054 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"0      -1.166500 -0.606900 -0.733650 -1.481400  1.281200  4.095450 -0.712400   \\n\",\n       \"1       0.794862 -0.358541  0.122322 -0.550787 -0.178181  1.566406  0.058614   \\n\",\n       \"2       2.599445  1.755998 -0.776326 -4.121394  2.539309  0.533612 -5.299499   \\n\",\n       \"3       0.230140  1.530381 -0.823664  0.111742  0.497451 -1.489498  0.113403   \\n\",\n       \"4       2.498556  3.288291 -0.831289 -3.811227 -0.816384  4.508691 -3.575771   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"428617 -0.580379  1.485145 -1.204291 -1.012525 -0.450992  0.134765 -0.635437   \\n\",\n       \"428622 -0.728779  0.216934 -0.230840  0.678832 -0.169310 -0.659888  0.288666   \\n\",\n       \"428623 -0.542132  0.293748  0.141321 -3.669676  0.769465  0.132505 -0.666395   \\n\",\n       \"428628 -0.861263 -0.403781  0.113876  0.717960  0.250757 -0.132691 -0.141309   \\n\",\n       \"428629 -0.249332 -0.532578 -0.230950  0.146238 -0.073509  0.132697 -1.529438   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"0       0.995350 -0.031650 -0.990250  ... -0.883050 -1.605100  0.005250   \\n\",\n       \"1       0.308965  0.369855 -0.948085  ... -2.421624 -0.335863  0.308946   \\n\",\n       \"2       1.496123  0.462508  2.645836  ... -0.400337  0.068793 -0.495560   \\n\",\n       \"3       0.309370  0.087925 -1.126528  ... -2.049004 -0.486649  0.594023   \\n\",\n       \"4       1.432798  0.086126  1.699897  ... -0.593055 -0.425393  0.606134   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"428617 -0.413369  0.629057 -0.939117  ...  0.301474 -0.844485 -1.120032   \\n\",\n       \"428622  0.798826 -0.144811 -0.720356  ... -0.088713 -0.253468  0.584100   \\n\",\n       \"428623  0.152104  0.360985  0.135885  ...  0.485555  0.613135 -0.104802   \\n\",\n       \"428628  0.013201 -0.019395  1.028209  ... -0.392342  0.000498  0.227225   \\n\",\n       \"428629  0.161819 -0.173739 -1.379441  ...  0.051463  0.044101  0.742501   \\n\",\n       \"\\n\",\n       \"            9988                                     full_id        pert_id  \\\\\\n\",\n       \"0       0.979050       ABY001_A375_XH:BRD-A61304759:0.625:24  BRD-A61304759   \\n\",\n       \"1      -0.352101        ABY001_A375_XH:BRD-A61304759:0.625:3  BRD-A61304759   \\n\",\n       \"2       0.044498          ABY001_A375_XH:BRD-A61304759:10:24  BRD-A61304759   \\n\",\n       \"3       1.365092           ABY001_A375_XH:BRD-A61304759:10:3  BRD-A61304759   \\n\",\n       \"4      -1.007184         ABY001_A375_XH:BRD-A61304759:2.5:24  BRD-A61304759   \\n\",\n       \"...          ...                                         ...            ...   \\n\",\n       \"428617 -0.260242   PRISM001_A375_6H:BRD-K47804338-001-01-6:5  BRD-K47804338   \\n\",\n       \"428622  0.151686  PRISM001_A375_24H:BRD-K65000194-001-01-1:5  BRD-K65000194   \\n\",\n       \"428623  0.074249   PRISM001_A375_6H:BRD-K65000194-001-01-1:5  BRD-K65000194   \\n\",\n       \"428628 -0.320918  PRISM001_A375_24H:BRD-K72490684-001-01-2:5  BRD-K72490684   \\n\",\n       \"428629 -1.953608   PRISM001_A375_6H:BRD-K72490684-001-01-2:5  BRD-K72490684   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"0             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"1             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"2             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"3             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"4             A375  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"428617        A375  COc1ccc(NC(=O)Nc2ccc3O[C@@H](CN(C)Cc4ccc5OCOc5...   \\n\",\n       \"428622        A375  COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...   \\n\",\n       \"428623        A375  COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...   \\n\",\n       \"428628        A375  COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...   \\n\",\n       \"428629        A375  COc1ccc(NC(=O)Nc2ccc3O[C@H](CN(C)Cc4ccc5OCOc5c...   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases  \\n\",\n       \"0       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"1       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"2       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"3       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"4       AYUNIORJHRXIBJ-ZGQRYRSUSA-N               NaN  \\n\",\n       \"...                             ...               ...  \\n\",\n       \"428617  NGRSXEMLDRQNMH-JAOJVIEDSA-N               NaN  \\n\",\n       \"428622  NGRSXEMLDRQNMH-FNFPNEPISA-N               NaN  \\n\",\n       \"428623  NGRSXEMLDRQNMH-FNFPNEPISA-N               NaN  \\n\",\n       \"428628  NGRSXEMLDRQNMH-PHKGMDLESA-N               NaN  \\n\",\n       \"428629  NGRSXEMLDRQNMH-PHKGMDLESA-N               NaN  \\n\",\n       \"\\n\",\n       \"[23054 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Split into different dataframes per cell line in `CHOSEN_CELL_LINES`\\n\",\n    \"df_per_cell_line = {}\\n\",\n    \"for line in CHOSEN_CELL_LINES:\\n\",\n    \"    df_per_cell_line[line] = parsed_df2[parsed_df2[\\\"cell_iname\\\"] == line]\\n\",\n    \"    print(line)\\n\",\n    \"    display(df_per_cell_line[line])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 16,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>10046</th>\\n\",\n       \"      <th>10049</th>\\n\",\n       \"      <th>10051</th>\\n\",\n       \"      <th>10057</th>\\n\",\n       \"      <th>10058</th>\\n\",\n       \"      <th>10059</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>354</th>\\n\",\n       \"      <td>0.266000</td>\\n\",\n       \"      <td>-0.479600</td>\\n\",\n       \"      <td>-0.198300</td>\\n\",\n       \"      <td>-0.431300</td>\\n\",\n       \"      <td>-1.229350</td>\\n\",\n       \"      <td>3.355450</td>\\n\",\n       \"      <td>-1.548500</td>\\n\",\n       \"      <td>0.493650</td>\\n\",\n       \"      <td>-0.611050</td>\\n\",\n       \"      <td>0.528150</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.097050</td>\\n\",\n       \"      <td>-2.536850</td>\\n\",\n       \"      <td>-1.160750</td>\\n\",\n       \"      <td>0.172700</td>\\n\",\n       \"      <td>PAC001_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>355</th>\\n\",\n       \"      <td>0.215200</td>\\n\",\n       \"      <td>0.210250</td>\\n\",\n       \"      <td>1.565950</td>\\n\",\n       \"      <td>-0.410100</td>\\n\",\n       \"      <td>-0.532450</td>\\n\",\n       \"      <td>3.426700</td>\\n\",\n       \"      <td>-0.762800</td>\\n\",\n       \"      <td>1.400200</td>\\n\",\n       \"      <td>0.557150</td>\\n\",\n       \"      <td>-0.067150</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.420500</td>\\n\",\n       \"      <td>-1.600900</td>\\n\",\n       \"      <td>-0.944150</td>\\n\",\n       \"      <td>0.896000</td>\\n\",\n       \"      <td>PAC002_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>356</th>\\n\",\n       \"      <td>-0.030250</td>\\n\",\n       \"      <td>0.215500</td>\\n\",\n       \"      <td>1.020750</td>\\n\",\n       \"      <td>0.212600</td>\\n\",\n       \"      <td>0.240450</td>\\n\",\n       \"      <td>2.905800</td>\\n\",\n       \"      <td>-1.067600</td>\\n\",\n       \"      <td>0.687250</td>\\n\",\n       \"      <td>0.232700</td>\\n\",\n       \"      <td>0.448650</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.074000</td>\\n\",\n       \"      <td>-1.339700</td>\\n\",\n       \"      <td>-0.656750</td>\\n\",\n       \"      <td>-0.375600</td>\\n\",\n       \"      <td>PAC003_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>357</th>\\n\",\n       \"      <td>0.131900</td>\\n\",\n       \"      <td>1.166850</td>\\n\",\n       \"      <td>1.117750</td>\\n\",\n       \"      <td>-0.466700</td>\\n\",\n       \"      <td>-0.843450</td>\\n\",\n       \"      <td>4.599200</td>\\n\",\n       \"      <td>-2.009750</td>\\n\",\n       \"      <td>-0.575650</td>\\n\",\n       \"      <td>0.672100</td>\\n\",\n       \"      <td>0.367900</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.687200</td>\\n\",\n       \"      <td>-1.443900</td>\\n\",\n       \"      <td>-1.615750</td>\\n\",\n       \"      <td>-0.186750</td>\\n\",\n       \"      <td>PAC004_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>358</th>\\n\",\n       \"      <td>-0.034000</td>\\n\",\n       \"      <td>0.874000</td>\\n\",\n       \"      <td>0.655750</td>\\n\",\n       \"      <td>-0.269150</td>\\n\",\n       \"      <td>0.966950</td>\\n\",\n       \"      <td>3.022300</td>\\n\",\n       \"      <td>-0.939900</td>\\n\",\n       \"      <td>0.985750</td>\\n\",\n       \"      <td>0.717850</td>\\n\",\n       \"      <td>-0.352600</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.305050</td>\\n\",\n       \"      <td>-1.605700</td>\\n\",\n       \"      <td>-2.667600</td>\\n\",\n       \"      <td>-0.660750</td>\\n\",\n       \"      <td>PAC005_U2OS_6H:BRD-A61304759-001-01-0:20</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423912</th>\\n\",\n       \"      <td>-0.517706</td>\\n\",\n       \"      <td>-1.212317</td>\\n\",\n       \"      <td>-0.322461</td>\\n\",\n       \"      <td>0.124004</td>\\n\",\n       \"      <td>-0.497949</td>\\n\",\n       \"      <td>0.409649</td>\\n\",\n       \"      <td>1.260086</td>\\n\",\n       \"      <td>0.022732</td>\\n\",\n       \"      <td>-0.279040</td>\\n\",\n       \"      <td>0.357222</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.349653</td>\\n\",\n       \"      <td>1.112821</td>\\n\",\n       \"      <td>-0.600327</td>\\n\",\n       \"      <td>-0.213830</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K77801455-001-05-3:10</td>\\n\",\n       \"      <td>BRD-K77801455</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>CCC(=O)NCC(=O)OCC(=O)c1ccccc1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>Propionylamino-acetic acid 2-oxo-2-phenyl-ethy...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423922</th>\\n\",\n       \"      <td>0.404553</td>\\n\",\n       \"      <td>-0.282865</td>\\n\",\n       \"      <td>0.209292</td>\\n\",\n       \"      <td>0.007066</td>\\n\",\n       \"      <td>0.490206</td>\\n\",\n       \"      <td>-0.355228</td>\\n\",\n       \"      <td>-0.748379</td>\\n\",\n       \"      <td>0.106611</td>\\n\",\n       \"      <td>-0.473985</td>\\n\",\n       \"      <td>-0.632465</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.196296</td>\\n\",\n       \"      <td>-0.104630</td>\\n\",\n       \"      <td>0.070716</td>\\n\",\n       \"      <td>-0.157745</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K81266242-001-05-1:10</td>\\n\",\n       \"      <td>BRD-K81266242</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>O=C(CCc1ccccc1)N1CCN(CC1)c1ccccn1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>1-(3-phenylpropanoyl)-4-(2-pyridinyl)piperazine</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423925</th>\\n\",\n       \"      <td>-0.451088</td>\\n\",\n       \"      <td>0.057352</td>\\n\",\n       \"      <td>-0.274881</td>\\n\",\n       \"      <td>-0.319326</td>\\n\",\n       \"      <td>-0.444744</td>\\n\",\n       \"      <td>0.394264</td>\\n\",\n       \"      <td>0.334506</td>\\n\",\n       \"      <td>-0.001001</td>\\n\",\n       \"      <td>-0.517647</td>\\n\",\n       \"      <td>-0.022595</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.168200</td>\\n\",\n       \"      <td>1.282395</td>\\n\",\n       \"      <td>0.302800</td>\\n\",\n       \"      <td>-0.405775</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K83028309-001-05-3:10</td>\\n\",\n       \"      <td>BRD-K83028309</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>Nc1cc(nc2cc(nn12)-c1ccccc1)-c1ccc(Br)cc1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>5-(4-Bromo-phenyl)-2-phenyl-pyrazolo[1,5-a]pyr...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423939</th>\\n\",\n       \"      <td>-1.002630</td>\\n\",\n       \"      <td>0.584975</td>\\n\",\n       \"      <td>0.709276</td>\\n\",\n       \"      <td>0.607589</td>\\n\",\n       \"      <td>0.158020</td>\\n\",\n       \"      <td>-1.167863</td>\\n\",\n       \"      <td>0.213422</td>\\n\",\n       \"      <td>-0.155531</td>\\n\",\n       \"      <td>0.696174</td>\\n\",\n       \"      <td>-0.907303</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.025072</td>\\n\",\n       \"      <td>0.210780</td>\\n\",\n       \"      <td>-0.037277</td>\\n\",\n       \"      <td>-0.869588</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K87879912-001-05-4:10</td>\\n\",\n       \"      <td>BRD-K87879912</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>O=S(=O)(CCOc1ccccc1)c1nc2ccccc2[nH]1</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>423955</th>\\n\",\n       \"      <td>-0.570759</td>\\n\",\n       \"      <td>-0.515178</td>\\n\",\n       \"      <td>0.889132</td>\\n\",\n       \"      <td>-2.230757</td>\\n\",\n       \"      <td>2.850153</td>\\n\",\n       \"      <td>-1.396367</td>\\n\",\n       \"      <td>-0.140172</td>\\n\",\n       \"      <td>-0.561217</td>\\n\",\n       \"      <td>0.466810</td>\\n\",\n       \"      <td>-1.153605</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.611325</td>\\n\",\n       \"      <td>1.261738</td>\\n\",\n       \"      <td>0.581113</td>\\n\",\n       \"      <td>0.099712</td>\\n\",\n       \"      <td>PAC068_U2OS_6H:BRD-K94948151-001-06-6:10</td>\\n\",\n       \"      <td>BRD-K94948151</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COc1cc(C[C@H](C)[C@H](C)Cc2ccc3OCOc3c2)ccc1O</td>\\n\",\n       \"      <td>QDDILOVMGWUNGD-UONOGXRCSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>22631 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"           10007      1001     10013     10038     10046     10049     10051  \\\\\\n\",\n       \"354     0.266000 -0.479600 -0.198300 -0.431300 -1.229350  3.355450 -1.548500   \\n\",\n       \"355     0.215200  0.210250  1.565950 -0.410100 -0.532450  3.426700 -0.762800   \\n\",\n       \"356    -0.030250  0.215500  1.020750  0.212600  0.240450  2.905800 -1.067600   \\n\",\n       \"357     0.131900  1.166850  1.117750 -0.466700 -0.843450  4.599200 -2.009750   \\n\",\n       \"358    -0.034000  0.874000  0.655750 -0.269150  0.966950  3.022300 -0.939900   \\n\",\n       \"...          ...       ...       ...       ...       ...       ...       ...   \\n\",\n       \"423912 -0.517706 -1.212317 -0.322461  0.124004 -0.497949  0.409649  1.260086   \\n\",\n       \"423922  0.404553 -0.282865  0.209292  0.007066  0.490206 -0.355228 -0.748379   \\n\",\n       \"423925 -0.451088  0.057352 -0.274881 -0.319326 -0.444744  0.394264  0.334506   \\n\",\n       \"423939 -1.002630  0.584975  0.709276  0.607589  0.158020 -1.167863  0.213422   \\n\",\n       \"423955 -0.570759 -0.515178  0.889132 -2.230757  2.850153 -1.396367 -0.140172   \\n\",\n       \"\\n\",\n       \"           10057     10058     10059  ...      9943      9961       998  \\\\\\n\",\n       \"354     0.493650 -0.611050  0.528150  ...  0.097050 -2.536850 -1.160750   \\n\",\n       \"355     1.400200  0.557150 -0.067150  ...  1.420500 -1.600900 -0.944150   \\n\",\n       \"356     0.687250  0.232700  0.448650  ... -0.074000 -1.339700 -0.656750   \\n\",\n       \"357    -0.575650  0.672100  0.367900  ...  0.687200 -1.443900 -1.615750   \\n\",\n       \"358     0.985750  0.717850 -0.352600  ... -1.305050 -1.605700 -2.667600   \\n\",\n       \"...          ...       ...       ...  ...       ...       ...       ...   \\n\",\n       \"423912  0.022732 -0.279040  0.357222  ...  1.349653  1.112821 -0.600327   \\n\",\n       \"423922  0.106611 -0.473985 -0.632465  ...  0.196296 -0.104630  0.070716   \\n\",\n       \"423925 -0.001001 -0.517647 -0.022595  ...  0.168200  1.282395  0.302800   \\n\",\n       \"423939 -0.155531  0.696174 -0.907303  ...  0.025072  0.210780 -0.037277   \\n\",\n       \"423955 -0.561217  0.466810 -1.153605  ...  0.611325  1.261738  0.581113   \\n\",\n       \"\\n\",\n       \"            9988                                   full_id        pert_id  \\\\\\n\",\n       \"354     0.172700  PAC001_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"355     0.896000  PAC002_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"356    -0.375600  PAC003_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"357    -0.186750  PAC004_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"358    -0.660750  PAC005_U2OS_6H:BRD-A61304759-001-01-0:20  BRD-A61304759   \\n\",\n       \"...          ...                                       ...            ...   \\n\",\n       \"423912 -0.213830  PAC068_U2OS_6H:BRD-K77801455-001-05-3:10  BRD-K77801455   \\n\",\n       \"423922 -0.157745  PAC068_U2OS_6H:BRD-K81266242-001-05-1:10  BRD-K81266242   \\n\",\n       \"423925 -0.405775  PAC068_U2OS_6H:BRD-K83028309-001-05-3:10  BRD-K83028309   \\n\",\n       \"423939 -0.869588  PAC068_U2OS_6H:BRD-K87879912-001-05-4:10  BRD-K87879912   \\n\",\n       \"423955  0.099712  PAC068_U2OS_6H:BRD-K94948151-001-06-6:10  BRD-K94948151   \\n\",\n       \"\\n\",\n       \"        cell_iname                                             SMILES  \\\\\\n\",\n       \"354           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"355           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"356           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"357           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"358           U2OS  COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"...            ...                                                ...   \\n\",\n       \"423912        U2OS                      CCC(=O)NCC(=O)OCC(=O)c1ccccc1   \\n\",\n       \"423922        U2OS                  O=C(CCc1ccccc1)N1CCN(CC1)c1ccccn1   \\n\",\n       \"423925        U2OS           Nc1cc(nc2cc(nn12)-c1ccccc1)-c1ccc(Br)cc1   \\n\",\n       \"423939        U2OS               O=S(=O)(CCOc1ccccc1)c1nc2ccccc2[nH]1   \\n\",\n       \"423955        U2OS       COc1cc(C[C@H](C)[C@H](C)Cc2ccc3OCOc3c2)ccc1O   \\n\",\n       \"\\n\",\n       \"                          inchi_key  \\\\\\n\",\n       \"354     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"355     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"356     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"357     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"358     AYUNIORJHRXIBJ-ZGQRYRSUSA-N   \\n\",\n       \"...                             ...   \\n\",\n       \"423912                          NaN   \\n\",\n       \"423922                          NaN   \\n\",\n       \"423925                          NaN   \\n\",\n       \"423939                          NaN   \\n\",\n       \"423955  QDDILOVMGWUNGD-UONOGXRCSA-N   \\n\",\n       \"\\n\",\n       \"                                         compound_aliases  \\n\",\n       \"354                                                   NaN  \\n\",\n       \"355                                                   NaN  \\n\",\n       \"356                                                   NaN  \\n\",\n       \"357                                                   NaN  \\n\",\n       \"358                                                   NaN  \\n\",\n       \"...                                                   ...  \\n\",\n       \"423912  Propionylamino-acetic acid 2-oxo-2-phenyl-ethy...  \\n\",\n       \"423922    1-(3-phenylpropanoyl)-4-(2-pyridinyl)piperazine  \\n\",\n       \"423925  5-(4-Bromo-phenyl)-2-phenyl-pyrazolo[1,5-a]pyr...  \\n\",\n       \"423939                                                NaN  \\n\",\n       \"423955                                                NaN  \\n\",\n       \"\\n\",\n       \"[22631 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"display(df_per_cell_line[list(df_per_cell_line.keys())[0]])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 33,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>415582</th>\\n\",\n       \"      <td>PAC017_U2OS_6H:BRD-K35635568-001-01-9:9.97358</td>\\n\",\n       \"      <td>BRD-K35635568</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@@H](C)[C@@H](CN(C)C)OCCCC[C@@...</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.869478</td>\\n\",\n       \"      <td>0.042114</td>\\n\",\n       \"      <td>0.090729</td>\\n\",\n       \"      <td>-0.256483</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.760171</td>\\n\",\n       \"      <td>0.753874</td>\\n\",\n       \"      <td>0.999389</td>\\n\",\n       \"      <td>0.147320</td>\\n\",\n       \"      <td>0.228526</td>\\n\",\n       \"      <td>0.435754</td>\\n\",\n       \"      <td>0.699075</td>\\n\",\n       \"      <td>0.110136</td>\\n\",\n       \"      <td>0.163373</td>\\n\",\n       \"      <td>-0.796202</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>414494</th>\\n\",\n       \"      <td>PAC011_U2OS_6H:BRD-K14695950-001-01-4:10.0184</td>\\n\",\n       \"      <td>BRD-K14695950</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>COc1ccc(cc1)S(=O)(=O)Nc1cccc2c1O[C@H](CN(C)CC1...</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.031156</td>\\n\",\n       \"      <td>-0.426591</td>\\n\",\n       \"      <td>-0.535521</td>\\n\",\n       \"      <td>0.041022</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.087797</td>\\n\",\n       \"      <td>0.426523</td>\\n\",\n       \"      <td>0.014110</td>\\n\",\n       \"      <td>-0.123189</td>\\n\",\n       \"      <td>-0.190503</td>\\n\",\n       \"      <td>0.666766</td>\\n\",\n       \"      <td>0.426695</td>\\n\",\n       \"      <td>0.068460</td>\\n\",\n       \"      <td>0.917429</td>\\n\",\n       \"      <td>-0.304055</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>335645</th>\\n\",\n       \"      <td>PAC046_U2OS_6H:BRD-K10821756-001-01-6:9.96131</td>\\n\",\n       \"      <td>BRD-K10821756</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>CN(C)C(=O)C[C@H]1C[C@@H]2[C@@H](Oc3ccc(NC(=O)c...</td>\\n\",\n       \"      <td>IUZUJTQXZTXMFN-ZWVMGRIOSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-2.114700</td>\\n\",\n       \"      <td>-0.179000</td>\\n\",\n       \"      <td>0.040933</td>\\n\",\n       \"      <td>0.004367</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.719833</td>\\n\",\n       \"      <td>-0.137200</td>\\n\",\n       \"      <td>-0.274533</td>\\n\",\n       \"      <td>-0.529267</td>\\n\",\n       \"      <td>0.290033</td>\\n\",\n       \"      <td>0.658333</td>\\n\",\n       \"      <td>-1.030267</td>\\n\",\n       \"      <td>0.196100</td>\\n\",\n       \"      <td>0.126067</td>\\n\",\n       \"      <td>-0.493633</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>415401</th>\\n\",\n       \"      <td>PAC017_U2OS_6H:BRD-K23867157-001-01-9:10.0446</td>\\n\",\n       \"      <td>BRD-K23867157</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@H](C)[C@@H](CN(C)Cc2ccc3OCOc3...</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.032800</td>\\n\",\n       \"      <td>0.103533</td>\\n\",\n       \"      <td>0.159733</td>\\n\",\n       \"      <td>0.178633</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.552733</td>\\n\",\n       \"      <td>-0.751167</td>\\n\",\n       \"      <td>-0.236200</td>\\n\",\n       \"      <td>-0.288467</td>\\n\",\n       \"      <td>1.186633</td>\\n\",\n       \"      <td>-0.096867</td>\\n\",\n       \"      <td>0.008300</td>\\n\",\n       \"      <td>-0.226933</td>\\n\",\n       \"      <td>0.121933</td>\\n\",\n       \"      <td>0.289800</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>418195</th>\\n\",\n       \"      <td>PAC030_U2OS_6H:BRD-K87193132-001-01-3:10.0847</td>\\n\",\n       \"      <td>BRD-K87193132</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>CO[C@@H]1CN(C)C(=O)c2cc(NC(=O)NC3CCCCC3)ccc2OC...</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.628900</td>\\n\",\n       \"      <td>-0.142933</td>\\n\",\n       \"      <td>-0.005667</td>\\n\",\n       \"      <td>-0.565000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.460967</td>\\n\",\n       \"      <td>0.200767</td>\\n\",\n       \"      <td>-0.412833</td>\\n\",\n       \"      <td>0.257400</td>\\n\",\n       \"      <td>0.135633</td>\\n\",\n       \"      <td>0.140200</td>\\n\",\n       \"      <td>-0.176033</td>\\n\",\n       \"      <td>0.058967</td>\\n\",\n       \"      <td>0.127333</td>\\n\",\n       \"      <td>-0.348967</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>28269</th>\\n\",\n       \"      <td>PAC011_U2OS_6H:BRD-A84481105-003-18-0:20</td>\\n\",\n       \"      <td>BRD-A84481105</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>CSc1ccc2Sc3ccccc3N(CCC3CCCCN3C)c2c1</td>\\n\",\n       \"      <td>KLBQZWRITKRQQV-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-2.989600</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>7.966500</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>3.323200</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-1.078400</td>\\n\",\n       \"      <td>-9.840500</td>\\n\",\n       \"      <td>-4.797400</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-4.752700</td>\\n\",\n       \"      <td>-8.170600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-1.186500</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>323070</th>\\n\",\n       \"      <td>PAC022_U2OS_6H:BRD-K87004592-001-02-3:10.0222</td>\\n\",\n       \"      <td>BRD-K87004592</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)S(=O)(=O)c2cc...</td>\\n\",\n       \"      <td>YDFSFENEUSKTIZ-WFXMLNOXSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-1.131200</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-4.055600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-7.773200</td>\\n\",\n       \"      <td>1.938100</td>\\n\",\n       \"      <td>1.993000</td>\\n\",\n       \"      <td>4.441500</td>\\n\",\n       \"      <td>5.403000</td>\\n\",\n       \"      <td>1.289000</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>322698</th>\\n\",\n       \"      <td>PAC022_U2OS_6H:BRD-K50510764-001-01-6:10.0958</td>\\n\",\n       \"      <td>BRD-K50510764</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)Cc2cccc(c2)C(...</td>\\n\",\n       \"      <td>KWAFQNIHTBJTFB-SPEDKVCISA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>0.648300</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-8.683200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-8.787700</td>\\n\",\n       \"      <td>-2.034900</td>\\n\",\n       \"      <td>6.682700</td>\\n\",\n       \"      <td>1.310300</td>\\n\",\n       \"      <td>6.207100</td>\\n\",\n       \"      <td>0.432300</td>\\n\",\n       \"      <td>3.209100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>323054</th>\\n\",\n       \"      <td>PAC022_U2OS_6H:BRD-K84992272-001-02-9:9.9872</td>\\n\",\n       \"      <td>BRD-K84992272</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@@H](C)[C@H](CN(C)S(=O)(=O)c2c...</td>\\n\",\n       \"      <td>BKQPNZMATHQHFX-VHSZZVNMSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>1.527500</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-7.884600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-7.369500</td>\\n\",\n       \"      <td>-4.354500</td>\\n\",\n       \"      <td>2.639300</td>\\n\",\n       \"      <td>5.096700</td>\\n\",\n       \"      <td>4.949600</td>\\n\",\n       \"      <td>-3.694200</td>\\n\",\n       \"      <td>-3.723500</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>322740</th>\\n\",\n       \"      <td>PAC022_U2OS_6H:BRD-K56987347-001-01-6:10.0716</td>\\n\",\n       \"      <td>BRD-K56987347</td>\\n\",\n       \"      <td>U2OS</td>\\n\",\n       \"      <td>C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)Cc2ccc(C)cc2)...</td>\\n\",\n       \"      <td>WGPDLCVCGOLGPE-JTAQYXEDSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>4.219400</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-6.938900</td>\\n\",\n       \"      <td>-9.631900</td>\\n\",\n       \"      <td>-2.627000</td>\\n\",\n       \"      <td>-4.594600</td>\\n\",\n       \"      <td>1.311600</td>\\n\",\n       \"      <td>3.464000</td>\\n\",\n       \"      <td>5.099600</td>\\n\",\n       \"      <td>-8.922200</td>\\n\",\n       \"      <td>-0.006800</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>16058 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                              full_id        pert_id  \\\\\\n\",\n       \"415582  PAC017_U2OS_6H:BRD-K35635568-001-01-9:9.97358  BRD-K35635568   \\n\",\n       \"414494  PAC011_U2OS_6H:BRD-K14695950-001-01-4:10.0184  BRD-K14695950   \\n\",\n       \"335645  PAC046_U2OS_6H:BRD-K10821756-001-01-6:9.96131  BRD-K10821756   \\n\",\n       \"415401  PAC017_U2OS_6H:BRD-K23867157-001-01-9:10.0446  BRD-K23867157   \\n\",\n       \"418195  PAC030_U2OS_6H:BRD-K87193132-001-01-3:10.0847  BRD-K87193132   \\n\",\n       \"...                                               ...            ...   \\n\",\n       \"28269        PAC011_U2OS_6H:BRD-A84481105-003-18-0:20  BRD-A84481105   \\n\",\n       \"323070  PAC022_U2OS_6H:BRD-K87004592-001-02-3:10.0222  BRD-K87004592   \\n\",\n       \"322698  PAC022_U2OS_6H:BRD-K50510764-001-01-6:10.0958  BRD-K50510764   \\n\",\n       \"323054   PAC022_U2OS_6H:BRD-K84992272-001-02-9:9.9872  BRD-K84992272   \\n\",\n       \"322740  PAC022_U2OS_6H:BRD-K56987347-001-01-6:10.0716  BRD-K56987347   \\n\",\n       \"\\n\",\n       \"       cell_iname                                             SMILES  \\\\\\n\",\n       \"415582       U2OS  C[C@@H](CO)N1C[C@@H](C)[C@@H](CN(C)C)OCCCC[C@@...   \\n\",\n       \"414494       U2OS  COc1ccc(cc1)S(=O)(=O)Nc1cccc2c1O[C@H](CN(C)CC1...   \\n\",\n       \"335645       U2OS  CN(C)C(=O)C[C@H]1C[C@@H]2[C@@H](Oc3ccc(NC(=O)c...   \\n\",\n       \"415401       U2OS  C[C@@H](CO)N1C[C@H](C)[C@@H](CN(C)Cc2ccc3OCOc3...   \\n\",\n       \"418195       U2OS  CO[C@@H]1CN(C)C(=O)c2cc(NC(=O)NC3CCCCC3)ccc2OC...   \\n\",\n       \"...           ...                                                ...   \\n\",\n       \"28269        U2OS                CSc1ccc2Sc3ccccc3N(CCC3CCCCN3C)c2c1   \\n\",\n       \"323070       U2OS  C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)S(=O)(=O)c2cc...   \\n\",\n       \"322698       U2OS  C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)Cc2cccc(c2)C(...   \\n\",\n       \"323054       U2OS  C[C@@H](CO)N1C[C@@H](C)[C@H](CN(C)S(=O)(=O)c2c...   \\n\",\n       \"322740       U2OS  C[C@@H](CO)N1C[C@H](C)[C@H](CN(C)Cc2ccc(C)cc2)...   \\n\",\n       \"\\n\",\n       \"                          inchi_key compound_aliases      10007       1001  \\\\\\n\",\n       \"415582                          NaN              NaN   0.869478   0.042114   \\n\",\n       \"414494                          NaN              NaN   0.031156  -0.426591   \\n\",\n       \"335645  IUZUJTQXZTXMFN-ZWVMGRIOSA-N              NaN  -2.114700  -0.179000   \\n\",\n       \"415401                          NaN              NaN   0.032800   0.103533   \\n\",\n       \"418195                          NaN              NaN  -0.628900  -0.142933   \\n\",\n       \"...                             ...              ...        ...        ...   \\n\",\n       \"28269   KLBQZWRITKRQQV-UHFFFAOYSA-N              NaN  -2.989600  10.000000   \\n\",\n       \"323070  YDFSFENEUSKTIZ-WFXMLNOXSA-N              NaN  10.000000  -1.131200   \\n\",\n       \"322698  KWAFQNIHTBJTFB-SPEDKVCISA-N              NaN  10.000000   0.648300   \\n\",\n       \"323054  BKQPNZMATHQHFX-VHSZZVNMSA-N              NaN  10.000000   1.527500   \\n\",\n       \"322740  WGPDLCVCGOLGPE-JTAQYXEDSA-N              NaN  10.000000   4.219400   \\n\",\n       \"\\n\",\n       \"            10013      10038  ...       9918       9924       9926       9928  \\\\\\n\",\n       \"415582   0.090729  -0.256483  ...   0.760171   0.753874   0.999389   0.147320   \\n\",\n       \"414494  -0.535521   0.041022  ...  -1.087797   0.426523   0.014110  -0.123189   \\n\",\n       \"335645   0.040933   0.004367  ...   0.719833  -0.137200  -0.274533  -0.529267   \\n\",\n       \"415401   0.159733   0.178633  ...  -0.552733  -0.751167  -0.236200  -0.288467   \\n\",\n       \"418195  -0.005667  -0.565000  ...   0.460967   0.200767  -0.412833   0.257400   \\n\",\n       \"...           ...        ...  ...        ...        ...        ...        ...   \\n\",\n       \"28269    7.966500  10.000000  ...   3.323200  10.000000  -1.078400  -9.840500   \\n\",\n       \"323070  10.000000 -10.000000  ... -10.000000  -4.055600 -10.000000 -10.000000   \\n\",\n       \"322698  10.000000 -10.000000  ... -10.000000  -8.683200 -10.000000  -8.787700   \\n\",\n       \"323054  10.000000 -10.000000  ... -10.000000  -7.884600 -10.000000  -7.369500   \\n\",\n       \"322740  10.000000 -10.000000  ... -10.000000  -6.938900  -9.631900  -2.627000   \\n\",\n       \"\\n\",\n       \"             993        994      9943      9961        998      9988  \\n\",\n       \"415582  0.228526   0.435754  0.699075  0.110136   0.163373 -0.796202  \\n\",\n       \"414494 -0.190503   0.666766  0.426695  0.068460   0.917429 -0.304055  \\n\",\n       \"335645  0.290033   0.658333 -1.030267  0.196100   0.126067 -0.493633  \\n\",\n       \"415401  1.186633  -0.096867  0.008300 -0.226933   0.121933  0.289800  \\n\",\n       \"418195  0.135633   0.140200 -0.176033  0.058967   0.127333 -0.348967  \\n\",\n       \"...          ...        ...       ...       ...        ...       ...  \\n\",\n       \"28269  -4.797400 -10.000000 -4.752700 -8.170600 -10.000000 -1.186500  \\n\",\n       \"323070 -7.773200   1.938100  1.993000  4.441500   5.403000  1.289000  \\n\",\n       \"322698 -2.034900   6.682700  1.310300  6.207100   0.432300  3.209100  \\n\",\n       \"323054 -4.354500   2.639300  5.096700  4.949600  -3.694200 -3.723500  \\n\",\n       \"322740 -4.594600   1.311600  3.464000  5.099600  -8.922200 -0.006800  \\n\",\n       \"\\n\",\n       \"[16058 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>174213</th>\\n\",\n       \"      <td>CPC006_HA1E_6H:BRD-K56301217-001-01-7:11.1</td>\\n\",\n       \"      <td>BRD-K56301217</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>CN(C)CC[C@H](CSc1ccccc1)Nc1ccc(cc1[N+]([O-])=O...</td>\\n\",\n       \"      <td>HPLNQCPCUACXLM-PGUFJCEWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.231973</td>\\n\",\n       \"      <td>0.018931</td>\\n\",\n       \"      <td>0.079527</td>\\n\",\n       \"      <td>0.046269</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.035346</td>\\n\",\n       \"      <td>0.290603</td>\\n\",\n       \"      <td>-0.186647</td>\\n\",\n       \"      <td>0.199493</td>\\n\",\n       \"      <td>-0.451440</td>\\n\",\n       \"      <td>0.304840</td>\\n\",\n       \"      <td>0.394064</td>\\n\",\n       \"      <td>0.089000</td>\\n\",\n       \"      <td>-0.363613</td>\\n\",\n       \"      <td>-0.475398</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>156163</th>\\n\",\n       \"      <td>CPC006_HA1E_6H:BRD-A81936046-323-01-6:10</td>\\n\",\n       \"      <td>BRD-A81936046</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>CO[C@H]1C2OP(O)(=O)OC[C@H]2O[C@H]1n1c(Sc2ccc(C...</td>\\n\",\n       \"      <td>BCGHHRAUZWOTNH-ZIWBQIBKSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.631552</td>\\n\",\n       \"      <td>2.103255</td>\\n\",\n       \"      <td>-0.396988</td>\\n\",\n       \"      <td>0.586870</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.082199</td>\\n\",\n       \"      <td>-0.525986</td>\\n\",\n       \"      <td>0.117207</td>\\n\",\n       \"      <td>-0.017121</td>\\n\",\n       \"      <td>0.819982</td>\\n\",\n       \"      <td>0.131787</td>\\n\",\n       \"      <td>0.291434</td>\\n\",\n       \"      <td>-0.823752</td>\\n\",\n       \"      <td>0.418069</td>\\n\",\n       \"      <td>-1.155668</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>156475</th>\\n\",\n       \"      <td>CPC006_HA1E_6H:BRD-A94451536-001-04-4:44.4</td>\\n\",\n       \"      <td>BRD-A94451536</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>CCCCCCCCCCCCCCC(F)C(=O)O</td>\\n\",\n       \"      <td>JGRIJJOLCNCSNX-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>2-fluoropalmitic-acid</td>\\n\",\n       \"      <td>-0.312471</td>\\n\",\n       \"      <td>-0.254659</td>\\n\",\n       \"      <td>-0.079145</td>\\n\",\n       \"      <td>-0.042123</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.397542</td>\\n\",\n       \"      <td>-0.100607</td>\\n\",\n       \"      <td>0.077998</td>\\n\",\n       \"      <td>0.686762</td>\\n\",\n       \"      <td>-0.093666</td>\\n\",\n       \"      <td>0.265468</td>\\n\",\n       \"      <td>0.266711</td>\\n\",\n       \"      <td>0.274151</td>\\n\",\n       \"      <td>0.306019</td>\\n\",\n       \"      <td>0.270725</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>168383</th>\\n\",\n       \"      <td>CPC006_HA1E_6H:BRD-K35723520-001-01-0:10</td>\\n\",\n       \"      <td>BRD-K35723520</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[As](C)SC[C@H](NC(=O)CC[C@H](N)C(O)=O)C(=O)NC...</td>\\n\",\n       \"      <td>JGDXFQORBMPJGR-YUMQZZPRSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.406390</td>\\n\",\n       \"      <td>0.300561</td>\\n\",\n       \"      <td>0.241660</td>\\n\",\n       \"      <td>-0.092437</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.569998</td>\\n\",\n       \"      <td>0.195336</td>\\n\",\n       \"      <td>0.179857</td>\\n\",\n       \"      <td>0.301028</td>\\n\",\n       \"      <td>-0.214072</td>\\n\",\n       \"      <td>0.311946</td>\\n\",\n       \"      <td>0.326444</td>\\n\",\n       \"      <td>0.317308</td>\\n\",\n       \"      <td>1.005617</td>\\n\",\n       \"      <td>0.829641</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>176004</th>\\n\",\n       \"      <td>CPC006_HA1E_6H:BRD-K61053657-001-01-4:80</td>\\n\",\n       \"      <td>BRD-K61053657</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>CN[C@@H](C)C(=O)N[C@@H](C(C)C)C(=O)N1CCC[C@H]1...</td>\\n\",\n       \"      <td>PUQVAXDZGUENDK-YSSFQJQWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.533862</td>\\n\",\n       \"      <td>0.780318</td>\\n\",\n       \"      <td>-0.336582</td>\\n\",\n       \"      <td>0.811964</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.128239</td>\\n\",\n       \"      <td>-0.846647</td>\\n\",\n       \"      <td>0.508807</td>\\n\",\n       \"      <td>-0.112610</td>\\n\",\n       \"      <td>1.296148</td>\\n\",\n       \"      <td>0.478039</td>\\n\",\n       \"      <td>-0.046367</td>\\n\",\n       \"      <td>-0.197475</td>\\n\",\n       \"      <td>0.455606</td>\\n\",\n       \"      <td>-0.481770</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>46153</th>\\n\",\n       \"      <td>DOSVAL002_HA1E_24H:BRD-K88378636:20</td>\\n\",\n       \"      <td>BRD-K88378636</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>C[C@@H]([C@H]1CC[C@H]2[C@@H]3C[C@H]4O[C@]45[C@...</td>\\n\",\n       \"      <td>DBRXOUCRJQVYJQ-CKNDUULBSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>2.667600</td>\\n\",\n       \"      <td>-0.674500</td>\\n\",\n       \"      <td>1.740400</td>\\n\",\n       \"      <td>-8.846600</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>5.539100</td>\\n\",\n       \"      <td>-0.372200</td>\\n\",\n       \"      <td>1.302400</td>\\n\",\n       \"      <td>-0.775300</td>\\n\",\n       \"      <td>-9.430100</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.153800</td>\\n\",\n       \"      <td>-5.618400</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>0.174400</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>366343</th>\\n\",\n       \"      <td>DOSVAL005_HA1E_24H:BRD-K70836798:5</td>\\n\",\n       \"      <td>BRD-K70836798</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COc1ccc(CN(C)C[C@@H]2OCCCC[C@H](C)Oc3ccc(NC(=O...</td>\\n\",\n       \"      <td>UVSCGJDGVAXROK-PNCWTNKOSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>1.471700</td>\\n\",\n       \"      <td>5.967600</td>\\n\",\n       \"      <td>-0.820200</td>\\n\",\n       \"      <td>9.857500</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.998600</td>\\n\",\n       \"      <td>1.783300</td>\\n\",\n       \"      <td>-5.215200</td>\\n\",\n       \"      <td>-2.273300</td>\\n\",\n       \"      <td>-4.165900</td>\\n\",\n       \"      <td>-0.574100</td>\\n\",\n       \"      <td>7.438000</td>\\n\",\n       \"      <td>-4.380200</td>\\n\",\n       \"      <td>1.201400</td>\\n\",\n       \"      <td>1.573700</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>368232</th>\\n\",\n       \"      <td>DOSVAL005_HA1E_24H:BRD-K41335306:20</td>\\n\",\n       \"      <td>BRD-K41335306</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>OC[C@H]1[C@@H]([C@H]2CN(Cc3cccnc3)CCCCN12)c1cc...</td>\\n\",\n       \"      <td>ORVIWTYNJHNTDZ-OBGOALODSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-4.555200</td>\\n\",\n       \"      <td>0.606200</td>\\n\",\n       \"      <td>-6.571000</td>\\n\",\n       \"      <td>-5.795200</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.605800</td>\\n\",\n       \"      <td>-1.555300</td>\\n\",\n       \"      <td>0.635500</td>\\n\",\n       \"      <td>-2.601500</td>\\n\",\n       \"      <td>4.450500</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.595400</td>\\n\",\n       \"      <td>-8.991100</td>\\n\",\n       \"      <td>-2.266700</td>\\n\",\n       \"      <td>0.924000</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>157680</th>\\n\",\n       \"      <td>PCLB002_HA1E_24H:BRD-K01877528:0.04</td>\\n\",\n       \"      <td>BRD-K01877528</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>Cc1onc(C(=O)N2CCN(CC2)C(c2ccc(Cl)cc2)c2ccc(Cl)...</td>\\n\",\n       \"      <td>VIBHJPDPEVVDTB-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-4.342000</td>\\n\",\n       \"      <td>6.058400</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-9.228200</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>6.787200</td>\\n\",\n       \"      <td>0.273300</td>\\n\",\n       \"      <td>-3.649200</td>\\n\",\n       \"      <td>-4.557400</td>\\n\",\n       \"      <td>-3.390900</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.501700</td>\\n\",\n       \"      <td>-5.219100</td>\\n\",\n       \"      <td>-3.623900</td>\\n\",\n       \"      <td>-8.071300</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>193868</th>\\n\",\n       \"      <td>PCLB003_HA1E_24H:BRD-K56411643-001-02-6:0.12</td>\\n\",\n       \"      <td>BRD-K56411643</td>\\n\",\n       \"      <td>HA1E</td>\\n\",\n       \"      <td>COC(=O)[C@@H]1Cc2c([nH]c3ccccc23)[C@H](N1C(=O)...</td>\\n\",\n       \"      <td>TXJZRSRTYPUYRW-GHTZIAJQSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-8.165000</td>\\n\",\n       \"      <td>0.368500</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-8.175600</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-3.700700</td>\\n\",\n       \"      <td>0.262700</td>\\n\",\n       \"      <td>1.202600</td>\\n\",\n       \"      <td>-3.865000</td>\\n\",\n       \"      <td>-1.445300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.591100</td>\\n\",\n       \"      <td>-9.500800</td>\\n\",\n       \"      <td>-6.109300</td>\\n\",\n       \"      <td>-2.565800</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>5514 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                             full_id        pert_id  \\\\\\n\",\n       \"174213    CPC006_HA1E_6H:BRD-K56301217-001-01-7:11.1  BRD-K56301217   \\n\",\n       \"156163      CPC006_HA1E_6H:BRD-A81936046-323-01-6:10  BRD-A81936046   \\n\",\n       \"156475    CPC006_HA1E_6H:BRD-A94451536-001-04-4:44.4  BRD-A94451536   \\n\",\n       \"168383      CPC006_HA1E_6H:BRD-K35723520-001-01-0:10  BRD-K35723520   \\n\",\n       \"176004      CPC006_HA1E_6H:BRD-K61053657-001-01-4:80  BRD-K61053657   \\n\",\n       \"...                                              ...            ...   \\n\",\n       \"46153            DOSVAL002_HA1E_24H:BRD-K88378636:20  BRD-K88378636   \\n\",\n       \"366343            DOSVAL005_HA1E_24H:BRD-K70836798:5  BRD-K70836798   \\n\",\n       \"368232           DOSVAL005_HA1E_24H:BRD-K41335306:20  BRD-K41335306   \\n\",\n       \"157680           PCLB002_HA1E_24H:BRD-K01877528:0.04  BRD-K01877528   \\n\",\n       \"193868  PCLB003_HA1E_24H:BRD-K56411643-001-02-6:0.12  BRD-K56411643   \\n\",\n       \"\\n\",\n       \"       cell_iname                                             SMILES  \\\\\\n\",\n       \"174213       HA1E  CN(C)CC[C@H](CSc1ccccc1)Nc1ccc(cc1[N+]([O-])=O...   \\n\",\n       \"156163       HA1E  CO[C@H]1C2OP(O)(=O)OC[C@H]2O[C@H]1n1c(Sc2ccc(C...   \\n\",\n       \"156475       HA1E                           CCCCCCCCCCCCCCC(F)C(=O)O   \\n\",\n       \"168383       HA1E  C[As](C)SC[C@H](NC(=O)CC[C@H](N)C(O)=O)C(=O)NC...   \\n\",\n       \"176004       HA1E  CN[C@@H](C)C(=O)N[C@@H](C(C)C)C(=O)N1CCC[C@H]1...   \\n\",\n       \"...           ...                                                ...   \\n\",\n       \"46153        HA1E  C[C@@H]([C@H]1CC[C@H]2[C@@H]3C[C@H]4O[C@]45[C@...   \\n\",\n       \"366343       HA1E  COc1ccc(CN(C)C[C@@H]2OCCCC[C@H](C)Oc3ccc(NC(=O...   \\n\",\n       \"368232       HA1E  OC[C@H]1[C@@H]([C@H]2CN(Cc3cccnc3)CCCCN12)c1cc...   \\n\",\n       \"157680       HA1E  Cc1onc(C(=O)N2CCN(CC2)C(c2ccc(Cl)cc2)c2ccc(Cl)...   \\n\",\n       \"193868       HA1E  COC(=O)[C@@H]1Cc2c([nH]c3ccccc23)[C@H](N1C(=O)...   \\n\",\n       \"\\n\",\n       \"                          inchi_key       compound_aliases     10007  \\\\\\n\",\n       \"174213  HPLNQCPCUACXLM-PGUFJCEWSA-N                    NaN  0.231973   \\n\",\n       \"156163  BCGHHRAUZWOTNH-ZIWBQIBKSA-N                    NaN -0.631552   \\n\",\n       \"156475  JGRIJJOLCNCSNX-UHFFFAOYSA-N  2-fluoropalmitic-acid -0.312471   \\n\",\n       \"168383  JGDXFQORBMPJGR-YUMQZZPRSA-N                    NaN -0.406390   \\n\",\n       \"176004  PUQVAXDZGUENDK-YSSFQJQWSA-N                    NaN  0.533862   \\n\",\n       \"...                             ...                    ...       ...   \\n\",\n       \"46153   DBRXOUCRJQVYJQ-CKNDUULBSA-N                    NaN  2.667600   \\n\",\n       \"366343  UVSCGJDGVAXROK-PNCWTNKOSA-N                    NaN  1.471700   \\n\",\n       \"368232  ORVIWTYNJHNTDZ-OBGOALODSA-N                    NaN -4.555200   \\n\",\n       \"157680  VIBHJPDPEVVDTB-UHFFFAOYSA-N                    NaN -4.342000   \\n\",\n       \"193868  TXJZRSRTYPUYRW-GHTZIAJQSA-N                    NaN -8.165000   \\n\",\n       \"\\n\",\n       \"            1001      10013     10038  ...      9918      9924      9926  \\\\\\n\",\n       \"174213  0.018931   0.079527  0.046269  ... -0.035346  0.290603 -0.186647   \\n\",\n       \"156163  2.103255  -0.396988  0.586870  ... -0.082199 -0.525986  0.117207   \\n\",\n       \"156475 -0.254659  -0.079145 -0.042123  ...  0.397542 -0.100607  0.077998   \\n\",\n       \"168383  0.300561   0.241660 -0.092437  ...  0.569998  0.195336  0.179857   \\n\",\n       \"176004  0.780318  -0.336582  0.811964  ...  0.128239 -0.846647  0.508807   \\n\",\n       \"...          ...        ...       ...  ...       ...       ...       ...   \\n\",\n       \"46153  -0.674500   1.740400 -8.846600  ...  5.539100 -0.372200  1.302400   \\n\",\n       \"366343  5.967600  -0.820200  9.857500  ...  0.998600  1.783300 -5.215200   \\n\",\n       \"368232  0.606200  -6.571000 -5.795200  ...  4.605800 -1.555300  0.635500   \\n\",\n       \"157680  6.058400 -10.000000 -9.228200  ...  6.787200  0.273300 -3.649200   \\n\",\n       \"193868  0.368500 -10.000000 -8.175600  ... -3.700700  0.262700  1.202600   \\n\",\n       \"\\n\",\n       \"            9928       993        994      9943      9961        998      9988  \\n\",\n       \"174213  0.199493 -0.451440   0.304840  0.394064  0.089000  -0.363613 -0.475398  \\n\",\n       \"156163 -0.017121  0.819982   0.131787  0.291434 -0.823752   0.418069 -1.155668  \\n\",\n       \"156475  0.686762 -0.093666   0.265468  0.266711  0.274151   0.306019  0.270725  \\n\",\n       \"168383  0.301028 -0.214072   0.311946  0.326444  0.317308   1.005617  0.829641  \\n\",\n       \"176004 -0.112610  1.296148   0.478039 -0.046367 -0.197475   0.455606 -0.481770  \\n\",\n       \"...          ...       ...        ...       ...       ...        ...       ...  \\n\",\n       \"46153  -0.775300 -9.430100 -10.000000 -0.153800 -5.618400 -10.000000  0.174400  \\n\",\n       \"366343 -2.273300 -4.165900  -0.574100  7.438000 -4.380200   1.201400  1.573700  \\n\",\n       \"368232 -2.601500  4.450500 -10.000000 -0.595400 -8.991100  -2.266700  0.924000  \\n\",\n       \"157680 -4.557400 -3.390900 -10.000000 -0.501700 -5.219100  -3.623900 -8.071300  \\n\",\n       \"193868 -3.865000 -1.445300 -10.000000 -0.591100 -9.500800  -6.109300 -2.565800  \\n\",\n       \"\\n\",\n       \"[5514 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>131565</th>\\n\",\n       \"      <td>CPC004_VCAP_6H:BRD-K63945320-001-03-2:10</td>\\n\",\n       \"      <td>BRD-K63945320</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>CC(C)CC(=O)O[C@@H]1[C@H](OC(=O)C)c2c(OC1(C)C)c...</td>\\n\",\n       \"      <td>ALKTVPFKDYZFGA-WOJBJXKFSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.619350</td>\\n\",\n       \"      <td>0.979950</td>\\n\",\n       \"      <td>0.077025</td>\\n\",\n       \"      <td>-1.826175</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.288375</td>\\n\",\n       \"      <td>0.812075</td>\\n\",\n       \"      <td>0.113600</td>\\n\",\n       \"      <td>0.558975</td>\\n\",\n       \"      <td>0.185975</td>\\n\",\n       \"      <td>-0.133500</td>\\n\",\n       \"      <td>0.196950</td>\\n\",\n       \"      <td>-0.177750</td>\\n\",\n       \"      <td>-0.245525</td>\\n\",\n       \"      <td>-0.280200</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>129809</th>\\n\",\n       \"      <td>CPC004_VCAP_6H:BRD-K56558538-003-02-8:10</td>\\n\",\n       \"      <td>BRD-K56558538</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Nc1c(Br)cc(Br)cc1CN[C@H]1CC[C@H](O)CC1</td>\\n\",\n       \"      <td>JBDGDEWWOUBZPM-XYPYZODXSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.514618</td>\\n\",\n       \"      <td>-0.725917</td>\\n\",\n       \"      <td>-0.241905</td>\\n\",\n       \"      <td>0.703576</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.402551</td>\\n\",\n       \"      <td>-0.221116</td>\\n\",\n       \"      <td>0.570629</td>\\n\",\n       \"      <td>0.353851</td>\\n\",\n       \"      <td>0.059618</td>\\n\",\n       \"      <td>0.576806</td>\\n\",\n       \"      <td>0.471493</td>\\n\",\n       \"      <td>-1.184916</td>\\n\",\n       \"      <td>0.086062</td>\\n\",\n       \"      <td>-0.790209</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>266862</th>\\n\",\n       \"      <td>CPC014_VCAP_24H:BRD-K02421873-001-05-4:10</td>\\n\",\n       \"      <td>BRD-K02421873</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>[O-][N+](=O)c1ccc(o1)C(=O)OCc2nnc(o2)c3ccccc3</td>\\n\",\n       \"      <td>NENLILZNTFFZKS-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.015498</td>\\n\",\n       \"      <td>-0.172991</td>\\n\",\n       \"      <td>-0.443552</td>\\n\",\n       \"      <td>0.130475</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.387143</td>\\n\",\n       \"      <td>0.004123</td>\\n\",\n       \"      <td>-0.131220</td>\\n\",\n       \"      <td>-0.249845</td>\\n\",\n       \"      <td>0.237359</td>\\n\",\n       \"      <td>0.621236</td>\\n\",\n       \"      <td>-0.026491</td>\\n\",\n       \"      <td>-0.479426</td>\\n\",\n       \"      <td>0.021192</td>\\n\",\n       \"      <td>0.214971</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>110592</th>\\n\",\n       \"      <td>CPC004_VCAP_6H:BRD-A09467419-003-14-1:10</td>\\n\",\n       \"      <td>BRD-A09467419</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>CCN(CCCCOC(=O)c1ccc(OC)c(OC)c1)C(C)Cc1ccc(OC)cc1</td>\\n\",\n       \"      <td>VYVKHNNGDFVQGA-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.177417</td>\\n\",\n       \"      <td>-0.011983</td>\\n\",\n       \"      <td>0.654752</td>\\n\",\n       \"      <td>0.772483</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.127329</td>\\n\",\n       \"      <td>0.302186</td>\\n\",\n       \"      <td>-0.593025</td>\\n\",\n       \"      <td>0.564000</td>\\n\",\n       \"      <td>0.552848</td>\\n\",\n       \"      <td>0.274139</td>\\n\",\n       \"      <td>-0.134025</td>\\n\",\n       \"      <td>1.009820</td>\\n\",\n       \"      <td>0.408449</td>\\n\",\n       \"      <td>-0.199366</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>109517</th>\\n\",\n       \"      <td>CPC004_VCAP_6H:BRD-A00546892-001-01-8:10</td>\\n\",\n       \"      <td>BRD-A00546892</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>OC(CCN1CCCCC1)(C1CC2CC1C=C2)c1ccccc1</td>\\n\",\n       \"      <td>YSXKPIUOCJLQIE-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.461757</td>\\n\",\n       \"      <td>-0.145747</td>\\n\",\n       \"      <td>-0.114601</td>\\n\",\n       \"      <td>-0.122461</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.040699</td>\\n\",\n       \"      <td>-0.248790</td>\\n\",\n       \"      <td>0.101551</td>\\n\",\n       \"      <td>0.365520</td>\\n\",\n       \"      <td>-0.082306</td>\\n\",\n       \"      <td>0.097964</td>\\n\",\n       \"      <td>-0.103175</td>\\n\",\n       \"      <td>-0.061970</td>\\n\",\n       \"      <td>-0.018410</td>\\n\",\n       \"      <td>0.351744</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>203688</th>\\n\",\n       \"      <td>ERG020_VCAP_24H:BRD-K10846167:10</td>\\n\",\n       \"      <td>BRD-K10846167</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>C(Nc1nc(nc2n(cnc12)-c1ccsc1)N1CCOCC1)c1nc2cc3c...</td>\\n\",\n       \"      <td>YKFITDVTDUOHEH-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>1.972600</td>\\n\",\n       \"      <td>2.243100</td>\\n\",\n       \"      <td>-1.518100</td>\\n\",\n       \"      <td>-1.928400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.962300</td>\\n\",\n       \"      <td>-1.819800</td>\\n\",\n       \"      <td>-0.883500</td>\\n\",\n       \"      <td>-0.083300</td>\\n\",\n       \"      <td>-6.487100</td>\\n\",\n       \"      <td>-7.421100</td>\\n\",\n       \"      <td>-4.943900</td>\\n\",\n       \"      <td>7.903800</td>\\n\",\n       \"      <td>-4.877300</td>\\n\",\n       \"      <td>-2.733000</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1887</th>\\n\",\n       \"      <td>ERG005_VCAP_48H:BRD-K21680192-300-08-6:1</td>\\n\",\n       \"      <td>BRD-K21680192</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>OCCNCCNc1ccc(NCCNCCO)c2C(=O)c3c(O)ccc(O)c3C(=O...</td>\\n\",\n       \"      <td>KKZJGLLVHKMTCM-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-5.841500</td>\\n\",\n       \"      <td>4.240600</td>\\n\",\n       \"      <td>8.539700</td>\\n\",\n       \"      <td>-1.812000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>3.692800</td>\\n\",\n       \"      <td>0.762700</td>\\n\",\n       \"      <td>-0.480900</td>\\n\",\n       \"      <td>-1.721300</td>\\n\",\n       \"      <td>-2.220000</td>\\n\",\n       \"      <td>-5.575100</td>\\n\",\n       \"      <td>2.721000</td>\\n\",\n       \"      <td>7.177500</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-2.780800</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>103356</th>\\n\",\n       \"      <td>ERG021_VCAP_24H:BRD-K36198571:10</td>\\n\",\n       \"      <td>BRD-K36198571</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>Cc1cc(C)c(N(Cc2ccccc2)S(=O)(=O)c2ccc(OCCNC(=O)...</td>\\n\",\n       \"      <td>FARMEEAGJWMFSZ-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>1.282900</td>\\n\",\n       \"      <td>-1.389100</td>\\n\",\n       \"      <td>-1.650400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.099700</td>\\n\",\n       \"      <td>-0.818300</td>\\n\",\n       \"      <td>1.601800</td>\\n\",\n       \"      <td>-2.465300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-9.370700</td>\\n\",\n       \"      <td>4.146900</td>\\n\",\n       \"      <td>-0.295000</td>\\n\",\n       \"      <td>1.578900</td>\\n\",\n       \"      <td>5.121400</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>16381</th>\\n\",\n       \"      <td>DOS013_VCAP_24H:BRD-A19037878:10</td>\\n\",\n       \"      <td>BRD-A19037878</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>CC(C=C(C)C=CC(=O)NO)C(=O)c1ccc(cc1)N(C)C</td>\\n\",\n       \"      <td>RTKIYFITIVXBLE-WKWSCTOISA-N</td>\\n\",\n       \"      <td>trichostatin-a</td>\\n\",\n       \"      <td>-0.828800</td>\\n\",\n       \"      <td>3.716100</td>\\n\",\n       \"      <td>4.576000</td>\\n\",\n       \"      <td>0.650200</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.255600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-7.012300</td>\\n\",\n       \"      <td>-6.611700</td>\\n\",\n       \"      <td>-0.888200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.078000</td>\\n\",\n       \"      <td>5.409200</td>\\n\",\n       \"      <td>-1.338600</td>\\n\",\n       \"      <td>-2.224700</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>9336</th>\\n\",\n       \"      <td>DOS017_VCAP_24H:BRD-K81418486:10</td>\\n\",\n       \"      <td>BRD-K81418486</td>\\n\",\n       \"      <td>VCAP</td>\\n\",\n       \"      <td>ONC(=O)CCCCCCC(=O)Nc1ccccc1</td>\\n\",\n       \"      <td>WAEXFXRVDQXREF-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>2.229600</td>\\n\",\n       \"      <td>1.509200</td>\\n\",\n       \"      <td>4.542500</td>\\n\",\n       \"      <td>-9.906800</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-6.066700</td>\\n\",\n       \"      <td>-1.847500</td>\\n\",\n       \"      <td>2.198000</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-2.601400</td>\\n\",\n       \"      <td>-1.224900</td>\\n\",\n       \"      <td>1.405700</td>\\n\",\n       \"      <td>-1.487300</td>\\n\",\n       \"      <td>-3.186200</td>\\n\",\n       \"      <td>1.455800</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>15220 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                          full_id        pert_id cell_iname  \\\\\\n\",\n       \"131565   CPC004_VCAP_6H:BRD-K63945320-001-03-2:10  BRD-K63945320       VCAP   \\n\",\n       \"129809   CPC004_VCAP_6H:BRD-K56558538-003-02-8:10  BRD-K56558538       VCAP   \\n\",\n       \"266862  CPC014_VCAP_24H:BRD-K02421873-001-05-4:10  BRD-K02421873       VCAP   \\n\",\n       \"110592   CPC004_VCAP_6H:BRD-A09467419-003-14-1:10  BRD-A09467419       VCAP   \\n\",\n       \"109517   CPC004_VCAP_6H:BRD-A00546892-001-01-8:10  BRD-A00546892       VCAP   \\n\",\n       \"...                                           ...            ...        ...   \\n\",\n       \"203688           ERG020_VCAP_24H:BRD-K10846167:10  BRD-K10846167       VCAP   \\n\",\n       \"1887     ERG005_VCAP_48H:BRD-K21680192-300-08-6:1  BRD-K21680192       VCAP   \\n\",\n       \"103356           ERG021_VCAP_24H:BRD-K36198571:10  BRD-K36198571       VCAP   \\n\",\n       \"16381            DOS013_VCAP_24H:BRD-A19037878:10  BRD-A19037878       VCAP   \\n\",\n       \"9336             DOS017_VCAP_24H:BRD-K81418486:10  BRD-K81418486       VCAP   \\n\",\n       \"\\n\",\n       \"                                                   SMILES  \\\\\\n\",\n       \"131565  CC(C)CC(=O)O[C@@H]1[C@H](OC(=O)C)c2c(OC1(C)C)c...   \\n\",\n       \"129809             Nc1c(Br)cc(Br)cc1CN[C@H]1CC[C@H](O)CC1   \\n\",\n       \"266862      [O-][N+](=O)c1ccc(o1)C(=O)OCc2nnc(o2)c3ccccc3   \\n\",\n       \"110592   CCN(CCCCOC(=O)c1ccc(OC)c(OC)c1)C(C)Cc1ccc(OC)cc1   \\n\",\n       \"109517               OC(CCN1CCCCC1)(C1CC2CC1C=C2)c1ccccc1   \\n\",\n       \"...                                                   ...   \\n\",\n       \"203688  C(Nc1nc(nc2n(cnc12)-c1ccsc1)N1CCOCC1)c1nc2cc3c...   \\n\",\n       \"1887    OCCNCCNc1ccc(NCCNCCO)c2C(=O)c3c(O)ccc(O)c3C(=O...   \\n\",\n       \"103356  Cc1cc(C)c(N(Cc2ccccc2)S(=O)(=O)c2ccc(OCCNC(=O)...   \\n\",\n       \"16381            CC(C=C(C)C=CC(=O)NO)C(=O)c1ccc(cc1)N(C)C   \\n\",\n       \"9336                          ONC(=O)CCCCCCC(=O)Nc1ccccc1   \\n\",\n       \"\\n\",\n       \"                          inchi_key compound_aliases      10007      1001  \\\\\\n\",\n       \"131565  ALKTVPFKDYZFGA-WOJBJXKFSA-N              NaN   0.619350  0.979950   \\n\",\n       \"129809  JBDGDEWWOUBZPM-XYPYZODXSA-N              NaN   0.514618 -0.725917   \\n\",\n       \"266862  NENLILZNTFFZKS-UHFFFAOYSA-N              NaN  -0.015498 -0.172991   \\n\",\n       \"110592  VYVKHNNGDFVQGA-UHFFFAOYSA-N              NaN  -0.177417 -0.011983   \\n\",\n       \"109517  YSXKPIUOCJLQIE-UHFFFAOYSA-N              NaN   0.461757 -0.145747   \\n\",\n       \"...                             ...              ...        ...       ...   \\n\",\n       \"203688  YKFITDVTDUOHEH-UHFFFAOYSA-N              NaN   1.972600  2.243100   \\n\",\n       \"1887    KKZJGLLVHKMTCM-UHFFFAOYSA-N              NaN  -5.841500  4.240600   \\n\",\n       \"103356  FARMEEAGJWMFSZ-UHFFFAOYSA-N              NaN -10.000000  1.282900   \\n\",\n       \"16381   RTKIYFITIVXBLE-WKWSCTOISA-N   trichostatin-a  -0.828800  3.716100   \\n\",\n       \"9336    WAEXFXRVDQXREF-UHFFFAOYSA-N              NaN   2.229600  1.509200   \\n\",\n       \"\\n\",\n       \"           10013     10038  ...      9918       9924      9926       9928  \\\\\\n\",\n       \"131565  0.077025 -1.826175  ... -0.288375   0.812075  0.113600   0.558975   \\n\",\n       \"129809 -0.241905  0.703576  ... -0.402551  -0.221116  0.570629   0.353851   \\n\",\n       \"266862 -0.443552  0.130475  ...  0.387143   0.004123 -0.131220  -0.249845   \\n\",\n       \"110592  0.654752  0.772483  ...  0.127329   0.302186 -0.593025   0.564000   \\n\",\n       \"109517 -0.114601 -0.122461  ... -0.040699  -0.248790  0.101551   0.365520   \\n\",\n       \"...          ...       ...  ...       ...        ...       ...        ...   \\n\",\n       \"203688 -1.518100 -1.928400  ...  4.962300  -1.819800 -0.883500  -0.083300   \\n\",\n       \"1887    8.539700 -1.812000  ...  3.692800   0.762700 -0.480900  -1.721300   \\n\",\n       \"103356 -1.389100 -1.650400  ... -1.099700  -0.818300  1.601800  -2.465300   \\n\",\n       \"16381   4.576000  0.650200  ...  0.255600 -10.000000 -7.012300  -6.611700   \\n\",\n       \"9336    4.542500 -9.906800  ... -6.066700  -1.847500  2.198000  10.000000   \\n\",\n       \"\\n\",\n       \"              993        994      9943      9961        998      9988  \\n\",\n       \"131565   0.185975  -0.133500  0.196950 -0.177750  -0.245525 -0.280200  \\n\",\n       \"129809   0.059618   0.576806  0.471493 -1.184916   0.086062 -0.790209  \\n\",\n       \"266862   0.237359   0.621236 -0.026491 -0.479426   0.021192  0.214971  \\n\",\n       \"110592   0.552848   0.274139 -0.134025  1.009820   0.408449 -0.199366  \\n\",\n       \"109517  -0.082306   0.097964 -0.103175 -0.061970  -0.018410  0.351744  \\n\",\n       \"...           ...        ...       ...       ...        ...       ...  \\n\",\n       \"203688  -6.487100  -7.421100 -4.943900  7.903800  -4.877300 -2.733000  \\n\",\n       \"1887    -2.220000  -5.575100  2.721000  7.177500 -10.000000 -2.780800  \\n\",\n       \"103356 -10.000000  -9.370700  4.146900 -0.295000   1.578900  5.121400  \\n\",\n       \"16381   -0.888200 -10.000000 -0.078000  5.409200  -1.338600 -2.224700  \\n\",\n       \"9336    -2.601400  -1.224900  1.405700 -1.487300  -3.186200  1.455800  \\n\",\n       \"\\n\",\n       \"[15220 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>300792</th>\\n\",\n       \"      <td>DPK.CP001_A549_24H:BRD-K44993696:10</td>\\n\",\n       \"      <td>BRD-K44993696</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>CC(C)NC[C@H](O)COc1ccc(CC(N)=O)cc1</td>\\n\",\n       \"      <td>METKIMKYRPQLGS-LBPRGKRZSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.056206</td>\\n\",\n       \"      <td>0.348380</td>\\n\",\n       \"      <td>0.149685</td>\\n\",\n       \"      <td>0.386817</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.411859</td>\\n\",\n       \"      <td>-0.060205</td>\\n\",\n       \"      <td>0.082864</td>\\n\",\n       \"      <td>0.490584</td>\\n\",\n       \"      <td>0.643067</td>\\n\",\n       \"      <td>0.342237</td>\\n\",\n       \"      <td>-0.397448</td>\\n\",\n       \"      <td>-0.040522</td>\\n\",\n       \"      <td>-0.942005</td>\\n\",\n       \"      <td>-0.207695</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>386682</th>\\n\",\n       \"      <td>DPK.CP001_A549_24H:BRD-K26634012:10</td>\\n\",\n       \"      <td>BRD-K26634012</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>CC[C@@H]1OC(=O)C[C@@H](O)[C@H](C)[C@@H](O[C@@H...</td>\\n\",\n       \"      <td>JTSDBFGMPLKDCD-NXYWBGGVSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.023966</td>\\n\",\n       \"      <td>-0.067202</td>\\n\",\n       \"      <td>0.478092</td>\\n\",\n       \"      <td>-0.146135</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.173642</td>\\n\",\n       \"      <td>-0.905919</td>\\n\",\n       \"      <td>-0.384436</td>\\n\",\n       \"      <td>0.606815</td>\\n\",\n       \"      <td>0.367887</td>\\n\",\n       \"      <td>-0.279217</td>\\n\",\n       \"      <td>-0.773536</td>\\n\",\n       \"      <td>-0.498392</td>\\n\",\n       \"      <td>0.337296</td>\\n\",\n       \"      <td>0.362324</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>426065</th>\\n\",\n       \"      <td>PCLB001_A549_24H:BRD-K00640357-001-07-2:10</td>\\n\",\n       \"      <td>BRD-K00640357</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>CCCN(CCC)c1ncnc2n(ncc12)c3ccc(OC)cc3</td>\\n\",\n       \"      <td>HIHZFTNEEQAKJI-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.088505</td>\\n\",\n       \"      <td>-0.150477</td>\\n\",\n       \"      <td>0.636680</td>\\n\",\n       \"      <td>0.069768</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.789002</td>\\n\",\n       \"      <td>-1.147770</td>\\n\",\n       \"      <td>0.454659</td>\\n\",\n       \"      <td>-0.096799</td>\\n\",\n       \"      <td>-0.084420</td>\\n\",\n       \"      <td>-0.263718</td>\\n\",\n       \"      <td>0.264277</td>\\n\",\n       \"      <td>1.185595</td>\\n\",\n       \"      <td>1.438651</td>\\n\",\n       \"      <td>-0.285030</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>299275</th>\\n\",\n       \"      <td>DPK.CP001_A549_24H:BRD-K93923542:10</td>\\n\",\n       \"      <td>BRD-K93923542</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>C[C@]12CC[C@@H]3[C@@H](CC(=C)C4=CC(=O)C=C[C@]3...</td>\\n\",\n       \"      <td>BFYIZQONLCFLEV-AFJOWOCMSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.133483</td>\\n\",\n       \"      <td>0.714459</td>\\n\",\n       \"      <td>-0.859223</td>\\n\",\n       \"      <td>0.937306</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.209324</td>\\n\",\n       \"      <td>-0.369444</td>\\n\",\n       \"      <td>0.314782</td>\\n\",\n       \"      <td>-2.220048</td>\\n\",\n       \"      <td>0.095951</td>\\n\",\n       \"      <td>-0.035446</td>\\n\",\n       \"      <td>-0.211462</td>\\n\",\n       \"      <td>-0.722183</td>\\n\",\n       \"      <td>0.712876</td>\\n\",\n       \"      <td>0.127065</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>152828</th>\\n\",\n       \"      <td>CPC006_A549_6H:BRD-A46335897-003-05-4:6.23</td>\\n\",\n       \"      <td>BRD-A46335897</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>CC(CN1c2ccccc2Sc2ccccc12)N(C)C</td>\\n\",\n       \"      <td>PWWVAXIEGOYWEE-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.511799</td>\\n\",\n       \"      <td>2.672387</td>\\n\",\n       \"      <td>-0.038576</td>\\n\",\n       \"      <td>0.198272</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.444251</td>\\n\",\n       \"      <td>0.191055</td>\\n\",\n       \"      <td>0.177904</td>\\n\",\n       \"      <td>0.316457</td>\\n\",\n       \"      <td>0.111580</td>\\n\",\n       \"      <td>0.620553</td>\\n\",\n       \"      <td>-0.170694</td>\\n\",\n       \"      <td>-0.852343</td>\\n\",\n       \"      <td>-1.931709</td>\\n\",\n       \"      <td>-0.265093</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>306160</th>\\n\",\n       \"      <td>PCLB003_A549_24H:BRD-K07572174-001-17-0:100</td>\\n\",\n       \"      <td>BRD-K07572174</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COc1cc(ccc1O)C=CC(=O)CC(=O)C=Cc1ccc(O)c(OC)c1</td>\\n\",\n       \"      <td>VFLDPWHFBUODDF-FCXRPNKRSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>0.518550</td>\\n\",\n       \"      <td>2.412450</td>\\n\",\n       \"      <td>0.552650</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.255900</td>\\n\",\n       \"      <td>-1.943600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-6.944650</td>\\n\",\n       \"      <td>-3.563300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-1.640100</td>\\n\",\n       \"      <td>-1.324000</td>\\n\",\n       \"      <td>-7.236450</td>\\n\",\n       \"      <td>1.282200</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>287348</th>\\n\",\n       \"      <td>PCLB003_A549_24H:BRD-K17953061-001-10-1:10</td>\\n\",\n       \"      <td>BRD-K17953061</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>CN[C@@H]1C[C@H]2O[C@@](C)([C@@H]1OC)n3c4ccccc4...</td>\\n\",\n       \"      <td>HKSZLNNOFSGOKW-FYTWVXJKSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>0.559000</td>\\n\",\n       \"      <td>0.701200</td>\\n\",\n       \"      <td>-2.570400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.225500</td>\\n\",\n       \"      <td>1.128100</td>\\n\",\n       \"      <td>-8.663700</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>-1.965000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.730800</td>\\n\",\n       \"      <td>-2.855300</td>\\n\",\n       \"      <td>-7.733100</td>\\n\",\n       \"      <td>-2.122000</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>277718</th>\\n\",\n       \"      <td>PCLB003_A549_24H:BRD-K78431006-001-05-2:10</td>\\n\",\n       \"      <td>BRD-K78431006</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>C[C@@H](Oc1cc(cnc1N)-c1cnn(c1)C1CCNCC1)c1c(Cl)...</td>\\n\",\n       \"      <td>KTEIFNKAUNYNJU-GFCCVEGCSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.914500</td>\\n\",\n       \"      <td>-0.623200</td>\\n\",\n       \"      <td>0.191100</td>\\n\",\n       \"      <td>0.130400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>7.544400</td>\\n\",\n       \"      <td>-3.168100</td>\\n\",\n       \"      <td>-5.945800</td>\\n\",\n       \"      <td>-8.145600</td>\\n\",\n       \"      <td>-2.736100</td>\\n\",\n       \"      <td>-5.421300</td>\\n\",\n       \"      <td>-1.792900</td>\\n\",\n       \"      <td>-0.607800</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>4.759100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>95654</th>\\n\",\n       \"      <td>CPC015_A549_24H:BRD-A11605036-001-03-2:10</td>\\n\",\n       \"      <td>BRD-A11605036</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>COc1c(O[C@@H]2O[C@H](CO)[C@@H](O)[C@H](O)[C@H]...</td>\\n\",\n       \"      <td>LEQAKWQJCITZNK-MSQQGMGVSA-N</td>\\n\",\n       \"      <td>thiocolchicoside</td>\\n\",\n       \"      <td>-0.392000</td>\\n\",\n       \"      <td>-0.420800</td>\\n\",\n       \"      <td>7.179400</td>\\n\",\n       \"      <td>-4.210800</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-4.484500</td>\\n\",\n       \"      <td>1.589500</td>\\n\",\n       \"      <td>1.302100</td>\\n\",\n       \"      <td>-0.324600</td>\\n\",\n       \"      <td>-5.143300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>4.393500</td>\\n\",\n       \"      <td>-9.374500</td>\\n\",\n       \"      <td>-6.165000</td>\\n\",\n       \"      <td>-2.423500</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>365599</th>\\n\",\n       \"      <td>DOSBIO001_A549_24H:BRD-K49669601:9.903</td>\\n\",\n       \"      <td>BRD-K49669601</td>\\n\",\n       \"      <td>A549</td>\\n\",\n       \"      <td>C[C@H](CO)N1C[C@@H](C)[C@@H](CN(C)CC2CC2)OCCCC...</td>\\n\",\n       \"      <td>DFAHTHVTMMOLJD-SQVYLTPJSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>8.349200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>2.367400</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>3.459500</td>\\n\",\n       \"      <td>-1.307500</td>\\n\",\n       \"      <td>9.090500</td>\\n\",\n       \"      <td>1.389800</td>\\n\",\n       \"      <td>4.510900</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>5.345800</td>\\n\",\n       \"      <td>0.884300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>8.247100</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>12285 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                            full_id        pert_id cell_iname  \\\\\\n\",\n       \"300792          DPK.CP001_A549_24H:BRD-K44993696:10  BRD-K44993696       A549   \\n\",\n       \"386682          DPK.CP001_A549_24H:BRD-K26634012:10  BRD-K26634012       A549   \\n\",\n       \"426065   PCLB001_A549_24H:BRD-K00640357-001-07-2:10  BRD-K00640357       A549   \\n\",\n       \"299275          DPK.CP001_A549_24H:BRD-K93923542:10  BRD-K93923542       A549   \\n\",\n       \"152828   CPC006_A549_6H:BRD-A46335897-003-05-4:6.23  BRD-A46335897       A549   \\n\",\n       \"...                                             ...            ...        ...   \\n\",\n       \"306160  PCLB003_A549_24H:BRD-K07572174-001-17-0:100  BRD-K07572174       A549   \\n\",\n       \"287348   PCLB003_A549_24H:BRD-K17953061-001-10-1:10  BRD-K17953061       A549   \\n\",\n       \"277718   PCLB003_A549_24H:BRD-K78431006-001-05-2:10  BRD-K78431006       A549   \\n\",\n       \"95654     CPC015_A549_24H:BRD-A11605036-001-03-2:10  BRD-A11605036       A549   \\n\",\n       \"365599       DOSBIO001_A549_24H:BRD-K49669601:9.903  BRD-K49669601       A549   \\n\",\n       \"\\n\",\n       \"                                                   SMILES  \\\\\\n\",\n       \"300792                 CC(C)NC[C@H](O)COc1ccc(CC(N)=O)cc1   \\n\",\n       \"386682  CC[C@@H]1OC(=O)C[C@@H](O)[C@H](C)[C@@H](O[C@@H...   \\n\",\n       \"426065               CCCN(CCC)c1ncnc2n(ncc12)c3ccc(OC)cc3   \\n\",\n       \"299275  C[C@]12CC[C@@H]3[C@@H](CC(=C)C4=CC(=O)C=C[C@]3...   \\n\",\n       \"152828                     CC(CN1c2ccccc2Sc2ccccc12)N(C)C   \\n\",\n       \"...                                                   ...   \\n\",\n       \"306160      COc1cc(ccc1O)C=CC(=O)CC(=O)C=Cc1ccc(O)c(OC)c1   \\n\",\n       \"287348  CN[C@@H]1C[C@H]2O[C@@](C)([C@@H]1OC)n3c4ccccc4...   \\n\",\n       \"277718  C[C@@H](Oc1cc(cnc1N)-c1cnn(c1)C1CCNCC1)c1c(Cl)...   \\n\",\n       \"95654   COc1c(O[C@@H]2O[C@H](CO)[C@@H](O)[C@H](O)[C@H]...   \\n\",\n       \"365599  C[C@H](CO)N1C[C@@H](C)[C@@H](CN(C)CC2CC2)OCCCC...   \\n\",\n       \"\\n\",\n       \"                          inchi_key  compound_aliases      10007      1001  \\\\\\n\",\n       \"300792  METKIMKYRPQLGS-LBPRGKRZSA-N               NaN   0.056206  0.348380   \\n\",\n       \"386682  JTSDBFGMPLKDCD-NXYWBGGVSA-N               NaN  -0.023966 -0.067202   \\n\",\n       \"426065  HIHZFTNEEQAKJI-UHFFFAOYSA-N               NaN  -0.088505 -0.150477   \\n\",\n       \"299275  BFYIZQONLCFLEV-AFJOWOCMSA-N               NaN  -0.133483  0.714459   \\n\",\n       \"152828  PWWVAXIEGOYWEE-UHFFFAOYSA-N               NaN  -0.511799  2.672387   \\n\",\n       \"...                             ...               ...        ...       ...   \\n\",\n       \"306160  VFLDPWHFBUODDF-FCXRPNKRSA-N               NaN -10.000000  0.518550   \\n\",\n       \"287348  HKSZLNNOFSGOKW-FYTWVXJKSA-N               NaN -10.000000  0.559000   \\n\",\n       \"277718  KTEIFNKAUNYNJU-GFCCVEGCSA-N               NaN  -0.914500 -0.623200   \\n\",\n       \"95654   LEQAKWQJCITZNK-MSQQGMGVSA-N  thiocolchicoside  -0.392000 -0.420800   \\n\",\n       \"365599  DFAHTHVTMMOLJD-SQVYLTPJSA-N               NaN -10.000000  8.349200   \\n\",\n       \"\\n\",\n       \"            10013     10038  ...      9918      9924       9926       9928  \\\\\\n\",\n       \"300792   0.149685  0.386817  ...  0.411859 -0.060205   0.082864   0.490584   \\n\",\n       \"386682   0.478092 -0.146135  ... -0.173642 -0.905919  -0.384436   0.606815   \\n\",\n       \"426065   0.636680  0.069768  ... -0.789002 -1.147770   0.454659  -0.096799   \\n\",\n       \"299275  -0.859223  0.937306  ...  0.209324 -0.369444   0.314782  -2.220048   \\n\",\n       \"152828  -0.038576  0.198272  ... -0.444251  0.191055   0.177904   0.316457   \\n\",\n       \"...           ...       ...  ...       ...       ...        ...        ...   \\n\",\n       \"306160   2.412450  0.552650  ...  4.255900 -1.943600 -10.000000  -6.944650   \\n\",\n       \"287348   0.701200 -2.570400  ...  4.225500  1.128100  -8.663700  10.000000   \\n\",\n       \"277718   0.191100  0.130400  ...  7.544400 -3.168100  -5.945800  -8.145600   \\n\",\n       \"95654    7.179400 -4.210800  ... -4.484500  1.589500   1.302100  -0.324600   \\n\",\n       \"365599 -10.000000  2.367400  ...  3.459500 -1.307500   9.090500   1.389800   \\n\",\n       \"\\n\",\n       \"             993        994      9943      9961        998      9988  \\n\",\n       \"300792  0.643067   0.342237 -0.397448 -0.040522  -0.942005 -0.207695  \\n\",\n       \"386682  0.367887  -0.279217 -0.773536 -0.498392   0.337296  0.362324  \\n\",\n       \"426065 -0.084420  -0.263718  0.264277  1.185595   1.438651 -0.285030  \\n\",\n       \"299275  0.095951  -0.035446 -0.211462 -0.722183   0.712876  0.127065  \\n\",\n       \"152828  0.111580   0.620553 -0.170694 -0.852343  -1.931709 -0.265093  \\n\",\n       \"...          ...        ...       ...       ...        ...       ...  \\n\",\n       \"306160 -3.563300 -10.000000 -1.640100 -1.324000  -7.236450  1.282200  \\n\",\n       \"287348 -1.965000 -10.000000 -0.730800 -2.855300  -7.733100 -2.122000  \\n\",\n       \"277718 -2.736100  -5.421300 -1.792900 -0.607800 -10.000000  4.759100  \\n\",\n       \"95654  -5.143300 -10.000000  4.393500 -9.374500  -6.165000 -2.423500  \\n\",\n       \"365599  4.510900 -10.000000  5.345800  0.884300 -10.000000  8.247100  \\n\",\n       \"\\n\",\n       \"[12285 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>302663</th>\\n\",\n       \"      <td>CRCGN004_MCF7_6H:BRD-K91960538-001-06-8:10</td>\\n\",\n       \"      <td>BRD-K91960538</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>CN(C)[N+][O-]</td>\\n\",\n       \"      <td>UMFJAHHVKNCGLG-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.1941</td>\\n\",\n       \"      <td>0.0048</td>\\n\",\n       \"      <td>0.0307</td>\\n\",\n       \"      <td>-0.2970</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.0767</td>\\n\",\n       \"      <td>-0.0044</td>\\n\",\n       \"      <td>-0.2711</td>\\n\",\n       \"      <td>0.1766</td>\\n\",\n       \"      <td>0.2891</td>\\n\",\n       \"      <td>-0.1575</td>\\n\",\n       \"      <td>-0.8057</td>\\n\",\n       \"      <td>0.2617</td>\\n\",\n       \"      <td>0.2762</td>\\n\",\n       \"      <td>-0.0126</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>301346</th>\\n\",\n       \"      <td>CRCGN004_MCF7_6H:BRD-A66229260-001-01-6:10</td>\\n\",\n       \"      <td>BRD-A66229260</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>CC(CCCC(C)(C)O)C1CCC2C1(C)CCCC2=C/C=C1/CC(O)CC...</td>\\n\",\n       \"      <td>GMRQFYUYWCNGIN-KWJBEKLWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.0306</td>\\n\",\n       \"      <td>0.1756</td>\\n\",\n       \"      <td>-0.0611</td>\\n\",\n       \"      <td>0.4677</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.6214</td>\\n\",\n       \"      <td>-1.7848</td>\\n\",\n       \"      <td>0.2496</td>\\n\",\n       \"      <td>0.3155</td>\\n\",\n       \"      <td>0.6782</td>\\n\",\n       \"      <td>0.0818</td>\\n\",\n       \"      <td>0.7288</td>\\n\",\n       \"      <td>-0.1041</td>\\n\",\n       \"      <td>-0.1184</td>\\n\",\n       \"      <td>0.3639</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>301292</th>\\n\",\n       \"      <td>CRCGN004_MCF7_6H:BRD-A43286952-001-01-7:19.9</td>\\n\",\n       \"      <td>BRD-A43286952</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>CC12CCC3C(CCc4cc(O)ccc34)C1CCC2(O)C#C</td>\\n\",\n       \"      <td>BFPYWIDHMRZLRN-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>ethinyl-estradiol</td>\\n\",\n       \"      <td>-0.4324</td>\\n\",\n       \"      <td>0.1748</td>\\n\",\n       \"      <td>0.5012</td>\\n\",\n       \"      <td>-0.2536</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.3402</td>\\n\",\n       \"      <td>-1.1109</td>\\n\",\n       \"      <td>-0.0592</td>\\n\",\n       \"      <td>-0.0780</td>\\n\",\n       \"      <td>-0.0019</td>\\n\",\n       \"      <td>-0.9054</td>\\n\",\n       \"      <td>-0.0371</td>\\n\",\n       \"      <td>-0.3098</td>\\n\",\n       \"      <td>-0.1568</td>\\n\",\n       \"      <td>0.1244</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>301482</th>\\n\",\n       \"      <td>CRCGN004_MCF7_6H:BRD-A88939772-001-06-4:20.1</td>\\n\",\n       \"      <td>BRD-A88939772</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>CC12CCC3C(CCc4cc(O)ccc34)C2CC(O)C1O</td>\\n\",\n       \"      <td>PROQIPRRNZUXQM-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.1939</td>\\n\",\n       \"      <td>-1.0207</td>\\n\",\n       \"      <td>0.1532</td>\\n\",\n       \"      <td>0.3879</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.5235</td>\\n\",\n       \"      <td>0.5886</td>\\n\",\n       \"      <td>-0.2801</td>\\n\",\n       \"      <td>-0.3497</td>\\n\",\n       \"      <td>0.7149</td>\\n\",\n       \"      <td>-0.0927</td>\\n\",\n       \"      <td>0.1414</td>\\n\",\n       \"      <td>-0.3235</td>\\n\",\n       \"      <td>-0.1391</td>\\n\",\n       \"      <td>-0.0635</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>301575</th>\\n\",\n       \"      <td>CRCGN004_MCF7_6H:BRD-K09784055-001-05-8:10</td>\\n\",\n       \"      <td>BRD-K09784055</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>c1ccc2c(c1)c3cccc4ccc5cccc2c5c43</td>\\n\",\n       \"      <td>TXVHTIQJNYSSKO-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>benzo(e)pyrene</td>\\n\",\n       \"      <td>0.1609</td>\\n\",\n       \"      <td>-0.3943</td>\\n\",\n       \"      <td>0.0122</td>\\n\",\n       \"      <td>-0.2167</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.1850</td>\\n\",\n       \"      <td>0.3569</td>\\n\",\n       \"      <td>-0.5812</td>\\n\",\n       \"      <td>-0.9063</td>\\n\",\n       \"      <td>0.0104</td>\\n\",\n       \"      <td>0.5815</td>\\n\",\n       \"      <td>0.0259</td>\\n\",\n       \"      <td>-0.1110</td>\\n\",\n       \"      <td>-0.5486</td>\\n\",\n       \"      <td>0.2293</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>409944</th>\\n\",\n       \"      <td>MUC.CP005_MCF7_6H:BRD-K62122103-001-01-6:0.3707</td>\\n\",\n       \"      <td>BRD-K62122103</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>O[C@H]1COC[C@@H]2O[C@H](CC(=O)NCc3ccccc3)CC[C@...</td>\\n\",\n       \"      <td>PETYMTNRVWKWCT-QPXUXIHVSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-6.9677</td>\\n\",\n       \"      <td>-4.7041</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>-4.5273</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-4.6201</td>\\n\",\n       \"      <td>1.1209</td>\\n\",\n       \"      <td>-7.5554</td>\\n\",\n       \"      <td>-4.4353</td>\\n\",\n       \"      <td>-4.1802</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>2.7441</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>-0.3233</td>\\n\",\n       \"      <td>4.4998</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>408650</th>\\n\",\n       \"      <td>MUC.CP001_MCF7_6H:BRD-K25464116-001-01-3:10.0546</td>\\n\",\n       \"      <td>BRD-K25464116</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>COc1cccc(CN2C[C@]3(CN(Cc4ccccn4)C3)c3c([nH]c4c...</td>\\n\",\n       \"      <td>MIXUGIDMCUQDBF-SANMLTNESA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.1379</td>\\n\",\n       \"      <td>-5.0869</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.2269</td>\\n\",\n       \"      <td>0.7651</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>2.1298</td>\\n\",\n       \"      <td>1.6069</td>\\n\",\n       \"      <td>-7.8071</td>\\n\",\n       \"      <td>-1.3344</td>\\n\",\n       \"      <td>8.2091</td>\\n\",\n       \"      <td>-7.2477</td>\\n\",\n       \"      <td>-2.3272</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>164645</th>\\n\",\n       \"      <td>MUC.CP003_MCF7_24H:BRD-K16406336-311-01-2:10</td>\\n\",\n       \"      <td>BRD-K16406336</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>CN(C)c1ccc2nc3ccc(cc3[s+]c2c1)N(C)C</td>\\n\",\n       \"      <td>RBTBFTRPCNLSDE-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>1.1937</td>\\n\",\n       \"      <td>2.2605</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>0.6851</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.1453</td>\\n\",\n       \"      <td>-7.7583</td>\\n\",\n       \"      <td>1.0289</td>\\n\",\n       \"      <td>0.3450</td>\\n\",\n       \"      <td>3.3349</td>\\n\",\n       \"      <td>-4.9317</td>\\n\",\n       \"      <td>3.1063</td>\\n\",\n       \"      <td>5.5335</td>\\n\",\n       \"      <td>-6.5199</td>\\n\",\n       \"      <td>0.5741</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>234573</th>\\n\",\n       \"      <td>HDAC002_MCF7_24H:BRD-K02130563-001-05-6:1.25</td>\\n\",\n       \"      <td>BRD-K02130563</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>Cc1[nH]c2ccccc2c1CCNCc1ccc(C=CC(=O)NO)cc1</td>\\n\",\n       \"      <td>FPOHNWQLNRZRFC-ZHACJKMWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.8106</td>\\n\",\n       \"      <td>-2.7816</td>\\n\",\n       \"      <td>8.3877</td>\\n\",\n       \"      <td>-5.4464</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.4781</td>\\n\",\n       \"      <td>-1.8474</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>3.8867</td>\\n\",\n       \"      <td>6.9389</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>3.6115</td>\\n\",\n       \"      <td>1.2103</td>\\n\",\n       \"      <td>-7.0777</td>\\n\",\n       \"      <td>1.1838</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>409816</th>\\n\",\n       \"      <td>MUC.CP005_MCF7_6H:BRD-K19696793-001-01-8:9.9971</td>\\n\",\n       \"      <td>BRD-K19696793</td>\\n\",\n       \"      <td>MCF7</td>\\n\",\n       \"      <td>O[C@@H]1COC[C@H]2O[C@@H](CC(=O)N3CCOCC3)CC[C@@...</td>\\n\",\n       \"      <td>RQXRWTHAFLVOKQ-ACZWYYKOSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>3.6340</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"      <td>-10.0000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-2.9725</td>\\n\",\n       \"      <td>7.2787</td>\\n\",\n       \"      <td>2.5405</td>\\n\",\n       \"      <td>-5.7262</td>\\n\",\n       \"      <td>-1.6385</td>\\n\",\n       \"      <td>-6.7065</td>\\n\",\n       \"      <td>3.3411</td>\\n\",\n       \"      <td>9.9241</td>\\n\",\n       \"      <td>-6.0190</td>\\n\",\n       \"      <td>10.0000</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>11622 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                                 full_id        pert_id  \\\\\\n\",\n       \"302663        CRCGN004_MCF7_6H:BRD-K91960538-001-06-8:10  BRD-K91960538   \\n\",\n       \"301346        CRCGN004_MCF7_6H:BRD-A66229260-001-01-6:10  BRD-A66229260   \\n\",\n       \"301292      CRCGN004_MCF7_6H:BRD-A43286952-001-01-7:19.9  BRD-A43286952   \\n\",\n       \"301482      CRCGN004_MCF7_6H:BRD-A88939772-001-06-4:20.1  BRD-A88939772   \\n\",\n       \"301575        CRCGN004_MCF7_6H:BRD-K09784055-001-05-8:10  BRD-K09784055   \\n\",\n       \"...                                                  ...            ...   \\n\",\n       \"409944   MUC.CP005_MCF7_6H:BRD-K62122103-001-01-6:0.3707  BRD-K62122103   \\n\",\n       \"408650  MUC.CP001_MCF7_6H:BRD-K25464116-001-01-3:10.0546  BRD-K25464116   \\n\",\n       \"164645      MUC.CP003_MCF7_24H:BRD-K16406336-311-01-2:10  BRD-K16406336   \\n\",\n       \"234573      HDAC002_MCF7_24H:BRD-K02130563-001-05-6:1.25  BRD-K02130563   \\n\",\n       \"409816   MUC.CP005_MCF7_6H:BRD-K19696793-001-01-8:9.9971  BRD-K19696793   \\n\",\n       \"\\n\",\n       \"       cell_iname                                             SMILES  \\\\\\n\",\n       \"302663       MCF7                                      CN(C)[N+][O-]   \\n\",\n       \"301346       MCF7  CC(CCCC(C)(C)O)C1CCC2C1(C)CCCC2=C/C=C1/CC(O)CC...   \\n\",\n       \"301292       MCF7              CC12CCC3C(CCc4cc(O)ccc34)C1CCC2(O)C#C   \\n\",\n       \"301482       MCF7                CC12CCC3C(CCc4cc(O)ccc34)C2CC(O)C1O   \\n\",\n       \"301575       MCF7                   c1ccc2c(c1)c3cccc4ccc5cccc2c5c43   \\n\",\n       \"...           ...                                                ...   \\n\",\n       \"409944       MCF7  O[C@H]1COC[C@@H]2O[C@H](CC(=O)NCc3ccccc3)CC[C@...   \\n\",\n       \"408650       MCF7  COc1cccc(CN2C[C@]3(CN(Cc4ccccn4)C3)c3c([nH]c4c...   \\n\",\n       \"164645       MCF7                CN(C)c1ccc2nc3ccc(cc3[s+]c2c1)N(C)C   \\n\",\n       \"234573       MCF7          Cc1[nH]c2ccccc2c1CCNCc1ccc(C=CC(=O)NO)cc1   \\n\",\n       \"409816       MCF7  O[C@@H]1COC[C@H]2O[C@@H](CC(=O)N3CCOCC3)CC[C@@...   \\n\",\n       \"\\n\",\n       \"                          inchi_key   compound_aliases    10007    1001  \\\\\\n\",\n       \"302663  UMFJAHHVKNCGLG-UHFFFAOYSA-N                NaN  -0.1941  0.0048   \\n\",\n       \"301346  GMRQFYUYWCNGIN-KWJBEKLWSA-N                NaN   0.0306  0.1756   \\n\",\n       \"301292  BFPYWIDHMRZLRN-UHFFFAOYSA-N  ethinyl-estradiol  -0.4324  0.1748   \\n\",\n       \"301482  PROQIPRRNZUXQM-UHFFFAOYSA-N                NaN   0.1939 -1.0207   \\n\",\n       \"301575  TXVHTIQJNYSSKO-UHFFFAOYSA-N     benzo(e)pyrene   0.1609 -0.3943   \\n\",\n       \"...                             ...                ...      ...     ...   \\n\",\n       \"409944  PETYMTNRVWKWCT-QPXUXIHVSA-N                NaN  -6.9677 -4.7041   \\n\",\n       \"408650  MIXUGIDMCUQDBF-SANMLTNESA-N                NaN   0.1379 -5.0869   \\n\",\n       \"164645  RBTBFTRPCNLSDE-UHFFFAOYSA-N                NaN   1.1937  2.2605   \\n\",\n       \"234573  FPOHNWQLNRZRFC-ZHACJKMWSA-N                NaN  -0.8106 -2.7816   \\n\",\n       \"409816  RQXRWTHAFLVOKQ-ACZWYYKOSA-N                NaN -10.0000  3.6340   \\n\",\n       \"\\n\",\n       \"          10013    10038  ...    9918    9924     9926    9928     993  \\\\\\n\",\n       \"302663   0.0307  -0.2970  ... -0.0767 -0.0044  -0.2711  0.1766  0.2891   \\n\",\n       \"301346  -0.0611   0.4677  ... -0.6214 -1.7848   0.2496  0.3155  0.6782   \\n\",\n       \"301292   0.5012  -0.2536  ... -0.3402 -1.1109  -0.0592 -0.0780 -0.0019   \\n\",\n       \"301482   0.1532   0.3879  ... -0.5235  0.5886  -0.2801 -0.3497  0.7149   \\n\",\n       \"301575   0.0122  -0.2167  ... -0.1850  0.3569  -0.5812 -0.9063  0.0104   \\n\",\n       \"...         ...      ...  ...     ...     ...      ...     ...     ...   \\n\",\n       \"409944  10.0000  -4.5273  ... -4.6201  1.1209  -7.5554 -4.4353 -4.1802   \\n\",\n       \"408650  10.0000 -10.0000  ... -0.2269  0.7651 -10.0000  2.1298  1.6069   \\n\",\n       \"164645  10.0000   0.6851  ...  2.1453 -7.7583   1.0289  0.3450  3.3349   \\n\",\n       \"234573   8.3877  -5.4464  ...  0.4781 -1.8474  10.0000  3.8867  6.9389   \\n\",\n       \"409816  10.0000 -10.0000  ... -2.9725  7.2787   2.5405 -5.7262 -1.6385   \\n\",\n       \"\\n\",\n       \"            994    9943     9961     998     9988  \\n\",\n       \"302663  -0.1575 -0.8057   0.2617  0.2762  -0.0126  \\n\",\n       \"301346   0.0818  0.7288  -0.1041 -0.1184   0.3639  \\n\",\n       \"301292  -0.9054 -0.0371  -0.3098 -0.1568   0.1244  \\n\",\n       \"301482  -0.0927  0.1414  -0.3235 -0.1391  -0.0635  \\n\",\n       \"301575   0.5815  0.0259  -0.1110 -0.5486   0.2293  \\n\",\n       \"...         ...     ...      ...     ...      ...  \\n\",\n       \"409944 -10.0000  2.7441  10.0000 -0.3233   4.4998  \\n\",\n       \"408650  -7.8071 -1.3344   8.2091 -7.2477  -2.3272  \\n\",\n       \"164645  -4.9317  3.1063   5.5335 -6.5199   0.5741  \\n\",\n       \"234573 -10.0000  3.6115   1.2103 -7.0777   1.1838  \\n\",\n       \"409816  -6.7065  3.3411   9.9241 -6.0190  10.0000  \\n\",\n       \"\\n\",\n       \"[11622 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>386749</th>\\n\",\n       \"      <td>DPK.CP003_PC3_24H:BRD-K50071428:10</td>\\n\",\n       \"      <td>BRD-K50071428</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>C[C@]12CC[C@H]3[C@@H](CC=C4C[C@@H](O)CC[C@]34C...</td>\\n\",\n       \"      <td>GZOSMCIZMLWJML-VJLLXTKPSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.323546</td>\\n\",\n       \"      <td>-1.490595</td>\\n\",\n       \"      <td>-0.444563</td>\\n\",\n       \"      <td>-0.607655</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.696308</td>\\n\",\n       \"      <td>-1.218408</td>\\n\",\n       \"      <td>0.557992</td>\\n\",\n       \"      <td>0.239305</td>\\n\",\n       \"      <td>0.439346</td>\\n\",\n       \"      <td>0.442279</td>\\n\",\n       \"      <td>0.218131</td>\\n\",\n       \"      <td>-0.531223</td>\\n\",\n       \"      <td>-0.108359</td>\\n\",\n       \"      <td>0.069679</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>201781</th>\\n\",\n       \"      <td>CPC008_PC3_24H:BRD-K49791723-001-01-6:10</td>\\n\",\n       \"      <td>BRD-K49791723</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>NC(=O)C1CCN(CC1)c1nc(cs1)-c1ccc(Oc2ccccc2)cc1</td>\\n\",\n       \"      <td>YKWYKGWGJNJYKQ-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.030490</td>\\n\",\n       \"      <td>-0.425787</td>\\n\",\n       \"      <td>0.023675</td>\\n\",\n       \"      <td>0.285044</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.737361</td>\\n\",\n       \"      <td>-0.485443</td>\\n\",\n       \"      <td>0.043927</td>\\n\",\n       \"      <td>0.591405</td>\\n\",\n       \"      <td>-0.393642</td>\\n\",\n       \"      <td>0.896101</td>\\n\",\n       \"      <td>-0.710377</td>\\n\",\n       \"      <td>0.201203</td>\\n\",\n       \"      <td>-2.066092</td>\\n\",\n       \"      <td>0.762952</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>386748</th>\\n\",\n       \"      <td>DPK.CP003_PC3_24H:BRD-K28824103:10</td>\\n\",\n       \"      <td>BRD-K28824103</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>COC(=O)C1=CO[C@@H](O)[C@H]2[C@@H]1CC=C2CO</td>\\n\",\n       \"      <td>AZKVWQKMDGGDSV-BCMRRPTOSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.341272</td>\\n\",\n       \"      <td>-0.805053</td>\\n\",\n       \"      <td>-0.251595</td>\\n\",\n       \"      <td>0.282141</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.558577</td>\\n\",\n       \"      <td>0.073408</td>\\n\",\n       \"      <td>-0.050935</td>\\n\",\n       \"      <td>0.256479</td>\\n\",\n       \"      <td>0.106139</td>\\n\",\n       \"      <td>-0.051141</td>\\n\",\n       \"      <td>-0.471109</td>\\n\",\n       \"      <td>-0.049407</td>\\n\",\n       \"      <td>0.952701</td>\\n\",\n       \"      <td>0.050138</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>386752</th>\\n\",\n       \"      <td>DPK.CP003_PC3_24H:BRD-K64034691:10</td>\\n\",\n       \"      <td>BRD-K64034691</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>CC#CC(=O)N1CCC[C@H]1c1nc(-c2ccc(cc2)C(=O)Nc2cc...</td>\\n\",\n       \"      <td>WDENQIQQYWYTPO-IBGZPJMESA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.021362</td>\\n\",\n       \"      <td>-0.234064</td>\\n\",\n       \"      <td>-0.003436</td>\\n\",\n       \"      <td>-0.088608</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.536669</td>\\n\",\n       \"      <td>-0.048115</td>\\n\",\n       \"      <td>-0.126921</td>\\n\",\n       \"      <td>-0.101185</td>\\n\",\n       \"      <td>-0.279225</td>\\n\",\n       \"      <td>0.012632</td>\\n\",\n       \"      <td>-0.219388</td>\\n\",\n       \"      <td>0.754346</td>\\n\",\n       \"      <td>-0.501091</td>\\n\",\n       \"      <td>0.014386</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>208034</th>\\n\",\n       \"      <td>CPC009_PC3_24H:BRD-K85237725-001-01-3:10</td>\\n\",\n       \"      <td>BRD-K85237725</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>NC(=O)C1CCN(CC1)c1ncc(Br)s1</td>\\n\",\n       \"      <td>MIMIVBJPBQWWPP-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.083892</td>\\n\",\n       \"      <td>-0.260815</td>\\n\",\n       \"      <td>-0.158541</td>\\n\",\n       \"      <td>0.025183</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1.155743</td>\\n\",\n       \"      <td>0.804736</td>\\n\",\n       \"      <td>0.242454</td>\\n\",\n       \"      <td>-0.171066</td>\\n\",\n       \"      <td>0.218899</td>\\n\",\n       \"      <td>-2.360980</td>\\n\",\n       \"      <td>-0.802210</td>\\n\",\n       \"      <td>-0.554818</td>\\n\",\n       \"      <td>0.002064</td>\\n\",\n       \"      <td>0.385117</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>428040</th>\\n\",\n       \"      <td>PCLB003_PC3_24H:BRD-K42573370-001-01-1:10</td>\\n\",\n       \"      <td>BRD-K42573370</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>CC1(C)[C@@H]2C[C@]34CCCN3C(=O)[C@]2(NC4=O)C=C2...</td>\\n\",\n       \"      <td>YSHQRTLYKODQCX-PUUVEUEGSA-N</td>\\n\",\n       \"      <td>avrainvillamide-analog-2</td>\\n\",\n       \"      <td>0.528700</td>\\n\",\n       \"      <td>4.174200</td>\\n\",\n       \"      <td>-1.298400</td>\\n\",\n       \"      <td>-6.898200</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.227500</td>\\n\",\n       \"      <td>0.251100</td>\\n\",\n       \"      <td>-8.700700</td>\\n\",\n       \"      <td>6.789800</td>\\n\",\n       \"      <td>-1.726200</td>\\n\",\n       \"      <td>1.471900</td>\\n\",\n       \"      <td>-3.751800</td>\\n\",\n       \"      <td>-0.724300</td>\\n\",\n       \"      <td>-0.113400</td>\\n\",\n       \"      <td>2.462100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>286889</th>\\n\",\n       \"      <td>PCLB003_PC3_24H:BRD-K04466929-001-05-1:10</td>\\n\",\n       \"      <td>BRD-K04466929</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>CC(=O)Nc1ccc(cc1)C(=O)Nc2cc(ccc2N)c3cccs3</td>\\n\",\n       \"      <td>ABZSPJVXTTUFAA-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>MERCK-60</td>\\n\",\n       \"      <td>-2.438300</td>\\n\",\n       \"      <td>-1.226700</td>\\n\",\n       \"      <td>-6.216600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.911700</td>\\n\",\n       \"      <td>4.312700</td>\\n\",\n       \"      <td>2.333200</td>\\n\",\n       \"      <td>2.156200</td>\\n\",\n       \"      <td>3.754500</td>\\n\",\n       \"      <td>-2.712000</td>\\n\",\n       \"      <td>5.745800</td>\\n\",\n       \"      <td>5.767900</td>\\n\",\n       \"      <td>1.687200</td>\\n\",\n       \"      <td>8.743100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>384572</th>\\n\",\n       \"      <td>PCLB002_PC3_24H:BRD-K98645985:10</td>\\n\",\n       \"      <td>BRD-K98645985</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>CO[C@H]1CN(C)C(=O)c2cc(NC(=O)NC(C)C)ccc2OC[C@H...</td>\\n\",\n       \"      <td>DZGZFXRKNSZUSK-MKGJDZFWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-1.920700</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>0.114900</td>\\n\",\n       \"      <td>-6.938500</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-3.226400</td>\\n\",\n       \"      <td>3.250600</td>\\n\",\n       \"      <td>-3.184800</td>\\n\",\n       \"      <td>7.009000</td>\\n\",\n       \"      <td>0.868200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>6.461500</td>\\n\",\n       \"      <td>3.070500</td>\\n\",\n       \"      <td>-1.547200</td>\\n\",\n       \"      <td>6.416900</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>234649</th>\\n\",\n       \"      <td>HDAC002_PC3_6H:BRD-K02130563-001-05-6:1.25</td>\\n\",\n       \"      <td>BRD-K02130563</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>Cc1[nH]c2ccccc2c1CCNCc1ccc(C=CC(=O)NO)cc1</td>\\n\",\n       \"      <td>FPOHNWQLNRZRFC-ZHACJKMWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-5.626050</td>\\n\",\n       \"      <td>-3.415650</td>\\n\",\n       \"      <td>8.913350</td>\\n\",\n       \"      <td>-4.190050</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>4.526700</td>\\n\",\n       \"      <td>1.134650</td>\\n\",\n       \"      <td>2.826550</td>\\n\",\n       \"      <td>0.276850</td>\\n\",\n       \"      <td>2.031500</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>1.640250</td>\\n\",\n       \"      <td>1.542300</td>\\n\",\n       \"      <td>-7.604000</td>\\n\",\n       \"      <td>0.276700</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>391115</th>\\n\",\n       \"      <td>ERG014_PC3_6H:BRD-K20624574-001-01-2:10.1199</td>\\n\",\n       \"      <td>BRD-K20624574</td>\\n\",\n       \"      <td>PC3</td>\\n\",\n       \"      <td>C[C@H](CO)N1C[C@@H](C)[C@H](CN(C)C(=O)Nc2ccc3O...</td>\\n\",\n       \"      <td>BZCYEFWYEYUQFU-LRKPSKMWSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-1.258200</td>\\n\",\n       \"      <td>1.814400</td>\\n\",\n       \"      <td>-2.333400</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.061500</td>\\n\",\n       \"      <td>6.157000</td>\\n\",\n       \"      <td>-4.861100</td>\\n\",\n       \"      <td>-5.385500</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>5.805700</td>\\n\",\n       \"      <td>4.844300</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>11521 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                             full_id        pert_id  \\\\\\n\",\n       \"386749            DPK.CP003_PC3_24H:BRD-K50071428:10  BRD-K50071428   \\n\",\n       \"201781      CPC008_PC3_24H:BRD-K49791723-001-01-6:10  BRD-K49791723   \\n\",\n       \"386748            DPK.CP003_PC3_24H:BRD-K28824103:10  BRD-K28824103   \\n\",\n       \"386752            DPK.CP003_PC3_24H:BRD-K64034691:10  BRD-K64034691   \\n\",\n       \"208034      CPC009_PC3_24H:BRD-K85237725-001-01-3:10  BRD-K85237725   \\n\",\n       \"...                                              ...            ...   \\n\",\n       \"428040     PCLB003_PC3_24H:BRD-K42573370-001-01-1:10  BRD-K42573370   \\n\",\n       \"286889     PCLB003_PC3_24H:BRD-K04466929-001-05-1:10  BRD-K04466929   \\n\",\n       \"384572              PCLB002_PC3_24H:BRD-K98645985:10  BRD-K98645985   \\n\",\n       \"234649    HDAC002_PC3_6H:BRD-K02130563-001-05-6:1.25  BRD-K02130563   \\n\",\n       \"391115  ERG014_PC3_6H:BRD-K20624574-001-01-2:10.1199  BRD-K20624574   \\n\",\n       \"\\n\",\n       \"       cell_iname                                             SMILES  \\\\\\n\",\n       \"386749        PC3  C[C@]12CC[C@H]3[C@@H](CC=C4C[C@@H](O)CC[C@]34C...   \\n\",\n       \"201781        PC3      NC(=O)C1CCN(CC1)c1nc(cs1)-c1ccc(Oc2ccccc2)cc1   \\n\",\n       \"386748        PC3          COC(=O)C1=CO[C@@H](O)[C@H]2[C@@H]1CC=C2CO   \\n\",\n       \"386752        PC3  CC#CC(=O)N1CCC[C@H]1c1nc(-c2ccc(cc2)C(=O)Nc2cc...   \\n\",\n       \"208034        PC3                        NC(=O)C1CCN(CC1)c1ncc(Br)s1   \\n\",\n       \"...           ...                                                ...   \\n\",\n       \"428040        PC3  CC1(C)[C@@H]2C[C@]34CCCN3C(=O)[C@]2(NC4=O)C=C2...   \\n\",\n       \"286889        PC3          CC(=O)Nc1ccc(cc1)C(=O)Nc2cc(ccc2N)c3cccs3   \\n\",\n       \"384572        PC3  CO[C@H]1CN(C)C(=O)c2cc(NC(=O)NC(C)C)ccc2OC[C@H...   \\n\",\n       \"234649        PC3          Cc1[nH]c2ccccc2c1CCNCc1ccc(C=CC(=O)NO)cc1   \\n\",\n       \"391115        PC3  C[C@H](CO)N1C[C@@H](C)[C@H](CN(C)C(=O)Nc2ccc3O...   \\n\",\n       \"\\n\",\n       \"                          inchi_key          compound_aliases     10007  \\\\\\n\",\n       \"386749  GZOSMCIZMLWJML-VJLLXTKPSA-N                       NaN  0.323546   \\n\",\n       \"201781  YKWYKGWGJNJYKQ-UHFFFAOYSA-N                       NaN  0.030490   \\n\",\n       \"386748  AZKVWQKMDGGDSV-BCMRRPTOSA-N                       NaN  0.341272   \\n\",\n       \"386752  WDENQIQQYWYTPO-IBGZPJMESA-N                       NaN  0.021362   \\n\",\n       \"208034  MIMIVBJPBQWWPP-UHFFFAOYSA-N                       NaN -0.083892   \\n\",\n       \"...                             ...                       ...       ...   \\n\",\n       \"428040  YSHQRTLYKODQCX-PUUVEUEGSA-N  avrainvillamide-analog-2  0.528700   \\n\",\n       \"286889  ABZSPJVXTTUFAA-UHFFFAOYSA-N                  MERCK-60 -2.438300   \\n\",\n       \"384572  DZGZFXRKNSZUSK-MKGJDZFWSA-N                       NaN -1.920700   \\n\",\n       \"234649  FPOHNWQLNRZRFC-ZHACJKMWSA-N                       NaN -5.626050   \\n\",\n       \"391115  BZCYEFWYEYUQFU-LRKPSKMWSA-N                       NaN -1.258200   \\n\",\n       \"\\n\",\n       \"             1001     10013      10038  ...      9918      9924      9926  \\\\\\n\",\n       \"386749  -1.490595 -0.444563  -0.607655  ...  0.696308 -1.218408  0.557992   \\n\",\n       \"201781  -0.425787  0.023675   0.285044  ... -0.737361 -0.485443  0.043927   \\n\",\n       \"386748  -0.805053 -0.251595   0.282141  ...  0.558577  0.073408 -0.050935   \\n\",\n       \"386752  -0.234064 -0.003436  -0.088608  ...  1.536669 -0.048115 -0.126921   \\n\",\n       \"208034  -0.260815 -0.158541   0.025183  ...  1.155743  0.804736  0.242454   \\n\",\n       \"...           ...       ...        ...  ...       ...       ...       ...   \\n\",\n       \"428040   4.174200 -1.298400  -6.898200  ...  2.227500  0.251100 -8.700700   \\n\",\n       \"286889  -1.226700 -6.216600 -10.000000  ... -1.911700  4.312700  2.333200   \\n\",\n       \"384572  10.000000  0.114900  -6.938500  ... -3.226400  3.250600 -3.184800   \\n\",\n       \"234649  -3.415650  8.913350  -4.190050  ...  4.526700  1.134650  2.826550   \\n\",\n       \"391115   1.814400 -2.333400 -10.000000  ...  2.061500  6.157000 -4.861100   \\n\",\n       \"\\n\",\n       \"            9928        993        994      9943      9961        998  \\\\\\n\",\n       \"386749  0.239305   0.439346   0.442279  0.218131 -0.531223  -0.108359   \\n\",\n       \"201781  0.591405  -0.393642   0.896101 -0.710377  0.201203  -2.066092   \\n\",\n       \"386748  0.256479   0.106139  -0.051141 -0.471109 -0.049407   0.952701   \\n\",\n       \"386752 -0.101185  -0.279225   0.012632 -0.219388  0.754346  -0.501091   \\n\",\n       \"208034 -0.171066   0.218899  -2.360980 -0.802210 -0.554818   0.002064   \\n\",\n       \"...          ...        ...        ...       ...       ...        ...   \\n\",\n       \"428040  6.789800  -1.726200   1.471900 -3.751800 -0.724300  -0.113400   \\n\",\n       \"286889  2.156200   3.754500  -2.712000  5.745800  5.767900   1.687200   \\n\",\n       \"384572  7.009000   0.868200 -10.000000  6.461500  3.070500  -1.547200   \\n\",\n       \"234649  0.276850   2.031500 -10.000000  1.640250  1.542300  -7.604000   \\n\",\n       \"391115 -5.385500 -10.000000 -10.000000  5.805700  4.844300 -10.000000   \\n\",\n       \"\\n\",\n       \"             9988  \\n\",\n       \"386749   0.069679  \\n\",\n       \"201781   0.762952  \\n\",\n       \"386748   0.050138  \\n\",\n       \"386752   0.014386  \\n\",\n       \"208034   0.385117  \\n\",\n       \"...           ...  \\n\",\n       \"428040   2.462100  \\n\",\n       \"286889   8.743100  \\n\",\n       \"384572   6.416900  \\n\",\n       \"234649   0.276700  \\n\",\n       \"391115  10.000000  \\n\",\n       \"\\n\",\n       \"[11521 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/tmp/ipykernel_69015/2571295951.py:7: SettingWithCopyWarning: \\n\",\n      \"A value is trying to be set on a copy of a slice from a DataFrame.\\n\",\n      \"Try using .loc[row_indexer,col_indexer] = value instead\\n\",\n      \"\\n\",\n      \"See the caveats in the documentation: https://pandas.pydata.org/pandas-docs/stable/user_guide/indexing.html#returning-a-view-versus-a-copy\\n\",\n      \"  this_df[\\\"std\\\"] = std\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>full_id</th>\\n\",\n       \"      <th>pert_id</th>\\n\",\n       \"      <th>cell_iname</th>\\n\",\n       \"      <th>SMILES</th>\\n\",\n       \"      <th>inchi_key</th>\\n\",\n       \"      <th>compound_aliases</th>\\n\",\n       \"      <th>10007</th>\\n\",\n       \"      <th>1001</th>\\n\",\n       \"      <th>10013</th>\\n\",\n       \"      <th>10038</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>9918</th>\\n\",\n       \"      <th>9924</th>\\n\",\n       \"      <th>9926</th>\\n\",\n       \"      <th>9928</th>\\n\",\n       \"      <th>993</th>\\n\",\n       \"      <th>994</th>\\n\",\n       \"      <th>9943</th>\\n\",\n       \"      <th>9961</th>\\n\",\n       \"      <th>998</th>\\n\",\n       \"      <th>9988</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>227534</th>\\n\",\n       \"      <td>CPC011_A375_6H:BRD-K43236057-001-04-8:10</td>\\n\",\n       \"      <td>BRD-K43236057</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>OC[C@H]1O[C@@H](Oc2cc(O)cc(C=Cc3ccc(O)cc3)c2)[...</td>\\n\",\n       \"      <td>HSTZMXCBWJGKHG-CUYWLFDKSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.240231</td>\\n\",\n       \"      <td>-0.427503</td>\\n\",\n       \"      <td>-0.371866</td>\\n\",\n       \"      <td>0.160758</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.049103</td>\\n\",\n       \"      <td>0.112653</td>\\n\",\n       \"      <td>0.787318</td>\\n\",\n       \"      <td>0.361519</td>\\n\",\n       \"      <td>0.595066</td>\\n\",\n       \"      <td>-0.050393</td>\\n\",\n       \"      <td>-0.000183</td>\\n\",\n       \"      <td>-0.562766</td>\\n\",\n       \"      <td>0.539329</td>\\n\",\n       \"      <td>0.513262</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>69737</th>\\n\",\n       \"      <td>CPC018_A375_6H:BRD-A24122750-001-01-1:10</td>\\n\",\n       \"      <td>BRD-A24122750</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>NCC(CS(O)(=O)=O)c1ccc(Cl)cc1</td>\\n\",\n       \"      <td>JYLNVJYYQQXNEK-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.231360</td>\\n\",\n       \"      <td>-0.205703</td>\\n\",\n       \"      <td>0.299739</td>\\n\",\n       \"      <td>-0.266904</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.487991</td>\\n\",\n       \"      <td>0.630590</td>\\n\",\n       \"      <td>0.253384</td>\\n\",\n       \"      <td>0.054045</td>\\n\",\n       \"      <td>-0.105749</td>\\n\",\n       \"      <td>0.294856</td>\\n\",\n       \"      <td>0.243753</td>\\n\",\n       \"      <td>-0.093112</td>\\n\",\n       \"      <td>-0.415093</td>\\n\",\n       \"      <td>1.505666</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>227941</th>\\n\",\n       \"      <td>CPC011_A375_6H:BRD-K50938287-036-09-6:10</td>\\n\",\n       \"      <td>BRD-K50938287</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>CNS(=O)(=O)Cc1ccc2[nH]cc(CCN(C)C)c2c1</td>\\n\",\n       \"      <td>KQKPFRSPSRPDEB-UHFFFAOYSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.149539</td>\\n\",\n       \"      <td>-0.115549</td>\\n\",\n       \"      <td>-0.474415</td>\\n\",\n       \"      <td>-0.391042</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.002793</td>\\n\",\n       \"      <td>-0.124959</td>\\n\",\n       \"      <td>0.359368</td>\\n\",\n       \"      <td>0.224918</td>\\n\",\n       \"      <td>-0.616804</td>\\n\",\n       \"      <td>-0.203747</td>\\n\",\n       \"      <td>-0.859153</td>\\n\",\n       \"      <td>-0.795572</td>\\n\",\n       \"      <td>0.456950</td>\\n\",\n       \"      <td>0.490084</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>76230</th>\\n\",\n       \"      <td>CPC018_A375_6H:BRD-K18316707-001-01-9:10</td>\\n\",\n       \"      <td>BRD-K18316707</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1cc(C)cc(OC)c1[C@@H]1C=C(C)CC[C@H]1C(C)=C</td>\\n\",\n       \"      <td>ICHJMVMWPKLUKT-JKSUJKDBSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.200260</td>\\n\",\n       \"      <td>-0.302280</td>\\n\",\n       \"      <td>-0.443100</td>\\n\",\n       \"      <td>0.155580</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.002480</td>\\n\",\n       \"      <td>-0.421380</td>\\n\",\n       \"      <td>-0.397380</td>\\n\",\n       \"      <td>0.329560</td>\\n\",\n       \"      <td>-0.203320</td>\\n\",\n       \"      <td>0.705760</td>\\n\",\n       \"      <td>-0.940700</td>\\n\",\n       \"      <td>0.317900</td>\\n\",\n       \"      <td>-1.341780</td>\\n\",\n       \"      <td>-1.011780</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>225999</th>\\n\",\n       \"      <td>CPC011_A375_6H:BRD-K29113274-001-20-0:10</td>\\n\",\n       \"      <td>BRD-K29113274</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>CC(=O)N1CCN(CC1)c1ccc(OC[C@H]2CO[C@@](Cn3ccnc3...</td>\\n\",\n       \"      <td>XMAYWYJOQHXEEK-OZXSUGGESA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>0.545711</td>\\n\",\n       \"      <td>-0.585337</td>\\n\",\n       \"      <td>-0.268187</td>\\n\",\n       \"      <td>0.319893</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-0.356957</td>\\n\",\n       \"      <td>0.408205</td>\\n\",\n       \"      <td>0.002745</td>\\n\",\n       \"      <td>0.253226</td>\\n\",\n       \"      <td>0.371863</td>\\n\",\n       \"      <td>-0.176473</td>\\n\",\n       \"      <td>-0.691677</td>\\n\",\n       \"      <td>-0.492211</td>\\n\",\n       \"      <td>0.317271</td>\\n\",\n       \"      <td>0.119787</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>287335</th>\\n\",\n       \"      <td>PCLB003_A375_24H:BRD-K17953061-001-10-1:10</td>\\n\",\n       \"      <td>BRD-K17953061</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>CN[C@@H]1C[C@H]2O[C@@](C)([C@@H]1OC)n3c4ccccc4...</td>\\n\",\n       \"      <td>HKSZLNNOFSGOKW-FYTWVXJKSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-9.032300</td>\\n\",\n       \"      <td>-1.144700</td>\\n\",\n       \"      <td>4.462500</td>\\n\",\n       \"      <td>0.609000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-1.185900</td>\\n\",\n       \"      <td>-1.904500</td>\\n\",\n       \"      <td>-9.802100</td>\\n\",\n       \"      <td>-1.854500</td>\\n\",\n       \"      <td>-6.617700</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-6.913400</td>\\n\",\n       \"      <td>-2.248000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-0.050400</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>150</th>\\n\",\n       \"      <td>DOSVAL004_A375_24H:BRD-A61304759:5</td>\\n\",\n       \"      <td>BRD-A61304759</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...</td>\\n\",\n       \"      <td>AYUNIORJHRXIBJ-ZGQRYRSUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-0.695700</td>\\n\",\n       \"      <td>0.729100</td>\\n\",\n       \"      <td>7.177500</td>\\n\",\n       \"      <td>0.371700</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0.674500</td>\\n\",\n       \"      <td>-8.714800</td>\\n\",\n       \"      <td>-3.937900</td>\\n\",\n       \"      <td>-4.532800</td>\\n\",\n       \"      <td>1.763600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>4.421200</td>\\n\",\n       \"      <td>3.784900</td>\\n\",\n       \"      <td>-2.483400</td>\\n\",\n       \"      <td>1.431000</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>377565</th>\\n\",\n       \"      <td>DOSBIO001_A375_24H:BRD-K08412560:10.0896</td>\\n\",\n       \"      <td>BRD-K08412560</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>COc1ccc2c3c([C@@H](CO)N(C[C@]33CN(CC4CC4)C3)C(...</td>\\n\",\n       \"      <td>XKSIWSGTCPACLT-OAQYLSRUSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-3.142500</td>\\n\",\n       \"      <td>10.000000</td>\\n\",\n       \"      <td>5.293700</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>4.972200</td>\\n\",\n       \"      <td>-5.377800</td>\\n\",\n       \"      <td>-6.791900</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>2.125200</td>\\n\",\n       \"      <td>-6.089400</td>\\n\",\n       \"      <td>-1.801600</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-1.673100</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>277706</th>\\n\",\n       \"      <td>PCLB003_A375_24H:BRD-K78431006-001-05-2:10</td>\\n\",\n       \"      <td>BRD-K78431006</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>C[C@@H](Oc1cc(cnc1N)-c1cnn(c1)C1CCNCC1)c1c(Cl)...</td>\\n\",\n       \"      <td>KTEIFNKAUNYNJU-GFCCVEGCSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-6.482900</td>\\n\",\n       \"      <td>1.906800</td>\\n\",\n       \"      <td>7.057600</td>\\n\",\n       \"      <td>3.195200</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.695300</td>\\n\",\n       \"      <td>-5.500400</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-5.374700</td>\\n\",\n       \"      <td>-5.679000</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>1.481800</td>\\n\",\n       \"      <td>-1.277900</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>-3.348200</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>365598</th>\\n\",\n       \"      <td>DOSBIO001_A375_24H:BRD-K49669601:9.903</td>\\n\",\n       \"      <td>BRD-K49669601</td>\\n\",\n       \"      <td>A375</td>\\n\",\n       \"      <td>C[C@H](CO)N1C[C@@H](C)[C@@H](CN(C)CC2CC2)OCCCC...</td>\\n\",\n       \"      <td>DFAHTHVTMMOLJD-SQVYLTPJSA-N</td>\\n\",\n       \"      <td>NaN</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>5.210000</td>\\n\",\n       \"      <td>-2.923100</td>\\n\",\n       \"      <td>-4.222100</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2.580100</td>\\n\",\n       \"      <td>3.684500</td>\\n\",\n       \"      <td>5.692300</td>\\n\",\n       \"      <td>-1.143100</td>\\n\",\n       \"      <td>-7.186700</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>3.964800</td>\\n\",\n       \"      <td>1.002200</td>\\n\",\n       \"      <td>-10.000000</td>\\n\",\n       \"      <td>3.584000</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>10694 rows × 984 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"                                           full_id        pert_id cell_iname  \\\\\\n\",\n       \"227534    CPC011_A375_6H:BRD-K43236057-001-04-8:10  BRD-K43236057       A375   \\n\",\n       \"69737     CPC018_A375_6H:BRD-A24122750-001-01-1:10  BRD-A24122750       A375   \\n\",\n       \"227941    CPC011_A375_6H:BRD-K50938287-036-09-6:10  BRD-K50938287       A375   \\n\",\n       \"76230     CPC018_A375_6H:BRD-K18316707-001-01-9:10  BRD-K18316707       A375   \\n\",\n       \"225999    CPC011_A375_6H:BRD-K29113274-001-20-0:10  BRD-K29113274       A375   \\n\",\n       \"...                                            ...            ...        ...   \\n\",\n       \"287335  PCLB003_A375_24H:BRD-K17953061-001-10-1:10  BRD-K17953061       A375   \\n\",\n       \"150             DOSVAL004_A375_24H:BRD-A61304759:5  BRD-A61304759       A375   \\n\",\n       \"377565    DOSBIO001_A375_24H:BRD-K08412560:10.0896  BRD-K08412560       A375   \\n\",\n       \"277706  PCLB003_A375_24H:BRD-K78431006-001-05-2:10  BRD-K78431006       A375   \\n\",\n       \"365598      DOSBIO001_A375_24H:BRD-K49669601:9.903  BRD-K49669601       A375   \\n\",\n       \"\\n\",\n       \"                                                   SMILES  \\\\\\n\",\n       \"227534  OC[C@H]1O[C@@H](Oc2cc(O)cc(C=Cc3ccc(O)cc3)c2)[...   \\n\",\n       \"69737                        NCC(CS(O)(=O)=O)c1ccc(Cl)cc1   \\n\",\n       \"227941              CNS(=O)(=O)Cc1ccc2[nH]cc(CCN(C)C)c2c1   \\n\",\n       \"76230        COc1cc(C)cc(OC)c1[C@@H]1C=C(C)CC[C@H]1C(C)=C   \\n\",\n       \"225999  CC(=O)N1CCN(CC1)c1ccc(OC[C@H]2CO[C@@](Cn3ccnc3...   \\n\",\n       \"...                                                   ...   \\n\",\n       \"287335  CN[C@@H]1C[C@H]2O[C@@](C)([C@@H]1OC)n3c4ccccc4...   \\n\",\n       \"150     COC1CC(C)CC2=C(NCC=C)C(=O)C=C(NC(=O)C(C)=CC=CC...   \\n\",\n       \"377565  COc1ccc2c3c([C@@H](CO)N(C[C@]33CN(CC4CC4)C3)C(...   \\n\",\n       \"277706  C[C@@H](Oc1cc(cnc1N)-c1cnn(c1)C1CCNCC1)c1c(Cl)...   \\n\",\n       \"365598  C[C@H](CO)N1C[C@@H](C)[C@@H](CN(C)CC2CC2)OCCCC...   \\n\",\n       \"\\n\",\n       \"                          inchi_key compound_aliases      10007       1001  \\\\\\n\",\n       \"227534  HSTZMXCBWJGKHG-CUYWLFDKSA-N              NaN   0.240231  -0.427503   \\n\",\n       \"69737   JYLNVJYYQQXNEK-UHFFFAOYSA-N              NaN  -0.231360  -0.205703   \\n\",\n       \"227941  KQKPFRSPSRPDEB-UHFFFAOYSA-N              NaN   0.149539  -0.115549   \\n\",\n       \"76230   ICHJMVMWPKLUKT-JKSUJKDBSA-N              NaN  -0.200260  -0.302280   \\n\",\n       \"225999  XMAYWYJOQHXEEK-OZXSUGGESA-N              NaN   0.545711  -0.585337   \\n\",\n       \"...                             ...              ...        ...        ...   \\n\",\n       \"287335  HKSZLNNOFSGOKW-FYTWVXJKSA-N              NaN  -9.032300  -1.144700   \\n\",\n       \"150     AYUNIORJHRXIBJ-ZGQRYRSUSA-N              NaN  -0.695700   0.729100   \\n\",\n       \"377565  XKSIWSGTCPACLT-OAQYLSRUSA-N              NaN  -3.142500  10.000000   \\n\",\n       \"277706  KTEIFNKAUNYNJU-GFCCVEGCSA-N              NaN  -6.482900   1.906800   \\n\",\n       \"365598  DFAHTHVTMMOLJD-SQVYLTPJSA-N              NaN -10.000000   5.210000   \\n\",\n       \"\\n\",\n       \"           10013      10038  ...       9918      9924       9926      9928  \\\\\\n\",\n       \"227534 -0.371866   0.160758  ...   0.049103  0.112653   0.787318  0.361519   \\n\",\n       \"69737   0.299739  -0.266904  ...   0.487991  0.630590   0.253384  0.054045   \\n\",\n       \"227941 -0.474415  -0.391042  ...   0.002793 -0.124959   0.359368  0.224918   \\n\",\n       \"76230  -0.443100   0.155580  ...  -0.002480 -0.421380  -0.397380  0.329560   \\n\",\n       \"225999 -0.268187   0.319893  ...  -0.356957  0.408205   0.002745  0.253226   \\n\",\n       \"...          ...        ...  ...        ...       ...        ...       ...   \\n\",\n       \"287335  4.462500   0.609000  ...  -1.185900 -1.904500  -9.802100 -1.854500   \\n\",\n       \"150     7.177500   0.371700  ...   0.674500 -8.714800  -3.937900 -4.532800   \\n\",\n       \"377565  5.293700 -10.000000  ... -10.000000  4.972200  -5.377800 -6.791900   \\n\",\n       \"277706  7.057600   3.195200  ...   2.695300 -5.500400 -10.000000 -5.374700   \\n\",\n       \"365598 -2.923100  -4.222100  ...   2.580100  3.684500   5.692300 -1.143100   \\n\",\n       \"\\n\",\n       \"              993        994      9943      9961        998      9988  \\n\",\n       \"227534   0.595066  -0.050393 -0.000183 -0.562766   0.539329  0.513262  \\n\",\n       \"69737   -0.105749   0.294856  0.243753 -0.093112  -0.415093  1.505666  \\n\",\n       \"227941  -0.616804  -0.203747 -0.859153 -0.795572   0.456950  0.490084  \\n\",\n       \"76230   -0.203320   0.705760 -0.940700  0.317900  -1.341780 -1.011780  \\n\",\n       \"225999   0.371863  -0.176473 -0.691677 -0.492211   0.317271  0.119787  \\n\",\n       \"...           ...        ...       ...       ...        ...       ...  \\n\",\n       \"287335  -6.617700 -10.000000 -6.913400 -2.248000 -10.000000 -0.050400  \\n\",\n       \"150      1.763600 -10.000000  4.421200  3.784900  -2.483400  1.431000  \\n\",\n       \"377565 -10.000000   2.125200 -6.089400 -1.801600 -10.000000 -1.673100  \\n\",\n       \"277706  -5.679000 -10.000000  1.481800 -1.277900 -10.000000 -3.348200  \\n\",\n       \"365598  -7.186700 -10.000000  3.964800  1.002200 -10.000000  3.584000  \\n\",\n       \"\\n\",\n       \"[10694 rows x 984 columns]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Compute the standard deviation of each row. \\n\",\n    \"# In case of duplicate molecules, keep the row with the highest STD\\n\",\n    \"# Re-order columns to keep the data at the end\\n\",\n    \"df_per_cell_line_cleaned = {}\\n\",\n    \"for line, this_df in df_per_cell_line.items():\\n\",\n    \"    std = np.std(this_df[data_cols], axis=1)\\n\",\n    \"    this_df[\\\"std\\\"] = std\\n\",\n    \"    this_df = this_df.sort_values(by=\\\"std\\\")\\n\",\n    \"    this_df = this_df.drop_duplicates(subset=\\\"pert_id\\\", keep=\\\"last\\\")\\n\",\n    \"    this_df = this_df.drop(columns=[\\\"std\\\"])\\n\",\n    \"    data_cols_idx = [idx for idx, col in enumerate(this_df.columns) if (col in data_cols)]\\n\",\n    \"    not_data_cols_idx = [idx for idx, col in enumerate(this_df.columns) if not (col in data_cols)]\\n\",\n    \"    new_cols_order = not_data_cols_idx + data_cols_idx\\n\",\n    \"    this_df = this_df[this_df.columns[new_cols_order]]\\n\",\n    \"    df_per_cell_line_cleaned[line] = this_df\\n\",\n    \"    display(this_df)\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 34,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"<matplotlib.collections.LineCollection at 0x2ba74af8deb0>\"\n      ]\n     },\n     \"execution_count\": 34,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    },\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAicAAAGdCAYAAADJ6dNTAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAABw2ElEQVR4nO3dd3gU5fo38O/2NAhgKAmEBBBBQCkRaSIgAkZEmoINiAKCBCSichBUyhFQUMT3GECUYgXkUMSK8QcIHJRmQARESkJNCARIz9Z5/1hm2NndbLZO2bk/18XFZHd29tl7Z2fvfeZ57lExDMOAEEIIIUQi1GI3gBBCCCHEESUnhBBCCJEUSk4IIYQQIimUnBBCCCFEUig5IYQQQoikUHJCCCGEEEmh5IQQQgghkkLJCSGEEEIkRSt2A3xls9lw6dIl1KhRAyqVSuzmEEIIIcQLDMOgpKQECQkJUKs9943ILjm5dOkSEhMTxW4GIYQQQvxw/vx5NGrUyOM6sktOatSoAcD+4mrWrClyawghhBDijeLiYiQmJnLf457ILjlhT+XUrFmTkhNCCCFEZrwZkkEDYgkhhBAiKZScEEIIIURSKDkhhBBCiKTIbswJIYQQEgiGYWCxWGC1WsVuSljRaDTQarVBKfNByQkhhBDFMJlMyMvLQ3l5udhNCUtRUVGIj4+HXq8PaDuUnBBCCFEEm82GnJwcaDQaJCQkQK/XUzHPIGEYBiaTCVeuXEFOTg6aN29ebaE1Tyg5IYQQoggmkwk2mw2JiYmIiooSuzlhJzIyEjqdDmfPnoXJZEJERITf26IBsYQQQhQlkF/0xLNgxVaUdygnJwe9evVCq1atcNddd6GsrEyMZhBCCCFEgkQ5rZOWloa33noL3bt3x7Vr12AwGMRoBiGEEEIkSPDk5OjRo9DpdOjevTsAoE6dOkI3gRBCCOE5cqFI0Oe7q1GsoM8nNz6f1tm5cycGDBiAhIQEqFQqbN682WWdJUuWoEmTJoiIiEBKSgp27drF3Xfy5EnExMTg0UcfRYcOHTBv3ryAXgAhhBAS7nr27ImMjAyX2zdv3szNONq4cSP69OmDunXrombNmujSpQu2bt3q8phr164hIyMDycnJ0Ov1iI+Px7PPPotz587x1isoKMC4cePQuHFjGAwGNGjQAP369cNvv/0WktfoyOfkpKysDG3btsWHH37o9v5169YhIyMDM2bMQHZ2Nrp3747U1FTuRZvNZuzatQuZmZn47bffkJWVhaysrCqfz2g0ori4mPePEEIIIXw7d+5Enz598MMPP+DgwYPo1asXBgwYgOzsbG6da9euoXPnzvjll1+wZMkSnDp1CuvWrcPp06fRsWNHnDlzhlt36NChOHz4MD799FP8888/2LJlC3r27Ilr166F/LX4fFonNTUVqampVd6/aNEijB49GmPGjAEALF68GFu3bsXSpUsxf/58NGrUCB07dkRiYiIA4OGHH8ahQ4fQp08ft9ubP38+Zs+e7WsziUSVlZUhJiYGAFBaWoro6GiRWyQuigcfxYOP4sFH8fBs8eLFvL/nzZuHb775Bt9++y3at28PAJgxYwYuXbqEU6dOoUGDBgCAxo0bY+vWrWjevDnS09Px448/4saNG9i9ezd27NiBHj16AACSkpJw7733CvJagjrmxGQy4eDBg5g2bRrv9r59+2LPnj0AgI4dO+Ly5cu4fv06YmNjsXPnTowbN67Kbb722muYMmUK93dxcTGX2BD50el0eOWVV7hlpaN48FE8+CgefBQP39hsNpSUlHBjO202G9auXYunn36aS0xYkZGRmDBhAl5//XVcu3YNNWvWRExMDDZv3ozOnTsLPnElqMnJ1atXYbVaUb9+fd7t9evXR35+vv0JtVrMmzcP999/PxiGQd++ffHII49UuU2DwUCzecKIXq/HwoULxW6GZFA8+CgefBQPPoqHb9577z2UlZVh2LBhAIArV67gxo0buPPOO92uf+edd4JhGJw6dQr33nsvVq9ejbFjx2LZsmXo0KEDevTogSeeeAJ33313yNsekjonzuWAGYbh3ZaamoojR47gr7/+wqJFi0LRBEIIIUSx1qxZg1mzZmHdunWoV6+eV49hGAbAre/woUOH4tKlS9iyZQv69euHHTt2oEOHDli9enWoms0JanISFxcHjUbD9ZKwCgoKXHpTfJWZmYlWrVqhY8eOAW2HiIthGJjNZpjNZu6DoGQUDz6KBx/Fg0/J8ahZsyaKilynO9+4cQM1a9bk3bZu3TqMHj0aX3/9NR588EHu9rp166JWrVo4duyY2+f4+++/oVKp0KxZM+62iIgI9OnTB2+++Sb27NmDtLQ0zJw5M0ivqmpBTU70ej1SUlJcZt9kZWWha9euAW07PT0dx44dw/79+wPaDhFXeXk59Ho99Ho9XRUUFA9njvHYdzJP7OaIjvYPPiXHo2XLljhw4IDL7fv370eLFi24v9esWYO0tDR89dVX6N+/P29dtVqNYcOG4auvvnLpRKioqMCSJUvQr18/j/XHWrVqJUhVd5+Tk9LSUhw6dAiHDh0CYC9Ff+jQIW6q8JQpU/DJJ59g5cqVOH78OF566SWcO3cO48ePD2rDCSGEEKWYMGECTp8+jfT0dBw+fBj//PMPMjMzsWLFCrz66qsA7InJyJEj8d5776Fz587Iz89Hfn4+r8dl7ty5aNCgAfr06YMff/wR58+fx86dO9GvXz+YzWZkZmYCAAoLC/HAAw/giy++wJ9//omcnBysX78eCxYswMCBA0P/ghkfbd++nQHg8m/UqFHcOpmZmUxSUhKj1+uZDh06ML/++quvT1OloqIiBgBTVFQUtG0S4dhsNub69evM9evXGZvNJnZzREfx4Dt87jqz+69cZvdfuczhc9fFbo7oaP/gCzQeFRUVzLFjx5iKiooQtC70Dhw4wPTr14+pV68eU7NmTeaee+5h1qxZw93fo0ePar+fGYZhrly5wkyaNIlJTExktFotU79+fWbUqFHM2bNnuXUqKyuZadOmMR06dGBiY2OZqKgopkWLFszrr7/OlJeXV9lGTzH25ftbxTDyOHGXmZmJzMxMWK1W/PPPPygqKnI5z0YIkSdvSodTuW8SqMrKSuTk5HAVzEnweYpxcXExYmNjvfr+FuXCf/5IT09Heno69+IIIfLm67VM2PUpSSEk/MkmOSHhwWQycddTmj59OvR6vcgtEhfFg89sMuHjD98DAIyd+DJ0Co8H7R98FA/lkM1pHZYv3UJEeqj8NJ8S4+Gpx6S8vAydWzQEAKzdvx2tGrSvcl0l9KAocf/wJNB40Gmd0FPcaR3HMSdEvrRaLSZMmMAtKx3Fg+98+WmkPjEUAKDRanCm+G/uvqY1W4rVLNHQ/sFH8VAO6jkhhAjCU4+JYxJSHXdJihJ6UUjgqOck9BTXc0IICS++JCSEEGUJybV1CCEkVM4U/02JDSFhjpITIqiysjLodDrodDpBSiBLnRLj4Sm5qCyvwNC23TC0bTdUlld4vc0jF4p8nposB0rcPzyheCiHbE7r0IDY8GGxWMRugqSEezx8TRqsFvtn3JBzFmhd9SBYpQyWDff9w1cUD2WQTXJCRdjCQ2RkJC5cuMAtKx3Fg6/GxTx8vmoJAPuFRM0it0dstH/whTQel7KDu73qJFQ9Td6dtLQ03LhxA5s3b+bdvmPHDvTq1QvXr19HrVq1uNtbtGiBnJwc5OTkoGHDhrzHbNy4ER999BEOHjyIwsJCZGdno127drx1kpOTcfbsWZd2zJ8/H9OmTfOp7f6QTXJCwoNarXb5oChZuMbDXW+Jp3EihtO5AICK69cRDcBUqkFJaTFi1bncOsZmyV4/b7jM3gnX/cNfFA/v7N69G5WVlXj88cexevVqzJgxg3d/WVkZunXrhscffxxjx46tcjtz5sxxub9GjRohabMzSk4IIaJgExIAqCgsBGBPSgAg2hTBux0A1B6SEzbxCefTO4R4a8WKFXjqqafQo0cPpKenY/r06VCpVNz9I0aMAADk5uZ63E6NGjXQoEGDUDa1SpScEEGZTCZ88MEHAIDJkycrvvy0kuPhmHiwSYm+XItvd/0MABjQvS93OwDo9x0EAKjvTRGwleJS8v7hDsWjeiUlJVi/fj327t2Lli1boqysjDv1IydUhI0Iispx84VrPBxP6zifzmF7TIrO3uBuY3tKKo1GPD59HABg/byPEGEwcOuU6SsBALFJtQC4P83D9pyEy2mdcN0//BXS8vUyGHPyxRdfuLTbarWisrKSG3Py8ccfY8mSJcjOtr+ejIwMXL16FV988YXLNnNzc9GkSZMqx5zk5eVBp9Pxbv/uu+/Qs2fPKtupuCJsNFsnPGi1WowaNYpbVjolx4NNSBxpNGo8cE83btnd+myPi6fTPOFCyfuHO0qPR69evbB06VLebXv37sUzzzzD/b1ixQre38888wzuv/9+3Lhxgzdg1huvvvoq0tLSeLcJNeZHNu8uzdYJDwaDAatXrxa7GZKhxHiwPSbRcE1OdFodXnqy6gF6wK1TQO6Kj3O9NBfCowdFifuHJ0qPR3R0NG6//XbebezsJQA4duwY9u7di/379+Nf//oXd7vVasWaNWvwwgsv+PR8cXFxLs8nFNkkJ4QQ6WNP57ibmcOeztG46THxh+OAWm9m8hAS7lasWIH7778fmZmZvNs///xzrFixwufkREyUnBBCBMGejtEhsHET7OkdxzErEc0C2iQhsmc2m/H5559jzpw5aNOmDe++MWPGYMGCBTh8+DDatm2La9eu4dy5c7h06RIA4MSJEwCABg0a8GbnlJSUID8/n7etqKgoQcZ7Uvl6IqiysjLUqlULtWrVovLTUFY8TKUa3uwbdyqNRjwx4wU8MeMFVBqNArVMupS0f3iD4lG1nTt3orCwEIMHD3a5r3nz5rjrrruwYsUKAMCWLVvQvn179O/fHwDwxBNPoH379li2bBnvcW+++Sbi4+N5/6ZOnRr6FwPqOSEiKCoKv2ugBCLc48Gefqn0cv2ySu+vqeP8HOzpHfa00l3o5PO2pCbc9w9fhSwePs6eEVpVY2169uwJdtKtpwkjf/75J7eclpbmMtDVWXU1UEKNkhMiqMjISPzzzz/cstIpKR7uZuc40+t0+Gja29xyoNtznNIsx8GxSto/vEHxUA5KToig1Go1mjdvLnYzJCNc4uFpIKyn2TnO1Go1Eur6XpGSfY5wG3sSLvtHsFA8lEM2Y04yMzPRqlUrdOzYUeymEEIkynA6lzeLhxAiT7LpOaE6J+HBbDZj+fLlAIDnn3/epfqg0oRrPBwTBF+mDlusFmz9bQcAoF+XntBqvDtEeXOKR47Cdf/wF8VDOah8PREUlePmC5d4OJ/WcUxOLCe9HQrruXy9N6z17LFEh0Yu9w1sJb/BseGyfwRLSMvXk6BQXPl6Eh40Gg0ee+wxblnpwjUevOvmeDHWhKVWq9Dt7nu4ZV9V3rgKAIiAa3IiR+G6f/iL4qEc1HNCCPFbVQNhK7cd4paFPOXCXhww4oF2LveF20UBie+o5yT0qOeEECI5vtY0CTY2EdI61T0hhMgLJSeEkKAL1wGqhBBhyGYqMQkP5eXlaNiwIRo2bIjy8nKxmyM6igdfpcmIUbMzMGp2BipN/pevLzp7gzfuBbCfenJXh0XKaP/go3goB/WcEEExDMNdbEpmw51CItzi4UvBNbcY4FrxDW5Z6cJt/whUKONxtPBoULdXnda3tfZ63QEDBqCiogK//PKLy32//fYbunbtioMHD6JDhw7YsGED/vOf/yA7OxtWqxVNmzbFY489hokTJ6JOnTrc4yoqKpCQkACVSoWLFy+6VNxNTk7G2bNnAdir8TZt2hSTJk3CuHHj/HzFvpFNzwkVYQsPERERyM7ORnZ2Ng1Ig/zjEezeCJ1Ohw+mzMYHU2ZTDQvIf/8INqXGY/To0di2bRuXLDhauXIl2rVrhw4dOmDGjBkYPnw4OnbsiB9//BF//fUX3nvvPRw+fBiff/4573EbNmxAmzZt0KpVK2zcuNHt886ZMwd5eXn4888/MWjQIIwfPx7r1q0LyWt0RrN1CCE+cbxejXNdE67nRCJjTrTNb7WDHRxLs3aUy9NMEin3nFgsFjRq1AgvvPACZs6cyd1eXl6OBg0aYN68ebj33nvRqVMnLF68GJMnT3bZxo0bN1CrVi3u7169euGJJ54AwzD4+uuvsW3bNt76ycnJyMjIQEZGBnfbHXfcgZSUFKxZs6bKtgZrto5sek4IIdIXbYqQTGJCSLjQarUYOXIkVq9ezTudtX79ephMJjz99NP48ssvERMTgwkTJrjdhmNicvr0afz2228YNmwYhg0bhj179uDMmTPVtiMiIgJmszng1+MNSk6IoMxmM1avXo3Vq1cLtpNLGcWDz2K14Jd9u/DLvl2wWC0Bb48dGOs4OFZOA2Np/+BTcjyee+455ObmYseOHdxtK1euxJAhQ1C7dm2cPHkSTZs29ep06MqVK5GamoratWujTp06eOihh7By5coq17dYLFi9ejWOHDmC3r17B+PlVItO6xBBUTluPjnGw91pHfxxAQCgKSgNaNuO5es3zloAXY16AW2PLcoGuBZmk0M5eznuH6EUyvL1Uj6tw+rWrRuaNm2Kzz//HKdPn0bz5s3x888/48EHH0Rqairy8vJw6NAhj9uwWq1ISkrCBx98gKFDhwIA/vvf/+Kll15Cbm4uV3k3OTkZeXl50Ol0MBqN0Ov1SE9Px9tvvw21uup+DSrCRmRJo9Hg4Ycf5paVTo7xcNfrwJaN93uWzk06cwk6tmgFAFCrqGNXjvtHKCk9HqNHj8bEiRORmZmJVatWISkpievJuOOOO7B7926YzWaPvSdbt27FxYsXMXz4cN7tVqsVP//8M1JTU7nbXn31VaSlpSEqKgrx8fFQqXy/pIS/qOeEEOKTb47t5ZaDNRBWYyyqdh2rIbABrOzgWHZgrBx6Tkhwyb3npLS0FPHx8Vi4cCHmzp2LsWPH4s033wQA7N27F507d652QOzQoUOh1+sxY8YM3v1vv/02Kisr8d///heA+wGx3qCeE0KIoBxP5zjzJynxJiFxt36gSQqLfT00a4fIRUxMDIYPH47p06ejqKgIaWlp3H2dOnXC1KlT8fLLL+PixYsYPHgwEhIScOrUKSxbtgz33XcfnnrqKXz77bfYsmUL2rRpw9v2qFGj0L9/f1y5cgV169YV+JW5on5TQogiGIu0MBbR7zEib6NHj8b169fx4IMPonHjxrz73nnnHXz11VfYu3cv+vXrh9atW2PKlCm4++67MWrUKHz22WeIjo52O6i1V69eqFGjhks9FLHQaR0iqPLycrRt2xYAcPjwYURFRYncInHJKR5ur0AcwEBYdz0nlSYTJv7nHQDAh5P+hQi9vsrH+9qDUtUVi6V8ekdO+4cQAo0HXZU49Oi0DpElhmFw6tQpblnp5B4PfwbCej6dw+BS4VVuWenkvn8EG8VDOSg5IYKKiIjA7t27uWWlk1M8hKgNotPqsHDcZG45FNhBvOzAWCmT0/4hBIqHclByQgSl0WjQrVs3sZshGXKMB/vlDgCVVa/mwpsBsBq1Gq2Tmvq8PW9O8cixcq0c949QongoByUnhBC/yfELnxAifbJJTjIzM5GZmQmr1Sp2U0gALBYLNm3aBAAYPHgwtFrZ7IIhIYd4eJpCHGxWqxV7jv0JAOja6u6QFNpynrHj+PqkNq1YDvuHkCgeykGzdYigqBw3nxzi4TJL5+YMHcC3WTrenNapNBkxZNZUAPby9RF6gw8t9e70TlWzdgDpzdyRw/4hpFCWryfBQbN1iCyp1Wr06NGDW1Y6OcaDnaEDeDdLx5diayqVCnc1uZ1bVjo57h+hRPFQDkpOiKAiIyN5V9VUOjnEQ8gr+Bp0erwzdpIwzyWDWTty2D+ERPFQDkpOCCFeYb/MfZmhI0U0iJcQ6aPkhBDiE2++3H29bk6w+HL9HSplT4h00aeTCKqiogJdunQBAPz222+IjIwUuUXionjwGc0mvLxsMQDgvfEZMOiqLl+vBLR/8IUyHhV/CXtV4sg2vl+VGAD27NmD7t27o0+fPvjpp59497kbp7V06VKMHz/e5fZTp06hffv20Gg0uHHjBu++zMxMfPjhh8jNzUXjxo0xY8YMjBw50q/2+ouSEyIom82Gw4cPc8tKJ+V4OE8hZnsagj+59xaGYXAm7yK3HErswN4INOJu++bYXgDSmbUj5f1DDBQPYOXKlZg0aRI++eQTnDt3zuXif6tWrcJDDz3E/R0b69qLaDab8eSTT6J79+7Ys2cP776lS5fitddew8cff4yOHTti3759GDt2LGrXro0BAwaE5kW5QckJEVRERAR+/vlnblnppBwP54Gw/lxHx1c6rQ5vPfsCt6x0Ut4/xKD0eJSVleHrr7/G/v37kZ+fj9WrV+PNN9/krVOrVi00aNDA43Zef/11tGzZEr1793ZJTj7//HOMGzcOw4cPBwA0bdoUv//+O9555x1KTkj40mg06NOnj9jNkAyKB59GrUaH5i0D344PY08cy/FLbeYO7R98So/HunXr0KJFC7Ro0QLPPPMMJk2ahDfeeIN3OmfixIkYM2YMmjRpgtGjR+P555/nTbvetm0b1q9fj0OHDmHjxo0uz2E0Gl0Sv8jISOzbtw9msxk6nTA/GmiiOCHEI8PpXN4XuCcaY5Fog2F9FW2KoJk7RFZWrFiBZ555BgDw0EMPobS0FP/3f//H3f/vf/8b69evxy+//IInnngCL7/8MubNm8fdX1hYiLS0NKxevbrKImj9+vXDJ598goMHD4JhGBw4cAArV66E2WzG1atX3T4mFKjnhAjKYrFg69atAOwfAqWXn5ZTPIT4IrdarTh40n46KaV5y5CUr3cm5Vk7cto/hKDkeJw4cQL79u3jeju0Wi2GDx+OlStX4sEHHwRgP13DateuHQBgzpw53O1jx47FU089hfvvv7/K53njjTeQn5+Pzp07g2EY1K9fH2lpaViwYIEgn0eWct5ZIglGoxGPPPIIAHv5aSUdXNyhePCZrRbM+mw5AHv5eiEOho4Vbx0Hx0oB7R98So7HihUrYLFY0LBhQ+42hmGg0+lw/fp11K5d2+UxnTt3RnFxMS5fvoz69etj27Zt2LJlC959913u8TabDVqtFsuXL8dzzz2HyMhIrFy5Eh999BEuX76M+Ph4LF++HDVq1EBcXJxgr1c57yyRBLVajXvuuYdbVjo5xEOIWToslUqF5g0TuWWxHC20TyttfZt/0z2DRQ77h5CUGg+LxYLPPvsM7733Hvr27cu7b+jQofjyyy8xceJEl8dlZ2cjIiICtWrVAmCffu148dxvvvkG77zzDvbs2cNLegBAp9OhUSN7sr527Vo88sgjgsackhMiqMjISOzfv1/sZkiG1OLBTqV1JMQsHZZBp8cH6a8EbXuO4198Ghxbv03Q2hAIqe0fYlNqPL777jtcv34do0ePdpka/Nhjj2HFihVISkpCfn4+unTpgsjISGzfvh0zZszA888/D4PBfgHNO++8k/fYAwcOQK1Wo02bW/v7P//8g3379qFTp064fv06Fi1ahL/++guffvpp6F+oA0pOCCEBk8sgWG+dumy/2nLr20RuCCGwn9J58MEH3dYsGTp0KObNm4cjR45g/fr1mDJlCmw2G5o2bYo5c+YgPT3dp+eyWq147733cOLECeh0OvTq1Qt79uxBcnJykF6Nd1RMqCsdBZkvl1wmhPjGseeE7UUoOnsDgOcBsXJITrzpOdE2t79GdkqxVIqxkeCorKxETk4OmjRposg6KULwFGNfvr9FOWmn1WrRrl07tGvXDmPGjBGjCUQkFRUV6NatG7p164aKigqxmyM6igcfW77+5WWLYTSbhH/+Iq2kZu/Q/sFH8VAOUT6FtWrVwqFDh8R4aiIym83GVSRUavlpR1KOB/slLWQtEIZhcPxcDrcsNHcl7cUk5f1DDBQP5ZDOTwSiCAaDAZs2beKWlY7iwafTaPH6M6O55WDypWosS+xZO7R/8FE8lMPn0zo7d+7EgAEDkJCQAJVKhc2bN7uss2TJEu58U0pKCnbt2sW7v7i4GCkpKbjvvvvw66+/+t14Ij9arRaDBg3CoEGDFFWjoCpSicfRwqPcFzGr8sZVXg0Qd4JdEVaj0aBrq7vRtdXdghZ8cuZLVdxQksr+IRUUD+XwOTkpKytD27Zt8eGHH7q9f926dcjIyMCMGTOQnZ2N7t27IzU1FefOnePWyc3NxcGDB7Fs2TKMHDkSxcXF/r8CQkjQsF/KUvhiJoQol8+pZ2pqKlJTU6u8f9GiRRg9ejQ30HXx4sXYunUrli5divnz5wMAEhISAABt2rRBq1at8M8//3CFdZwZjUYYjUbub0pk5M1qtXI9ad27dxf117EUSCUe7NRZsTvKrTYbjuaeBgC0Tm4GjcCFtqR2rR2p7B9SQfFQjqD2i5lMJhw8eBDTpk3j3d63b19uENP169cRFRUFg8GACxcu4NixY2jatGmV25w/fz5mz54dzGYSEVVWVqJXr14A7OWno6OjRW6RuCgefGaLGdM+sffKbpy1ABq9OOkSOxhY7HontH/wUTyUI6jJydWrV2G1WlG/fn3e7fXr10d+fj4A4Pjx4xg3bhzUajVUKhU++OAD1KlTp8ptvvbaa5gyZQr3d3FxMRITE4PZbCIglUqFVq1acctKJ7V4OE6jFacXQYXG9Rpwy2JxnrXjOB5HyMGxUts/xEbxUI6QjChy3mkYhuFu69q1K44cOeL1tgwGA43KDiNRUVE4evRo9SsqhNjxcB4EK7YIvR7LMl4L6XP4M2tHLGLvH1JD8VCOoCYncXFx0Gg0XC8Jq6CgwKU3xVeZmZnIzMzkXbSIEBJcjrNzqrqWjhyqwQaL1K61Q4hSBHW0mV6vR0pKCrKysni3Z2VloWvXrgFtOz09HceOHVPkRZ8ICTWaocMXbYqQ3OBYEloFZ4sF/eertLQ0qFQqjB8/3uW+CRMmQKVSIS0tjbstPz8fkyZNQtOmTWEwGJCYmIgBAwbg//7v/7h1kpOToVKpeP/YKxHn5ua63Mf+W79+ve8B9pHPyUlpaSkOHTrEVXjNycnBoUOHuKnCU6ZMwSeffIKVK1fi+PHjeOmll3Du3Dm3ASXKU1FRgT59+qBPnz5UfhoUD2dGswnTV2Zi+spMUcrXu7RH5HL2tH/wKT0eiYmJWLt2Le+1V1ZWYs2aNWjcuDF3W25uLlJSUrBt2zYsWLAAR44cwU8//YRevXq5XAhwzpw5yMvL4/5lZ2dzz+V4e15eHmbPno3o6GiPM3aDxedP3YEDB7jR0gC4waqjRo3C6tWrMXz4cBQWFnIvuE2bNvjhhx+QlJQUvFYT2bLZbPjll1+4ZaUTOx5SmULMYhgGh079wy0rndj7h9QoPR4dOnTAmTNnsHHjRjz99NMAgI0bNyIxMZE365XtSdm3bx9vRlPr1q3x3HPP8bZZo0YNNGjQAM40Go3L7Zs2bcLw4cMRExMTzJflls/JSc+ePas9aEyYMAETJkzwu1Hu0JiT8GAwGPDFF19wy0onlXiIcR0dd3QaLV4dNoJbDiVvBsayY3DYJM5OuJL2Utk/pILiATz77LNYtWoVl5ysXLkSzz33HHbs2AEAuHbtGn766SfMnTvX7VTrWrVq+fW8Bw8exKFDh5CZmelv032iYmT288SXSy4TQlw5ztDhvnT/uAAA0BSUunuI/b4wHQjrKTkp01cCAGKTanG3JXa1D44V63o7xH+VlZXIycnhLq/iyJ9xIIGol+Tb91daWhpu3LiBTz75BI0aNcLff/8NlUqFli1b4vz58xgzZgxq1aqFCRMmoFOnTti4cSMGDx7scZvJycnIy8uDTqfjbps3bx5efPFFl3UnTJiAHTt24NixYx636SnGvnx/08UJCCFcD0FVM3QIIdIQFxeH/v3749NPPwXDMOjfvz/i4uK4+9n+Bm/rwLz66qu8gbSO22JVVFTgq6++whtvvBFY431AyQkRlNVqxR9//AHAfv5U6eWnxY4HO0OnUtBnrZrVZsPpS+cBAM0SEgUvX++MPc0l1qBYsfcPqaF42D333HOYOHEiALicZmnevDlUKhWOHz+OQYMGVbutuLg43H777R7X+e9//4vy8nKMHDnS7zb7ipITIqjKykrce++9AKj8NEDxcGa2mJGxZBEAccvXSwXtH3wUD7uHHnoIJpN9Nlu/fv1499WpUwf9+vVDZmYmXnzxRZcY3bhxw+dxJytWrMCjjz6KunXrBtRuX8gmOaEBseFBpVJxM7eo/LSw8XBXDVbMabLuqVCvVh1uWSoci9Ox2HiGcuwJfV74KB52Go0Gx48f55adLVmyBF27dsW9996LOXPm4O6774bFYkFWVhaWLl3KPdYbp06dws6dO/HDDz8Erf3ekNqRqUrp6elIT0/nBtQQeYqKikJubq7YzZAMqcRD7Fk6rAi9HqunzhT0OR0H+kqtpL1U9g+poHjc4mlAaZMmTfDHH39g7ty5ePnll5GXl4e6desiJSUFS5cu9el5Vq5ciYYNG6Jv376BNtknNFuHEIVgf+k7Tout3HYIgOfkJFxn6bhTVXLCztoBgDbt4+0LLZsBoFk7cuJpJgkJDpqtQwjxi2OZeqkMhJWT89fs1Tnp2uiEhI64Q+GJ4lRWVmLQoEEYNGgQKivpq5HiwWcymzHn808w5/NPYDKbxW4Oh73WTrQpQtCS9rR/8FE8lEM2PSc0IDY8WK1WfPPNN9yy0gkRD+eBsNIbBHuLjbHh9+NHuGUpch4c6xjfYJ/ioc8LH8VDOaR7lHJCA2LDg16vx/Lly7llpRM7HlIZCMvSarR4cfBwblnpxN4/pIbioRw0IJaQMOfcc/LXmt3csjfJiZIGxLK8KWnvPDAWoMGxUkcDYkOPBsQSQkJKiUmJL0oK7UP2aojcDkLCESUnRFA2m40rAHTnnXdCLXJ5crEJGY/ze/4K2baDxWaz4fyVywCAxLr1Jbl/CHkqjD4vfBQP5aDkhAiqoqICbdrYr+qq5PLTrFDGQ04DYVkmixkvfPA2AHv5+giZla8PdtVY+rzwUTyUQ/pHq5totk74cHfVSyUTOh5SGwTrrGaUPL5wzp67CABo061JSJ+HPi98FA9loAGxhISpqgbCepuc0JgTHwfGAlQ1VuJoQGzo0YBYQoh3/j7t9aqUkPiOHRgL0OBYObt85pSgz1e/6e0+rZ+WloZPP/0UAKDVapGYmIghQ4Zg9uzZ3OmtDRs24D//+Q+ys7NhtVrRtGlTPPbYY5g4cSLq1KmD3bt341//+hf+/vtvlJeXIykpCePGjcNLL70U9NcXKEpOCAkj7q48TELDUw+UEFcsJsrz0EMPYdWqVTCbzdi1axfGjBmDsrIyLF26FDNmzMA777yDl156CfPmzUNCQgJOnjyJZcuW4fPPP8fkyZMRHR2NiRMn4u6770Z0dDR2796NcePGITo6Gs8//7zYL4+HkhMiqMrKSowePRoAsGLFCsV3rQoRD8df9lJnMpuxeOMaAEDGkCeh1+lEblH1SguvcMs1ENzxJ/R54VN6PAwGAxo0aAAAeOqpp7B9+3Zs3rwZzz77LObNm4fFixdj8uTJ3PrJycno06cPbty4AQBo37492rdvz7t/48aN2LVrl+SSE/kctUhYsFqt+Oqrr/DVV1/R4GYIE4/SwisoLbzCXRtGymyMDTsOH8SOwwclW77eo79P+3QarTr0eeGjePBFRkbCbDbjyy+/RExMDCZMmOB2vVq1arm9PTs7G3v27EGPHj1C2Er/UM8JEZRer8f777/PLSudEPEoLDMBAKIh7cQEsJesf77/YG5ZbOwYHE8DY0OJPi98FI9b9u3bh6+++gq9e/fGyZMn0bRpU+i87Gls1KgRrly5AovFglmzZmHMmDEhbq3vxP/0e4mmEocHnU6HjIwMsZshGcGKh9uxJjIcCKvVaDCoW0+xm+ETQ+WtWizOVWMDHXtCnxc+pcfju+++Q0xMDCwWC8xmMwYOHIj//Oc/GDVqFFQqldfb2bVrF0pLS/H7779j2rRpuP322/Hkk0+GsOW+k01yQhf+I8Q3chprImdFFWZumSpwkFDq1asXli5dCp1Oh4SEBK6n5I477sDu3bthNpu96j1p0sQ+Nuquu+7C5cuXMWvWLMklJ3T0IoKy2WzIzc1Fbm4ubDYZjikIMooHn81mw+Xrhbh8vVCW8WDH9wQL7R98So9HdHQ0br/9diQlJfGSkKeeegqlpaVYsmSJ28exA2LdYRgGRqMx2E0NmGx6Tkh4qKio4LJ2Kj8d2niwX5JSHwTryGQx49mFcwDIs3w9hz2l5nDFYn/Q54WP4uFep06dMHXqVLz88su4ePEiBg8ejISEBJw6dQrLli3Dfffdh8mTJyMzMxONGzdGy5YtAQC7d+/Gu+++i0mTJon8ClxRckIEFxUVJXYTJCWQeIRjXRODTv4DHYN5xWL6vPBRPNx75513kJKSgszMTCxbtgw2mw3NmjXDY489hlGjRgGw9zy99tpryMnJgVarRbNmzfD2229j3LhxIrfeFZWvJ0TGnJOTkv/lcMtsz0nl9erP3kplQKwceJq5Extp72rXNbQfm2q4ue4OFWYTD5WvDz0qX08ICRglJaHBJobBLspGiFJQckKIDFV1OsdxMKac6psQQogjmq1DBGU0GjF27FiMHTtWkiPEhRbKeMihIqwzs8WCDzauxQcb18JssYjdnICV/C+Hd6rNV/R54aN4KAeNOSGCKisrQ0xMDAAabQ/4H4+qxpo49pzIcaxJpcmIIbOmApDubB1vqsWyY0+MEfYv0PhH73VZx5uxJ/R54Qs0HjTmJPQUN+aEKsSGB51Oh7feeotbVjpf4uFpZk4wa2uISaPWYGSf/tyyFAlZ0p4+L3wUD+WgnhNCZMJTcpK3ZZ/LbVX1nEitt0SuvJm1E2jPCQku6jkJPcX1nBBCqud4nZdKmD2sSYTkOO6EnV4c6HV3CAlnlJwQQTEMg6tXrwIA4uLifLpYVTjyJh6eekwCGWwpRQzDoLisDABQMzqa9g/6vPBQPJSDkhMiqPLyctSrVw8ADfADAo+H81gTx4vQyZHRbMKT82YAkO6AWJY3Y0/YnizH98mX2if0eeGjeCgHJSeEhAH2S9DTqRwaa0IIkQtKToigoqOjIbMx2CHlKR4er5vDXlguzEToDfhh3gdiNyOk2FNx3ow9oc8LXyjjYbpQEpLtVkXfyL8rL+3Zswfdu3dHnz598NNPP3G3FxYW4umnn8aff/6JwsJC1KtXDwMHDsS8efO4waezZs3C7NmzXbYZFRWFspunU3fs2IFevXq5rHP8+HHugoFCoOSEEBnK+6dQ7CYQD9jTa+ysHceByoQEYuXKlZg0aRI++eQTnDt3Do0bNwYAqNVqDBw4EG+99Rbq1q2LU6dOIT09HdeuXcNXX30FAHjllVcwfvx43vZ69+6Njh07ujzPiRMneDNq6tatG8JX5YqSE0IIERBdd4f4q6ysDF9//TX279+P/Px8rF69Gm+++SYAoHbt2njhhRe4dZOSkjBhwgQsXLiQuy0mJoYrYgcAhw8fxrFjx7Bs2TKX56pXrx5q1aoVuhdTDSpfTwRlNBqRkZGBjIwMKj8N9/E4WnjU8ymdMGa2WPDRdxvx0XcbZVO+XmMs4v4Fgn3fHd97+rzwKT0e69atQ4sWLdCiRQs888wzWLVqVZWnuS5duoSNGzeiR48eVW7vk08+wR133IHu3bu73Ne+fXvEx8ejd+/e2L59e9Beg7eoCBsRFJXj5nMXj6oSE3eF1pxPF7ibrSOngbByKF/vifPMHfa0jiO2MFvMbfZucnbsiSN2/Al9XvhCWb5eDmNOunXrhmHDhmHy5MmwWCyIj4/HmjVr8OCDD3LrPPnkk/jmm29QUVGBAQMG4Ouvv3ZbcM5oNCI+Ph7Tpk3D1KlTudtPnDiBnTt3IiUlBUajEZ9//jmWLVuGHTt24P7776+2jVSEjciSTqfD9OnTuWWlY+NxpeIKThafhK5S2THRqDUY3rMPtxyOfBl/Qp8XPiXH48SJE9i3bx82btwIANBqtRg+fDhWrlzJS07ef/99zJw5EydOnMD06dMxZcoULFmyxGV7GzduRElJCUaOHMm7ne2ZYXXp0gXnz5/Hu+++61VyEizUc0KIBHhzGsex58SbHhOWnHpO5M6bnhOWp9L2LKoeG1xy7jmZOnUqFi5cCI3mVtLOMAx0Oh3y8vJQu3Ztl8fs3r0b3bt3x6VLlxAfH8+7r3fv3qhZsyY2bdpU7XPPnTsXX3zxBY4fP17tuorrOaEL/5Fw5E1SEm5VYMOZPxcFdJ5aTIgzi8WCzz77DO+99x769u3Lu2/o0KH48ssvMXHiRJfHsX0PzuNzcnJysH37dmzZssWr58/OznZJbkJNNslJeno60tPTucyLyBPDMCgvLwdgn1uv9PLTDMOgorwCABAZFek2Hv5ecViOPSYMw8BoNgEADDq97PcP5ynFvqLPC59S4/Hdd9/h+vXrGD16tMv332OPPYYVK1agadOmuHz5Mjp27IiYmBgcO3YMU6dORbdu3ZCcnMx7zMqVKxEfH4/U1FSX51q8eDGSk5PRunVrmEwmfPHFF9iwYQM2bNgQypfoQjbJCQkP5eXlNMDPQUV5Be5Nsnfr7zu7D1HRUSK3SFxGs0nWA2J94Vza3t3U4gPnD3D7B31elHv8WLFiBR588EG3P8yHDh2KefPm4fjx49iwYQNeeuklGI1GJCYmYsiQIZg2bRpvfZvNhtWrVyMtLY13iohlMpnwyiuv4OLFi4iMjETr1q3x/fff4+GHHw7Z63OHkhNCRODP6Rwq5EVI6PhbsVUI3377bZX3dejQgTt98/LLL1e7LbVajfPnz1d5/9SpU3mzd8RCyQkRVFRUFEpLS7llpYuMisS+s/u4ZUf+ns6RM4NOj42zFnDLchXI2BPg1vgTdv9odVsr+ryAjh9KQskJEZRKpVJMV6wzd70lKpXK5VROyUX/rywsx3EmjlQqVVifynHH05WL2f0jtzIXqKTZO0o+figNJSeESExp3kXe355O53iaQkwIIXJFyQkRlMlk4q6KOXPmTOj18u26DwazyYwlC+0Fkia8OgE6vQ64VCByq8Rjtljw1Tb7lVafeuAh6LTKPESxvWcRdcHbPxx735TYi0LHD+WgImxEUEosx+1p8Gt5WTk3G2Pbb3sQGRWJ0oPZvHV86TmR+2kduZevd8aOOfF1KrGuof3YpmlXv8rZXEpMTkJZvp4Eh+KKsJHwoNVqMXnyZG5Z6bRaLZ4Z9wzKTuej4s8jsDqU5PZldo7ckxKWRq3BwK49uGW5Y9+XItwaGOtNosKOP7lN2xDPjHsGAH1egOAdP2w2W7CaRJwEK7bUc0JIiPhyZWFvLurnDttzEi7JSbhynLXjTXLCXRwwpT0AoEZD18coseckUDabDSdPnoRGo0HdunWh18u/0J9UMAwDk8mEK1euwGq1onnz5lCr1bx1qOeEEJnwNykh4Y3bB3Iv2P9v6FqgjU1+KUnxnlqtRpMmTZCXl4dLly6J3ZywFBUVhcaNG7skJr6i5ISQIPOlx8Rf1GMiL7z3KTLO68d5qh5L/KPX69G4cWNYLBa6VluQaTQaaLXaoPRGUXJCBKXEAbHusLMxKoxGPPrCCwCALUuXItKg7F6TcBsQGyjH/WPbb3tQr7lrV7iSelCCdfxQqVTQ6XTQ6fy75hEJPUpOCAkCX3tLnGfkAIChUg8Do+wvY+LBuYuAm+SEkHBEyQkRVFRUFAoKCrhlpYvQ67F+8WIYjAafegnC9XSOQafHmulzueWwVHHD/n9krWpXjbXVwJZ3lsJoMMJSWhzSZskBHT+Ug5ITIiiVSoW6deuK3QxROF/IDwAijBGI0EcAYfo97CuVSoXYm9324arUaAEAxERWsyLs8ahVoyY3e8fx0gbOM3iUUKBNyccPpREtOSkvL8edd96Jxx9/HO+++65YzSAkIL6czlHihfxIcHDX33G4tEGNhskitYaQ0BMtOZk7dy46deok1tMTkZhMJixcuBAA8Oqrryqi/LS7HhP2y8ZssWBN1ncAgCf7PKLYcu0ss8WCDbu2AQCGdn+A4uG0f/AvbZAsSpvEpMTjh1KJ8sk/efIk/v77bwwYMAB//fWXGE0gIjGbzXj99dcBABkZGbI9uASrx8RiteCTb9cDAB6v5loypTeuet9AmbLarPgs63sAwKBuPaAL5zPPXow9cd4/1Lg1RZM9xeOuQFu4zuAJl+MHqZ7Pn/ydO3di4cKFOHjwIPLy8rBp0yYMGjSIt86SJUuwcOFC5OXloXXr1li8eDG6d+/O3f/KK69g4cKF2LNnT8AvgMiLVqvFmDFjuOVw5jg+AHBfXE2j1uCRbj25ZaXTqDXod08XblnpnPcPfaVDIuKhQFu4UtLxQ+l8fnfLysrQtm1bPPvssxg6dKjL/evWrUNGRgaWLFmCbt264aOPPkJqaiqOHTuGxo0b45tvvsEdd9yBO+64g5ITBTIYDPj444/FboYg3E0XdqbX6TD16bECtEYedFotJg95QuxmSIan/cObAm3h1oOipOOH0vmcnKSmpiI1NbXK+xctWoTRo0dz2e3ixYuxdetWLF26FPPnz8fvv/+OtWvXYv369SgtLYXZbEbNmjXx5ptvut2e0WiE0Wjk/i4upul0RDzenM5x7jEJGNv9T8KTD1OL3fF0eocQuQqs+L0Tk8mEgwcPom/fvrzb+/bty/WSzJ8/H+fPn0dubi7effddjB07tsrEhF0/NjaW+5eYmBjMJhNSraOFR7l//jBUGuh6OcRFqdHCTSv2B7df5V64dYqHkDAR1OTk6tWrsFqtqF+/Pu/2+vXrIz8/369tvvbaaygqKuL+nT9/PhhNJSIpKytDdHQ0oqOjUVZWJnZzgqbkopn7V3ow26tTOgBQYaxE34zn0DfjOVQYK53uvAFU3Aj4S0xOKk1GDJ75KgbPfBWVJmP1DwhzHvePm0oLr1Q7TT2Q5FpKwvX4QVyFZESR80V/GIZxeyGgtLS0ardlMBhgUPj1RsJNeXm52E3wilAHc/oS5jOaTWI3QVK83T88FWgLJ3I5fpDABDU5iYuLg0ajceklKSgocOlNIcoUGRmJnJwcbjlsuOlW9+ZUjkGnx7p/L+aWlU6v1WHVq29yy4riOLbo5vgTb/YPrkCbQ29djYb3hqKFogvb4wdxEdTkRK/XIyUlBVlZWRg8eDB3e1ZWFgYOHBjQtjMzM5GZmUmXuJY5tVqN5ORksZvhkS89JuyvVX+rv6rVasTf5lSOW8EDYNVqNerXvk3sZkiG2/0jAHIvcS+H4wcJDp+Tk9LSUpw6dYr7OycnB4cOHUKdOnXQuHFjTJkyBSNGjMA999yDLl26YPny5Th37hzGjx8fUEPT09ORnp6O4uJixMbGBrQtQtzx5zSO+6sL02lIEiCawUMUzufk5MCBA+jVqxf395QpUwAAo0aNwurVqzF8+HAUFhZizpw5yMvLQ5s2bfDDDz8gKSkpeK0msmU2m5GZmQnAnnDqdME/eHqq7RCscSTBmi5ssVqw8dcswFyBId16QKtRduExi9WK737fBQB4pHN3ige7fwAY0qMPtJqqD9mOSTF7DR5P19+RYw0UIY4fRBpUDMMwYjfCF2zPSVFREWrWrCl2c4iPysrKEHPzqrOlpaWIjo4O+nM4JyCOB99gJydsz4m/vSUVxkr0e2k0AGDrvEWIdDP4WykzdQD74M8hs6YCADbOWoAIvXJ6oWIMrolHhTri1v7x/gpEGiK82hZ7FeOYlPYAvOtBkUOSIsTxg4SOL9/fsqn/S2NOwoNGo8FTTz3FLQshmLNunJOSQKmNJXiw/T32ZXVQZ/bLklqlRs+2Kdyy0nH7h1bv1/7hTQ+KnIhx/CDioJ4TEnZCOQU4b8s+t7f7Pc7Ew+BXJfWYEPc9Jxw/x56wPSjxj1Y/e0cOPSdE3sKy54QQr/192v5/y2ZB2VzJ/3KqvI8GvxJBCDBAVo5jUEj4ouSEKEuQEpeAkxIFTxcmwqHkmciVbJITGnMSHsrKyrg6Bbm5uSEZ0FZSaD83X4O9gU1I3N3naTs3e0z8rWHiwk1CUmE0Ythce9Gxr2fMcTsgVkkqTUakLZgNAFg9daaiBsS647J/4MatO33oRbk1Rqo9d1tVvShSroUixPGDSINskhOqcxI+rl69GpLtuow1uZmUsAmJIzbxqHGb7dZtTusFLSmpRlFZqSDPIxfF5XTNFEe0f/CF6vhBpIUGxBJB2Ww2HD9+HABw5513BjxDxd3gV+fEw11y4ok3SYlf3eVuek5sNhvOFtgv95BUrwHUarWiB8LabDacv3IZAJBYt75iZzCxg2Pd7R8cP8afGJve+mEXf0+yz48Xuycl2McPIiwaEEskS61Wo3VrYQ5wUk5KWGq1Gk0aJPi+zTClVquRVD9e7GZIhsf9w59BspcKHP5I9q9RIhLy+EHERckJUbSQJSSESJw/Je7d9VSK3ZtCwpNskhMaEBsezGYzVq9eDQBIS0sLbvlpbuBr1T0mvowjESIpsVit+HH/7wCA1I6dqVy71Ypf/tgLAHiwQyeKhzf7hw89KLwS9zcHyQZ6BWMhpyCH9PhBJIXGnBBBBav8tNtCa1UMgPV3YKsQ04UrjEb0m26/PhVbvl7JY06UXL7eETvmxN3+USUfx6D4U+LeEyGSEypfL2805oRIlkajwcCBA7nlYGKTEjYZifHzUvNC1jBRq9W4r/XdsNgYVJhtsEK5iQlgL1nf+c67uGWlYhNU3c39Awjd5Q3YEvdAQwD+JylC9KCE8vhBpIV6TogseZqlY75YDODWL0NvBe00jh8F1pTcW0Kq5rGkvSc+9KLIsQeFyBP1nBDlcCiw5mmsCYsGtxLini+F2jyhQbMkGCg5IbLifODzZrqwIAkJlaMnUuLHIFm2B+XWaR4PVzN2/FEQpGtYEeJINid1MzMz0apVK3Ts2FHsppAAlJeXIzk5GcnJySgvLw/pcxkqDZLvKak0mTDsrTeQtmA2Kk0msZsjukqTCWkLZlM8bmL3j2FvvSFcPC4VcP9KLpq5KcdSIOTxg4hLNj0nVL4+PDAMg7Nnz3LLfvNi2rAgAuwxKTWakX/92s2/ZDX8K0QYFNygeLAYhuH2j1APD3TuQQEA5F4AAJTk2v+s0a1JSNtQnaAdP4jkySY5EUrFX/bTBpFt6BxpKERERGDfvn3ccjCxs3QMkHZviSOdVofFE6Zwy0pH8eDT63T4aPJUbtln/lSRdeA88+3WxTALuXXiBTytE8rjB5EWmq3jhJITaXIZa3LzIOmIS06EPJUTcM8JzdIh1fN71g7Lz+TEecYbm6Q41g5ib+MuoukhWaGBscpGs3UICTUaAEvkxM8eFLeneqrADk6v4TBYlrtN5NNBRH4oOSGCslgsWLduHQBg+PDh0Gp93AWdxpo4/oKT46wcq9WKnUfsUzjvv6u94gtLUTz42HgYtGo80C5FtHL+7GdLlN5JBwEfP4hs0DtLBGU0GvHMM88AAAYNGuTzwcXXKw1LFXs6x2y1YOHXnwMAurS6S/FfxhQPPsd4dG/TVvRrDYk9+y3Q4weRD3pniaDUajUefPBBbtkTt9fPcSL2wTJQKpUK7W6/g1tWOooHHxsPrUoNtVqa8WArMusa2scQePoBEWiJe1+OH0TeZDMg1vGqxP/880/IBsSe33YIAJD4QLugb5v4xlOJekG7l0MwvoQGwhJfBDwg1h0/B8lWJSrWvj1zjM3lPucxJzQwVpnCckAs1TkhgqNBr0Qi2GQ2qElKgNOMnZUX2beni7n1pcP+iKgBfnLi+MODEhXijmySE0IIIdKnK711ukXup12JeCg5IYIqLy/nLkGwf/9+REVFuazjdqzJzVk6bPGnkB70BOwxqTSZkLHkPQDA4gkvI0KvF+y5pYjiweccj5Cc3glRDwoPO8suwIJt3hw/SHig5IQIimEYHDt2jFv2Vshm6Yh+6obBuYJ8bplQPPjCIx4uNVD8TFL8PX4Q+aHkhAgqIiIC27dv55aVxN0gWJ1Wh7fHTOSWlY7iwSdoPILcgxIKSj5+KA0lJ0RQGo0GPXv29G5lx8uyB+sCf6L3lPBp1Grc3bS52M2QDIoHn3M8HBPckJziEYEv04t9On4QWQuPvZuEhZDVNRE5IaFpw0RWHD8vQepFYWuh4LaYoGyPhD9KToigLBYLvvvuOwDAI4884rHCo+M4E5+uOCyD7mmW1WrF3hP2pKxTi9aKr4hK8eALt3hwY0/8fLwvxw8ib7J5Zx2LsBH5MhqNGDx4MACgtLQ0tAcXiZ3CccdsteCtL1YAADbOWiD7L59AUTz4RI+HgIm+N6d3BD1+EFHJ5p2lImzhQa1Wo2vXrtwy4N3pnHClUqlwZ+Mm3LLSUTz4JBOPICUpXA0Ux/FkPszccXf8IOFJNuXrWb6Uv/XH+c0HAQCJg1KCvm3iXlXJCVuqHvCxXL3EekxozAkJJVEGxvqZpLAl7iOSHD4TTskJVYwNX2FZvp4oh2NSwpJjpUlKSkjY8nPQLFugzVx464vJ3/EnJLxRcuKkoPwyACBR5HYQP0isx4QQ4p5jiftgVY8l4YWSEyKoiooK3H///aiwVODTbz9FRGTVhZTYUzmAl7N0ZMhoNmHq8v8HAFjw/Isw6JRdrp3iwafEeHgaGMsePwBg586diIyMFLRtRDiUnDgruyp2C8KazWbDgQMHuOVw5MvpHIZhcPLieW5Z6SgefN7EIyRXLA4xx+vvRMD72idKOH4QO/nszSQsGAwGfPfddzhbfBZ6A/9XYMlFMwB5D371lU6jxayRz3PLSkfx4JNFPAScbsweP9hlEr4kureLp7ykVOwmhDWtVov+/fu7n6GTe0H4BolMo9Hg3pY0O4FF8eCTVTwESFLY4wcJf5ScOKEZFoQQIpyqqsY6/oCh6cXKQ8kJEdSfBX9i7669AIBO3TspvgKo1WbD4dP/AADaNrsDGoUXlqJ48MkyHj5OM+auu+MFq9WKbdu2AQAeeOABxR8/whklJ0QQ7K8gY6URzz9mP4e+7+w+REVHcXVN5DzWxN8eN7PFjNdXLQVwszy5Xtnn0SkefEqKBzvmrEZDnct97PEjOSIZffv2BWAvXx8dHS1cA4mgKDkhglKr1WjRpgW3rHQqlQpN4xtyy0pH8eBTVDzYMWcNm1S5ilqtRtu2bbllEr6ofL2TH+fbawqkvvZi0LetZJ6un+NTz4nEekxYNFaJSInkphV7cXpH19B+PK/RrerkhMaeyJsv39+yST0zMzPRqlUrdOzYUeymEMIpNVooMSEkCHSlan7lWKJoEkuvq0ZXJZYfj70lN88vAz6ONSGEyI8X04zZwmwRf99M9t2Us/dUPZaEF9kkJ0Q+PCUllRWVGD98PKzFlVj05ixEKLyQktFswpurPwIAzEkbp4jy5J5QPPiUGI+qphYDt44f0bpo/Pjjj1S+PoxRckIEVXzBhAN77OWnSwqvwOJtciLRsSaBYhgGR3JOcctKR/Hg8zceki1pH2ChNpvNxh0/qHx9eJPYnkvkzFOPCUun1+GthQtQeSYHeq18d79gjTPRabR47ck0blnpKB58SowHd8HPi42429jpxXqDHu+teA+JNRKpfH2YU8beTiSj8vARdLotDrgtTuymSIJGo0H3u9qL3QzJoHjwKTEensaeabVa9BvYDwBwougEjT0JY5ScENHRQFhCQkOyp3e84XitLQ+1T0h4kuEeS6TA3+teWG02HD99GgBwZ7NmnstxS3CcSbCnDVttNvx9PhcA0DIxWR7lyUOI4sEXtvHwosQ9d3oHQA3YkxOr1YrDBw4DANre0zZUrSMSQMkJCZg3Y03YqcMmsxkvzZ8PANiydCkiFX7e2Gwx49WPPgAQ/uXJvUHx4FNyPBx7VNnjh6aWGSP7jwRgv/wFXRwwfFFyQgRRejAbAKACkFCvHlSMChFGAwyMcg627qmQwI2/CfPy5F6hePBRPBypVCo0btKYWybhi8rXO6Hy9Z5500vC4hVau5mcsOR0cT+qAEvkTtJjTrwpbd+tFe9vdxcHpJ4T6fPl+1vCeyyRKzYpKc276N8GJJKUEBIuJD0w1ovaJ+yPm5iUqmcuUfXY8CLBPZXIHZeUXCpwuU9OM3Oox4QQQsRByQkJGsfTOM7YpMRoNuGN5fYBfv9+frIiynF7YjKbMferlQCAGU89B73OtbtaSSgefBQPO+74YTRi+suvQGNQYfHqxTBEuP7YoR6U8EDJCQk+Nz0mLJvNht+PHuKWeSRyOkfIHhMbY8P+E8e4ZaWjePApKh5enN6x5ZzHnl27AQBF542o11w+PbHEN5SckKDxZoyJTqvFayOe55aVTqvR4qWhT3HLSkfx4KN4OCkox2sjnoemeUPodBSPcEbvLvHIqxom/8uxLzgUTaqKVqNFapcegTYrbGg1GvRJ6SR2MySD4sEX7Hg49gpKcnBsNdjjh/PsHXeoBoq8CV5usKSkBB07dkS7du1w11134eOPPxa6CSTISguv8Ko5sgyVBu4fIYQES2neRf9nAxJZEDx1joqKwq+//oqoqCiUl5ejTZs2GDJkCG677Tahm0JCxFMyYrXZcObiOQBA04aN7eW4JTLWRAxWmw25+ZcAAMkNEsKnPLmfKB58FA8+7vhx3n78wD3J3H3sgHx3NVBokKz8CJ6caDQaREVFAQAqKythtVohszpwYUuID7DJbMLo+TMAAFvfX4FIQ0TInstXYkwdNlvMmPThQgDKK0/uDsWDT5Hx8DAw1vn44WmGIJE3n9PwnTt3YsCAAUhISIBKpcLmzZtd1lmyZAmaNGmCiIgIpKSkYNeuXbz7b9y4gbZt26JRo0aYOnUq4uLiXLZBxHO08Cj3ryp5W/Zx/1jenMJRqVSIi62NuNjakig/XWq0cP/EocJtNWNxW81YUHlygOLhLHTxEHe/94/L8SP3wq1/JKz43HNSVlaGtm3b4tlnn8XQoUNd7l+3bh0yMjKwZMkSdOvWDR999BFSU1Nx7NgxNG5svyZCrVq1cPjwYVy+fBlDhgzBY489hvr16wf+akjIBfpLJUJvwMb5H9p/HVkrgIqK4DTMR1I5KEfo9fh82hyxmyEZFA8+igcfd/y4yXyxmFvWJYvQIBIyPicnqampSE1NrfL+RYsWYfTo0RgzZgwAYPHixdi6dSuWLl2K+TevRsuqX78+7r77buzcuROPP/642+0ZjUYYjUbu7+LiYrfrEf/5clVh52vkEELkzTlRl+MsHuBWKYMaDZOrXMf5WEdjUKQrqKOrTCYTDh48iL59+/Ju79u3L/bs2QMAuHz5MpdgFBcXY+fOnWjRokWV25w/fz5iY2O5f4mJicFsMqlGyf9ybk0VroKcZuTIsSubEEWquHHrnw9KLpppLEoYCGqKfPXqVVitVpdTNPXr10d+fj4A4MKFCxg9ejQYhgHDMJg4cSLuvvvuKrf52muvYcqUKdzfxcXFlKCI4eY5Xb+TkJsHGKM2CnNXLwWsZsx4ahQMCi3HzTKZzXh3/RcAgFcef0ax5clZFA8+igef0WyyHz8AzEh7gXf5C8OZIgBAKXdLQ27J3QweIm0h6b9zHujIMAx3W0pKCg4dOuT1tgwGAwwGefwqlxtvTucEm63sOnZk2wfRvvbEiJA+lxx6SGyMDbv/OgQAmPLYU+I2RgIoHnwUDz6bzXbr+DFynNt1DCU3TwjEC9UqEgpBTU7i4uKg0Wi4XhJWQUFBwANeMzMzkZmZCavVGtB2lMqXacKh7BLVabXIGDyMWw4FOSQlLK1GixcGPMYtKx3Fg4/icdPNnledPgYZw0fZl6s5fjgWafM0DoVIU1D3dr1ej5SUFGRlZWHw4MHc7VlZWRg4cGBA205PT0d6ejqKi4sRGxsbaFOJA+dkxHHQa8xtdYP6XFqNBkPuo/L1LK1GgwFduovdDMmgePCJEQ8pl7jXarQY0qNv9StWwblQm7veYxokKw0+73mlpaU4deoU93dOTg4OHTqEOnXqoHHjxpgyZQpGjBiBe+65B126dMHy5ctx7tw5jB8/PqgNJ/7xeCrHTa0Atiy9X2NNFFz5lRAiLu70DoCiczfHzBXZa2qV3LydxqJIl8/JyYEDB9CrVy/ub3aw6qhRo7B69WoMHz4chYWFmDNnDvLy8tCmTRv88MMPSEpKCl6ridc8JSNijGi32Wy4WHgVANDwtjioFV6O22azIe+aPR7xdSgeFA8+sePB9qJIpQfFVnbNfvyIqImGcfXcx+PKdZebymvYe9utJQUAAF0dGpAidT7vcT179qy23PyECRMwYcIEvxvlDo05CQ9GsxlPvz0bALB13iJEKnyws8lixthFcwHYy5NHKKE8uQcUDz6KBx/v+BGiy1/QdXikQRrpsBdozEnoVXVlYa95eRonJiLS+20qQDTFg4fiwUfx4IuJiAR8vPSF4ykeR469x3SKR1pUjMyuuscmJ0VFRahZs2bQt//j/P8HAEh97cWgb1sojqdynAeAuT2Vc3OsSUDjSwDJjDGR02wdQuREKqd3OG4uDugNY1P7D9yY+KprobjrOaFelcD48v0tsT2NBMKr8SWOg16TG4W4RaEn5ZkFhBD5cP4hxxKjHhSh5ETRuDoAgfaYSAT1mBAiDLn/KPBUqI37IXebcO0hrmSzV9GA2BC4VCD4U5osZry7fg0A4JXHn4Req+zzvGaLBf/ZvA4AMGnQ8JAVppMLigef1OIh9g8Al+NHCJ/r9z8PAaCxKGKRzTy99PR0HDt2DPv37xe7KfKUewHIvYDSwivcv6Dx4eJcVqsNPx3Yi58O7IXVagteG2TKarPilz/24Zc/9sFqo8Sb4sFH8eBzOX74eXFAXLkOXLkO3TUr96807yKvqqzz346OFh6l0z0hpuyfJWGmqnOmgPuZOH4JcNCrVqPBC48M4paVTqPW4LmHHuWWlY7iwUfx4PN4/GCPTT4Mki0/fuLWH01pFqiUUHIS7txUfRWTTqvFk736iN0MydBptXjs/t5iN0MyKB58coiH86meUI5BCeXxgx2HUgp+bwlNNxYHJSdhoOBsMe/vvC37uGVvro0j94GwhBDC8aMHxV80tTh0ZJOc0IDY4BMjKbHZbCgssSdTt9Wo6Xc5brEH5gWLzWbDtZvxqBNAPMIFxYOP4sEXrOOHR+xEgYR6wd828ZpskhOqEOuKzdrrIrHKdZzHmojdS2I0mzF0zgwAVL4esM8+GPnOTABUnhygeDijePD5dPxwHB/nTS/KzWvyGGDfptHNKt7UQqFelOCQTXJCqnbm8M7QP0kQq79qAvi1Ey49Jo4CiUc4onjwUTz4hI5H6cHsW3/c7E2p0TBZ0DYoEZWvdyL18vWO40vOFJ2xLzgNevU0M8ernhOJlKF3JxyTE0LCiaSLsvkwDoUtcc+rB3UzOXEsew/we1Ko56RqVL6ecMQ+jRMslJQQIg/uPquSSVh8GSzrpkil84we5ySFBI9E9hjizHkGDovrLQEkN02YEELCBfvDzhjhbvRJ1WgGT3DIJjlR6mwdNhlpGtu0ynWCNuhVgNM5JosZH36zEQAwceAQKl9vsWD595sAAM/3Hyx6eXKxUTz4KB58Yhw/eMfTSvugWdTgT8qgWijBJ5uRVlS+PoT8Kf/sJ6vVhs17dmLznp1Uvh728uTf792N7/fupvLkoHg4C5d4lBotQTk1K7njx83LgrhDJe4Do+w0XGKqOpUDOMzISW7E3RZwj4kIA1+1Gg3S+j7MLSudRq3BUw88xC0rHcWDj+LBF7TjR4CF2rixJyX5AIAYh+MyCQ5KTuTGIUuX42BXnVaL5/r1F7sZkqHTavHMg6liN0MyKB58FA++oB8/fK2F4gV1vr2HyNaAvl4DQdETkaeeEg4NeiWEkNDzozfF3Q/E8iv25CSCkpOAUPRE4FVSEiwSq1nCMAxKKysAADERkVCpVCK3SFwMw6DsZjyiKR4UDycUDz6pHj901+Q7HkiqKDmRKOPpM25vdxxnwpZZdiGxhMRRpcmE/q+/CqD68tNKqG1iNJsw7N+vAaDy5ADFwxnFg8+X40dI3Sx1zyo/foJbjrqzBQCHGTy3CdaqsCKb5ESpU4mdyXGcCSGEALd+cEimKFsIlF+1F2/T1YkHAPz+5yGXdTrf3U7AFskTla93IkT5en8KrLE9JnIvP88wDKw2+xRAjVrtsVtWCT0nvsRDCSgefEqIhy+JiiDxCHRgbN3aAADjzZk8jlc3ZivKKjU5ofL1RLJUKhVNIXZA8eCjePBRPPgEjYe7H3k+XN0YEcFsjPJQciIlNDOHEELEJUDPM5W4rx4lJ1U4cqEIAHBXo9hq1vSOu1M5hRfsp3Fibl66wXGwa8xtdQEA5ov2x7kd/BpgISExmC0WfPzjFgDA2NRH3ZbjVsLpHJbZYsFnWd8DAEb26a/48uQUDz6KB583xw85oMGy1ZPnOysg04USAIC+UQ2/Hu8uKeFm4tzMN65dueiyDjfGpKoZOTJlsVqxdsf/AQCe7UsHW6vNig27tgEAnu79EHQK/0hSPPgoHnyiHz+ce1V8vbqx01WM2cGySh2D4omy9/RwIeEBsM60Gg2e6NmbW1Y6jVqDod0f4JaVjuLBR/Hgk/3xgz1171TuvuKvW9fgiWxDp3oAmq3jgp2t02jEKABAi5vXRgy058Rd3ZITZ7N5fzvOxGEv0+0yO0dGiYi/lHRahxBiJ8vpxV70nLDHcuDW8VzXrRVvnbbqW1cyDufkhGbrhBibcNRLCn5yxKKkhBCiJLKsgeLFuD935R/YirLmOvben6sF5lvrC/D9Igcy2gvCl5IKqymhboMvKB58FA8+igdf2MTj0mX7/3USql1ViB/DLHYiCBC8ySD+kk1yIpcKsd5cN4c9nSP3gmr+qDSZ0G/6FAC3yk8rucfEaDZhyKypAKg8OUDxcEbx4HN3/JAcL3pTyovs6+hy7YmW45SIpmgakmbJjWySk/T0dKSnp3PnrKSKHVtiaGbfwYzZJ+131JTh4C1CCCHV8/NHpK7UPqbRHGPjbuMqhZ9VdpIim+REaJqCcvtCvRhxGxJmIvR6fP/WQm5Z6Qw6Pb5+Yz63rHQUDz6KB5+sjh/+1qFiZ/S05Scnjr3yShiPQslJADydwnGenVOQf4lbVtIYE2cqlQo1IqPEboZkqFQqxFA8OBQPPooHX7gdP7jTOzGuyQbbg9I0Vpk9KJScCITdCYmdkseZEEKII3fVwVnOQwWUgpKTqpzLBQBcszUBADTwts5JMX/Abm7eieofE2aDXj0xWyz4/P+2wmSxYXjPPoqvEGu2WLBuRxYAUDxA8XCmxHh4mlLMHj8AYETvfmETDyX3plclPN5ZAbgtQ39zsKuhfXPX9csvh7xNcmSxWrH65x8AAI/d/4Diy3FbbVZ8te0nABQPgOLhjOLB53j8eLLng/JITvy8ujFbCwVwnUzh/H0UjmNQZPDOSgM3ghpAw6v2/4uu26+ZoLlwa+eh4bOeaTRqDOp6P8xWG5Xjhr0kef9O93HLSkfx4KN48LHHD3Y5rN2shXLx5kye5CsNuLvUdcNn3E1VqHy9E7Z8fVxCCgAgsXUzAECuPp9bh0tOzuYBAKxRt6aB1THYdyC258SrsSYKOq3DojEnhBBPHE/ryLJ6rC8celKcL10SFVvLZfXEbvfx/g5Wz0moi7BR+fpQYKd3AUBMo6rXIxzHBCRsDyqEkJBQ7A+Yspu/fjX2KxizP3DdJSnhjL4xqlBRYt9BzBci7DeEeQ+iUBR7wCGEEC8YTNH2hcjq1+VKViS1AxBetVAoOakGN7DVYTAJezrH0/oup3McT934WpQnDLBJSaXJiMfnTAMArH/zbSrHTfHgoXjwUTz4KoxGPPz6KwCAH956V5rl6/0RpFP7vPpaNxMWuaLkxEtsmWF3NOW37rMYvBjCo8AxJo7YC3cRO4oHH8WDj+LBp5h4OFWY9bVWltxn9MgmORH7wn+OO4YhorZX6xFXeq0On/1rNresdBQPPooHH8WDz6DTYcObc7llYufNBWdZFX8d5ZYj27QORXOCQjbJiVwu/OdCgb0knsaVqNVqxClsYJcnFA8+igcfxYOv3GxDZESM4gfY266U8/52vlyKO2wCYywwc7clBrdZQaXsdzhY2NHVAGAqsf+vwHElhBAihLCfWlyNossXAQBRbJ7h5qr3ci97r8x3NsiuswkJqZbZYsE3e34FAAzs2kMeFR5DiOLBR/Hgo3jwOcej1OE+JSUq5gJ7AVDUriduQ0JIOe9mMDn2lFQlzE/n+FvDxGqzYuVPWwAAj3S+T/HluCkefBQPPooHH8UjuNjxJ1Ice0LvrLccko3r4rVC9jRqDR7scC+3rHQUDz6KBx/Fg0+R8XCateMrb07vcINka0mnwCiVr3fClq+PrpFk/7+OfUR4eX5u0J9LjtwNdmV7TqjAGiFELEo6rcMmKmzV2HpR9V3XcRqHwiYnjoNn4+rZv9+u3hwke+OOJtx9VL6eyB4lJYQQIjy2dEWRUQUAiPVmDEqxQzmOetKdjk3JCSGEEBIGiq4XcMuxuJmouJnJw7rqMK1Yaig5qQadzrELVu9IpcmIEW/PBAB8Pm224stxUzz4KB58FA8+isdNzhMuPBQGZRVesJ/OialmPamg5IQIrqyyQuwmSArFg4/iwUfx4KsqHp5+QClqPMpN7HXe6tVMAACYLthroxQ5XG4ltma88A3zEg2IdeI8IFZluhj055CjYPWc2Gw25F2zT8WOrxMHtVrZl3umePBRPPgoHnz+xiPck5PatZu43GZ0us4bOz6ltkMvS2wSPzmhAbFEsdRqNRrGhW/hIF9RPPgoHnwUDz6Kh/fYZCTKw+UPLpZeAAA0jLFPIS7JK7t1ZwiSE19QclIFbWUhAMCq7B8qNBOHEEKU4mzureWOCaI1A6DkhLgRyoTEYrXix317AACp93aFVqOQQkpVoHjwUTz4KB58FI8qsFXLo+OqXfV6pUMZUXbUQrzV7bpiouSECMpitWDpt/8FAPRJuVfxBxeKBx/Fg4/iwUfxcI+9vlttN8kJe3rHo5JL9v8jbwtiqwJDyUkVisz2S0qF+0AqoalVatzXph23rHQUDz6KBx/Fg4/i4dn16zm3/vCz3L1UCD5b5/z58xgxYgQKCgqg1Wrxxhtv4PHHH/f68ULN1mEpMTmhcSaEkHDifBxnj3GKOr6zyYpjjZSbtyXH1AAAnHfoOen6WNegN0HSs3W0Wi0WL16Mdu3aoaCgAB06dMDDDz+M6OhooZtCbqJkhBBC7BSVuLCJys3kREoEj358fDzi4+1zq+vVq4c6derg2rVrlJwQQggJCW8SDiX/SCtix6VIaMyJzyftdu7ciQEDBiAhIQEqlQqbN292WWfJkiVo0qQJIiIikJKSgl27drnd1oEDB2Cz2ZCYmOhzw4k8VZpMGPH2mxjx9puoNJnEbo7oKB58FA8+igcfxSMAFTdcy9470ZUx3D+x+ZyclJWVoW3btvjwww/d3r9u3TpkZGRgxowZyM7ORvfu3ZGamopz587x1issLMTIkSOxfPly/1pOfFJqtLj8EweDwuIiFBYXARD/AyA+igcfxYOP4sEXWDzEPfYRXwQ0IFalUmHTpk0YNGgQd1unTp3QoUMHLF26lLvtzjvvxKBBgzB//nwAgNFoRJ8+fTB27FiMGDHC43MYjUYYjUbu7+LiYiQmJtKAWB+5+0Cyr03ID6vVZkNuvn3aWnKDBGgUXo6b4sFH8eCjePAJGY9wOfZ7o7bePuakokZT7raOo7oF/XlEGxBrMplw8OBBTJs2jXd73759sWePvXAOwzBIS0vDAw88UG1iAgDz58/H7Nmzg9lMcpMYvyA0ajWaJTQS/HmliuLBR/Hgo3jwUTxCg62TEiFyOxwFNe28evUqrFYr6tevz7u9fv36yM/PBwD873//w7p167B582a0a9cO7dq1w5EjR6rc5muvvYaioiLu3/nz54PZZEIIIYRITEj6rVQqFe9vhmG42+677z7YbDavt2UwGGAwGILaPjlyN9pcjlPeLFYrth86AADo1e4exVd4pHjwUTz4KB58FI/QumG6KnYTOEH9VouLi4NGo+F6SVgFBQUuvSlEmSxWC97f8BUAoPtd7RR/cKF48FE8+CgefBSP0DKW3BC7CZygJid6vR4pKSnIysrC4MGDuduzsrIwcODAgLadmZmJzMxMWK3Su0AR8Z5apUbHFq24ZaWjePBRPPgoHnwUD+XwebZOaWkpTp06BQBo3749Fi1ahF69eqFOnTpo3Lgx1q1bhxEjRmDZsmXo0qULli9fjo8//hhHjx5FUlJSwA1Wavl6T4NXPc26EWNGDiGEyJ27Y6dUvg9CxfG1pr72YtC3H9LZOgcOHECvXr24v6dMmQIAGDVqFFavXo3hw4ejsLAQc+bMQV5eHtq0aYMffvghKIkJIYQQIhZfKs2GeyITaj5Hr2fPnqius2XChAmYMGGC341yh07rEEIIkRsl9bwEk2xO2qWnp+PYsWPYv3+/2E2RFalVRKw0mTDmvX9jzHv/pvLToHg4o3jwUTz4pBYPqR1fwwmlcRLlS9egvD4cDC4VXuWWCcWDj+LBR/HgEy4eoTyu0qmf6lFkiKB0Wh0WjpvMLSsdxYOP4sFH8eCTSjy8SVzk9aNReig5EZHzzquELFqjVqN1UtPqV1QIigcfxYOP4sFH8VAO2Yw5yczMRKtWrdCxY0exm0IIIYSQEJJNckIDYsOD1WrFriPZ2HUkm2ZegeLhjOLBR/Hgo3goh2ySExIezFYL5q9ZjflrVsNspXOyFA8+igcfxYNPCfGoagYQe7tSxrKE/yAHGalqhwwnKpUKdzW5nVtWOooHH8WDj+LBR/FQDp/L14tNLuXrvRnsGm6JByGEkKr5czkRby5PEixSKl8vm9M6UhwQq6QuNkIIIYGh7wzvyea0Tnp6OtLT07nMSyjBylZphySEECI0uZaskEcrSdgwmk14edliAMB74zNg0OnFbZDIKB58FA8+igcfxUM5KDkJMuoh8YxhGJzJu8gtKx3Fg4/iwUfx4FNiPJT6nULJCRGUTqvDW8++wC0rHcWDj+LBR/Hgo3h45u8VkKWYAFFyQgSlUavRoXlLsZshGRQPPooHH8WDj+KhHDRbhxBCCCGSIpvkhMrXhwer1Yp9fx/Fvr+PUvlpUDycUTz4KB58FA/vyX3asmySExIezFYLZn22HLM+Wx625ad9QfHgo3jwUTz4KB7KQWNOgkDO2anQVCoVmjdM5JaVjuLBR/Hgo3jwUTz42O8eudQu8QWVr3fiXL7eHX9KEBNCCCGh4Mt3kreXUhG7fH34pVuEEEKIgoTjD2VKTvwQjjsCIYQQIhWySU4yMzORmZlJI7Rlzmg2YfqKJQCAeaMnKL78NMWDj+LBR/Hgo3gETi4/rmWTnIh14T8SXAzD4Pi5HG5Z6SgefBQPPooHH8VDOWSTnJDwoNNo8fozo7llpaN48FE8+CgefBQP5aB3lwhKo9Gga6u7xW6GZFA8+CgefBQPPoqHclARNkIIIYRICvWcEEFZbTYczT0NAGid3AwatbLzY4oHH8WDj+LBR/FQDnpniaDMFjOmffIhpn3yIcwWs9jNER3Fg4/iwUfx4KN4KAf1nBCBqdC4XgNumVA8+CgefBQPPoqHUlD5eifelK8nhBBCwpnY5evptA4hhBBCJEU2yUlmZiZatWqFjh07it0UQgghhISQbJKT9PR0HDt2DPv37xe7KSQARrMJ01dmYvrKTBjNJrGbIzqKBx/Fg4/iwUfxUA4aEEsExTAMDp36h1tWOooHH8WDj+LBR/FQDkpOiKB0Gi1eHTaCW1Y6igcfxYOP4sFH8VAOeneJoDQaDXq1u0fsZkgGxYOP4sFH8eCjeCiHbMacEEIIIUQZZNdzwp5nLC4uDsn2yysrQ7JdYme12ZCTfwEA0KRBI8WXn6Z48FE8+CgefBQP4YTiO5bdpjfjhWRXhO3ChQtITEwUuxmEEEII8cP58+fRqFEjj+vILjmx2Wy4dOkSatSoAZUquOWLi4uLkZiYiPPnz4ek+qzYwv31AfQaw0G4vz6AXmM4CPfXBwT/NTIMg5KSEiQkJEBdTa+X7E7rqNXqajOuQNWsWTNsdzYg/F8fQK8xHIT76wPoNYaDcH99QHBfY2xsrFfr0Qk7QgghhEgKJSeEEEIIkRRKThwYDAbMnDkTBoNB7KaERLi/PoBeYzgI99cH0GsMB+H++gBxX6PsBsQSQgghJLxRzwkhhBBCJIWSE0IIIYRICiUnhBBCCJEUSk4IIYQQIimKSk7mzp2Lrl27IioqCrVq1XK7zrlz5zBgwABER0cjLi4OL774Ikwmk8ftGo1GTJo0CXFxcYiOjsajjz6KCxcuhOAV+GbHjh1QqVRu/+3fv7/Kx6Wlpbms37lzZwFb7pvk5GSX9k6bNs3jYxiGwaxZs5CQkIDIyEj07NkTR48eFajF3svNzcXo0aPRpEkTREZGolmzZpg5c2a1+6TU38MlS5agSZMmiIiIQEpKCnbt2uVx/V9//RUpKSmIiIhA06ZNsWzZMoFa6rv58+ejY8eOqFGjBurVq4dBgwbhxIkTHh9T1Wf177//FqjVvpk1a5ZLWxs0aODxMXJ6D90dU1QqFdLT092uL4f3b+fOnRgwYAASEhKgUqmwefNm3v3+HhM3bNiAVq1awWAwoFWrVti0aVNQ2quo5MRkMuHxxx/HCy+84PZ+q9WK/v37o6ysDLt378batWuxYcMGvPzyyx63m5GRgU2bNmHt2rXYvXs3SktL8cgjj8BqtYbiZXita9euyMvL4/0bM2YMkpOTcc89ni87/tBDD/Ee98MPPwjUav/MmTOH197XX3/d4/oLFizAokWL8OGHH2L//v1o0KAB+vTpg5KSEoFa7J2///4bNpsNH330EY4ePYr3338fy5Ytw/Tp06t9rFTfw3Xr1iEjIwMzZsxAdnY2unfvjtTUVJw7d87t+jk5OXj44YfRvXt3ZGdnY/r06XjxxRexYcMGgVvunV9//RXp6en4/fffkZWVBYvFgr59+6KsrKzax544cYL3njVv3lyAFvundevWvLYeOXKkynXl9h7u37+f99qysrIAAI8//rjHx0n5/SsrK0Pbtm3x4Ycfur3fn2Pib7/9huHDh2PEiBE4fPgwRowYgWHDhmHv3r2BN5hRoFWrVjGxsbEut//www+MWq1mLl68yN22Zs0axmAwMEVFRW63dePGDUan0zFr167lbrt48SKjVquZn376KehtD4TJZGLq1avHzJkzx+N6o0aNYgYOHChMo4IgKSmJef/9971e32azMQ0aNGDefvtt7rbKykomNjaWWbZsWQhaGFwLFixgmjRp4nEdKb+H9957LzN+/HjebS1btmSmTZvmdv2pU6cyLVu25N02btw4pnPnziFrYzAVFBQwAJhff/21ynW2b9/OAGCuX78uXMMCMHPmTKZt27Zery/393Dy5MlMs2bNGJvN5vZ+ub1/AJhNmzZxf/t7TBw2bBjz0EMP8W7r168f88QTTwTcRkX1nFTnt99+Q5s2bZCQkMDd1q9fPxiNRhw8eNDtYw4ePAiz2Yy+fftytyUkJKBNmzbYs2dPyNvsiy1btuDq1atIS0urdt0dO3agXr16uOOOOzB27FgUFBSEvoEBeOedd3DbbbehXbt2mDt3rsfTHjk5OcjPz+e9ZwaDAT169JDce+ZOUVER6tSpU+16UnwPTSYTDh48yIs9APTt27fK2P/2228u6/fr1w8HDhyA2WwOWVuDpaioCAC8es/at2+P+Ph49O7dG9u3bw910wJy8uRJJCQkoEmTJnjiiSdw5syZKteV83toMpnwxRdf4Lnnnqv2YrNyev8c+XtMrOp9DcZxlJITB/n5+ahfvz7vttq1a0Ov1yM/P7/Kx+j1etSuXZt3e/369at8jFhWrFiBfv36ITEx0eN6qamp+PLLL7Ft2za899572L9/Px544AEYjUaBWuqbyZMnY+3atdi+fTsmTpyIxYsXY8KECVWuz74vzu+1FN8zZ6dPn8Z//vMfjB8/3uN6Un0Pr169CqvV6lPs3X0u69evD4vFgqtXr4asrcHAMAymTJmC++67D23atKlyvfj4eCxfvhwbNmzAxo0b0aJFC/Tu3Rs7d+4UsLXe69SpEz777DNs3boVH3/8MfLz89G1a1cUFha6XV/O7+HmzZtx48YNjz/q5Pb+OfP3mFjV+xqM46jsrkrsbNasWZg9e7bHdfbv31/tGAuWu8yYYZhqM+ZgPMZb/rzmCxcuYOvWrfj666+r3f7w4cO55TZt2uCee+5BUlISvv/+ewwZMsT/hvvAl9f40ksvcbfdfffdqF27Nh577DGuN6Uqzu9PKN8zZ/68h5cuXcJDDz2Exx9/HGPGjPH4WCm8h574Gnt367u7XWomTpyIP//8E7t37/a4XosWLdCiRQvu7y5duuD8+fN49913cf/994e6mT5LTU3llu+66y506dIFzZo1w6effoopU6a4fYxc38MVK1YgNTWV16PuTG7vX1X8OSaG6jgq++Rk4sSJeOKJJzyuk5yc7NW2GjRo4DKQ5/r16zCbzS7ZoeNjTCYTrl+/zus9KSgoQNeuXb16Xl/585pXrVqF2267DY8++qjPzxcfH4+kpCScPHnS58f6K5D3lZ2VcurUKbfJCTurID8/H/Hx8dztBQUFVb7Pwebr67t06RJ69eqFLl26YPny5T4/nxjvoTtxcXHQaDQuv6w8xb5BgwZu19dqtR6TT7FNmjQJW7Zswc6dO9GoUSOfH9+5c2d88cUXIWhZ8EVHR+Ouu+6qcv+S63t49uxZ/PLLL9i4caPPj5XT++fvMbGq9zUYx1HZJydxcXGIi4sLyra6dOmCuXPnIi8vj3uDfv75ZxgMBqSkpLh9TEpKCnQ6HbKysjBs2DAAQF5eHv766y8sWLAgKO1y5utrZhgGq1atwsiRI6HT6Xx+vsLCQpw/f56304ZaIO9rdnY2AFTZ3iZNmqBBgwbIyspC+/btAdjPK//666945513/Guwj3x5fRcvXkSvXr2QkpKCVatWQa32/WysGO+hO3q9HikpKcjKysLgwYO527OysjBw4EC3j+nSpQu+/fZb3m0///wz7rnnHr/251BjGAaTJk3Cpk2bsGPHDjRp0sSv7WRnZ4v+fnnLaDTi+PHj6N69u9v75fYeslatWoV69eqhf//+Pj9WTu+fv8fELl26ICsri9d7/fPPPwfnh3nAQ2pl5OzZs0x2djYze/ZsJiYmhsnOzmays7OZkpIShmEYxmKxMG3atGF69+7N/PHHH8wvv/zCNGrUiJk4cSK3jQsXLjAtWrRg9u7dy902fvx4plGjRswvv/zC/PHHH8wDDzzAtG3blrFYLIK/Rnd++eUXBgBz7Ngxt/e3aNGC2bhxI8MwDFNSUsK8/PLLzJ49e5icnBxm+/btTJcuXZiGDRsyxcXFQjbbK3v27GEWLVrEZGdnM2fOnGHWrVvHJCQkMI8++ihvPcfXyDAM8/bbbzOxsbHMxo0bmSNHjjBPPvkkEx8fL7nXePHiReb2229nHnjgAebChQtMXl4e98+RnN7DtWvXMjqdjlmxYgVz7NgxJiMjg4mOjmZyc3MZhmGYadOmMSNGjODWP3PmDBMVFcW89NJLzLFjx5gVK1YwOp2O+e9//yvWS/DohRdeYGJjY5kdO3bw3q/y8nJuHefX+P777zObNm1i/vnnH+avv/5ipk2bxgBgNmzYIMZLqNbLL7/M7Nixgzlz5gzz+++/M4888ghTo0aNsHkPGYZhrFYr07hxY+Zf//qXy31yfP9KSkq47zwA3HHz7NmzDMN4d0wcMWIEb1bd//73P0aj0TBvv/02c/z4cebtt99mtFot8/vvvwfcXkUlJ6NGjWIAuPzbvn07t87Zs2eZ/v37M5GRkUydOnWYiRMnMpWVldz9OTk5Lo+pqKhgJk6cyNSpU4eJjIxkHnnkEebcuXMCvjLPnnzySaZr165V3g+AWbVqFcMwDFNeXs707duXqVu3LqPT6ZjGjRszo0aNktTrcXTw4EGmU6dOTGxsLBMREcG0aNGCmTlzJlNWVsZbz/E1Mox96tzMmTOZBg0aMAaDgbn//vuZI0eOCNz66q1atcrtPuv8u0Ju72FmZiaTlJTE6PV6pkOHDrxptqNGjWJ69OjBW3/Hjh1M+/btGb1ezyQnJzNLly4VuMXeq+r9ctz/nF/jO++8wzRr1oyJiIhgateuzdx3333M999/L3zjvTR8+HAmPj6e0el0TEJCAjNkyBDm6NGj3P1yfw8ZhmG2bt3KAGBOnDjhcp8c3z92urPzv1GjRjEM490xsUePHtz6rPXr1zMtWrRgdDod07Jly6AlZCqGuTkqiRBCCCFEAmgqMSGEEEIkhZITQgghhEgKJSeEEEIIkRRKTgghhBAiKZScEEIIIURSKDkhhBBCiKRQckIIIYQQSaHkhBBCCCGSQskJIYQQQiSFkhNCCCGESAolJ4QQQgiRFEpOCCGEECIp/x8gy2OS1PPHSwAAAABJRU5ErkJggg==\",\n      \"text/plain\": [\n       \"<Figure size 640x480 with 1 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Plot the distribution of activity values for each cell line\\n\",\n    \"# We can observe that the distribution is Gaussian between -2.5 and 2.5 (a parabola in log-space).\\n\",\n    \"# This indicates that the data is probably noise under this threshold.\\n\",\n    \"for line, this_df in df_per_cell_line_cleaned.items():\\n\",\n    \"    plt.hist(this_df[data_cols].values.flatten(), bins=np.arange(-10, 10, 0.1), alpha=0.2, log=True)\\n\",\n    \"plt.legend(df_per_cell_line_cleaned.keys())\\n\",\n    \"# plt.vlines([-THRESHOLD, 0, THRESHOLD], ymin=0, ymax=1e6, colors=\\\"black\\\", linestyles=\\\":\\\")\\n\",\n    \"plt.vlines([THRESHOLD_lst[0],THRESHOLD_lst[1], 0,THRESHOLD_lst[2], THRESHOLD_lst[3]], ymin=0, ymax=1e6, colors=\\\"black\\\", linestyles=\\\":\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Save each chosen cell-line into 2 CSV files. One with z-score, one with thresholded z-score\\n\",\n    \"# single THRESHOLD\\n\",\n    \"# Create the dirs\\n\",\n    \"zscore_dir = f\\\"{OUT_DIR}zscore-highestactivity/\\\"\\n\",\n    \"makedirs(zscore_dir, exist_ok=True)\\n\",\n    \"class_dir = f\\\"{OUT_DIR}classification-highestactivity/\\\"\\n\",\n    \"makedirs(class_dir, exist_ok=True)\\n\",\n    \"class_dir2 = f\\\"{OUT_DIR}classification-filtered/\\\"\\n\",\n    \"makedirs(class_dir2, exist_ok=True)\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"data_cols2 = [\\\"geneID-\\\" + str(col) for col in data_cols]\\n\",\n    \"data_cols_rename = {data_cols[ii]: data_cols2[ii] for ii in range(len(data_cols))}\\n\",\n    \"for line, this_df in df_per_cell_line_cleaned.items():\\n\",\n    \"    this_df = this_df.rename(columns=data_cols_rename)\\n\",\n    \"    this_df.to_csv(f\\\"{zscore_dir}{line}.csv\\\", index=False)\\n\",\n    \"    this_df_classification = this_df.copy(deep=True)\\n\",\n    \"    this_df_classification[data_cols2] = (np.abs(this_df_classification[data_cols2]) > THRESHOLD).astype(int)\\n\",\n    \"    display(this_df_classification)\\n\",\n    \"    this_df_classification.to_csv(f\\\"{class_dir}{line}.csv\\\", index=False)\\n\",\n    \"    remove_cols = [col for col in data_cols2 if np.sum(this_df_classification[col]) < MIN_ACTIVE_PER_COL]\\n\",\n    \"    this_df_class_filtered = this_df_classification.copy(deep=True).drop(columns=remove_cols)\\n\",\n    \"    this_df_class_filtered.to_csv(f\\\"{class_dir2}{line}.csv\\\", index=False)\\n\",\n    \"    vals = this_df_class_filtered[set(data_cols2) - set(remove_cols)].values\\n\",\n    \"    print(\\\"{}:\\\\tnum mols: {}; \\\\tnum cols removed = {}; \\\\tdensity of positives = {:.2%}\\\\n\\\".format(\\n\",\n    \"    line, vals.shape[0], len(remove_cols), np.sum(vals) / np.size(vals)\\n\",\n    \"    ))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 36,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def apply_labels(x):\\n\",\n    \"  if x <= -6:\\n\",\n    \"    return 0\\n\",\n    \"  elif x<= -3 and x>-6:\\n\",\n    \"   return 1\\n\",\n    \"  elif x<= 3 and x>-3:\\n\",\n    \"   return 2\\n\",\n    \"  elif x<= 6 and x>3:\\n\",\n    \"   return 3\\n\",\n    \"  elif x>6:\\n\",\n    \"   return 4\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 41,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"U2OS:\\tnum mols: 16058; \\tnum cols removed = 0; \\tnum of 0's = 6629; \\tnum of 1's = 44426; \\tnum of 2's = 15615576; \\tnum of 3's = 32878; \\tnum of 4's = 5215\\n\",\n      \"\\n\",\n      \"removed columns ['geneID-9637']\\n\",\n      \"len of removed cols 1\\n\",\n      \"HA1E:\\tnum mols: 5514; \\tnum cols removed = 1; \\tnum of 0's = 18531; \\tnum of 1's = 111277; \\tnum of 2's = 5146513; \\tnum of 3's = 97224; \\tnum of 4's = 13633\\n\",\n      \"\\n\",\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"VCAP:\\tnum mols: 15220; \\tnum cols removed = 0; \\tnum of 0's = 34339; \\tnum of 1's = 280665; \\tnum of 2's = 14323559; \\tnum of 3's = 222288; \\tnum of 4's = 24309\\n\",\n      \"\\n\",\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"A549:\\tnum mols: 12285; \\tnum cols removed = 0; \\tnum of 0's = 30445; \\tnum of 1's = 213422; \\tnum of 2's = 11584661; \\tnum of 3's = 165520; \\tnum of 4's = 20682\\n\",\n      \"\\n\",\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"MCF7:\\tnum mols: 11622; \\tnum cols removed = 0; \\tnum of 0's = 31900; \\tnum of 1's = 213495; \\tnum of 2's = 10928761; \\tnum of 3's = 169375; \\tnum of 4's = 22785\\n\",\n      \"\\n\",\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"PC3:\\tnum mols: 11521; \\tnum cols removed = 0; \\tnum of 0's = 26839; \\tnum of 1's = 197543; \\tnum of 2's = 10864161; \\tnum of 3's = 159508; \\tnum of 4's = 19487\\n\",\n      \"\\n\",\n      \"removed columns []\\n\",\n      \"len of removed cols 0\\n\",\n      \"A375:\\tnum mols: 10694; \\tnum cols removed = 0; \\tnum of 0's = 24850; \\tnum of 1's = 175129; \\tnum of 2's = 10089199; \\tnum of 3's = 150185; \\tnum of 4's = 19369\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Save each chosen cell-line into 2 CSV files. One with z-score, one with thresholded z-score\\n\",\n    \"\\n\",\n    \"OUT_DIR = f\\\"out/th_multi_label/\\\"\\n\",\n    \"\\n\",\n    \"# Create the dirs\\n\",\n    \"zscore_dir = f\\\"{OUT_DIR}zscore-highestactivity/\\\"\\n\",\n    \"makedirs(zscore_dir, exist_ok=True)\\n\",\n    \"class_dir = f\\\"{OUT_DIR}classification-highestactivity/\\\"\\n\",\n    \"makedirs(class_dir, exist_ok=True)\\n\",\n    \"class_dir2 = f\\\"{OUT_DIR}classification-filtered/\\\"\\n\",\n    \"makedirs(class_dir2, exist_ok=True)\\n\",\n    \"\\n\",\n    \"modified_final_df = {}\\n\",\n    \"data_cols2 = [\\\"geneID-\\\" + str(col) for col in data_cols]\\n\",\n    \"data_cols_rename = {data_cols[ii]: data_cols2[ii] for ii in range(len(data_cols))}\\n\",\n    \"for line, this_df in df_per_cell_line_cleaned.items():\\n\",\n    \"    this_df = this_df.rename(columns=data_cols_rename)\\n\",\n    \"    this_df.to_csv(f\\\"{zscore_dir}{line}.csv\\\", index=False)\\n\",\n    \"    this_df_classification = this_df.copy(deep=True)\\n\",\n    \"    this_df_classification[data_cols2] = this_df_classification[data_cols2].applymap(apply_labels)\\n\",\n    \"    this_df_classification.to_csv(f\\\"{class_dir}{line}.csv\\\", index=False)\\n\",\n    \"    # display(this_df_classification)\\n\",\n    \"    modified_final_df[line] = this_df_classification\\n\",\n    \"    remove_cols = [col for col in data_cols2 if np.sum(this_df_classification[col] != 2) < MIN_ACTIVE_PER_COL]\\n\",\n    \"    print(\\\"removed columns\\\", remove_cols)\\n\",\n    \"    print(\\\"len of removed cols\\\", len(remove_cols))\\n\",\n    \"    this_df_class_filtered = this_df_classification.copy(deep=True).drop(columns=remove_cols)\\n\",\n    \"    this_df_class_filtered.to_csv(f\\\"{class_dir2}{line}.csv\\\", index=False)\\n\",\n    \"    vals = this_df_class_filtered[set(data_cols2) - set(remove_cols)].values\\n\",\n    \"    print(\\\"{}:\\\\tnum mols: {}; \\\\tnum cols removed = {}; \\\\tnum of 0's = {}; \\\\tnum of 1's = {}; \\\\tnum of 2's = {}; \\\\tnum of 3's = {}; \\\\tnum of 4's = {}\\\\n\\\".format(\\n\",\n    \"    line, vals.shape[0], len(remove_cols),np.unique(vals,return_counts=True)[1][0],np.unique(vals,return_counts=True)[1][1],np.unique(vals,return_counts=True)[1][2],np.unique(vals,return_counts=True)[1][3],np.unique(vals,return_counts=True)[1][4])\\n\",\n    \"    )\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 42,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"U2OS\\n\",\n      \"shape of datafarme with duplicates: (16058, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (16057, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"HA1E\\n\",\n      \"shape of datafarme with duplicates: (5514, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (5488, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"VCAP\\n\",\n      \"shape of datafarme with duplicates: (15220, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (15193, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"A549\\n\",\n      \"shape of datafarme with duplicates: (12285, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (12256, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"MCF7\\n\",\n      \"shape of datafarme with duplicates: (11622, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (11585, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"PC3\\n\",\n      \"shape of datafarme with duplicates: (11521, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (11485, 984)\\n\",\n      \"-------------------------------\\n\",\n      \"A375\\n\",\n      \"shape of datafarme with duplicates: (10694, 984)\\n\",\n      \"shape of datafarme sfter removing duplicates: (10680, 984)\\n\",\n      \"-------------------------------\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"for line, this_df in modified_final_df.items():\\n\",\n    \"    print(line)\\n\",\n    \"    print(\\\"shape of datafarme with duplicates:\\\" , this_df.shape)\\n\",\n    \"    modified_final_df[line] = this_df.drop_duplicates(subset='SMILES', keep=\\\"first\\\")\\n\",\n    \"    print(\\\"shape of datafarme sfter removing duplicates:\\\" , modified_final_df[line].shape)\\n\",\n    \"    print(\\\"-------------------------------\\\")\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 44,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"1450\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"import functools as ft\\n\",\n    \"dfs = [this_df for line, this_df in modified_final_df.items()]\\n\",\n    \"df_final = ft.reduce(lambda left, right: pd.merge(left, right, on='SMILES'), dfs)\\n\",\n    \"common_smiles = df_final['SMILES']\\n\",\n    \"print(len(common_smiles))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 45,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <th>5</th>\\n\",\n       \"      <th>6</th>\\n\",\n       \"      <th>7</th>\\n\",\n       \"      <th>8</th>\\n\",\n       \"      <th>9</th>\\n\",\n       \"      <th>...</th>\\n\",\n       \"      <th>1418090</th>\\n\",\n       \"      <th>1418091</th>\\n\",\n       \"      <th>1418092</th>\\n\",\n       \"      <th>1418093</th>\\n\",\n       \"      <th>1418094</th>\\n\",\n       \"      <th>1418095</th>\\n\",\n       \"      <th>1418096</th>\\n\",\n       \"      <th>1418097</th>\\n\",\n       \"      <th>1418098</th>\\n\",\n       \"      <th>1418099</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>0</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>1</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>2</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>3</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>4</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>4</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>5</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>6</th>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>...</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>1</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>0</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>3</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"      <td>2</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"<p>7 rows × 1418100 columns</p>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"   0        1        2        3        4        5        6        7        \\\\\\n\",\n       \"0        2        2        2        2        2        2        2        2   \\n\",\n       \"1        2        2        2        2        2        2        2        2   \\n\",\n       \"2        2        2        2        2        2        2        2        2   \\n\",\n       \"3        2        2        2        2        2        2        2        2   \\n\",\n       \"4        2        2        2        2        2        2        2        2   \\n\",\n       \"5        2        2        2        2        2        2        2        2   \\n\",\n       \"6        2        2        2        2        2        2        2        2   \\n\",\n       \"\\n\",\n       \"   8        9        ...  1418090  1418091  1418092  1418093  1418094  \\\\\\n\",\n       \"0        2        2  ...        3        4        2        0        1   \\n\",\n       \"1        2        2  ...        3        2        2        2        0   \\n\",\n       \"2        2        2  ...        0        2        2        4        2   \\n\",\n       \"3        2        2  ...        1        2        2        2        1   \\n\",\n       \"4        2        2  ...        2        3        2        0        2   \\n\",\n       \"5        2        2  ...        2        2        2        1        0   \\n\",\n       \"6        2        2  ...        2        0        1        1        2   \\n\",\n       \"\\n\",\n       \"   1418095  1418096  1418097  1418098  1418099  \\n\",\n       \"0        0        1        0        0        2  \\n\",\n       \"1        0        2        1        0        2  \\n\",\n       \"2        2        2        2        1        2  \\n\",\n       \"3        0        3        0        0        2  \\n\",\n       \"4        1        4        3        2        4  \\n\",\n       \"5        0        2        2        2        3  \\n\",\n       \"6        0        3        3        2        2  \\n\",\n       \"\\n\",\n       \"[7 rows x 1418100 columns]\"\n      ]\n     },\n     \"execution_count\": 45,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"lst_numbers = []\\n\",\n    \"for check_df in dfs: \\n\",\n    \"    modified_1  = check_df.loc[check_df['SMILES'].isin(common_smiles)]\\n\",\n    \"    numbers = modified_1[data_cols2]\\n\",\n    \"    values_gene_id = np.reshape(numbers.values, (1,numbers.shape[0]*numbers.shape[1]))\\n\",\n    \"    lst_numbers.append(values_gene_id)\\n\",\n    \"df_corr = pd.DataFrame(np.concatenate(lst_numbers))\\n\",\n    \"df_corr\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 46,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>U2OS</th>\\n\",\n       \"      <th>HA1E</th>\\n\",\n       \"      <th>VCAP</th>\\n\",\n       \"      <th>A549</th>\\n\",\n       \"      <th>MCF7</th>\\n\",\n       \"      <th>PC3</th>\\n\",\n       \"      <th>A375</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>U2OS</th>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.060399</td>\\n\",\n       \"      <td>0.061331</td>\\n\",\n       \"      <td>0.042867</td>\\n\",\n       \"      <td>0.060538</td>\\n\",\n       \"      <td>0.062246</td>\\n\",\n       \"      <td>0.061650</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>HA1E</th>\\n\",\n       \"      <td>0.060399</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.079837</td>\\n\",\n       \"      <td>0.083084</td>\\n\",\n       \"      <td>0.082264</td>\\n\",\n       \"      <td>0.097515</td>\\n\",\n       \"      <td>0.102486</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>VCAP</th>\\n\",\n       \"      <td>0.061331</td>\\n\",\n       \"      <td>0.079837</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.090002</td>\\n\",\n       \"      <td>0.090035</td>\\n\",\n       \"      <td>0.092994</td>\\n\",\n       \"      <td>0.099552</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>A549</th>\\n\",\n       \"      <td>0.042867</td>\\n\",\n       \"      <td>0.083084</td>\\n\",\n       \"      <td>0.090002</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.079212</td>\\n\",\n       \"      <td>0.100671</td>\\n\",\n       \"      <td>0.104961</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>MCF7</th>\\n\",\n       \"      <td>0.060538</td>\\n\",\n       \"      <td>0.082264</td>\\n\",\n       \"      <td>0.090035</td>\\n\",\n       \"      <td>0.079212</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.090312</td>\\n\",\n       \"      <td>0.086981</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>PC3</th>\\n\",\n       \"      <td>0.062246</td>\\n\",\n       \"      <td>0.097515</td>\\n\",\n       \"      <td>0.092994</td>\\n\",\n       \"      <td>0.100671</td>\\n\",\n       \"      <td>0.090312</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"      <td>0.115805</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>A375</th>\\n\",\n       \"      <td>0.061650</td>\\n\",\n       \"      <td>0.102486</td>\\n\",\n       \"      <td>0.099552</td>\\n\",\n       \"      <td>0.104961</td>\\n\",\n       \"      <td>0.086981</td>\\n\",\n       \"      <td>0.115805</td>\\n\",\n       \"      <td>1.000000</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"          U2OS      HA1E      VCAP      A549      MCF7       PC3      A375\\n\",\n       \"U2OS  1.000000  0.060399  0.061331  0.042867  0.060538  0.062246  0.061650\\n\",\n       \"HA1E  0.060399  1.000000  0.079837  0.083084  0.082264  0.097515  0.102486\\n\",\n       \"VCAP  0.061331  0.079837  1.000000  0.090002  0.090035  0.092994  0.099552\\n\",\n       \"A549  0.042867  0.083084  0.090002  1.000000  0.079212  0.100671  0.104961\\n\",\n       \"MCF7  0.060538  0.082264  0.090035  0.079212  1.000000  0.090312  0.086981\\n\",\n       \"PC3   0.062246  0.097515  0.092994  0.100671  0.090312  1.000000  0.115805\\n\",\n       \"A375  0.061650  0.102486  0.099552  0.104961  0.086981  0.115805  1.000000\"\n      ]\n     },\n     \"execution_count\": 46,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"df_transpose = df_corr.transpose()\\n\",\n    \"df_transpose.columns = df_transpose.columns.astype(str)\\n\",\n    \"df_transpose = df_transpose.rename(columns={'0': 'U2OS', '1': 'HA1E','2': 'VCAP', '3': 'A549','4': 'MCF7', '5': 'PC3','6': 'A375' })\\n\",\n    \"df_transpose.columns\\n\",\n    \"corr_values = df_transpose.corr()\\n\",\n    \"corr_values\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 47,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"<Axes: >\"\n      ]\n     },\n     \"execution_count\": 47,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    },\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAgMAAAGiCAYAAAB6c8WBAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMSwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy/bCgiHAAAACXBIWXMAAA9hAAAPYQGoP6dpAADj20lEQVR4nOzddVRUzRvA8S8IYisNFravraCgYiF2YmF3Y/xsxW7sbhS7xW5ERBBUWsAuMGjBRkH39weyuOyiu774GsznnHuOzH12Zp69d9e5c2PVJBKJBEEQBEEQsiz1X90BQRAEQRB+LTEYEARBEIQsTgwGBEEQBCGLE4MBQRAEQcjixGBAEARBELI4MRgQBEEQhCxODAYEQRAEIYsTgwFBEARByOLEYEAQBEEQsjgxGBAEQRCELE4MBgRBEAThN3H58mVat25NwYIFUVNT4+jRo999jbu7O2ZmZuTIkYMSJUqwYcMGldsVgwFBEARB+E28ffuWKlWqsGbNGqXiHz16RIsWLahbty4BAQFMnjyZkSNH4uzsrFK7auKHigRBEATh96OmpsaRI0ewsbHJMGbixIkcP36cW7duScuGDBlCUFAQ3t7eSrclZgYEQRAE4Sf68OEDr169klk+fPiQKXV7e3vTpEkTmbKmTZvi6+tLUlKS0vVoZEpvBEEQBOEvkhT7MNPqclizg1mzZsmUzZgxg5kzZ/7ruiMjIzE0NJQpMzQ0JDk5mdjYWIyNjZWq57caDGTmm/+70dQrQY4cRX91N36qxMTwLJGjRvZCv7obP03yx2dZYhvmzGnyq7vxU71/H5YlcvypPn/KtKrs7e0ZM2aMTJmWllam1a+mpibzd+rZ//Tl3/JbDQYEQRAE4W+jpaWVqf/5f83IyIjIyEiZsujoaDQ0NNDV1VW6HjEYEARBEIT0JJ9/dQ+UUqtWLU6cOCFTdv78eapXr46mpqbS9YgLCAVBEAQhvc+fM29RwZs3bwgMDCQwMBBIuXUwMDCQ8PBwIOWUQ69evaTxQ4YMISwsjDFjxnDr1i2cnJzYsmUL48aNU6ldMTMgCIIgCOlIftHMgK+vL1ZWVtK/U6816N27N9u2bSMiIkI6MAAoXrw4p0+fZvTo0axdu5aCBQuyatUqOnTooFK7v9VzBsQFhH82cQHhn09cQPh3EBcQ/nsfn4dmWl3ZC1bItLp+FjEzIAiCIAjpqTi9/6cTgwFBEARBSO8PuYAws4gLCAVBEAQhixMzA4IgCIKQXiY+dOhPoNLMwLVr1zhz5oxM2Y4dOyhevDgGBgYMGjQo0563LAiCIAi/jORz5i1/AJUGAzNnzuTGjRvSv4ODg+nfvz+NGjVi0qRJnDhxAgcHh0zvpCAIgiAIP49Kg4HAwECsra2lf+/btw8LCwscHR0ZM2YMq1at4sCBA5neSUEQBEH4T/2ihw79KipdMxAfHy/z60ju7u40a9ZM+neNGjV48uRJ5vVOEARBEH6BX/XQoV9FpZkBQ0NDHj16BMDHjx/x9/enVq1a0vWvX79W6VnIgiAIgiD8eirNDDRr1oxJkyaxcOFCjh49Sq5cuahbt650/Y0bNyhZsmSmd1IQBEEQ/lN/yPR+ZlFpMDB37lzat29P/fr1yZMnD9u2bSN79uzS9U5OTjRp0iTTOykIgiAI/6ksdppApcGAvr4+Hh4evHz5kjx58pAtWzaZ9QcPHiRPnjyZ2kFBEARB+M9lsecM/NBDh/Lnz09CQgL3799HTU2NkiVLUqBAAXR0dDK7f4IgCIIg/GQqP4748ePHtGzZEj09PSwsLDA3N0dPT49WrVrx+PHjn9BFQRAEQfiPiYcOZezJkyfUrFmTGzduMGfOHJydnTl06BCzZ88mKCiIWrVq8fTp00zrnG9gMMMmzMCqTXcqWjbH9bLXd1/jE3AD234jMLVqQ7NOfdl/5JRcjIubJ226D6Jag9a06T6IC+5X5GL2HT5J0459MLVqg22/EfgFhsisl0gkrN2yC6s23TGzakuf4RO4//DHflJz0KCe3L7tSULCXby8TmFpaf7N+Lp1LfDyOkVCwl1u3fJkwIAecjH58+djxYo5PHrkS0LCXQIDXWna1EomRpV216xxIDExnOHD+/8xOdapY46zsxMPH/qQmBhO69by17NMnTqaoKCLxMXdJiIimNOn91CjRtUfynHI4N7cu+PNm1cPuHb1DHW+k2O9ujW5dvUMb1494O5tLwYN7JlhrK1tG5I/PsP50BaZ8okThuPtdYr4uDs8fxqE86EtlCkjexGvgYEeWzYvJ/yxH68S7nPqxC5KlSqucn6/637q6LiUxMRwmcXd/ajK+aW2deuWJ/Hxd7hy5SSWljW+GV+njgVXrpwkPv4ON296MGBAd4U5Ll8+h4cPfYiPv0NAgGyO48bZ4el5nOjoUMLC/DhwYBOlS5eQrtfQ0GDu3En4+JwjNvYWDx9eZ/PmZRgbG/wxOVpamnPo0BYePrzO+/dhCj+LAGXLluLgwc1ERgYTHR2Ku/sRihQp+EN5/mtZ7DkDKg0GZsyYQdmyZbl37x729vbY2NjQrl07Jk+ezN27dylTpgwzZszItM69f59I2VIlmDzGTqn4p88jsRs3HdPKFTi4dQ0DenbGYcUGXNw8pTGBIbcYN8OB1k2tcd6+jtZNrRk3zYEbobelMWcuuLNg5UYG9urCwa1rMK1cgSHjphERGS2Ncdp9kB37DjN5jB37tqxET0ebgaMm8/btO5Vy7NixNUuWzGDhwjVYWLTgypXrHDu2PcMPQLFiRTh6dDtXrlzHwqIFixatYdmymdjYNJfGaGpqcurUbkxMCtOt2xAqV7bCzm4Sz59H/lC7rVs3oUaNqjx7Fim37nfOMVeuXAQH32T06GkZ9u3evYeMHj2d6tWb0LBhB8LCnnDy5C709FQ75dWpUxuWLZ2Jw4JVVDdviqfndU6e2PXNHE8c34mn53WqmzdlwcLVrFg+m3btWsjFFi1aiEULpuPhcVVuXb26NVm/fjuWdVvTrEVXNLJpcObUHnLlyimNOXzIiRLFi9K+Qz+qmzclLPwZ587sk4n5nt99Pz13zg0TEzPpYmPTW+nc0tpqxeLF01m4cA01a7bEy+s6R49mnKOJSRGOHt2Gl9d1atZsyaJFa1m6VFGOuzAxKUz37kOpUqUhw4ZNlMmxbl0LNmzYQf36NrRq1YNs2TQ4eXKndPvkypWTqlUrsmDBKmrVakmXLoMpXbo4Bw9ukevT75pj7ty5CA6+xejR0zPsW/HiRXF1PcTduw9o2rQL5ubNcHBYTWKieMT9f0FNIpFIlA0uWLAgBw4coE6dOgrXX758mS5duvD8+fMf6kxS7MMM11W0bM5Kh2lY16udYcyydVtw87zGiT2bpGWzFq3m7v2H7N60HICx0xx4++4dG5bOkcYMHjOVfHnzsHjWJAC6DhxFuTIlmT5+hDSmdbdBNKxbi9FD+yKRSLBq252etjb072ELpDx3oX7rbowe2g9bG/kvdE29EuTIUVSu/PLlYwQGhjBy5BRpWWCgKydOnGfatIVy8XPn2tOqVSOqVk17EuTq1fOpVKkcDRq0A2DAgB6MGTOYypWtSE5OVvheKdtuwYKGXL58nNate3L06FZWr3ZizRrFX0KJieG/VY7p+9ap0wBOnDj/zbi8efMQE3OT5s274uYmP2OUmBiORvZCcuVenifwDwhh+Ah7aVnwjUscP36WKVMXyMU7zJ9Mq1ZNqFS5gbRs7ZoFVKlcnjr12kjL1NXVcXN1Ztv2/dSpY0GBAvno0DHj2Rk9PR0inwdj1bA9Hp7XKF26BLdCPahc1YqbN+9K64x4dgP7yfNw2rpX5vXJH5/9VttQmXYdHZeSP38+bG0HZvi+fC0xMZycOU0UtHWUgIAQ/ve/qdKygABXTpw4x/TpixTkOImWLRtTrVpajqtWzaNy5fJf5did0aMHU6VKQ6X2U0jZhk+eBNCoUSeuXLmuMMbMrDKenicoU6YWT57If9++fx/22+b4/n0YtrYD5T6LO3asJikpmf79R3+3jtR6fqYPIS6ZVpdWxcaZVtfPotLMQFxcHMWKFctwfYkSJYiLi/u3ffphQSG3qW1uKlNmaWFK6O17JH3ZSYNCb1G7RroYczMCg28BkJSUxM079+TqqW1uSlDITSBlBiI2Ll4mJnv27FSvWonA4JtK91dTUxNT00pcuHBZpvzCBQ9q1jRT+JqaNU25cMFDpszFxR0zs8poaKRcD9qqVSOuXfNj5cq5hIX54efnwoQJw1BXV1epXTU1NZycVrB8+UZu3bqrdF6/Q44/2tf+/buRkPCSGzdU3Y6VcbngLtfnWjWrK3xNTQszXFxk48+7XJLJEWDa1NHExMaxdds+pfqSP38+AF7EJwCgpZVy6+/XR1efP3/m48eP353mT/W776cA9erVJDzcn+DgS6xbtxB9fV2lcvs6x2rVKuHqKttnV9fLGeZoYWGKq2v6vl3G1LSSNMeWLRtz7Zo/K1bM4fFjX3x9zzN+/Lf303z58gIQ/2UbZhTz+fNnEhJeKZMe8HvlmJ6amhrNmjXk3r1HHD++g7AwPy5fPprh6YT/hDhNkLGCBQsSGhqa4fqQkBCMjY2/W8+HDx949eqVzJIZv3YY+yIeXe0CMmW6Otokf/ok/dDExsWjq5M+pgCxL14AEJ/wik+fPqOroy0bo12A2Lh4aTspZdoK6olXur96ejpoaGgQHR0rUx4dHYOhob7C1xga6hMdHZMuPhZNTU3p1Hbx4kVp164F2bKpY2PThwULVvG//w1i0qQRKrU7bpwdycmfWLvWSemcfpccVdG8uTWxsbd4+fIeI0YMoGXL7sTF/cB2jEqfYyyGRorP6xoaGci/J1GyOdauVZ2+fboyeMh4pfuyZPEMPD2vERp6B4Dbt+/z+PET5s21p0CB/GhqajJh/DCMjQ0xzqBvGeb3m+6n585dok+f/9GsWRcmTpyLmVllzp7dJ/MMlO/nqK2wraio2G/mGKVgm8vmWIR27ZqTLVs22rXrw8KFq/nf/wYyceLwDPuycOE0rly5Lp3JSU9LS4s5cyaxf/8xXr9+80fmmJ6BgR558+Zh3LihuLi407p1T44fP8e+fRupU8dC6XqEH6fSrYVt27Zl/PjxmJqaoq8vu/NER0czceJEbGxsvluPg4MDs2bNkimbMWMGU4b3UqU7Cqmpqcn8nXoW5OtiRTHpy+RiUCJGIl+mjPRnatTU1OTK0rejqB+pr1FXVycmJg47u0l8/vyZgIBgjI0NGT16CPPnr1Sq3WrVKjFsWF9q1Wqpcj6K+/xrclSGu7sX5ubN0NPToV+/ruzevY66ddsSE6PaLJfqOaaPTyvPkyc327etZsjQ8UoPTFatnEeliuWob9VOWpacnIxt54Fs2rSU2OibJCcn4+rqwZkzrkpm9a3+/vr9FODQoRPSf9+8eRd//xvcvetF8+YNOXbs7E/OUT5eUY7DhqXmGIKxsSGjRg3GwWGVXH3Ll8+hUqV/sLbuqLA9DQ0Ndu5cjbq6usxUvyp+dY6KqKun1HnypAurV6echrxx4yYWFmYMHNgdT89ryiWXiSQS8ZyBDM2YMYPTp09TsmRJevTowT///APAzZs32bNnD0ZGRkyfnvEFIqns7e0ZM2aMTJmWlha8fqZKd+To6WjLHZm/iE9AI1s26fSpnq629Ag/Leal9Chfu0A+smVTJzbuhXzMlxkFvS+zBrEvXqD/1YVmL+IT5GYmviU29gXJyclyo3J9fT250XuqqCj5ozF9fV2SkpKk/2lERkaTlJTM56+mp27fvo+xsQGamppKtWtpaY6BgR737nlL12toaLBw4VRGjOhH2bKWv3WOSUlJSvUP4N279zx8GMbDh2Fcvx5ASIg7ffp0YfHitarlaCTf5+ioGIWviYqMls/RQE+aY4UKZSlevChHj2yTrk+ddk18F0b5ivV4+NXdKyuWz6F1qyZYWbfn2bMImXr9A4KpXqMJ+fLlJXv2lO3v5XkCX78bKON33k8ViYyMJjz8mUp3TMTGxitsy8BA95s5GinY5j+6ny5bNotWrRrRqJGtwot1NTQ02L17LSYmRWjevKtKswK/S47f6ltSUhK3bt2TKb9z5z61a3/7boef5g+5JTCzqHSaQFtbm2vXrtG9e3f27dvHqFGjGDVqFAcOHKBbt254e3sr9eAhLS0t8uXLJ7NoaWn9cBKpqlT8B28ff5kyr+v+VPinNJpfzm9VqVBOPsbHn6qVygEp59XKly2Nt0+ATIy3jz9VKpYHoHBBI/R0tWVikpKS8A0Mpmql8kr3NykpCX//YKyt68qUW1vX5epVP4WvuXrVXy6+UaN6+PndkF684+3tS8mSJjKzFKVLl+D58yiSkpKUanfPHmeqV2+CuXkz6fLsWSTLlm2kVauMb4H7XXL8N9TU1KTn2pWRkuMNGlnXk+uz91Vfha+5es2PRo1k4xs3qi/N8fbt+1Sp1hCzGk2ky4mT57l0yQuzGk1kLhpbuWIu7Wya07ipLY8fZ/yroa9evSY29gWlShXHzKwKJ06cUyG/33M/VURHpwCFCxsT+dXdP8rkGBAQTMOGsm01bJhxW9eu+cvFW1vXxd8/+Ds5FiciQnY/Xb58Nm3bNqNZs66Ehclvw9SBQMmSxWnZsjsvXiQondvvkuP3+ubnd4MyZUrIlJcuXZzw8H93kCgoR+WrrbS1tVm/fj1xcXFERkYSGRlJXFwcGzZsQFdXtYt2vufdu/fcvvuA23cfAPDseRS37z6Q3uK3fP1W7Ocskcbb2rQkIjKaRas28eBxOIdPnuPwyfP06dpBGtPDti1ePv5s2XWAh2FP2LLrAFd9AuhpayON6dW5Hc4nznH45DkePA5n4cqNRETF0PnLbV9qamr0tLXBccd+Lrhf4d7Dx0yZt4wcWlq0bNxApRxXrdpM375d6N3blrJlS7Fo0XSKFCmIo+MuAObMmciWLcul8Zs376Jo0UIsXDiNsmVL0bu3LX36dGbFirQ7KDZt2omOjjZLl86kVKniNGvWkAkThrFx43al233xIoGbN+/KLMnJSURFxXDvXsZ3ffxOOebOnYvKlctTuXLKAK1YsSJUrlxeehtVrlw5mT17Aubm1ShatBBVq1Zk/fqFFCpkhLOz/PMpvmX5Skf69+tKn96d+eefUixdPJOiRQqxcdNOAObNncRWp7Sp742bdmJStDBLFs3gn39K0ad3Z/r17cLS5RuAlOtqQkPvyCwJCa94/eYNoaF3pF+yq1fNp3u39vTsNZzXr99gaKiPoaE+OXLkkLbVoUMr6terRfHiRWnduglnT+/l2PGzuKS7MO9bftf9NHfuXDg4TMHCwhQTk8LUq1cTZ2cnYmPjVT5FkNJWZ3r1Sm1rGkWKFGTz5t0AzJ49gc2bl0njHR13y+TYq5d8jo6Ou+RyHD9+GBs27JDGrFgxly5dbOjdeyRv3rz9ahumHCBly5aNPXvWY2pamb59/0e2bNmkMar+SuyvyvF7n0WA5cs30rFjK/r27UKJEiYMGdKbFi0asWlTWj3/qSx2ASGS38jHmAcyi+e5I5IyZcrILeP/Zyf5GPNAMv5/dpJunTvIvObK+SOStq2aSypUKC+xql9XsnPTKrl6Tx7YJmnaqKGkfPnykqaNrSWnDu6Qi9mxcaWkQb06kgoVyktsWreQeLkclVn/Ifq+ZPmCWZLatSwkFStWkHSzbS8JveYqV0/qIpFIJFpaRRQuI0ZMljx+HC5JTEyU+PndkFhbd5Cu27HjgMTd3UsmvlGjjhJ//xuSxMREyaNHYZJhw+zl6qxXr63k6lU/yfv37yUPHjyWTJu2QJIzp4nS7SpaHj8Ol4wdOzPD9b9bjo0bd1K4n+3YcUCipVVEki9fKcmRI6clT59GSBITEyXPnkVKjh8/J6ldu+U3c8ymWVDhMmy4veTRo5Qcff2CJA2s2knXbdu+X3Lp0hWZeKuG7SV+X3J8+DBMMtRuYoZ1p9Zx9NgZmbKM9O03Shrzv1FTJeHhzyQfPnyQPH78RDJ33nJJjlwmCtv43bbh99rNn7+U5Pz5S5KoqBjJhw8fJGFhTyQ7dhyQlCxp/s1tmCNHUYXLyJFT0rXVUbouNcev4xs16iTx9w/+kmO4ZPhwe7k669e3kVy79nWOCyW5chWTrs/IgAFjJDlyFJWUKVM7w5jGjW0V5vG75di4sW2Gn8Wv6xk0aJzk3r2Hknfv3ksCA0MkHTv2zzCPn+2975FMW/4EKj1nACAoKIgTJ06go6ODra0tenp60nWvXr1i1KhRODn92NXn33rOwJ8uo+cM/E0yes7A3ySj5wz8LTJ6zsDfJKPnDPxNMnrOwN/kZz9nINHHOdPqylGjw/eDfjGVThOcP38ec3Nz9u3bx8KFCylXrhxubm7S9e/fv2f79u3fqEEQBEEQhN+NSoOBmTNnMm7cOEJCQnj8+DETJkygTZs2nD2r2rk5QRAEQfitZbEfKlLp1sLQ0FB27ky5IEpNTY3x48dTuHBhOnbsyN69ezE3V+6JZoIgCILwW/tTLvzLJCoNBrS0tEhISJAp69q1K+rq6nTp0oWlS5dmZt8EQRAEQfgPqDQYqFq1Km5ubpiZyT7HunPnznz+/JnevVX/pTBBEARB+O38IdP7mUWlwcDQoUO5fFnxvcldu3YFYNOmTQrXC4IgCMIfI4udJlD51sKfSdxa+GcTtxb++cSthX8HcWvhv5d4ZXem1ZXDsnum1fWzqHQ3gbq6OtmyZZNbtLW1qVmzJocPH/5Z/RQEQRCE/04WewKhSqcJDh8+rPBX+RISErh+/To9evRg+/btdOrUKdM6KAiCIAj/NfGrhd/wrZ8n7t27N+XLl2fJkiViMCAIgiAIfxCVf6joW5o0acLdu3czs0pBEARB+O+J0wQ/7v379zK/liYIgiAIfyRxa+GPc3R0pFq1aplZpSAIgiD89/6QI/rMotJgYMyYMQrLX758ia+vLw8ePMDDwyNTOiYIgiAIwn9DpcFAQECAwvJ8+fLRrFkz7OzsMDH5u+9tFQRBELIAcZogY1//XLEgCIIg/LWy2GmCTL2bQBAEQRCEP0+mXkAoCIIgCH8FcZpAEARBELI4cZpAEARBEISs5LeaGdDUK/Gru/BTJSaG/+ou/HRZIcfkj89+dRd+qqywDX/2L979DrJCjj9VFpsZ+K0GA3/zT6cmJob/1T/RDCmDufx5Sv7qbvxUL988QDtPqV/djZ8m/s39vzo/SMnRMP8/v7obP1XUy9sYFSj3q7vxU0Um3Pq5DWSxawbEaQJBEARByOJ+q5kBQRAEQfgtiNMEgiAIgpDFZbHTBGIwIAiCIAjpZbGZAXHNgCAIgiBkcWJmQBAEQRDSE6cJBEEQBCGLE6cJBEEQBEHISsTMgCAIgiCkl8VmBsRgQBAEQRDSk0h+dQ/+UyqdJli0aBHv37+X/n358mU+fPgg/fv169fY2dllXu8EQRAEQfjpVBoM2Nvb8/r1a+nfrVq14tmztB9teffuHRs3bsy83gmCIAjCr/D5c+YtfwCVThNI0k2bpP9bEARBEP4Kf8h/4plF3E0gCIIgCFmcuIBQEARBENITDx36ts2bN5MnTx4AkpOT2bZtG3p6egAy1xMIgiAIwh8ri50mUGkwULRoURwdHaV/GxkZsXPnTrkYQRAEQfijZbFr4lQaDDx+/PgndUMQBEEQhF9FXDMgCIIgCOllsdMEmXo3QVRUFLNnz87MKgVBEAThv/cLnzOwbt06ihcvTo4cOTAzM8PDw+Ob8bt376ZKlSrkypULY2Nj+vbtS1xcnEptZupgIDIyklmzZmVmlQAMGtST27c9SUi4i5fXKSwtzb8ZX7euBV5ep0hIuMutW54MGNBDLiZ//nysWDGHR498SUi4S2CgK02bWv1wu2vWOJCYGM7w4f1Vys03MJhhE2Zg1aY7FS2b43rZ67uv8Qm4gW2/EZhataFZp77sP3JKLsbFzZM23QdRrUFr2nQfxAX3K3Ix+w6fpGnHPphatcG23wj8AkNk1kskEtZu2YVVm+6YWbWlz/AJ3H8YplJ+qQYM7M6NkEtExd7E3eMYtWpX/2a8ZR1z3D2OERV7k6BgN/r17yqz/uSZ3bx880BuOXBoszQmT57cOCycSvDNy0TGhHL+wkFMTSvJ1JM7dy4WL53BzTueRMaEct3vHP0HdPuhHPsP7E5giBsRsaG4eRz9bo6165jj5nGUiNhQAoIv0jddjgBD7Ppw3f88z2NCCLntwbwFU9DSyi5d329ANzyvniTseSBhzwM553qQRo3rydTRqk0TDh3dyv2w68S/uU/FSuX+mPxGjx2Cq/thwiMCufvoGrv2rqdU6eIydUycPJJr/ud4GnWDR0/8OHJiO2bVq/xQjn0GdMXnxgXCooI47+6MRS2zb8bXsqzBeXdnwqKCuB7kQq9+nWXWa2hoMGaCHdcCzxMWFcRFz6NYWdeRifG54UrUy9tyi8OSadKYlesc5NafvrDvx3Ls35XrQS48jgzk3KVD38zRwFCfdY6L8fQ5zfMXocx2sFcY17JNYy5fPUFYVBCXr56geatGMutHjB7I2YsHuP/El5B7nmzdvZqSpYrJxKxcN5/IhFsyyymXH8vxT7Z//35GjRrFlClTCAgIoG7dujRv3pzw8HCF8Z6envTq1Yv+/fsTGhrKwYMH8fHxYcCAASq1q9Jg4MaNG99c7ty5o1LjyujYsTVLlsxg4cI1WFi04MqV6xw7tp0iRQoqjC9WrAhHj27nypXrWFi0YNGiNSxbNhMbm+bSGE1NTU6d2o2JSWG6dRtC5cpW2NlN4vnzyB9qt3XrJtSoUZVnzyLl1n3P+/eJlC1VgsljlHuM89PnkdiNm45p5Qoc3LqGAT0747BiAy5untKYwJBbjJvhQOum1jhvX0frptaMm+bAjdDb0pgzF9xZsHIjA3t14eDWNZhWrsCQcdOIiIyWxjjtPsiOfYeZPMaOfVtWoqejzcBRk3n79p1KObbv0BKHhVNZsngddS1b4+Xlw6HDThQubKww3sSkMAedt+Dl5UNdy9YsXbKehYun06ZtU2lMz252lC5hIV0sajQjOTmZo0fOSGNWr3XAqqElgweOpbZFCy5e9ODoiZ0YGxtKYxwWTKVRo/oMGjAWc7MmrFuzlUVLZtCipeyX2fe069CC+QunsHTxeupbtsHby4cDh7dkmGNRk8IccN6Mt5cP9S3bsGzJBhYsnkbrr3LsZNuGGbPHs8hhNRZmTRlhZ0+7Di2YPmu8NOb5s0hmTV9Mw3o2NKxng8dlb3bv38A/5UpLY3LnysW1q37Mmr5EpZx+h/xq1zFn86ZdNGnYifate6OhkY3Dx7aRK1dOacyDe4+YMGYWlhYtad6kC+Fhzzh8bBu6ejoq5di2fXPmONizYskGGtVtxzUvX/Ye2kShDHMsxJ6DG7nm5Uujuu1YuXQj8xZOoWWbJtKYSdP+R6++nZk8fi71LFqyfes+tu5eQ8XKaQOyZlYdqVi6jnTp1LYvACeOnpNpz9Xlskxct06DVcoPoG275sx2mMSKJRtpXK8917z92HNwY4Y5amlpEhf3gpVLNxIaclthjFmNqmx0WsbB/cexrmPDwf3H2bR1GdXMKktjalnWYOvmPbRs3AXbdv3RyKbB/iNbZLYjwEWXy1QqU1e6dP+BHDON5HPmLSpYtmwZ/fv3Z8CAAZQrV44VK1ZQpEgR1q9frzD+6tWrFCtWjJEjR1K8eHHq1KnD4MGD8fX1ValdNYkKjxFUV1dHTU1N4ZMHU8vV1NT49OmTSp1IlSOH/J0Ily8fIzAwhJEjp0jLAgNdOXHiPNOmLZSLnzvXnlatGlG1qrW0bPXq+VSqVI4GDdoBMGBAD8aMGUzlylYkJycr7Iuy7RYsaMjly8dp3bonR49uZfVqJ9as2SJXX2JiOEmxD7+Zf0XL5qx0mIZ1vdoZxixbtwU3z2uc2LNJWjZr0Wru3n/I7k3LARg7zYG3796xYekcaczgMVPJlzcPi2dNAqDrwFGUK1OS6eNHSGNadxtEw7q1GD20LxKJBKu23elpa0P/HrYAfPz4kfqtuzF6aD9sbVrI9U1TrwT585SUK3d1cyYoKJQxo6ZLy677nePUCRdmzZT/D2rW7Ak0b2mNuVnafxzLV86hYsV/aGzdSeH7MtSuD5OnjqJsqVq8e/eeHDm0eBZ5g66dB3P+3CVpnIfXCc6ddWPu7GUAeF8/w2HnUyxeuEYa4+5xjPPnLzFvznK5dl6+eYB2nlJy5S5uh7gRFMrYUTOkZVf9znL6xAVmK8hx5uzxNGtpTU2zZtKyZStnU6FiOZp+yXHR0hmUKVsSm1a9pDFz5ttjVr0yLZrIH2Wnehjuy/SpC9m146BMeZGihbhx0526tVoTEnxL4Wvj39z/rfPT1dPh/uPrtGzaFa8rPgpj8ubNQ3hEIG1b9eTyJW+FORrm/0eu/Izrfm4E3WTimLTZTY/rpzh7ypV5s5bJxU+dNZamzRtS17yltGzR8plUqPgPLRt3ASDo9mVWLNnA1s17pDHbdq/h7dt3DBs0QWH/5zjY07hZA2pWS9v/V65zIH/+vPTpPlzha9KLenkbowLyM0CnL+wjOOgWE8em5Xj52knOnnJl/mz5/f1rh09uJyT4NtPtHWTKNzotI2/e3DKDkz2HNvEy4RVDB4xTWJeurjahD7ywadGTq16+X3KcT778eenbfYTC16QXmaB4H84s7zaNzrS6svVeIPM7PgBaWlpoaWnJlH38+JFcuXJx8OBB2rVrJy3/3//+R2BgIO7u7nJ1e3l5YWVlxZEjR2jevDnR0dHY2tpSrlw5NmzYoHQfVZoZ0NXVxdHRkUePHsktDx8+5OTJk6pU912ampqYmlbiwoXLMuUXLnhQs6biqa2aNU25cEH2/IqLiztmZpXR0Ei5XrJVq0Zcu+bHypVzCQvzw8/PhQkThqGurq5Su2pqajg5rWD58o3cunX3X+erjKCQ29Q2N5Ups7QwJfT2PZK+DGyCQm9Ru0a6GHMzAr/8B5CUlMTNO/fk6qltbkpQyE0gZQYiNi5eJiZ79uxUr1qJwOCbSvdXU1OTqtUqctHVU6b8oqsn5jVNFb6mhkU1uXjXCx5UM60k3Ybp9exty2HnU7x7l/JDWhoaGmhoaPDhw0eZuMT3idT8alr0qrcvLVpYS2cL6tarSclSxXBNt+1/JEe37+TopjDHitIcr3r7UrVqRUy/HGGZFCtC46b1OX/2ksI61dXVad+xJbly58LneoDS/f+e3yU/gHz58gIQH5+QYV979+3My4RXhAQrPpLN6HWVq1bg0kXZ02nuF69Q3byawtdUr1EV93Txbq6eVKlWQZpjdq3scv8JJCZ+wDyD7y9NTU06dG7D3l2H5dbVrmNO6P0rePmdZemq2eipOPMhzdEtXY5uV6hhoThHZZjVqMIlN9nTm5cufrvOvF+2Y0L8S5ny2nXMCbnnyRXfMyxZqXqOvysHBwfy588vszg4OMjFxcbG8unTJwwNDWXKDQ0NiYxUPPNcu3Ztdu/eTefOncmePTtGRkYUKFCA1atXq9RHle4mMDMz4/nz55iYmChcn5CQoNTvFXz48EHhKCk9PT0dNDQ0iI6OlSmPjo7B0FBfYd2GhvpER8eki49FU1MTPT0dIiOjKV68KA0a1GbfvqPY2PShVKlirFgxFw0NDebPX6l0u+PG2ZGc/Im1a52+m3NmiX0Rj652AZkyXR1tkj99IiHhFfp6OsTGxaOrkz6mALEvXgAQn/CKT58+o6ujLRujXYDYuHhpOyll2nL1PP/qVML36OpqK3wvY6JjMTTIYBsa6BMj996nbENdXW2iomS3r6lZZSpUKMtwu0nSsjdv3nLtqj/jJw7jzu37REfH0rFTa6rXqMqD+4+lcRPGzWbVmvncvudFUlISnz9/ZsSwyVz19lM5x/R9jomOw8BAT+FrDAz0iYmOSxcvm+PhQ6fQ1dPhjMs+1NTU0NTUZIvjblYsk/0xsPIVynDO9SA5cmjx9s07enYdyp3b95Xu/++e39fmOUzG28uHWzfvyZQ3bWbF5m0ryJUrJ5GR0bRr05sXX/ZlZehIc0zX55g4DAwzyNFQn5gY2QFPTHQcmpqa6OhqEx0VwyVXTwYP64P3FV8ePwqnboNaNG3RkGzZsimss3kra/Lnz8u+3Udkyi9euMyJo2d5+uQ5RU0KM3HKSJxPbKNx/Q58/JikZI4FMtyO+hlsR2UYGOopqDP2m3XOmj+Rq16+3L6Vth0vunhw4ug5nj55ThGTQkycMpJDx7fRpIHyOWaqTLybwN7enjFjxsiUKfo/L5WamprM36mz7orcvHmTkSNHMn36dJo2bUpERATjx49nyJAhbNkiP0udEZUGA4MHD+bt27cZri9atChbt279bj0ODg5yFxrOmDEjg2j5H0TK6FRFWrzs36lvYupr1NXViYmJw85uEp8/fyYgIBhjY0NGjx7C/PkrlWq3WrVKDBvWl1q1WvJfU7SjpJR/OyZ9mVwMSsRI5MuUofC95FvbUD5eUTlAr962hIbewd/vhkz54IFjWbN+AXfue5OcnExQYCgHDxynStUK0pghQ3tTo0ZVOncayJPwZ9SuY87S5bOIiozm0qXvX8wp22fZv9XU+Fc5Wta1YOx4O8aNnomfbyDFS5iwYNE0IiOjWbJwrfR19+4+ol7tNuTPn5c2bZuxbtNiWjXrlqkDgpR+yf79X+WXavGymVSoWJbmX6bgv+Zx+Sr1ardBV1ebXn06s3XHKhpZdSA25sW/SlJN7ds/yCafo2z51InzWLpqDld8TyORSHj86An7dh+mS/f2Cuvr1rMjF108iEo34D52OO1amNu37hEYEIJfiCuNmjbg9AkXpdNL6Zvs39/7PlWuTuW/ox0WT6N8hbK0adZdpvzYEdkcgwJC8Q2+8EM5ZopMfByxolMCiujp6ZEtWza5WYDo6Gi52YJUDg4OWFpaMn58yrU2lStXJnfu3NStW5e5c+dibKz4epD0VBoMfH0OQxFtbW169+793XoyGiUtWCB7hB0b+4Lk5GS5WQB9fT25I81UUVHyswb6+rokJSUR9+VIITIymqSkZD5/NfK7ffs+xsYGaGpqKtWupaU5BgZ63LuXdk5SQ0ODhQunMmJEP8qWtfzu+/Aj9HS0pUftqV7EJ6CRLRv58+dLidHVlh7hp8W8lB7laxfIR7Zs6sTGvZCP+TKjoPdl1iD2xQv0v5qqexGfIDcz8S1xcfEK30s9fd2Mt2F0DAYZbMMXLxJkynPmzEH7Dq2YP2+FXD2PHoXTslk3cuXKSd68eYiKimHr9lWEPX4KQI4cWkyfOZbuXYdKrysIDb1D5UrlGPG/gUoPBlJzTH8EqaevK3ekmSo6OkZh/Nc5Tpk2igN7j7Jz+wEAbobeJXeuXCxfPZeli9ZJv2yTkpJ49OUuj8CAEKqZVWKIXW9Gj5xGZvjV+QEsXDKd5i2sadG0q8yFvqnevXvPo4dhPHoYhq9PIL6BF+jZy5blS5U7Z/riS4766fus940co2LkZkZSc4z/kmNcXDx9ug9HSys72joFiIyIZuqssYSHPZWrr3CRgtRrUIt+Pb5/zjw6KoanT55ToqTiWVpFXsQlZLAddYiNUe02NNm+xMp9XvX0dRXWOW/RFJo0t6Jdy55EPI/6Tr0xPH0SQYkSyuf4p8uePTtmZma4uLjI/H/r4uJC27ZtFb7m3bt3cqdPU2eeVBnkZeqthcHBwYwaNeq7cVpaWuTLl09mUTRqSkpKwt8/GGvrujLl1tZ1uXpV8TTu1av+cvGNGtXDz++G9GJBb29fSpY0kTnCLV26BM+fR5GUlKRUu3v2OFO9ehPMzZtJl2fPIlm2bCOtWvX87nvwo6pU/AdvH3+ZMq/r/lT4pzSaX3aIKhXKycf4+FP1yy1lmpqalC9bGm8f2fPK3j7+VKlYHoDCBY3Q09WWiUlKSsI3MJiqlcor3d+kpCQCA0Kwaig7OLJqaMn1q/4KX+NzLUAuvqF1HQL8g+Uu+GzXviVaWtnZv+9ohn149+49UVExFCiQj4bWdTl96gKQ8j5kz56dz59lPzCfPn9GXV352Y+0HGVvGWvQsM43c2yQLj4lxxBpjjlz5pQZsKb07RNqamrfnJ1RU1Mje/bsGa5X1a/Ob9HSGbRq04Q2LXso/E9UETU1NbJrKf8eJCUlcSMwlPpWshfv1rOqjW8G11/4+gRSL118g4aWBAWEyu2nHz58JDIiGg0NDVq1acK50xfl6uvSvT2xMXG4nJO/SCw9be0CFCxkTFRkzHdjU0lzbCDb5/oNauNz7cevMfHzCZKrs4GVfJ3zF02lRavGdGzTl/CwZ9+tNyVHI7nTgv+Zz5LMW1QwZswYNm/ejJOTE7du3WL06NGEh4czZMgQIOVgulevtItuW7duzeHDh1m/fj0PHz7kypUrjBw5EnNzcwoWVHzXnSL/+gmEr169Yu/evWzZsgVfX18qV678/RepYNWqzTg5Lcff/wZXr/rTv383ihQpiKPjLgDmzJlIwYJG9O+fcuXn5s27GDq0NwsXTsPJaS81a5rSp09nevVKG21v2rSToUP7sHTpTNat20apUsWZMGEY69ZtVbrdFy8S5I5Sk5OTiIqK4d69b9818LV3794T/vS59O9nz6O4ffcB+fPlxdjIgOXrtxIdG4fDtJSrcm1tWrLX+QSLVm2iQ5tmBIXc4vDJ8yyeOVFaRw/btvQZNp4tuw5gVbcWbh7eXPUJYMf6tKu+e3Vuh/2cJVT4pzRVKpbj0LEzRETF0Lldyl0Campq9LS1wXHHfooWLohJkUI47thPDi0tWjZuoHR+AGvXOLHRcQkB/sFcvx5An75dKFy4IE5bUq6wnjFzHMYFjRgyKCVHpy17GDi4J/McJrN9237MzavRs1cn+vcdJVd3z96dOHXSRXok9jVr67qgpsb9ew8pUcKE2fMmcf/eQ3btPATA69dv8PC4ypx5k0hMTORJ+DMs61jQpWs7ptjPUynHdWuc2PAlR5/rAfTu24XChY3Z+iXH6TPHYVzQkKGDxn/JcS8DBvdkrsNkdmzbTw3zavTo1YkBfdOuYD575iJ2w/tx48ZNfH2CKFHChMlTR3PmtKv0P9FpM8ZywcWdp08jyJs3N+07tqJOXQs62vST1lNAOz+FCxfE2NgAgNJlUu7Tj46KyXB25nfJb8nyWXTs1JpuXYbw5vVb6ZH4q1evSUz8QK5cORk73o4zp12JioxGW0eb/gO7U7CQkcy0szI2rN3Gmo0LCQoIwfd6ID372FK4sDHbnVLudZ8yYwxGxgaMGJJybcoOp330H9idWfMmsWv7AaqbV6Vbzw4M6Z92Bb2pWWWMChoSGnwLI2NDxtsPR11dnTUrN8u0raamRpfu7Tiw96jc3Vi5cudivP1wTh07T1RUDEWKFmLy9NG8iIvn9MkLKuW4ce12Vm9cQFBgSo49+thSqLAxO7buB2Dy9NEYFzSU5ghQoVLKnRe5c+dCV1ebCpX+IeljEnfvPADAccMOjp7eyfD/DeDsaVeatbCmboNatGmW9nyXBUum065TS/p0G86bN2+l1xO8Tt2OuXMxftIwTh53IToqmiJFC2E/LTXHX3CKAH7ZEwg7d+5MXFwcs2fPJiIigooVK3L69GnptXoREREyzxzo06cPr1+/Zs2aNYwdO5YCBQrQsGFDFi6Uv9vuW354MODu7s6WLVtwdnYmMTGR8ePHs2fPHkqVkr8t6d84dOgEOjoFmDz5fxgZGRAaehcbm96Eh6eMLI2MDGTu/X/8+Ak2Nr1ZtGg6Q4b0IiIiijFjZnL0aNoXw9OnEbRq1YNFi6bj63uO58+jWLvWiSVL1ivdbmYJuX2PfiPS/iNftDrllsG2zRsxb+pYYuNeEBGVdv6wcEEj1i2ZzaJVm9h7+AQGerrYjxpCY6u0o7BqlcqzeNYkVm/awWrHnRQpZMzi2fZUrpB2O1XzRvV5+eo1G7buISbuBaVLFGP9ktkUNEo7L9WveycSP3xk7tK1vHr9hsrly7JpxTxy586lUo6HnU+ho1OACZNGYGSkz62b9+jUoT9PnqQMggyNDChcJO28VljYUzp16I/DgikMHNSDyIhoJo6fzfFjsvddlyxVjNq1a2DTuheK5Muflxkzx1GwkBHx8S85fuwsc2YtlTlq69f7f8yYNR7HLcvQ1i7AkyfPmDNrKVu+uhVMGUecT6Ojo82EScMxNDLg1s27dO4w4Ksc9Sn81X4aHvYU2w4DmL9gCgMG9SAyIopJ4+dw4qsclyxci0QiYcq0MRgXNCQu9gVnz1xkzqyl0hh9Az02OC7B0MiAV69eExpym442/WSuGG/ewpp1GxdJ/3bavgqABfNXsXD+qt86v/4DU84rnzoruz3sBk9g7+7DfPr0idJlS9Clezt0dXV48SKeAL9gWjTpInNxmjKOHT6Dtk4BxkwYhqGRPrdv3aNbp8E8/ZKjgaE+hQp/neMzunUazGyHSfQd2I2oyGimTJzHqePnpTFaObSYNPV/mBQrwtu373A9786wQRN59VL2F17rWdWmSNFC7NkpfxfB50+fKFe+DLZd2pIvf16iImO44nGdQX1H8/ZNxtdwKczxSGqOdhgYpuTY3XaINEdDI325Zw64eqRdzFilWkU62LbmSfgzalROeRaH7/VAhvQby8Sp/2PClBE8fvSEwf3GEvDVNTx9BqTcKnrk1A6Zuv9nZ8/+PUf5/OkT/5QvQ6cvOUZHxXLF4xqD+43h7RvVnmuSaX7h44jt7Oyws1P87Jlt27bJlY0YMYIRI5S7JTMjKj1nICIigq1bt+Lk5MTbt2/p2rUr3bp1o1atWgQFBVG+vPLTx4ooes7A30KZ5wz86TJ6zsDfJKPnDPwtMnrOwN8ko+cM/E0yes7A3+SnP2dg5ZBMqyvX/5S/3/9XUWlmoHjx4nTq1Im1a9fSuHFj6X35giAIgvBXET9hnDETExM8PT0pWrQoJiYm/PPP3z26FgRBELIo8auFGbtz5w67du0iIiKCGjVqYGZmxvLlKY+w/JF7zwVBEARB+PVUnue3tLTEycmJiIgIhgwZwoEDB/j06RN2dnY4OjoSE/OLbgMRBEEQhMzyi24t/FV++KR/njx5GDhwIN7e3oSEhGBqasrUqVNVuq9REARBEH5Lv+hXC38Vla4ZePXqlcLywoULM2PGDOzt7bl8WfkfeBEEQRAE4ddTaTBQoEABpa4N+NGfMBYEQRCE38IfMr2fWVQaDLi5uUn/LZFIaNGiBZs3b6ZQoUKZ3jFBEARB+FUkWexuApUGA/Xr15f5O1u2bNSsWZMSJUpkaqcEQRAEQfjv/OvfJhAEQRCEv444TSAIgiAIWdwfchdAZvnXgwHxsCFBEAThryNmBjLWvn17mb8TExMZMmQIuXPnlik/fFj+l7cEQRAEQfg9qTQYyJ8/v8zfPXr0yCBSEARBEP5g4m6CjG3duvVn9UMQBEEQfh9Z7DSB+A1iQRAEQcjixN0EgiAIgpCeuJtAEARBELI4cZpAEARBEISsRMwMCIIgCEI6We23CdQkEknWmgsRBEEQhO94M7H994OUlGfh7//snd9qZiBHjqK/ugs/TWJiOPnzlPzV3fipXr55QFLsw1/djZ9KU68EevnK/Opu/DSxr+7+1flBSo76+cv+6m78VDEv72BUoNyv7sZPFZlw61d34a/yWw0GBEEQBOG3kMUuIBSDAUEQBEFIT9xaKAiCIAhZXBabGRC3FgqCIAhCFidmBgRBEAQhHUkWmxkQgwFBEARBSC+LDQbEaQJBEARByOLEzIAgCIIgpJfFnkAoBgOCIAiCkF4WO02g0mDg4MGDHD16lKSkJBo1asSgQYN+Vr8EQRAEQfiPKD0Y2LRpE0OGDKF06dLkyJEDZ2dnHj16hIODw8/snyAIgiD897LYzIDSFxCuXr2aKVOmcOfOHYKCgtiyZQtr1qz5mX0TBEEQhF9CIpFk2vInUHow8PDhQ/r27Sv9u2fPnnz48IHIyMif0jFBEARBEP4bSp8meP/+PXny5JH+nS1bNrS0tHj37t1P6ZggCIIg/DJZ7DSBShcQbt68WWZAkJyczLZt29DT05OWjRw5MvN6JwiCIAi/ghgMKFa0aFEcHR1lyoyMjNi5c6f0bzU1NTEYEARBEP544nHEGXj8+PFP7IYgCIIgCL9Kpj2OOC4ujhUrVmRWdYIgCILw63yWZN7yB/hXgwGJRMK5c+ewtbWlYMGCzJs3L7P6JQiCIAi/zudMXP4APzQYePz4MdOnT8fExIQWLVqQI0cOTp069dNuMxw0qCe3b3uSkHAXL69TWFqafzO+bl0LvLxOkZBwl1u3PBkwoIdcTP78+VixYg6PHvmSkHCXwEBXmja1kq6vU8ccZ2cnHj70ITExnNatm8jVMXXqaIKCLhIXd5uIiGBOn95DjRpVVc5vwMDu3Ai5RFTsTdw9jlGrdvVvxlvWMcfd4xhRsTcJCnajX/+uMutPntnNyzcP5JYDhzZLY/LkyY3DwqkE37xMZEwo5y8cxNS0kkw9uXPnYvHSGdy840lkTCjX/c7Rf0A3lfPzDQxm2IQZWLXpTkXL5rhe9vrua3wCbmDbbwSmVm1o1qkv+4+ckotxcfOkTfdBVGvQmjbdB3HB/YpczL7DJ2nasQ+mVm2w7TcCv8AQmfUSiYS1W3Zh1aY7ZlZt6TN8AvcfhqmcI0DfAd3wu+HK0+hgXN0PU7PWt7djbcsauLof5ml0ML5BrvTp10VmvYaGBuMmDsMn6AJPo4O5dOU4DRvV/aF2J9iPIOSOB0+ibnDs1E7K/lNK5JdBjr43XHkSdYML7s7UrGX23RwvuDvzJOoGPkEX6K0gx7EThnE90IUnUTdw8zxGQ2vZHPv078qlK8d5+MSPh0/8OO2yD+tG9eTaGj9pOMG3PQiPDOLoyR0/nGOf/l25HuTC48hAzl06hMV3cqxlWYNzlw7xODKQa4Hn6dW3s1yOYybYcTXgHI8jA3H1PIKVdR2ZmNx5cjHbwR7fYFceRQRw4tweqlarKNdW6TIl2L53LXfDrnP/iS+nXPZRqLDxD+UpqEbpwcCHDx/Yu3cv1tbWlCtXjpCQEJYtW4a6ujqTJk2iUaNGZMuWLdM72LFja5YsmcHChWuwsGjBlSvXOXZsO0WKFFQYX6xYEY4e3c6VK9exsGjBokVrWLZsJjY2zaUxmpqanDq1GxOTwnTrNoTKla2ws5vE8+dpg5lcuXIRHHyT0aOnZdi3e/ceMnr0dKpXb0LDhh0IC3vCyZO70NPTUTq/9h1a4rBwKksWr6OuZWu8vHw4dNiJwhl8AExMCnPQeQteXj7UtWzN0iXrWbh4Om3aNpXG9OxmR+kSFtLFokYzkpOTOXrkjDRm9VoHrBpaMnjgWGpbtODiRQ+OntiJsbGhNMZhwVQaNarPoAFjMTdrwro1W1m0ZAYtWjZSOj+A9+8TKVuqBJPH2CkV//R5JHbjpmNauQIHt65hQM/OOKzYgIubpzQmMOQW42Y40LqpNc7b19G6qTXjpjlwI/S2NObMBXcWrNzIwF5dOLh1DaaVKzBk3DQiIqOlMU67D7Jj32Emj7Fj35aV6OloM3DUZN6+Ve2WWZv2LZi3YDLLl2zAqo4N3t6+7HN2zPCLrKhJYfYecsTb2xerOjasWLqB+Yum0qpN2qBz8rRR9O7bBfvxc7A0b8F2p71s372WSpXLqdTuiFEDGTqsLxPHzaFxgw5ER8fifGwrefLkFvnJ5NicuQ72rFiynoZ1bbjq5ce+Q9/Occ/BTVz18qNhXRtWLt3A/IVTZHK0nzaK3n07M3n8HOpYtGD71n1s271GJsfnzyKZO3MJjRp0oFGDDnhevsqOvWtl/rNPzXHS+Nk0sepIdHQsh45uJbeKObZt15zZDpNYsWQjjeu155q3H3sObvxGjoXYfWAD17z9aFyvPSuXbmLuwsm0bNNYGjNp6v/o2ceWKRPmUc+iFTuc9uO0azUVv8px2aq51G9Qm+GDJ2JVuy3ublc4cNQJI2MDaYxJsSIcO7ub+3cf0b51bxrWsWH54vV8SPygUo6ZRfJZkmnLn0BNouTjkfT09Chfvjw9evSgU6dOaGtrAyn/sQYFBVG+fPl/3ZkcOYrKlV2+fIzAwBBGjpwiLQsMdOXEifNMm7ZQLn7uXHtatWpE1arW0rLVq+dTqVI5GjRoB8CAAT0YM2YwlStbkZyc/N1+JSaG06nTAE6cOP/NuLx58xATc5Pmzbvi5iZ7lJqYGE7+PCXlXuPq5kxQUChjRk2Xll33O8epEy7MmrlELn7W7Ak0b2mNuVnaf/7LV86hYsV/aGzdSWG/htr1YfLUUZQtVYt3796TI4cWzyJv0LXzYM6fuySN8/A6wbmzbsydvQwA7+tnOOx8isUL05406e5xjPPnLzFvznK5dl6+eUBS7MMM3p0UFS2bs9JhGtb1amcYs2zdFtw8r3Fiz6a0vBet5u79h+zelNLu2GkOvH33jg1L50hjBo+ZSr68eVg8axIAXQeOolyZkkwfP0Ia07rbIBrWrcXooX2RSCRYte1OT1sb+vewBeDjx4/Ub92N0UP7YWvTQq5vmnol0MtXRq783MWD3AgMZfyYmdIyL58znD55gbmzlsrFT581jmYtrKldI22QumT5LCpU+ofmjVKOvELueLBsyQacHHdLY3bsWcfbt28ZOnC80u2G3vVkw7rtrF6RcjdQ9uya3LrvzewZi9m+db9Mv2Jf3f2r80vNUT9/Wbnys64HuBF0kwlftXXl+mnOnLrA3FnL5OKnzRpHs+YNsTRP208WL59FhYpladE4ZYYg+LYHy5esx2nzHmnM9t1refv2HXaDxsvVmeru42vMmraY3TsPSd+rjet3yOR4854Xs2cuYYeCHGNe3sGoQDm58tMX9hEcdIuJY2dJyy5fO8nZU67Mny3/mZ46cyxNmltRz6KVtGzhshlUqPgPrZqkzEgG3nJn5dKNbP0qx627V/P2zTuGD55Ijhxa3H/qS59uw7lw3l0ac8HjMC5n3Vk4byUAG7YsJSk5mRGDJ2b4vnwtMuGWUnE/KqGr1feDlFRgr1um1fWzKD0z8OnTJ9TU1FBTU/spMwCKaGpqYmpaiQsXLsuUX7jgQc2aiqe2atY05cIFD5kyFxd3zMwqo6GRcvNEq1aNuHbNj5Ur5xIW5oefnwsTJgxDXf3HL6HQ1NSkf/9uJCS85MaNm0q/pmq1ilx09ZQpv+jqiXlNU4WvqWFRTS7e9YIH1UwrSfNLr2dvWw47n+Ldu/dAyrSehoYGHz58lIlLfJ8oMy161duXFi2spbMFdevVpGSpYrim2x6ZLSjkNrXNZfO3tDAl9PY9kr4M3oJCb1G7RroYczMCg1O+IJKSkrh5555cPbXNTQkKSdk+T59HEhsXLxOTPXt2qletRGCwctsQUrZjlaoVcLsoOwB0u+iJuUU1ha+pYV4Nt4vy271qtYrS7ZhdK7vcUVFiYiIWX/Z9Zdo1KVYEQyMDLn3V1sePSXhduU4NC8X7WFbL7+u2LqXr86WLV6hhnkGONapyKX3fXD3S5agp/zlLTMQig8+3uro6Nh1akCtXLnyuB3zJsXAGOfpgnkHfMsqxctUKXEp3oOLudoUaGWxHM/OquKeLv3TxClWqVZDZjokf0m3H9x+kpx+yaWRDQ0ODxPTb+v0HLGqlvA9qamo0alKfh/cfs9fZkZB7npy+sI9mLa0R/htK/+8XERHBoEGD2Lt3L0ZGRnTo0IEjR46gpqb20zqnp6eDhoYG0dGxMuXR0TEYGuorfI2hoT7R0THp4mPR1NSUTt8XL16Udu1akC2bOjY2fViwYBX/+98gJk0aoajKb2re3JrY2Fu8fHmPESMG0LJld+Li4pV6ra6utsL8YqJjMTTIID8DfWLk3o+U/HR1teXiTc0qU6FCWbZvSzt6ePPmLdeu+jN+4jCMjAxQV1fHtnNbqteoipFh2rTdhHGzuX37PrfveREbfxvnI06MHT2Dq95+SuX3o2JfxKOrXUCmTFdHm+RPn0hIeJUSExePrk76mALEvngBQHzCKz59+oyujux7oqtdgNgv2yf2RfyXMm0F9Si3DSFtO6bfLjHRcRgY6il8jYGhHjHRceniZbejm6snQ4f3pURJE9TU1KhvVZtmLawxNDJQul0DAz1pmbJ9y2r5AehI20pXT0wsBhl81xgY6hETI9+39DkOGdaHEiUU55iqXPkyPH7mz7OYYJYsm0Wf7sO4e+fBlxxT2o9W2DdVciyQ4fupb5DBdjTIeDvqfMnxkqsnQ+z6UPxLjvUa1KZpi4bS9+3tm3f4XAtgzIShGBrpo66uTgfb1phWryyN0dPXJU/e3IwYNQA3V086tx/A6ZMXcNq5ilqWNZTOMVOJCwgVy5EjB927d+fixYsEBwdTrlw5Ro4cSXJyMvPmzcPFxYVPnz4pVdeHDx949eqVzPLhQ8bnhdKfyVBTU/vmjz+kX5U6YEl9jbq6OjExcdjZTSIgIJiDB0+wcOFqBg7sqVT/v+bu7oW5eTMaNGiHi8sldu9eh76+rkp1KMyPb+UnH6+oHKBXb1tCQ+/g73dDpnzwwLGoqalx5743MS9uMWRobw4eOM6nz2nbcMjQ3tSoUZXOnQZSv05bpkx2YOnyWTRokPEUf2ZJP8hMze3rYkUx6cvkYlAiRiJfpoz020xNTX5fTN9fRf1ILZ88YS4PH4Th7XuWiLhQFi6Zzt7dh+U+Z8q0q+pnSGF///L8fqSe7+U4ZeI8Hj4Iw8v3DM9jQ1iweDr7FOR4/94jrOra0KxRZ7Y57WX1hoWUKZvutKLCvqmUnqJq/nWO0ybN5+HDx3j6nOJJzA3mL57K/t1HZHIcPngiampqBN2+THh0EAMG9+DwwZN8/hKjrp5S59nTF9m0bjuhwbdZs2IzLucuyV2w+F/JatcM/NC8eMmSJZk7dy5hYWGcOnWKDx8+0KpVKwwMDL7/YsDBwYH8+fPLLIp+Cjk29gXJyclyswD6+npyR9OpoqLkZw309XVJSkqSHrFHRkZz794jPn9OG7Ldvn0fY2MDNDU1lcoh1bt373n4MIzr1wMYMmQCycmf6NOny/dfCMTFxSvMT09fN+P8omPkjlRS83vxIkGmPGfOHLTv0Iod2w/I1fPoUTgtm3XD2KAi5cvWoWGD9mhqahL2+CkAOXJoMX3mWCbbz+PsmYuEht7BceNOjjifYsT/BiqV34/S09GWOzJ/EZ+ARrZs5M+fLyVGV1t6hJ8W81J6lK9dIB/ZsqkTG/dCPubLjILel1mD1NmEr9tKPzPxLanb0cBAfjumPwpLFR0lf1Snl247xsXF06ubHUWNqlC1ghU1zZrx9s1bwsOeKt1u6n4k35aO3BFfVs0P4EVqW+nr0ftOjnJ905HLsXf3YZgYV6VaRStqVW/Gm7fvpDmmSkpK4tHDcIICQpg7axmhIbcZNLTXlxxjFOf4jb4pzjFBcY76OsTGKH6voqMz3o7xX+XYt/sIShQ0pXola+rUaMHbt+94EvZM+pqwx09o17IXJQqaYlqhIc2tO6OpqUn4l5gXcQkkJSVJZ0NS3bvzUNxN8B/5V88ZUFdXp3nz5hw6dIhnz54xZcqU778IsLe35+XLlzKLvb29XFxSUhL+/sFYp7sVx9q6LlevKp6qvnrVXy6+UaN6+PndkF4s6O3tS8kvU5OpSpcuwfPnUSQlJSmVQ0bU1NTQ0squVGxSUhKBASFYNbSUKbdqaMn1q/4KX+NzLUAuvqF1HQL8g+UuhmzXviVaWtnZv+9ohn149+49UVExFCiQj4bWdTl96gKQcn4xe/bsfE43qv30+bN0FP+zVKn4D94+svl7Xfenwj+l0fxynrJKhXLyMT7+VK2UctGUpqYm5cuWxtsnQCbG28efKhVTLnYtXNAIPV1tmZikpCR8A4OpWkn5C2KTkpIICgylQUPZGZMGVpZcvxag8DU+1wNoYCW/3QMDQuS244cPH4mMiEJDQ4NWbZty5pSr0u2GPX5CVGS0TFuamprUtjTH55rifSyr5fd1W/XT9bm+VW3puXu5HH0CqW+Vrm8N63wjx2g0NDRo3aYJZ0+7frM/ampqaGXP/iXHp0RFRsv0LSXHGlzPoG8Z5XgjMJT66Wb26jeojU8G29HveqBcfAMrS4ICQr+ZY8s2jRXm+O7de6KjYsifPx8NrC2lMUlJSQT6h1CydHGZ+BKlivH0yXOlc8xUWew0gdKPI46Pj2fXrl307t2bfPnyyax7+fIle/fuZcCAAUrVpaWlhZaWllKxq1ZtxslpOf7+N7h61Z/+/btRpEhBHB13ATBnzkQKFjSif//RAGzevIuhQ3uzcOE0nJz2UrOmKX36dKZXr7TrATZt2snQoX1YunQm69Zto1Sp4kyYMIx167ZKY3LnzkXJksWkfxcrVoTKlcsTH5/AkyfPyZUrJ5MmjeDkSRciI6PR0dFm8OCeFCpkhLOz/D3xGVm7xomNjksI8A/m+vUA+vTtQuHCBXHaknJl7oyZ4zAuaMSQQeMAcNqyh4GDezLPYTLbt+3H3LwaPXt1on/fUXJ19+zdiVMnXaQj+K9ZW9cFNTXu33tIiRImzJ43ifv3HrLry9XLr1+/wcPjKnPmTSIxMZEn4c+wrGNBl67tmGKv2sOl3r17T/jTtA/0s+dR3L77gPz58mJsZMDy9VuJjo3DYVpKjrY2LdnrfIJFqzbRoU0zgkJucfjkeRbPTLvKuIdtW/oMG8+WXQewqlsLNw9vrvoEsGN92h0YvTq3w37OEir8U5oqFctx6NgZIqJi6Nwu5epvNTU1etra4LhjP0ULF8SkSCEcd+wnh5YWLRs3UCnH9Wu2sm7TIgL9Q/C5HkjvvrYUKmzMNqe9AEydMRbjgoYMGzwBgG1O++g/qAdz5tuzY9sBaphXpXuvjgzqN0Zap2n1yhgbGxESfAtjY0Mm2I9AXU2d1SsdlW4XYMO67YwaO4QHD8J4+OAxo8cN4f379zgfPCny+8qGtVtZu3ERQQEh+FwPoFefzhQubMw2p31fchyDkbEhw4ek7IfbnfbRf2B3Zs+bxM7tB6hhXo3uPTswuP/YtBzNKmNc0FCa43j7Eaipq7N6ZdozP6ZMH42ry2WePYskT57ctOvQAss65nTukPZ9unH9DkaNGczDB495+CCMUWMH8/59oso5bly7ndUbFxAUGILv9UB69El5P1PvSJg8fTTGBQ0ZMSTljpwdW/fRb2A3Zs6byO7tB6luXpWuPdszdMA4aZ3VUnO8cQvjgoaMm5RyMfbaVVukMQ0aWqKmpsaD+48oVtyE6XPG8eDeI/btPiKNWbfaiY1OS7l6xZcrHtdo2KgOTZo1oH2r3irlmFn+lOn9zKL0YGDNmjXcuHGDESPkL7LLnz8/Hh4evH79msmTJ2dqBw8dOoGOTgEmT/4fRkYGhIbexcamN+HhKdNLRkYGMs8cePz4CTY2vVm0aDpDhvQiIiKKMWNmcvRo2j32T59G0KpVDxYtmo6v7zmeP49i7VonlixZL40xM6vM+fNp0+uLF88AYOfOgwwcOJZPnz5TpkxJ9u7tiJ6eNnFxCfj5BWFt3ZFbt+4qnd9h51Po6BRgwqQRGBnpc+vmPTp16M+TL6NhQyMDChdJmyYLC3tKpw79cVgwhYGDehAZEc3E8bM5fuycTL0lSxWjdu0a2LTupbDdfPnzMmPmOAoWMiI+/iXHj51lzqylMqP9fr3/x4xZ43Hcsgxt7QI8efKMObOWsuWrW4iUEXL7Hv1GpP1Hvmh1yi2DbZs3Yt7UscTGvSAiKu3e/8IFjVi3ZDaLVm1i7+ETGOjpYj9qCI2t0h5kUq1SeRbPmsTqTTtY7biTIoWMWTzbnsoV/pHGNG9Un5evXrNh6x5i4l5QukQx1i+ZTUGjtGcp9OveicQPH5m7dC2vXr+hcvmybFoxj9y5c6mU49HDp9HWKcC4icMwNDLg9s27dO04UHpUY2ikL/PsiPCwp3TtOJC5DpPpN7A7kRFRTJ4wl5PH025fzaGlxeRpozApVoS3b99x4bw7doPG8+rla6XbBVi9wpGcOXOweNkM8hfIj79vEB1t+vHmzVuRn0yOZ9DW0WbsBLuUtm7dpWunQWk5Gsrn2K3TIOY42KfkGBnN5InzZHPMoYX91PQ5TpDJUd9Aj7UbF2FoZMCrV6+5GXqHzh0G4O6W9nCu1SscyZFDi0VL03Ls1K4fb1XM8diRM2jrFGDMBDsMDPW5fese3W2HyGzHQjI5PqO77RBmzZ9E3wHdiIqMZurE+Zw67iKT46QpIyn6JceLLpcZPniiTI758uVl8ozRGBc0IiH+JaeOn8dh7gqZ75szJy8wccwsRowexNyFk3lw/xH9e/0vw1nSn+4POaLPLEo/Z6Bq1aosXboUa2vFt3q4uroybtw4AgKUn7ZKT9FzBv4WGT1n4G+izHMG/nQZPWfgb5HRcwb+Jhk9Z+BvktFzBv4mP/s5Ay/a1s+0unSOuX8/6BdTembgwYMHlC5dOsP1pUuX5sGDBxmuFwRBEIQ/hSSLzQwofQFhtmzZeP484ws5nj9//q8e2iMIgiAIv40sdgGh0v97V6tWjaNHj2a4/siRI1SrpvzTsARBEARB+D0ofZpg+PDhdOnShcKFCzN06FDpI4k/ffrEunXrWL58OXv2qHZhmSAIgiD8jrLaaQKlBwMdOnRgwoQJjBw5kilTplCiRImUW0UePODNmzeMHz+ejh07/sy+CoIgCMJ/QwwGFAsMDGTevHnY2Niwa9cu7t+/j0QioV69enTr1g1zc/Of2U9BEARBEH4SpQcDpqamVKtWjQEDBjB79mzy58//M/slCIIgCL9MVjtNoPQFhFeuXMHU1JRJkyZhbGxMz549cXP7/X+jWRAEQRBUJfmcecufQOnBQK1atXB0dCQyMpL169fz5MkTGjVqRMmSJZk3bx5Pnz79fiWCIAiC8AcQg4HvyJkzJ7179+bSpUvcvXuXrl27snHjRooXL06LFi1+Rh8FQRAEQfiJ/tVTgkqWLMmkSZOYMmUK+fLl49y5c99/kSAIgiD87iRqmbf8AZS+gDA9d3d3nJyccHZ2Jlu2bNja2tK/f//M7JsgCIIg/BJ/yvR+ZlFpMPDkyRO2bdvGtm3bePToEbVr12b16tXY2tqSO3fun9VHQRAEQRB+IqVPEzRu3JjixYuzbt06OnbsyK1bt/D09KRv375iICAIgiD8VSSf1TJtUdW6desoXrw4OXLkwMzMDA8Pj2/Gf/jwgSlTpmBiYoKWlhYlS5bEyclJpTaVnhnImTMnzs7OtGrVSvooYkEQBEH4G/2q0wT79+9n1KhRrFu3DktLSzZu3Ejz5s25efMmRYsWVfgaW1tboqKi2LJlC6VKlSI6Oprk5GSV2lV6MHD8+HGVKhYEQRAEQTXLli2jf//+DBgwAIAVK1Zw7tw51q9fj4ODg1z82bNncXd35+HDh+jo6ABQrFgxldsVvzksCIIgCOlIJGqZtnz48IFXr17JLB8+fJBr8+PHj/j5+dGkSROZ8iZNmuDl5aWwn8ePH6d69eosWrSIQoUKUaZMGcaNG8f79+9VylcMBgRBEAQhncx86JCDgwP58+eXWRQd5cfGxvLp0ycMDQ1lyg0NDYmMjFTYz4cPH+Lp6UlISAhHjhxhxYoVHDp0iGHDhqmU7w/fWigIgiAIwvfZ29szZswYmTItLa0M49XUZC86lEgkcmWpPn/+jJqaGrt375b+ZtCyZcvo2LEja9euJWfOnEr1UQwGBEEQBCGdH7kLICNaWlrf/M8/lZ6eHtmyZZObBYiOjpabLUhlbGxMoUKFZH48sFy5ckgkEp4+fUrp0qWV6qM4TSAIgiAI6UgkmbcoK3v27JiZmeHi4iJT7uLiQu3atRW+xtLSkufPn/PmzRtp2d27d1FXV6dw4cJKt60mkajSVUEQBEH4+4WZNsq0ukz8Lygdu3//fnr27MmGDRuoVasWmzZtwtHRkdDQUExMTLC3t+fZs2fs2LEDgDdv3lCuXDlq1qzJrFmziI2NZcCAAdSvXx9HR0el2/2tThNoZC/0q7vw0yR/fIZ2nlK/uhs/Vfyb++jlK/Oru/FTxb66S1Lsw1/djZ9GU68E+fOU/NXd+KlevnmAUYFyv7obP1Vkwi2MC5T/1d34qSISbv7qLvwUnTt3Ji4ujtmzZxMREUHFihU5ffo0JiYmAERERBAeHi6Nz5MnDy4uLowYMYLq1aujq6uLra0tc+fOVand32pmQAwG/mxiMPDnE4OBv4MYDPx7j6s2zrS6igW6fD/oF/utZgYEQRAE4Xfw+xwm/zfEBYSCIAiCkMWJmQFBEARBSCczby38E4jBgCAIgiCkI5FkrcGAOE0gCIIgCFmcmBkQBEEQhHR+1U8Y/ypiMCAIgiAI6XwWpwkEQRAEQchKfmhm4NOnT8TGxpItWzb09PQyu0+CIAiC8EuJCwi/4dSpU9SrV4/cuXNTsGBBDA0NKVCgAD179pR5PKIgCIIg/Mkkn9UybfkTKD0Y2LlzJ127dsXMzIzRo0ejr6/PhAkTWLBgAU+ePMHMzIx79+79zL4KgiAIwn/iV/xq4a+k9GmC+fPn4+joSOfOnQHo0KED7dq1Izw8nCFDhtClSxcmTpzI4cOHf1pnBUEQBEHIfErPDISFhWFhYSH9u3r16kRGRhIREQHAmDFjcHNzy/weCoIgCMJ/TJwmyECxYsXw9fWV/u3v74+6ujqGhoYA6OjokJSUlPk9FARBEIT/2GeJWqYtfwKlTxMMGzaMAQMG4OPjQ44cOdi8eTM9e/YkW7ZsAFy7do0yZf7un68VBEEQhL+RSoMBdXV1du3axYcPH+jTpw/Tpk2Trjc3N2fPnj0/pZOCIAiC8F/KarcWqvScgaFDhzJ06FCF60qXLp0pHRIEQRCEX+1PuQsgs/zrJxBu27aNly9fZkZfBEEQBEH4Bf71YGDQoEE8f/48M/qSoSGDe3PvjjdvXj3g2tUz1LE0/2Z8vbo1uXb1DG9ePeDubS8GDeyZYaytbRuSPz7D+dAWmfKJE4bj7XWK+Lg7PH8ahPOhLZQpU1ImxsBAjy2blxP+2I9XCfc5dWIXpUoVVzm//gO7ExjiRkRsKG4eR6lVu/o342vXMcfN4ygRsaEEBF+kb/+ucjFD7Ppw3f88z2NCCLntwbwFU9DSyi5d329ANzyvniTseSBhzwM553qQRo3rydTRqk0TDh3dyv2w68S/uU/FSuVUzi1V3wHd8LvhytPoYFzdD1Oz1ndytKyBq/thnkYH4xvkSp9+XWTWa2hoMG7iMHyCLvA0OphLV47TsFHdH2p3gv0IQu548CTqBsdO7aTsP6VUys03MJhhE2Zg1aY7FS2b43rZ67uv8Qm4gW2/EZhataFZp77sP3JKLsbFzZM23QdRrUFr2nQfxAX3K3Ix+w6fpGnHPphatcG23wj8AkNk1kskEtZu2YVVm+6YWbWlz/AJ3H8YplJ+qQYM7M6NkEtExd7E3ePYd/dTyzrmuHscIyr2JkHBbvRLt5+ePLObl28eyC0HDm2WxuTJkxuHhVMJvnmZyJhQzl84iKlpJel6DQ0NZs2egNe10zyPCub2PS82bFqCkZHBD+XYp39Xrge58DgykHOXDmFRyyzDWANDfdY5LsbT5zTPX4Qy28FeYVzLNo25fPUEYVFBXL56guatGqncbmTCLYWL3Yh+KufYu38XrgWd51FkAOcuHfxOjnqsdVyEh88pnr0IYbbDpAxzdL96gsdRgbhfPUHzVtYy68dOGkZEwk2ZJejOZbl6xk4aRsCtSzyM8Mf55DbKqPhZzExZ7QJCpQcDOjo6Cpfk5GRq1aol/TuzderUhmVLZ+KwYBXVzZvi6Xmdkyd2UaRIQYXxxYoV4cTxnXh6Xqe6eVMWLFzNiuWzadeuhVxs0aKFWLRgOh4eV+XW1atbk/Xrt2NZtzXNWnRFI5sGZ07tIVeunNKYw4ecKFG8KO079KO6eVPCwp9x7sw+mZjvadehBfMXTmHp4vXUt2yDt5cPBw5voXBhY4XxRU0Kc8B5M95ePtS3bMOyJRtYsHgards2TXvPbNswY/Z4FjmsxsKsKSPs7GnXoQXTZ42Xxjx/Fsms6YtpWM+GhvVs8Ljsze79G/inXNrpnty5cnHtqh+zpi9ROh9FbNq3YN6CySxfsgGrOjZ4e/uyz9mRQt/Ice8hR7y9fbGqY8OKpRuYv2gqrdo0kcZMnjaK3n27YD9+DpbmLdjutJftu9dSqXLagEWZdkeMGsjQYX2ZOG4OjRt0IDo6FudjW8mTJ7fS+b1/n0jZUiWYPMZOqfinzyOxGzcd08oVOLh1DQN6dsZhxQZc3DylMYEhtxg3w4HWTa1x3r6O1k2tGTfNgRuht6UxZy64s2DlRgb26sLBrWswrVyBIeOmEREZLY1x2n2QHfsOM3mMHfu2rERPR5uBoybz9u07pfMDaN+hJQ4Lp7Jk8TrqWrbGy8uHQ4edMtxPTUwKc9B5C15ePtS1bM3SJetZuHg6bb7aT3t2s6N0CQvpYlGjGcnJyRw9ckYas3qtA1YNLRk8cCy1LVpw8aIHR0/sxNg45S6mXLlyUKVqBRYvXEO9Om3o0c2OUqWKse/AJpXyA2jbrjmzHSaxYslGGtdrzzVvP/Yc3JjhfqqlpUlc3AtWLt1IaMhthTFmNaqy0WkZB/cfx7qODQf3H2fT1mVUM6usUruVytSVWUYNm8znz585efy8Sjm2adeM2Q72rFyykSb1OnDN24/d38gxu1Z2XsTFs2rpRkJD7mSQYxU2OC3l0P7jNKrTjkP7j7MxXY4At2/eo3KZetKlYe22MuuH/a8/g+16M2XCXJo3tCU6Kpb9RzaTO08ulXLMLBKJWqYtfwI1iUS5MyN58+alfv36dOrUSVomkUgYMGAAs2fPplChQgD07t37hzujkb2QXJmX5wn8A0IYPiJt1B184xLHj59lytQFcvEO8yfTqlUTKlVuIC1bu2YBVSqXp069NtIydXV13Fyd2bZ9P3XqWFCgQD46dOyfYd/09HSIfB6MVcP2eHheo3TpEtwK9aByVStu3rwrrTPi2Q3sJ8/Daetemdcnf3yGdh75Ua6L2yFuBIUydtQMadlVv7OcPnGB2TPl/xOeOXs8zVpaU9OsmbRs2crZVKhYjqbWKdtm0dIZlClbEptWvaQxc+bbY1a9Mi2ayM8ipHoY7sv0qQvZteOgTHmRooW4cdOdurVaExJ8K8PXx7+5j14++TtKzl08yI3AUMaPmSkt8/I5w+mTF5g7a6lc/PRZ42jWwpraNZpLy5Ysn0WFSv/QvFHKQ69C7niwbMkGnBx3S2N27FnH27dvGTpwvNLtht71ZMO67axe4QhA9uya3LrvzewZi9m+db9c32Jf3SUp9mGG70FFy+asdJiGdb3aGcYsW7cFN89rnNiT9h/WrEWruXv/Ibs3LQdg7DQH3r57x4alc6Qxg8dMJV/ePCyelXJ01nXgKMqVKcn08SOkMa27DaJh3VqMHtoXiUSCVdvu9LS1oX8PWwA+fvxI/dbdGD20H7Y28gNkTb0S5M9TUq7c1c2ZoKBQxoyaLi277neOUydcmKVgP501ewLNW1pjbpb2n//ylXOoWPEfGlt3kosHGGrXh8lTR1G2VC3evXtPjhxaPIu8QdfOgzl/7pI0zsPrBOfOujF39jKF9ZiaVsLt8lEq/FOHp08j5Na/fPMAowLys1ynL+wjOOgWE8fOkpZdvnaSs6dcmT97ucK2Uh0+uZ2Q4NtMt3eQKd/otIy8eXPTrdNgadmeQ5t4mfCKoQPG/XC7W3evJk+e3HRqq3hmIDLhFsYFysuVn7qwj+Cgm0waO/urtk5w9tTF7+bofHIbocG3mW4v+727wWkpefPmobtMjhtJSHiF3YCUz+LYScNo1tKaxnXbZ1h/4G13HNfvYO3KlFna7Nk1uXHPg3kzlrFz2wG5+IiEm9/s778VULTt94OUVC38WKbV9bMoPTMQEBBAdHQ0Fy9epEOHDvTu3Zs+ffqgpqaGjY0NvXv3/lcDAUU0NTUxNa2MywV3mXIXF3dq1VQ8RVnTwgwXF9n48y6XMDOrjIZG2vWS06aOJiY2jq3b9inVl/z58wHwIj4BQDrlnpj4QRrz+fNnPn78iOV3TmOk0tTUpGq1ilx09ZQpd3P1xLymqcLX1LCohlu6eNcLHlQzrSjN76q3L1WrVsT0y8jcpFgRGjetz/mzlxTWqa6uTvuOLcmVOxc+1wOU6ruyNDU1qVK1Am4XZae43S56Ym5RTeFraphXw+2ibI4XXT2pWi0tx+xa2fnw1XsPkJiYiEVNM6XbNSlWBEMjAy591dbHj0l4XblODQvF739mCAq5TW1z2fotLUwJvX2PpOTklJjQW9SukS7G3IzAL4OxpKQkbt65J1dPbXNTgkJSviSfPo8kNi5eJiZ79uxUr1qJwGDlv0gz2k8vfmc/TR+fsp9Wkvkcfq1nb1sOO5/i3bv3QMopAA0NDT58+CgTl/g+kZrfmNrOly8vnz9/5uXL19/NLZWmpiaVq1bgkpvs/uLudoUaGeynyjCrUYVLbrKnjS5dTKvzR9rV09elUZP67NnprFJfUtoqj7tcW15Ut6iqUl1fq16jqlydX+eYqkSJogTcusS1oPOs37KEoiaFpeuKmhTG0Egf96/eq48fk/C+4vuv+vZvZLXHESs9GChVqhReXl4YGRlRtWpVrlyRP3+prA8fPvDq1SuZ5cOHD3Jxeno6aGhoEB0VK1MeHR2LYQbnBA2NDIiOThcfFYumpiZ6eimnMWrXqk7fPl0ZPGS8oioUWrJ4Bp6e1wgNTZkqu337Po8fP2HeXHsKFMiPpqYmE8YPw9jYEGMlz1fq6mqjoaFBTLr+xkTHYWCg+NcgDQz0iYmOSxefkp+urjYAhw+dYt7c5Zxx2Ud0/C0CQ9zwvHyNFcs2yryufIUyPIkMIurFTZatmEPPrkO5c/u+Un1X1jdzNMwgR0O97+bo5urJ0OF9KVHSBDU1Nepb1aZZC2vpfqFMu6nvsXxbGfctM8S+iEdXu4BMma6ONsmfPpGQ8ColJi4eXZ30MQWIffECgPiEV3z69BldHW3ZGO0CxMbFS9tJKdNWUE+80v1NfS/Tf65iomMxNNBX+BpDA3259z463Tb8mqlZZSpUKMv2bWmzMW/evOXaVX/GTxyGkZEB6urq2HZuS/UaVTEyVPwZ09LKzszZEzh44DivX79ROkcd3QIZ7i/6GXwWlZGyL8u/b6l1/ki7nbva8ObNW06fcFGpL2ltye/v/yZHfYWfV9k6A3xvMHKoPV07DGTcyBkYGOpx4vwetLXzA0g/b+nfh9jo2Ay/C382cc3AN2hoaLBw4UI2bdpEt27dmDx5Mmpqqifq4OBA/vz5ZRYHB4cM49OfyVBTU5Mr+3Z8WnmePLnZvm01Q4aOJy5OuS/EVSvnUaliObr3HCYtS05OxrbzQEqXLkFs9E1ev7xP/Xq1OHPGlU+fPilVb1p/Zf9WUwMJquSnJlNuWdeCsePtGDd6Jg3qtKVH16E0bWbFuInDZF537+4j6tVuQ2Orjjht3sO6TYtVvnhOWenzUVP79oj5ezlOnjCXhw/C8PY9S0RcKAuXTGfv7sNy770y7aq6f2WG9J+b1Pa+LlYUk75MLgYlYiTyZcpQ+D79i/30a7162xIaegd/vxsy5YMHjkVNTY07972JeXGLIUN7c/DAcT59lv+MaWho4LRtFerqaowdPUNuvTLkP4v/fl9QZv9Spd0uPdpz+OBJuRmTf9Off3v4+r0cL17w4NRxF27fvIeHuzc9bFNuUbftZqNSPf+lrHbNgErPGUjVsGFD/P39GThwILlz55Y+hVBZ9vb2jBkzRqZMS0uLufMdZcpiY1+QnJyMoZHs0Ye+vi7RUTEK646KjMbQMF28gR5JSUnExcVToUJZihcvytEj26Tr1dVTxkSJ78IoX7EeD7+62nrF8jm0btUEK+v2PHsme/7RPyCY6jWakC9fXrJn1yQ29gVenifwTfeFlpG4uHiSk5PljkL19HXlRtqpoqNjFMYnJSXx4kUCAFOmjeLA3qPs3J5ynu1m6F1y58rF8tVzWbponfTDlZSUxKMvuQYGhFDNrBJD7HozeuQ0Mos0x3RHkCk5xip8TXRU7HdzjIuLp1c3O7S0sqOto01kRBTTZ40jPOyp0u2mHukaGOoR9dX+pKevk+H7nxn0dLTljsxfxCegkS2b9HSUnq629Ag/Leal9Chfu0A+smVTJzbuhXzMlxkFvS+zBrEvXqCvp/NVTILczMS3pL6X6T9Xevq6crMFqaKiYzBI/zlMtw1T5cyZg/YdWjF/3gq5eh49Cqdls27kypWTvHnzEBUVw9btqwh7/FQmTkNDg207V2NSrDCtW/ZQaVYA4EVcQgafRR1iY358X0jZl+Xft9Q6VW3XopYZpcuUYHC/MXLrvudbbcX8ixxjFH5ev/2+vX/3nls371K8hAmAdPbXwFBfZiZYV1/3X/VNUN4P31qoq6vL4cOHiY+Pp2zZsiq9VktLi3z58sksWlpacnFJSUn4+9+gkbXsLW+NGtXD+6qvXDzA1Wt+NGokG9+4UX38/G6QnJzM7dv3qVKtIWY1mkiXEyfPc+mSF2Y1mvDkSdptkitXzKWdTXMaN7Xl8eMnGebz6tVrYmNfUKpUcczMqnDixDml3oekpCQCA0KwalhHprxBwzpcv+qv8DU+1wJokC6+oXUdAvxDSP5yvjlnzpx8/vxZJubT50+oqal984hQTU2N7NmzZ7j+RyQlJREUGEqDhrIX1DWwsuT6NcXXJ/hcD6CBlaVMmVVDSwID0nJM9eHDRyIjotDQ0KBV26acOeWqdLthj58QFRkt05ampia1Lc3xuab4/c8MVSr+g7ePbP1e1/2p8E9pNL+cT69SoZx8jI8/Vb/c3qmpqUn5sqXx9pF9D719/KlSMeXCscIFjdDT1ZaJSUpKwjcwmKqV5C8uy0jafiq/Tb61n6aPT9lPg+W2Ybv2LdHSys7+fUcz7MO7d++JioqhQIF8NLSuy+lTF6TrUgcCJUsWo23rXsSnG2woIykpiRuBodRvILu/1G9QG58M9lNl+PkEydXZwCqtTlXb7dazA0EBIdzM4Mr+b0lp6yb10rVVr0FtfK8FqlxfKl+fQLk661tZfvN9y55dk9JlSkgH4eFhT4mKjKFeg1rSGE1NTWpZVv9Xffs3stppgh+aGfhaTEwMBQoUQFNTMzP6I2f5Ske2b12Jn18QV6/5MbB/D4oWKcTGTTsBmDd3EgULGtO33/8A2LhpJ3ZD+7Jk0Qw2O+2mpoUZ/fp2kU7xf/jwQXreP1Xqedqvy1evmk/XLja079CP16/fSI+KXr58TWJiIgAdOrQiNiaO8CfPqFjxH5Yvnc2x42dxuSB//2xG1q1xYoPjEgL8g/G5HkDvvl0oXNiYrVtSHu08feY4jAsaMnRQyvUNTlv2MmBwT+Y6TGbHtv3UMK9Gj16dGNB3tLTOs2cuYje8Hzdu3MTXJ4gSJUyYPHU0Z067SgcJ02aM5YKLO0+fRpA3b27ad2xFnboWdLRJuzq5gHZ+ChcuiLFxyvnZ0mVSnqEQHRWT4RGhIuvXbGXdpkUE+ofgcz2Q3n1tKVTYmG1OKXdcTJ0xFuOChgwbPAGAbU776D+oB3Pm27Nj2wFqmFele6+ODPrqaMi0emWMjY0ICb6FsbEhE+xHoK6mzuqVjkq3C7Bh3XZGjR3CgwdhPHzwmNHjhvD+/XucD55UOr93794T/jRtEPnseRS37z4gf768GBsZsHz9VqJj43CYlnL1uK1NS/Y6n2DRqk10aNOMoJBbHD55nsUzJ0rr6GHblj7DxrNl1wGs6tbCzcObqz4B7FifduV+r87tsJ+zhAr/lKZKxXIcOnaGiKgYOn+5jVZNTY2etjY47thP0cIFMSlSCMcd+8mhpUXLxg2Uzg9g7RonNn7ZT69fD6BP3y4ULlwQpy/76YyZ4zAuaMSQQSk5Om3Zw8DBPZnnMJnt2/Zjbl6Nnr060b/vKLm6e/buxKmTLgr/E7e2rgtqaty/95ASJUyYPW8S9+89ZNfOQwBky5aNHbvWUKVqRTp3HEA2dXXpOeb4+Jcq/XjaxrXbWb1xAUGBIfheD6RHn5T9ZceXu0omTx+NcUFDRgxJu9e+QqV/AMidOxe6utpUqPQPSR+TuHvnAQCOG3Zw9PROhv9vAGdPu9KshTV1G9SiTbMeSrebKk/e3LRu25SZUxcpnZN8jttYvXEhQYGh+F0PpEefTnI5GhU0YOSQtLu3ZHPUkctx84adHDm9g2H/68+50xdp2qIhdRvUpG2ztOe7TJ8zHpezbjx9GoGeni6jxg8mb948HNybdpW94/odjBw7iEcPw3j4IIyRYwbx/l0ihw8p/1nMTH/IdX+ZRunBwKZNm+jduzdaWlpIJBIcHBxYvHgxr169IkeOHAwePJglS5ZIp9wzy8GDx9HV0WbqlNEYGxsQEnqH1m16Eh7+DAAjI0OKfvXMgcePn9C6TU+WLJnJ0KG9ef48ilGjp3PkyGmV2h06JOXOiIuuslfs9us/mh07U6bfjY0MWLJoBoaGekRERLNr9yHmKpjq/JYjzqfR0dFmwqThGBoZcOvmXTp3GCCdoTA00qfwV/mFhz3FtsMA5i+YwoBBPYiMiGLS+DmcOJY2G7Fk4VokEglTpo3BuKAhcbEvOHvmInO+uo1P30CPDY5LMDQy4NWr14SG3KajTT+Zq5qbt7Bm3ca0Lx6n7asAWDB/FQvnr1I6x6OHT6OtU4BxE4dhaGTA7Zt36dpxIE+/zvGr+5zDw57SteNA5jpMpt/A7kRGRDF5wlyZe6pzaGkxedooTIoV4e3bd1w4747doPG8+uoK8u+1C7B6hSM5c+Zg8bIZ5C+QH3/fIDra9OPNm7dK5xdy+x79RqT9R75odcotg22bN2Le1LHExr0gIirt3v/CBY1Yt2Q2i1ZtYu/hExjo6WI/agiNrdJmfKpVKs/iWZNYvWkHqx13UqSQMYtn21O5wj/SmOaN6vPy1Ws2bN1DTNwLSpcoxvolsyloZCiN6de9E4kfPjJ36VpevX5D5fJl2bRiHrlzq3bv9mHnU+joFGDCpBEYGelz6+Y9OnXo/9V+akDhImnbMCzsKZ069MdhwRQGDupBZEQ0E8fP5vgx2VmzkqWKUbt2DWxa90KRfPnzMmPmOAoWMiI+/iXHj51lzqyl0tmFQoWMaNmqMQBXrso+uKll8254elxTOsdjR86grVOAMRPsMDDU5/ate3S3HSKzn6a/H9/V44j031WqVaSDbWuehD+jRuWUBwv5Xg9kSL+xTJz6PyZMGcHjR08Y3G8sAV+dSvxeu6ls2rcANTWOOMs/oEpZx4+c/dLWUAwM9blz6x49bAdL2zIw0pPL8YLHYZkc29u24kn4M8wrN/4qx3FMmjqSCVNGEvYonCHpcjQuaMi6zUvQ0dUmLvYF/r5BtGrcVSbHtSu3kCNnDhyWTCd/gXwE+N2gS/sBvH2j2jMxhB+j9HMGsmXLRkREBAYGBmzcuJGxY8cye/Zsatasib+/P1OnTmXu3LkMHz78hzuj6DkDf4uMnjPwN8noOQN/k+89Z+BPl9FzBv4mGT1n4G+S0XMG/iY/+zkDXsYdMq2u2hGq3Qb6Kyg9M/D1mGHLli3MmTOH0aNTpqZr165Njhw5WL169b8aDAiCIAjC7+BPuQsgs6g0p5968dmjR4+wtpZ99nTDhg15+PDvPWISBEEQhL+VShcQnj17lvz585MzZ07ev38vs+79+/eZfr2AIAiCIPwKn78f8ldRaTDw9eOGXV1dsbCwkP7t7e1NyZJ/97lGQRAEIWuQkLVOEyg9GEh/33p6RkZG2Nra/usOCYIgCILw3/rXzxl4+fIlu3fvZvPmzQQFBTFpkuLfuxYEQRCEP8XnLPaggR8+yX/x4kV69OiBsbExq1evpkWLFvj6Kn4qoCAIgiD8ST6jlmnLn0ClmYGnT5+ybds2nJycePv2Lba2tiQlJeHs7Ez58n/3Pa2CIAhC1pHVrhlQemagRYsWlC9fnps3b7J69WqeP3/O6tWrf2bfBEEQBEH4Dyg9M3D+/HlGjhzJ0KFDKV269M/skyAIgiD8Ulnt1kKlZwY8PDx4/fo11atXx8LCgjVr1hATo/hnhAVBEAThTyZBLdOWP4HSg4FatWrh6OhIREQEgwcPZt++fRQqVIjPnz/j4uLC69evv1+JIAiCIAi/HZXvJsiVKxf9+vXD09OT4OBgxo4dy4IFCzAwMKBNmzY/o4+CIAiC8J/6nInLn+BfPT+4bNmyLFq0iKdPn7J3797vv0AQBEEQ/gBiMPADsmXLho2NDcePH8+M6gRBEARB+A/96ycQCoIgCMLf5k+58C+ziMGAIAiCIKTzOWuNBTLnNIEgCIIgCH8uMTMgCIIgCOn8Kb8pkFnEYEAQBEEQ0sliP1ooBgOCIAiCkN6fcktgZlGTSCRZbQAkCIIgCN902KhbptXVPnJPptX1s/xWMwM5chT91V34aRITw9HOU+pXd+Onin9zH718ZX51N36q2Fd3yZ+n5K/uxk/z8s0DkmIf/upu/FSaeiWyxH6qk/fv/kG5F6/v/dT6P6uJawYEQRAEIUvLalPm4tZCQRAEQcjixMyAIAiCIKST1S4gFIMBQRAEQUhHPIFQEARBEIQsRcwMCIIgCEI64gmEgiAIgpDFibsJBEEQBEHIUsTMgCAIgiCkIy4g/AZnZ2fevXv3s/oiCIIgCL+Fz5m4/AlUGgx06tQJIyMjBg0axLVr135WnwRBEAThl5Jk4vInUPmagfHjx+Pr60utWrWoWLEiK1asIC4u7mf0TRAEQRCE/4DKg4HBgwfj7++Pj48P9erVY9asWRQqVAhbW1tcXFx+Rh8FQRAE4T/1WS3zlj/BD99NYGZmxrp164iIiMDR0ZGYmBiaNWtGsWLFMrF7giAIgvDfE9cMfIOagp90zJEjBz179sTNzY07d+7QvXv3TOucIAiCIAg/n0qDAYnk25dClCpVinnz5v2rDgmCIAjCr/YrZwbWrVtH8eLFyZEjB2ZmZnh4eCj1uitXrqChoUHVqlVVblOlwcCjR4/Q19dXuRFBEARB+JNI1DJvUcX+/fsZNWoUU6ZMISAggLp169K8eXPCw8O/+bqXL1/Sq1cvrK2tfyhflQYDJiYmCk8VCIIgCIKg2IcPH3j16pXM8uHDB4Wxy5Yto3///gwYMIBy5cqxYsUKihQpwvr167/ZxuDBg+nWrRu1atX6oT6qNBgoWrSozG2Ea9as4dWrVz/UsCAIgiD8rjLzNIGDgwP58+eXWRwcHOTa/PjxI35+fjRp0kSmvEmTJnh5eWXY161bt/LgwQNmzJjxw/mqNBh4+vQpnz59kv49efJkYmNjf7hxZQ0a1JPbtz1JSLiLl9cpLC3Nvxlft64FXl6nSEi4y61bngwY0EMuJn/+fKxYMYdHj3xJSLhLYKArTZtaqdSuo+NSEhPDZRZ396Mq59d/YHcCQ9yIiA3FzeMotWpX/2Z87TrmuHkcJSI2lIDgi/Tt31UuZohdH677n+d5TAghtz2Yt2AKWlrZpetHjx2Cq/thwiMCufvoGrv2rqdU6eIydUycPJJr/ud4GnWDR0/8OHJiO2bVq6icH0DfAd3wu+HK0+hgXN0PU7PWd3K0rIGr+2GeRgfjG+RKn35dZNZraGgwbuIwfIIu8DQ6mEtXjtOwUd0faneC/QhC7njwJOoGx07tpOw/pX4oxwEDu3Mj5BJRsTdx9zj23e1oWcccd49jRMXeJCjYjX7ptuPJM7t5+eaB3HLg0GZpTJ48uXFYOJXgm5eJjAnl/IWDmJpWkq7X0NBg1uwJeF07zfOoYG7f82LDpiUYGRmolJtvYDDDJszAqk13Klo2x/Vyxl9MqXwCbmDbbwSmVm1o1qkv+4+ckotxcfOkTfdBVGvQmjbdB3HB/YpczL7DJ2nasQ+mVm2w7TcCv8AQmfUSiYS1W3Zh1aY7ZlZt6TN8AvcfhqmUX6rfdT+dYD8Cb9+zhEUEcj/MB+dj2zCtXvmHcuw3oBsBwRd5HhPCxctHqPm97xtLcy5ePsLzmBD8b1ykTz/F3zfX/M/xLDqY4FuXmecwWeb7JjDEjRev78kti5bK/uc10X4EoXc9eRYdzPHTu/jnBz+LmSEzBwP29va8fPlSZrG3t5drMzY2lk+fPmFoaChTbmhoSGRkpMJ+3rt3j0mTJrF79240NH78Fwb+1Q8Vfe+CwszQsWNrliyZwcKFa7CwaMGVK9c5dmw7RYoUVBhfrFgRjh7dzpUr17GwaMGiRWtYtmwmNjbNpTGampqcOrUbE5PCdOs2hMqVrbCzm8Tz52lvtrLtnjvnhomJmXSxsemtUn7tOrRg/sIpLF28nvqWbfD28uHA4S0ULmysML6oSWEOOG/G28uH+pZtWLZkAwsWT6N126bSmE62bZgxezyLHFZjYdaUEXb2tOvQgumzxktjatcxZ/OmXTRp2In2rXujoZGNw8e2kStXTmnMg3uPmDBmFpYWLWnepAvhYc84fGwbuno6KuVo074F8xZMZvmSDVjVscHb25d9zo4U+kaOew854u3ti1UdG1Ys3cD8RVNp1SZttDx52ih69+2C/fg5WJq3YLvTXrbvXkulyuVUanfEqIEMHdaXiePm0LhBB6KjY3E+tpU8eXKrlGP7Di1xWDiVJYvXUdeyNV5ePhw67JThdjQxKcxB5y14eflQ17I1S5esZ+Hi6bT5ajv27GZH6RIW0sWiRjOSk5M5euSMNGb1WgesGloyeOBYalu04OJFD46e2ImxccqXSa5cOahStQKLF66hXp029OhmR6lSxdh3YJNK+b1/n0jZUiWYPMZOqfinzyOxGzcd08oVOLh1DQN6dsZhxQZc3DylMYEhtxg3w4HWTa1x3r6O1k2tGTfNgRuht6UxZy64s2DlRgb26sLBrWswrVyBIeOmEREZLY1x2n2QHfsOM3mMHfu2rERPR5uBoybz9q1qj07/nffTB/cfMXHcbOrVak3Lpl15Ev6MQ0e2oqurrVKO7dqnfN8sW7KeBnXactXLlwPOm7+Z435nR656+dKgTluWL13PgsVTad0mbT/taNuG6bPGschhDTWrN2PksMnYdGjB9JnjpDHWDTrwT8la0qVd65TvyWNf7csjRw/Cbng/Jo6bTaP67YmOisH5+DaVP4u/Iy0tLfLlyyezaGlpZRif/nS8RCJReIr+06dPdOvWjVmzZlGmTJl/1Uc1iQr/o6urqxMZGYmBQcpRRd68eQkKCqJEiRL/qhOpcuQoKld2+fIxAgNDGDlyirQsMNCVEyfOM23aQrn4uXPtadWqEVWrpl1EsXr1fCpVKkeDBu0AGDCgB2PGDKZyZSuSk5MV9kWZdh0dl5I/fz5sbQd+N7fExHC088iPcl3cDnEjKJSxo9JGyFf9znL6xAVmz1wiFz9z9niatbSmplkzadmylbOpULEcTa07AbBo6QzKlC2JTate0pg58+0xq16ZFk3kR/UAuno63H98nZZNu+J1xUdhTN68eQiPCKRtq55cvuQttz7+zX308snvkOcuHuRGYCjjx8yUlnn5nOH0yQvMnbVULn76rHE0a2FN7RppA7gly2dRodI/NG/UGYCQOx4sW7IBJ8fd0pgde9bx9u1bhg4cr3S7oXc92bBuO6tXOAKQPbsmt+57M3vGYrZv3S/Xt9hXd8mfp6RcuaubM0FBoYwZNV1adt3vHKdOuDBLwXacNXsCzVtaY26W9qW6fOUcKlb8h8ZftmN6Q+36MHnqKMqWqsW7d+/JkUOLZ5E36Np5MOfPXZLGeXid4NxZN+bOXqawHlPTSrhdPkqFf+rw9GmEzLqXbx6QFPtQ4etSVbRszkqHaVjXq51hzLJ1W3DzvMaJPWmDjlmLVnP3/kN2b1oOwNhpDrx9944NS+dIYwaPmUq+vHlYPGsSAF0HjqJcmZJMHz9CGtO62yAa1q3F6KF9kUgkWLXtTk9bG/r3sAVSplrrt+7G6KH9sLVpIdc3Tb0Sf9x+ml6evLl5/CyAdq174+Eu/1mMfXUXnbyl5cpdLh4iKCiUcaO/+r7xPcupky7MmSnf1ozZ42newpqa1dO+b5aumE3FSv/Q1Drl/V64ZDplypaU/gcPMGf+JEzNKtOyaTeF/Z+/YApNmllRvWojadnNe1fYsG47q5an7DPZs2fnzgNvZk5fzPat++TqePH6nsK6M8vqIvIzyj9qxJNdSsV9/PiRXLlycfDgQdq1ayct/9///kdgYCDu7u4y8QkJCWhra5MtWzZp2efPn5FIJGTLlo3z58/TsGFDpdpWeWZg8+bNrFq1ilWrVpGcnMy2bdukf6cumUVTUxNT00pcuHBZpvzCBQ9q1jRT+JqaNU25cEH2NgwXF3fMzCpLp1BatWrEtWt+rFw5l7AwP/z8XJgwYRjq6uoqt1uvXk3Cw/0JDr7EunUL0dfXVSm/qtUqctHVU6bczdUT85qmCl9Tw6IabuniXS94UM20ojS/q96+VK1aEVOzlGlEk2JFaNy0PufPXsqwL/ny5QUgPj4hw7727tuZlwmvCAm+rTAmo9dVqVoBt4uy079uFz0xt6im8DU1zKvhdlE2x4uunlStlpZjdq3sfEiUvQAnMTERiy/bR5l2TYoVwdDIgEtftfXxYxJeV65Tw0Lx+59Rjoq248XvbMf08SnbsVKGU309e9ty2PkU7969B1KmoDU0NPjw4aNMXOL7RGrWUvz5gJRt/fnzZ16+fP3d3H5UUMhtapvL5m5pYUro7XskfRmAB4XeonaNdDHmZgQG3wIgKSmJm3fuydVT29yUoJCbQMoMRGxcvExM9uzZqV61EoHBN5Xu7++8nyrqa+8+KZ/FUFU/i9UqyPXZzdUT8wz29xrm8t83F109ZHK85u0n/33TpIHMADV9Pzp1acPuXYekZSbFimBkZCDT1sePH7ly5TrmNRW/Dz/br3gCYfbs2TEzM5N7mq+Liwu1a8sPvvPly0dwcDCBgYHSZciQIZQtW5bAwEAsLCyUblulEwxFixbF0dFR+reRkRE7d+6UiVFTU2PkyJHfrOfDhw9yV1IqmjLR09NBQ0OD6GjZ6xKio2MwNFR8i6OhoT7R0THp4mPR1NRET0+HyMhoihcvSoMGtdm37yg2Nn0oVaoYK1bMRUNDg/nzVyrd7rlzl3B2PkV4+FOKFSvKjBljOXt2H7VqteTjR9kvaEV0dbXR0NAgJl07MdFxGBjoKXyNgYE+MdFx6eJT8tPV1SYqKobDh06hq6fDGZd9qKmpoampyRbH3axYtjHDvsxzmIy3lw+3bsqOtps2s2LzthXkypWTyMho2rXpzYu4+O/mplSOhhnkaKj33RzdXD0ZOrwv3l4+PHoYTr0GtWjWwlo6Qlam3dT3WL6tOAoXVXwa6ls5pt9fYqJjMTTIYD810JfrW3S6HL9malaZChXKMtxukrTszZu3XLvqz/iJw7hz+z7R0bF07NSa6jWq8uD+Y4XtamllZ+bsCRw8cJzXr98onaOqYl/Eo6tdQKZMV0eb5E+fSEh4hb6eDrFx8ejqpI8pQOyLFwDEJ7zi06fP6OrIToXrahcg9ss+GPsi/kuZtlw9z786lfA9v/N+mqpJswZsclpOrlw5iYqMoaNNX168+PefxeiY2G/mGB0jv1/LfN84p3zfnD6/V+b7ZuUyxaeiWrZqRP78+di767C0zNBQT1p3+raKFCmkdI6Z6Vc9OXDMmDH07NmT6tWrU6tWLTZt2kR4eDhDhgwBUq4/ePbsGTt27EBdXZ2KFSvKvN7AwIAcOXLIlX+PSoOBx48fq1R5RhwcHJg1a5ZM2beugkx/JkNNTe2b1yukX5V6riX1Nerq6sTExGFnN4nPnz8TEBCMsbEho0cPYf78lUq3e+jQCem/b968i7//De7e9aJ584YcO3Y2w/59v78g+cZvXSnq19fllnUtGDvejnGjZ+LnG0jxEiYsWDSNyMholixcK1ff4mUzqVCxLM0bd5Fb53H5KvVqt0FXV5tefTqzdccqGll1IDbmhdL5gXw+amryeauS4+QJc1m+eh7evmeRSCQ8fhTO3t2H6dq9vcrtqrp/qdLnf7Mdv9arty2hoXfw97shUz544FjWrF/AnfveJCcnExQYysEDx6lStYJcHRoaGjhtW4W6uhpjR2f8ecssis57ppR/OyZ9mVwMSsRIFD8x9Xt+5/3U8/I1rOq0RUdXm569bdm8bQVNG3YiNlbFz2L6PvO979PvfN/UMWfM+KGMHzMTX58gSpQ0wWHhVKIiY1iySP77pkevTlxwuUykgsFaZn0W/2SdO3cmLi6O2bNnExERQcWKFTl9+jQmJiYAREREfPeZAz/iX11A+KOUv7LyBcnJyXKzAPr6enJHYamiouRnDfT1dUlKSiLuy9FEZGQ09+494vPntLHf7dv3MTY2QFNT84faTa03PPwZpUoVzzDma3Fx8SQnJ8uNyvX0deWOOFJFR8cojE9KSuLFiwQApkwbxYG9R9m5/QA3Q+9y6kTK+cDRY4fIfUEuXDKd5i2sad2ih8wFlKnevXvPo4dh+PoEMnKYPcnJn+jZy1ap/GRyTHeEnJKj4vcyOkr+SCV9jnFx8fTqZkdRoypUrWBFTbNmvH3zlvCwp0q3m7ot5dvSyfD9/1aO6fcXPX3djPfT6BgMMthPU3NMlTNnDtp3aMWO7Qfk6nn0KJyWzbphbFCR8mXr0LBBezQ1NQl7/FQmTkNDg207V2NSrDBt2/T+qbMCAHo62tKj9lQv4hPQyJaN/PnzpcToakuP8NNiXkqP8rUL5CNbNnVi417Ix3yZUdD7MmuQOpvwdVvpZya+5XfeT1OlfBbD8fMJYtTwKXz69InuvRRfX/LNHBXsdxl+30TJz26lz3HytFEc2HeMndsPcuvml++bWUsZNXaw3PdN4SIFqW9Vm53p9uWoqNTPooLPUEzG37k/0698AqGdnR2PHz/mw4cP+Pn5Ua9ePem6bdu2cenSpQxfO3PmTAIDA1VuU6XBwMWLFylfvrzCZwu8fPmSChUqcPnyZQWvlPX/9u47nurvjwP46+ISioxsaUnR0B4atCgpLQ2KdtHeUikNLS0lDaNBSmlvmhpCCBEpNGxaQjfO7w+5fNxL9/alfuU8v4/P4/HtfN73fM77no97z+d8xhX0ykoOh4OnT6PRvz/zVpz+/Xvj8eNwvnU/fvyUJ37AgD4ID3/GvVjw0aMwNG/OfICStnYzvH+fAQ6H80vbBQB5+YbQ0FDlO+Llh8PhIDIiBkb9ejHKDfv1wpPHT/m+JjQkAoaV4vv174WIpzHc/CQlJRkDHQAoLikGi8Vi5LzVxRFDhw3CMFMr7ofTz7BYLIhXuGXoZzgcDqIiY2HYj3m+y9DIAE9CIvi+JvRJBAyNDBhlRv0MEBkRw3PBZ1HRN6SnZUBMTAxDhxvj6uUggbebkvwGGemZjG2x2Wz0NOiK0BD+739VOZb2I2+bq+vHyvGl/RjNk+OIkaaQkBDHSb9zVbbh69cCZGRkoWFDGfTr3xtXLgdy15UNBJo3b4LhZpOQV2mwURvat2mFR6HM3B8+eQq9Vtpg/zjX3F6vNW9M6FPoty290p7NZkNXRxuPQpn7yaPQp2jfRhcAoKGmAkUFOUYMh8NBWGQ09NvqCtze/+f9tEosFuP2vZ/hcDiIiojlabNhPwM8qWJ/D30SAUOe/boXI0e+nzfFJTyfNwBgaTUKWVk5PNcvpSS/QXp6JmNbbDYbBgZd8eTxT96HWkJqcPkbCDUY2LVrF6ZPnw4ZGRmedbKyspg5cyZ27txZY40DgD17DmPy5HGwtraAjk4LbN26Bpqaajh0qPTqzPXrl8PDo3ybhw8fR+PG6tiyZTV0dFrA2toCNjZjsWtX+fmrgwePQV5eDi4ua9GiRVOYmPTDsmV2OHDgiMDblZaWgrOzA7p16wgtLQ306dMdZ854Ijs7T6hTBG57PTHRegwsJ45GS53m2LjZARoaqvDy8AUArFm7BPsPbuPGe3qcgKamGjY4r0RLneawnDgaVpPGYO+e8nvPr129hcnTLDFytCkaa2nA0MgAK1ctxNUrQdw/2u0718Fi7HBMn7IIXz7nQ0lJEUpKiqhXr3RQJiUlidWOi9G5iz40NdXQrr0edu/dBDV1FcbtQILYv9cLVpPGYILVKGi3bI4NzvZQ11CFt+cJAMAqx8XYd2ArN97b0w8ammpYv8ke2i2bY4LVKFhOGo19ezy4MR07t4Op2SBoNdFE9x6dcSrAAyIsEbjuPiTwdgHA3e0IFiyehSFDB6JVa23sdd+MgoICnPG/JFSO+/Z6YpK1Bax+9OOmzQ7Q0FCD549+dFy7BO4Hy+8q8PTwhaamOjb+6EeriaMxcdIYuFboxzITrcfg8qWbfL/E+/fvjf4D+kBLSwNGRga4eMUHLxNf4fix0ouzREVFcfT4XnTo0BbTpyyEqIgIt6/ZbLbA+X39WoD4hCTEJyQBAN69z0B8QhL3Fr+d+71gv748PwtzU6SlZ2LrnoNISk5FwKXrCLh0AzbjR3FjrCyG42HoU3gcP4VXKW/gcfwUHodGYKKFOTdm0tgROHPxOgIuXUdSciq27D6AtIwsjB1RepcAi8XCRAtzHDp6EoF3HyDxVTIcNu5APQkJmA40FDg/4P93P5WSkoTDmkXo1KU9NDTV0K69Lna5boSamvB/izyfN84roa6hCi+P0m2tXrsYbhVy9PI4AQ1NNWxwtq/weTMae3eX53j96i1MmToBI0dV/LxZgGsVPm+A0r6aYDUKfr5nGc+rKePudgSLFs+CqdlAtG6tjX3uW/C1oABn/C/yxFI1T6hrBqKiorBlC+/tfGUGDRqE7dt5b6P6L06fvgh5+YZYuXI+VFSUEBubAHNza6SmvgMAqKgoMe79T05+A3Nza2zdugazZk1CWloGFi1ai3Pnyv9o3r5Nw9ChVti6dQ3Cwq7j/fsM7Nvnie3b9wu83eLiYrRp0wqWlqPQsKEM0tMzcffuI0ycaIcvX/IFzu/smSuQl5fDshVzoKyihLjnCRg7ahrevHkPAFBWaQSNCvmlpryFxahp2LTZAdNmWCE9LQMrlq7HxfPXuTHbt+wDIQQOqxdBVU0ZOdm5uHb1FtZXuE1p6vTSX5e8fM2X0R7bmctwwicAxcXF0NZphnGWI6CgII/c3DxEhEdjyKBxiI8T7paecwFXICffEEuW20FZRQnxzxMwfvR0vK2YY4X7nFNT3mL86OnY4LwSU6ZbIj0tAyuXbcClCze4MfUkJLBy9QJoNdFEfv5XBN64C9sZS/GpwhXyP9suALjuOgRJyXrYtsMRsg1l8TQsCqPNpwjVhwAQcOYy5OUbYtmKuVBRaYS454kYM2pqhX5UgoZmeY4pKW8xZtRUOG92wPQZVkhPy8TypU64UKEfAaB5iybo2bMLzM0mgR8Z2QZwXLsEauoqyMv7iAvnr2H9OhfuUZu6ugpMhw4EADx4zHzoj+ngCQi+HyJQfjHxiZgydzn331tdSwfXwwcPwMZVi5Gdk4u0jPIZMQ01Fbhtd8LWPQdxIuAilBQVYL9gFgYalc9qdWiri23rVsD14FG4HjoGTXVVbHOyRzu9VtyYwQP64uOnz3D38kVWTi60mzXB/u1OUFMpfyjLFMsxKCz6hg0u+/Dp8xe009XBwV0bIS0tJVBuZf5f99Pi4mJot2yGcRNGQF5BDnm5eYh4Gg0zkwl4Ef9SqBzP/tjW0h/binuegLGMHJV4Pm/GjpqOjZtXYur0ss+bDbh4ocLnzVY3EEKwcvVCxudN5VtbDY0MoNlYHT7HToOfPTsPQrKeBLbtWIuGDWURHhaF0cMnC/23WFOEuQvgXyDUcwbq1auHmJgYtGjB/6lQL1++RNu2bVFQUPBLjeH3nIF/RVXPGfiXVPWcgX9JVc8Z+FcI8pyBv11Vzxn4l1T1nIF/SW0/Z2CzVs09Z2BFimDPGfiThDpNoK6ujujo6CrXP3v2DKqq/J9kRVEURVHU/yehBgNDhgzBmjVrUFhYyLOuoKAAjo6OGDp0aI01jqIoiqL+hLp2AaFQ1wysWrUKAQEBaNmyJebMmQMdHR2wWCzExcVh3759KC4uhoODw88roiiKoqj/YyV/zdd4zRBqMKCsrIwHDx7A1tYW9vb2FR4iwoKxsTHc3Nx4fm2JoiiKoqj/b0L/3mGTJk1w5coV5OXl4eXLlyCEQFtbG3KVHgdKURRFUX+rP/U44j9FqMHAlClTBIrz9PT8pcZQFEVR1P+DunWSQMjBgLe3N7S0tNChQ4c697xoiqIoqu6gMwPVmDVrFvz8/PDq1StMmTIFVlZWkJeXr622URRFURT1Gwh1a6GbmxvS0tKwfPlyXLx4EZqamrCwsMD169fpTAFFURT1zyhh1dzyNxD6VwslJCQwfvx43Lx5E8+fP4eenh5sbW2hpaWFL19q95fQKIqiKOp3KAGpseVv8J9+wrjsV6kIITy/WkVRFEVR1N9B6MFAUVERTpw4gYEDB0JHRwfR0dHYu3cvUlNTUb9+/dpoI0VRFEX9VvQJhNWwtbWFn58fGjdujMmTJ8PPzw8KCgq11TaKoiiK+iPq2ly3UIMBd3d3NG7cGE2bNsXdu3dx9+5dvnEBAQE10jiKoiiKomqfUIOBSZMmgcX6Sy6NpCiKoqhf9Ldc+FdThH7oEEVRFEX96+rWUOA/3k1AURRFUdTfT+gfKqIoiqKofx29gJCiKIqi6jh6zQBFURRF1XF1ayhArxmgKIqiqDrv/2pmoLAw9U83oVblfXn5p5tQ67I/JfzpJtS6j1+S/nQTahVbsdmfbkKtqwv7ae7nxD/dhL8avWbgD5KU1PrTTag1BQUpUJZt9aebUasyPsajkazOn25Grcr6+AIqDVv/6WbUmvQPcVCUafmnm1Grsj8lgJP96k83o1axFZuhsXzbP92MWpWaG12r9ZM6dqKAniagKIqiqDru/2pmgKIoiqL+H9DTBBRFURRVx9W1WwvpaQKKoiiKquPozABFURRFVVK35gXoYICiKIqieNDTBBRFURRF1Sl0ZoCiKIqiKqF3E1AURVFUHVfXHjpEBwMURVEUVUldmxmg1wxQFEVRVB1HZwYoiqIoqhJ6moCiKIqi6jh6moCiKIqiqDqFzgxQFEVRVCUlpG6dJhB6ZuDw4cOwtraGl5cXAODkyZNo3bo1mjVrBkdHxxpvIEVRFEX9bqQGl7+BUDMDu3btwqpVq2BsbAwHBwe8f/8eO3fuxMKFC1FSUgIXFxeoq6tjxowZtdVeiqIoiqJqmFCDgQMHDuDgwYOYMGECIiIi0LVrV7i7u2Pq1KkAAA0NDezbt48OBiiKoqi/Gv1tgmqkpKSgV69eAIAOHTpAVFQU3bt3567v3bs3kpKSaraFFEVRFPWbkRr8728g1GBASkoK+fn53H83atQI9evXZ8R8//69ZlpWwYwZExEXF4y8vBd48OASDAy6VBvfq1c3PHhwCXl5L/D8+X1Mm2bJEyMrK4OdO9fj1atQ5OW9QEREEIyNjbjrlyyxRXDwBWRmxiIlJRynTh2EtnYz7noxMTFs2LACoaHXkZ0dh1evnuDw4R1QVVUSOj+baeMR+iwQKRlRuHH3DLr16FRtfA+DLrhx9wxSMqLwJOomJk0Zy1gvJiaGRctsERJ5AykZUbgVfA5G/XsxYkKfBSHjYzzP4rx9NTdmt5szz/orgX5C5wcAk6dNQNizILzJeIbAu2fQ/Sc59jTogsC7Z/Am4xlCowJhPWUcT46Ll9nhSeRNvMl4htvB59Gvf29GjM3U8bjz4AJevQnHqzfhuHLTD/0H9OHZ1tIVcxAdfx+p6VE4d+kodFq1+KUcbaaOx5Oom0hOj8T1O6er7Ucl5UZwO7QNwaFX8D43Fk7O9nzjTIcNxL3HF5GSEYV7jy9i8NABQm83/UMc38V27hSh8ps8bQLCnwXhbWY0gu4GoHuPztXG9zTogqC7AXibGY2wqCDY8OnDJcvtEBoViLeZ0bjz4AL6DejNU8/PtrvMfi4ehV1DSlokXqaE4sx5b3Ts3E6o3AAgLDIadsscYTTMEm0MBiPo3sOfviY04hkspsxFR6NhMBkzGSfPXuaJuXk7GMMsZ6CDoRmGWc5A4N0HPDF+AZdgPNoGHY2GwWLKXIRHxjDWE0Kwz+M4jIZZopPRcNjMWYaXr1KEzhEAJk4Zi+CIq0h4H4bLt06ia/eOVcYqKStiz8EtuB1yAcnZUXDctIwnpmWr5nA/sgMPIq8hNTcaU2dZ8cQsXD4bqbnRjCUs7jYjRkpaEk5bViIkJhAJ70IR9Pg8rCZb/FKOlPCEGgy0atUKz5494/77zZs30NLS4v47Pj4eTZo0qbHGAcDo0UOxbdsabNmyF927m+Lhwyc4d+4INDXV+MZraWni3DlvPHz4BN27m2Lr1n1wcVkLc/PB3Bg2m43Ll49DS0sDlpaz0b59P9jZLcf79+ncmN69u8Hd/Sj69jXH0KFWEBUVw6VLxyAlJQkAkJKShL5+G2zevAc9ephi3LiZ0NZuCn9/D6HyGz5yMNY722PXdncM6D0CIQ/DcOL0QahrqPKNb6ylDl//Awh5GIYBvUdgt8sBbNziANNhg7gxK1bPx6TJY7Fy6Qb06WaKI15+8PLZizbtWnNjTIxGo412L+4yZvhkAMDFc9cZ2wu6eY8RN2HMTKHyAwDzkYOxwdkeu7bvR7/e5nj8MBx+pw9Vk6MGfP0P4vHDcPTrbY7dLu7YtMUBQyvkaL96Aawnj8XKpevRq9sQHPHyg7fPXrStkOP7d+nYsHY7BhiOwgDDUQi+9xhHT+xjfNnPXTAds+0mY8VSJwwyGo3MzGycPucF6frSQuU4fMRgODmvwK7tBzCwz0iEPAqHr/+BKnOUkGAjJycXu10OIDYmnm9Mpy76OOC5A/4nL6B/L3P4n7yAg1470KFT+RedINtt27I3Y1lgtxIlJSW4dOGGwPmZjxyCjZtXYud2dxj1MsejR2HwO1N9H544fQiPHoXBqJc5drm4Y9PWVYw+XLl6Aawnj4P90vUw6DoERzxP4IjPPkYfCrLdpJevsXyJE/r0MIOp8Xi8SX2H02e9oKAgJ3B+AFBQUAidFs2wcpGtQPFv36fDdskadGynB3+vvZg2cSycd7nj5u1gbkxkTByWODrDzLg/zhxxg5lxfyxZ7YxnseV9fjXwLjbvPoDpk8bB32svOrbTw6wlq5GWnsmN8fTxx1G/AKxcZAs/j91QlJfD9AUrkZ//VagczUYYw3HTcuzdcQhDDMfgyeNwHDm1H2rqKnzjxcXFkZuTi707DuF5zAu+MfUk6yE1+S02O+1CZnpWldt+EZeITq0MucugXiMZ6x03LoNhfwPMn7kC/boPh8f+Y3DaYo+Bg42qqLF2ldTg8jdgESL4/RMPHjyAtLQ09PX1+a53c3NDSUkJ5syZ80uNkZTU4im7d+8cIiJiMH/+Km5ZREQQLl68jjVrtvLEb9iwAqamA9GhQ39u2Z49G9GunS4MDUcAAKZNs8TChTPRvn0/gWcyFBXl8eZNBAYMGIMHD57wjenUqR2Cgy+iZcseePPmPWNdQUEKlGVb8bzmatBJPIt6juWL1nHL7j+5jGuXg7Bx3Q6e+FXrFsN4cD/07mrKLdu6cy302rSC6cDSI6+o+HvYtd0dXod9uTHePnuRn/8VdjN4R/YAsN7ZHgNNDNG9gzG3bLebM2RlG8DGUrD+zPgYj0ayOjzl14JO4VnUcyxbtJZb9uDJFVy9HIgNfHJcvW4JTAb3g0HXIdyybTvXQa+NDob8yDE6/j52bt8Pzwo5HvHZh/z8r7CdsbTKNiYkh2Dd6m3wOXYaABDz4j4O7D8K112HAADi4mw8T3wIp7XbcdTrJM/rsz6+gErD1jzlVwL9EB0Vh+WLy/vxXsglXLschE1OO6tsDwAEXDqCmOh4rLF3ZpQf8NyBBg2kGQMw39MH8fHDJ8yetuSXt+vl44r69aUxZjjvzED6hzgoyrTkKb9+yx/PImOxdNFabtnD0Ku4cikQG9a58MSvWbcEJkP6o2eX8kH49p3roNe2FQYPKJ3JinlxHzu2u8PzkA835qivG/Lz8zF7+tJf2i4A1G8gjeR3ERhhZo37dx/xrM/+lABO9iu+ry3TxmAwdjuvRv8+PauM2eHmgdvBIbjoe5Bbtm6rKxJevoLPwdL3fvFqZ+R//Qp3l/XcmJmLVkGmQX1sW7cCADB++gK0btkca5bO5caYTZiBfr17YOHsySCEwGi4JSZamGOqVemR8rdv39DXbAIWzp4CC/Pyv5MybMVmaCzflqf8/E0fxETFwWHJBm5Z0OPzuHH5Fras313te3Lygieex8Rj3Urez90yDyKvwdP9ODzcjzPKFy6fjUFD+mFw3zFVvvbmgwBcPHsde7Yf4JZdvnUStwLvw2XTXp741Nzoatv7X43RGl5jdfmnnK+xumqLUDMDBgYGVQ4EAMDW1vaXBwL8sNlsdOjQFkFB9xnlQUH30L07/ynYbt06IijoHqMsMPAeOnZsCzGx0uslTU0HIiTkKXbtWo/k5DCEhd3A0qV2EBGp+u2QkWkAAMjL+1BtTElJCT58+CRIemCz2Winr4c7t5jThndvPUDnrh34vqZzF33crRR/OygY7TvocfMTlxBHUVERI6awsAhdq3jP2Gw2Ro0dhhPHA3jW9ezVFbEvH+Bh+DW47HGCoqK8QLlVrLu9vh7u3ApmlN+59QBdqsixSxd9nvfkdtB96HdoUyFHNoqKvjFiCgsL0a2KKU8RERGYjxoCKSkphD6JAABoNdGAsooSo23fvnHw8EEoulbRtqpybKevhzu3K/Xj7Qfo0k3weirr1KU97txmTlXfuVVe569sV7GRAgYM6gvfY2cEbkdZH96u3Ce3gtG1iu106doBtyv1+a2g4Ep9KI6iwsr7aSG6/dhPf2W7bDYb1jZj8fHDJ8RG859xqSlRMfHo2ZW5vxl064jY+ERwfhxkRMXGoWeXSjFdOyEyOg4AwOFw8PxFIk89Pbt2RFTMcwClMxDZOXmMGHFxcXTWb4vI6OcCt5fNFkPb9rq4V2mfun/7ITp11Re4nl/VtFljhMYGITjiKvYe3orGWhqM9aGPIzDQxBDKP0619ujVBU2ba+FeEO9pld+BXjNQjby8PLi6uuLTJ94vu48fP1a5rrKioiJ8+vSJsVT+8gIARUU5iImJITMzm1GekZENZeVGfOtWVm6EjAxmfGZmNthsNveLrGlTTYwYMRiioqIYMcIGW7a4Yv786Vi+vOqBzJYtq/HgwRM8f57Ad72EhATWr1+BkyfP4/PnL9XmX0ZeoTS/rMwcRnlWVg6UlBX5vkZJuRGysirFZ+aAzWZD/se06J2gYMy0s0HTZlpgsVjoY9QTxkP6QVmF/3s2eGh/yMo2gJ/PWUb5rcB7sJ2+FKPMbLDWYQv0O7TFmYveEBdnC5Rf9TlmQ6mKPlRSVkRWFrMPy3Ism/q9HRSMWXY2aPYjx75GPWEypD+UVZjXbLTWbYnkd0/xLisa23esg42lHRJelF7kqqRUuv1Mvm3j//7zz7Hhjxx529xISfB6KlNSVuRTZza3zl/Z7tjx5vjyJR9XLt4UuB0K3D7k3U7V+6kib5//+Dus2Iez50xGs+b8+1CY7Q4yMUTy+wi8y4rGLLvJGG0+Gbm5eQLn+Cuyc/OgINeQUaYgL4fvxcXcA4LsnDwoyFeOaYjs3FwAQN6HTyguLoGCPPOUhoJcQ2Tn5HG3U1omx6cewXMs+1vM5vP50UhJQeB6fkVEeDQW2jrAavQsrFiwDo2UFBFw7RgayslyYxxXOCPxRRJCY4OQlPEUR/3dsWrpBoSGRNRq26hSQg0G9u7di3v37kFGRoZnnaysLO7fvw9XV9ef1uPs7AxZWVnG4uzsXGV85TMZLBaLp+xn8RXLRUREkJWVAzu7FYiIiIG//0Vs3boX06dP5Fvfzp3r0bZtK1hbz+W7XkxMDMeOuUJERIRxOkNgPO3lzYEZzhtfsXzV8o14nZSCB2FX8DY7Gs7bVsPPJwDFxcV865swcTRu3byPjArnKAHgfMBVBN64i/i4RNy4dhvjR89AsxZNMMDYUMgEa74PHZZvxKukFDwMu4r32THYvG0N3xxfJr6GUW9zmAwYC2/PE3B134KWOs0rb4xP24RKj181P81RsDp//r4Js91xViMR4H+JZ1ZFoLaA335aTfxP+nDlsg14lZSCR2HXkJYTiy3b1+AEnz4UZLvB90Jg1Gs4Bg8ci6DAezjsvUvoWaxfUZZTmbLcKhbzi6lcxhMDAWIIb5kg+O8vQlcjlDuBwbh6MRAv4hIRfPcxbMbZAQBGjy+fip880xIdOrfDlPFzYGo0DhtWb8eGbavQq2/3qqqtVXXtmgGhBgNnzpzBrFmzqlw/c+ZMnD59+qf12Nvb4+PHj4zF3p73aurs7Dx8//6dZxZASUmBZ7agTEZGFlQqHQE3aqQADoeDnB8j7fT0TCQmvkZJSXk3xce/hKqqEths5lHvjh3rMHToABgbj8e7d+moTExMDD4++6ClpYmhQy0FnhUAgNyc0vwaVTrKUVRU4DmqKpOZkQWlSkd9ij/yy8v9AADIycmDjeUcNFXtgE5t+sGg82Dk539Faspbnvo0NNXQx7AHfI76/7S9mRlZePvmPZo15722oyplOVY+kivNkX8fZmZkc4/aufGN5MHhcJBbIUdrSztoqeqjQxsj9Ohsgi98cuRwOHj9KhVRETHYsG4HYmPiMWP2pNLtZJZe7CRM2/jn+IF/jo3keY7ChJGZwTt7othIgVunsNvt1qMTtFs2g8/Rn/+NVpRT1oc8ffKTPuRplwJPH06aYIvGKu2hr2eE7p1MkP8ln9uHwmz369cCvH6VivDQKCyY44Di4mJYTqr6/HRNUJSX4zkyz837ADFRUcjKlh4wKSrIcY/wy2M+co/y5RrKQFRUBNk5ubwxP2YUFH/MGpTNJlTcVuWZiepwP28qzQL81/30VxR8LcCLuEQ0bdYYACBRTwLLVs3H+lXbEHj9LuKfJ+DI4RO4eO4aZsyx/q1tK0MIqbHlbyDUYCApKQna2tpVrtfW1hboOQMSEhKQkZFhLBISEjxxHA4HERHR6NePebtRv3698fhxON+6Q0Ke8sT3798bT59Gcy8WfPQoDM1/TE2Wt70p0tIywOFwuGU7dzph+HATmJiMR0rKG55tlQ0EmjdvClNTS+6HnKA4HA6eRcairxHzIqU+Rj0R9oT/1FhYaCT6VIo37GeAqIhYnoshi4q+IT0tE2JiYhg6bBCuX7nFU984y5HIzsrBzet3f9peObmGUFNXRUY1VwxXxuFwEBUZi75GBozyvkY9uefuKwsNjeR5Twz79UJkREy1OZoNG4RrV4KqbQ+LxYKEuDgAICX5LTLSMxltY7PZ6GnQBU+qaFtVOT6LjEVfQ2ab+xr2/E9TnOGhUTx1GhqV1ynsdidMHIWoiJgqrwqvSlkfGvar3BYDPKkiv9AnETCs1OdG/Qyq6cOM0v10uDGuXg765e1ysViQkBAXJL1f1r5NKzwKfcooe/jkKfRaaYP947qI9nqteWNCn0K/belFqGw2G7o62ngUysznUehTtG+jCwDQUFOBooIcI4bD4SAsMhr6bXUFbi+H8x3RUc/R27AHo7y3YQ+EP4kUuJ6aIC7ORouWzZD545Qumy0GcXE2z+8BlBSXVHstF1VzhHqXRUVF8f79+yrXv3//vsY7bs+ew5g8eSwmTbKAjk4LbN26Gpqaajh8uPQKZCenZTh8uPyK9EOHfNC4sTq2bFkNHZ0WmDTJAjY2Y7Fr18EKMcchLy8HF5e1aNGiKUxM+mHpUju4ux/lxuzatQHjxpnD2noevnzJh7JyIygrN0K9ehLc98LXdz86dmyHyZPnQ1RUlBtTeXahOu77vGE5aTTGW42EdstmcNq0AhoaqjjiWXo/v4PjIri6b+bGH/X0g6amGtZtXAHtls0w3mokJkwcBTdXT25Mx07tMMRsILSaaKBbj07wCzgEERER7N19mLFtFouFcZYjcOrEOZ6pWSlpKThuWIbOXfSh2VgdPXt1xbGT+5Gbk4crlwIFzq80Ry9YTRqNCVajoN2yGdZvsoeGhiq8f+S4ynER9rpv4cYf8fSDhqYanH7kOMFqFCz55Gj6I8fuPTrhZMBhsERE4FohR4c1C9G9RydoNlZHa92WWLl6AQx6dcVp/4vcmAP7j2LBopkYMnQAWrXWhut+ZxQUFOKM/yWhcjyw7wgmTBrF7cd1m1ZAXUOVe0fCyjULGf0IAHptW0GvbStIS0tBQUEOem1bMU5hHHI/ir79emLO/Glood0Uc+ZPQ2/DHji4/6jA2y1Tv4E0zIYbCz0rUGb/Xi9YTRrzow+bY4OzPdQ1VOHteQIAsMpxMfYdKL/K3PtHH67fZA/tls1L+3DSaOzbU37rbcfO7WBqNghaTTTRvUdnnArwgAhLBK67Dwm8XSkpSTisWYROXdpDQ1MN7drrYpfrRqipqeD82atC5fj1awHiE5IQn1B6QPPufQbiE5K4t/jt3O8F+/XbufEW5qZIS8/E1j0HkZScioBL1xFw6QZsxo/ixlhZDMfD0KfwOH4Kr1LewOP4KTwOjcBEC3NuzKSxI3Dm4nUEXLqOpORUbNl9AGkZWRg7ovQuARaLhYkW5jh09CQC7z5A4qtkOGzcgXoSEjAdaChUjofdjmLcxFGwsDRHi5ZNsWbjMqipq+K41ykAwPLV87HTbSPjNbptdKDbRgfS0lKQV5CHbhsdaOuUP3OFzRbjxoiz2VBWVYJuGx1oNdXkxjg4LUa3np2h2Vgd+p3awt17B+o3kMbpE6VX2X/5nI9HwaFwWLcI3Q1K40aPH45RY81w7VL1A/zaUgJSY8vfQKjHEXfo0AHnzp1jPHWworNnz6JDh1+/epqf06cvQV5eDitXzoOKihJiYxNgbm6D1NR3AAAVFSXGMwdSUt7A3NwGW7euwcyZE5GWlonFi9fi3LnyD4a3b9NgZjYRW7euRmjoNbx/n4F9+7zg4rKfGzNzZun1AzdvnmK0Z/r0xTh+/DTU1VVhZlZ6z/STJ9cYMYMGjcX9+48Fyu98wFXIyTfEomV2UFZphPi4REwYMxNvf9yaqKTcCOoa5fmlprzDhDEz4eS8ApOnT0BGeiYclm/E5Qr3jEvUk8CKVfOh1UQT+flfEXTjLuxmLMenj58Z2+5j1BOajdXhe4z3LoKS4mK01m0Ji3HDISPbABnpWXhw/wlmTF6I/C/5PPHVORdwFXLycli8zBbKKkqIj0vA+DEzuDkqKzeCRoX7xlNT3mLCmBlY72yPKdMtkZ6eiZXLNzLui69XTwL2qxZwcwy8cRe2M5YxcmykpIh9B7ZCWUUJnz59xvPYFxg7ahruVria2nXXIdSrJ4GtLo6QbSiLp2FRGDNiitA5nj9b1o+2UFIu7UdLi1nlOao04rknP+h++QWb7Tu0wSgLM7xJfYcu7UofLBT2JBKzpizG8lXzscxhLpJfv8HMKYsREf5M4O2WMR85BGCxcPYM70NxBHEu4Ark5BtiyXK70j58noDxo6cz8qvch+NHT8cG55WlfZiWgZXLNjD7UEICK1dX7sOljD782XaLi4uh3bIZxk0YAXkFOeTl5iHiaTTMTCbgRfxLoXKMiU/ElLnLuf/e6lp6ADF88ABsXLUY2Tm5SMsov65GQ00FbtudsHXPQZwIuAglRQXYL5iFgUblD/jq0FYX29atgOvBo3A9dAya6qrY5mSPdnrltxkPHtAXHz99hruXL7JycqHdrAn2b3eCmooyN2aK5RgUFn3DBpd9+PT5C9rp6uDgro2QlpYSKseLZ6+joVxDzF86C0rKjZAQ9xLWY23x7m0agNLPG7VK++m1e+UDyHYd9DBijCnepL6Dgb4JAEBZRYkRM2vuZMyaOxmPgkMxdljp7auqasrYe2gL5BTkkJudi6fhz2A+yJK7XQCYM20plq9ZgD0HNqOhnCzevknD1o2u3IHK7/a3nOuvKUI9Z+DMmTMYN24cdu7cidmzZ0NUVBRA6R+km5sbFi9eDF9fX4wePfqXGsPvOQP/iqqeM/Avqeo5A/+Sqp4z8K+o6jkD/xJBnjPwt6vqOQP/ktp+zoBZ46E1VtfFVOFmGv8EoWYGRo0ahWXLlmHevHlwcHBAs2bNwGKxkJSUhC9fvmDp0qW/PBCgKIqiqP8Xf8vzAWqKUIMBANi4cSPMzc3h4+ODxMREEELQp08fTJgwAV27dq2NNlIURVHUb/W3nOuvKUINBr5+/YqlS5fi3Llz4HA46N+/P1xdXaGo+OsPVqEoiqIo6s8S6tJ/R0dHeHt7w9TUFOPHj0dgYCBmz55dW22jKIqiqD+irj1nQKiZgYCAAHh4eGDcuNIfi7G0tISBgQGKi4u5FxNSFEVR1N+urt1NINTMwJs3b9C7d/kDfbp27QoxMbFqnz1AURRFUX8b+kNF1SguLoa4OPOpXmJiYgL/DDBFURRFUf9/hDpNQAiBjY0N49HBhYWFmDVrFqSlpbllAQG8D7GhKIqiqL8FvZugGtbWvD8YYWVlVWONoSiKoqj/B3/LhX81RajBgJeXV221g6IoiqIoAG5ubti2bRvS0tKgp6eHXbt2Ma7XqyggIAD79+9HZGQkioqKoKenh7Vr18LY2FiobdKfg6IoiqKoSv7UDxWdPHkSCxYsgIODAyIiItC7d28MHjwYqampfOPv3buHgQMH4sqVKwgPD4eRkRHMzMwQESHcL6YK9dsEtY3+NsHfjf42wd+P/jbBv4H+NsF/Z6gxoMbquvNW8F967datGzp27Ij9+8t/OK9169YwNzeHs7OzQHXo6elh7NixWLNmjcDbpTMDFEVRFFWLioqK8OnTJ8ZSVFTEE/ft2zeEh4dj0KBBjPJBgwbh4cOHPPH8lJSU4PPnz5CXlxeqjXQwQFEURVGVlBBSY4uzszNkZWUZC7+j/OzsbBQXF0NZWZlRrqysjPT0dIHa7eLigvz8fFhYWAiVr9A/VERRFEVR/7qaPH9ub2+PRYsWMcoq3qJfGYvFYraFEJ4yfk6cOIG1a9fi/PnzUFJSEqqNdDBAURRFUbVIQkKi2i//MoqKihAVFeWZBcjMzOSZLajs5MmTmDp1Kvz9/TFggPDXO9DTBBRFURRVyZ+4m0BcXBydOnXCzZs3GeU3b95Ez549q3zdiRMnYGNjA19fX5iamv5SvnRmgKIoiqIq+VNPIFy0aBEmTpyIzp07o0ePHjh48CBSU1Mxa9YsAKWnHN69e4ejR48CKB0ITJo0Cbt370b37t25swqSkpKQlZUVeLt0MEBRFEVRlfypu+7Hjh2LnJwcODk5IS0tDW3atMGVK1egpVV6631aWhrjmQMHDhzA9+/fYWdnBzs7O265tbU1vL29Bd4uHQxQFEVR1P8RW1tb2Nra8l1X+Qv+zp07NbLN/6uHDlEURVHU/4Ouan1rrK4n7+/WWF215f9qZuBffwLhv/zkOqD06XV1IUfVhrp/uhm1Ju3Dc8g30P7TzahVuZ8T68TT+erCUxZrE6ljv1pI7yagKIqiqDru/2pmgKIoiqL+H9S1M+h0MEBRFEVRlfypWwv/FHqagKIoiqLqODozQFEURVGV0NMEFEVRFFXH0dMEFEVRFEXVKXRmgKIoiqIqqWvPGaCDAYqiKIqqpIReM0BRFEVRdVtdmxmg1wxQFEVRVB0n8GAgISGBcatFcHAwzM3NoaenhwEDBuD8+fO10kCKoiiK+t1KCKmx5W8g8GCgdevWyMrKAlD6k4l9+/ZFSUkJLC0t0bBhQ4wcORLXr1+vtYZSFEVR1O9CavC/v4HA1wxUnBXYsGEDZs2ahX379nHL7O3tsWnTJhgbG9dsCymKoiiKqlW/dM3A8+fPMWnSJEbZxIkTERsbWyONoiiKoqg/qa6dJhDqboLPnz+jXr16kJSUhISEBGOduLg4CgoKarRxFEVRFPUn/C3T+zVFqMFAy5YtAZSeMggPD4e+vj53XWxsLNTV1Wu0cRRFURRF1T6BBwO3b99m/FtVVZXx7+TkZEyfPr1mWkVRFEVRf9DfMr1fUwQeDPTt27fa9fPnz//PjaEoiqKo/wd17TQBfegQRVEURdVxAg8GGjRogKlTp+Lhw4e12R6+ZsyYiLi4YOTlvcCDB5dgYNCl2vhevbrhwYNLyMt7gefP72PaNEueGFlZGezcuR6vXoUiL+8FIiKCYGxsxF1vYNAVp0974NWrJygoSIGZ2SC+29LRaQF//8NIT49GZmYs7t49C01NNaHys5k6Hk+ibiI5PRLX75xGtx6dqoxVUm4Et0PbEBx6Be9zY+HkbM83znTYQNx7fBEpGVG49/giBg8dwFg/d+F0XLt1Ci/fhCEmMRhePq5o3qIJI2a32yakf4hjLJdv+gmV26/kCAA9DLrg+p3TSE6PREjkDUyaPJaxXkxMDIuW2eJxxHUkp0ciKPgsjPr3YsRI15eCk7M9wqKD8DotAhev+0K/QxuebWm3bIYjJ/YhIeUJXr4Jw+WbflDXUOWJ+xnrqeMQEnUDr9MjcP2O/0/6URH7Dm3F/dDLeJcbAyfnFXzjTIcNxN3HF5GcEYm7jy9i8ND+jPWLV9gh7cNzxhL14h5PPYtX2CEi7g5epT3FmUveaNmqhdD5TZk2ARHRt/A+Kwa37p1F956dq43vadAVt+6dxfusGDx9dgs2U8bzxMyytUHI0+t4lxmN6Lh72Oi8EhIS4tz1kTG3kfs5kWfZ6uLIqGe5/VzEJgTjXWY0Llw5jla/kB8ATJwyFsERV5HwPgyXb51E1+4dq4xVUlbEnoNbcDvkApKzo+C4aRlPTMtWzeF+ZAceRF5Dam40ps6y4olZuHw2UnOjGUtYHPO0rJS0JJy2rERITCAS3oUi6PF5WE22ECq3sMho2C1zhNEwS7QxGIygez//LA+NeAaLKXPR0WgYTMZMxsmzl3libt4OxjDLGehgaIZhljMQePcBT4xfwCUYj7ZBR6NhsJgyF+GRMYz1hBDs8zgOo2GW6GQ0HDZzluHlqxSh8qtphJTU2PI3EHgwkJ+fj5CQEPTq1QutW7eGi4sLMjMza7NtAIDRo4di27Y12LJlL7p3N8XDh09w7tyRKr9wtbQ0ce6cNx4+fILu3U2xdes+uLishbn5YG4Mm83G5cvHoaWlAUvL2Wjfvh/s7Jbj/ft0boy0tBSio+OwcOGaKtvWtGljBAWdRkJCEoyNx6FrVxM4O7uisLBI4PyGjxgMJ+cV2LX9AAb2GYmQR+Hw9T9Q5ZeRhAQbOTm52O1yALEx8XxjOnXRxwHPHfA/eQH9e5nD/+QFHPTagQ6d2nFjehh0gddhX5gOHAeLEVMhJiqGk2c9ICUlyajr1s17aNuyN3exHDNT4Nx+NcfGWurwOeWOkEfhGNhnJHa7HMSGLSthOmwgN2bFqvmYaGMBh2Ub0afbUBz1PAnP465o0641N2bHng3oa9gTc2Yuh1HP4bh7+wFOnfOEiqoSN0ariSbOX/PBy4TXGGlmjX69zLFz234UCdGHADBshAmcnO2xe/sBDOozCiGPwuFTTY7iEuLIzcnDHpcDiI15wTemU5f2cPd0wemTFzCg1wicPnkBByr1IwDEP09Eu5Z9uEu/nsMZ6+3mT8VMW2s4LNuAwf0skJmRjZNnD0O6vpTA+Y0YOQSbtjhgx/b9MOw1HI8fhuHUmcPV9KEGTp45hMcPw2DYazh2uuzH5m2rYDas/Dkkoy2GYc26JdjqvBfdO5tgnt1KmI8agjVrl3Bj+huOQqvmPbjLCDNrAMD5s1e5MfMWzoDtnClYvsQJA/qORGZGFs5c8Eb9+tIC5wcAZiOM4bhpOfbuOIQhhmPw5HE4jpzaDzV1Fb7x4uLiyM3Jxd4dh/C8ij6sJ1kPqclvsdlpFzLTs6rc9ou4RHRqZchdBvUayVjvuHEZDPsbYP7MFejXfTg89h+D0xZ7DBxsVEWNvAoKCqHTohlWLrIVKP7t+3TYLlmDju304O+1F9MmjoXzLnfcvB3MjYmMicMSR2eYGffHmSNuMDPujyWrnfEstvyz6WrgXWzefQDTJ42Dv9dedGynh1lLViMtvfz7w9PHH0f9ArBykS38PHZDUV4O0xesRH7+V4Hzq2klIDW2/A1YhAh2lYSIiAjS09ORlpaGw4cPw9fXF1++fMHQoUMxbdo0mJiYgMVi/afGSEpq8ZTdu3cOERExmD9/FbcsIiIIFy9ex5o1W3niN2xYAVPTgejQofwIas+ejWjXTheGhiMAANOmWWLhwplo374fvn///tN2FRSkwMJiOi5evMEoP3rUFRzOd0ydulCgOlQatuYpvxLoh+ioOCxfvK4855BLuHY5CJucdlZbZ8ClI4iJjscae2dG+QHPHWjQQBoTKnxx+54+iI8fPmH2tCWVqwEAKCjIITbpIcyHTMTjh2EASmcGZGQbYLLl3J/mBwDpH+JqJMdVaxdj0GAj9Ok2lFu2ZYcj9Nq0wtBBpUeXkXF3sdvlALwO+3JjvHxckf/lK+bMXI569STw8m0YbCbMQeCNu9yYwPsBuHntLrZs3A0AcPdwAef7d8yduVzgHFUb6vKUXw70Q3TUc6xY7FQhx4u4dvnWT/vxzCVvxEbHY439Zka5u6cLGjSozxiA+Z4+gA8fPsF22lIApUf8Jqb9MbA388ujosj4uzi0/yj27fYAAIiLs/Es8T42Ou7AMe9TjNi0D88h30Cbp46bt04jKioWSxaWH5E/DruGy5duYv1aF554R6elGDykP7p3NuGWuexyQpu2rWDcv/SIdsv2NWip05z7BQ8A6zetQMdO7WBqPIFvLps2O2CQiRE665fPdD1PfAB3tyPYs/Pgj/zE8SLpEdau2YYjXrwzWbmfE9FYvi1P+fmbPoiJioPDkg3csqDH53Hj8i1sWb+bb3vKnLzgiecx8Vi3kvczqcyDyGvwdD8OD/fjjPKFy2dj0JB+GNx3TJWvvfkgABfPXsee7Qe4ZZdvncStwPtw2bSXJz41Nxqc7FdV1tfGYDB2O69G/z49q4zZ4eaB28EhuOh7kFu2bqsrEl6+gs/B0n168Wpn5H/9CneX9dyYmYtWQaZBfWxbVzrbNX76ArRu2RxrlpZ/jphNmIF+vXtg4ezJIITAaLglJlqYY6pV6b7x7ds39DWbgIWzp8DCfAjf9rEVm1XZ9prAbx/5Vam50TVWV20R+pqB9u3bw9XVFWlpafD29sbHjx8xdOhQNG7cGGvWVH0U/SvYbDY6dGiLoKD7jPKgoHvo3p3/FGy3bh0RFMScJg0MvIeOHdtCTKz0eklT04EICXmKXbvWIzk5DGFhN7B0qR1ERAR/O1gsFkxM+iEx8TUuXDiKlJRw3Lt3rsrTCVXl105fD3duM6fV7t5+gC7dOghcT2WdurTHndvMKcA7t6qvs4FMAwDAh7yPjPKevboiJjEYD8KuYvtuJygqygvVll/JsVNXfdytFH/n1gO076DH7UNxCXEUFjGP3gsLirhT86JiohATE+OZpSmNKZ36ZbFYGDCoL169TMaJM4cQkxiMK4F+MDFlTsULlqMuT5vv3n6Izt30haqros5d+L8Pld+3Zs0aIyLuDkKibmC/x3Y01tLgrmuspQFllUa4W2F/+PaNg0cPwgRuG5vNRvsOerh9K5hRfjsoGF278Z9G79K1A24HMeNvBd2Hfoc23D4MeRQOff026PhjpkOriSYGDjLEjet3qmzHmHHD4HP8NLdMq4kmVFSUGNv69u0bHjx4gq7dBf8bYrPF0La9Lu5V+ru5f/shOnXVF7ieX9W0WWOExgYhOOIq9h7eyuhDAAh9HIGBJoZQ/jGr1aNXFzRtroV7QbxT8jUlKiYePbsy+9egW0fExieC8+MgKio2Dj27VIrp2gmR0XEAAA6Hg+cvEnnq6dm1I6JingMonYHIzsljxIiLi6OzfltERj+v8bwo/gT+9qt81C8uLo7x48cjMDAQSUlJsLGxgbe3t0B1FRUV4dOnT4ylqIh3WlZRUQ5iYmLIzMxmlGdkZENZuRHfupWVGyEjgxmfmZkNNpvN/SJr2lQTI0YMhqioKEaMsMGWLa6YP386li+fI1D7AUBJSRENGtTHkiWzcfPmXZiZTcSFC9fh53cAvXp1E6gOeYWGEBMTQ1al/LIyc9BISVHgtvC0TVmRT53Z1da5btNyPH4Yhvi4RG7ZrZv3YTd9GUYPm4y1q7ZAv2MbnL7gDXFxtsBt+ZUclZQUkZWZw9N+NpsNeQU5AMCdoGDMsrVB02ZaYLFY6GPYE8ZD+kHpx36R/+UrQkMisGjZbCirNIKIiAhGWZihY+d23BjFRgqo30AacxdMw+2gYIwdOQ1XLgXC89ge9PjJdSn8c6zc5v/Wj42U+b0PzDojwp5h3mx7jB81HUvmOUJJWREXb/hCTk4WQOm+UPo65vufnZkNJQHbpqAgx7cPM7OyufVXpqSsiMws3n2QzWZD4UcfBpy5jE0bduHKjRPIyH2OiOhbuH/vMXbvOMivSpgOHQBZWRmcOB7ALVOuIr+szGwoK/H/jOBH/keO2Vn83m8Fgev5FRHh0Vho6wCr0bOwYsE6NFJSRMC1Y2j4ow8BwHGFMxJfJCE0NghJGU9x1N8dq5ZuQGhIRK21Kzs3DwpyDRllCvJy+F5cjA8fPpXG5ORBQb5yTENk5+YCAPI+fEJxcQkU5OWYMXINkZ2Tx91OaZkcn3ryaiododW10wS/9NsElTVp0gTr16+Hk5NTlTEVOTs7Y926dYwyR0fHKqJ5t81isaptD7/4iuUiIiLIysqBnd0KlJSUICIiBqqqyliwYCacnfcIlIOISGmdly7dhKtr6fTrs2fP0a1bJ0yfbong4BCB6iltF/PfP8tPsDoFf8+ct62Grp4OhpkwL7SseF42Pi4RURGxCIsOxABjQ1y5eFPI9jD//V/7cPWKTdi+xwnBoZdBCEHy6zc46XMWYy1HcF8zZ+Zy7Nq3EVHx9/D9+3dERz1HgP8ltGtfOs1f1ofXrtzCQbcjAIDY6Hh06dYBkyaPxaMHoULmyKfNtdyPtwLLZ83ikYiw0Eg8jrgOiwnmOLDviMD1/FJb8N/60KBXVyxaOhtLF61FWGgUmjXXgvOWVchIz8L2rfsqVwerSWMQePMe0tN5r1WqifxK62H+u7QeoasRyp3A8lmNF3GJCA+Nwv3wKxg9fjgOux0FAEyeaYkOndthyvg5ePsmDd16dsKGbauQmZGN4LuPa61tlQ8Cy97TisX8YiqX8cRAgBjCW/Y7/dfP4L+NwIMBR0dH1K9fv9oYQTvO3t4eixYtYpRJSEhgyxYvRll2dh6+f//OMwugpKTAM1tQJiMjCyoqzPhGjRTA4XCQ82Mkmp6eCQ7nO0pKyq/yjI9/CVVVJbDZbHA4nJ/mkJ2dBw6Hg7gKR9IA8OLFS/TsKdhRZW7OB3z//p3n6EqxkTzPEYowMjOyuUe/5XUq8K1z41YHDBpshBGmE5H2PuMn9Wbh7Zs0NGvGe21HVX4lx8xM3iNOxR99mJf7AQCQk5OHyZZzISEhDjn5hkhPy8SqtYvxJuUd9zUpyW8wwnQSpKQkUb9BfWRmZOGA5w6k/ojJzfkADoeDhBdJjG0lvnhV7VXkwuSY9R/6MSuD3/tQ/b5R8LUAcc8T0PRHH2X+mCVTUm7E/X8AUGikIHDbcnLyfuTH+3dVeeaiTGYG75F5WR/m/ujDlasX4JTfeRw74g8AiHueACkpSezcswEu29wYH8Yammroa9QTkyztGHVmVMgvI6P8Aj3FRgo8MxPVyf2RY+VZgP/6t/grCr4W4EVcIpo2awwAkKgngWWr5mPGxPm4dbN08Bf/PAG6bXUwY451rQ0GFOXleI7Mc/M+QExUFLKyMqUxCnLcI/zymI/co3y5hjIQFRVBdk4ub8yPGQXFH7MG2bm5aFThNGRu3geemQmq9gh8msDR0RFSUoJffVwdCQkJyMjIMJbKv3UAlJ5vioiIRr9+vRnl/fr1xuPH4XzrDgl5yhPfv39vPH0azb1Y8NGjMDRvrsUYvGhrN0VaWoZAA4GytoWHP0PLlsyLWLS1myI19V0Vr+Kt41lkLPoaMi/i6WvY8z9N/4WHRvHUaWjEW+emraswZOhAjB42mfsFWR05uYZQU1dhfOj+zK/kGP4kkk/7DRAVEctzwWdR0Tekp2VCTEwMpsMG4tqVIJ76vn4tQGZGFmRlZWDY34Abw+FwEPk0Bs21mzLim7Vogrdv3guZ43P0qdTmPoY9ERYSKXA9lYWFRvLU2dfIoNp9Q1ycDe2Wzbh9lJryFhnpWehj2IMbw2az0cOgs8Bt43A4iIqIhaGRAaPcsJ8BnoQ85fua0CcRMOzHjDfq1wuRETHcPpSUlGQMyAGguLgELBaL58DC0moUsrJycOPaHUZ5SvIbpKdnMrbFZrNhYNAVTx4L/jfE4ZTOHPWu8D4BQG/DHgh/EilwPTVBXJyNFi2bcQdvbLYYxMXZPE/EKykuEeo6J2G1b9MKj0KZ/fvwyVPotdIG+8d1H+31WvPGhD6FftvWP9rOhq6ONh6FMvviUehTtG9TOkOnoaYCRQU5RgyHw0FYZDT02/JerPu70B8qElJGRgaKiorQuHHjmmgPjz17DsPDYyeePn2GkJCnmDp1PDQ11XD4sA8AwMlpGdTUVDBtWulMw6FDPpg1yxpbtqyGp+cJdOvWETY2Y2FtPY9b56FDxzF7tg1cXNbCzc0bLVo0xdKldnBz8+bGSEtLoXnzJtx/N2miiXbtdJGX9wFvfnxR7Nx5AMeO7UVwcAju3n2EQYMMMWTIABgbM++Jr86BfUfgemAzoiJjEPYkElY2FlDXUMVRr5MAgJVrFkJVTRlzZ5Xfh67XthW3jQoKctBr2wqcb+VHuIfcj+LclWOYM38arl0JgsmQ/uht2APDTMrvcd68fQ1GjDGFzYQ5+PIln3se+vOnzygsLIKUtBSWrrDDpQs3kZmRCc3G6rBfvRC5OXm4ckm4UwTC5njUyw9Tpk/A2o3L4XPEH5276mP8xJGMOyE6dGoHVTVlxDyLg6qaMpasKL0AdN8eD26MYT8DsFgsJL18jSZNtbBm/RIkJb6Gn89ZboybqycOeLrg8YMwPLgfgn4DemGQiSFGDi2/wl2wHL3hemALoiJjEf4kElY2Y3hyVFFTwrxZ5c+FYPajPE8/HnY/hrNXjsJu/lRcv3ILxkP6obdhdww3mcitY836pbh57Tbevk2DoqICFiydiQYN6sP/xHluzKH9RzFv8Qy8fpWCV0kpmLdoBgq+FiLg9CWB83Pb64n9h7YhMiIGoU8iYG0zFuoaqvDyOAEAWL12MVRVlWE7s/Reey+PE5g2wwobnO1x1PsUunTtAKtJozF9cvmM4PWrt2A7Zwqio54jLCwKzZppYeWqBbh2JYgxSGCxWJhgNQp+vmdRXFzM0zZ3tyNYtHgWXiUl49XLZCxcMhtfCwpwxv+iwPkBwGG3o9i53xnPImPxNDQKE6zHQE1dFce9Su+4WL56PlRUlbDQ1oH7Gt02OgBK+1BeQR66bXTA4XCQ+KL0Sn42WwzaOs0BAOJsNpRVlaDbRgf5+V+R8voNAMDBaTECr93F+7dpUGgkj3mLZ6B+A2mc/tGHXz7n41FwKBzWLUJhQSHevUlDN4POGDXWDE6rtgmc39evBUh9Wz7Iffc+A/EJSZCVaQBVFSXs3O+FzOwcOK8u/TuzMDfFiTMXsXXPQYwaZoKomDgEXLqBbWvL77yxshgOG7ul8Dh+Cka9e+D2/Ud4HBqBo/u3c2MmjR0B+/XboddKG+3btMbp81eRlpGFsSNK7xJgsViYaGGOQ0dPorGGGrQ01XHo6EnUk5CA6UBDgfOraXXtCYQgAvr06ROxtLQkjRs3JpMmTSJFRUXE1taWsFgsIiIiQvr06UM+fvwoaHV81avXmO8yb54DSU5OJYWFhSQ8/Bnp3380d93Ro6fI3bsPGfEDBowhT59Gk8LCQvL6dSqZM8eep86+fc1JSEg4KSgoIElJyWT16i1ESqoJd/3AgRZ823j06ClGPTNmLCGJia/I168FJDIyhowePZVvDoQQoizbiu+yfNE6kprylhQWFpHIiBgyfLAVd52fTwB5cD+EEc9PaspbRszUifNIwoskUlRURF7EvySTreb+tA5CCJk3ewVRlm1FtJTbk1uB90lWZjYpKioib1LfET+fANJB17DKPGoyR/MhE0lUZCwpLCwiKclvyNIFjjzrX8QlkoKCQpKdnUtOnThH2un0ZsRMt15AXr9KIYWFRSQ9LZN4HDhOWmh25mnbAruVJOllMvn6tYBEP3tOJo23rTZHFdnWfJeKOUZFxBDzwVbcdWU5Voyvqh8rxkydOJ8kvkgiRUXfSEL8SzLFai5j/dnTl0na+wxSVPSNvH+XTi6dv076dB3K07ZtzntJelomKSgoJA+Dn5C+3c345kAIIXL1W/BdFi9YQ1KS35DCwiIS8TSaDDEez13nc/wMuX/vMSPe1HgCiYyIIYWFRST5dSpZOG81Y72irA5x3rCL+96/SX1HDh04RrTUOzDiRg6zIYQQ0ll/QJVt27xxN0lLyyAFBYUk+H4I6dllcJWxhBCiKdeG77Jy8XpuHz6LiCWjhlhz153yOUce3n/CiK+qD8vW92g3iG9MxXrOn7lC0n/0Ydq7dHL5wg3Sr/swxnY66vQlJ33OkrR36aTgawFJfPGKrHPYWmUehBDyLSuJsQRfP0tatmzJsyydb0u+ZSWRpfNtyYSxoxiveXDjLBk+dDDR09MlRn17k2MH9/DUe+mUNzEe0I/o6uoS44H9yWX/ozwxRw/sJoZ9ehE9PV1ibjaEPLx5jrG+KPMl2bl5HenZoxtp00aPTLAYSWJDgnjqqbjUtqo+A35l+RsI/JyBuXPnIjAwELa2tggICICsrCySkpLg7u6OkpIS2NraYtiwYdi4ceMvD0z4PWfgX1HVcwb+JVU9Z+BfUtVzBv4VVT1n4F9S1XMG/iU/e87Av6C2nzOgLNuqxurK+Mj/AXH/TwQ+TXD+/HkcOXIERkZGGDVqFDQ0NHD+/HkYGJSeq9uyZQsWLVr0nwYDFEVRFPX/4G+5JbCmCHz1SWZmJlq0KH3et5qaGiQlJaGjo8Ndr6enhzdv3tR8CymKoiiKqlUCDwYUFBSQlVV+Ffnw4cPRsGFD7r+/fPnC944AiqIoivrbEEJqbPkbCDwYaNeuHUJDyx/C4uvrCyWl8h98CQ0NRevW//b5YoqiKKpuoLcWVsHHx6fae1qVlZUxevToGmkURVEURf1Jf8sRfU0ReDAgL8//B2o+fvwIHx8fHD58GFFRUVi48Oe/4EdRFEVR1P+PX3581a1bt2BlZQVVVVW4urpiyJAhCAsLq8m2URRFUdQfQX+oqBpv376Ft7c3PD09kZ+fDwsLC3A4HJw5cwa6uv/uvdcURVFU3VLXThMIPDMwZMgQ6Orq4vnz53B1dcX79+/h6upam22jKIqiKOo3EHhm4MaNG5g3bx5mz54Nbe1/+wllFEVRVN32t9wFUFMEnhm4f/8+Pn/+jM6dO6Nbt27Yu3cv47kDFEVRFPWvIDX4399A4MFAjx49cOjQIaSlpWHmzJnw8/ODuro6SkpKcPPmTXz+/Lk220lRFEVRVC0R+m4CKSkpTJkyBcHBwYiOjsbixYuxefNmKCkpYdiwYbXRRoqiKIr6reraQ4d++dZCANDR0cHWrVvx9u1bnDhxoqbaRFEURVF/FH0c8S8QFRWFubk5Lly4UBPVURRFURT1Gwn1nAGKoiiKqgv+lgv/agodDFAURVFUJX/L9H5NoYMBiqIoiqqkrg0GauSaAYqiKIqi/l50ZoCiKIqiKqlb8wIASB1UWFhIHB0dSWFh4Z9uSq3513P81/MjhOb4L/jX8yOkbuRYF7AIqWMnRgB8+vQJsrKy+PjxI2RkZP50c2rFv57jv54fQHP8F/zr+QF1I8e6gF4zQFEURVF1HB0MUBRFUVQdRwcDFEVRFFXH1cnBgISEBBwdHSEhIfGnm1Jr/vUc//X8AJrjv+Bfzw+oGznWBXXyAkKKoiiKosrVyZkBiqIoiqLK0cEARVEURdVxdDBAURRFUXUcHQxQFEVRVB1HBwMURVEUVcf9dYMBQ0NDLFiwgKf83LlzYLFYAICAgAAMHDgQjRo1goyMDHr06IHr16/zvCY3NxcLFixAkyZNIC4uDlVVVUyePBmpqamMuMzMTMycORONGzeGhIQEVFRUYGxsjEePHtVKjjY2NjA3N+cpv3PnDlgsFj58+MAo19HRgbi4ON69e8fzmoCAABgbG0NRUREsFguRkZE8MU2aNAGLxeJZNm/eXEMZMZmZmWHAgAF81z169AgsFgtPnz4FAJw5cwaGhoaQlZVF/fr10a5dOzg5OSE3N5fxuoKCAsjJyUFeXh4FBQU89VbMUUpKCm3atMGBAwdqPjk+Hj58CFFRUZiYmPCs4/e+u7u7863n5cuXaNCgARo2bMizbt++fWjdujUkJSWho6ODo0eP1nQaPGxsbMBisTBr1iyedba2tmCxWLCxseGWpaenY+7cuWjWrBkkJCSgqakJMzMzBAUFcWP47YsaGhoAgOTkZL7vF4vFgr+/f63nW6YsbxaLBTabjWbNmmHJkiXIz8/nxvxsvw0ODoaBgQEUFBQgKSmJVq1aYefOnb8tB36q2k9zcnJgYmICNTU1br/NmTMHnz594sasXbuWb79IS0tzY8o+vyov8fHxvy1Hqmp/3WBAEPfu3cPAgQNx5coVhIeHw8jICGZmZoiIiODG5Obmonv37ggMDISbmxtevnyJkydPIikpCV26dMGrV6+4saNGjUJUVBSOHDmChIQEXLhwAYaGhjxfSH9CcHAwCgsLMWbMGHh7e/Osz8/Ph4GBwU+/2J2cnJCWlsZY5s6dWyttnjp1Km7duoWUlBSedZ6entDX10fHjh3h4OCAsWPHokuXLrh69SpiYmLg4uKCqKgoHDt2jPG6M2fOoE2bNtDV1UVAQEC1OT579gzm5uaYNWsWTp48WSs5Vs5p7ty5CA4O5hloAoCXlxfjfbe2tuaJ4XA4GD9+PHr37s2zbv/+/bC3t8fatWsRGxuLdevWwc7ODhcvXqyVfCrS1NSEn58fYwBWWFiIEydOoHHjxtyy5ORkdOrUCbdu3cLWrVsRHR2Na9euwcjICHZ2dow6K++LZX+3mpqaPPvounXrIC0tjcGDB9d6rhWZmJggLS0Nr169woYNG+Dm5oYlS5YAgED7rbS0NObMmYN79+4hLi4Oq1atwqpVq3Dw4MHfmkdFVe2nIiIiGD58OC5cuICEhAR4e3sjMDCQMQhcsmQJT9/o6upizJgxPNt58eIFI05bW/u35Ef9xJ/9nSTh9e3bl8yfP5+n/OzZs6S6dHR1dcm6deu4/541axaRlpYmaWlpjLivX78SdXV1YmJiQgghJC8vjwAgd+7cqZkEBGBtbU2GDx/OU3779m0CgOTl5XHLbGxsyIoVK8jVq1dJs2bNSElJCd86X79+TQCQiIgInnVaWlpk586dNdN4AXA4HKKsrEzWrl3LKM/PzycNGjQgrq6uJCQkhAAgu3bt4ltHxfeAEEIMDQ2Ju7s72b9/PzEyMuKJ55ejtrY2GTdu3H/K5We+fPlCGjRoQOLj48nYsWMZ+yAhhAAgZ8+e/Wk9y5YtI1ZWVsTLy4vIysoy1vXo0YMsWbKEUTZ//nxiYGDwX5tfrbL9tG3btuT48ePcch8fH9K2bVsyfPhwYm1tTQghZPDgwURdXZ18+fKFp56KfSnsvqivr0+mTJnyqyn8En5/n9OmTSMqKipC77cVjRgxglhZWdVgSwX3s/20st27dxMNDY0q10dGRhIA5N69e9wyfp9f1P+Pf3JmoLKSkhJ8/vwZ8vLy3H/7+fnB0tISKioqjFhJSUnY2tri+vXryM3NRf369VG/fn2cO3cORUVFf6L5Vfr8+TP8/f1hZWWFgQMHIj8/H3fu3PnTzfopMTExTJo0Cd7e3iAVnnnl7++Pb9++wdLSEj4+Pqhfvz5sbW351lFxqjwpKQmPHj2ChYUFLCws8PDhQ8bMTlXq1asHDofzn/OpzsmTJ6GjowMdHR1YWVnBy8uLkTMAzJkzB4qKiujSpQvc3d1RUlLCWH/r1i34+/tj3759fLdRVFSEevXqMcokJSXx5MmTWs8PACZPngwvLy/uvz09PTFlyhTuv3Nzc3Ht2jXY2dkxpo3L8DvtIYjw8HBERkZi6tSpv/T6miQpKQkOhyPUfltRREQEHj58iL59+9ZiK6smyH5a5v379wgICKi2rYcPH0bLli35zmR16NABqqqq6N+/P27fvl1jOVD/TZ0YDLi4uCA/Px8WFhYAgKysLHz48AGtW7fmG9+6dWsQQvDy5UuIiYnB29sbR44cQcOGDWFgYICVK1fi2bNntdrmS5cucQciZUvlqVA/Pz9oa2tDT08PoqKiGDduHDw8PH5pe8uXL+fZXm0OLKZMmYLk5GTGNjw9PTFy5EjIyckhMTERzZo1A5vN/mldnp6eGDx4MPeaARMTE3h6elYZ//37d3h7eyM6Ohr9+/eviXSq5OHhASsrKwClU8tfvnxhnCNfv349/P39ERgYiHHjxmHx4sXYtGkTd31OTg5sbGzg7e1d5c/DGhsb4/DhwwgPDwchBGFhYfD09ASHw0F2dnat5gcAEydORHBwMJKTk5GSkoIHDx5wcwZKr3UghKBVq1YC1Vd5X9yzZw/fOA8PD7Ru3Ro9e/askTx+1ZMnT+Dr64v+/fsLtd8CgIaGBiQkJNC5c2fY2dlh2rRptdxa/n62nwLA+PHjISUlBXV1dcjIyODw4cN86yoqKoKPjw/PIE1VVRUHDx7EmTNnEBAQAB0dHfTv3x/37t2rnaQo4fzJaYlfIexpAl9fXyIlJUVu3rzJLUtPTycAqpyOPHPmDAFAnjx5wi0rKCggN27cIOvWrSM9evQgoqKixMvL67+mw5e1tTUZMGAASUxMZCzHjx9nTLN169aNbN++nfu60NBQIikpyXca7menCRwcHHi29/Xr11rJr0zPnj2506IvX74kLBaL208mJiakffv2P63j+/fvRF1dnZw+fZpb5u/vTzQ0NMj379+5ZVpaWkRcXJxIS0sTMTExIiUlRZYuXUqKi4trNqkK4uPjiZiYGElPT+eW2dnZkfHjx1f5mu3btxMZGRnuv0eMGEGWL1/O/Te/0wRfv34lkydPJmJiYkRUVJSoqamRZcuWEQAkIyOj5hKqpOJ0+ciRI8natWuJo6MjGTVqFCGEcE8TPH78WODTIfz2RX7789evX4msrCxj//9drK2tiaioKJGWliYSEhJERESEjBgxgmRkZAi835Z59eoVefbsGTl48CCRl5cnvr6+tdfwKgi6n6alpZG4uDhy7tw5oqurS2bPns23Pl9fXyImJsZzCpafoUOHEjMzs/+WAFUj/rrBgJmZGbGxseEp9/LyYnyIEkKIn58fkZSUJJcuXWKUFxcXk4YNG5Lp06fz3cbGjRsJi8UiOTk5VbZj6tSppHHjxr+Qwc8Jcs1AbGwsAUBERESIqKgodwFA3NzceF77/3TNQBkPDw8iKSlJPn78SBwcHEiTJk241zzMmzeP1K9fn3z79q3aOi5fvkwAMN6DsvfhypUr3LiKXzLv3r2r8tqKmrR06VKetomIiBAJCQmSm5vL9zXBwcEEAPeDWVZWluf1ZXV6eHgwXvvt2zfy5s0b8v37d+Lm5kYaNGhQq4OdivvppUuXSJMmTUiTJk3I5cuXCSHlg4GcnBzCYrHIpk2bflqnoPvi0aNHCZvNJpmZmf8lhV9ScbCenJzM2EcF3W/5Wb9+PWnZsmVNNlUgv7Kf3r9/nwAg79+/51nXr18/Ym5uLtC2N2zYQFq1avWf2k/VjL/uNEGrVq0QFhbGUx4aGgodHR3uv0+cOAEbGxv4+vrC1NSUESsiIgILCwv4+voiPT2dsa6goABubm4wNjbmXmPAj66uLuNWot/Nw8MDffr0QVRUFCIjI7nLsmXLfvlUwe9mYWEBUVFR+Pr64siRI5g8eTL39tAJEybgy5cvcHNz4/vastsrPTw8MG7cOMZ7EBkZCUtLS573QVFRES1atICamhp3O7Xl+/fvOHr0KFxcXBjtioqKgpaWFnx8fPi+LiIiAvXq1eOeW3706BHj9U5OTmjQoAEiIyMxYsQIxmvZbDY0NDQgKioKPz8/DB06FCIiv+dP3MTEBN++fcO3b99gbGzMWCcvLw9jY2Ps27eP799M5VtlBeHh4YFhw4ahUaNGv9rk/0RaWhotWrSAlpYW45SAoPstP4SQ335d0q/up+TH9QSV2/v69Wvcvn1b4Os4IiIioKqq+t+SoGrGnx6NCOv169dEUlKS2NraksjISPLixQuyd+9eIiEhQU6dOkUIKZ+m2rdvH0lLS+MuHz584NaTlZVFmjdvTtq0aUOuXLlCUlNTyd27d0nv3r2JkpISSUpKIoQQkp2dTYyMjMixY8dIVFQUefXqFTl16hRRVlautauYfzYzkJmZSRo1akT279/PE5OQkEAAkMjISEIIITk5OSQiIoJ7BO3n50ciIiIYU3haWlrEycmJ8V6lpaWRjx8/1kp+FU2dOpXIyckRERERkpKSwli3bNkyIioqSpYuXUoePnxIkpOTSWBgIBk9ejTZtWsXyczMJGw2m1y9epWn3hs3bjCOHH/37MfZs2eJuLg4Y58rs3LlSqKvr08uXLhADh48SKKjo8nLly/JoUOHiIyMDJk3b16V9fI7TfDixQty7NgxkpCQQEJCQsjYsWOJvLw8ef36dQ1nxVR5P/348SNjn6l4N8GrV6+IiooK0dXVJadPnyYJCQnk+fPnZPfu3YwjQ0H6KTExkbBYLL79/jtU9fdZ5mf7LSGE7N27l1y4cIEkJCSQhIQE4unpSWRkZIiDg8NvyqKUIPvp5cuXiaenJ4mOjiavX78mly9fJnp6enzvVlm1ahVRU1NjnKIrs3PnTnL27FmSkJBAYmJiyIoVKwgAcubMmVrJjRLOXzcYIISQsLAwYmxsTJSUlIiMjAzp3LkzOXHiBHd93759CQCepeyDqUxWVhaZO3cu0dTUJGJiYkRZWZlYW1szvpQKCwvJihUrSMeOHYmsrCyRkpIiOjo6ZNWqVbV2Tv1ng4HTp08TERERxjm+itq2bUvmzp1LCCn98uD3Xjg6OnLjtbS0+MbMnDmzNtJjePjwIQFABg0axHf9yZMnSZ8+fUiDBg2ItLQ0adeuHXFyciJ5eXlk+/btpGHDhnynZDkcDpGXlycuLi6EkN8/GBg6dCgZMmQI33Xh4eEEANm4cSPR19cn9evXJ1JSUqRNmzZk165dhMPhVFkvv8HA8+fPib6+PpGUlCQyMjJk+PDhJD4+vibT4etnX4oVBwOEEPL+/XtiZ2fHvX5DXV2dDBs2jNy+fZsbI0g/2dvbEw0NjVo9BVKdn+VNSPX7LSGE7Nmzh+jp6REpKSkiIyNDOnToQNzc3H57ToLsp9u3byc9evQgsrKypF69ekRbW5ssX76c51qO4uJioqGhQVauXMm3vi1btpDmzZuTevXqETk5OdKrVy/uKSXqz2MRUsX9IxRFURRF1Ql/3TUDFEVRFEXVLDoYoCiKoqg6jg4GKIqiKKqOo4MBiqIoiqrj6GCAoiiKouo4OhigKIqiqDqODgYoiqIoqo6jgwGKoiiKquPoYICiKIqi6jg6GKAoiqKoOo4OBiiKoiiqjvsfsYnd656MRdUAAAAASUVORK5CYII=\",\n      \"text/plain\": [\n       \"<Figure size 640x480 with 2 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"import seaborn as sns\\n\",\n    \"sns.heatmap(corr_values, annot= True,linewidth=.5, fmt=\\\".4f\\\")\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"interpreter\": {\n   \"hash\": \"71b811b81a5db06f6ef6ca65d1fa78aed36849f5dc3738ddbafefdb8a0f00ae2\"\n  },\n  \"kernelspec\": {\n   \"display_name\": \"Python 3.9.7 64-bit ('L1000': conda)\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.16\"\n  },\n  \"orig_nbformat\": 4\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "graphium/data/QM9/micro_qm9.csv",
    "content": "mol_id,smiles,A,B,C,mu,alpha,homo,lumo,gap,r2,zpve,u0,u298,h298,g298,cv,u0_atom,u298_atom,h298_atom,g298_atom\r\ngdb_1,C,157.7118,157.70997,157.70699,0,13.21,-0.3877,0.1171,0.5048,35.3641,0.044749,-40.47893,-40.476062,-40.475117,-40.498597,6.469,-395.999594594,-398.643290011,-401.014646522,-372.471772148\r\ngdb_2,N,293.60975,293.54111,191.39397,1.6256,9.46,-0.257,0.0829,0.3399,26.1563,0.034358,-56.525887,-56.523026,-56.522082,-56.544961,6.316,-276.861363363,-278.62027109,-280.399259105,-259.338802047\r\ngdb_3,O,799.58812,437.90386,282.94545,1.8511,6.31,-0.2928,0.0687,0.3615,19.0002,0.021375,-76.404702,-76.401867,-76.400922,-76.422349,6.002,-213.087623693,-213.97429391,-215.159658411,-201.407171167\r\ngdb_4,C#C,0,35.6100361,35.6100361,0,16.28,-0.2845,0.0506,0.3351,59.5248,0.026841,-77.308427,-77.305527,-77.304583,-77.327429,8.574,-385.501996533,-387.237686427,-389.016046933,-365.800723969\r\ngdb_5,C#N,0,44.593883,44.593883,2.8937,12.99,-0.3604,0.0191,0.3796,48.7476,0.016601,-93.411888,-93.40937,-93.408425,-93.431246,6.278,-301.820533838,-302.906751917,-304.091488909,-288.720028445\r\ngdb_6,C=O,285.48839,38.9823,34.29892,2.1089,14.18,-0.267,-0.0406,0.2263,59.9891,0.026603,-114.483613,-114.480746,-114.479802,-114.505268,6.413,-358.756935444,-360.512705626,-362.291066132,-340.464420585\r\ngdb_59880,CC(C)NC(=O)C1CC1,3.86906,0.78331,0.75776,3.2728,85.74,-0.2412,0.0361,0.2772,1652.3265,0.193066,-404.384586,-404.374176,-404.373232,-404.423116,37.645,-2074.038354237,-2087.058538478,-2099.506434511,-1928.596828253\r\ngdb_27544,NC1=CC(=CO1)C1CN1,5.28086,0.93524,0.83626,2.643,76.28,-0.1898,0.033,0.2228,1354.0962,0.137619,-418.000163,-417.991915,-417.990971,-418.033721,30.896,-1684.890530414,-1694.823997884,-1704.306913892,-1569.448327193\r\ngdb_28971,Cc1c(cncn1)C#C,2.96276,1.54033,1.0198,2.2352,83.5,-0.2504,-0.0518,0.1986,1102.8906,0.113001,-379.702572,-379.694552,-379.693608,-379.735453,29.445,-1635.167972272,-1643.468033815,-1651.764957813,-1532.927930902\r\ngdb_91391,CC1CC2CC(C1)C2=O,2.74817,1.31124,1.30921,2.8552,81.78,-0.2302,-0.0057,0.2245,1129.9283,0.184568,-387.115605,-387.107588,-387.106644,-387.147873,32.117,-2054.1971471661,-2067.8310352091,-2079.685935237,-1911.8197476291\r\ngdb_132952,FC1=CNC=C(F)C1=N,2.39575,1.72903,1.00425,6.3056,67.36,-0.2002,-0.017,0.1831,1040.618,0.089505,-502.014824,-502.007532,-502.006588,-502.046592,27.8,-1364.63754721,-1371.6160747991,-1378.727006787,-1274.732450253\r\ngdb_9122,CCC#CC(=O)CC,5.19635,0.78533,0.69386,3.5498,79.98,-0.2517,-0.0422,0.2095,1650.9158,0.150747,-347.807845,-347.797844,-347.796899,-347.846483,33.437,-1765.265648188,-1774.9882726341,-1785.064184647,-1649.383561158\r\ngdb_93752,CC1CCC(C)=C1C=O,1.95103,1.61352,0.99057,3.6944,88.48,-0.2411,-0.0492,0.1919,1262.5264,0.181787,-387.136409,-387.12688,-387.125936,-387.170663,35.18,-2067.2518444021,-2079.9369388371,-2091.791838865,-1926.1206777391\r\ngdb_60471,CC(C)CCCCC#N,5.50394,0.4672,0.44521,4.2926,91.43,-0.3098,0.0382,0.348,2458.7796,0.215743,-368.449021,-368.437715,-368.436771,-368.486806,39.954,-2250.8051295011,-2265.0407986751,-2278.674686718,-2091.7811712121\r\ngdb_28657,c1c(cnc(n1)N)C#N,5.91709,1.04575,0.88869,5.2599,77.88,-0.2463,-0.0534,0.1929,1194.9277,0.092071,-411.873714,-411.86609,-411.865146,-411.906262,27.382,-1445.352774862,-1452.122969463,-1459.233901451,-1355.658520929\r\ngdb_97460,CN=COCCC(C)O,6.31598,0.48417,0.46188,2.0845,85.06,-0.2441,0.0288,0.2729,2407.6505,0.190768,-441.48138,-441.469981,-441.469037,-441.519737,39.23,-1998.10160762,-2010.500557951,-2022.948453984,-1852.295498907\r\ngdb_76059,OC1C(C#C)C11CCO1,2.72903,1.29031,1.12389,1.4157,75.66,-0.2243,0.0344,0.2586,1140.7191,0.133254,-421.74749,-421.738435,-421.737491,-421.781514,33.177,-1687.2110586961,-1696.638753912,-1706.12166992,-1572.4214648351\r\ngdb_36314,C#CCC1C=CC2CC12,4.12238,1.05184,0.91711,0.8533,86.1,-0.2297,0.017,0.2467,1289.7226,0.159658,-348.716644,-348.708509,-348.707564,-348.749431,32.115,-1940.864001694,-1952.6473656961,-1963.315646205,-1812.4392647721\r\ngdb_35821,N#CC1NCC2CNC12,2.72508,1.56563,1.18711,5.8541,75.03,-0.236,0.0283,0.2643,1059.9401,0.151322,-398.10976,-398.102111,-398.101167,-398.142343,29.347,-1741.392695792,-1752.590593897,-1762.66650591,-1618.601107163\r\ngdb_17912,OC1C2C=CC1C=C2,3.37648,2.33914,2.21279,1.4585,67.83,-0.2322,-0.0085,0.2237,718.4626,0.133054,-346.59144,-346.585586,-346.584642,-346.620841,25.253,-1629.812182957,-1640.359981738,-1649.25052925,-1521.504757066\r\ngdb_96963,CN=C1OC2CC1(C)C2,2.69728,1.53734,1.16856,2.3155,82.24,-0.2415,0.0371,0.2786,1133.2806,0.171174,-403.147326,-403.1387,-403.137756,-403.180802,32.725,-1925.498188811,-1937.86074362,-1949.122647643,-1790.256194113\r\ngdb_4587,c1c(ncc(n1)O)N,5.94363,1.58792,1.25456,2.0576,65.37,-0.2064,-0.0392,0.1672,891.1098,0.098125,-394.841434,-394.834926,-394.833982,-394.87181,25.337,-1340.873153871,-1348.343021007,-1355.454580504,-1252.309671156\r\ngdb_49948,O=CC1NC1(C#C)C#C,2.27876,1.27817,0.9844,2.6863,77.28,-0.2503,-0.0452,0.2051,1200.0487,0.096821,-399.449692,-399.440678,-399.439734,-399.483632,32.737,-1488.7544348471,-1495.5421997001,-1503.246127693,-1393.916489641\r\ngdb_23626,FC1=NC(=N)C=CO1,4.0159,1.99111,1.33113,2.8942,56.13,-0.252,-0.0388,0.2132,809.2827,0.073504,-438.713803,-438.707889,-438.706945,-438.744113,21.959,-1176.494661285,-1182.560163279,-1188.485730766,-1101.400031746\r\ngdb_68573,CC12CC(N=CN1)C2=O,2.40891,1.93423,1.51173,4.3138,72.94,-0.2144,-0.0183,0.1961,936.9824,0.136958,-418.014329,-418.00664,-418.005696,-418.046069,30.086,-1693.779822908,-1704.0640679091,-1713.546983917,-1577.196808325\r\ngdb_7033,N=COC(=O)CC=O,6.45782,0.91543,0.839,2.2773,59.12,-0.2684,-0.0444,0.224,1277.4864,0.0926,-435.771957,-435.763578,-435.762633,-435.806664,27.308,-1321.474968154,-1327.770765951,-1334.881697939,-1235.733393412\r\ngdb_87234,CC1CC1(O)CCC#N,4.54421,0.70983,0.68737,4.2845,80.8,-0.2705,0.035,0.3056,1674.7028,0.168914,-403.155621,-403.145469,-403.144524,-403.191085,36.741,-1930.7033759661,-1942.108352041,-1953.369628555,-1796.70886916\r\ngdb_30225,NC1CN2C1=NC=C2O,3.66556,1.29956,1.01069,1.6998,72.51,-0.1958,0.0384,0.2342,1136.336,0.126495,-434.050629,-434.042595,-434.041651,-434.083474,29.842,-1567.954228264,-1577.132802407,-1586.02272241,-1458.441985093\r\ngdb_84384,OC1C=CC(O)C11CO1,2.13085,1.83475,1.19047,2.9887,69.12,-0.2449,-0.01,0.2349,1045.746,0.135413,-458.936463,-458.928249,-458.927305,-458.969303,31.671,-1669.1174641901,-1679.0722669661,-1688.555182974,-1553.328876001\r\ngdb_124627,CC(C)Nc1cnn[nH]1,4.32712,0.90594,0.82857,5.8615,78.77,-0.2092,0.0187,0.2279,1427.4604,0.15996,-415.385864,-415.376778,-415.375834,-415.420085,33.633,-1765.7036494701,-1776.8877423771,-1787.556650395,-1636.946332778\r\ngdb_124050,C1C=Cc2c(non2)O1,3.77018,1.68205,1.17394,5.4758,67.38,-0.2487,-0.0589,0.1899,947.414,0.091276,-452.706242,-452.699887,-452.698943,-452.737529,24.052,-1364.46184469,-1372.028348212,-1379.1392802,-1274.080468402\r\ngdb_85744,CC1CC(C#N)N2CC12,3.28109,1.2216,1.04518,4.7083,78.93,-0.2522,0.0246,0.2768,1189.1483,0.161035,-382.071548,-382.063375,-382.062431,-382.104262,31.455,-1866.018493228,-1877.7767568701,-1888.445664888,-1736.9317343111\r\ngdb_50505,O=CN1CCC(C1)C#C,5.20789,0.86912,0.76917,3.557,80.33,-0.2427,0.0243,0.267,1429.4829,0.14806,-401.952663,-401.943882,-401.942938,-401.986881,32.006,-1803.688024258,-1814.176209684,-1824.252121697,-1682.28260301\r\ngdb_124867,CC1(CC1)n2cncn2,3.60968,1.23821,1.18517,3.2935,76.45,-0.2593,0.0097,0.2689,1117.1541,0.14947,-398.143638,-398.135692,-398.134748,-398.176633,29.943,-1762.651445694,-1773.6629736261,-1783.738885639,-1640.118390773\r\ngdb_37557,C1OC23C4NC2C4C3=C1,4.24136,1.6428,1.37872,2.5821,75.81,-0.2272,0.0256,0.2529,921.2834,0.125888,-400.610406,-400.604169,-400.603225,-400.64081,25.869,-1589.261296359,-1599.568759193,-1608.458679196,-1478.8335175571\r\ngdb_64599,OC1(C=O)C=CC2CC12,3.24899,1.34721,1.25372,2.6028,73.75,-0.2393,-0.044,0.1953,1048.0888,0.135686,-421.826306,-421.818512,-421.817568,-421.85861,30.427,-1736.66880804,-1746.887792105,-1756.370708113,-1620.799898699\r\ngdb_116453,CCC1OCCCC1C,2.42684,1.26955,0.9793,1.1311,88.73,-0.2412,0.0809,0.3221,1359.0582,0.230074,-389.516508,-389.506829,-389.505885,-389.550576,37.521,-2305.082147965,-2321.227327026,-2335.454211074,-2137.092341084\r\ngdb_4805,c1c(coc1CO)N,4.98871,1.31203,1.06225,2.5467,64.71,-0.1939,0.0213,0.2152,1062.6667,0.119542,-399.811999,-399.804188,-399.803244,-399.844429,28.678,-1482.934916381,-1491.364872287,-1499.662423794,-1382.649565546\r\ngdb_100497,OCC(O)C#CCC=O,5.12315,0.50874,0.48786,2.4952,75.84,-0.2589,-0.0376,0.2213,2157.6553,0.132362,-458.930171,-458.919755,-458.918811,-458.968233,35.288,-1665.1691775621,-1673.74220552,-1683.225121528,-1552.657441371\r\ngdb_83944,OC1C2OCOC1C2=O,2.47908,1.79279,1.55841,0.9948,60.41,-0.2443,-0.0555,0.1888,897.3895,0.112558,-494.868701,-494.86121,-494.860266,-494.900925,28.157,-1490.2629664831,-1498.893725269,-1507.190649267,-1387.20277085\r\ngdb_69742,OC12CC1C1C=CCC21,3.26787,1.52986,1.31998,1.1816,78.96,-0.2375,0.025,0.2626,1013.5605,0.16068,-385.86457,-385.857172,-385.856228,-385.895971,30.45,-1897.0130452651,-1909.258255891,-1919.927163909,-1767.462557197\r\ngdb_9960,CCC1(C)C(C)C1O,2.44288,1.61764,1.43649,1.2446,78.27,-0.2373,0.0613,0.2986,1068.1393,0.197945,-350.19774,-350.187676,-350.186731,-350.23196,37.091,-2009.2430299151,-2022.4803322701,-2034.928228303,-1863.8466845791\r\ngdb_64972,CC1(CO1)C(=O)CC#N,3.54125,1.01628,0.85808,5.2593,69.97,-0.2732,-0.0488,0.2244,1315.391,0.121179,-437.894827,-437.885524,-437.884579,-437.929892,32.632,-1631.0621808851,-1639.445073616,-1648.33436611,-1523.30319786\r\ngdb_18512,O=COC1CN2CC12,6.35348,1.15016,1.08826,2.4332,62.79,-0.2573,0.0024,0.2597,1063.8294,0.118788,-399.766061,-399.758911,-399.757966,-399.798443,25.619,-1454.108407939,-1462.953147294,-1471.250071292,-1353.7929366721\r\ngdb_98343,CN(CC#C)C(=O)OC,3.53632,0.91934,0.81704,2.1775,76.06,-0.2452,0.0369,0.282,1441.4073,0.144704,-439.082994,-439.072366,-439.071422,-439.120183,35.878,-1748.796046974,-1758.124595768,-1768.200507781,-1628.998931293\r\ngdb_105186,CC1NC1(C[NH3+])C([O-])=O,2.10872,1.49768,1.01996,4.9108,72.46,-0.2522,0.0132,0.2655,1211.3936,0.159973,-456.360243,-456.350973,-456.350029,-456.394509,34.244,-1773.8254984571,-1784.8941297081,-1795.563037726,-1645.200586164\r\ngdb_66434,CC12C3C1C1(C)N3C21C,2.59558,1.69727,1.33457,1.905,82.98,-0.2009,0.0787,0.2796,1067.9824,0.168686,-365.914228,-365.905372,-365.904428,-365.946976,34.198,-1915.902948692,-1928.1218039401,-1939.383707963,-1780.4601511141\r\ngdb_51810,O=COC1OCC2NC12,3.48271,1.22675,1.1175,6.4004,65.36,-0.2544,0.0183,0.2726,1100.7958,0.125129,-475.014866,-475.007193,-475.006249,-475.047729,28.095,-1569.7118809731,-1579.116985865,-1588.006905868,-1460.3150994581\r\ngdb_116796,CC[NH+]=CNCC([O-])=O,6.65798,0.51721,0.49846,7.7485,80.5,-0.2308,0.0101,0.2409,2184.647,0.158114,-456.38812,-456.377916,-456.376972,-456.42564,35.134,-1791.3185668501,-1801.8011046951,-1812.4700127131,-1664.735568843\r\ngdb_70705,CC12OC1CCOC2=O,2.31163,1.92791,1.24821,4.6713,68.12,-0.2648,-0.0112,0.2536,1016.3032,0.135978,-458.987513,-458.979515,-458.97857,-459.02007,30.13,-1701.1517986401,-1711.24214336,-1720.724431859,-1585.185625404\r\ngdb_98751,OCC(=O)C1CN1C=O,3.8091,0.94758,0.82744,3.835,68.78,-0.2628,-0.0556,0.2073,1350.9923,0.121923,-475.021043,-475.012007,-475.011063,-475.055791,31.564,-1573.5880040661,-1582.137814191,-1591.027734194,-1465.374077016\r\ngdb_35951,N#CC1CCCOC=N1,3.38355,1.18371,0.91899,5.4587,74.5,-0.2686,-0.0063,0.2623,1222.3081,0.137757,-418.023603,-418.01554,-418.014596,-418.05691,29.714,-1699.599341374,-1709.648898009,-1719.131814017,-1583.9996333941\r\ngdb_129544,NC(C#C)C1=CN=NO1,3.32604,1.14385,1.04545,2.6072,69.9,-0.2653,-0.0468,0.2185,1149.0404,0.099948,-432.79241,-432.784112,-432.783168,-432.82605,29.909,-1406.262101707,-1413.497907986,-1421.201835979,-1310.6196899631\r\ngdb_44914,O=C1OC2CC1N=CO2,2.97466,1.86065,1.63773,4.0415,60.29,-0.2703,-0.0176,0.2527,838.1414,0.104331,-473.874618,-473.86855,-473.867606,-473.90507,24.1,-1482.0476186551,-1490.6827700041,-1498.386697997,-1384.5088747131\r\ngdb_52088,N=COCC12CN(C1)C2,5.03678,0.80214,0.77672,3.8018,79.04,-0.2347,0.0167,0.2515,1502.8487,0.15981,-419.115977,-419.107385,-419.106441,-419.151872,30.897,-1757.222237826,-1768.716947688,-1779.385855706,-1629.875561366\r\ngdb_92380,CC1CC2COC12C#C,2.2107,1.624,1.28187,1.7906,82.1,-0.2452,0.0202,0.2654,1093.0658,0.158054,-385.826116,-385.817508,-385.816564,-385.859172,33.187,-1872.8828141791,-1884.368738915,-1895.037646933,-1744.3708535061\r\ngdb_37290,C1C2C1C1C3OCC3C21,3.10127,1.75307,1.63983,1.7171,76.99,-0.2264,0.0629,0.2893,914.5177,0.161808,-385.791391,-385.784906,-385.783962,-385.821888,27.716,-1851.092564154,-1863.910690497,-1874.579598515,-1720.9748079501\r\ngdb_87491,CC1NC11C2NC1C2O,2.92458,1.41799,1.15136,1.2002,76.45,-0.2401,0.064,0.3041,1118.5337,0.160062,-419.100239,-419.092204,-419.09126,-419.132532,31.922,-1747.346501184,-1759.190733559,-1769.859641577,-1617.7395373061\r\ngdb_56620,CC(C#C)N(C=O)C=O,2.4677,1.11576,1.0037,5.7118,73.47,-0.2532,-0.0389,0.2143,1251.169,0.120625,-437.883121,-437.873407,-437.872463,-437.918255,33.842,-1623.7165605311,-1631.841547063,-1640.731467066,-1516.000875627\r\ngdb_87093,OC1CC1(O)C#CC=O,4.52216,0.68054,0.65594,3.3792,78.34,-0.258,-0.064,0.194,1661.8311,0.108867,-457.710228,-457.700752,-457.699808,-457.745299,33.369,-1527.495585489,-1534.8813664191,-1543.1782904171,-1426.477931651\r\ngdb_27428,Cc1cccnc(=O)n1,2.58576,1.71652,1.18889,5.9042,75.19,-0.2499,-0.089,0.1609,1032.0877,0.11327,-416.850159,-416.84253,-416.841586,-416.882319,28.35,-1591.104290292,-1599.649080345,-1607.946004343,-1488.155791261\r\ngdb_87830,CC1NC11CC2OC12C,2.81683,1.34965,1.15561,1.3598,79.51,-0.2379,0.0691,0.307,1165.5208,0.169336,-403.091974,-403.083043,-403.082099,-403.125249,34.178,-1890.764310643,-1902.935475207,-1914.19737923,-1755.396186636\r\ngdb_103777,CCC1(O)CC11COC1,3.22305,1.06963,0.96954,3.0704,79.07,-0.2383,0.0621,0.3004,1319.5562,0.180582,-424.186739,-424.177105,-424.176161,-424.221219,35.86,-1962.1585196091,-1974.77709809,-1986.631998118,-1820.913146308\r\ngdb_92101,OC1CN2CC2CC=C1,2.33483,1.76985,1.28894,2.5675,80.28,-0.2129,0.0068,0.2197,1050.8649,0.172661,-403.112969,-403.104798,-403.103854,-403.145285,32.308,-1903.938862098,-1916.5869335021,-1927.848837525,-1767.96895696\r\ngdb_111119,COC1C2CC1C1CC21,3.45832,1.29916,1.20127,0.9664,81.88,-0.2402,0.0946,0.3348,1137.2726,0.184552,-387.033886,-387.026206,-387.025262,-387.065939,31.198,-2002.917739195,-2016.7630977711,-2028.617997799,-1860.405425223\r\ngdb_89847,OC1CC2(CC2)CC1O,2.84485,1.29228,1.13438,2.5285,77.54,-0.247,0.068,0.3151,1170.1392,0.183464,-424.222449,-424.213851,-424.212907,-424.25586,34.302,-1984.5668659991,-1997.835543804,-2009.690443832,-1842.6506855771\r\ngdb_76184,CC1(C)C(O)C1(O)C=O,2.0282,1.51212,1.2333,2.5574,73.98,-0.2397,-0.0239,0.2158,1128.3666,0.154998,-460.156729,-460.14653,-460.145585,-460.191082,37.661,-1806.9937416701,-1817.480044569,-1828.148325078,-1678.783612826\r\ngdb_72461,CC12CCC(C1)N=CO2,2.58943,1.88762,1.50055,2.2904,77.6,-0.242,0.0279,0.2699,970.0442,0.174005,-403.18798,-403.180571,-403.179627,-403.21932,30.464,-1951.008939697,-1964.1351729591,-1975.397076982,-1814.4265857751\r\ngdb_90456,CC1CC2C(O1)C2(C)O,3.52888,1.13161,1.11211,0.5111,78.12,-0.222,0.0742,0.2962,1220.2489,0.181294,-424.218499,-424.209313,-424.208369,-424.252066,35.511,-1982.0882054491,-1994.9879079621,-2006.84280799,-1840.2699164311\r\ngdb_28131,c1c2c(nco2)oc1N,5.4084,1.33103,1.06941,3.1558,68.6,-0.186,0.0002,0.1862,1025.1975,0.091788,-452.766163,-452.759413,-452.758469,-452.797251,25.847,-1402.0628114791,-1409.381448946,-1416.492380934,-1311.5565609\r\ngdb_90543,CC1CC2C(CC12)C=O,4.14069,0.94156,0.87902,3.2146,83.55,-0.2413,-0.0189,0.2224,1386.7093,0.182075,-387.076405,-387.06753,-387.066586,-387.111152,33.246,-2029.5987943661,-2042.6942796871,-2054.549179715,-1888.7769896401\r\ngdb_718,C1C2C3OC1C23,6.29327,5.52309,4.39133,1.7566,48.29,-0.2298,0.084,0.3137,384.9781,0.097855,-269.154939,-269.150758,-269.149814,-269.182103,16.78,-1163.942598758,-1171.984754102,-1178.504572612,-1085.856633816\r\ngdb_72886,CC12COC1C1(CC1)C2,2.92312,1.42696,1.17256,1.7644,82.93,-0.236,0.0676,0.3036,1145.1422,0.182213,-387.046174,-387.037725,-387.036781,-387.07921,33.137,-2010.6285697871,-2023.9913739421,-2035.84627397,-1868.733097162\r\ngdb_101748,COC(C)CC1COC1,3.18411,0.80205,0.73405,2.3327,82.35,-0.2397,0.0754,0.3151,1657.6355,0.203934,-425.396699,-425.386167,-425.385223,-425.433793,36.97,-2093.5676893351,-2107.3998702221,-2120.44076226,-1940.591662788\r\ngdb_53762,CC(=O)C#CCNC=O,5.09979,0.53924,0.51473,2.4071,78.91,-0.2611,-0.0563,0.2048,2073.9322,0.121413,-437.897247,-437.887001,-437.886057,-437.935498,33.63,-1632.5807526651,-1640.3719044091,-1649.261824412,-1526.821013314\r\ngdb_29091,CC1=COC(=O)C(=N)N1,3.12679,1.44161,0.99281,5.0347,73.09,-0.2159,-0.0464,0.1695,1130.6695,0.113512,-453.999459,-453.991486,-453.990542,-454.032059,29.813,-1548.1155312291,-1556.4438306771,-1564.7407546751,-1445.1871124861\r\ngdb_98443,CC1(COC1)C(=O)NC,2.85992,1.0714,1.05305,3.7545,75.85,-0.2486,0.0255,0.2741,1287.5079,0.169572,-440.295315,-440.285133,-440.284189,-440.331832,35.395,-1881.6867654491,-1893.072288745,-1904.334192768,-1748.097001948\r\ngdb_120959,CCCC1CC2CC2O1,4.38344,0.82525,0.77002,1.3113,86.03,-0.2332,0.0878,0.3211,1601.7427,0.206231,-388.289071,-388.279897,-388.278953,-388.323612,34.909,-2162.706003446,-2177.3909690641,-2190.431861102,-2008.3839700941\r\ngdb_105366,CCC1(CC1)CC1OC1,2.55308,1.06067,0.87413,1.6887,85.74,-0.2555,0.0822,0.3377,1445.7978,0.204646,-388.259314,-388.249503,-388.248558,-388.294672,36.751,-2144.0332181331,-2158.3184605181,-2171.358725047,-1990.2238596341\r\ngdb_121013,C[NH2+]C[C-]1NC=CC1=O,3.494,0.73047,0.62378,4.1711,84.98,-0.2123,-0.0622,0.1501,1808.2107,0.156791,-419.222005,-419.211007,-419.210063,-419.260304,37.125,-1823.7557620781,-1833.7406852861,-1844.409593304,-1697.9176172541\r\ngdb_90931,CC1CC2C3CC1C23O,2.86912,1.39676,1.3712,1.3566,80.87,-0.2355,0.0744,0.31,1084.103,0.183874,-387.071901,-387.063874,-387.06293,-387.103802,32.876,-2026.7724938301,-2040.4001067831,-2052.255006811,-1884.16479849\r\ngdb_82254,OC1C2OC2C2OCC12,2.59184,1.94934,1.44445,2.2297,65.62,-0.2489,0.0541,0.3029,919.7488,0.137673,-458.919553,-458.912541,-458.911597,-458.951061,27.78,-1658.5062870001,-1669.215355594,-1678.698271602,-1541.881856823\r\ngdb_81510,CN1C2CC1C2(C)C#C,3.12636,1.41499,1.10788,0.8091,87.19,-0.2063,0.0401,0.2463,1144.6526,0.168515,-365.885794,-365.876606,-365.875662,-365.91974,35.418,-1898.060357786,-1910.070880046,-1921.332784069,-1763.36931599\r\ngdb_81236,CC1C2CC11CCOC21,2.79156,1.55591,1.41988,1.5071,82.41,-0.246,0.09,0.3359,1047.4307,0.184402,-387.023081,-387.015381,-387.014437,-387.055078,31.162,-1996.1375044501,-2009.9703128461,-2021.825212874,-1853.5900499741\r\ngdb_95019,CC1COC2C3OC3C12,2.52399,1.91729,1.44653,2.1973,72.96,-0.2328,0.0684,0.3012,966.2672,0.160613,-422.987636,-422.980222,-422.979278,-423.019329,29.541,-1837.5622150961,-1849.7967580691,-1860.465666087,-1707.9389359841\r\ngdb_84879,CC1C=CCOC1(C)C,2.0541,1.75388,1.32064,1.068,86.59,-0.2393,0.0257,0.265,1126.394,0.20535,-388.308166,-388.298883,-388.297939,-388.341111,37.161,-2174.6882878011,-2189.3048549381,-2202.345746976,-2019.3647500851\r\ngdb_109626,CNC1C(N)C1(O)C#N,2.68693,1.12824,1.00083,6.4844,74.52,-0.2263,0.0168,0.2431,1261.4888,0.146942,-435.234405,-435.224693,-435.223749,-435.268799,35.485,-1682.932702334,-1692.835421863,-1702.911333876,-1561.021508832\r\ngdb_47980,O=CC#CC12CN1CC2,5.35087,0.72039,0.67868,3.8421,86.72,-0.2504,-0.0648,0.1856,1618.9931,0.123467,-400.677866,-400.669433,-400.668489,-400.712071,30.015,-1631.5930534991,-1640.5225065691,-1649.412426572,-1523.550436406\r\ngdb_81121,OC1C2OC1(C=C2)C#C,2.67463,1.7507,1.2915,1.2871,72.43,-0.2547,-0.0105,0.2442,970.9793,0.111392,-420.537809,-420.530428,-420.529484,-420.569321,29.772,-1555.976963981,-1564.678003775,-1572.974927773,-1452.982029284\r\ngdb_94218,CC1CCC1(O)CC#N,3.15153,1.07548,0.9756,3.2816,79.38,-0.2743,0.0237,0.298,1289.8473,0.169636,-403.1622,-403.152641,-403.151697,-403.1966,35.683,-1934.8317576771,-1946.6088465891,-1957.870750612,-1800.169581295\r\ngdb_101776,CCC(C)CCC(C)=O,3.9448,0.60074,0.54967,2.5068,90.4,-0.2412,-0.0067,0.2344,2119.3586,0.225281,-389.529679,-389.517745,-389.516801,-389.568761,41.78,-2313.3470690041,-2328.0772152701,-2342.304099318,-2148.5035922491\r\ngdb_68507,CC12CC(C3NC13)C2=O,2.55076,1.85017,1.44232,2.2498,75.03,-0.2347,0.0019,0.2366,961.858,0.148838,-401.917098,-401.909673,-401.908729,-401.948667,29.854,-1781.370666673,-1792.7097543031,-1802.785666316,-1658.3029740841\r\ngdb_83218,OC1C2CNC(C#N)C12,2.7507,1.48204,1.27075,5.4001,71.8,-0.2447,0.0185,0.2632,1030.1395,0.138505,-417.997573,-417.989996,-417.989052,-418.029679,29.508,-1683.265282104,-1693.619808113,-1703.102724121,-1566.911935815\r\ngdb_42455,C#CCC(=O)NCC#C,2.32129,1.00906,0.74106,2.9544,77.71,-0.2499,0.0205,0.2703,1480.8307,0.12098,-400.709857,-400.699945,-400.699,-400.746351,34.932,-1651.6676939181,-1659.6690611771,-1668.558353671,-1545.0614449261\r\ngdb_85874,CC1(C)CC(O)C1C#N,2.13602,1.52896,1.09888,3.7876,78.88,-0.2751,0.0233,0.2984,1178.002,0.169453,-403.162578,-403.152995,-403.152051,-403.196451,36.271,-1935.0689560791,-1946.830984775,-1958.092888798,-1800.0760824541\r\ngdb_28241,C1COc2c1oc(n2)N,4.82202,1.28689,1.03264,2.613,68.7,-0.182,0.0275,0.2095,1096.82,0.115438,-453.961206,-453.953965,-453.953021,-453.992908,27.489,-1524.111429452,-1532.899065488,-1541.195989486,-1420.619507627\r\ngdb_67191,OC12C3NC1C(=O)OC23,3.08887,1.56824,1.51142,2.9789,60.9,-0.2426,-0.008,0.2345,899.6364,0.102335,-473.795617,-473.788999,-473.788055,-473.826512,26.273,-1432.473780146,-1440.763801545,-1448.467729538,-1335.213022691\r\ngdb_6978,CC1OC=NC1(C)C,2.55483,2.25087,1.65871,1.6604,70.97,-0.2443,0.0266,0.2709,890.3706,0.164994,-365.093193,-365.084859,-365.083914,-365.1257,31.565,-1795.377295062,-1807.033902246,-1817.702810264,-1668.654362548\r\ngdb_15212,OC1C(C#N)C1C#N,2.56328,2.03967,1.5641,5.4209,59.52,-0.2934,-0.0175,0.2759,828.939,0.083031,-377.521828,-377.514091,-377.513147,-377.554365,26.679,-1289.2643036751,-1295.075037015,-1301.593600507,-1209.050455714\r\ngdb_72392,OC12CC=CC3OC1C23,2.7342,1.9699,1.41647,0.9739,71.41,-0.218,0.0015,0.2195,924.3724,0.136899,-421.781635,-421.77465,-421.773706,-421.812582,28.849,-1708.6373535011,-1719.3639923471,-1728.846908355,-1591.9169144471\r\ngdb_91438,CN1CC2NC(C=O)C12,2.49045,1.45906,1.04702,3.6211,76.63,-0.2128,-0.0206,0.1921,1172.4079,0.158622,-419.151059,-419.142421,-419.141477,-419.184987,31.935,-1779.236508564,-1790.702353012,-1801.37126103,-1650.655521901\r\ngdb_128441,COC(=O)c1cnno1,4.8069,1.07064,0.88048,3.0572,63.23,-0.2946,-0.0819,0.2127,1222.1443,0.08827,-489.882335,-489.874383,-489.873439,-489.91586,27.064,-1338.2859342641,-1344.849678404,-1351.960610392,-1249.0528994461\r\ngdb_42384,C#CC(=N)NC(=O)C#C,3.3615,0.97151,0.75369,5.3915,77.35,-0.2605,-0.0621,0.1983,1391.8667,0.086447,-415.556772,-415.54762,-415.546676,-415.591713,32.157,-1407.343927223,-1413.155915581,-1420.266847569,-1319.511492493\r\ngdb_29108,c1c([nH]c(=O)c(=O)o1)N,3.10496,1.46463,0.998,7.2195,64.82,-0.222,-0.0544,0.1675,1080.7766,0.090065,-489.937352,-489.929675,-489.928731,-489.969667,28.656,-1372.809596917,-1379.545906032,-1386.6568380201,-1282.817276209\r\ngdb_86476,CN1CC(C1)(C#C)C#C,2.32285,1.33978,0.98018,1.0177,85.72,-0.2212,0.0286,0.2498,1254.2772,0.144482,-364.709427,-364.699987,-364.699043,-364.743387,35.45,-1787.731097897,-1797.8063824011,-1807.882294414,-1666.419802999\r\ngdb_74800,CC12CC1N=C(N)C2=O,2.72969,1.67488,1.17929,0.5565,75.18,-0.2252,-0.0542,0.171,1050.6937,0.136488,-418.038732,-418.03039,-418.029446,-418.071196,32.033,-1709.092925035,-1718.9674066591,-1728.4503226671,-1592.964226968\r\ngdb_41281,C1CC23OC4CC12NC34,2.87909,2.08282,1.84969,1.3957,74.4,-0.2216,0.0682,0.2898,841.6817,0.150244,-401.853919,-401.847727,-401.846782,-401.884002,27.022,-1741.7252755621,-1753.8380817891,-1763.913366293,-1617.725104599\r\ngdb_117530,[NH3+]CCC(C#C)C([O-])=O,2.23583,1.272,0.85491,6.8342,74.53,-0.2457,0.0278,0.2734,1327.0266,0.146681,-439.081376,-439.071856,-439.070911,-439.116189,34.791,-1747.7807374121,-1757.8045661781,-1767.8798506821,-1626.4926603471\r\ngdb_13483,CC(C)C1CNC1=O,3.14685,1.66351,1.15705,3.4605,72.23,-0.2445,0.0362,0.2807,1054.8408,0.164996,-365.095793,-365.087037,-365.086093,-365.129074,31.96,-1797.0088184621,-1808.4006168481,-1819.070152375,-1670.7715779141\r\ngdb_33566,C#CC1(CCCC1)C#C,2.2347,1.48048,1.16905,0.8784,86.4,-0.2553,0.0333,0.2886,1151.8472,0.158165,-348.69946,-348.690273,-348.689329,-348.733409,34.968,-1930.080887038,-1941.2041115721,-1951.87301959,-1802.3853155741\r\ngdb_46763,N=C1OCC2N=COC12,2.90668,1.86118,1.30394,2.0269,65.83,-0.266,-0.0029,0.2631,932.1802,0.115926,-453.971804,-453.965062,-453.964118,-454.003641,25.339,-1530.761769834,-1539.862532861,-1548.159456859,-1427.354561724\r\ngdb_13043,O=C1OCC1CC#N,4.14125,1.44434,1.12716,5.8748,57.96,-0.2906,-0.0116,0.2791,978.8583,0.095833,-398.633109,-398.625842,-398.624898,-398.665563,25.068,-1371.0224512851,-1378.016666599,-1385.128226096,-1284.123122438\r\ngdb_116251,CNC1CCOC1C#N,2.12101,1.46346,1.01889,4.27,75.29,-0.2342,0.0189,0.2531,1221.5578,0.160179,-419.208715,-419.1995,-419.198556,-419.243484,32.819,-1815.416167468,-1826.519939223,-1837.188847241,-1687.362915874\r\ngdb_25595,N=COCC1=COC=N1,6.51377,0.76885,0.69065,1.8061,69.27,-0.2496,-0.0016,0.2479,1526.1934,0.113196,-453.963252,-453.955015,-453.954071,-453.998433,27.955,-1525.3953128661,-1533.5579499381,-1541.854873936,-1424.086494852\r\ngdb_107616,CC1CCC2OC12CO,2.51282,1.51097,1.11385,1.9152,77.37,-0.2552,0.072,0.3272,1159.2223,0.182906,-424.228041,-424.219124,-424.21818,-424.26155,34.298,-1988.0758963271,-2001.1443987611,-2012.9992987891,-1846.2212117871\r\ngdb_87798,CC1C2OC3(CC3O)C12,2.79268,1.4857,1.24005,2.6466,75.52,-0.2342,0.0584,0.2926,1077.4926,0.157846,-422.956598,-422.948175,-422.947231,-422.98926,33.011,-1818.0855907541,-1829.686977146,-1840.355885164,-1689.070367863\r\ngdb_113545,CCC1CC11OCC1C,2.95123,1.00679,0.93234,1.6603,86.63,-0.2212,0.0823,0.3036,1436.4479,0.204285,-388.266622,-388.256681,-388.255737,-388.301834,36.916,-2148.619053905,-2162.8227201201,-2175.863612158,-1994.718079092\r\ngdb_46240,O=C1NCC11NC1C#N,2.84119,1.36198,1.07842,7.8846,68.45,-0.2623,-0.0083,0.254,1088.9978,0.10124,-432.856946,-432.848885,-432.847941,-432.890137,28.787,-1446.759022531,-1454.143548443,-1461.847476436,-1350.834859246\r\ngdb_127831,CN=C(c1ccon1)N,4.50524,1.0482,0.85558,0.8083,75.45,-0.2348,-0.0309,0.2039,1300.7825,0.124817,-434.068236,-434.059831,-434.058887,-434.101539,30.785,-1579.002779227,-1587.948547531,-1596.838467534,-1469.777935178\r\ngdb_73083,CC12CCC1C=CCC2,2.34164,1.67431,1.30743,0.1844,90.26,-0.2325,0.0295,0.2621,1111.6301,0.208018,-351.171062,-351.162788,-351.161844,-351.203278,33.933,-2225.3301466281,-2240.5804978551,-2253.621389893,-2069.8051785231\r\ngdb_74399,CC1=CC2(C)CC1(C)C2,2.66764,1.47677,1.1346,0.2964,91.77,-0.2095,0.026,0.2355,1206.3621,0.204639,-351.146112,-351.136817,-351.135873,-351.179133,37.028,-2209.6737970781,-2224.2834616161,-2237.324353654,-2054.653973718\r\ngdb_42297,C1CN(C1=O)C(=O)C#N,2.69513,1.4915,0.972,5.7582,66.84,-0.2908,-0.0859,0.2049,1112.3815,0.088889,-452.793086,-452.784992,-452.784048,-452.826628,27.841,-1418.957236286,-1425.432501657,-1432.543433645,-1329.9908927931\r\ngdb_85816,CC1CC(C)(C#C)C1O,1.98765,1.47812,1.33012,1.3738,84.0,-0.2483,0.0408,0.2892,1153.4108,0.179894,-387.054751,-387.044731,-387.043787,-387.088854,38.324,-2016.0107144801,-2028.3877019961,-2040.242602024,-1874.7847939581\r\ngdb_56201,NC(=O)CCC1CCO1,4.6309,0.69028,0.62892,2.0107,76.28,-0.2432,0.0356,0.2788,1762.4288,0.17001,-440.305445,-440.295318,-440.294374,-440.342987,34.849,-1888.0434316191,-1899.46346791,-1910.725371933,-1755.096864843\r\ngdb_66203,NC1(COC=NC1)C#C,2.33865,1.54789,1.40594,2.2064,73.88,-0.2499,0.0226,0.2725,1021.277,0.136596,-417.97225,-417.963754,-417.962809,-418.005031,32.285,-1667.374871697,-1677.152716935,-1686.635005434,-1551.445093983\r\ngdb_86735,CC1CN(C=N1)C(N)=O,3.88827,1.07387,0.88251,1.9061,75.94,-0.2348,-0.0042,0.2306,1307.6331,0.149177,-435.326501,-435.317672,-435.316728,-435.360603,32.375,-1740.723771198,-1751.1805811741,-1761.256493187,-1618.629345068\r\ngdb_54943,[O-]C(=O)C1CN(C1)C=[NH2+],2.58889,1.81036,1.32198,2.5968,67.77,-0.2441,0.0154,0.2595,973.3285,0.138653,-455.146891,-455.13953,-455.138586,-455.178575,29.052,-1640.287818203,-1650.7772586471,-1660.260174655,-1523.4136394441\r\ngdb_99898,CCC(N)(C#N)C1CO1,2.27071,1.30713,1.1572,4.5561,75.29,-0.2636,0.0245,0.2881,1191.5753,0.15769,-419.188712,-419.178796,-419.177852,-419.223442,35.889,-1802.864104941,-1813.527992887,-1824.196900905,-1674.786380496\r\ngdb_31383,[NH3+]CC1=NC(=O)N=C[N-]1,2.97351,1.37256,0.95251,6.7438,69.35,-0.2565,-0.0527,0.2037,1164.0116,0.115909,-450.192382,-450.184655,-450.183711,-450.225151,28.203,-1508.301340197,-1516.7833793501,-1525.0803033481,-1405.118780309\r\ngdb_52247,O=CCOC1CC2OC12,5.48108,0.73798,0.70838,3.6262,69.46,-0.2566,-0.0297,0.2269,1554.2054,0.132968,-458.907936,-458.899232,-458.898288,-458.943044,30.466,-1651.2165149471,-1660.863838313,-1670.346754321,-1536.85111717\r\ngdb_109214,CCC1=CCC2(C)NC12,3.15083,1.09189,0.98346,1.5359,88.77,-0.2204,0.0234,0.2439,1352.364,0.194482,-367.211042,-367.201888,-367.200943,-367.244832,35.286,-2101.8137851041,-2115.6227481581,-2128.070016682,-1953.653890132\r\ngdb_66559,CC12C3C1C1C3CC21O,2.33453,2.19739,1.59422,1.067,78.04,-0.2234,0.0758,0.2991,926.3729,0.159402,-385.821583,-385.814126,-385.813182,-385.8528,30.951,-1870.038315882,-1882.2465034771,-1892.915411495,-1740.3723661581\r\ngdb_88749,OC1CC1N1CC1C=O,3.41255,0.98091,0.84662,2.2716,75.44,-0.2383,-0.0376,0.2007,1352.3986,0.145676,-439.033069,-439.02428,-439.023335,-439.067222,32.643,-1717.4676601491,-1727.950197994,-1738.025482498,-1595.7654271441\r\ngdb_96063,OC1COCC1OC=N,3.08832,1.15558,0.93667,2.4014,69.97,-0.2566,0.0138,0.2704,1249.211,0.147746,-476.21147,-476.202661,-476.201717,-476.246199,31.701,-1692.7400404951,-1703.209400651,-1713.285312664,-1571.143229002\r\ngdb_122477,OCCCC1CC1C#C,3.46134,0.65992,0.58807,0.8342,86.05,-0.2409,0.0501,0.291,1886.4952,0.18024,-387.040812,-387.030396,-387.029452,-387.077316,37.619,-2007.2638665291,-2019.392360481,-2031.247260509,-1867.5445951161\r\ngdb_122167,CCOCC1(C)CC1C,3.02394,0.79118,0.71408,0.9143,91.24,-0.2478,0.0851,0.3329,1762.6765,0.226313,-389.481994,-389.47073,-389.469786,-389.518801,40.806,-2283.424302339,-2298.574879635,-2312.801763683,-2117.153242609\r\ngdb_123739,C1CC(C1)c2n[nH]nn2,5.03041,1.00359,0.96389,1.8368,74.03,-0.2776,-0.0238,0.2538,1246.6733,0.139162,-414.16811,-414.160698,-414.159754,-414.201107,27.38,-1629.4036745981,-1639.8611120831,-1649.3440280911,-1513.249248662\r\ngdb_80010,CN1C2C=CC1C1CC21,2.46196,2.09861,1.65477,0.7398,81.55,-0.1948,0.02,0.2149,926.7939,0.173053,-365.964068,-365.956726,-365.955782,-365.995942,30.33,-1947.177997252,-1960.3469011261,-1971.608805149,-1811.186756808\r\ngdb_108585,CCC12OCOC1C2C,2.41331,1.44998,1.1887,1.2187,78.07,-0.2402,0.0839,0.3241,1170.1292,0.181655,-424.213209,-424.203913,-424.202969,-424.24736,34.366,-1978.7686828391,-1991.599359362,-2003.45425939,-1837.316859077\r\ngdb_87203,CC1(CC2CO2)CC1O,4.15636,0.84821,0.79602,2.6978,79.39,-0.2538,0.0786,0.3324,1489.3232,0.180329,-424.185336,-424.17556,-424.174615,-424.220226,36.321,-1961.2781244821,-1973.8075966851,-1985.6618692041,-1820.2900298711\r\ngdb_78378,CC1C2C3C(CN3C)N12,2.82701,1.4822,1.32937,1.6647,82.88,-0.2057,0.0622,0.268,1088.8157,0.183016,-383.171024,-383.162707,-383.161763,-383.203446,32.843,-1928.0979585981,-1941.5429664321,-1953.39786646,-1785.4570052811\r\ngdb_97080,CN=C1COC(O1)C#C,3.06567,1.17917,0.99182,1.3208,73.29,-0.2569,0.0136,0.2705,1220.4227,0.122605,-437.850119,-437.841265,-437.840321,-437.884591,31.306,-1603.0075085131,-1611.672152785,-1620.5620727881,-1494.876412651\r\ngdb_38881,C1C2OC3CCC2OC13,2.76097,2.13274,1.93573,0.237,72.27,-0.2351,0.0904,0.3255,838.0454,0.164401,-423.037173,-423.031304,-423.03036,-423.066972,26.169,-1868.6471284291,-1881.851172807,-1892.520080825,-1737.8353472711\r\ngdb_40468,C1CC1C12CN3C1CC23,3.66996,1.29734,1.18349,1.9996,83.13,-0.2269,0.0876,0.3145,1123.1942,0.172102,-365.898341,-365.890895,-365.889951,-365.930535,30.001,-1905.9337132091,-1919.0373561471,-1930.29926017,-1770.1432756451\r\ngdb_111507,COC1C2CN2CC1C,1.99449,1.84918,1.12333,1.5052,81.58,-0.2233,0.0747,0.298,1169.5312,0.194876,-404.319918,-404.310963,-404.310019,-404.353412,34.266,-2033.458602225,-2047.3918120611,-2059.839708094,-1884.856940917\r\ngdb_6783,COCCNC(=O)N,5.30488,0.86758,0.76937,3.8505,66.79,-0.2407,0.0618,0.3025,1460.3443,0.153342,-418.28797,-418.278458,-418.277514,-418.323415,32.954,-1632.321591448,-1642.3491852681,-1652.42572479,-1513.563630671\r\ngdb_103521,CC1(CO)C=CCC1=O,2.38149,1.44265,1.17915,1.8507,76.63,-0.2393,-0.0185,0.2208,1130.0999,0.157732,-423.053809,-423.044499,-423.043555,-423.087917,34.402,-1879.0863681531,-1890.131154062,-1900.80006208,-1750.978523276\r\ngdb_129325,N=C1ON=NC=NC1=N,2.6343,1.92963,1.28977,3.2518,67.97,-0.2839,-0.1222,0.1617,912.616,0.076276,-464.902388,-464.895059,-464.894115,-464.934419,26.888,-1178.065943821,-1184.131445815,-1190.649381798,-1094.16861803\r\ngdb_44215,N=C1OC2C1C=CC2=O,3.51327,1.48735,1.19221,2.5334,69.39,-0.2538,-0.0736,0.1802,989.9163,0.102469,-436.706551,-436.699765,-436.698821,-436.738036,25.899,-1513.259916315,-1521.445143711,-1529.149071704,-1416.6254128421\r\ngdb_16497,OC1C2OC3C2C13O,3.02628,2.62399,1.83904,1.1718,55.48,-0.2254,0.0683,0.2937,725.9597,0.10686,-419.595051,-419.588613,-419.587668,-419.625259,25.658,-1359.0690323441,-1367.472005363,-1375.175933356,-1264.1269206441\r\ngdb_91997,OC1CN2CC2C1C#C,2.78172,1.52824,1.24732,1.9053,76.79,-0.2313,0.0362,0.2675,1044.9105,0.148193,-401.891069,-401.883051,-401.882106,-401.923286,31.806,-1765.037234912,-1776.004209705,-1786.079494209,-1642.376168155\r\ngdb_32838,CCCc1cc(c[nH]1)O,5.10604,0.73462,0.68061,1.9209,83.53,-0.1794,0.0488,0.2281,1693.5636,0.170826,-403.188897,-403.179163,-403.178219,-403.224351,35.688,-1951.5843654501,-1963.2516402871,-1974.51354431,-1817.5835835541\r\ngdb_38026,C1C2C=C3CCN=C3N12,3.84066,1.59825,1.23692,3.4369,81.49,-0.2216,-0.0313,0.1903,1000.4027,0.139404,-380.863879,-380.85725,-380.856306,-380.894858,26.7,-1736.0469466211,-1746.9969786711,-1756.479894679,-1619.2424213611\r\ngdb_31638,CCc1cc(n(c1)C)N,3.35872,1.01445,0.83167,1.7089,88.4,-0.1785,0.0558,0.2343,1461.6701,0.182927,-383.309074,-383.298997,-383.298053,-383.344486,36.644,-2014.725576048,-2027.0661680421,-2038.92106807,-1873.9608746411\r\ngdb_12789,OCC1CC2CC12O,2.99838,1.87258,1.44582,2.465,67.55,-0.2347,0.065,0.2997,920.7484,0.153198,-384.898905,-384.890802,-384.889858,-384.931207,30.824,-1685.7307649651,-1696.643773984,-1706.720313506,-1565.616757239\r\ngdb_11153,C1C=C(NC1=O)C#N,5.91358,1.3676,1.11837,2.1411,63.86,-0.2466,-0.0616,0.185,960.1125,0.084672,-377.580878,-377.574071,-377.573127,-377.612074,24.759,-1326.318710125,-1332.713026835,-1339.231590327,-1245.2633725951\r\ngdb_101698,CCC(O)CC1(C)CO1,3.24991,0.85577,0.76435,2.7822,82.45,-0.2585,0.0643,0.3228,1607.1595,0.202848,-425.415128,-425.404395,-425.40345,-425.451211,39.494,-2105.1320526961,-2118.8381042741,-2131.878368803,-1951.52161455\r\ngdb_62866,CC1(O)C2CC1C1NC21,2.81165,1.86088,1.462,3.3382,77.66,-0.2333,0.0729,0.3061,963.3754,0.173086,-403.095296,-403.088072,-403.087128,-403.126382,30.93,-1892.8488955411,-1906.0912179681,-1917.353121991,-1756.107154333\r\ngdb_121987,CCOCC(C)NC=O,3.33982,0.71139,0.65851,4.2515,81.67,-0.2446,0.0373,0.2819,1797.8051,0.192071,-441.513864,-441.502819,-441.501874,-441.551787,38.173,-2018.4856099761,-2031.1066984931,-2043.553967017,-1872.4071623571\r\ngdb_87255,CC1NC1(C)CNC=O,4.00882,0.79024,0.76133,2.7597,81.29,-0.2473,0.0275,0.2748,1575.7425,0.181362,-420.415933,-420.405567,-420.404623,-420.452272,37.049,-1945.104707516,-1957.2633219001,-1969.118221928,-1804.66568328\r\ngdb_91604,CC1OC2CC1C(=N)O2,2.32011,2.01412,1.49705,2.2895,71.72,-0.2616,0.0239,0.2855,948.8163,0.15004,-439.115421,-439.108196,-439.107251,-439.146912,28.972,-1769.1442813171,-1780.6082432381,-1790.683527742,-1645.771619354\r\ngdb_83598,CC1C2CCC2(O)C1=O,2.26988,1.70126,1.16968,2.3376,76.13,-0.2417,-0.0215,0.2202,1104.9565,0.158089,-423.022951,-423.014119,-423.013175,-423.056419,33.349,-1859.722695431,-1871.0674306421,-1881.73633866,-1731.2132447941\r\ngdb_106007,CCC12C3OC1C1C3N21,3.30787,1.64171,1.41196,0.5484,72.95,-0.2434,0.062,0.3053,972.9996,0.148205,-401.857452,-401.85023,-401.849286,-401.889384,28.474,-1743.9422648591,-1755.408736816,-1765.484648829,-1621.102358037\r\ngdb_96330,CC1CCCOC(C)O1,2.00066,1.60292,0.9704,1.6536,82.49,-0.2474,0.0827,0.3301,1299.7626,0.205998,-425.443253,-425.43382,-425.432876,-425.477207,36.11,-2122.7807433211,-2137.3025565991,-2150.3434486371,-1967.8343385141\r\ngdb_113826,CCC1OC1C1NC1C,4.0976,0.78918,0.72174,1.7287,84.4,-0.2478,0.0712,0.3191,1671.6896,0.192905,-404.308424,-404.298364,-404.297419,-404.344571,36.502,-2026.2460137791,-2039.48582617,-2051.933094694,-1879.309133848\r\ngdb_90809,CN1CC2C3C4C(C14)N23,3.1653,1.85625,1.52149,1.7884,76.99,-0.1942,0.0666,0.2609,926.7709,0.161757,-381.977922,-381.970921,-381.969977,-382.009366,28.594,-1807.267335594,-1819.761039784,-1830.429947802,-1677.383640247\r\ngdb_120344,CCOC1C=CC2OC12,4.04576,0.99329,0.88342,2.6382,77.18,-0.2465,-0.0081,0.2385,1366.9617,0.158433,-422.995331,-422.986756,-422.985812,-423.029396,31.702,-1842.3908968511,-1853.896901875,-1864.565809893,-1714.2560690871\r\ngdb_119594,COCC12CC1CC=C2,4.03183,0.99717,0.92052,0.9657,84.88,-0.2235,0.0233,0.2469,1367.3407,0.182761,-387.076749,-387.067963,-387.067019,-387.110474,33.322,-2029.8146574621,-2042.965991084,-2054.820891112,-1888.351538538\r\ngdb_34927,C#CC1C2CC3(CC3)N12,2.74094,1.59089,1.26477,1.4176,81.88,-0.2283,0.0326,0.2609,1043.7836,0.146742,-364.712571,-364.704581,-364.703637,-364.74478,31.801,-1789.7039861931,-1800.6891587471,-1810.76507076,-1667.2939230361\r\ngdb_92108,CC1CC2OC3(C)C2C13,2.66015,1.67074,1.39852,1.5677,81.17,-0.2177,0.0874,0.3051,1047.1192,0.183218,-387.073211,-387.065141,-387.064197,-387.105267,32.559,-2027.5945306201,-2041.195160686,-2053.050060714,-1885.0840991751\r\ngdb_2476,O=C1CN2CC1C2,5.29239,2.73393,2.46416,2.0133,56.61,-0.2197,-0.0129,0.2069,589.838,0.115987,-324.55226,-324.547035,-324.546091,-324.581099,20.824,-1360.470259941,-1369.634401377,-1377.339584388,-1267.314666364\r\ngdb_94385,OC1COC11CC2NC12,2.91787,1.50781,1.26991,2.247,71.27,-0.2353,0.0622,0.2975,1035.0094,0.147962,-439.009563,-439.001748,-439.000803,-439.042031,30.466,-1702.7174335951,-1713.811165206,-1723.88644971,-1579.957847925\r\ngdb_125921,CC1CC2=CNN=C2C1,4.4891,1.18943,0.97304,1.9257,81.7,-0.2269,0.024,0.2509,1216.5897,0.162941,-382.107679,-382.100079,-382.099135,-382.139635,30.136,-1888.6910209071,-1900.808847206,-1911.477755224,-1759.1286101681\r\ngdb_71158,CC12CN3C(C13)C21CC1,2.56163,1.8451,1.4483,1.8827,82.2,-0.2283,0.0783,0.3066,995.5509,0.171248,-365.931129,-365.923422,-365.922477,-365.962729,31.542,-1926.508478301,-1939.4483413901,-1950.709617904,-1790.345300391\r\ngdb_46407,N=C1OCC1OCC#N,3.94438,0.88013,0.76912,4.2299,67.92,-0.2798,-0.0119,0.2679,1406.2052,0.112025,-453.929009,-453.920522,-453.919577,-453.96346,29.443,-1503.9075221791,-1511.9132820011,-1520.20957849,-1402.140622595\r\ngdb_87772,CC1CC11OC2(C)CC12,3.45376,1.14049,1.12447,1.4327,84.09,-0.2208,0.0824,0.3031,1239.7898,0.180636,-387.043469,-387.034461,-387.033517,-387.076714,35.067,-2008.9311579421,-2021.9431845661,-2033.798084594,-1867.166834698\r\ngdb_124162,CC#Cc1nc([nH]n1)O,6.51888,0.76434,0.68706,1.2915,76.75,-0.2251,-0.0059,0.2191,1569.2136,0.10157,-432.87594,-432.867401,-432.866456,-432.91191,29.726,-1458.677928477,-1465.762505087,-1473.4658055711,-1364.497612703\r\ngdb_42701,O=C1C2C3OC4C3C1C24,4.32062,1.64066,1.51919,2.2349,67.36,-0.2426,-0.0117,0.2309,853.7079,0.1151,-420.58019,-420.574588,-420.573644,-420.610105,23.663,-1582.5714229101,-1592.3888012151,-1600.685725213,-1478.57435634\r\ngdb_40412,C1OC1C12CC=CC1O2,3.68236,1.29603,1.14525,0.5516,72.26,-0.2487,-0.0011,0.2475,1087.7616,0.135948,-421.787025,-421.779609,-421.778665,-421.819188,28.775,-1712.0196270111,-1722.475809478,-1731.958725486,-1596.062238901\r\ngdb_10662,CC(=O)C1(CC1)C#C,2.72587,2.08683,1.31119,2.011,72.07,-0.245,-0.0166,0.2285,936.6215,0.12759,-346.584283,-346.575811,-346.574867,-346.617216,31.265,-1625.3211010441,-1634.226081263,-1643.116628775,-1519.230036941\r\ngdb_42005,C1CC2CCOCC2=C1,3.25779,1.39677,1.05197,1.1224,84.23,-0.2265,0.0254,0.2519,1174.9404,0.185893,-387.108254,-387.10047,-387.099526,-387.140786,30.649,-2049.584328507,-2063.364426147,-2075.219326175,-1907.3725913461\r\ngdb_1090,CC(C)(C)OC=O,4.34643,1.65721,1.64286,4.3411,61.5,-0.267,0.0121,0.2791,858.7256,0.145148,-346.890723,-346.882225,-346.881281,-346.923014,30.464,-1584.444537275,-1594.2204999861,-1603.704671012,-1473.4507453551\r\ngdb_97048,CN=C1COC(=O)CO1,4.26296,1.07223,0.91367,2.814,69.0,-0.264,-0.0098,0.2542,1248.9266,0.124064,-475.054686,-475.04622,-475.045276,-475.088696,29.854,-1594.6992893531,-1603.606779608,-1612.496699611,-1486.022260661\r\ngdb_71215,OC12CC3C(NC13)C2=O,2.45989,2.07169,1.65084,1.4171,67.71,-0.229,-0.0253,0.2037,867.4926,0.126137,-437.846388,-437.839656,-437.838712,-437.8772,27.481,-1600.666272434,-1610.662490804,-1619.552410807,-1490.2384936321\r\ngdb_133301,c1c(c(c(nn1)N)N)F,2.40245,1.83038,1.04217,4.3572,69.04,-0.2255,-0.024,0.2015,1039.2949,0.102655,-474.174006,-474.166529,-474.165585,-474.205667,29.075,-1400.210404911,-1407.9607685701,-1415.664696563,-1303.105269688\r\ngdb_131942,Cn1cc(nc1)N(=O)=O,4.59563,1.10513,0.89595,7.8182,69.42,-0.2608,-0.0673,0.1934,1216.0729,0.10172,-469.959213,-469.95141,-469.950466,-469.992645,27.189,-1374.256632671,-1381.802428396,-1389.506356389,-1278.227675383\r\ngdb_98549,COC(=O)C12CC1CN2,3.59188,1.1745,0.98,1.535,74.3,-0.2284,0.0093,0.2377,1235.5768,0.147569,-439.070409,-439.061724,-439.06078,-439.104424,31.541,-1740.8988462091,-1751.4466449901,-1761.522557003,-1619.110016962\r\ngdb_34115,C#CC12CC3NC1C3O2,3.654,1.47053,1.39221,1.6362,75.47,-0.2322,0.0165,0.2487,959.509,0.125387,-400.655073,-400.648371,-400.647427,-400.68596,27.788,-1617.2902408621,-1627.3059120111,-1636.195832014,-1507.1655489071\r\ngdb_94516,CC1COC1C(=O)C#N,2.67116,1.29362,1.06688,3.8632,69.91,-0.2732,-0.0915,0.1817,1147.6797,0.122504,-437.879204,-437.870175,-437.86923,-437.914334,31.105,-1621.2586077781,-1629.8134379751,-1638.702730469,-1513.540412838\r\ngdb_19376,OCC12NC1C1NC21,4.30382,1.64987,1.4714,2.3863,64.35,-0.2061,0.0599,0.266,879.656,0.130632,-379.795519,-379.788437,-379.787493,-379.826843,27.509,-1460.322629566,-1470.098592277,-1478.989139789,-1352.605689644\r\ngdb_37138,C1CC23NC2(C1)C1OC31,3.14695,1.68681,1.66685,0.9786,73.16,-0.2408,0.0707,0.3115,907.1929,0.150391,-401.877935,-401.871306,-401.870362,-401.908586,27.727,-1756.795531706,-1768.6341165001,-1778.710028513,-1633.1517858551\r\ngdb_130017,c12c(nc([nH]1)N)n[nH]n2,5.48234,1.35057,1.08543,4.0868,67.38,-0.2131,-0.0061,0.2069,1008.3362,0.093905,-445.117542,-445.110728,-445.109784,-445.148426,26.287,-1300.806076712,-1308.083298585,-1315.194230573,-1209.451433965\r\ngdb_11037,CC12CCC(O)C1O2,4.04958,1.70646,1.40699,2.0328,66.73,-0.2478,0.0797,0.3275,924.9852,0.153984,-384.942793,-384.935129,-384.934185,-384.974401,29.699,-1713.2708799571,-1724.4593654271,-1734.535904949,-1592.7213809851\r\ngdb_53850,CC(=O)C(=O)CCC#C,4.311,0.73238,0.63322,0.7242,75.61,-0.2442,-0.0876,0.1566,1700.2989,0.130869,-421.828637,-421.818201,-421.817257,-421.865915,35.548,-1738.1315315191,-1746.6926368061,-1756.175552814,-1625.383851944\r\ngdb_131559,OC1COC2=C1ON=N2,3.16048,1.65476,1.18063,3.4545,60.36,-0.2459,-0.0591,0.1868,960.389,0.089632,-489.825699,-489.818739,-489.817794,-489.857373,25.988,-1302.7463345401,-1309.932567608,-1317.042872087,-1212.351780563\r\ngdb_91460,CC1OC2CC1(C)C=C2,2.3732,1.94858,1.41376,1.4199,81.91,-0.2345,0.0114,0.2459,1015.4523,0.183597,-387.099057,-387.091087,-387.090143,-387.130731,32.939,-2043.8131282341,-2057.4765092,-2069.331409228,-1901.062988351\r\ngdb_17820,CC1NC1C1CCO1,3.89102,1.57542,1.35685,2.2627,71.71,-0.2395,0.071,0.3105,984.6983,0.166187,-365.016028,-365.007978,-365.007034,-365.048952,29.575,-1746.955563077,-1758.790382817,-1769.459918344,-1620.494301816\r\ngdb_76943,CC1C(C=O)C1(C)C=O,1.97112,1.23809,1.06263,4.0556,79.05,-0.2534,-0.0497,0.2037,1264.098,0.155596,-423.041174,-423.031063,-423.030118,-423.076416,35.642,-1871.1577919381,-1881.6999431381,-1892.3682236471,-1743.761542267\r\ngdb_85385,CC1CC(=O)NC(=O)C1,1.91757,1.67787,0.92753,3.2152,73.81,-0.2579,-0.024,0.234,1231.8504,0.14896,-439.182465,-439.174142,-439.173198,-439.215577,31.41,-1811.214994713,-1821.9899517521,-1832.0658637651,-1688.859524839\r\ngdb_7201,COCC(C)CCO,5.35362,0.74642,0.68463,0.4062,76.07,-0.2515,0.0797,0.3312,1673.383,0.197843,-387.327874,-387.31726,-387.316315,-387.363972,36.709,-1954.2274333581,-1967.118978254,-1979.566874287,-1809.7535262521\r\ngdb_65906,C1CC(C1)(C(=O)C#N)O,2.9374,1.30361,1.11198,3.2162,70.04,-0.2772,-0.0922,0.1849,1112.8132,0.122304,-437.889885,-437.880926,-437.879982,-437.924156,32.213,-1627.9610314071,-1636.5597872341,-1645.449707237,-1519.703806236\r\ngdb_65092,CC1(CC1)C12CCC1N2,2.95676,1.37174,1.13179,1.3276,85.97,-0.2375,0.0821,0.3196,1181.5161,0.194836,-367.163334,-367.154683,-367.153739,-367.196668,34.353,-2071.8765857321,-2086.0011858131,-2098.449081846,-1923.4305466561\r\ngdb_10227,CCCC(CO)CO,2.27516,1.15617,0.81931,2.0082,74.97,-0.2533,0.0722,0.3254,1447.8911,0.198345,-387.33615,-387.325576,-387.324632,-387.372143,37.225,-1959.420697842,-1972.3373430981,-1984.78586664,-1814.8809022911\r\ngdb_84607,CN1C=NC2CC12C#N,2.54061,1.7268,1.22416,3.4224,75.4,-0.2293,-0.0008,0.2285,1017.4138,0.125353,-396.933271,-396.925218,-396.924274,-396.966178,29.562,-1630.986879805,-1640.154158786,-1649.044078789,-1521.769565864\r\ngdb_1968,OCC1CC(=N)O1,6.9034,1.67937,1.47296,1.2437,55.23,-0.2659,0.021,0.2869,807.6631,0.113694,-361.707841,-361.701076,-361.700132,-361.738859,24.5,-1321.422884907,-1329.6200349741,-1337.325217985,-1229.378609769\r\ngdb_78432,CC1C2C1C1N=COC21,3.44015,1.39969,1.36922,1.7856,74.23,-0.2455,0.0206,0.266,1024.8344,0.149739,-401.931879,-401.924763,-401.923819,-401.963245,28.389,-1790.645877202,-1802.178865113,-1812.254777126,-1667.450800286\r\ngdb_117886,CCC#CC(O)COC,1.90833,0.82498,0.61854,2.1239,85.0,-0.2481,0.0378,0.2859,1883.6664,0.178975,-424.184512,-424.172925,-424.17198,-424.223635,39.149,-1960.7610570661,-1972.1541104701,-1984.0083829891,-1822.4292080521\r\ngdb_74558,CC1=CC2C(C#C)N2C1,2.4684,1.58327,1.25842,1.5859,83.35,-0.2142,0.0094,0.2236,1073.6716,0.146941,-364.763299,-364.754996,-364.754051,-364.795936,32.196,-1821.5362627451,-1832.3250249821,-1842.400309486,-1699.3947734401\r\ngdb_123774,c1cnn2c1C3(C2)CC3,4.32444,1.32557,1.1121,3.1689,80.63,-0.2267,0.032,0.2587,1090.7426,0.138686,-380.836777,-380.829669,-380.828725,-380.868615,27.603,-1719.040197703,-1729.6896529421,-1739.17256895,-1602.774702674\r\ngdb_37404,N1C2C3OC3C3OC2C13,2.91248,2.25242,1.81382,0.4409,65.1,-0.2352,0.0494,0.2846,791.5874,0.127921,-437.803628,-437.798032,-437.797088,-437.833278,24.328,-1573.833987594,-1584.543056188,-1593.432976191,-1462.677043334\r\ngdb_69241,CC12CC1C(=O)NC=N2,2.88954,1.57379,1.20803,1.6027,74.81,-0.2385,-0.014,0.2245,1042.1272,0.137836,-418.055797,-418.048175,-418.047231,-418.087639,29.938,-1719.8013661201,-1730.1276542241,-1739.6105702321,-1603.2823574551\r\ngdb_88851,CC1OC1C1OCC=C1,3.88113,1.03431,0.90384,1.128,78.25,-0.2365,0.0122,0.2487,1327.6772,0.15843,-423.006801,-422.998105,-422.997161,-423.041211,31.757,-1849.5884250811,-1861.018501516,-1871.687409534,-1721.670087922\r\ngdb_86747,CC1OC(C=C1)C1CN1,3.46049,1.16472,1.0251,2.9083,80.28,-0.234,0.0096,0.2436,1239.6756,0.171656,-403.12664,-403.118108,-403.117164,-403.160243,32.298,-1912.5175376371,-1924.939078292,-1936.200982315,-1777.355236582\r\ngdb_30264,CC1CN2C=CN=C2C1,4.5131,1.23959,1.00903,3.9221,81.41,-0.2099,0.0405,0.2504,1186.3622,0.163113,-382.128428,-382.120916,-382.119972,-382.160406,29.414,-1901.711205148,-1913.8842522391,-1924.553160257,-1772.1625996071\r\ngdb_83933,OC1C2OCCC12C=O,2.57766,1.47463,1.19592,2.1004,69.48,-0.2482,-0.024,0.2242,1062.4151,0.135951,-458.953821,-458.945891,-458.944947,-458.986349,30.085,-1680.0097654121,-1690.1427807441,-1699.625696752,-1564.0253944151\r\ngdb_127701,CN1N=NC(=N1)C1CN1,5.85653,0.99445,0.91021,3.0997,73.27,-0.2436,-0.0263,0.2173,1275.5515,0.126537,-430.210278,-430.202343,-430.201399,-430.243668,28.025,-1507.260302766,-1516.500372791,-1525.390292794,-1397.729861834\r\ngdb_24771,c1cc2n(c1)C=CC2=O,3.5411,1.70985,1.15308,4.1178,78.68,-0.2256,-0.08,0.1456,973.4501,0.103844,-399.608662,-399.602283,-399.601338,-399.639435,25.249,-1588.509540577,-1596.9507916451,-1604.654092129,-1491.684274368\r\ngdb_120642,CCOC1CC1(O)CC,3.65572,0.81646,0.77334,1.6391,83.4,-0.2233,0.0736,0.2969,1634.5358,0.202739,-425.408729,-425.39796,-425.397016,-425.444838,39.483,-2101.1166226051,-2114.8000838591,-2127.840975897,-1947.5224996931\r\ngdb_9412,CC(CO)CCCO,5.83791,0.68028,0.6383,2.6318,75.34,-0.2622,0.0769,0.3391,1754.1957,0.197979,-387.33526,-387.324546,-387.323601,-387.371631,37.297,-1958.8622148321,-1971.691008828,-1984.138904861,-1814.559617683\r\ngdb_6343,CCC(=O)NC(=O)C,5.2759,1.05747,0.89554,5.704,67.2,-0.2415,-0.0178,0.2237,1290.6015,0.13953,-401.068565,-401.058976,-401.058032,-401.104357,32.227,-1643.589770561,-1652.681120953,-1662.1646644701,-1532.043143212\r\ngdb_8000,CC1CC(C)C(C)=C1,3.08943,1.58953,1.15887,0.3106,85.04,-0.2244,0.0337,0.2581,1133.7956,0.199929,-313.102562,-313.093549,-313.092605,-313.135703,34.209,-2086.193831076,-2100.0912728991,-2112.539796441,-1940.376427201\r\ngdb_127191,CN1C(=O)NN=NC1=N,2.41209,2.0466,1.11488,0.7709,67.81,-0.2645,-0.0806,0.1839,998.1545,0.1026,-466.196418,-466.188639,-466.187695,-466.229216,28.045,-1362.229795177,-1369.790023609,-1377.4939516021,-1265.442807017\r\ngdb_91853,OC1CN2CC2(C1)C#N,3.63718,1.20799,1.11564,5.4917,72.97,-0.2505,0.0187,0.2692,1111.022,0.137368,-417.999564,-417.991737,-417.990793,-418.0317,30.377,-1684.5146525231,-1694.7123012821,-1704.19521729,-1568.1801315041\r\ngdb_72369,NC12CC=CC1OC2=O,2.57457,1.71657,1.43661,4.4173,68.6,-0.2532,-0.0114,0.2418,954.6833,0.125714,-437.912957,-437.905389,-437.904445,-437.944972,28.982,-1642.4389190551,-1651.9105399011,-1660.800459904,-1532.76603358\r\ngdb_128256,CNc1[nH]nc(n1)C#C,6.65418,0.90458,0.80361,4.5151,78.44,-0.2162,-0.0041,0.212,1373.0182,0.113341,-412.977021,-412.968431,-412.967486,-413.010384,31.08,-1509.836227211,-1517.777353606,-1526.073650095,-1407.282431341\r\ngdb_114876,OCC1CC2OCC2O1,2.76204,1.40102,1.28796,2.5087,70.01,-0.2517,0.0724,0.3242,1067.2894,0.160794,-460.143917,-460.136038,-460.135094,-460.176559,30.376,-1798.9540963621,-1810.896220141,-1821.565128159,-1669.6702996191\r\ngdb_124598,CC(C)n1cnoc1=O,2.70753,1.33113,1.10954,5.5181,69.25,-0.2517,-0.0063,0.2455,1148.2767,0.135635,-455.174204,-455.165609,-455.164665,-455.208301,31.111,-1657.42697152,-1667.142065858,-1676.624981866,-1542.066971978\r\ngdb_77073,CC1(O)CC(C#C)C1O,2.62405,1.28401,1.19429,2.4546,77.1,-0.251,0.0366,0.2875,1138.4605,0.156249,-422.983597,-422.973964,-422.97302,-423.017261,37.07,-1835.027706245,-1845.8698067471,-1856.538714765,-1706.641247372\r\ngdb_120587,CCOC1CC1(N)C#N,3.23412,0.88678,0.86389,4.4516,78.21,-0.2433,0.0264,0.2697,1447.8235,0.157853,-419.190966,-419.181081,-419.180137,-419.226131,35.77,-1804.2785102271,-1814.9618509521,-1825.63075897,-1676.4737521971\r\ngdb_95953,CC1CCOC11CCC1,2.57361,1.46199,1.24626,1.3731,84.62,-0.2339,0.0739,0.3078,1154.4086,0.206856,-388.296304,-388.287438,-388.286494,-388.329905,34.439,-2167.2447760431,-2182.1230144331,-2195.163906471,-2012.332884231\r\ngdb_9561,CC1(CC1C#C)C#N,4.26867,1.31883,1.24274,3.7984,73.56,-0.2646,0.0058,0.2705,1002.1336,0.117116,-325.500341,-325.492093,-325.491149,-325.532714,30.292,-1560.718421985,-1568.8754114761,-1577.172962983,-1460.9093504811\r\ngdb_21164,CNc1cnoc1N,3.34941,1.83514,1.25366,3.7021,64.42,-0.2082,0.0144,0.2226,944.1733,0.119915,-395.960078,-395.95233,-395.951386,-395.992439,28.228,-1414.980711753,-1423.449573217,-1431.747124724,-1314.291872631\r\ngdb_52609,CC#CC(C)(C)C(C)C,2.3451,0.96903,0.84315,0.2562,97.88,-0.2397,0.0663,0.3061,1587.7204,0.225086,-352.348375,-352.336237,-352.335293,-352.386253,43.909,-2336.253030031,-2350.8557919701,-2365.082676018,-2170.9100561121\r\ngdb_112655,CC1CC(CO)OC1=O,3.18662,1.07611,0.85777,3.5601,71.78,-0.2688,0.0086,0.2775,1351.1564,0.159354,-460.216239,-460.207263,-460.206319,-460.250309,33.091,-1844.3368022601,-1855.5905486661,-1866.2594566841,-1715.9490883691\r\ngdb_55903,NC(=O)CC1CC1C=O,3.78152,0.7564,0.6857,3.7748,74.59,-0.2476,-0.0188,0.2288,1606.8474,0.145847,-439.118212,-439.108485,-439.107541,-439.154378,34.301,-1770.8956589361,-1780.789593339,-1790.865505352,-1650.4566015481\r\ngdb_121851,CCCOC(=NCC)C,2.74763,0.70888,0.58729,0.7254,90.7,-0.2398,0.0361,0.2759,2018.4741,0.214313,-405.570312,-405.558486,-405.557542,-405.609662,40.647,-2190.2404708571,-2204.1492078421,-2217.783095885,-2031.942540481\r\ngdb_32864,CCOc1[nH]cc(n1)O,4.53148,0.89822,0.75672,2.3016,71.83,-0.1816,0.0582,0.2398,1471.8992,0.136079,-455.185177,-455.176183,-455.175239,-455.219447,32.682,-1664.3126277771,-1673.777346024,-1683.260262032,-1549.061187292\r\ngdb_56462,CC(C#C)C(=O)NC=N,2.37856,1.14152,0.80625,1.4214,79.32,-0.2572,-0.0219,0.2353,1395.7763,0.134349,-418.01585,-418.006132,-418.005188,-418.051375,34.628,-1694.7342640971,-1703.745293337,-1713.228209345,-1580.5263710791\r\ngdb_85106,CC1OC(=N)C1(O)C#N,2.05275,1.6217,1.24995,3.6102,67.56,-0.2918,-0.0246,0.2672,1058.2414,0.110862,-453.948189,-453.939278,-453.938333,-453.981885,31.928,-1515.9431447991,-1523.6828408051,-1531.9791372941,-1413.70247592\r\ngdb_92775,CC1CC=C(C=O)C1=O,3.17136,1.18849,0.90756,3.9954,76.99,-0.2424,-0.0807,0.1616,1260.3735,0.135058,-421.880759,-421.872106,-421.871162,-421.914654,31.118,-1770.8385556171,-1780.5185094511,-1790.0014254591,-1655.9680130951\r\ngdb_66859,OC12C3C4CC1CC2N34,2.55992,2.06748,1.82909,2.2542,73.36,-0.2132,0.066,0.2792,861.5408,0.1499,-401.888598,-401.882003,-401.881059,-401.919023,28.081,-1763.4866601731,-1775.3465802731,-1785.422492286,-1639.701097288\r\ngdb_65794,CC1(CC=CC1=O)C#N,2.25517,1.58611,1.35894,5.6808,75.08,-0.261,-0.0684,0.1926,1025.2523,0.124215,-400.801124,-400.792813,-400.791869,-400.834402,30.578,-1708.9385578211,-1717.9445669891,-1726.834486992,-1600.3142398851\r\ngdb_38557,C1C2OC3C4C=CC12C34,2.83919,2.2699,1.67036,1.1326,75.1,-0.192,0.0054,0.1973,841.222,0.138271,-384.626182,-384.620069,-384.619124,-384.656185,26.517,-1747.765049687,-1759.03950389,-1768.521792389,-1630.708265809\r\ngdb_129615,C(C#N)c1c(onn1)N,3.37966,1.32156,0.98264,1.8146,65.05,-0.241,-0.0307,0.2102,1096.2994,0.089126,-448.922789,-448.914833,-448.913888,-448.955758,28.78,-1339.471926274,-1346.033160378,-1353.143464857,-1249.785829958\r\ngdb_22190,CC#CC(=NO)CCO,2.0071,0.88747,0.64237,1.5261,84.4,-0.2275,-0.0245,0.2031,1730.7406,0.14413,-439.015193,-439.004194,-439.00325,-439.053851,37.091,-1706.2503092651,-1715.3460522201,-1725.421964233,-1587.3750043051\r\ngdb_20324,C#CCc1ccon1,6.04094,1.29705,1.13409,2.5771,64.22,-0.2669,-0.0137,0.2532,993.0118,0.095161,-361.421908,-361.414905,-361.413961,-361.45398,25.404,-1375.167775739,-1382.328280938,-1389.439840435,-1288.2847621261\r\ngdb_37370,C1CC2OC(O1)C1OC21,2.52458,2.03767,1.95167,1.7644,64.91,-0.2427,0.0651,0.3078,835.023,0.139051,-458.932886,-458.926577,-458.925633,-458.963312,26.137,-1666.8728644971,-1678.023071918,-1687.505987926,-1549.569469582\r\ngdb_63130,CC1(O)C2COC12C=O,2.32562,1.59522,1.19836,5.3886,71.08,-0.2382,-0.0265,0.2117,1072.1828,0.133026,-458.920044,-458.911241,-458.910296,-458.953192,32.781,-1658.8143939191,-1668.399593894,-1677.881882393,-1543.219078502\r\ngdb_28784,c1cn2c(cnc2[nH]1)O,3.40293,1.74797,1.15865,3.5014,69.18,-0.1873,0.0257,0.213,965.3922,0.103531,-432.901252,-432.894077,-432.893133,-432.932685,27.424,-1474.561436285,-1482.501935171,-1490.205863164,-1377.534112178\r\ngdb_48580,O=CC12CC(C1)C(=O)N2,4.01683,1.22256,1.11601,1.3744,69.64,-0.2532,-0.05,0.2031,1076.3041,0.125096,-437.899191,-437.891787,-437.890843,-437.931354,28.44,-1633.8006301611,-1643.3751624831,-1652.265082486,-1524.220616018\r\ngdb_38871,C1C2CC3OC2CCC13,2.65378,2.07292,1.88508,1.3935,79.12,-0.2339,0.0853,0.3192,883.2627,0.188923,-387.109697,-387.103695,-387.10275,-387.139537,27.391,-2050.489823994,-2065.3881426721,-2077.242415191,-1906.5888326051\r\ngdb_66566,CC12C3C4CC1(C)C4N23,2.3111,2.21316,1.56603,1.8317,81.03,-0.2215,0.0833,0.3048,954.2432,0.17136,-365.943653,-365.935973,-365.935029,-365.975147,31.275,-1934.367401017,-1947.3242068491,-1958.586110872,-1798.1377071531\r\ngdb_98904,OCC(=O)C1CCCO1,3.90759,0.95065,0.88298,3.3213,71.96,-0.2478,-0.0271,0.2206,1363.4311,0.159109,-460.178682,-460.169695,-460.168751,-460.213808,32.293,-1820.7694467471,-1832.016290554,-1842.685198572,-1693.04438236\r\ngdb_123385,C#CCOc1ccno1,7.57614,0.8124,0.7371,2.1413,71.59,-0.2344,0.0027,0.2371,1435.5655,0.099573,-436.628232,-436.620144,-436.6192,-436.662089,28.762,-1464.1140389441,-1471.482249622,-1479.186177615,-1368.967986819\r\ngdb_63645,CC1(O)CC1C1CCC1,3.32871,0.91816,0.85861,1.3247,86.98,-0.2443,0.0758,0.3201,1482.1157,0.204668,-388.268324,-388.258624,-388.25768,-388.303036,37.21,-2149.6870742231,-2164.041970107,-2177.082862145,-1995.47234491\r\ngdb_11421,CCC1OCC(=N)O1,3.77541,1.55797,1.22897,2.1648,65.64,-0.2667,0.0221,0.2888,1008.9037,0.142384,-401.021643,-401.013723,-401.012779,-401.054538,28.523,-1614.1457932631,-1624.2844561761,-1633.7679996931,-1500.781272341\r\ngdb_11528,CN(CC1CC1)C=O,4.22656,1.11624,1.00799,3.8946,73.4,-0.2393,0.0332,0.2725,1211.949,0.165174,-365.072982,-365.064127,-365.063183,-365.107398,31.33,-1782.694710663,-1794.024385658,-1804.693921185,-1657.16969283\r\ngdb_13215,CCC(O)C1COC1,4.11329,1.17611,0.99641,2.5498,70.85,-0.2439,0.0628,0.3066,1214.8596,0.17607,-386.117752,-386.108628,-386.107684,-386.152176,32.506,-1822.7166071741,-1834.766034992,-1846.028566524,-1690.563211774\r\ngdb_42974,O=C1N2CC2C=C1C#N,3.47763,1.42337,1.08532,6.3287,71.94,-0.2776,-0.0922,0.1853,1035.7459,0.090553,-415.626511,-415.619546,-415.618602,-415.658157,25.982,-1451.105777374,-1458.290127915,-1465.401059903,-1361.205700489\r\ngdb_90276,CC1CC2=C(COC2)C1,4.09116,1.09801,0.90716,1.9764,84.9,-0.2184,0.0301,0.2485,1325.7142,0.184355,-387.10992,-387.101566,-387.100622,-387.143016,31.917,-2050.6297585011,-2064.0521760111,-2075.907076039,-1908.7719364161\r\ngdb_107340,CCC12CCC=CC1N2,3.24398,1.25322,1.04765,1.5821,87.13,-0.219,0.0089,0.2279,1240.235,0.196319,-367.213021,-367.204421,-367.203477,-367.246045,33.966,-2103.0556254151,-2117.2122284551,-2129.660124488,-1954.4150585491\r\ngdb_91661,CC1OC2CC1C21CN1,2.56216,1.75483,1.38172,1.6889,78.99,-0.2314,0.0823,0.3137,1020.9936,0.172546,-403.09771,-403.090074,-403.08913,-403.129334,31.154,-1894.3637022671,-1907.3474909861,-1918.609395009,-1757.959560901\r\ngdb_25269,c1cc2n(c1)C(=O)CN2,3.18395,1.78428,1.15406,2.9872,74.46,-0.1938,-0.0225,0.1713,987.2482,0.11604,-416.871622,-416.864886,-416.863942,-416.902746,26.556,-1604.572515959,-1613.6776715491,-1621.974595547,-1500.973917604\r\ngdb_63231,CC1(C)C=CCC2OC12,2.47529,1.59729,1.32377,1.7102,83.56,-0.2425,0.0258,0.2683,1088.9332,0.182611,-387.09661,-387.088038,-387.087094,-387.129247,34.001,-2042.2776137111,-2055.5632342591,-2067.418134287,-1900.1317649951\r\ngdb_130347,NC1COC2=C1ON=C2,3.07541,1.63161,1.16016,2.0963,67.3,-0.2228,-0.0191,0.2038,1008.197,0.11513,-453.905942,-453.898759,-453.897814,-453.937965,27.096,-1489.432772076,-1498.256803634,-1506.553100123,-1386.14228064\r\ngdb_119792,CCCC1=NC2CC2O1,5.27465,0.84754,0.78331,0.9594,80.0,-0.2371,0.0252,0.2623,1527.27,0.171224,-403.159026,-403.150354,-403.14941,-403.193246,32.367,-1932.840044111,-1945.1737335061,-1956.435637529,-1798.064916109\r\ngdb_102921,COC(CCC#N)C=O,2.59207,0.81587,0.64794,4.6792,74.46,-0.2605,-0.0477,0.2127,1662.9557,0.144435,-439.096766,-439.08643,-439.085485,-439.133678,34.681,-1757.4381009221,-1766.949882344,-1777.025166848,-1637.467165248\r\ngdb_25174,c1cc(c(=O)[nH]c1)C#N,3.06448,1.51629,1.01438,7.7947,76.6,-0.2405,-0.0792,0.1613,1065.7602,0.092264,-415.695366,-415.688366,-415.687422,-415.727029,26.269,-1494.312909569,-1501.475297295,-1508.586229283,-1404.4235003371\r\ngdb_119153,COCC1(CO1)C(C)C,1.95517,1.24415,0.98036,1.4572,82.21,-0.2572,0.0683,0.3255,1382.2471,0.202624,-425.399299,-425.388409,-425.387465,-425.436085,39.006,-2095.199212735,-2108.8067454001,-2121.8476374381,-1942.029913416\r\ngdb_128617,COC1=NC(=N)C=NO1,3.03446,1.44254,0.98381,2.8201,68.94,-0.2601,-0.06,0.2001,1125.1224,0.101298,-469.9863,-469.978517,-469.977573,-470.019095,27.833,-1391.2539689541,-1398.8123148591,-1406.516242852,-1294.825288433\r\ngdb_21084,Cn1c(=O)nccn1,3.51631,2.50597,1.47663,3.828,63.68,-0.2535,-0.0835,0.17,786.906,0.097162,-394.80978,-394.803167,-394.802223,-394.840775,23.937,-1321.009983985,-1328.413962676,-1335.525522173,-1232.834929341\r\ngdb_37567,C1C2N3C1C21CCCC31,3.64263,1.60881,1.37954,1.9642,82.65,-0.2172,0.0795,0.2966,995.6721,0.173648,-365.905591,-365.898748,-365.897804,-365.936824,28.576,-1910.483153459,-1923.9651843241,-1935.227088347,-1774.089679746\r\ngdb_75324,CC1=CCC2CC1C2=O,2.41803,1.61842,1.35989,2.8222,81.36,-0.2201,-0.0047,0.2154,1053.2346,0.160503,-385.913804,-385.90604,-385.905096,-385.9458,30.783,-1927.9078233711,-1939.9233657031,-1950.592273721,-1798.7307031581\r\ngdb_25555,c1coc(=NCC=O)o1,7.01452,0.75113,0.68581,5.4799,66.42,-0.2303,-0.0186,0.2116,1519.3398,0.09897,-473.832417,-473.824194,-473.82325,-473.867833,28.123,-1455.5661113461,-1462.8489808,-1470.552908793,-1361.1423220801\r\ngdb_106500,COC12CC(C1)OC2C,2.39024,1.65667,1.27086,1.3985,77.8,-0.2379,0.0773,0.3153,1091.5044,0.182279,-424.198301,-424.189613,-424.188669,-424.231423,33.659,-1969.4137786671,-1982.6259806621,-1994.48088069,-1827.316248144\r\ngdb_186,N=C1OC=CO1,9.19863,4.12641,2.84857,3.6351,40.77,-0.2262,0.0118,0.238,424.2581,0.062567,-321.236907,-321.232405,-321.231461,-321.264644,16.38,-930.440852277,-935.614663982,-940.355494477,-872.206762059\r\ngdb_87741,CC1CC11CC(O1)C#N,4.71917,0.9118,0.87654,4.4837,77.53,-0.2504,0.0018,0.2522,1361.1307,0.146467,-401.92797,-401.919182,-401.918238,-401.962017,32.376,-1788.1929445211,-1798.6767373841,-1808.752649397,-1666.6802192341\r\ngdb_47519,O=C1CCCC2CC1C2,2.41683,1.71335,1.24829,3.14,81.36,-0.2303,-0.0074,0.2229,1075.0313,0.185673,-387.114826,-387.107188,-387.106244,-387.146794,31.057,-2053.708317655,-2067.5800316091,-2079.434931637,-1911.1426654181\r\ngdb_80808,CN1C2CC1(C#C)C2O,2.19849,1.48199,1.43384,1.8534,80.61,-0.2047,0.0281,0.2328,1061.1619,0.14539,-401.826516,-401.817647,-401.816703,-401.859639,34.102,-1724.5296464351,-1734.9626110691,-1745.038523082,-1602.4371028321\r\ngdb_105376,OCC1(CC2CC2)CO1,3.06461,0.99301,0.83905,1.8517,78.38,-0.2573,0.0844,0.3417,1430.2863,0.181585,-424.188157,-424.178764,-424.17782,-424.223137,35.154,-1963.0483273711,-1975.818135521,-1987.673035549,-1822.11670857\r\ngdb_94931,OC1CCN2C(C=O)C12,2.10409,1.80385,1.55123,2.1131,71.94,-0.2368,-0.0261,0.2107,973.3369,0.148036,-439.063341,-439.055142,-439.054198,-439.096175,30.863,-1736.463612597,-1747.3163807521,-1757.392292765,-1613.9336952211\r\ngdb_107307,CCC12CC(C)=CC1O2,3.35989,1.12988,0.94758,1.9164,84.23,-0.2315,0.0094,0.2409,1332.8969,0.18143,-387.098258,-387.089144,-387.0882,-387.131995,34.779,-2043.3117485431,-2056.2572592131,-2068.112159241,-1901.8561597271\r\ngdb_41687,C1OCC23CC=CC2C13,3.85277,1.44788,1.2389,1.4861,79.66,-0.2296,0.0184,0.248,1038.4854,0.162346,-385.892775,-385.885906,-385.884961,-385.923711,28.333,-1914.71193661,-1927.2890994971,-1937.957380006,-1784.8696568571\r\ngdb_96217,CN1CCCN=C1C=O,2.76707,1.41434,0.97953,3.4022,81.15,-0.2191,-0.0538,0.1653,1204.7009,0.160412,-419.229037,-419.220216,-419.219272,-419.263082,31.956,-1828.168405366,-1839.519415667,-1850.188323685,-1699.660837256\r\ngdb_131134,OC(C=O)C1=NOC=N1,3.95345,1.09985,0.97586,3.1047,59.64,-0.2836,-0.0566,0.227,1149.7746,0.088517,-489.877455,-489.869809,-489.868865,-489.91075,26.776,-1335.2236903441,-1341.979452238,-1349.090384226,-1245.8463284561\r\ngdb_84002,CC12OCCOC1C2O,2.17976,1.77635,1.4617,1.9348,71.06,-0.2162,0.0662,0.2824,1012.761,0.159031,-460.128702,-460.120349,-460.119405,-460.16111,32.597,-1789.4065469271,-1801.05123144,-1811.720139458,-1659.9759130781\r\ngdb_46420,O=C1NCC1CCC#C,3.79683,0.82321,0.71974,3.7593,78.97,-0.2502,0.0287,0.2788,1542.4276,0.146663,-401.939396,-401.930084,-401.92914,-401.974341,33.557,-1795.362862355,-1805.5178405021,-1815.593752515,-1674.41364015\r\ngdb_67926,OC12CC(=O)C1C2C#C,1.97453,1.78188,1.14964,3.4953,72.38,-0.2468,-0.0194,0.2273,1064.4794,0.109875,-420.572971,-420.564667,-420.563723,-420.605755,31.667,-1578.0414354391,-1586.163284426,-1594.460208424,-1475.84469219\r\ngdb_40776,C1CC1C1CCC2CC12,3.51797,1.10874,0.94952,0.1223,88.95,-0.2551,0.0862,0.3413,1332.2027,0.208384,-351.146631,-351.13843,-351.137486,-351.179566,33.135,-2209.9994742491,-2225.2956336331,-2238.336525671,-2054.9256851151\r\ngdb_42337,C1CN(C1)C(=O)C2CC2,4.00093,0.99124,0.86121,3.7238,82.79,-0.2318,0.0287,0.2605,1395.4956,0.171754,-403.1549,-403.145881,-403.144937,-403.190247,32.263,-1930.2509419771,-1942.366885749,-1953.628789772,-1796.1830166181\r\ngdb_15128,O=CC#CCC1CO1,9.42512,0.71009,0.6806,4.3063,69.68,-0.2663,-0.0615,0.2048,1553.7685,0.104774,-382.489301,-382.480897,-382.479953,-382.524333,27.656,-1429.3858083571,-1436.555726191,-1444.260281693,-1337.726823745\r\ngdb_5432,Cc1c(cc[nH]1)C#N,3.49111,1.92407,1.25011,6.2934,71.16,-0.2251,-0.002,0.2231,911.5292,0.109129,-341.652648,-341.645372,-341.644428,-341.684327,26.517,-1507.688263904,-1515.5660118901,-1523.270567392,-1413.8210751211\r\ngdb_102560,CC1CN1C(CO)CO,2.41446,1.16332,0.97395,3.5407,79.04,-0.239,0.0739,0.3129,1331.4631,0.193572,-441.445237,-441.435317,-441.434372,-441.480268,36.993,-1975.4215498331,-1988.7485859751,-2001.195854499,-1827.5283461861\r\ngdb_63463,CC1(C)NC1(C)C1CN1,2.64641,1.30668,1.03031,1.5992,85.74,-0.2389,0.0703,0.3092,1269.9872,0.20437,-384.425538,-384.415242,-384.414298,-384.460353,38.723,-2087.4651643101,-2101.445437321,-2114.486329359,-1932.954878258\r\ngdb_123801,c1nc(no1)NC2CC2,5.51734,0.96326,0.85472,1.6966,70.98,-0.2312,-0.0095,0.2217,1311.8284,0.125957,-434.048382,-434.040735,-434.039791,-434.081478,28.238,-1566.544215541,-1575.965635667,-1584.85555567,-1457.189477129\r\ngdb_93010,CC1CC=CC2(CC2)C1,2.72551,1.30954,0.99107,0.2503,92.83,-0.2203,0.0311,0.2514,1294.4331,0.207317,-351.171271,-351.162641,-351.161697,-351.204077,34.895,-2225.461296009,-2240.4882540321,-2253.52914607,-2070.306558214\r\ngdb_46251,N=C1OCC11CC2OC12,3.16636,1.47842,1.23782,3.8843,69.64,-0.2575,0.0169,0.2743,1014.267,0.124455,-437.831547,-437.824232,-437.823287,-437.863604,27.946,-1591.353411365,-1600.983791988,-1609.873084482,-1481.706881268\r\ngdb_67343,CC12C3NC1C3OC2=N,2.89873,1.94417,1.3501,2.5968,73.46,-0.2516,0.0143,0.2659,940.9406,0.137824,-417.958114,-417.951114,-417.95017,-417.989231,28.274,-1658.5044044731,-1669.2210031751,-1678.703919183,-1541.5304517831\r\ngdb_105449,OCC1(CC=CC1)C=O,2.69375,1.31657,1.15296,2.8777,76.46,-0.2526,-0.0326,0.2201,1137.2792,0.159119,-423.045958,-423.037372,-423.036428,-423.079254,32.553,-1874.1597949941,-1885.658897419,-1896.327805437,-1745.5424128091\r\ngdb_73070,OC12CNC1C2OC=N,3.43439,1.12038,0.99268,3.1969,70.85,-0.2277,0.0316,0.2593,1210.2496,0.135322,-455.072336,-455.063709,-455.062764,-455.105975,31.737,-1593.5038847081,-1603.198898758,-1612.681187257,-1477.856486044\r\ngdb_105208,C[NH2+]C1(CC1O)C([O-])=O,2.19273,1.52139,1.08878,4.9102,68.23,-0.24,0.0171,0.2572,1141.244,0.146522,-476.228321,-476.218932,-476.217988,-476.262406,34.563,-1703.3141946541,-1713.41959959,-1723.4955116031,-1581.313267365\r\ngdb_85898,CC1(C)CC(O)C2OC12,2.66439,1.36531,1.22107,1.8953,76.79,-0.2514,0.0758,0.3272,1142.1475,0.181786,-424.226119,-424.217128,-424.216184,-424.259277,35.255,-1986.8698240291,-1999.8918907971,-2011.746790825,-1844.7948838301\r\ngdb_31862,COc1ncc(cn1)O,5.13342,1.06681,0.88818,3.081,72.2,-0.2257,-0.0401,0.1855,1241.7307,0.113641,-453.994954,-453.987111,-453.986167,-454.027417,28.923,-1545.2886031841,-1553.6984788021,-1561.9954028001,-1442.2742157081\r\ngdb_44262,O=C1NC2C1CCC2=O,3.15825,1.44777,1.18754,1.4547,68.69,-0.2433,-0.0369,0.2064,1027.2839,0.125782,-437.936618,-437.929174,-437.92823,-437.968947,28.249,-1657.2864095041,-1666.835841466,-1675.725761469,-1547.810561855\r\ngdb_12678,CC1NCC(=O)C1O,2.62741,2.2579,1.51605,2.1652,63.44,-0.2324,-0.0325,0.1999,871.7213,0.142528,-401.014395,-401.006584,-401.00564,-401.046317,29.544,-1609.597608031,-1619.804669425,-1629.288212942,-1495.622520852\r\ngdb_56281,[O-]C(=O)C[NH2+]CC1CO1,5.1746,0.65402,0.6382,4.4489,69.18,-0.2522,0.0177,0.2698,1722.8779,0.147376,-476.221042,-476.211955,-476.21101,-476.256626,32.06,-1698.7465566431,-1709.041469297,-1719.116753801,-1577.686265345\r\ngdb_118816,CC1(CCO)NC1C=O,3.42085,0.86517,0.79809,4.8715,77.75,-0.239,-0.0205,0.2184,1481.9272,0.168543,-440.266799,-440.25667,-440.255726,-440.30226,36.454,-1863.792718805,-1875.211500078,-1886.473404101,-1729.5403058\r\ngdb_133500,CC(C(C)=O)C(F)(F)F,2.28565,1.3416,1.11949,2.625,58.03,-0.2602,-0.0298,0.2305,1122.3178,0.117797,-569.424434,-569.414778,-569.413834,-569.460055,33.672,-1548.6256960461,-1556.786450591,-1565.676370594,-1441.2733388531\r\ngdb_126023,CC1COC2=C1C=NO2,3.11904,1.58447,1.12308,4.5992,70.12,-0.2205,0.0056,0.226,1051.6925,0.12599,-437.871471,-437.864082,-437.863138,-437.904433,27.547,-1616.4060806811,-1625.990025638,-1634.879945641,-1507.327446229\r\ngdb_15582,CC1COCC1C#N,3.17334,1.82429,1.27822,3.0149,67.62,-0.2738,0.0279,0.3017,947.9999,0.143527,-363.884577,-363.876758,-363.875814,-363.918343,28.271,-1664.811497432,-1675.014166263,-1684.49770978,-1552.249560521\r\ngdb_78646,CC12C3C1C(C3O)C2=O,2.96144,1.5052,1.159,3.5702,72.58,-0.2365,0.0051,0.2416,1064.6868,0.135018,-421.795893,-421.787854,-421.78691,-421.828215,30.942,-1717.584376823,-1727.6496211831,-1737.132537191,-1601.7267626441\r\ngdb_2691,CN=COCCO,9.32709,1.0204,0.96464,2.6277,62.32,-0.2554,0.0191,0.2745,1201.8264,0.13495,-362.899954,-362.891472,-362.890528,-362.933845,28.43,-1441.63290151,-1450.529724112,-1459.420899133,-1338.020497957\r\ngdb_41134,C1OC23CN4C2C=CC134,2.99485,2.24825,1.70398,0.5173,73.02,-0.2173,-0.0263,0.191,818.5226,0.124785,-400.568113,-400.56188,-400.560936,-400.598193,26.561,-1562.722058222,-1573.032031092,-1581.921951095,-1452.090966504\r\ngdb_28308,c1c(nc2n1CC=C2)N,5.10433,1.28789,1.03649,3.1059,81.04,-0.1739,-0.019,0.1548,1104.8099,0.127864,-396.975461,-396.968301,-396.967357,-397.006799,28.246,-1657.461484515,-1667.189129033,-1676.079049036,-1547.259608953\r\ngdb_25896,CC(=O)c1c[nH]cc1O,2.72853,1.52929,0.98602,5.0706,74.9,-0.2006,-0.0327,0.168,1140.5772,0.124905,-437.953775,-437.945538,-437.944594,-437.986863,30.866,-1668.0525814171,-1677.1043987421,-1685.994318745,-1559.053013099\r\ngdb_14201,OC12CC3C1CN3C2,3.65732,2.3013,2.10902,1.1294,66.78,-0.2163,0.0678,0.2842,731.0648,0.144454,-363.797131,-363.791006,-363.790061,-363.826881,25.696,-1609.938345418,-1621.2040144951,-1630.686930503,-1494.856332363\r\ngdb_94856,OC1CCC2(CCO2)C1,3.55194,1.09542,1.02035,2.3523,78.09,-0.2377,0.0702,0.3078,1259.1483,0.183145,-424.221913,-424.213226,-424.212282,-424.25624,33.192,-1984.230521175,-1997.4433506791,-2009.298250707,-1842.889138997\r\ngdb_7100,CCOC(=O)CC#N,8.05611,0.8924,0.8156,5.6532,62.99,-0.2933,-0.0066,0.2868,1347.2193,0.116846,-399.850285,-399.841382,-399.840438,-399.885499,29.572,-1506.959725955,-1514.704442033,-1523.00199354,-1408.421360176\r\ngdb_133386,c1(c(nc(=O)on1)F)N,3.21654,1.4883,1.01884,6.1398,57.13,-0.263,-0.0977,0.1653,1021.7071,0.064842,-529.968647,-529.961395,-529.960451,-530.000742,26.663,-1147.283489826,-1152.5087572691,-1158.4336972471,-1070.303195742\r\ngdb_104602,[NH3+]CC1(CC([O-])=O)CO1,2.79672,1.18071,0.94837,10.4299,72.1,-0.1935,0.0052,0.1988,1245.302,0.144964,-476.155268,-476.14642,-476.145475,-476.189205,33.045,-1657.4727796771,-1667.9176669821,-1677.9929514861,-1535.3789810561\r\ngdb_122710,CCCOC=NCC=O,8.29565,0.42909,0.42489,4.358,82.93,-0.2404,-0.02,0.2203,2563.7213,0.167025,-440.290576,-440.279635,-440.278691,-440.329281,36.261,-1878.713000298,-1889.622244263,-1900.884148286,-1746.496226489\r\ngdb_53457,CC#CCC#CC(C)O,3.83157,0.53311,0.48191,1.6096,89.48,-0.2452,0.0367,0.2819,2293.0433,0.15531,-385.835842,-385.824371,-385.823427,-385.875799,38.462,-1878.9859667131,-1888.6753331821,-1899.3442412,-1754.804445649\r\ngdb_49387,O=CC1C2C1C1OCC21,3.96602,1.1069,1.05873,3.9892,73.13,-0.2505,-0.0233,0.2272,1160.2324,0.136164,-421.767632,-421.760386,-421.759442,-421.799576,28.065,-1699.8503449741,-1710.4132039711,-1719.896119979,-1583.7555323931\r\ngdb_85428,CC1OC(=N)OC1(C)C,2.2855,1.57525,1.24233,4.4904,76.1,-0.2475,0.0615,0.309,1120.1593,0.169892,-440.328242,-440.319117,-440.318173,-440.361633,35.038,-1902.348754292,-1914.397554601,-1925.659458624,-1766.797397657\r\ngdb_41362,C1OC2C3CNC12CO3,2.84352,1.89634,1.70763,1.9098,70.1,-0.2275,0.0503,0.2777,877.307,0.151297,-439.007968,-439.001586,-439.000642,-439.038359,26.962,-1701.7165567401,-1713.709508748,-1723.785420761,-1577.6536348771\r\ngdb_110807,CCC1C2OC(C#N)C12,2.39377,1.38125,1.05747,4.2332,75.92,-0.2608,-0.003,0.2578,1188.8789,0.147212,-401.913839,-401.905276,-401.904332,-401.947505,31.601,-1779.3256148421,-1789.9505972301,-1800.026509243,-1657.573808626\r\ngdb_40548,C1CN1C1=NCC2OC12,3.91922,1.29644,1.0814,1.0463,74.88,-0.2378,0.0003,0.2381,1116.5519,0.137329,-417.963238,-417.955896,-417.954952,-417.995401,28.086,-1661.719760589,-1672.221751213,-1681.704667221,-1545.402182313\r\ngdb_58334,CC(O)C(O)CCC=O,3.50923,0.66824,0.58794,3.2513,76.88,-0.2462,-0.0173,0.2289,1866.8029,0.178394,-461.370595,-461.359498,-461.358554,-461.407821,39.087,-1940.8539615501,-1952.553866855,-1964.408766883,-1801.0757042911\r\ngdb_38913,C1OC2CC3OC3C2=C1,3.61107,1.55417,1.22418,1.801,73.96,-0.2333,-0.0009,0.2324,1003.2174,0.137947,-421.803891,-421.797165,-421.796221,-421.835082,26.856,-1722.6031938051,-1733.492357482,-1742.97527349,-1606.035866947\r\ngdb_119073,CCC1(CCO)CCO1,2.05203,1.16344,0.8854,1.3992,81.2,-0.2411,0.0755,0.3165,1433.2875,0.204286,-425.409447,-425.39896,-425.398015,-425.445643,38.012,-2101.5671740671,-2115.4275928591,-2128.467857388,-1948.0276444381\r\ngdb_36707,C#CCOC1CC=CC1,4.46893,0.79985,0.76487,1.2897,83.43,-0.2407,0.0275,0.2682,1530.2012,0.157869,-385.851584,-385.842575,-385.84163,-385.886527,33.153,-1888.8642133911,-1900.098507018,-1910.766787527,-1761.536362201\r\ngdb_60652,NC(=O)C(O)C1CCC1,2.91222,1.11614,0.86414,4.002,75.74,-0.2551,0.0196,0.2747,1348.5211,0.170657,-440.310289,-440.300732,-440.299788,-440.345296,35.311,-1891.083085215,-1902.860801636,-1914.122705659,-1756.545783124\r\ngdb_101843,CCC(O)CCN1CC1,4.85659,0.59847,0.55919,1.0223,87.27,-0.2303,0.0719,0.3021,2061.1326,0.21564,-405.520144,-405.50933,-405.508386,-405.557021,39.173,-2158.7595993451,-2173.303375438,-2186.937263481,-1998.9098392121\r\ngdb_130719,c1cc(=O)c(=O)[nH]nc1,2.64183,1.76719,1.05888,4.6958,73.69,-0.2577,-0.101,0.1567,1025.7612,0.091133,-452.764115,-452.756976,-452.756032,-452.79659,26.132,-1400.777673047,-1407.852209513,-1414.963141501,-1311.141777451\r\ngdb_77502,CC1C2(C)CC1(O)C2O,1.89932,1.70819,1.27489,1.6368,78.98,-0.245,0.0713,0.3163,1131.8349,0.180417,-424.184017,-424.174256,-424.173312,-424.217663,37.439,-1960.4504401111,-1972.9893249491,-1984.8442249771,-1818.6817243041\r\ngdb_100327,CC(CC#N)(CO)CO,2.59682,1.0947,0.94785,2.8096,74.85,-0.2719,0.0333,0.3052,1300.8245,0.169199,-440.295213,-440.284803,-440.283859,-440.330511,37.848,-1881.6227595311,-1892.865210775,-1904.127114798,-1747.268062559\r\ngdb_3296,OC1CN2CCC12,4.65397,2.66529,2.15405,1.5127,59.65,-0.2097,0.0599,0.2697,657.5896,0.138048,-325.720699,-325.714314,-325.71337,-325.751034,24.264,-1465.8246284781,-1476.037964962,-1484.929139983,-1360.236826593\r\ngdb_126647,CCCNc1cnn[nH]1,6.4188,0.68067,0.62323,5.9521,79.21,-0.2123,0.0171,0.2294,1772.9501,0.160614,-415.382968,-415.373832,-415.372888,-415.417851,32.81,-1763.886383406,-1775.039100863,-1785.708008881,-1635.544477672\r\ngdb_81493,CC1(C#N)C2OC1C2O,2.27388,1.96312,1.20194,3.1941,69.56,-0.2648,0.0211,0.286,1007.2373,0.122515,-437.818334,-437.810074,-437.80913,-437.850769,31.357,-1583.062134948,-1592.099519566,-1600.989439569,-1473.652803253\r\ngdb_97552,CCC#CC(C)(O)CC,2.96644,0.71267,0.65965,1.7794,92.0,-0.2466,0.0456,0.2922,1880.7384,0.202,-388.277454,-388.265674,-388.26473,-388.315469,42.046,-2155.416231393,-2168.4659085571,-2181.506800595,-2003.2741643071\r\ngdb_128035,CN=COc1ccn[nH]1,4.68255,1.00185,0.82952,3.6855,76.44,-0.2226,-0.0074,0.2152,1347.1907,0.125664,-434.079505,-434.071447,-434.070503,-434.112659,29.325,-1586.074178148,-1595.237692075,-1604.127612078,-1476.755835258\r\ngdb_38652,C1C2OC=NC3CN1C23,2.81796,2.17957,1.53719,1.0573,70.6,-0.2229,0.0162,0.2392,868.1634,0.139952,-417.968188,-417.961929,-417.960985,-417.998864,25.586,-1664.825930139,-1676.0075130101,-1685.490429018,-1547.5752459801\r\ngdb_25525,N=C1NC(NC=O)=CO1,3.73595,1.17708,0.89507,5.8363,69.89,-0.1905,-0.0199,0.1705,1197.2047,0.101882,-470.046504,-470.038638,-470.037694,-470.079489,29.044,-1429.03252079,-1436.538783448,-1444.242711441,-1332.723066979\r\ngdb_72505,CC12COC(C=C1)C2=O,2.30979,1.91051,1.69157,2.5708,72.38,-0.229,-0.0145,0.2145,916.4157,0.136328,-421.837559,-421.830001,-421.829057,-421.869247,29.537,-1743.7301668171,-1754.097243006,-1763.580159014,-1627.474711932\r\ngdb_21640,c1cc(nnc1N)N,5.77314,1.5875,1.24866,3.9084,69.73,-0.2001,-0.0342,0.1659,916.1572,0.10982,-374.931066,-374.924183,-374.923239,-374.961676,26.968,-1384.847102064,-1392.970206069,-1400.674761571,-1289.692892322\r\ngdb_115396,OCC1CCC(O)CO1,3.12808,1.07554,0.94628,1.4936,74.3,-0.2554,0.0694,0.3248,1306.5446,0.183797,-461.368752,-461.359889,-461.358945,-461.402184,34.42,-1939.697462463,-1952.7992228741,-1964.6541229021,-1797.538436058\r\ngdb_55628,CC(=N)OC(C)(C)C#C,2.70036,1.12637,0.9682,1.0753,82.89,-0.2575,0.0274,0.2849,1324.2292,0.166446,-403.129146,-403.118451,-403.117507,-403.1645,39.292,-1914.090075191,-1925.154313879,-1936.416217902,-1780.026542395\r\ngdb_19057,OC1CC2CN3C2C13,3.51597,2.30324,2.08072,3.074,66.16,-0.2245,0.0659,0.2904,738.7105,0.144922,-363.815794,-363.809552,-363.808608,-363.845774,25.608,-1621.649545885,-1632.8417964091,-1642.325339926,-1506.7118599001\r\ngdb_25443,c1cn(ccc1=O)C=O,5.12815,1.06375,0.881,3.4112,77.28,-0.2408,-0.0676,0.1732,1206.2041,0.102872,-436.762391,-436.755294,-436.754349,-436.794182,26.688,-1548.300018875,-1556.2900909721,-1563.993391456,-1451.857533156\r\ngdb_20667,Cc1c[nH]nc1C#N,3.44641,1.99392,1.27307,6.2987,66.27,-0.2629,-0.0278,0.2351,885.6444,0.097532,-357.686524,-357.679372,-357.678427,-357.718595,25.242,-1380.341587444,-1387.407966293,-1394.518898281,-1293.097756156\r\ngdb_69422,CC12CN1C(CC#N)C2,4.38504,0.93514,0.87044,5.4384,80.13,-0.2428,0.0332,0.276,1367.3801,0.159137,-382.04658,-382.037803,-382.036858,-382.080348,32.833,-1850.350848516,-1861.7300967221,-1872.398377231,-1721.9254840851\r\ngdb_14422,CC1C2C3OC1C23C,4.13032,1.82181,1.64377,1.8782,71.23,-0.2251,0.0831,0.3083,874.3057,0.153527,-347.744848,-347.737515,-347.736571,-347.775966,28.591,-1725.734463715,-1737.131282173,-1747.207821695,-1605.133509005\r\ngdb_116383,CCC1CCCOC1=O,2.75665,1.27699,0.93287,4.1762,79.05,-0.2566,0.0101,0.2668,1302.9366,0.183575,-424.280603,-424.271603,-424.270658,-424.314772,33.642,-2021.0590243851,-2034.0754435721,-2045.9297160911,-1879.6184957851\r\ngdb_91958,OC1CC2CC2C11CC1,2.37956,1.68639,1.38038,1.4576,81.73,-0.2494,0.0749,0.3243,1052.4765,0.184121,-387.075919,-387.067927,-387.066983,-387.107847,32.912,-2029.293824992,-2042.9434007601,-2054.798300788,-1886.7030723951\r\ngdb_62677,CC1(C)C2C3CN3C12C,2.36947,1.70871,1.29166,1.5194,86.41,-0.2121,0.0695,0.2816,1104.0923,0.193079,-367.122779,-367.113735,-367.112791,-367.155637,36.09,-2046.427958237,-2060.3059472811,-2072.753843314,-1897.6832248771\r\ngdb_79156,CC1OC2C3C1C2N3C,3.90347,1.31167,1.16106,1.9219,78.93,-0.214,0.0878,0.3019,1129.4524,0.171311,-403.077925,-403.070148,-403.069204,-403.109795,31.22,-1881.9484367021,-1894.8437466521,-1906.105650675,-1745.6986625501\r\ngdb_77410,CC12C(O)C1(O)C1CN21,2.56092,1.66619,1.35612,2.527,72.08,-0.2305,0.0506,0.2811,1007.3121,0.146385,-439.009288,-439.000811,-438.999867,-439.041654,33.12,-1702.5448686201,-1713.223189273,-1723.299101286,-1579.721277032\r\ngdb_74277,CC1=CC(OC1=O)(C)C,2.70285,1.32896,1.11164,4.4887,78.98,-0.2623,-0.0326,0.2296,1192.5985,0.157386,-423.101073,-423.091771,-423.090827,-423.134806,34.631,-1908.7449535291,-1919.7947595101,-1930.463667528,-1780.4017927771\r\ngdb_95293,CC1OCC2CCC12O,2.0181,1.81015,1.43388,1.1492,77.37,-0.2455,0.0669,0.3124,1055.827,0.182723,-424.225723,-424.217031,-424.216087,-424.25877,33.974,-1986.6213304651,-1999.8310224241,-2011.685922452,-1844.4767367671\r\ngdb_118906,COCC1(O)CC=CC1,3.04339,1.05293,0.98402,2.0696,80.02,-0.236,0.0287,0.2647,1330.6333,0.181746,-424.227559,-424.21816,-424.217216,-424.262151,35.143,-1987.773436989,-2000.5394800851,-2012.394380113,-1846.5983446961\r\ngdb_1328,CC1(C)CC1C=O,4.56847,1.69259,1.58646,3.3742,66.52,-0.243,-0.0155,0.2275,857.5795,0.146416,-309.727166,-309.719292,-309.718348,-309.75884,28.845,-1618.486900525,-1628.655056361,-1638.139227387,-1507.3619592241\r\ngdb_98999,COC(=N)CC(=O)CO,4.4525,0.73296,0.66969,2.6966,70.84,-0.266,-0.0406,0.2254,1666.2226,0.145087,-476.239971,-476.229929,-476.228985,-476.276614,34.766,-1710.6246745041,-1720.320316063,-1730.396228076,-1590.228915237\r\ngdb_19054,CN1CC2OC3C2C13,4.36449,2.0913,1.78814,2.2269,67.11,-0.1987,0.0819,0.2807,784.016,0.143613,-363.799574,-363.793216,-363.792272,-363.829824,25.573,-1611.471349905,-1622.590809385,-1632.074352902,-1496.70309135\r\ngdb_78043,CC1C2C(C1=O)C2(C)C,2.75055,1.24274,1.10417,3.3034,84.19,-0.2329,-0.0077,0.2252,1237.0394,0.180616,-387.09452,-387.085066,-387.084121,-387.128147,36.048,-2040.966119901,-2053.6982775111,-2065.55255003,-1899.4415050951\r\ngdb_82264,NC1C2OC2C=C1C#N,3.21969,1.40491,1.08228,3.2101,72.79,-0.2544,-0.0649,0.1895,1065.4361,0.114172,-416.799128,-416.791531,-416.790586,-416.8313,28.752,-1559.081878513,-1567.646748854,-1575.943045343,-1456.14090959\r\ngdb_122725,CCCOCC#CC#C,8.89508,0.4329,0.4171,1.5339,96.31,-0.2435,-0.022,0.2215,2564.7658,0.155443,-385.827119,-385.816353,-385.815409,-385.864322,37.082,-1873.512205706,-1883.64396602,-1894.312874038,-1747.602524856\r\ngdb_54013,CC(=N)N(C=O)C(=N)N,2.18202,1.58329,0.97878,2.0238,73.5,-0.2473,-0.0305,0.2169,1186.4647,0.136245,-451.360697,-451.351654,-451.35071,-451.394929,33.474,-1613.577897618,-1623.011240415,-1632.494156423,-1497.942421625\r\ngdb_82733,OC1C2CN3C2C3C1=O,2.79589,1.90285,1.42476,2.4305,67.87,-0.2469,-0.0386,0.2083,907.2266,0.125792,-437.855577,-437.848727,-437.847783,-437.886728,27.383,-1606.432452635,-1616.3546249431,-1625.244544946,-1496.217399384\r\ngdb_128130,CNc1c(cnn1C)N,1.93073,1.87367,1.02154,2.907,80.22,-0.1826,0.0317,0.2144,1188.2025,0.160544,-415.38901,-415.379269,-415.378325,-415.424066,34.69,-1767.677792784,-1778.4508672961,-1789.119775314,-1639.444446107\r\ngdb_90710,CC1(C)OC2C(O)CC12,2.89107,1.34538,1.32001,2.6127,77.57,-0.2435,0.0653,0.3088,1108.5645,0.181638,-424.204214,-424.195432,-424.194488,-424.23741,34.734,-1973.124239384,-1986.277455533,-1998.132355561,-1831.073144527\r\ngdb_99048,CC(OC(=O)CN)C#N,3.15032,0.87716,0.78426,3.5953,70.22,-0.2576,-0.0069,0.2508,1492.1461,0.134049,-455.183065,-455.172813,-455.171869,-455.220702,34.607,-1662.9873287691,-1671.662640694,-1681.145556702,-1549.848711087\r\ngdb_74940,CC1=CC2OC3CC1C23,3.14747,1.68746,1.36641,2.0376,79.56,-0.227,0.0083,0.2353,995.4377,0.161159,-385.872101,-385.865018,-385.864073,-385.903302,29.004,-1901.738815544,-1914.1816915051,-1924.849972014,-1772.062825676\r\ngdb_32160,c1cc(cc(c1)N)CO,2.74604,1.23899,0.87321,2.278,83.99,-0.2008,0.0032,0.204,1299.8497,0.149702,-402.012151,-402.003726,-402.002782,-402.045329,32.568,-1841.01727965,-1851.72885828,-1861.804770293,-1718.959249042\r\ngdb_128608,COc1conc1C#N,2.28993,1.594,0.9455,6.2177,67.44,-0.2613,-0.0611,0.2002,1145.0875,0.089249,-452.72697,-452.719195,-452.71825,-452.759828,27.473,-1377.468851242,-1384.1442919841,-1391.2545964631,-1288.073291593\r\ngdb_33702,N#CC12C=CC3C1CN23,3.2234,1.8327,1.3499,4.446,73.8,-0.2418,-0.0343,0.2076,913.1794,0.114373,-379.595365,-379.588909,-379.587964,-379.626014,26.256,-1567.894614909,-1577.1761005281,-1585.472397017,-1464.253973451\r\ngdb_8878,CCC(=O)C(C)OC,3.25629,1.29227,0.98249,2.587,72.29,-0.2398,-0.0101,0.2297,1245.5803,0.173166,-386.157013,-386.146754,-386.145809,-386.193134,35.281,-1847.3532380231,-1858.690443126,-1869.952347149,-1716.264725396\r\ngdb_26258,CC(C)c1nccn1C,2.50547,1.38047,1.05981,3.5992,85.04,-0.2097,0.0364,0.246,1238.399,0.182914,-383.320765,-383.310999,-383.310055,-383.356407,35.173,-2022.061783767,-2034.59753106,-2046.452431088,-1881.44140943\r\ngdb_64034,CC1(O)CC=CC1C#N,2.28649,1.58785,1.1875,4.5377,76.61,-0.2647,-0.0009,0.2638,1088.469,0.146914,-401.976351,-401.967569,-401.966625,-402.009613,33.399,-1818.5524574501,-1829.0400153671,-1839.11592738,-1696.547137598\r\ngdb_33150,CCCn1ccc(c1)N,4.54615,0.78694,0.71528,1.7649,87.58,-0.1708,0.053,0.2238,1653.6188,0.183466,-383.30426,-383.294519,-383.293575,-383.339611,35.958,-2011.704747722,-2024.25618274,-2036.111082768,-1870.901768266\r\ngdb_30251,CC1CN2C=CC(C)=C12,2.53877,1.55992,1.03248,1.8632,88.33,-0.1913,0.0558,0.2471,1204.4537,0.171689,-366.02585,-366.016981,-366.016037,-366.059971,32.842,-1985.94675829,-1998.157455921,-2009.419359944,-1851.365530569\r\ngdb_100337,CCC(O)(CO)CC#N,2.51737,1.12102,0.95284,3.8912,75.3,-0.2828,0.0321,0.3149,1314.5933,0.167926,-440.298912,-440.288076,-440.287132,-440.334881,38.739,-1883.943915322,-1894.919047732,-1906.180951755,-1750.010276889\r\ngdb_71071,CC12OC3=NCCC1C23,3.63538,1.59982,1.31727,3.744,76.58,-0.2266,-0.022,0.2046,995.1219,0.148118,-401.875047,-401.867755,-401.866811,-401.906251,29.642,-1754.983285714,-1766.405832041,-1776.481744054,-1631.68655234\r\ngdb_44942,N=C1OC2CC1CCO2,2.64562,1.77402,1.59618,2.0015,71.25,-0.2643,0.0223,0.2866,928.2038,0.151902,-439.118739,-439.112007,-439.111063,-439.14967,27.639,-1771.226356179,-1782.999680037,-1793.07559205,-1647.502289176\r\ngdb_73367,CC12CC3CC1C3CO2,2.46927,2.09055,1.579,1.4558,80.71,-0.2335,0.0914,0.3249,951.4309,0.185561,-387.092834,-387.085792,-387.084848,-387.123667,30.134,-2039.908139727,-2054.1538490451,-2066.008749073,-1896.6302647751\r\ngdb_124817,CC(O)CC1=CN=NN1,4.50742,0.94751,0.81357,6.3444,74.12,-0.243,0.0118,0.2548,1389.084,0.148658,-435.273541,-435.264911,-435.263967,-435.307249,31.865,-1707.4908945581,-1718.072578825,-1728.148490838,-1585.149229882\r\ngdb_3758,Cn1c(ccn1)O,4.05076,3.42009,1.87653,2.7379,55.69,-0.2145,0.0402,0.2546,645.1519,0.103009,-340.65257,-340.645874,-340.64493,-340.6836,23.555,-1274.811516387,-1282.163411831,-1289.275598837,-1189.408168996\r\ngdb_19634,C#CC12C3C4C3N1C24,6.25633,1.75385,1.68388,2.0248,68.46,-0.2198,0.0137,0.2335,768.8712,0.094697,-324.161603,-324.155586,-324.154642,-324.191456,24.077,-1348.4998982571,-1356.2797548391,-1363.391314336,-1260.4804658451\r\ngdb_48859,O=CC12CC3=NCC1N23,4.33545,1.39308,1.20086,2.1859,72.38,-0.248,-0.0598,0.1883,993.6232,0.112506,-416.710365,-416.703429,-416.702485,-416.741766,26.707,-1503.3822971461,-1512.361950936,-1520.658874934,-1399.9575187841\r\ngdb_91777,CC12CC1CC(O)C2=O,2.38265,1.70564,1.24547,1.8957,75.33,-0.2328,-0.021,0.2118,1064.5003,0.159347,-423.048923,-423.040314,-423.03937,-423.081909,33.004,-1876.0203591791,-1887.5050288971,-1898.1739369151,-1747.2084492041\r\ngdb_2510,COC(C#N)C#N,3.40959,2.35867,1.45933,4.3677,51.87,-0.3184,-0.0403,0.2781,775.2676,0.076999,-339.426816,-339.419476,-339.418532,-339.458581,24.111,-1133.491469515,-1138.662143675,-1144.588338671,-1061.920303011\r\ngdb_106986,CCC12CC1C(O)C2=O,2.9453,1.26738,1.01645,1.9747,76.39,-0.2279,-0.0152,0.2127,1218.9429,0.157757,-423.018878,-423.009607,-423.008663,-423.053151,34.069,-1857.166851274,-1868.2361100341,-1878.9050180521,-1729.1625453821\r\ngdb_15614,CN=C1C(=O)OCO1,3.63381,1.87513,1.25651,4.318,58.99,-0.2732,-0.0544,0.2188,907.5533,0.095297,-435.771982,-435.764557,-435.763612,-435.804598,25.024,-1321.490655879,-1328.385097262,-1335.49602925,-1234.436959818\r\ngdb_48239,C(=O)C(=O)N(C=O)C=O,2.00122,1.66864,0.98418,2.3144,58.95,-0.2772,-0.1232,0.154,1106.3001,0.074069,-509.808232,-509.799511,-509.798566,-509.84264,28.812,-1304.056573332,-1309.249210307,-1315.766518781,-1222.115193094\r\ngdb_131927,c1c(oc(=O)o1)N(=O)=O,4.61928,1.17844,0.93891,2.8556,54.87,-0.2909,-0.1158,0.1751,1087.0381,0.053253,-545.643999,-545.637264,-545.63632,-545.675969,23.622,-1064.6656548861,-1069.326791738,-1074.658735711,-994.3093458061\r\ngdb_70618,CC12NC1CCC21CN1,2.33667,1.84524,1.29265,2.3438,81.91,-0.2318,0.0647,0.2964,1061.1098,0.184695,-383.237876,-383.229721,-383.228777,-383.270124,32.932,-1970.0481902661,-1983.594854558,-1995.449754586,-1827.298050383\r\ngdb_123170,COCCCCN1CC1,7.93001,0.44155,0.43303,0.8831,88.53,-0.2313,0.0893,0.3206,2566.374,0.215874,-405.5059,-405.49504,-405.494096,-405.543396,37.933,-2149.8213611491,-2164.3362718281,-2177.970159871,-1990.360029087\r\ngdb_69253,CC12CC1C(=O)CCC2,2.47501,1.50354,1.1384,3.435,82.9,-0.2292,-0.0047,0.2244,1157.4817,0.18389,-387.124084,-387.115703,-387.114759,-387.156765,33.241,-2059.517795977,-2072.9232707441,-2084.778170772,-1917.3995576571\r\ngdb_102444,CCN(CC)C(=O)C#N,1.96468,1.37536,0.91754,5.2424,78.82,-0.2647,-0.0468,0.2179,1321.9602,0.157884,-419.249085,-419.238801,-419.237857,-419.285164,35.751,-1840.748705798,-1851.1816704321,-1861.85057845,-1713.517490994\r\ngdb_77837,CN1C2C(C#C)C1C2=O,3.4597,1.27185,1.05208,1.5296,77.48,-0.2491,-0.0055,0.2436,1130.4096,0.122103,-400.647018,-400.638756,-400.637812,-400.679668,31.329,-1612.235655867,-1621.2724129761,-1630.162332979,-1503.217262279\r\ngdb_120158,CC12CN(C1)C2CCO,3.03394,0.96274,0.84817,2.903,83.17,-0.2241,0.0743,0.2985,1451.1399,0.193898,-404.265336,-404.255866,-404.254922,-404.300334,35.433,-1999.207905987,-2012.8179486881,-2025.265844721,-1851.5500182151\r\ngdb_35223,N#CC1CC11COC=N1,3.4594,1.25539,1.0717,3.161,71.57,-0.2541,-0.0082,0.2459,1096.138,0.114048,-416.821776,-416.814291,-416.813347,-416.854256,27.663,-1573.293702345,-1581.9288536941,-1590.225777692,-1470.5460061941\r\ngdb_13943,OC1CCC11COC1,3.28331,1.81884,1.46864,3.1866,67.42,-0.2382,0.0654,0.3035,909.2894,0.153734,-384.900087,-384.892305,-384.891361,-384.932403,28.993,-1686.472480603,-1697.5869200111,-1707.663459533,-1566.3672580031\r\ngdb_91411,CC1OC2CC(C2)C1=O,2.53219,1.78597,1.30273,2.3336,75.0,-0.2383,-0.016,0.2222,1023.5023,0.160084,-423.036401,-423.02856,-423.027616,-423.068866,30.747,-1868.1626914811,-1880.1292881111,-1890.798196129,-1739.023849317\r\ngdb_89738,CC1OC2(CC2)C1(C)C,2.21781,1.48792,1.24235,1.5954,85.62,-0.2219,0.0799,0.3017,1187.6331,0.203039,-388.273124,-388.263139,-388.262194,-388.307762,38.379,-2152.699117423,-2166.8751732421,-2179.915437771,-1998.437952444\r\ngdb_106150,COC12C3CC1CC23C,2.35227,1.6318,1.26116,0.9253,83.19,-0.2242,0.0926,0.3168,1119.9803,0.182032,-387.031856,-387.023095,-387.022151,-387.065037,33.663,-2001.6438959251,-2014.8109172721,-2026.6658173,-1859.839412105\r\ngdb_70588,OC12CC1COC1CC21,2.51508,1.82537,1.32103,1.8155,74.2,-0.2341,0.0783,0.3125,1002.8631,0.160215,-422.991838,-422.984296,-422.983352,-423.023245,31.23,-1840.199007914,-1852.353229735,-1863.0221377531,-1710.396261228\r\ngdb_69328,CC12COC(C)(C)C1N2,2.86563,1.34854,1.27369,1.3096,81.41,-0.2419,0.0814,0.3233,1150.112,0.193433,-404.349017,-404.339741,-404.338797,-404.382128,36.289,-2051.7184866161,-2065.450266063,-2077.898162096,-1902.8764893611\r\ngdb_102673,CCC(CC)OCC#C,1.78873,1.08086,0.72459,1.2235,88.83,-0.2473,0.0334,0.2807,1641.7397,0.201608,-388.253255,-388.241731,-388.240786,-388.291344,40.701,-2140.2311411021,-2153.4414605701,-2166.481725099,-1988.135509682\r\ngdb_77445,CC12OC(=O)C1(O)C2O,2.44538,1.56174,1.24391,2.9407,62.41,-0.2593,-0.0053,0.2539,1025.1532,0.109792,-494.885607,-494.876921,-494.875977,-494.918603,31.979,-1500.8716336371,-1508.752519168,-1517.049443166,-1398.295874952\r\ngdb_17112,COC1CC2CC1O2,4.79304,1.54225,1.46395,2.548,67.55,-0.2387,0.0893,0.328,928.4875,0.154746,-384.903425,-384.89632,-384.895376,-384.934791,27.32,-1688.5671056451,-1700.106368646,-1710.182908168,-1567.8657494951\r\ngdb_56243,C(CNC(=O)N)C(=O)N,4.50515,0.68755,0.62302,2.8876,69.6,-0.2423,0.0333,0.2756,1730.7117,0.147468,-472.480847,-472.470463,-472.469519,-472.517722,36.47,-1700.901422549,-1710.381828521,-1720.457740534,-1580.291055204\r\ngdb_21959,C(CO)c1nnon1,7.06973,1.0637,0.98647,2.3369,55.71,-0.2871,-0.0734,0.2136,1108.8661,0.094864,-431.857319,-431.84997,-431.849026,-431.890141,25.142,-1214.1652815731,-1221.106786131,-1228.218345628,-1126.8806622\r\ngdb_80200,CC1C2OC(=N)C12C#N,2.22559,1.61664,1.20821,4.809,72.85,-0.2869,-0.0127,0.2743,1056.8814,0.112182,-416.791942,-416.783887,-416.782943,-416.824636,29.751,-1554.572598839,-1562.850070058,-1571.146994056,-1451.959189614\r\ngdb_84462,CC1C=CC2(NC12)C#N,4.08647,1.19203,1.0449,3.6721,79.35,-0.2563,-0.0175,0.2388,1154.8932,0.137228,-380.868839,-380.860968,-380.860024,-380.901064,30.344,-1739.159391261,-1749.3300571331,-1758.812973141,-1623.1367422151\r\ngdb_121687,C(COC(=O)CC#N)O,2.06251,1.17056,0.78185,2.874,66.17,-0.2812,-0.0153,0.2659,1399.7785,0.122284,-475.060667,-475.050929,-475.049985,-475.097757,32.727,-1598.4524206821,-1606.561719489,-1615.451639492,-1491.70811971\r\ngdb_91015,CC12CN1C1CC(O)C21,2.70536,1.57142,1.344,2.3914,78.05,-0.2146,0.061,0.2756,1051.267,0.171009,-403.075813,-403.067622,-403.066678,-403.107981,32.853,-1880.623137694,-1893.258658918,-1904.520562941,-1744.560361224\r\ngdb_36124,N#CCC12CC=CC1N2,3.99515,1.13492,1.00218,4.4761,76.52,-0.2445,-0.0071,0.2374,1176.5723,0.13764,-380.871823,-380.864093,-380.863149,-380.9045,29.543,-1741.031878117,-1751.2910227581,-1760.773938766,-1625.292863139\r\ngdb_80980,CC1C2CC1(C)C=CC2,2.37021,1.74613,1.29932,0.2006,90.03,-0.2278,0.0253,0.2531,1106.2613,0.207149,-351.164302,-351.155894,-351.15495,-351.196294,34.819,-2221.0881857881,-2236.2544508091,-2249.295342847,-2065.4226556671\r\ngdb_2765,CC1CCC1C=O,3.72207,2.2633,1.60962,2.9436,64.62,-0.2434,-0.0203,0.223,802.9428,0.147698,-309.720334,-309.712874,-309.711929,-309.752146,26.966,-1614.199759037,-1624.627703599,-1634.111247116,-1503.161413978\r\ngdb_75666,CC1C(=O)N2CC12C#N,2.85207,1.3418,1.11999,3.5653,71.15,-0.2722,-0.0427,0.2294,1091.6708,0.111871,-416.800743,-416.79259,-416.791646,-416.833584,29.804,-1560.0953055481,-1568.311280885,-1576.608204883,-1457.574140146\r\ngdb_48434,O=CC1(CCC#C)CO1,2.8579,0.95898,0.78695,2.4315,75.45,-0.2618,-0.0442,0.2176,1432.6115,0.132299,-421.779692,-421.770235,-421.769291,-421.814774,33.716,-1707.4181035141,-1716.5935401121,-1726.07645612,-1593.292414175\r\ngdb_58158,CC(C#CC=O)N(C)C,3.23854,0.72955,0.66357,3.9434,88.04,-0.2192,-0.0587,0.1605,1747.8872,0.167275,-403.108642,-403.09766,-403.096716,-403.145862,37.74,-1901.223630655,-1912.1077742601,-1923.369678283,-1768.3310296531\r\ngdb_14608,N=COCCCC#C,12.96436,0.6125,0.5913,3.345,73.94,-0.2633,0.0195,0.2827,1789.1568,0.139747,-363.824694,-363.815188,-363.814244,-363.860893,31.893,-1627.234375985,-1636.378437133,-1645.86198065,-1516.199168471\r\ngdb_78176,CC1NC(=O)C2C1N2C,2.10771,1.7742,1.23301,3.9447,76.15,-0.2323,0.0194,0.2517,1088.0921,0.159909,-419.222836,-419.214249,-419.213305,-419.255886,32.628,-1824.277222057,-1835.775069464,-1846.443977482,-1695.145282492\r\ngdb_76948,CC1C(C=O)C11CCO1,2.29029,1.26947,1.01026,3.743,78.29,-0.2331,-0.0197,0.2135,1251.6318,0.157295,-423.002566,-422.993393,-422.992449,-423.037024,33.152,-1846.9309244661,-1858.0616791081,-1868.7305871261,-1719.042707739\r\ngdb_50758,C1CCN(C=CC1)C=O,3.1064,1.18979,0.91046,3.8738,85.27,-0.2188,-0.0011,0.2177,1292.5142,0.173639,-403.190588,-403.182309,-403.181365,-403.223811,31.621,-1952.645483169,-1965.225783601,-1976.487687624,-1817.244728694\r\ngdb_85969,[NH3+]C1CC(O)(C1)C([O-])=O,3.34626,1.19638,1.02645,5.7341,68.02,-0.2528,0.0152,0.2679,1175.3242,0.148016,-476.244887,-476.236346,-476.235402,-476.278216,33.03,-1713.7095087481,-1724.3470413161,-1734.4229533291,-1591.234184655\r\ngdb_9945,OC1C(O)C1(O)C#N,3.5696,1.55505,1.47566,3.5559,55.3,-0.2757,-0.0069,0.2688,870.1806,0.094123,-435.712226,-435.704575,-435.703631,-435.743902,28.399,-1283.9932280751,-1290.745852424,-1297.857411921,-1196.349673554\r\ngdb_21671,C#Cc1cc(no1)N,7.68219,1.30267,1.11524,3.2172,68.3,-0.234,-0.0385,0.1955,982.305,0.083986,-377.483613,-377.476681,-377.475737,-377.514525,26.087,-1265.28404724,-1271.599925325,-1278.118488817,-1184.050497154\r\ngdb_114177,OCC1CC2(CN12)C=O,4.55125,0.93563,0.86626,2.1785,73.91,-0.2527,-0.0386,0.2141,1330.7106,0.146727,-439.040573,-439.031865,-439.030921,-439.074308,32.109,-1722.1764876851,-1732.7098537591,-1742.785765772,-1600.2119559181\r\ngdb_115475,CCC1=CC(CO)CC1,3.3649,0.84437,0.71992,1.2297,89.57,-0.2235,0.0335,0.257,1654.3459,0.20573,-388.309382,-388.299339,-388.298395,-388.344858,36.962,-2175.4513387451,-2189.5909990421,-2202.63189108,-2021.7160263081\r\ngdb_90087,CC1C2CC11CN(C)C21,3.3856,1.2678,1.17773,0.6588,89.31,-0.205,0.0762,0.2812,1190.0443,0.194197,-367.076302,-367.067995,-367.06705,-367.10855,33.606,-2017.2632224441,-2031.6036856211,-2044.050954145,-1868.1357085941\r\ngdb_35762,N#CC1COC2CC1C2,3.33309,1.34997,1.17368,3.4134,75.09,-0.2585,0.027,0.2855,1072.3307,0.150541,-401.94769,-401.940429,-401.939485,-401.979408,28.798,-1800.567422001,-1812.009421107,-1822.08533312,-1677.593228253\r\ngdb_71310,CC12CC3C1C(C#N)C23,3.27494,1.47444,1.17251,4.0077,80.83,-0.2968,0.0279,0.3247,1062.5455,0.148177,-364.76084,-364.753218,-364.752274,-364.792713,30.01,-1819.993218114,-1831.2093139801,-1841.285225993,-1697.3723119331\r\ngdb_34964,N#CC1C2CC3OC1C23,3.22046,1.5601,1.49,5.1464,71.49,-0.259,0.0199,0.2789,928.0864,0.126364,-400.710609,-400.70398,-400.703036,-400.741737,26.294,-1652.139580686,-1662.2010599921,-1671.090979995,-1542.1661184\r\ngdb_112239,CCC1CC(=O)C1C=O,1.95821,1.41857,0.9104,2.6434,77.33,-0.2465,-0.037,0.2095,1303.7804,0.156276,-423.046657,-423.037109,-423.036165,-423.082067,34.008,-1874.5984237851,-1885.4938625521,-1896.16277057,-1747.3075956261\r\ngdb_93964,OC1COC(C1)C1CC1,3.99942,1.02941,0.97928,1.173,77.67,-0.2486,0.0727,0.3213,1291.46,0.183429,-424.214738,-424.206017,-424.205073,-424.248627,33.352,-1979.7281441001,-1992.919638298,-2004.7745383261,-1838.11191298\r\ngdb_95829,CC1CCCC(O)CO1,2.64172,1.20747,0.93399,0.7135,81.63,-0.2395,0.0758,0.3153,1336.6642,0.206586,-425.435298,-425.42597,-425.425026,-425.468932,36.443,-2117.7889092261,-2132.3766109491,-2145.417502987,-1962.6417015391\r\ngdb_119454,OCCC12CC1C(=O)O2,4.25713,0.86877,0.79205,3.0379,69.26,-0.2702,0.0002,0.2704,1418.4425,0.134842,-458.956035,-458.94743,-458.946486,-458.990083,31.439,-1681.3990703381,-1691.1085170951,-1700.591433103,-1566.3685130211\r\ngdb_2559,C#CC1(CO1)C#C,3.76122,2.59832,1.70992,1.686,59.38,-0.2651,-0.0069,0.2583,704.2482,0.075031,-306.012858,-306.005974,-306.00503,-306.043396,25.208,-1171.280061495,-1176.738134777,-1182.664329773,-1099.555155286\r\ngdb_76133,CC1(C)C([NH3+])C1C([O-])=O,2.21035,1.35849,1.12614,6.4186,76.11,-0.2443,0.0266,0.2709,1197.0741,0.170633,-440.316364,-440.306887,-440.305943,-440.350148,35.899,-1894.8952023901,-1906.7231195311,-1917.985023554,-1759.590456792\r\ngdb_117860,CC(CCO)C#CC#C,3.76435,0.57748,0.51383,2.3691,95.21,-0.24,-0.0083,0.2316,2080.5655,0.155918,-385.834694,-385.823906,-385.822962,-385.871306,38.167,-1878.2655863811,-1888.3835414971,-1899.052449515,-1751.985047712\r\ngdb_126977,CCNCc1nnon1,5.39935,0.70472,0.68198,3.3206,71.78,-0.243,-0.069,0.174,1663.6008,0.135307,-451.260685,-451.251865,-451.250921,-451.295887,30.565,-1550.81946751,-1560.392744814,-1569.875660822,-1435.792675247\r\ngdb_121712,COCCC(=O)OC=N,7.22948,0.56087,0.52559,1.7632,73.24,-0.2627,-0.0247,0.238,2043.0061,0.144721,-476.242988,-476.232671,-476.231727,-476.280066,34.174,-1712.5178691571,-1722.0409457411,-1732.116857754,-1592.395076305\r\ngdb_102829,OCC(CC=O)N1CC1,1.98039,1.38075,0.923,0.4002,76.04,-0.245,-0.0254,0.2196,1297.5737,0.169082,-440.255948,-440.246313,-440.245369,-440.291004,34.77,-1856.9836186461,-1868.712389365,-1879.974293388,-1722.477064496\r\ngdb_2804,COC1CCC1O,3.96982,2.2858,1.66922,2.4748,60.55,-0.2471,0.078,0.3251,788.2389,0.147951,-346.829695,-346.822193,-346.821249,-346.861198,27.509,-1546.1489180231,-1556.549879698,-1566.034050724,-1434.660649011\r\ngdb_44405,O=C1CN2C3CC12C=C3,3.56223,1.52895,1.50978,3.0707,74.78,-0.2279,-0.0339,0.194,922.3432,0.124487,-400.641833,-400.635248,-400.634304,-400.672564,27.338,-1608.982021702,-1619.0711114041,-1627.961031407,-1498.759438343\r\ngdb_62111,CC(CNC=O)C1CO1,3.96026,0.77771,0.71561,3.0949,76.71,-0.2526,0.03,0.2826,1590.6916,0.169529,-440.28999,-440.279941,-440.278996,-440.326525,35.17,-1878.345280024,-1889.814262017,-1901.075538531,-1744.766811685\r\ngdb_14729,CCC#CCOCC,6.73307,0.60037,0.57982,0.7711,82.7,-0.2411,0.0436,0.2847,2008.5746,0.174242,-348.974493,-348.964116,-348.963171,-349.011578,34.888,-1869.496148106,-1880.759934656,-1892.021838679,-1739.268577827\r\ngdb_123462,C(C1CO1)C1=NNC=C1,5.12501,0.93583,0.82771,1.2258,74.48,-0.2416,0.0249,0.2665,1352.5431,0.137911,-418.005267,-417.997418,-417.996474,-418.038837,28.952,-1688.09333635,-1698.277179911,-1707.760095919,-1572.658663237\r\ngdb_104972,CCC1(CO1)C(=O)C=O,2.53592,1.38717,1.03543,1.5547,69.7,-0.2534,-0.095,0.1583,1173.1375,0.13231,-458.957477,-458.947979,-458.947035,-458.99243,33.262,-1682.3039383161,-1691.453019536,-1700.9359355441,-1567.841276644\r\ngdb_75004,CC1=CC2OCC3C2C13,3.04917,1.62349,1.45984,1.8466,78.78,-0.2221,0.0127,0.2347,987.4649,0.161837,-385.890038,-385.882967,-385.882022,-385.92124,29.0,-1912.9944444771,-1925.4448505461,-1936.113131055,-1783.3190821181\r\ngdb_73297,CC12OCC3(O)C1OC23,2.93002,1.87567,1.32951,0.1807,67.32,-0.2448,0.058,0.3028,950.9102,0.135202,-458.897903,-458.890532,-458.889588,-458.929114,29.997,-1644.9207171501,-1655.404510013,-1664.887426021,-1528.1099168001\r\ngdb_34839,N#CC1C2CC1C2C#N,3.56801,1.17504,0.98647,3.8973,75.02,-0.3245,-0.0023,0.3223,1151.4457,0.11433,-379.646367,-379.639062,-379.638118,-379.678383,27.598,-1599.898828927,-1608.6475594051,-1616.944483403,-1497.115992272\r\ngdb_5834,CC(C)(C)C(O)C=O,2.98128,1.5376,1.3406,2.2296,71.34,-0.2704,-0.0383,0.2321,1028.8909,0.17337,-386.159671,-386.150013,-386.149069,-386.193412,35.788,-1849.0211569451,-1860.7354949571,-1871.998026489,-1716.4391728981\r\ngdb_114438,COC1OC2C3OC1C23,3.9699,1.40996,1.25964,2.3268,67.29,-0.2213,0.0621,0.2834,1023.2634,0.136032,-458.89538,-458.888083,-458.887138,-458.927361,28.059,-1643.3375119431,-1653.867740472,-1663.350028971,-1527.0098935231\r\ngdb_17561,CC12OC1CC21CO1,3.27967,2.14981,1.63614,0.3672,64.55,-0.2489,0.08,0.3289,824.8832,0.128789,-383.680637,-383.673328,-383.672383,-383.711937,27.781,-1549.108250467,-1558.742396144,-1567.632316147,-1441.736440495\r\ngdb_96385,OC1COCOC11CC1,2.45875,1.67296,1.31377,1.9442,70.63,-0.2482,0.0838,0.332,1033.3944,0.160096,-460.14517,-460.137129,-460.136185,-460.177409,31.719,-1799.7403651391,-1811.58083246,-1822.249740478,-1670.203682269\r\ngdb_130077,c1c2n(nc1N)C=CC2,5.09622,1.28785,1.03603,1.9794,80.6,-0.1854,-0.0056,0.1798,1106.0112,0.127693,-396.958016,-396.950939,-396.949995,-396.98922,28.256,-1646.51459001,-1656.294317775,-1665.184237778,-1536.228628242\r\ngdb_16250,OC1CCCC11CC1,2.77765,2.19281,1.44235,1.4957,74.3,-0.2513,0.0717,0.323,929.7545,0.178341,-349.00454,-348.996557,-348.995613,-349.037436,30.781,-1888.3509110291,-1901.1169541251,-1912.379485657,-1755.494705549\r\ngdb_63681,CC1(CN1CCC=O)C,4.2832,0.66006,0.63564,2.8841,85.25,-0.2318,-0.0209,0.2109,1826.0079,0.190617,-404.334918,-404.324268,-404.323324,-404.371645,37.934,-2042.8712372251,-2055.7408193061,-2068.188715339,-1896.298312514\r\ngdb_67216,CC12C3CC1C(C)(C)N23,2.56673,1.74018,1.33104,1.5518,85.73,-0.2283,0.0904,0.3187,1077.7564,0.193436,-367.164675,-367.155825,-367.154881,-367.197266,35.275,-2072.718075301,-2086.7178010911,-2099.165697124,-1923.8057970381\r\ngdb_76442,CC1C(C)C1(C)OC=O,2.7529,0.99228,0.9226,4.3508,80.24,-0.2641,0.0091,0.2732,1400.2433,0.179299,-424.244881,-424.234452,-424.233508,-424.280013,37.682,-1998.6431478871,-2010.7628567131,-2022.617756741,-1857.8069104541\r\ngdb_133003,CN=C1N=CNC=C1F,2.524,1.59064,0.98162,5.7171,76.8,-0.204,-0.0162,0.1879,1132.4967,0.113005,-458.114362,-458.106257,-458.105313,-458.147574,29.42,-1511.387429459,-1519.6328977191,-1527.929821717,-1408.878186728\r\ngdb_22488,CCC(=NO)CNC=O,3.03857,0.80292,0.67093,3.8364,76.76,-0.247,-0.0018,0.2452,1658.143,0.157305,-456.334972,-456.324579,-456.323634,-456.371812,36.198,-1757.9677185181,-1768.3316571621,-1778.9999376711,-1630.958014391\r\ngdb_53983,CC(C)(C(=O)N)OC=O,2.7518,1.11422,0.9598,0.5804,69.43,-0.257,-0.0038,0.2531,1263.7865,0.144308,-476.286963,-476.276702,-476.275758,-476.322417,36.639,-1740.1125774321,-1749.67079452,-1759.7467065331,-1618.970709964\r\ngdb_41478,C1OC23CCCC2CC13,3.01526,1.62961,1.26174,1.5282,81.16,-0.2286,0.0687,0.2973,1060.8258,0.185114,-387.074681,-387.067239,-387.066295,-387.10671,30.204,-2028.51696885,-2042.5116745681,-2054.366574596,-1885.9895946621\r\ngdb_19611,CC1C2C3C(C2O)N13,3.81993,1.9792,1.66083,1.201,66.69,-0.2315,0.0683,0.2998,826.862,0.142636,-363.787203,-363.780227,-363.779283,-363.817966,27.602,-1603.708436066,-1614.4400949841,-1623.923638501,-1489.2620896281\r\ngdb_15510,O=CC1CCNC1=O,3.05456,1.94683,1.30051,3.9067,61.92,-0.2484,-0.0269,0.2215,907.7267,0.119944,-399.856998,-399.849588,-399.848644,-399.889402,26.342,-1511.172193872,-1519.853780887,-1528.151332394,-1410.8705278031\r\ngdb_119425,COCC12CC1(C)CC2,2.84894,1.12309,0.953,1.1546,87.65,-0.2338,0.0837,0.3175,1372.6071,0.204589,-388.259151,-388.249295,-388.248351,-388.293864,36.749,-2143.930934166,-2158.187938646,-2171.228830684,-1989.7168323621\r\ngdb_104846,CCC1(CC)CCC1C,1.77419,1.34706,1.13501,0.0834,95.23,-0.2716,0.0764,0.3479,1364.3558,0.251213,-353.561691,-353.550843,-353.549898,-353.596902,41.552,-2469.768119961,-2486.9574739981,-2502.369722547,-2289.3806177671\r\ngdb_83461,CC1C2COC12CC=O,2.88037,1.11978,0.97489,3.1424,76.33,-0.2374,-0.023,0.2144,1274.9972,0.157122,-422.993539,-422.984424,-422.98348,-423.028426,32.956,-1841.2664007231,-1852.433550887,-1863.102458905,-1713.6473853571\r\ngdb_50691,O=CC1OCOC1C=O,2.20088,1.84723,1.21393,2.6446,61.15,-0.2537,-0.0554,0.1983,1000.2238,0.110513,-494.895937,-494.887552,-494.886607,-494.930375,28.682,-1507.3538016071,-1515.423567347,-1523.719863836,-1405.6829109001\r\ngdb_101339,CCC1CN1C(C)CO,2.8274,0.97362,0.81498,2.8708,86.71,-0.2348,0.0791,0.314,1521.6528,0.215935,-405.520392,-405.509782,-405.508838,-405.556072,39.411,-2158.9152215771,-2173.587009506,-2187.220897549,-1998.314333171\r\ngdb_45010,N=C1CC2OC2C(=O)O1,2.48465,1.86549,1.23777,3.1233,63.73,-0.2803,-0.0377,0.2427,965.7501,0.101685,-473.848554,-473.84152,-473.840576,-473.88035,26.549,-1465.692224079,-1473.721201734,-1481.425129727,-1368.996852233\r\ngdb_83901,CC1N2CCCC12C#N,2.54598,1.4396,1.20925,5.1915,80.06,-0.2471,0.0217,0.2688,1114.0752,0.160704,-382.072659,-382.064138,-382.063194,-382.105775,31.977,-1866.715655727,-1878.2555462371,-1888.924454255,-1737.881155428\r\ngdb_108660,C1C(=C(C(=O)N1)CO)N,2.2372,1.58832,0.96582,5.2037,73.38,-0.2142,-0.0039,0.2104,1174.2802,0.137557,-455.195674,-455.186923,-455.185979,-455.228928,32.837,-1670.89958975,-1680.516792684,-1689.999708692,-1555.010600121\r\ngdb_64909,OC1(CC1(O)C#N)C#N,2.09382,1.71781,1.11901,6.0902,64.31,-0.3044,-0.0227,0.2817,1043.8556,0.08622,-452.727649,-452.718704,-452.71776,-452.761095,32.045,-1377.8949298531,-1383.836185065,-1390.947117053,-1288.868345496\r\ngdb_113053,C1C(CN1CC#N)CO,5.91356,0.68419,0.64731,3.5318,77.23,-0.2396,0.021,0.2606,1682.3873,0.158927,-419.182441,-419.173039,-419.172095,-419.217669,33.442,-1798.928996002,-1809.915423574,-1820.584331592,-1671.1637710391\r\ngdb_30370,CN=C(c1ncco1)N,4.45464,1.09828,0.88708,1.4544,77.39,-0.2292,-0.0321,0.1971,1271.4332,0.125296,-434.102769,-434.094433,-434.093489,-434.135989,30.466,-1600.672547524,-1609.661613949,-1618.551533952,-1491.395620228\r\ngdb_25861,NC(=[NH2+])C1=COC(=N)[N-]1,2.88253,1.14918,0.82644,5.833,83.93,-0.1971,-0.0267,0.1704,1307.3151,0.111478,-450.155649,-450.146358,-450.145414,-450.189649,34.165,-1485.2510521,-1492.7516671771,-1501.048591175,-1382.840955791\r\ngdb_129265,N=C1ON=C(CC#N)O1,6.47749,0.92197,0.81111,2.5542,62.11,-0.2722,-0.0315,0.2407,1281.7642,0.07717,-468.798519,-468.790976,-468.790032,-468.831585,26.49,-1273.762321339,-1279.6941639161,-1286.212099899,-1190.874657529\r\ngdb_100612,OCC(O)C(=O)OC=N,3.2452,0.88182,0.72861,2.6768,65.35,-0.2685,-0.0418,0.2266,1489.8866,0.122052,-512.178884,-512.169255,-512.168311,-512.214433,33.257,-1535.9587993721,-1544.135869151,-1553.025789154,-1427.9914833591\r\ngdb_123661,C1C2NC3=CN=NN3C12,4.12856,1.67208,1.30857,5.7964,70.15,-0.2169,0.0127,0.2296,919.77,0.116634,-412.956428,-412.95027,-412.949326,-412.986868,24.976,-1496.913934374,-1506.381162657,-1514.678086655,-1392.525929697\r\ngdb_43835,O=C1NC2(CC2)C2CC12,2.98464,1.51132,1.16683,3.8707,77.06,-0.228,0.0301,0.2581,1073.3101,0.14914,-401.966882,-401.959328,-401.958384,-401.998842,30.311,-1812.610574729,-1823.868713698,-1833.944625711,-1689.788238159\r\ngdb_132391,c1(nc(nc(n1)F)O)N,2.03547,1.98635,1.00531,2.0604,59.34,-0.2635,-0.0147,0.2488,1042.7446,0.079613,-510.185998,-510.178817,-510.177873,-510.217825,26.527,-1271.40225999,-1277.560633316,-1284.078569299,-1187.516229361\r\ngdb_68901,CC12CC1(C)NC(=O)C2,2.64319,1.46787,1.14098,3.8677,79.46,-0.2326,0.0302,0.2628,1141.6267,0.170924,-403.196454,-403.187679,-403.186735,-403.229302,34.365,-1956.3264509631,-1968.5955069311,-1979.857410954,-1820.6903806131\r\ngdb_128639,Cc1cnc(nn1)OC,4.99669,1.09172,0.90611,1.1315,76.06,-0.2321,-0.0656,0.1665,1266.6102,0.124229,-434.08563,-434.077248,-434.076304,-434.119177,29.74,-1589.917670773,-1598.877871784,-1607.767791787,-1480.84593892\r\ngdb_38051,C1C2C=CC11CCOC21,3.47861,1.71922,1.47453,1.425,78.65,-0.225,0.0104,0.2354,938.911,0.162217,-385.84904,-385.842362,-385.841418,-385.879988,28.061,-1887.2678304951,-1899.9648476011,-1910.633755619,-1757.4330808501\r\ngdb_96859,CN=C1OC(C)C1C#N,2.4167,1.30565,0.95069,3.949,76.43,-0.2686,0.0031,0.2717,1259.9782,0.134667,-418.019499,-418.010154,-418.00921,-418.054302,32.546,-1697.024044438,-1706.269134535,-1715.752050543,-1582.363089922\r\ngdb_69145,CC12NC1C(=O)C1NC21,2.67549,1.67291,1.29944,4.2287,72.6,-0.2193,-0.0307,0.1885,1008.7181,0.136728,-417.967234,-417.959679,-417.958735,-417.999016,29.799,-1664.2272865531,-1674.59561776,-1684.078533768,-1547.670627348\r\ngdb_91725,CC1OC2OC1CC2=O,2.70317,1.59453,1.49209,3.3046,66.99,-0.2355,-0.0232,0.2123,966.8032,0.136878,-458.986333,-458.979032,-458.978088,-459.018076,28.663,-1700.4113380201,-1710.9390565131,-1720.4219725211,-1583.934372458\r\ngdb_129579,NC(N)=[NH+]C1=NC=N[N-]1,5.14715,1.0837,0.89584,10.4541,72.91,-0.1942,-0.0101,0.1841,1214.5793,0.114125,-446.328309,-446.320323,-446.319379,-446.360967,29.682,-1432.721646201,-1441.040533014,-1449.337457012,-1329.109242648\r\ngdb_51807,O=COC1CCC2OC12,4.37384,0.95565,0.85852,5.1296,69.85,-0.2725,0.0062,0.2786,1321.1423,0.136034,-458.975947,-458.967997,-458.967052,-459.009718,29.01,-1693.8940295461,-1704.0144946981,-1713.4967831971,-1578.6896522361\r\ngdb_18447,CC1CC2C1N2C=O,6.61924,1.19442,1.09881,3.7696,71.4,-0.2399,0.0043,0.2442,1089.2292,0.141654,-363.839281,-363.83163,-363.830686,-363.871296,28.294,-1636.3878497681,-1646.6959401111,-1656.179483628,-1522.727144598\r\ngdb_46182,C#CC1(COC1=O)C#C,1.90187,1.67203,1.29387,3.5086,70.28,-0.2677,-0.0111,0.2566,1034.6301,0.086278,-419.366455,-419.358002,-419.357058,-419.399709,30.629,-1448.793406709,-1455.044651367,-1462.155583355,-1360.262554462\r\ngdb_113611,OCC1OC1C#CC#N,3.60774,0.7117,0.62644,3.6764,78.01,-0.2834,-0.0637,0.2197,1667.9972,0.09946,-436.635997,-436.627116,-436.626172,-436.670859,30.741,-1468.986646329,-1475.85724237,-1483.561170363,-1374.471240749\r\ngdb_109185,CN=C1CN=C(N1)CO,2.85293,1.12986,0.83712,2.4636,77.0,-0.224,-0.0072,0.2168,1362.0806,0.147621,-435.28445,-435.275111,-435.274167,-435.319248,33.217,-1714.336390239,-1724.473170625,-1734.549082638,-1592.678710373\r\ngdb_4276,NC1=NC(=N)C=CO1,3.97374,2.02386,1.34174,4.7878,64.89,-0.2248,-0.0157,0.2091,845.3079,0.098156,-394.821934,-394.815333,-394.814388,-394.852574,25.263,-1328.636728371,-1336.04823717,-1343.159169158,-1240.238908032\r\ngdb_51999,C(C=O)C(=O)COC=O,4.48583,0.68313,0.66225,3.8242,65.23,-0.2698,-0.0587,0.2112,1633.8448,0.108214,-494.934918,-494.925371,-494.924426,-494.971573,31.435,-1531.8147299361,-1539.155330218,-1547.451626707,-1431.535026682\r\ngdb_84994,CC1CC(C)(O)C(=O)O1,2.38699,1.5475,1.25117,3.1112,71.16,-0.258,0.0063,0.2643,1095.3078,0.158257,-460.221432,-460.212377,-460.211433,-460.254806,34.366,-1847.5954564971,-1858.799629692,-1869.4685377101,-1718.770996342\r\ngdb_9906,CC1(CC1)OCCO,4.10977,1.0985,1.03827,2.2461,71.89,-0.2553,0.0822,0.3374,1226.8586,0.175242,-386.116345,-386.107165,-386.106221,-386.150672,33.736,-1821.8337020111,-1833.8479893251,-1845.110520857,-1689.619438238\r\ngdb_100777,CC(CO)C(O)C1CO1,3.74572,0.86699,0.81164,1.4493,74.68,-0.2615,0.0759,0.3374,1473.0036,0.180528,-461.331585,-461.321377,-461.320433,-461.366878,37.236,-1916.3748354601,-1928.6325962661,-1940.487496294,-1775.383603304\r\ngdb_62179,CC(CCCC#N)C#N,4.93903,0.5673,0.52442,0.8528,81.08,-0.3235,0.0209,0.3444,1992.8509,0.158421,-382.1123,-382.102091,-382.101147,-382.148717,35.298,-1891.5907399961,-1902.0713953141,-1912.740303332,-1764.8276469061\r\ngdb_62702,CC1(C)C2C3CCC1N23,2.7672,1.65708,1.46342,1.672,83.88,-0.2273,0.0793,0.3066,1034.0305,0.196132,-367.192986,-367.184975,-367.184031,-367.224722,32.976,-2090.4834826001,-2105.0096884411,-2117.457584474,-1941.0346841421\r\ngdb_5871,CC(C)(CCC#N)O,4.24712,0.97398,0.95431,5.4386,72.36,-0.2806,0.0405,0.3211,1286.3167,0.162942,-365.09112,-365.081461,-365.080517,-365.125473,35.017,-1794.076468905,-1804.901626664,-1815.571162191,-1668.511918005\r\ngdb_116192,CCC1COCC1(C)O,2.19545,1.35264,1.09264,1.7847,81.02,-0.2403,0.0673,0.3076,1264.563,0.204702,-425.435817,-425.42575,-425.424806,-425.470553,37.904,-2118.1145863971,-2132.238558969,-2145.279451007,-1963.658893628\r\ngdb_18762,C1CC23COC2CC13,3.8261,2.18818,1.69078,1.86,71.05,-0.2369,0.0698,0.3066,811.8747,0.15547,-347.75026,-347.74377,-347.742826,-347.780725,26.092,-1729.1305424231,-1741.056350968,-1751.13289049,-1608.1198243361\r\ngdb_17331,CC12CN1CC1NC21,3.4759,2.17976,1.82676,2.241,70.78,-0.2183,0.0739,0.2922,808.92,0.155899,-343.939463,-343.932698,-343.931754,-343.969864,27.536,-1686.982017911,-1698.7346339721,-1708.811173494,-1565.570949082\r\ngdb_82891,CC1C2OC3CC2C13C,2.43148,1.9578,1.44957,1.626,81.54,-0.2349,0.0873,0.3223,1005.8075,0.183325,-387.05978,-387.051887,-387.050943,-387.091571,32.116,-2019.166457241,-2032.8781564001,-2044.733056428,-1876.489735911\r\ngdb_116602,CCN=C1COC(C)O1,3.72669,0.90395,0.80134,1.6521,79.12,-0.2512,0.0244,0.2756,1511.487,0.16981,-440.306933,-440.297349,-440.296404,-440.342707,34.108,-1888.9771650111,-1900.737938689,-1911.999215203,-1754.921162323\r\ngdb_15917,C#CC12CNC1CN2,3.5973,1.89891,1.52994,1.3358,71.13,-0.225,0.0273,0.2523,859.9504,0.131141,-342.689102,-342.681834,-342.68089,-342.720553,28.038,-1530.220857076,-1539.880730622,-1548.771278134,-1422.839634469\r\ngdb_125360,Cc1c[nH]c2c1nn[nH]2,3.3158,1.65844,1.1131,5.6909,73.56,-0.2087,0.0035,0.2122,1026.1007,0.115075,-412.993742,-412.986172,-412.985228,-413.025626,28.397,-1520.3288052,-1528.9099907751,-1537.206914773,-1416.846923519\r\ngdb_72184,CC12CN3CC4C(C13)N24,3.07964,2.04115,1.6263,0.3743,77.53,-0.206,0.0675,0.2735,891.3158,0.161816,-381.999132,-381.992471,-381.991527,-382.029619,28.288,-1820.576801484,-1833.2838587341,-1843.952766752,-1690.0925800241\r\ngdb_83442,CC1C2CCC12CC#C,2.72486,1.16084,1.02126,0.6609,88.11,-0.2457,0.053,0.2987,1286.1803,0.181347,-349.896955,-349.887654,-349.88671,-349.931104,35.628,-2053.668157079,-2066.4969510751,-2078.351851103,-1912.7271256431\r\ngdb_80310,CC1C2C=C(CN12)C#N,3.56528,1.14865,1.10393,3.8559,81.46,-0.2368,-0.052,0.1848,1149.7881,0.137252,-380.870822,-380.862999,-380.862055,-380.90314,29.903,-1740.403741608,-1750.6045279121,-1760.08744392,-1624.4394508991\r\ngdb_126336,CCC1=CNN=NC1=N,2.90958,1.41485,1.02403,4.7508,80.72,-0.2203,-0.0517,0.1686,1156.9177,0.137438,-414.172517,-414.164243,-414.163298,-414.205711,30.851,-1632.169106761,-1642.085631488,-1651.567919987,-1516.1383000981\r\ngdb_94313,CC1(C)OC11CCC1O,2.37393,1.48022,1.12029,1.9512,79.41,-0.25,0.0577,0.3077,1185.0344,0.179745,-424.206867,-424.197095,-424.196151,-424.240935,36.997,-1974.7890207611,-1987.321003,-1999.175903028,-1833.285113752\r\ngdb_12598,NC1=NCC(=O)C1O,3.22629,2.1499,1.35185,1.4852,59.34,-0.2412,-0.0365,0.2047,853.9659,0.107565,-415.887119,-415.8797,-415.878756,-415.918721,27.663,-1381.469221117,-1389.255980298,-1396.9605358,-1287.041666797\r\ngdb_58928,CC(O)C12OC3C1CC23,2.85272,1.57775,1.22972,1.5657,74.71,-0.2331,0.058,0.2912,1060.0285,0.157979,-422.931517,-422.923286,-422.922342,-422.964371,32.141,-1802.347037525,-1814.0689056451,-1824.737813663,-1673.4522963621\r\ngdb_20738,Cc1c(c[nH]n1)CO,3.32584,1.75881,1.16714,1.8482,66.78,-0.2258,0.0348,0.2606,991.057,0.131289,-379.941396,-379.933202,-379.932258,-379.974793,29.013,-1551.861759959,-1560.939932662,-1569.830480174,-1445.445646194\r\ngdb_128887,C(#N)c1nnc(o1)C#N,5.93558,1.00074,0.85636,0.6476,66.31,-0.3282,-0.1235,0.2047,1176.7364,0.043175,-446.533049,-446.52626,-446.525315,-446.564833,23.311,-1095.591808442,-1099.331134573,-1104.069455032,-1031.904037505\r\ngdb_30997,C(c1c([nH]c(=O)[nH]1)O)O,2.38602,1.3438,0.89911,3.8404,65.22,-0.1954,0.027,0.2224,1219.8982,0.112556,-491.127119,-491.118217,-491.117273,-491.160975,32.216,-1491.5474774061,-1499.292193484,-1507.5891174821,-1389.1511862951\r\ngdb_68613,CC12CC(CC1)C2C#C,2.32982,1.62021,1.32449,0.6226,88.17,-0.2512,0.0462,0.2974,1089.8049,0.183246,-349.925299,-349.91694,-349.915996,-349.957603,33.96,-2071.454272175,-2084.8741796491,-2096.729079677,-1929.3554866341\r\ngdb_119167,COCC1(CC1)C1CO1,3.49795,0.94924,0.83308,2.6181,79.5,-0.2555,0.0829,0.3383,1447.3548,0.180828,-424.174999,-424.165358,-424.164414,-424.210048,34.988,-1954.7915639491,-1967.4057498671,-1979.260649895,-1813.903243269\r\ngdb_24090,C#CCNc1cocn1,3.71709,1.04068,0.92902,1.2715,71.6,-0.2054,0.01,0.2154,1244.1769,0.113025,-416.788126,-416.779994,-416.77905,-416.821842,29.738,-1552.178024495,-1560.407177521,-1568.704101519,-1450.205929468\r\ngdb_81688,CC1CC2CC(C1)C2O,2.44232,1.41087,1.32642,1.625,84.07,-0.2473,0.0589,0.3062,1142.9451,0.208311,-388.293726,-388.28522,-388.284276,-388.326297,34.433,-2165.6270578411,-2180.7311994711,-2193.772091509,-2010.0688317591\r\ngdb_40796,C1CC1C1CCC=CC1,3.57574,1.02973,0.85783,0.2235,90.97,-0.2343,0.0335,0.2678,1417.7903,0.207933,-351.164548,-351.155983,-351.155039,-351.197896,33.885,-2221.2425530021,-2236.3102991101,-2249.351191148,-2066.4279250851\r\ngdb_29973,Cn1cc(nc1)NC=O,3.80867,1.10175,0.85919,2.7704,75.08,-0.2049,0.0111,0.216,1289.4273,0.12612,-434.133931,-434.125614,-434.12467,-434.16752,29.714,-1620.226982982,-1629.227972078,-1638.117892081,-1511.181606507\r\ngdb_24313,C1C2CCC3=CC=C1N23,2.71571,2.36393,1.4634,2.1936,82.71,-0.1841,0.0002,0.1843,900.4025,0.151793,-364.783093,-364.776757,-364.775813,-364.813424,26.757,-1833.9571758911,-1845.9802483311,-1856.056160344,-1710.3686508321\r\ngdb_108454,OCC12CCN1CC2=O,2.33734,1.68471,1.22211,2.4135,71.72,-0.2394,-0.0391,0.2003,1058.8961,0.147201,-439.049685,-439.041298,-439.040353,-439.083017,31.388,-1727.8943496931,-1738.629146156,-1748.70443066,-1605.676931799\r\ngdb_52671,CC#CC(C)C1(C)CC1,2.6437,1.00649,0.81876,0.3105,94.26,-0.2409,0.0603,0.3012,1535.5778,0.203046,-351.129244,-351.118182,-351.117237,-351.166923,39.934,-2199.0889752661,-2212.5898314011,-2225.63009593,-2046.9920888281\r\ngdb_31119,Cc1c(c(co1)O)CO,2.33087,1.54683,0.96733,2.6094,71.94,-0.1996,0.0189,0.2185,1184.2351,0.135745,-458.989975,-458.981219,-458.980275,-459.023431,32.531,-1702.6967257981,-1712.311418696,-1721.7943347041,-1587.294683153\r\ngdb_111304,CC1(C)C2OC2C1CO,2.3122,1.33865,1.14033,1.7983,77.49,-0.2529,0.0614,0.3143,1201.8226,0.180402,-424.185517,-424.175937,-424.174993,-424.219523,36.309,-1961.3917036111,-1974.044167578,-1985.899067606,-1819.848891044\r\ngdb_117046,CCC(=O)C(=N)NCC,3.99944,0.76466,0.653,0.6287,82.92,-0.2244,-0.0529,0.1716,1775.2064,0.180571,-420.443096,-420.432103,-420.431159,-420.480545,38.264,-1962.149734483,-1973.914900724,-1985.769800752,-1822.407245237\r\ngdb_101869,CCC(C)COCC#N,5.2388,0.56301,0.53371,4.5619,84.76,-0.2813,0.011,0.2923,2087.1706,0.191593,-404.35777,-404.346677,-404.345733,-404.395325,38.341,-2057.2110728931,-2069.8026684871,-2082.25056452,-1911.1577256341\r\ngdb_83606,CN1C2COC2(C)C1=O,2.37994,1.69348,1.25263,4.225,73.7,-0.2343,0.0178,0.2521,1066.385,0.146623,-439.085258,-439.076424,-439.07548,-439.118711,32.016,-1750.2167273501,-1760.67102729,-1770.746939303,-1628.075238045\r\ngdb_49745,O=CC1N=C2OC3C1C23,3.70238,1.45255,1.34595,3.4171,67.16,-0.2306,-0.0772,0.1534,938.6701,0.098529,-436.54388,-436.536908,-436.535964,-436.575767,26.573,-1411.182399776,-1419.250910498,-1426.954838491,-1314.800154921\r\ngdb_66502,CC12C3C1C1=NC3C2O1,2.64808,1.96038,1.64288,3.3684,72.56,-0.2381,-0.0177,0.2203,881.5749,0.123125,-400.649498,-400.642054,-400.64111,-400.680898,29.286,-1613.791878187,-1623.341937658,-1632.231857661,-1503.9890983491\r\ngdb_597,C1CN1CCO,8.50794,2.06188,2.01331,2.7069,54.01,-0.2447,0.0883,0.333,666.2034,0.131706,-287.651399,-287.644917,-287.643972,-287.681656,23.25,-1326.1863057261,-1335.449593584,-1343.7477726,-1229.676676544\r\ngdb_40087,C1OC11CC2CCC=C12,3.89231,1.24668,1.05176,1.9931,83.38,-0.2329,-0.0002,0.2327,1171.2187,0.160139,-385.848393,-385.840855,-385.83991,-385.880394,30.007,-1886.861832172,-1899.019191538,-1909.687472047,-1757.6878495041\r\ngdb_85725,CC1CC(C#C)C1C=O,2.16151,1.3815,0.95056,2.9278,82.21,-0.2436,-0.0238,0.2198,1267.2252,0.15672,-385.856231,-385.846714,-385.845769,-385.891054,34.9,-1891.780247714,-1902.6957667691,-1913.364047278,-1764.3770954441\r\ngdb_23586,ON=C1C2OC3CC2C13,4.49876,1.26725,1.20586,1.8193,73.02,-0.2408,0.0038,0.2445,1038.7948,0.125447,-437.78348,-437.776716,-437.775771,-437.814613,27.364,-1561.190936262,-1571.1670743441,-1580.056366838,-1450.964587849\r\ngdb_86726,CC1OC(C=O)C=C1C,2.57727,1.33516,1.1133,3.2908,79.1,-0.2284,-0.0263,0.2021,1181.7816,0.156792,-423.047468,-423.038128,-423.037184,-423.081861,33.782,-1875.107333584,-1886.1332942231,-1896.802202241,-1747.1783287721\r\ngdb_112455,OCC1OC(=O)CC1O,2.3994,1.36094,0.98248,4.3041,64.62,-0.2675,0.0083,0.2758,1190.3123,0.13553,-496.136757,-496.128067,-496.127123,-496.170462,31.968,-1658.1278990731,-1667.783380056,-1677.266296064,-1542.626082497\r\ngdb_74129,CC1=CC(=O)[CH-]C(=[NH2+])N1,1.89002,1.81194,0.93208,7.1271,80.68,-0.2037,-0.002,0.2017,1217.4899,0.137546,-418.077371,-418.068875,-418.067931,-418.110176,32.503,-1733.3392452861,-1743.117090524,-1752.600006532,-1617.424527788\r\ngdb_7478,CC1(CC=O)CCO1,4.09504,1.29805,1.21068,3.5282,68.13,-0.2425,-0.0212,0.2213,1053.7256,0.151418,-384.936607,-384.927917,-384.926973,-384.970699,30.71,-1709.389109283,-1719.933770519,-1730.010310041,-1590.3983426671\r\ngdb_66016,OC1(COC1=O)C1CO1,2.65921,1.3675,1.28947,3.7139,60.87,-0.2752,-0.0159,0.2593,1034.858,0.110857,-494.883883,-494.875485,-494.874541,-494.917428,29.843,-1499.7898081211,-1507.851416244,-1516.148340242,-1397.5585518771\r\ngdb_49465,O=CC1C2OC(=O)OC12,3.21412,1.29141,1.16402,4.459,58.88,-0.268,-0.046,0.2219,1034.2819,0.088688,-493.72592,-493.71879,-493.717846,-493.75831,25.718,-1401.0092238681,-1408.089407915,-1415.200339903,-1311.424156501\r\ngdb_36643,N#CCCC1C2CC1O2,5.74764,0.71151,0.69905,2.3334,76.76,-0.2494,0.0304,0.2798,1584.3286,0.147769,-401.888877,-401.880664,-401.87972,-401.922831,30.377,-1763.661735184,-1774.5063457221,-1784.582257735,-1642.09065156\r\ngdb_102084,CC1(CC1)C(CO)C=O,2.0293,1.25649,1.00395,3.8526,78.93,-0.2385,-0.0179,0.2206,1306.039,0.179599,-424.215752,-424.205389,-424.204444,-424.251591,37.415,-1980.3644382261,-1992.5255626461,-2004.3798351651,-1839.9718496561\r\ngdb_13584,CC1CN1C(=N)C=O,4.22926,1.39443,1.1608,3.3842,68.65,-0.247,-0.0728,0.1742,1029.6828,0.129371,-379.903512,-379.89533,-379.894386,-379.936362,29.179,-1528.089209003,-1537.174911814,-1546.065459326,-1421.329847815\r\ngdb_36074,C#CCC12C3CC1CN23,3.69224,1.23646,1.14543,1.7682,80.35,-0.236,0.051,0.287,1124.0869,0.147697,-364.710329,-364.702507,-364.701562,-364.743025,30.517,-1788.2971110151,-1799.3877050811,-1809.462989585,-1666.1926447411\r\ngdb_48013,N=C1OCC1C#CC=O,4.13139,0.73141,0.65911,2.8122,75.63,-0.2733,-0.0702,0.2031,1609.6813,0.099534,-436.651273,-436.64262,-436.641675,-436.686708,29.173,-1478.572473813,-1485.586141906,-1493.28944239,-1384.4166308901\r\ngdb_2176,CC1COCC1C,3.40886,2.71166,1.73957,1.5735,66.26,-0.2442,0.0755,0.3197,785.7238,0.172695,-310.929561,-310.922046,-310.921102,-310.961572,27.987,-1745.148964666,-1757.3195017211,-1767.989664757,-1620.864532126\r\ngdb_97279,CN=CN(C=O)C(=N)N,2.65199,1.13437,0.80297,1.4078,77.11,-0.2424,-0.0294,0.2129,1392.5859,0.135412,-451.361055,-451.351531,-451.350587,-451.396149,33.866,-1613.80254584,-1622.934056808,-1632.416972816,-1498.707982605\r\ngdb_84686,CC1C=CC2OCC12C,2.17796,1.785,1.37847,2.0194,82.26,-0.2394,0.0112,0.2507,1064.0387,0.182993,-387.088498,-387.080012,-387.079068,-387.121173,33.291,-2037.1872607031,-2050.5268470251,-2062.381747053,-1895.0652573291\r\ngdb_58487,CC(O)C1(C)CC1C#N,2.20434,1.32301,1.08479,4.152,80.18,-0.2729,0.0332,0.3062,1225.0653,0.168779,-403.158395,-403.148261,-403.147317,-403.19313,37.276,-1932.4440859321,-1943.860357169,-1955.122261192,-1797.992125065\r\ngdb_1972,CC1CC(CO)C1,6.72748,1.36017,1.25615,1.2181,68.05,-0.2607,0.0784,0.3391,1002.5383,0.171293,-310.90878,-310.900748,-310.899803,-310.941019,29.493,-1732.108700137,-1743.9548150391,-1754.624350566,-1607.967339649\r\ngdb_107267,CCC12OC1C1CC2C1,2.88636,1.55781,1.29108,1.8457,80.7,-0.2448,0.0837,0.3285,1076.8503,0.184084,-387.052597,-387.044897,-387.043953,-387.084651,31.485,-2014.659060094,-2028.49186849,-2040.346768518,-1872.147373631\r\ngdb_29990,OC1C=CC2=C1C=CN2,3.26322,1.58212,1.1141,3.273,78.81,-0.1829,-0.0151,0.1677,1040.5257,0.127073,-400.777203,-400.77002,-400.769076,-400.808612,28.7,-1693.9279150321,-1703.6417543521,-1712.531674355,-1584.130782775\r\ngdb_26760,Cc1cc([nH]c1C)C#N,3.53214,1.0831,0.83755,5.1305,87.29,-0.2172,-0.0178,0.1994,1341.6623,0.136336,-380.943925,-380.934714,-380.93377,-380.978746,32.604,-1786.276532035,-1795.6063358471,-1805.089251855,-1671.8828963531\r\ngdb_10300,CCC(C#C)C(C)C,2.40266,1.58732,1.02234,0.5918,84.23,-0.2582,0.0556,0.3138,1227.0783,0.197438,-313.050924,-313.04067,-313.039725,-313.0857,37.81,-2053.790521334,-2066.909224488,-2079.357120521,-1908.999094674\r\ngdb_104117,CCC1(C)CNC(=O)C1,3.43953,1.01392,0.97363,3.9048,81.93,-0.2354,0.0386,0.274,1337.8531,0.194678,-404.40952,-404.400029,-404.399085,-404.443554,36.256,-2089.684663643,-2103.2815286551,-2115.729424688,-1941.4218571951\r\ngdb_21655,c1(c([nH]nc1N)N)N,3.35279,2.03548,1.28084,2.5887,64.3,-0.1885,0.0523,0.2408,903.2074,0.12175,-392.168554,-392.160408,-392.159464,-392.200341,30.523,-1384.926168198,-1393.144653571,-1401.442205078,-1283.516948744\r\ngdb_40393,C1OC1C12CC1N1CC21,4.40345,1.17597,1.10427,2.2551,75.31,-0.2222,0.0598,0.282,1131.3273,0.147757,-401.828371,-401.820937,-401.819993,-401.860482,29.378,-1725.6936756301,-1737.027115679,-1747.103027692,-1602.966092919\r\ngdb_121876,CCCCC(O)C(N)=O,3.73708,0.64552,0.58542,3.9466,79.99,-0.258,0.02,0.278,1949.9061,0.192209,-441.532066,-441.520928,-441.519984,-441.569325,39.555,-2029.9075287941,-2042.4702589741,-2054.9181550071,-1883.412415199\r\ngdb_43860,O=C1CC2(C1)NC2C#N,3.68147,1.02986,0.89767,4.3493,71.28,-0.2608,-0.0432,0.2176,1246.2082,0.111407,-416.794321,-416.786284,-416.785339,-416.827708,29.707,-1556.0654427501,-1564.354209131,-1572.65050562,-1453.886897262\r\ngdb_20116,O=C(C1CN1)N1CC1,5.09833,1.3528,1.19916,2.6236,67.83,-0.2539,-0.0085,0.2454,1015.4845,0.131428,-379.890274,-379.882688,-379.881744,-379.922805,27.133,-1519.782244861,-1529.241943036,-1538.132490548,-1412.822708302\r\ngdb_115671,CC(=O)C1OCC1CO,2.34183,1.24417,0.94748,2.5915,71.36,-0.2396,-0.0214,0.2182,1279.8198,0.156567,-460.151231,-460.141284,-460.14034,-460.187505,34.584,-1803.5436971881,-1814.188132355,-1824.857040373,-1676.539013133\r\ngdb_90881,CC1OC2C3CC(O3)C12,3.28414,1.65316,1.4787,2.6479,73.81,-0.2423,0.0804,0.3227,960.8653,0.161146,-422.984302,-422.977331,-422.976387,-423.015361,28.692,-1835.4701000901,-1847.9826295501,-1858.651537568,-1705.4489802721\r\ngdb_57526,CC1(CC1)CC(C)(C)O,2.64216,1.06941,0.99239,1.4237,88.85,-0.254,0.0619,0.3159,1396.2126,0.2259,-389.490322,-389.479399,-389.478455,-389.52558,42.257,-2288.650197291,-2304.014755156,-2318.241639204,-2121.40712612\r\ngdb_59297,CC(C)C1C=CCC1O,2.59106,1.28959,0.97627,1.3455,86.93,-0.2443,0.0181,0.2624,1307.5206,0.20546,-388.307536,-388.297928,-388.296984,-388.341607,37.247,-2174.2929571311,-2188.7055838431,-2201.746475881,-2019.6759945491\r\ngdb_22381,ON=C1CC2(O)C(O)C12,2.84,1.20651,1.01897,1.3743,69.91,-0.2303,0.0128,0.2431,1175.1911,0.121594,-474.934179,-474.925238,-474.924294,-474.967627,33.41,-1519.08006229,-1527.68948577,-1536.579405773,-1410.05037354\r\ngdb_132764,Cc1c(ccnc1F)F,2.34843,1.77346,1.01676,1.862,66.98,-0.2595,-0.0249,0.2345,1054.2895,0.100309,-486.009306,-486.001716,-486.000772,-486.042213,27.723,-1509.779123892,-1517.4598340521,-1525.163762045,-1414.2113857191\r\ngdb_84158,CC1=CC(O)C(=O)CC1,2.63048,1.35728,0.97707,3.3408,78.26,-0.2443,-0.0236,0.2207,1230.2017,0.159022,-423.066927,-423.058179,-423.057235,-423.100309,33.225,-1887.3180312151,-1898.715477182,-1909.3843852,-1758.754614804\r\ngdb_61476,CC(CC(O)C#C)C#C,2.64276,0.88477,0.81983,1.6724,83.83,-0.2585,0.0308,0.2893,1503.7197,0.154941,-385.820574,-385.809925,-385.808981,-385.856443,38.817,-1869.4051593011,-1879.6103381681,-1890.279246186,-1742.6583814451\r\ngdb_68366,CC12CC(C1)NC(=O)N2,2.85609,1.61248,1.2376,4.2771,75.23,-0.2329,0.0557,0.2886,1047.5804,0.161528,-419.248984,-419.241151,-419.240207,-419.280695,31.932,-1840.6853273891,-1852.6563165821,-1863.3252246,-1710.713153273\r\ngdb_36826,C(C#CC1CC1)C1CO1,4.76473,0.61274,0.57624,1.4065,87.08,-0.2306,0.0504,0.281,1950.1512,0.158204,-385.824686,-385.815367,-385.814423,-385.860986,33.226,-1871.985476309,-1883.0252421461,-1893.694150164,-1745.5091548321\r\ngdb_104783,CCC1(CC2NC12)OC,2.32367,1.43012,1.15428,1.6451,81.91,-0.2273,0.0809,0.3082,1199.2494,0.193198,-404.292947,-404.283482,-404.282538,-404.326864,36.012,-2016.5340569861,-2030.1472372321,-2042.595133265,-1868.1978319851\r\ngdb_74176,[NH3+]C1=CC(=O)O[C-]1C=O,2.42407,1.42469,0.90174,13.0488,79.95,-0.1773,-0.0389,0.1384,1188.4085,0.100611,-473.805098,-473.797031,-473.796087,-473.837935,29.277,-1438.4231929751,-1445.8039538331,-1453.5078818261,-1342.3810579981\r\ngdb_51884,C1COCCN1CC=O,4.12424,0.878,0.76813,1.9105,76.86,-0.2323,-0.0287,0.2036,1463.9785,0.17127,-440.280258,-440.271591,-440.270647,-440.314262,32.072,-1872.238362436,-1884.5745618671,-1895.83646589,-1737.071668818\r\ngdb_4273,c1coc(cc1=O)N,4.04688,1.87483,1.28248,5.3859,64.13,-0.2225,-0.0207,0.2017,871.4302,0.097246,-398.649944,-398.643321,-398.642377,-398.680553,25.451,-1381.5865653,-1388.98489641,-1396.096455907,-1293.529482348\r\ngdb_107784,CC1C2C3CC1(CO)C23,2.50235,1.58881,1.26881,1.5709,81.78,-0.2412,0.0663,0.3075,1090.4696,0.182655,-387.036697,-387.028127,-387.027183,-387.069574,33.674,-2004.6816669941,-2017.9685425601,-2029.823442588,-1862.6864204381\r\ngdb_19422,C1OC11C=CC2CC12,4.18043,2.0034,1.62785,1.9209,70.18,-0.2375,0.0028,0.2402,810.4823,0.131718,-346.577786,-346.571504,-346.57056,-346.607913,25.512,-1621.244175071,-1631.5234,-1640.413947512,-1513.392320714\r\ngdb_108395,OCC12CCC1C2C#C,2.31574,1.36626,1.08219,1.2551,81.96,-0.2459,0.0442,0.2901,1191.7866,0.15823,-385.822961,-385.813987,-385.813043,-385.856632,34.19,-1870.9030232841,-1882.159279726,-1892.828187744,-1742.7769806461\r\ngdb_52485,CC#CC#CC1CN1C,5.5406,0.55505,0.52549,1.5798,104.58,-0.2165,-0.0121,0.2043,2116.6001,0.14532,-364.740203,-364.730025,-364.729081,-364.777702,35.071,-1807.0433148811,-1816.6554977431,-1826.731409756,-1687.9527743341\r\ngdb_10885,CC1N2CC1(O)C2C,2.68708,1.95883,1.67367,1.5344,72.23,-0.237,0.0737,0.3106,913.8118,0.164745,-364.989659,-364.981314,-364.98037,-365.021902,31.775,-1730.408778256,-1742.058482841,-1752.728018368,-1603.520183366\r\ngdb_110022,CCC1C(C)COC1=O,2.02552,1.58369,1.00252,4.2972,77.99,-0.2579,0.0139,0.2718,1249.362,0.182853,-424.28512,-424.27578,-424.274836,-424.319403,34.539,-2023.8934825381,-2036.6965486651,-2048.551448693,-1882.5244899641\r\ngdb_119182,OCCC1(CO1)C1CO1,3.44158,0.91269,0.79236,2.3886,71.15,-0.2571,0.0749,0.3321,1447.7014,0.157092,-460.103891,-460.094368,-460.093424,-460.139322,33.944,-1773.8374211281,-1784.747920111,-1795.416828129,-1646.3037469861\r\ngdb_44044,O=C1OC23CN4C2CC134,3.50766,1.7505,1.5207,3.2157,66.57,-0.2474,-0.0385,0.2089,859.4169,0.100657,-436.521648,-436.515327,-436.514383,-436.552192,25.335,-1397.2316196881,-1405.708638769,-1413.412566762,-1300.006630246\r\ngdb_41877,C1OCC2OC=NCC12,3.41132,1.49089,1.09091,0.4863,71.39,-0.2544,0.0165,0.2709,1070.6267,0.151284,-439.091438,-439.084281,-439.083337,-439.123263,27.627,-1754.0947329701,-1765.601365503,-1775.677277516,-1630.931659013\r\ngdb_97414,CN=COC1CN=CO1,5.53547,0.78305,0.73574,2.9492,73.92,-0.2611,0.0096,0.2707,1548.5883,0.135631,-455.153268,-455.144595,-455.143651,-455.187911,30.262,-1644.2894430961,-1653.955591732,-1663.43850774,-1529.272063468\r\ngdb_18825,O=C1OC2C3NC1C23,4.47884,2.32636,1.9375,5.1997,56.01,-0.2388,-0.0009,0.2379,681.6807,0.098015,-398.570261,-398.564911,-398.563967,-398.599601,21.594,-1331.5847656531,-1339.78191572,-1346.893475217,-1242.73137378\r\ngdb_72362,CC12CC=CC1CC2=O,2.33052,1.67375,1.46416,2.5763,80.02,-0.2353,-0.0173,0.218,1017.7858,0.159385,-385.918128,-385.910064,-385.90912,-385.950638,31.483,-1930.6211722871,-1942.4484619191,-1953.117369937,-1801.7665917001\r\ngdb_31186,c1coc(c1C=O)CO,1.94702,1.7251,0.93863,3.5457,70.71,-0.242,-0.0471,0.1949,1171.0867,0.112513,-457.796143,-457.787964,-457.78702,-457.829548,29.337,-1581.408021224,-1589.607681327,-1597.904605325,-1479.344937392\r\ngdb_84671,CC1C=CC2OC3C2C13,2.62332,1.94147,1.53871,1.705,78.54,-0.2199,0.0054,0.2253,943.4914,0.161251,-385.857243,-385.850126,-385.849182,-385.888431,29.385,-1892.415286822,-1904.8368274771,-1915.505735495,-1762.7311393371\r\ngdb_6528,CC1(C)CC1(O)CO,2.731,1.66756,1.42455,1.9339,71.4,-0.2354,0.0535,0.289,1002.7642,0.174727,-386.127471,-386.118083,-386.117139,-386.160731,35.563,-1828.8153671451,-1840.6991325871,-1851.961664119,-1695.9315512691\r\ngdb_7839,CC(O)CC1CC1O,3.6167,1.13378,1.03,1.8851,71.67,-0.2492,0.0749,0.3241,1228.8145,0.175181,-386.122218,-386.11284,-386.111896,-386.156541,34.668,-1825.519062368,-1837.4091029,-1848.671634432,-1693.302288559\r\ngdb_118425,CCCC(CC)C(C)=O,1.96459,0.94101,0.738,2.5596,90.12,-0.2373,-0.0071,0.2301,1717.195,0.225816,-389.528883,-389.517048,-389.516104,-389.566973,41.694,-2312.8475718401,-2327.6398414971,-2341.866725545,-2147.3816061571\r\ngdb_106858,COC12CC1(NC2)C#N,2.16073,1.62037,1.12047,3.6598,75.46,-0.2237,0.0056,0.2293,1114.5958,0.135809,-417.9549,-417.946341,-417.945397,-417.988462,31.529,-1656.4875905471,-1666.225902718,-1675.708818726,-1541.0478973621\r\ngdb_12284,CCC1OC(C)C1C,3.15018,1.44187,1.17712,1.598,78.28,-0.2374,0.0826,0.32,1149.2025,0.198947,-350.204527,-350.195035,-350.194091,-350.239065,34.729,-2013.501933498,-2027.0981710011,-2039.546694543,-1868.305136024\r\ngdb_114663,CCC1OC2CC1C2O,3.44093,1.12436,1.04441,0.7913,78.29,-0.2293,0.0692,0.2985,1256.0479,0.183144,-424.202064,-424.193553,-424.192609,-424.235018,33.504,-1971.7750950341,-1985.098366122,-1996.95326615,-1829.5721429991\r\ngdb_69077,CC12CC1(OC=N2)C=O,2.6939,1.71441,1.27771,1.5973,70.56,-0.2556,-0.0388,0.2169,986.5235,0.123361,-437.899208,-437.89135,-437.890406,-437.931331,29.781,-1633.8112978141,-1643.10094105,-1651.990861053,-1524.206183311\r\ngdb_5693,c1c[nH]c2c1CCN2,4.68054,2.03936,1.45986,1.5286,71.09,-0.17,0.0552,0.2252,835.0856,0.134989,-342.808753,-342.802358,-342.801414,-342.839049,25.394,-1605.302936435,-1615.5106253381,-1624.40117285,-1497.196940933\r\ngdb_105719,COC1(CCC=C1)C#C,2.36543,1.44912,1.18882,0.9472,83.72,-0.2481,0.0026,0.2507,1146.2929,0.157607,-385.853443,-385.84429,-385.843346,-385.887296,34.371,-1890.0307526221,-1901.1746849531,-1911.843592971,-1762.0189166221\r\ngdb_22236,CCCC(=NO)C(C)C,2.03289,0.9818,0.75779,0.473,88.74,-0.2386,0.0205,0.2591,1656.6481,0.21449,-405.525466,-405.514037,-405.513093,-405.562302,41.581,-2162.099202243,-2176.257060301,-2189.8909483441,-2002.223714241\r\ngdb_5933,CC#CC(C)(C)CO,3.65689,1.02646,0.98119,1.4674,80.07,-0.242,0.0582,0.3002,1319.8572,0.17333,-348.985338,-348.974576,-348.973632,-349.022382,37.287,-1876.3014832111,-1887.3236787961,-1898.586210328,-1746.0481850631\r\ngdb_113588,OCC1CC11CCCC1,3.86768,0.89931,0.82383,1.4393,86.16,-0.2567,0.0766,0.3333,1487.3764,0.206864,-388.288179,-388.278843,-388.277899,-388.323152,35.441,-2162.1462654181,-2176.7295745781,-2189.770466616,-2008.095315954\r\ngdb_31867,c1c(cnc(n1)CO)N,4.78815,0.99996,0.88621,2.598,77.06,-0.2247,-0.031,0.1937,1291.7954,0.125722,-434.107777,-434.099312,-434.098368,-434.141909,30.835,-1603.815112596,-1612.72323036,-1621.613150363,-1495.110473508\r\ngdb_34328,N#CC12COC1=NCC2,2.35558,1.87391,1.34341,3.8772,69.94,-0.2729,-0.0156,0.2573,959.7327,0.11444,-416.790965,-416.783803,-416.782859,-416.822631,27.199,-1553.9595225461,-1562.797359302,-1571.0942833,-1450.701034069\r\ngdb_83473,OC1C2NCC12OC=N,3.34254,1.10077,0.96113,2.0817,70.71,-0.2394,0.0179,0.2572,1237.9008,0.135796,-455.074302,-455.065919,-455.064975,-455.107813,31.195,-1594.737567402,-1604.585693648,-1614.068609656,-1479.009847586\r\ngdb_43847,C1CC12CC(=O)C(=O)O2,3.22803,1.35717,1.04181,5.2672,67.15,-0.2561,-0.084,0.1722,1108.2837,0.111611,-457.798038,-457.79035,-457.789406,-457.83208,28.62,-1582.5971507791,-1591.104917801,-1599.401841799,-1480.9337901801\r\ngdb_118129,CCOC(C)C1OC1C,2.56161,1.01382,0.84766,1.8387,82.79,-0.2506,0.0793,0.3299,1493.3344,0.202532,-425.403762,-425.392824,-425.39188,-425.440598,38.952,-2097.999785402,-2111.5771976351,-2124.618089673,-1944.8618615331\r\ngdb_41030,C1CC23C4CC2(CN13)O4,3.39434,1.78611,1.51766,1.0627,76.46,-0.2144,0.0325,0.2469,912.9253,0.147721,-401.756372,-401.74956,-401.748616,-401.787069,28.62,-1680.5136551391,-1692.2374057861,-1702.313317799,-1556.8987747021\r\ngdb_129933,c1c(c(n[nH]1)N2CC2)N,2.93014,1.47361,1.04586,1.508,76.28,-0.1863,0.0325,0.2188,1127.2012,0.138343,-414.167307,-414.159032,-414.158088,-414.20098,30.987,-1628.899784871,-1638.815682089,-1648.298598097,-1513.169555019\r\ngdb_128597,COc1cnn[nH]c1=O,2.75458,1.56913,1.0061,2.8211,69.12,-0.2468,-0.0635,0.1833,1096.5884,0.102024,-469.99756,-469.990008,-469.989064,-470.029849,27.608,-1398.319720294,-1406.023020778,-1413.726948771,-1301.5735202191\r\ngdb_23295,ON=C1C2CC(O)C1N2,2.675,1.38918,1.19398,1.7049,72.05,-0.2395,0.005,0.2445,1080.7282,0.137141,-455.057909,-455.050049,-455.049105,-455.090283,30.696,-1584.450812365,-1594.6271258181,-1604.110041826,-1468.009614816\r\ngdb_99889,CCC(C)(C#C)C1CC1,2.17594,1.27977,1.0134,0.5503,91.21,-0.2553,0.061,0.3163,1331.0143,0.203073,-351.11274,-351.102163,-351.101219,-351.148075,40.097,-2188.73256673,-2202.5377647301,-2215.578656768,-2035.1647991961\r\ngdb_113781,CC1(CO1)C1OC1CO,3.08751,0.99094,0.96359,1.1621,71.01,-0.2675,0.0639,0.3314,1313.2921,0.15646,-460.11065,-460.101084,-460.10014,-460.14617,34.589,-1778.0787544591,-1788.9622705551,-1799.6311785731,-1650.6009286181\r\ngdb_42754,O=C1C2CC1(CC#C)O2,2.87986,1.35593,1.13018,2.8799,71.36,-0.2554,-0.012,0.2433,1091.3383,0.109758,-420.53097,-420.522781,-420.521837,-420.564275,30.345,-1551.6854299301,-1559.879442452,-1568.17636645,-1449.81561887\r\ngdb_18792,CC1C2OC3CN1C23,3.48287,2.43653,2.16109,0.4991,66.69,-0.2103,0.0692,0.2794,729.5967,0.143782,-363.788425,-363.782182,-363.781237,-363.818438,25.28,-1604.4752520641,-1615.666875079,-1625.149791087,-1489.558273876\r\ngdb_107999,CCC12CC3C1C1C2N31,3.50417,1.46462,1.3443,1.9585,80.83,-0.2224,0.0731,0.2954,1043.2926,0.172156,-365.930356,-365.922968,-365.922023,-365.962123,30.017,-1926.0234138441,-1939.1634523041,-1950.424728818,-1789.9650299371\r\ngdb_52119,O=CNCC12CCC1O2,4.1425,0.91211,0.83298,2.3983,73.71,-0.2517,0.0277,0.2795,1388.2084,0.147396,-439.073105,-439.064459,-439.063514,-439.108028,31.334,-1742.590610473,-1753.1628821051,-1763.238166609,-1621.371559398\r\ngdb_31115,CC1=C(C[NH3+])C([O-])=CO1,2.342,1.54211,0.96604,3.8633,76.03,-0.1888,0.0284,0.2172,1205.5501,0.148553,-439.113465,-439.104657,-439.103712,-439.146919,33.012,-1767.9168737131,-1778.3874888871,-1788.462773391,-1645.7760119171\r\ngdb_114426,OCC1OC2C1OC2=O,3.30022,1.17106,1.15517,3.3794,60.83,-0.2689,-0.0233,0.2456,1096.3878,0.112451,-494.880854,-494.873205,-494.872261,-494.913629,28.092,-1497.8890833601,-1506.420695724,-1514.717619722,-1395.1746451861\r\ngdb_103909,COC1(C)CC1OC=N,2.54191,1.11582,0.96164,4.2255,78.36,-0.2486,0.0311,0.2797,1336.9511,0.167746,-440.251824,-440.241622,-440.240677,-440.287381,36.322,-1854.3957715301,-1865.7687446461,-1877.03002116,-1720.2035993891\r\ngdb_104350,OCC1(O)COC=NC1,3.27588,1.26409,1.12268,4.1378,68.05,-0.2438,0.0306,0.2744,1116.0564,0.148094,-476.222753,-476.214124,-476.21318,-476.255816,32.505,-1699.8202245421,-1710.4025363181,-1720.4784483311,-1577.177983055\r\ngdb_54432,[NH3+]C12CC1(NC2)C([O-])=O,2.49958,1.62545,1.2198,5.9626,69.54,-0.2263,0.0106,0.2368,1038.2185,0.137306,-455.127312,-455.11926,-455.118316,-455.159778,31.291,-1628.0018194921,-1638.0576512171,-1647.5405672251,-1511.6183527711\r\ngdb_73602,CC12OCCC1C2C#N,2.33319,1.63404,1.32884,4.4687,76.25,-0.2462,0.0271,0.2733,1044.8252,0.148364,-401.961415,-401.953141,-401.952197,-401.994156,31.219,-1809.1799830261,-1819.9863155151,-1830.062227528,-1686.847730985\r\ngdb_4568,Cc1cnc(nc1)C,5.7343,1.52051,1.21995,2.0337,74.17,-0.2398,-0.0329,0.2069,991.6866,0.131169,-342.855663,-342.847735,-342.846791,-342.891086,27.435,-1634.739383625,-1643.985101231,-1652.875648743,-1529.850626766\r\ngdb_93221,CC1CC(O)C(=O)CN1,2.68693,1.30242,0.9487,2.2831,75.69,-0.2258,-0.0292,0.1966,1255.3348,0.171771,-440.310645,-440.301856,-440.300912,-440.343898,33.91,-1891.3064784191,-1903.566121752,-1914.828025775,-1755.6685255421\r\ngdb_84795,CC1C=CCN1C(C)=O,2.55273,1.49003,1.02597,3.5755,81.4,-0.23,0.0109,0.2409,1198.4277,0.170632,-403.198891,-403.18972,-403.188776,-403.232772,34.012,-1957.855690396,-1969.8762528001,-1981.138156823,-1822.8678368431\r\ngdb_73064,CC12OCC1C2OC=O,2.87799,1.23646,1.13506,4.8614,69.46,-0.2522,0.0102,0.2625,1142.1382,0.134176,-458.93879,-458.930301,-458.929357,-458.972209,30.929,-1670.5776776331,-1680.359915434,-1689.842831442,-1555.1524171551\r\ngdb_12665,CC1C(O)COC1=O,2.69988,2.30696,1.48628,5.0341,60.39,-0.2631,0.0156,0.2788,861.3009,0.130539,-420.922772,-420.915047,-420.914103,-420.954762,28.398,-1564.3742894191,-1573.746763843,-1582.637311355,-1457.179436985\r\ngdb_74923,CC1=CC2OC3C1C23C,2.83851,1.52464,1.33994,1.8101,80.65,-0.2194,0.0086,0.228,1063.2995,0.1589,-385.858531,-385.850548,-385.849604,-385.8906,31.627,-1893.2235184141,-1905.1016362751,-1915.770544293,-1764.0922063581\r\ngdb_50581,O=CC1CN2CC2CN1,3.30497,1.23733,1.06884,1.6482,76.77,-0.2143,-0.0262,0.1881,1159.9269,0.160891,-419.176266,-419.168207,-419.167263,-419.209271,30.78,-1795.054127927,-1806.8833000861,-1817.552208104,-1665.893950457\r\ngdb_124295,CC(=O)c1c(non1)O,2.79954,1.51904,0.99072,0.1632,61.83,-0.2717,-0.0809,0.1908,1096.081,0.087615,-489.858761,-489.850749,-489.849804,-489.892166,28.289,-1323.4930370981,-1330.019130698,-1337.1294351771,-1234.1847012\r\ngdb_693,OC1CC2OC12,6.64811,3.82504,3.12955,1.9201,45.31,-0.2521,0.0773,0.3294,456.2711,0.096669,-306.320451,-306.315325,-306.31438,-306.348781,19.349,-1131.127015603,-1138.575547433,-1145.094738434,-1053.516702483\r\ngdb_18595,CC1CC2OCCC12,3.30385,1.97159,1.61583,1.5016,73.19,-0.2403,0.0732,0.3135,890.9403,0.179191,-349.005706,-348.998409,-348.997465,-349.037181,28.81,-1889.082586523,-1902.279100793,-1913.541632325,-1755.334690754\r\ngdb_84983,OC1CC(=O)C(=C1)C#N,2.93723,1.23438,0.90458,6.1838,72.0,-0.2686,-0.0863,0.1823,1201.2576,0.100634,-436.72697,-436.718844,-436.7179,-436.760241,29.268,-1526.0730225861,-1533.417387922,-1541.121315915,-1430.559250187\r\ngdb_59150,CC(O)C1C2C3NC3C12,3.08477,1.25314,1.14665,2.7418,77.96,-0.2085,0.0691,0.2775,1159.7495,0.170277,-403.04637,-403.03782,-403.036876,-403.079603,33.694,-1862.1473902071,-1874.5576357,-1885.819539723,-1726.7529108221\r\ngdb_21314,COCc1ncno1,7.30185,1.16907,1.05473,2.6375,59.86,-0.2751,-0.0308,0.2443,1085.6971,0.107287,-415.84114,-415.833706,-415.832762,-415.874209,24.985,-1352.616984806,-1360.3943313521,-1368.098886854,-1259.109986189\r\ngdb_96582,CC1CCCCCOC1,2.09862,1.40886,0.97728,1.2059,88.51,-0.2396,0.0794,0.3189,1339.5036,0.23122,-389.494215,-389.484911,-389.483966,-389.527856,36.7,-2291.0930898281,-2307.4735847641,-2321.699841303,-2122.8353366041\r\ngdb_125140,Cc1cnoc1C2CC2,2.9043,1.33791,0.99329,3.2758,80.69,-0.2295,-0.0029,0.2266,1213.1166,0.148016,-401.942419,-401.934007,-401.933062,-401.976494,30.737,-1797.259822062,-1807.979558309,-1818.054842813,-1675.764667027\r\ngdb_58721,CN(C)C1(COC1)C=O,2.24587,1.41289,1.23696,1.0284,75.17,-0.2096,-0.0316,0.1779,1141.6431,0.16864,-440.245807,-440.23615,-440.235206,-440.280064,34.969,-1850.6200498771,-1862.335015398,-1873.596919421,-1715.612116036\r\ngdb_125741,CC1=NOC(=O)OC1=N,3.23072,1.49064,1.02646,3.3493,64.77,-0.3013,-0.0856,0.2157,1068.7976,0.087918,-489.894904,-489.88716,-489.886216,-489.927709,27.847,-1346.1730948851,-1352.867360897,-1359.978292885,-1256.488253587\r\ngdb_89951,OC1CC2(OCC12)C#N,3.43126,1.31331,1.12553,3.1904,69.07,-0.2791,-0.0064,0.2727,1071.036,0.124241,-437.853668,-437.845702,-437.844758,-437.887066,29.777,-1605.2345379541,-1614.4564102181,-1623.346330221,-1496.429497426\r\ngdb_4540,Cc1cc(n(c1)C)N,3.87323,1.63812,1.17311,1.6297,76.74,-0.1777,0.0578,0.2355,1040.3809,0.154315,-344.021154,-344.012357,-344.011412,-344.054271,31.865,-1738.24385563,-1748.7213734031,-1758.797285416,-1618.5371012451\r\ngdb_12447,C1CNC(=O)CC=C1,3.27097,1.96047,1.39077,3.7897,70.07,-0.2332,0.0136,0.2468,903.7043,0.14475,-363.906958,-363.89973,-363.898786,-363.938511,27.696,-1678.855776361,-1689.429303011,-1698.9128465279,-1564.905162033\r\ngdb_47153,C#CCC1=CCNC1=O,3.32179,1.12758,0.85059,3.6925,76.87,-0.2465,-0.0228,0.2237,1290.3089,0.123786,-400.76739,-400.75876,-400.757816,-400.801452,31.437,-1687.770169215,-1696.5760030121,-1705.465923015,-1579.637818335\r\ngdb_9657,OCC1CC1(O)C#C,3.50895,1.40166,1.30699,2.7097,67.79,-0.2464,0.0249,0.2712,997.2798,0.12808,-383.679838,-383.671231,-383.670287,-383.712704,32.072,-1548.6068707761,-1557.4265097711,-1566.317057283,-1442.217739898\r\ngdb_10595,CC1CC1(C=O)C=O,2.9714,1.76259,1.32317,0.7415,67.5,-0.2539,-0.0451,0.2089,946.1145,0.127904,-383.756382,-383.748054,-383.74711,-383.788923,29.864,-1596.638919672,-1605.6336336781,-1614.52418119,-1490.0458483691\r\ngdb_131306,C1CC1n2nc(nn2)O,5.89286,1.01522,0.93016,1.7941,68.65,-0.2493,-0.0238,0.2255,1225.2913,0.114145,-450.096028,-450.088558,-450.087613,-450.129754,27.543,-1447.838338011,-1456.481646977,-1464.777943466,-1345.256304236\r\ngdb_35058,C#CC1C2OCCOC12,2.51138,1.51995,1.48532,1.2055,74.19,-0.2225,0.0444,0.2669,997.5226,0.135976,-421.757118,-421.749275,-421.748331,-421.789451,30.338,-1693.2527153481,-1703.440951472,-1712.9238674801,-1577.402003768\r\ngdb_87323,CN1CC1(CC#N)C=O,1.86697,1.68478,1.05643,5.2352,73.75,-0.2565,-0.0403,0.2162,1156.0436,0.133746,-417.99736,-417.988059,-417.987115,-418.031826,33.051,-1683.1316226871,-1692.4043231801,-1701.887239188,-1568.2591976381\r\ngdb_1831,N=C1NC(=O)CO1,6.67684,2.30384,1.73163,1.4121,49.76,-0.2671,-0.0122,0.2549,655.7217,0.080471,-376.631067,-376.625417,-376.624473,-376.660653,20.584,-1124.984957511,-1131.215494372,-1137.141689368,-1051.790425224\r\ngdb_4478,Cc1c(nc([nH]1)O)O,3.61312,1.81145,1.2157,1.6176,60.53,-0.178,0.0565,0.2345,931.379,0.107576,-415.90902,-415.901174,-415.90023,-415.940818,28.797,-1395.212295726,-1402.731108564,-1410.4356640661,-1300.90773317\r\ngdb_70204,CC12CCC(=O)CC1O2,3.32745,1.27811,1.08485,2.5935,74.96,-0.2391,-0.0147,0.2244,1155.1687,0.159032,-423.055282,-423.046967,-423.046023,-423.088321,32.04,-1880.01068891,-1891.6798462741,-1902.348754292,-1751.2320369121\r\ngdb_128961,C(C#N)Oc1nnon1,8.01753,0.80475,0.73471,2.1195,57.97,-0.3131,-0.0879,0.2252,1369.2235,0.064968,-484.794439,-484.787058,-484.786114,-484.827631,24.951,-1122.597913275,-1127.742232057,-1133.667172035,-1046.166689566\r\ngdb_11454,CCC1COC(C)=N1,4.04288,1.37469,1.09966,1.0475,73.23,-0.2392,0.0301,0.2693,1129.8143,0.165574,-365.098687,-365.090101,-365.089157,-365.132258,30.745,-1798.824829508,-1810.3233044241,-1820.992839951,-1672.76956657\r\ngdb_31688,CNc1cn(cn1)C=O,5.22204,0.92551,0.79187,3.5737,77.31,-0.1998,-0.0423,0.1575,1384.3945,0.125297,-434.115944,-434.107515,-434.106571,-434.149275,30.316,-1608.939978599,-1617.870686687,-1626.76060669,-1499.732704802\r\ngdb_61420,CC(NC(N)=[NH2+])C([O-])=O,3.02836,0.98188,0.81787,7.2725,71.9,-0.2368,0.0229,0.2598,1397.7658,0.147484,-472.459295,-472.44955,-472.448606,-472.494267,35.834,-1687.377348581,-1697.258732804,-1707.334644817,-1565.572831609\r\ngdb_11467,CCC1CCC(C)C1,4.76364,1.08397,0.94546,0.1158,85.29,-0.2965,0.0813,0.3778,1342.9761,0.224568,-314.302021,-314.292789,-314.291845,-314.336171,34.744,-2211.0135287931,-2226.5506516331,-2240.185167185,-2052.458319727\r\ngdb_56356,C#CCCN=C(C#N)N,5.96905,0.65299,0.62195,4.508,81.19,-0.2611,-0.0456,0.2156,1713.7503,0.122431,-396.903722,-396.893909,-396.892965,-396.93937,34.733,-1612.444616364,-1620.507479505,-1629.397399508,-1504.947304592\r\ngdb_114312,CC1C2OC(CO)C12O,2.29259,1.31466,1.12208,2.7939,71.37,-0.2367,0.0598,0.2964,1185.0184,0.15748,-460.104068,-460.094804,-460.09386,-460.137911,34.791,-1773.9484902211,-1785.021514035,-1795.690422053,-1645.418331787\r\ngdb_78318,CC1C2C1C1(C)OCC21,2.98889,1.38564,1.21215,1.6767,82.09,-0.2401,0.0743,0.3144,1146.4977,0.182112,-387.041046,-387.032666,-387.031722,-387.073355,33.44,-2007.4107036351,-2020.8168059111,-2032.671705939,-1865.0590319671\r\ngdb_82682,CN1C2CC3C2C13C#N,2.89133,1.63109,1.26772,4.5602,77.7,-0.2315,0.0083,0.2399,1010.8221,0.136725,-380.812246,-380.804712,-380.803768,-380.844094,29.374,-1703.6467744241,-1714.0289108291,-1723.511826837,-1587.387554485\r\ngdb_94457,CC1CCC11CCCC1,2.55138,1.34868,1.18403,0.0688,91.3,-0.2737,0.074,0.3477,1230.393,0.231025,-352.369295,-352.360236,-352.359292,-352.403118,35.832,-2349.3805183111,-2365.9153804611,-2380.142264509,-2181.4929953971\r\ngdb_127674,Cn1c(ncn1)NC=N,3.54619,1.12818,0.86058,1.801,76.68,-0.2232,-0.0013,0.222,1284.3414,0.126489,-430.278182,-430.269513,-430.268569,-430.312887,29.873,-1549.870673902,-1558.650152321,-1567.540072324,-1441.165407305\r\ngdb_117028,COCC(=O)C(=O)C#N,4.48149,0.81535,0.69585,4.5926,65.9,-0.2839,-0.1359,0.148,1524.1336,0.096857,-473.82284,-473.813146,-473.812202,-473.858849,31.655,-1449.5564576531,-1455.916261368,-1463.620189361,-1355.504781224\r\ngdb_110605,CCC1C2C3CC3(C)C12,2.95433,1.21108,1.07486,0.1742,89.87,-0.2518,0.0954,0.3472,1277.8543,0.205888,-351.120274,-351.111228,-351.110284,-351.153817,35.496,-2193.460219536,-2208.2261338151,-2221.267025853,-2038.7679558741\r\ngdb_126319,CCc1cn(nn1)C=O,5.0343,0.91815,0.78404,5.4308,74.72,-0.2691,-0.0671,0.202,1409.4024,0.123984,-434.083582,-434.07518,-434.074236,-434.117524,29.663,-1588.632532341,-1597.580183172,-1606.470103175,-1479.808666543\r\ngdb_129814,NC1=CC(=N)C(O)=NN1,3.22637,1.39802,0.97732,5.0204,75.68,-0.1939,-0.016,0.1779,1129.1617,0.115387,-450.140635,-450.132776,-450.131832,-450.172659,30.738,-1475.829631974,-1484.228839939,-1492.525763937,-1372.179577881\r\ngdb_38250,C1C2CC11OCCC1C2,3.09773,1.71445,1.40246,1.436,81.19,-0.2371,0.0851,0.3221,997.6806,0.186738,-387.082235,-387.075375,-387.074431,-387.11328,29.175,-2033.2571718361,-2047.6170877921,-2059.47198782,-1890.112328792\r\ngdb_114542,OCC1CN2CC(C2)O1,3.61186,1.20964,1.08296,1.6484,74.29,-0.2169,0.0787,0.2956,1167.0872,0.173263,-440.236537,-440.228731,-440.227787,-440.269027,31.002,-1844.803041447,-1857.6795261271,-1868.94143015,-1708.686299203\r\ngdb_20567,CC(O)C1=CNN=C1,4.52633,1.49652,1.1682,1.429,65.83,-0.2374,0.0204,0.2579,993.631,0.132124,-379.93995,-379.932309,-379.931365,-379.97237,28.274,-1550.9543819451,-1560.379567125,-1569.270114637,-1443.925191887\r\ngdb_71812,CC12OC3C4CN4C1C23,3.46216,1.65847,1.58506,1.228,74.17,-0.2035,0.0626,0.2661,924.7402,0.148829,-401.87351,-401.8667,-401.865756,-401.904269,28.385,-1754.0188043811,-1765.743810046,-1775.819722059,-1630.442829502\r\ngdb_30816,CCc1c(cc([nH]1)C)N,2.3991,1.23698,0.87018,1.4584,89.02,-0.1677,0.0492,0.217,1404.9252,0.182439,-383.316818,-383.306337,-383.305393,-383.352944,37.573,-2019.585005744,-2031.672084102,-2043.52698413,-1879.268345763\r\ngdb_120398,CCCC1OC(=N)C1N,2.98243,0.90244,0.77906,1.6372,80.89,-0.2463,0.0222,0.2685,1555.1441,0.18255,-420.409147,-420.399168,-420.398224,-420.444732,36.101,-1940.8464314421,-1953.247891809,-1965.1027918371,-1799.93426542\r\ngdb_126845,CCn1c(c(nn1)O)O,3.1125,1.30448,0.99436,3.0725,67.87,-0.2072,0.0121,0.2194,1173.0957,0.123829,-471.200371,-471.191352,-471.190408,-471.234478,32.2,-1525.242828179,-1533.802678448,-1542.692598451,-1416.266477694\r\ngdb_74311,CC1=NC(C=O)C(=N)N1,2.67877,1.40413,1.06783,4.7326,73.47,-0.233,-0.0276,0.2054,1138.8855,0.123518,-434.103074,-434.094293,-434.093349,-434.137212,31.195,-1600.863937769,-1609.573762689,-1618.463682692,-1492.163063735\r\ngdb_42535,O=C1C2C3C(C#N)C2C13,5.84229,1.06787,1.00904,2.8317,70.79,-0.2626,-0.0242,0.2383,1115.0566,0.101686,-399.501747,-399.494972,-399.494028,-399.533225,25.814,-1521.4194158421,-1529.6121733461,-1537.316101339,-1425.0365434781\r\ngdb_63427,CC1(CC(C1)CC#N)C,3.76724,0.79592,0.78202,4.1309,86.83,-0.2957,0.0359,0.3317,1572.3841,0.193243,-367.233925,-367.224132,-367.223187,-367.268805,36.838,-2116.1730735511,-2129.5810583541,-2142.028326878,-1968.6971633891\r\ngdb_73683,CC12CCOC1COC2,2.28353,1.78383,1.36944,1.672,76.48,-0.2408,0.0773,0.3182,1043.1857,0.18483,-424.24103,-424.233064,-424.23212,-424.273358,31.539,-1996.2266107281,-2009.8918742211,-2021.7467742491,-1853.630838059\r\ngdb_19911,CCN1C2C3OC3C12,4.63943,1.60725,1.45472,1.5436,68.08,-0.2165,0.0467,0.2632,917.2156,0.14086,-363.752087,-363.744829,-363.743884,-363.783841,27.903,-1581.672830022,-1592.227531402,-1601.71044741,-1467.848345003\r\ngdb_33746,C#CC12CC(C1)C21CC1,2.35969,1.65471,1.2622,0.5533,87.65,-0.2561,0.0412,0.2973,1077.3612,0.158899,-348.653476,-348.64553,-348.644586,-348.685583,32.25,-1901.2255131821,-1913.1274763851,-1923.796384403,-1772.3740701401\r\ngdb_97035,CN=C1COC(=N)N1C,3.14954,1.3063,0.9396,2.444,79.8,-0.2276,0.0095,0.237,1236.5846,0.14774,-435.300991,-435.291556,-435.290612,-435.336512,32.617,-1724.7160166081,-1734.79255613,-1744.868468143,-1603.512025749\r\ngdb_119576,COCC12CC1CC2C,3.23429,0.97669,0.87097,1.1972,87.01,-0.2399,0.0888,0.3286,1470.8737,0.204762,-388.254763,-388.244949,-388.244005,-388.28971,36.445,-2141.1774246741,-2155.4607845321,-2168.50167657,-1987.1101599761\r\ngdb_48125,N=C1OCC1C(=O)C=O,2.63926,1.31702,1.06013,2.0001,64.29,-0.2638,-0.1065,0.1572,1117.2508,0.099248,-473.828231,-473.819906,-473.818962,-473.862481,28.686,-1452.9393586721,-1460.158222208,-1467.862150201,-1357.783893912\r\ngdb_73689,OC12CCCC1CCC2,2.46674,1.60878,1.24915,1.1826,84.13,-0.2554,0.0776,0.3329,1123.0337,0.20868,-388.327857,-388.319636,-388.318691,-388.360619,33.732,-2187.04456752,-2202.3275492151,-2215.367813744,-2031.606195657\r\ngdb_30745,Cc1c(c[nH]c1OC)N,2.9199,1.27071,0.94394,1.1715,80.11,-0.1697,0.0596,0.2293,1265.8452,0.158559,-419.226982,-419.216865,-419.215921,-419.262014,36.265,-1826.8788743711,-1837.4166330081,-1848.085541026,-1698.9906576441\r\ngdb_38120,C1C2C=CC3COC12C3,2.4365,2.27497,1.6022,1.3024,79.25,-0.2234,0.009,0.2324,901.796,0.162604,-385.859819,-385.853271,-385.852326,-385.890197,28.343,-1894.0317500061,-1906.8103432821,-1917.478623791,-1763.8393202311\r\ngdb_74915,CC1=CC2OC3C(C1)C23,3.47022,1.50015,1.26354,1.8133,79.91,-0.2165,0.0101,0.2265,1048.7827,0.160706,-385.863699,-385.856369,-385.855425,-385.895206,29.753,-1896.466484926,-1908.754366164,-1919.423274182,-1766.982512812\r\ngdb_95995,[NH3+]C1COCC1C([O-])=O,2.45141,1.47923,0.9933,5.5389,67.13,-0.2552,0.02,0.2752,1159.883,0.149924,-476.253513,-476.245302,-476.244358,-476.287157,30.771,-1719.1224013821,-1729.96701192,-1740.042923933,-1596.8447426241\r\ngdb_12018,CC1(CN1C2CC2)C,3.75256,1.33024,1.2731,1.1335,80.27,-0.2221,0.0869,0.309,1103.7353,0.187227,-329.089652,-329.080797,-329.079852,-329.122761,33.739,-1929.488518542,-1942.5959265341,-1954.450826562,-1790.0284083461\r\ngdb_98186,CCC(=O)C(C)(O)CC,2.76921,0.96224,0.9225,2.4581,82.31,-0.2372,-0.0074,0.2298,1473.8478,0.201154,-425.459173,-425.447558,-425.446614,-425.496386,41.784,-2132.7706866011,-2145.9232752411,-2158.964167279,-1979.869333625\r\ngdb_37965,C1C2C3COCCC1C23,2.53987,1.87149,1.36716,1.3158,80.87,-0.2355,0.0784,0.3139,1011.7748,0.186485,-387.060154,-387.053105,-387.052161,-387.091188,29.771,-2019.4011456071,-2033.6424623621,-2045.49736239,-1876.249399964\r\ngdb_21855,OC1=CC(=N)C=NN1,3.99082,1.90402,1.28902,3.3219,66.03,-0.2047,-0.0306,0.1741,866.0969,0.097863,-394.788323,-394.781689,-394.780745,-394.818969,25.432,-1307.545523372,-1314.936324374,-1322.047883871,-1219.151468087\r\ngdb_107463,CC1(O)CN2CC12CO,2.59931,1.38781,1.19875,3.6611,74.48,-0.2334,0.0603,0.2937,1132.731,0.169706,-440.233789,-440.224619,-440.223675,-440.267096,35.515,-1843.078646715,-1855.0992091191,-1866.361113142,-1707.474579324\r\ngdb_1281,CC[NH2+]CC([O-])=O,6.15033,1.33409,1.23569,5.3151,57.48,-0.248,0.0213,0.2693,983.9296,0.136774,-362.945907,-362.938068,-362.937124,-362.978657,27.354,-1470.468822587,-1479.769133476,-1488.660308497,-1366.140431265\r\ngdb_82501,CN1C2CC3(C)OC2C13,3.37018,1.44969,1.40235,0.9239,78.46,-0.2118,0.0786,0.2904,1040.958,0.171329,-403.090987,-403.083286,-403.082342,-403.122566,31.423,-1890.14495926,-1903.087959894,-1914.349863917,-1753.712579989\r\ngdb_79150,CC1C2C3CC(O)C2C13,4.20928,1.18509,1.10274,1.3004,82.66,-0.2574,0.0805,0.3379,1185.3304,0.184119,-387.054923,-387.04704,-387.046096,-387.086952,31.957,-2016.118646028,-2029.8366202771,-2041.691520305,-1873.5912718401\r\ngdb_70928,CC12CC3(C1)COCC23,3.36074,1.44181,1.23773,1.814,83.01,-0.2399,0.0921,0.332,1090.4344,0.184288,-387.026341,-387.018601,-387.017657,-387.058248,31.431,-1998.1831837901,-2011.9908918261,-2023.845791854,-1855.5792535041\r\ngdb_93623,CC1C(O)CCC1C#C,1.88986,1.60968,1.33233,1.8067,83.13,-0.2469,0.0509,0.2979,1130.9788,0.181825,-387.070674,-387.061081,-387.060137,-387.104994,36.462,-2026.002540287,-2038.647474146,-2050.502374174,-1884.912789218\r\ngdb_43650,O=C1NC(CC#N)C=C1,3.65084,1.04645,0.84864,4.8153,72.33,-0.2586,-0.0471,0.2114,1274.6824,0.113754,-416.868692,-416.860703,-416.859758,-416.902031,29.085,-1602.7339145891,-1611.0528014021,-1619.349097891,-1500.525248669\r\ngdb_130149,C(C=O)c1[nH]nc(n1)N,4.68972,0.87535,0.80391,2.4745,69.21,-0.218,-0.0351,0.1829,1378.9688,0.11371,-450.164655,-450.156202,-450.155258,-450.199491,29.947,-1490.902398154,-1498.928865773,-1507.225789771,-1389.016899369\r\ngdb_54438,CC(=O)C12CC1C(=O)N2,3.46955,1.22935,1.01526,2.1124,70.87,-0.253,-0.038,0.215,1150.233,0.122944,-437.901085,-437.892772,-437.891828,-437.934412,30.743,-1634.9891322071,-1643.9932588481,-1652.883178851,-1526.1395385401\r\ngdb_80869,CC1OC2CC1(O)C2O,2.19051,1.73582,1.48139,2.4037,70.16,-0.2452,0.0601,0.3053,1005.4438,0.158921,-460.140004,-460.131654,-460.13071,-460.172194,33.336,-1796.4986536451,-1808.1452206851,-1818.814128703,-1666.9312228341\r\ngdb_59463,CC(O)C1OC1C(C)O,2.96842,0.94842,0.91831,2.0823,75.33,-0.262,0.0675,0.3296,1425.3946,0.179594,-461.34073,-461.330355,-461.329411,-461.376369,38.226,-1922.1134052651,-1934.266372068,-1946.121272096,-1781.339291223\r\ngdb_53304,CC#CC1OC1CC#N,3.26643,0.77787,0.66885,5.5249,79.7,-0.2643,0.0078,0.272,1647.8048,0.121859,-400.716657,-400.706989,-400.706044,-400.753263,32.67,-1655.9347551181,-1664.0892345731,-1672.978527067,-1549.398787134\r\ngdb_18823,O=C1OC2C3CC1C23,4.37724,2.28664,1.90134,4.3867,59.57,-0.2628,0.008,0.2708,704.5152,0.109737,-382.545167,-382.539703,-382.538759,-382.574615,22.136,-1464.442226151,-1473.457020445,-1481.161575947,-1369.279231283\r\ngdb_33018,COCc1cncnc1,5.05452,0.92154,0.78868,3.5786,77.25,-0.2466,-0.0364,0.2102,1415.8532,0.137482,-418.043686,-418.035383,-418.034439,-418.078382,29.116,-1712.201604621,-1722.1005590961,-1731.583475104,-1597.473506642\r\ngdb_16160,CCC12CC1OC=N2,4.4879,1.6632,1.42933,1.3111,67.56,-0.2428,0.0171,0.2599,911.9846,0.142992,-363.864647,-363.857445,-363.856501,-363.896078,27.52,-1652.305243062,-1662.895084946,-1672.378628463,-1538.278072636\r\ngdb_45200,O=C1NC2CC3CC12C3,3.43225,1.55368,1.31547,3.5471,76.3,-0.2431,0.0136,0.2567,988.6704,0.151052,-401.921325,-401.91469,-401.913746,-401.95209,27.802,-1784.023147216,-1795.8579669561,-1805.933878969,-1660.450937391\r\ngdb_46146,O=C1CCC(CO1)C#N,4.57942,0.99681,0.84837,1.7271,69.71,-0.279,-0.0093,0.2697,1273.9211,0.125728,-437.942464,-437.934451,-437.933506,-437.975605,29.167,-1660.954827118,-1670.1472064591,-1679.036498953,-1551.9885167771\r\ngdb_9691,COC1CC1(C)OC,3.92056,1.25435,1.20242,0.7472,72.23,-0.2346,0.0853,0.3199,1125.7567,0.173751,-386.103651,-386.094011,-386.093066,-386.137713,34.526,-1813.8681027651,-1825.5937359391,-1836.855639962,-1681.487549107\r\ngdb_63099,CC1(O)C2COC(=N)C12,2.76079,1.41781,1.25918,2.1319,73.61,-0.2495,0.0238,0.2732,1067.3986,0.147824,-439.091629,-439.083375,-439.082431,-439.124051,32.109,-1754.2145871891,-1765.0328423491,-1775.108754362,-1631.4261361051\r\ngdb_17119,N=C1CC(O1)C1CO1,4.71945,1.47619,1.39581,3.418,62.48,-0.2645,0.0269,0.2914,914.4486,0.118982,-399.771385,-399.76434,-399.763396,-399.803227,25.85,-1457.449265855,-1466.359893655,-1474.657445162,-1356.794939728\r\ngdb_24557,N1C=CC=CN=CN=C1,2.19181,2.05929,1.21554,3.9081,84.07,-0.2047,-0.0121,0.1925,1008.881,0.127025,-396.931743,-396.924205,-396.923261,-396.963717,28.734,-1630.028046053,-1639.518492169,-1648.408412172,-1520.225266215\r\ngdb_47100,O=C1COC2CCOC12,2.96306,1.63291,1.24587,2.9656,68.09,-0.2364,-0.0361,0.2004,1004.3637,0.137187,-458.981931,-458.974483,-458.973539,-459.014521,27.882,-1697.6490434021,-1708.0845180721,-1717.56743408,-1581.703577963\r\ngdb_46916,O=C1CNC2CC2CO1,3.30426,1.3986,1.0859,4.6281,72.32,-0.2389,0.018,0.2569,1102.6159,0.150093,-439.092301,-439.084724,-439.08378,-439.12453,29.689,-1754.6362732371,-1765.8793519901,-1775.955264003,-1631.726712916\r\ngdb_93940,NC(=O)N1CCC(O)C1,2.65814,1.47133,1.19901,3.7903,71.43,-0.2379,0.0407,0.2786,1089.9322,0.161689,-456.390179,-456.381559,-456.380615,-456.423773,32.907,-1792.610607881,-1804.087119982,-1814.756028,-1663.56400954\r\ngdb_24266,C1C2OC2C2=C1C=CO2,3.97975,1.58899,1.23468,1.9975,70.58,-0.2155,0.005,0.2205,959.9253,0.115218,-420.628809,-420.622621,-420.621676,-420.659296,25.114,-1613.0802829811,-1622.529941012,-1630.826237501,-1509.442151559\r\ngdb_42143,C1CCOCCOCC1,1.81837,1.76255,1.0201,1.4557,80.49,-0.2372,0.0776,0.3148,1234.2606,0.208279,-425.398486,-425.389778,-425.388834,-425.431451,34.369,-2094.6890479181,-2109.665805221,-2122.706697259,-1939.12203671\r\ngdb_127423,Cc1c(nn(n1)C)CO,2.92006,1.22167,0.91655,1.5294,76.33,-0.2418,0.0007,0.2424,1287.027,0.147725,-435.278455,-435.26885,-435.267905,-435.31413,33.05,-1710.574473784,-1720.544336776,-1730.61962128,-1589.467119311\r\ngdb_96751,CN=C(C=O)OCC#C,3.99814,0.80265,0.674,1.3781,76.62,-0.2478,-0.0772,0.1706,1590.9628,0.119753,-437.845183,-437.835118,-437.834173,-437.881383,34.188,-1599.9101240891,-1607.814854962,-1616.704147456,-1492.863363779\r\ngdb_13059,O=CNC1COC1=O,4.38663,1.30255,1.11739,3.321,56.33,-0.2738,-0.0106,0.2633,1009.0202,0.095756,-435.777782,-435.770439,-435.769495,-435.810577,25.208,-1325.130208079,-1332.0761052,-1339.187664697,-1238.188836129\r\ngdb_53717,C#CC#CC#CC(=O)N,11.40399,0.46393,0.44579,3.7775,107.3,-0.2478,-0.0798,0.1681,2245.2658,0.075317,-398.304997,-398.295577,-398.294633,-398.340225,32.151,-1398.299640006,-1403.055530717,-1409.5734667,-1317.640888164\r\ngdb_87439,CC1OC11C2C3OC2C13,4.70663,1.18428,1.10763,2.0592,72.02,-0.2346,0.081,0.3156,1117.0208,0.134513,-421.704494,-421.697192,-421.696248,-421.736189,28.854,-1660.2306817321,-1670.758400225,-1680.2413162331,-1543.9796194101\r\ngdb_104994,CC(C#N)C1(CO)CO1,2.94923,1.1403,0.97975,2.7636,71.38,-0.2787,0.0259,0.3046,1237.5426,0.14607,-439.074209,-439.064773,-439.063829,-439.108765,34.279,-1743.283380409,-1753.3599199311,-1763.435831944,-1621.834033531\r\ngdb_99822,CCC(O)(C#N)C(N)=O,2.08782,1.32988,1.20378,4.4696,69.14,-0.2652,-0.0072,0.258,1157.9839,0.133379,-455.190767,-455.180495,-455.179551,-455.226249,36.541,-1667.8204030871,-1676.483164832,-1685.96608084,-1553.32950351\r\ngdb_101388,CCC(O)C1CN=CO1,3.9986,0.96626,0.92476,2.0562,75.02,-0.252,0.0199,0.2719,1353.6309,0.171158,-440.303784,-440.294578,-440.293634,-440.338296,33.654,-1887.0011391701,-1898.99911125,-1910.261015273,-1752.1532201241\r\ngdb_61694,CC(OC=N)(C#N)C#N,2.21722,1.27371,1.03555,4.2089,69.5,-0.3009,-0.0456,0.2552,1181.8294,0.098098,-432.864814,-432.855421,-432.854476,-432.899479,32.226,-1451.6962633431,-1458.244947267,-1465.948247751,-1356.697048324\r\ngdb_44508,O=C1CC2C3OC2C3O1,4.42561,1.41569,1.22312,2.6048,64.22,-0.2597,-0.0029,0.2568,969.9124,0.113964,-457.73307,-457.726737,-457.725793,-457.763979,25.321,-1541.8291460671,-1551.1871877841,-1559.4841117821,-1438.199799771\r\ngdb_28507,Cc1c(nc(cn1)O)O,3.15962,1.42772,0.98936,2.3979,73.24,-0.2228,-0.0338,0.189,1127.1918,0.113502,-454.022857,-454.015019,-454.014074,-454.055241,29.695,-1562.797986811,-1571.210999974,-1579.507296463,-1459.734026124\r\ngdb_95357,CC1CCC2OCC=C12,2.81658,1.51276,1.08249,1.7201,84.24,-0.2261,0.0179,0.244,1164.8032,0.184475,-387.107491,-387.099312,-387.098367,-387.140513,31.769,-2049.10553914,-2062.6377707251,-2074.492043244,-1907.2012813891\r\ngdb_83517,CC1C2COC1C1CC21,2.7831,1.71368,1.54489,1.4708,80.21,-0.2376,0.0941,0.3317,986.3393,0.185994,-387.082944,-387.075772,-387.074827,-387.113977,30.369,-2033.702075717,-2047.866208865,-2059.720481384,-1890.549702565\r\ngdb_39919,C1NC11C=CC2CC1O2,3.25747,1.61317,1.40927,1.0502,76.68,-0.2364,-0.0005,0.2359,968.8015,0.150191,-401.900378,-401.893563,-401.892619,-401.931316,28.325,-1770.878716193,-1782.600584313,-1792.676496326,-1647.415065425\r\ngdb_19622,C1CC11C2C3OC1C23,4.71702,2.00746,1.84113,1.7384,67.65,-0.2282,0.0791,0.3073,765.1082,0.131082,-346.508518,-346.502494,-346.501549,-346.538344,24.875,-1577.777881659,-1588.2190039101,-1597.108923913,-1469.7371470931\r\ngdb_71414,CC12CC3C(C#N)C1N23,3.43108,1.3001,1.23156,2.5524,75.71,-0.263,0.0164,0.2795,1069.9923,0.137089,-380.834575,-380.826991,-380.826047,-380.866596,29.24,-1717.658422885,-1728.0091838401,-1737.492099848,-1601.5077620031\r\ngdb_38046,C1C=CC23OC(C=C2)C13,3.36718,1.76397,1.67918,2.1769,76.45,-0.2319,-0.001,0.2309,873.71,0.137848,-384.606134,-384.599962,-384.599018,-384.63616,26.913,-1735.184749255,-1746.4221804271,-1755.905096435,-1618.1423980841\r\ngdb_15882,CC1NC(=O)C1CO,2.8162,1.86948,1.24408,4.5353,64.89,-0.2553,0.026,0.2813,976.4901,0.142212,-401.020645,-401.012345,-401.011401,-401.053476,30.252,-1613.519539281,-1623.419748774,-1632.903292291,-1500.114857783\r\ngdb_92143,CC12C3C(O)CC1CN23,2.60984,1.56745,1.36044,2.4287,77.86,-0.2227,0.0707,0.2934,1052.2461,0.172034,-403.113868,-403.105786,-403.104842,-403.146062,32.103,-1904.5029926891,-1917.2069123941,-1928.468816417,-1768.456531453\r\ngdb_7356,CC(C)C(=O)COC,4.31582,0.96534,0.9375,3.4249,72.97,-0.2402,-0.0094,0.2308,1368.3204,0.17342,-386.149151,-386.138963,-386.138019,-386.185521,34.936,-1842.4197622651,-1853.8015205071,-1865.064052039,-1711.4874993791\r\ngdb_62678,CC1(C)C2C3NC3C12C,2.35542,1.69653,1.27572,1.7465,86.72,-0.1985,0.0748,0.2732,1111.5169,0.192741,-367.125029,-367.115852,-367.114908,-367.158007,36.444,-2047.839853487,-2061.6343838341,-2074.082279867,-1899.1704212071\r\ngdb_85361,CC1CC(=O)C=CN1C,2.88595,1.33319,1.06539,5.5982,85.05,-0.2041,-0.0225,0.1816,1188.3868,0.171583,-403.189932,-403.180935,-403.179991,-403.223595,33.753,-1952.2338372651,-1964.363586235,-1975.625490258,-1817.10918675\r\ngdb_55953,CC(=O)NC1CC=CC1,3.77066,0.88671,0.88217,3.4136,82.08,-0.2396,0.0182,0.2578,1431.4122,0.171102,-403.200355,-403.190994,-403.19005,-403.236812,33.66,-1958.774363572,-1970.675699266,-1981.937603289,-1825.402973203\r\ngdb_2488,CC12CC3C(O1)C23,4.99234,3.05434,2.62386,1.6347,59.74,-0.2267,0.0851,0.3119,589.1686,0.12522,-308.455402,-308.449654,-308.448709,-308.484455,23.104,-1448.295164563,-1458.0202990451,-1466.317850552,-1348.8964839451\r\ngdb_76665,CC1C(O)C1NC(C)=O,2.74463,0.995,0.91511,3.8438,77.52,-0.2285,0.0387,0.2671,1395.8983,0.168679,-440.303083,-440.292567,-440.291623,-440.339701,37.126,-1886.5612553611,-1897.737190651,-1908.999094674,-1753.034870269\r\ngdb_72145,CC12OC3CC1CC2O3,2.53126,2.0415,1.80925,1.7722,72.86,-0.2508,0.0857,0.3365,888.874,0.161911,-423.038167,-423.031442,-423.030498,-423.068774,28.459,-1869.2708723751,-1881.9377690491,-1892.606677067,-1738.966118489\r\ngdb_79748,CC1C2N3CC3CC12C,2.3247,1.71306,1.33304,1.6176,85.0,-0.2199,0.0824,0.3023,1091.1176,0.195481,-367.172964,-367.164562,-367.163618,-367.205227,33.934,-2077.919497402,-2092.200347224,-2104.648243257,-1928.801396187\r\ngdb_77931,CC1(C)C2C(O)C1C2O,2.40264,1.61678,1.11451,1.5556,78.5,-0.252,0.0669,0.3189,1138.3657,0.180802,-424.151018,-424.141399,-424.140455,-424.185008,36.918,-1939.7432706201,-1952.371261736,-1964.226161764,-1798.190417909\r\ngdb_87835,CC1OC11CC2OC12C,2.90364,1.35112,1.14871,0.2801,75.84,-0.2455,0.0823,0.3278,1151.2216,0.156549,-422.977697,-422.968824,-422.96788,-423.010997,33.497,-1831.325403145,-1842.6444104871,-1853.313318505,-1702.7105309961\r\ngdb_38644,C1C2OC3C=CC2OC13,2.88069,2.27663,1.97492,0.1885,70.16,-0.2267,0.0018,0.2285,789.3823,0.140009,-421.811109,-421.805548,-421.804604,-421.840596,24.988,-1727.1325537671,-1738.7527654291,-1748.235681437,-1609.4959515731\r\ngdb_101282,CCN(C)C1CC1C#N,2.15552,1.30696,0.93114,4.3932,84.77,-0.2141,0.0298,0.2439,1331.5932,0.181678,-383.25253,-383.242491,-383.241546,-383.288097,36.442,-1979.243707152,-1991.608144488,-2003.462417007,-1838.57626964\r\ngdb_71055,CC12CN=C3OC1(C)C23,2.37767,2.15368,1.32353,4.1062,77.54,-0.2115,-0.0464,0.1651,983.8056,0.144815,-401.822943,-401.814655,-401.813711,-401.855093,32.3,-1722.287556778,-1733.0851041411,-1743.161016154,-1599.584446918\r\ngdb_22862,ON=C1C=CC(=O)C1O,3.33419,1.26324,0.94216,2.8187,72.13,-0.2524,-0.0861,0.1664,1150.2044,0.100138,-473.802402,-473.794352,-473.793408,-473.83533,29.656,-1436.731428711,-1444.122857222,-1451.826785215,-1340.746397053\r\ngdb_17713,CC1C2C3N1C2C3=O,6.84325,1.51176,1.45571,3.6514,66.16,-0.2493,-0.0148,0.2345,870.3511,0.118683,-362.575774,-362.569251,-362.568307,-362.606275,25.16,-1471.377455619,-1480.6162706261,-1488.913822133,-1370.137663595\r\ngdb_99890,CCC(C)(C#C)C1CO1,1.85979,1.49489,1.1367,1.7224,83.98,-0.2555,0.0529,0.3084,1230.7517,0.179184,-387.036056,-387.025716,-387.024772,-387.07121,38.279,-2004.279433725,-2016.4556183611,-2028.310518389,-1863.7130251621\r\ngdb_16324,OCC12CC1NC2=O,3.73176,1.68547,1.33959,3.0817,61.53,-0.2458,0.0071,0.2529,905.483,0.119058,-399.781764,-399.77431,-399.773366,-399.813856,27.492,-1463.9621817661,-1472.616158385,-1480.913709892,-1363.4647328891\r\ngdb_84368,CC1C=CC(C)(C)C1O,2.26091,1.5266,1.2223,1.2538,86.8,-0.238,0.0251,0.2632,1174.6382,0.204455,-388.308208,-388.298186,-388.297242,-388.342183,38.757,-2174.714643179,-2188.8674811651,-2201.908373203,-2020.037439733\r\ngdb_79207,CC1C2C3CC1(C#C)N23,2.56317,1.65583,1.30122,1.9021,82.23,-0.2455,0.0174,0.2629,1037.6775,0.14731,-364.724666,-364.716714,-364.715769,-364.756792,31.456,-1797.2937075481,-1808.3027254441,-1818.378009948,-1674.8315611441\r\ngdb_11166,N=C1COC(O1)C#N,4.1811,1.6419,1.43838,2.4046,55.72,-0.2964,-0.0107,0.2858,845.2736,0.084543,-414.668982,-414.66224,-414.661296,-414.700493,23.844,-1244.9289102981,-1251.363387584,-1257.881951076,-1163.815214431\r\ngdb_120268,CCOC1C2NC2C1O,3.22627,1.09954,0.97613,2.3607,75.67,-0.2278,0.0711,0.2989,1300.3364,0.170488,-440.227135,-440.218129,-440.217185,-440.261302,33.697,-1838.903201829,-1851.026675709,-1862.288579732,-1703.8387921781\r\ngdb_65998,NC1(CNC1=O)C1CO1,2.80365,1.3182,1.17535,3.2728,69.1,-0.2391,0.0215,0.2606,1098.4168,0.136102,-455.12734,-455.11874,-455.117796,-455.160799,31.884,-1628.019389744,-1637.731346537,-1647.214262545,-1512.25903946\r\ngdb_91029,CC12C3CC3C1CC2O,2.09995,1.83141,1.54028,1.202,81.45,-0.2274,0.0709,0.2983,1023.2981,0.182531,-387.035835,-387.027542,-387.026598,-387.067892,33.799,-2004.1407542361,-2017.6014497951,-2029.456349823,-1861.6309503\r\ngdb_2526,COC(C=O)C=O,3.90626,2.07715,1.45287,2.2647,52.12,-0.2513,-0.0542,0.197,801.6864,0.09774,-381.585428,-381.577564,-381.576619,-381.617877,25.801,-1256.878564185,-1263.497529117,-1270.609088614,-1172.4698185591\r\ngdb_86398,CC1CC(C)C2(CO2)C1,2.61048,1.30117,0.97809,1.8167,85.29,-0.2547,0.0745,0.3293,1307.7633,0.205551,-388.298262,-388.288971,-388.288027,-388.332339,36.023,-2168.4734386651,-2183.0849857301,-2196.125877768,-2013.8602411371\r\ngdb_6277,CC(C)CC(O)C#C,3.75263,1.02212,0.95991,1.2402,77.59,-0.2633,0.0326,0.2959,1313.3192,0.173607,-348.974538,-348.964435,-348.963491,-349.009739,36.586,-1869.524386011,-1880.960110027,-1892.222641559,-1738.1145887761\r\ngdb_87766,CC12CC3(CC3O)C1O2,3.82907,1.06313,1.05913,2.596,76.84,-0.2396,0.0744,0.314,1234.8892,0.157112,-422.968563,-422.959832,-422.958888,-423.001659,33.705,-1825.5937359391,-1837.0018495591,-1847.670757577,-1696.8508519541\r\ngdb_27222,c1c2c(c(o1)N)[nH]cn2,3.56887,1.62187,1.12183,3.1491,71.05,-0.1905,-0.0065,0.184,996.4943,0.104227,-432.890482,-432.883466,-432.882522,-432.921705,27.161,-1467.803164355,-1475.843437172,-1483.547365165,-1370.644063358\r\ngdb_129308,N=C1ON=NC(=C1)C#C,3.826,1.23008,0.93082,2.5848,78.37,-0.2601,-0.1095,0.1506,1143.8279,0.075532,-431.585971,-431.578357,-431.577413,-431.618356,28.023,-1277.06239117,-1282.950308117,-1289.4682441,-1194.003417403\r\ngdb_18092,C1N=CN2C=NCC12,4.44071,2.21216,1.64749,0.9912,65.55,-0.232,0.0053,0.2373,761.7245,0.123166,-358.860259,-358.854291,-358.853347,-358.890556,22.701,-1489.019243645,-1498.605698638,-1506.903250145,-1387.291249619\r\ngdb_79584,CN1C2C3CC3(C)C12C,2.10585,1.66467,1.51916,1.2163,86.13,-0.2293,0.0864,0.3158,1080.8187,0.193114,-367.159577,-367.1504,-367.149455,-367.192555,35.883,-2069.519034419,-2083.313564766,-2095.76083329,-1920.849602139\r\ngdb_128757,COC=Nc1ncon1,6.90037,0.76877,0.70926,3.2396,69.67,-0.2565,-0.035,0.2215,1518.0566,0.100841,-469.994146,-469.985984,-469.98504,-470.028392,27.452,-1396.177404568,-1403.497924562,-1411.2018525551,-1300.659239606\r\ngdb_93038,OC1CC=CC2CC2C1,2.7318,1.43292,1.0951,1.4466,83.08,-0.2328,0.0092,0.242,1159.6685,0.184601,-387.084185,-387.076026,-387.075082,-387.116497,33.1,-2034.480814386,-2048.0255961511,-2059.880496179,-1892.1310252451\r\ngdb_15895,COC1C(C)CC1=O,2.65142,1.92943,1.30675,2.1754,68.81,-0.2257,-0.0244,0.2013,989.0128,0.151583,-384.937529,-384.928784,-384.927839,-384.97102,31.219,-1709.967672581,-1720.477820822,-1730.553732835,-1590.599773056\r\ngdb_50723,N=C1OCCCN1C=O,2.82664,1.48008,1.04633,6.2459,72.31,-0.2452,-0.0058,0.2394,1120.975,0.137569,-455.176564,-455.168514,-455.16757,-455.209801,29.596,-1658.90789276,-1668.964979503,-1678.447895511,-1543.0082354781\r\ngdb_75803,CN1C(=O)CC12CCC2,3.10629,1.2484,1.05111,3.8139,80.82,-0.2365,0.0404,0.2768,1207.8997,0.171004,-403.168091,-403.159237,-403.158293,-403.201867,32.97,-1938.528413196,-1950.747895953,-1962.009799976,-1803.474671198\r\ngdb_28076,c1c(nc([nH]1)N)NC=O,5.60358,0.95932,0.82035,3.693,71.57,-0.1873,0.0163,0.2036,1312.2113,0.115066,-450.194011,-450.185837,-450.184893,-450.226872,30.29,-1509.323552358,-1517.525094988,-1525.822018986,-1406.198723298\r\ngdb_109754,COC1C(O)C1C(C)=O,2.63069,1.19731,1.11313,3.0569,74.03,-0.2311,-0.0226,0.2085,1226.2215,0.156178,-460.151632,-460.141862,-460.140918,-460.186553,35.125,-1803.7953282971,-1814.550832557,-1825.219740575,-1675.941624565\r\ngdb_30467,CN=c1[nH]c(co1)C#C,6.61888,0.89779,0.7947,2.5037,82.09,-0.1934,-0.022,0.1714,1384.1769,0.110949,-416.792073,-416.783037,-416.782093,-416.82659,31.962,-1554.654802518,-1562.316687408,-1570.613611406,-1453.1853422\r\ngdb_41525,C1CC2=CCOCC2=C1,3.45486,1.43187,1.05479,1.4767,87.02,-0.2067,-0.0096,0.1971,1140.6685,0.162039,-385.915593,-385.908125,-385.907181,-385.947731,29.32,-1929.0304369721,-1941.2317219681,-1951.900629986,-1799.942423037\r\ngdb_127757,Cn1c2ccoc2nn1,3.66996,1.66247,1.15253,5.0063,70.19,-0.2318,-0.025,0.2069,984.8348,0.103569,-432.859906,-432.853123,-432.852179,-432.891287,25.438,-1448.616449171,-1456.802931585,-1464.506859578,-1351.556494596\r\ngdb_36076,C#CCC12C3CN1CC23,4.00198,1.14627,1.14015,1.2184,80.36,-0.2267,0.0471,0.2738,1145.3192,0.147814,-364.699477,-364.691679,-364.690735,-364.732099,30.465,-1781.4873833471,-1792.5930376291,-1802.668949642,-1659.3364814071\r\ngdb_36215,N#CCC1=CC2CN2C1,4.77837,0.96368,0.88768,4.4597,78.35,-0.2314,-0.0097,0.2217,1295.4575,0.137353,-380.863942,-380.856175,-380.85523,-380.896913,29.289,-1736.086479688,-1746.3224064961,-1755.804694995,-1620.531952356\r\ngdb_33552,C#CC1(CCC=C1)C#C,2.17242,1.57207,1.18559,0.9757,85.61,-0.2496,0.008,0.2576,1112.62,0.134169,-347.490137,-347.481413,-347.480469,-347.523354,33.48,-1799.071440545,-1808.708096258,-1818.191012266,-1684.287494265\r\ngdb_47287,N=C1CCC(=N)OCO1,3.36139,1.30342,1.00656,2.0737,71.8,-0.2723,0.0,0.2723,1146.3311,0.138055,-455.163349,-455.155462,-455.154518,-455.196449,29.578,-1650.615361325,-1660.774732035,-1670.257648043,-1534.62973531\r\ngdb_77362,OC1C(CNC1=O)C=O,3.01277,1.25252,0.94089,2.0426,65.89,-0.2574,-0.0354,0.222,1180.1187,0.124336,-475.071273,-475.062787,-475.061843,-475.105061,30.559,-1605.1077811361,-1614.0027212111,-1622.8926412141,-1496.291445446\r\ngdb_62520,CC1(O)C(C=O)C1C#N,1.69843,1.57433,1.03504,4.2298,71.72,-0.2631,-0.0437,0.2194,1184.2736,0.121147,-437.892959,-437.883582,-437.882638,-437.927226,33.613,-1629.8899940731,-1638.2264511381,-1647.116371141,-1521.6302588661\r\ngdb_93741,CC1=C(C)CC(O)CC1,2.56879,1.22307,0.86793,1.4187,89.39,-0.2216,0.0331,0.2547,1393.97,0.205445,-388.322038,-388.312115,-388.311171,-388.356415,37.69,-2183.3930926491,-2197.6080540261,-2210.648946064,-2028.9681478211\r\ngdb_112107,COC1C=CCOC1=O,2.957,1.40983,1.00539,3.839,71.15,-0.2451,-0.0075,0.2376,1162.7057,0.135245,-458.983019,-458.974497,-458.973553,-459.016915,30.744,-1698.3317731941,-1708.093303198,-1717.576219206,-1583.205834509\r\ngdb_55551,C1CC1C(=N)NC(=O)N,3.95494,0.90867,0.77341,4.0768,75.5,-0.2471,0.0021,0.2492,1454.0609,0.148692,-435.311553,-435.3024,-435.301456,-435.34663,33.571,-1731.343766666,-1741.597263726,-1751.673175739,-1609.861161811\r\ngdb_112506,CC1C(CO)CC1C#N,2.11468,1.24347,1.10667,3.1384,78.76,-0.2714,0.0295,0.3009,1236.4905,0.170504,-403.151635,-403.14202,-403.141075,-403.186248,35.15,-1928.2021250921,-1939.9440735,-1951.205350014,-1793.673608127\r\ngdb_128796,COCc1conc1O,3.42338,1.00982,0.78751,2.8043,67.57,-0.2525,0.0012,0.2536,1392.2299,0.123583,-475.001981,-474.993318,-474.992374,-475.036503,30.484,-1561.6264275081,-1570.41029849,-1579.300218493,-1453.270683424\r\ngdb_32885,[NH3+]CCC1=CC([O-])=CO1,4.27027,0.99422,0.85836,11.4378,79.0,-0.1505,0.0037,0.1542,1335.4045,0.14635,-439.011991,-439.003532,-439.002588,-439.045607,32.352,-1704.2410254471,-1714.9306412621,-1725.0065532751,-1582.2018201091\r\ngdb_77977,CC1C2C(O)C=CC12C,2.4227,1.51086,1.26588,1.8422,83.29,-0.228,0.0067,0.2347,1121.9938,0.182263,-387.088023,-387.078989,-387.078045,-387.120933,35.429,-2036.8891939281,-2049.884905318,-2061.739805346,-1894.914655169\r\ngdb_9963,CCC1(O)C(C)C1C,2.41424,1.57915,1.42058,1.3573,78.24,-0.2406,0.0732,0.3138,1082.743,0.197746,-350.200098,-350.190005,-350.189061,-350.234295,37.29,-2010.7226961371,-2023.941800731,-2036.390324273,-1865.311918094\r\ngdb_119316,CCOC1(COC1)C=O,2.43806,1.16752,0.93824,2.9144,72.69,-0.2499,-0.0404,0.2095,1321.5474,0.156112,-460.135598,-460.1259,-460.124955,-460.171119,33.866,-1793.7338489911,-1804.534533899,-1815.202814408,-1666.256650659\r\ngdb_2439,C1C2CC1N=CO2,4.41983,3.68248,2.90187,1.9921,56.02,-0.2483,0.0166,0.2649,537.4677,0.116975,-324.568597,-324.563663,-324.562719,-324.596956,20.087,-1370.7218744741,-1380.0686210291,-1387.77380404,-1277.265076577\r\ngdb_35487,N#CC1CC2CC3C2N13,3.14741,1.57916,1.49848,4.3515,74.4,-0.2588,0.0271,0.2859,938.6695,0.138967,-380.846554,-380.839788,-380.838844,-380.877717,27.121,-1725.1753531961,-1736.0394165131,-1745.522332521,-1608.4862895921\r\ngdb_59933,CC(O)CC(C#C)C#C,2.24782,0.99123,0.81133,1.6069,83.24,-0.2616,0.0331,0.2947,1479.0038,0.15473,-385.816063,-385.805377,-385.804433,-385.852171,38.939,-1866.574466202,-1876.756427236,-1887.425335254,-1739.977662997\r\ngdb_43623,N=C1CC(O1)C1CCO1,3.51833,1.16104,1.11222,3.5189,73.03,-0.2532,0.0315,0.2847,1163.9818,0.148309,-439.065474,-439.057412,-439.056468,-439.09905,29.718,-1737.802089294,-1748.740826182,-1758.816738195,-1615.737783596\r\ngdb_120337,OCCN1C2CCCC12,3.41129,0.99587,0.96844,3.1996,82.41,-0.2304,0.0755,0.3059,1348.3461,0.196247,-404.327598,-404.318878,-404.317934,-404.361847,33.369,-2038.277871345,-2052.358545796,-2064.806441829,-1890.149979332\r\ngdb_42470,O=C(CCC1CO1)C#C,4.59429,0.68244,0.61846,1.8463,76.53,-0.2665,-0.0582,0.2084,1734.9907,0.132587,-421.780964,-421.771435,-421.770491,-421.81697,33.235,-1708.2162949621,-1717.346550912,-1726.82946692,-1594.6704239391\r\ngdb_8887,CCC(O)C(=O)OC,3.72209,1.24472,1.00073,2.6819,65.36,-0.2658,0.0037,0.2694,1199.0025,0.151515,-422.114698,-422.10512,-422.104176,-422.149509,33.056,-1684.466961839,-1694.453767574,-1704.530307096,-1565.671350522\r\ngdb_113986,COC1CC1OCC#N,3.28213,0.94177,0.86143,4.0564,73.22,-0.2485,0.0085,0.257,1389.1324,0.145336,-439.045139,-439.035569,-439.034625,-439.080711,33.494,-1725.0416937791,-1735.0341470951,-1745.110059108,-1604.229896045\r\ngdb_4609,NC1=COC(=N)N=C1,5.75098,1.62237,1.26843,5.3368,66.31,-0.2094,-0.0649,0.1444,889.6185,0.097233,-394.794273,-394.787421,-394.786476,-394.825588,25.446,-1311.279201922,-1318.533205962,-1325.64413795,-1223.304950158\r\ngdb_8276,OC1CC(CC#N)C1,5.91545,0.98256,0.90514,4.8073,68.96,-0.2694,0.033,0.3024,1239.1982,0.142159,-363.866922,-363.858764,-363.85782,-363.900043,29.731,-1653.732826037,-1663.7227693171,-1673.206312834,-1540.7661458211\r\ngdb_128555,COC1=CC(=N)C=NN1,3.85187,1.17617,0.9063,5.5337,79.78,-0.1974,-0.028,0.1694,1237.8115,0.125565,-434.058982,-434.05065,-434.049706,-434.093013,30.348,-1573.1958109411,-1582.1873874021,-1591.077307405,-1464.427793444\r\ngdb_20598,Cc1c(onn1)CO,3.38404,1.88804,1.26151,3.0132,60.04,-0.2608,-0.0371,0.2237,909.7723,0.106256,-415.837391,-415.829677,-415.828733,-415.869954,27.113,-1350.2644535651,-1357.866097591,-1365.5706530931,-1256.439935394\r\ngdb_19675,N#CC1C2C3CC1N23,4.86664,1.81077,1.68156,2.3287,63.96,-0.2689,0.0138,0.2826,776.3639,0.109652,-341.536178,-341.530216,-341.529272,-341.566309,23.021,-1434.602290674,-1443.304585486,-1451.009140988,-1339.763717959\r\ngdb_47871,N=C1OCCOCC1=O,2.32737,1.76758,1.16728,3.9109,67.49,-0.2586,-0.064,0.1946,1040.4515,0.124475,-475.014963,-475.006834,-475.00589,-475.048667,29.306,-1569.7727493461,-1578.8917101341,-1587.7816301371,-1460.9037029\r\ngdb_108950,CCC1=NC2C(C)C2O1,2.92911,1.30956,1.16195,1.2056,79.24,-0.2324,0.022,0.2545,1174.6777,0.171101,-403.160453,-403.151572,-403.150628,-403.194569,32.932,-1933.7354994541,-1945.938039468,-1957.199943491,-1798.895110516\r\ngdb_6784,CC(=O)OCCOC,5.12176,0.88062,0.79647,2.5875,66.44,-0.2601,0.0197,0.2798,1453.8538,0.150715,-422.102771,-422.092958,-422.092014,-422.139335,32.308,-1676.9826619961,-1686.822003116,-1696.898542638,-1559.287073956\r\ngdb_64851,CC1(CC(O1)C#C)C#C,2.78405,1.21721,0.97825,1.7958,80.84,-0.2596,0.0078,0.2675,1236.6345,0.132141,-384.602332,-384.593025,-384.592081,-384.636718,34.828,-1732.798960037,-1742.0691504941,-1751.552066502,-1618.492548106\r\ngdb_98411,CCC(=O)C1(C)NC1C,2.19546,1.22671,1.02265,2.3862,83.06,-0.2327,-0.0113,0.2214,1339.0651,0.191258,-404.356813,-404.346094,-404.345149,-404.392842,38.757,-2056.6105467801,-2069.43683074,-2081.884099264,-1909.599620787\r\ngdb_3903,c1c(=O)[nH]ncn1,5.97394,2.93196,1.96671,1.6216,50.96,-0.2549,-0.088,0.1669,579.5407,0.069709,-355.517745,-355.512683,-355.511739,-355.546612,18.485,-1041.946064032,-1047.657023441,-1052.990222432,-974.933750413\r\ngdb_58607,CC(C)C1(CC1)C(=O)N,1.85753,1.48756,1.2893,3.2814,81.9,-0.239,0.0284,0.2674,1188.7961,0.192718,-404.379394,-404.369,-404.368056,-404.415102,38.877,-2070.7803275091,-2083.8105518941,-2096.258447927,-1923.5679711271\r\ngdb_68688,CC12CCCC(O1)C=C2,2.59719,1.80272,1.50553,1.4315,81.82,-0.233,0.0143,0.2473,999.3873,0.184817,-387.114686,-387.107113,-387.106169,-387.145949,31.84,-2053.6204663951,-2067.5329684341,-2079.387868462,-1910.612420313\r\ngdb_25939,CC(=O)N1C=COC1=N,2.90349,1.57424,1.02734,5.3702,70.49,-0.2167,-0.0206,0.1961,1091.3339,0.112165,-453.98799,-453.979948,-453.979004,-454.021054,29.626,-1540.9186305081,-1549.203631835,-1557.500555833,-1438.281375941\r\ngdb_122402,COCCC1CC(=N)O1,5.16849,0.64225,0.60502,2.7076,78.69,-0.259,0.0284,0.2874,1857.215,0.169767,-440.276349,-440.266699,-440.265755,-440.312154,34.056,-1869.7854297551,-1881.5047878391,-1892.766691862,-1735.7488798461\r\ngdb_76086,CC1C(C#C)C1N1CC1,2.05358,1.52606,1.17819,0.3534,86.22,-0.2108,0.0439,0.2546,1165.882,0.169223,-365.925347,-365.915884,-365.91494,-365.959702,35.478,-1922.880221263,-1934.7181785481,-1945.980082571,-1788.4458306481\r\ngdb_132439,Cc1c(=O)[nH]c(cn1)F,3.20087,1.36808,0.96409,2.0525,69.52,-0.2298,-0.0547,0.1752,1117.8907,0.10078,-478.0347,-478.027061,-478.026117,-478.067016,28.422,-1473.669745996,-1481.3190807061,-1489.0230086991,-1377.3358193341\r\ngdb_100610,OCC(O)C(=O)NC=N,2.36093,1.13638,0.79117,1.4994,69.85,-0.2633,-0.0258,0.2375,1386.4915,0.135554,-492.312105,-492.302673,-492.301729,-492.346746,33.833,-1607.2852373661,-1616.474479162,-1625.95739517,-1492.010579048\r\ngdb_4120,CC1OC(C)C1=NO,2.73275,1.81142,1.26962,1.0573,68.25,-0.2454,-0.0015,0.2438,994.3105,0.139744,-400.944839,-400.936142,-400.935198,-400.977986,31.401,-1565.9505920271,-1575.601680447,-1585.085223964,-1452.744203373\r\ngdb_10924,CC1C2OC2(C)C1C,3.15058,1.89194,1.47428,1.7728,74.6,-0.2424,0.0883,0.3307,955.8703,0.175284,-348.984158,-348.975483,-348.974539,-349.016636,32.802,-1875.561022591,-1887.892829459,-1899.155360991,-1742.442518349\r\ngdb_9590,CC1(CN1C=O)C#N,4.54817,1.31554,1.2191,2.7906,64.58,-0.2774,-0.0269,0.2506,983.2634,0.105677,-378.725892,-378.717891,-378.716947,-378.758404,28.041,-1416.973680337,-1424.3958567891,-1432.100412291,-1323.373182879\r\ngdb_92002,CC1CC2NC2C1C=O,2.57745,1.41406,1.13639,3.5809,78.82,-0.2329,-0.0167,0.2162,1141.6374,0.172045,-403.145469,-403.137078,-403.136133,-403.178696,32.241,-1924.332904598,-1936.842924022,-1948.104200536,-1788.934660159\r\ngdb_24163,c1c[nH]cc1OC2CC2,5.19272,0.96524,0.85419,2.4153,78.67,-0.1887,0.0513,0.24,1349.1776,0.1486,-401.95243,-401.944336,-401.943392,-401.985682,31.128,-1803.5418146611,-1814.4610987701,-1824.537010783,-1681.530219719\r\ngdb_3941,OC1=NNC(=N)N1,7.36164,2.24151,1.72172,2.0759,49.13,-0.1974,0.0457,0.2431,657.7641,0.080647,-372.765028,-372.759054,-372.75811,-372.794502,22.518,-1048.171580821,-1054.198177257,-1060.124372253,-974.54657736\r\ngdb_68445,CC12CC(C1)CC=CC2,2.64764,1.6312,1.22944,0.1616,90.44,-0.2343,0.0374,0.2717,1122.7037,0.208046,-351.168158,-351.160041,-351.159097,-351.200015,33.764,-2223.507860492,-2238.8567306321,-2251.89762267,-2067.7576166561\r\ngdb_83547,OC1C2CCC1C21CN1,3.08305,1.47273,1.29105,1.2296,78.01,-0.2397,0.0604,0.3001,1051.4013,0.173242,-403.105223,-403.097485,-403.096541,-403.136965,31.482,-1899.0781773841,-1911.997960185,-1923.259864208,-1762.74808208\r\ngdb_38282,C1C2CC1C1=C2COC1,3.7808,1.43042,1.25157,1.9869,80.84,-0.2078,0.0208,0.2286,1047.6624,0.162521,-385.846083,-385.839188,-385.838243,-385.877587,27.639,-1885.4122863821,-1897.9731340351,-1908.641414544,-1755.9264317411\r\ngdb_122164,CCCCC1(O)CC1O,3.08994,0.72839,0.66154,1.9814,83.02,-0.2434,0.0744,0.3178,1812.698,0.203004,-425.411393,-425.400513,-425.399569,-425.447651,40.107,-2102.788306581,-2116.4021143361,-2129.443006374,-1949.2876825101\r\ngdb_46761,N=C1OCC2C=CCN12,2.89994,1.82756,1.2279,3.7182,74.18,-0.2257,0.0048,0.2305,991.2392,0.13954,-418.03277,-418.025729,-418.024784,-418.064746,27.258,-1705.3517163771,-1716.04258721,-1725.524875709,-1588.916793918\r\ngdb_126596,CCCCn1cccn1,5.19998,0.69374,0.6677,2.2428,85.8,-0.2355,0.0267,0.2622,1768.645,0.184217,-383.301761,-383.292556,-383.291612,-383.337185,33.297,-2010.136602731,-2023.0243825731,-2034.879282601,-1869.379431432\r\ngdb_40283,C1CCCC2(CC1)CC2,2.38198,1.44339,1.10837,0.1552,91.54,-0.2588,0.0745,0.3334,1243.0294,0.232172,-352.364384,-352.355618,-352.354674,-352.397495,35.468,-2346.298821612,-2363.0175438991,-2377.244427947,-2177.96451229\r\ngdb_99072,COC(=O)CC(O)C#C,4.36618,0.76141,0.74855,2.6693,71.94,-0.2677,0.0087,0.2764,1554.8768,0.132262,-458.95325,-458.942937,-458.941993,-458.989775,35.542,-1679.6514577731,-1688.289119158,-1697.7720351661,-1566.1752402491\r\ngdb_12860,OC12CCC1CCC2,2.76795,2.22998,1.75119,1.2559,73.34,-0.249,0.0794,0.3284,864.3545,0.178726,-349.013499,-349.006052,-349.005108,-349.044903,29.913,-1893.9727641601,-1907.0751520801,-1918.337683612,-1760.180315252\r\ngdb_75516,CC1=CCCC=CC=C1,2.44151,1.44107,1.04581,0.2814,95.9,-0.2085,-0.0257,0.1828,1234.5676,0.184146,-349.972985,-349.96443,-349.963486,-350.005839,33.526,-2101.377666349,-2114.6745820591,-2126.529482087,-1959.624010758\r\ngdb_129526,c1c(non1)NC(=O)N,5.23405,1.00601,0.84459,5.4872,62.0,-0.2603,-0.0391,0.2212,1241.9231,0.090239,-486.062359,-486.054642,-486.053698,-486.095337,27.951,-1290.377504641,-1297.088085887,-1304.199017875,-1200.441032234\r\ngdb_128879,C1Cn2c1c(nn2)C#N,3.74627,1.35433,1.00754,7.4447,72.67,-0.2726,-0.0282,0.2445,1084.201,0.091712,-411.762384,-411.755551,-411.754607,-411.794059,24.898,-1375.492197892,-1382.7587521121,-1389.8696841,-1285.250128602\r\ngdb_95560,OC1CCC=CC=CC1,2.33914,1.45794,1.05505,1.1116,86.84,-0.2192,-0.0144,0.2047,1201.4513,0.184418,-387.097228,-387.08874,-387.087796,-387.129873,33.755,-2042.665414273,-2056.0037455771,-2067.858645605,-1900.524585629\r\ngdb_119386,COCC12CC(O1)C2O,3.05949,1.08601,0.95832,1.4954,72.73,-0.2517,0.0638,0.3155,1303.4628,0.157525,-460.077924,-460.069036,-460.068092,-460.112265,32.848,-1757.5428949251,-1768.8518621231,-1779.520770141,-1629.3252359731\r\ngdb_128721,Cc1c(onc1OC)N,3.15189,1.273,0.91808,2.5,72.97,-0.2078,0.0295,0.2373,1247.72,0.135114,-455.142051,-455.132608,-455.131663,-455.176626,33.347,-1637.250674643,-1646.433641349,-1655.915929848,-1522.190624403\r\ngdb_122082,CCCOC(CO)C#C,2.74949,0.78319,0.67319,2.3933,82.38,-0.2539,0.0344,0.2883,1751.6838,0.178907,-424.176915,-424.165879,-424.164935,-424.214319,38.798,-1955.9938711931,-1967.7326820561,-1979.587582084,-1816.583334208\r\ngdb_117913,COCC(O)C(=O)C#C,1.92948,1.35637,0.85764,3.0622,72.69,-0.2632,-0.0712,0.192,1333.3036,0.131664,-458.920501,-458.91028,-458.909336,-458.95688,35.233,-1659.1011655321,-1667.796557745,-1677.279473753,-1545.533331694\r\ngdb_12585,C1C(=O)C(=O)C(=N1)N,3.55158,2.09466,1.32886,3.1857,58.01,-0.2318,-0.1135,0.1182,836.1417,0.084029,-414.695268,-414.688138,-414.687194,-414.726669,25.763,-1261.423611872,-1267.614615666,-1274.133179158,-1180.240890015\r\ngdb_20470,C1OC1C1=CNN=N1,6.49406,1.42415,1.26167,3.2596,59.72,-0.252,-0.0068,0.2452,909.31,0.097365,-394.751119,-394.74474,-394.743795,-394.782329,23.032,-1284.1996785361,-1291.750494333,-1298.861426321,-1196.1595383271\r\ngdb_42660,O=C1C2C3CC1CCN23,2.51056,1.89664,1.72636,2.9439,74.27,-0.2279,-0.0165,0.2114,899.9854,0.151521,-401.964178,-401.957482,-401.956538,-401.995223,27.2,-1810.9137903931,-1822.7103320841,-1832.786244097,-1687.5172830881\r\ngdb_29305,OC1=COC(=N1)C1CN1,5.6363,0.99193,0.89775,1.0502,69.57,-0.2157,-0.0069,0.2088,1248.161,0.11425,-453.947187,-453.939639,-453.938695,-453.97972,28.213,-1515.314380781,-1523.909371554,-1532.206295552,-1412.343918935\r\ngdb_50469,O=CC1CCC(=O)OC1,4.39677,0.94577,0.81885,2.5024,69.96,-0.268,-0.045,0.223,1334.31,0.135867,-459.012496,-459.004046,-459.003102,-459.046769,30.088,-1716.8288559871,-1726.635566639,-1736.118482647,-1601.939488195\r\ngdb_125849,CC1C2C1C1=CC=NN21,4.72146,1.25574,1.15195,3.1341,79.33,-0.2309,0.0039,0.2348,1095.3191,0.138238,-380.825024,-380.817997,-380.817053,-380.856374,27.73,-1711.665084426,-1722.3653678941,-1731.848283902,-1595.0933650051\r\ngdb_79374,CC1C2N3CC2(C#N)C13,3.21034,1.3917,1.10546,2.5971,78.0,-0.2598,0.0115,0.2713,1106.0634,0.1365,-380.788938,-380.781352,-380.780408,-380.820954,29.296,-1689.020794652,-1699.3703005891,-1708.853216597,-1572.866996225\r\ngdb_28711,Cc1cnc([nH]1)OC=O,6.64518,0.84294,0.7544,4.6624,71.34,-0.2203,-0.0124,0.2078,1435.6516,0.112619,-454.010356,-454.00189,-454.000946,-454.044415,29.572,-1554.953496802,-1562.972434313,-1571.269358311,-1452.94061369\r\ngdb_63245,CC1(O)C=CCNC1=O,2.31258,1.73864,1.27137,3.9712,73.61,-0.2456,0.0073,0.253,1042.8713,0.148114,-439.13345,-439.124916,-439.123972,-439.16647,32.813,-1780.4576410781,-1791.100193718,-1801.176105731,-1658.0444403761\r\ngdb_117629,CC(=O)C(C)(O)CCO,2.76897,1.05818,0.96793,3.4784,75.12,-0.2328,-0.0061,0.2266,1333.9944,0.178759,-461.384233,-461.373413,-461.372469,-461.420367,39.566,-1949.4119292921,-1961.28565459,-1973.140554618,-1808.948432205\r\ngdb_120408,COCC1CC(=O)C1O,2.81752,0.97327,0.87294,1.5586,72.81,-0.2514,-0.0292,0.2222,1402.7759,0.156578,-460.149771,-460.139937,-460.138993,-460.18523,34.64,-1802.6275340481,-1813.342877732,-1824.0117857501,-1675.111430158\r\ngdb_57770,CC(O)C#CC(C)C=O,3.02148,0.68603,0.59042,1.4587,82.65,-0.2496,-0.0316,0.218,1877.4408,0.155033,-423.007318,-422.996254,-422.995309,-423.045789,37.779,-1849.9128472341,-1859.8569823571,-1870.525262866,-1724.5428241241\r\ngdb_27610,C#Cc1cccc(n1)O,3.39839,1.28108,0.93036,1.0834,82.62,-0.2315,-0.0445,0.187,1170.5474,0.102175,-399.589396,-399.582098,-399.581154,-399.621101,28.446,-1576.419952183,-1584.28452248,-1591.988450473,-1480.179524362\r\ngdb_5862,CC(C)(C#CC#N)O,4.7493,0.91106,0.905,5.9719,75.35,-0.2818,-0.0441,0.2377,1300.797,0.116011,-362.651824,-362.642743,-362.641798,-362.685403,32.229,-1519.099515069,-1526.733162054,-1535.030086052,-1419.791195747\r\ngdb_30465,CN=C1OC=C(O)C=C1,4.78599,1.04975,0.86554,3.128,80.14,-0.1964,-0.0439,0.1525,1287.0365,0.123808,-437.906195,-437.897822,-437.896878,-437.939297,31.055,-1638.1957031971,-1647.162179298,-1656.052099301,-1529.204920005\r\ngdb_40442,C1NC1C12COCC1N2,3.53813,1.2306,1.1179,1.5223,74.98,-0.2382,0.0719,0.3101,1145.9764,0.16196,-419.152934,-419.145205,-419.144261,-419.185645,29.855,-1780.4130879391,-1792.449338068,-1803.118246086,-1651.068422823\r\ngdb_41101,C1NC23C4CC2OC13C4,3.00723,1.93233,1.81985,2.3593,74.48,-0.2127,0.0547,0.2673,858.3043,0.149648,-401.832586,-401.826102,-401.825158,-401.862864,27.741,-1728.338626065,-1740.2681996641,-1750.344111677,-1604.460819357\r\ngdb_11791,CCC12CC(CC1)O2,3.93799,1.68516,1.55257,1.4991,73.95,-0.2396,0.0844,0.3241,933.9141,0.178365,-348.997746,-348.990244,-348.9893,-349.029398,29.609,-1884.0876148831,-1897.1554898081,-1908.41802134,-1750.4507882071\r\ngdb_100434,CC(=NCCC#C)OC,3.99215,0.66128,0.57543,1.2,85.15,-0.249,0.0249,0.2739,1893.6258,0.166984,-403.12325,-403.112206,-403.111262,-403.161485,38.005,-1910.390282127,-1921.2355201741,-1932.497424197,-1778.13460276\r\ngdb_65548,OC1(CN2CC1C2)C#C,2.99043,1.52952,1.31556,1.421,77.17,-0.2221,0.0299,0.2521,1015.1231,0.14802,-401.869993,-401.862119,-401.861174,-401.901914,31.722,-1751.811855228,-1762.869191317,-1772.944475821,-1628.965045807\r\ngdb_44421,O=C1CC2C3CC12OC3,3.46368,1.51527,1.47524,3.8912,73.9,-0.2465,-0.0182,0.2283,945.8361,0.136956,-421.749795,-421.743103,-421.742159,-421.780705,27.618,-1688.6574669411,-1699.5679659241,-1709.050881932,-1571.9138100541\r\ngdb_35776,C#CC1CN2CC2CO1,3.77914,1.21565,1.0305,0.5127,78.36,-0.2241,0.0284,0.2525,1166.7298,0.148281,-401.875926,-401.868031,-401.867087,-401.908376,30.834,-1755.534866125,-1766.579024525,-1776.654936538,-1633.020008965\r\ngdb_75862,OC1C(=O)NC=C1C#C,3.50284,1.2577,0.95407,2.7934,75.36,-0.2192,-0.0407,0.1785,1146.6507,0.099969,-436.690054,-436.681746,-436.680802,-436.723018,30.858,-1502.907900342,-1510.13805904,-1517.8419870331,-1407.2014826801\r\ngdb_127022,CCOc1ncnn1C,3.40123,1.11684,0.85428,2.9301,74.95,-0.2299,0.0293,0.2592,1352.1299,0.148375,-435.296425,-435.287201,-435.286257,-435.331756,31.673,-1721.850810514,-1732.059754435,-1742.135666448,-1600.527592945\r\ngdb_124468,CC(C)(c1cnn[nH]1)O,3.24974,1.20867,1.08415,4.4098,73.01,-0.2527,0.0022,0.2548,1166.3437,0.14742,-435.27276,-435.263912,-435.262968,-435.306418,33.178,-1707.0008100291,-1717.445697334,-1727.521609347,-1584.627769903\r\ngdb_48633,O=CC12CN(C1)CCN2,3.40141,1.38887,1.18407,2.8218,75.16,-0.2069,-0.0328,0.1741,1074.242,0.161578,-419.167359,-419.15964,-419.158696,-419.199731,30.21,-1789.464905264,-1801.5074304831,-1812.176338501,-1659.907514597\r\ngdb_119154,COCC1(CO1)C(C)O,1.94515,1.3295,1.03252,2.6849,75.17,-0.2508,0.0584,0.3092,1296.0014,0.179721,-461.32774,-461.317391,-461.316446,-461.363345,37.39,-1913.9620633551,-1926.131345392,-1937.985617911,-1773.1666140071\r\ngdb_125519,Cc1ccn(n1)CCO,4.22004,0.83624,0.74079,0.802,79.57,-0.2303,0.0262,0.2565,1549.8303,0.15978,-419.22966,-419.220275,-419.219331,-419.264995,33.338,-1828.559343473,-1839.5564386981,-1850.225346716,-1700.861261973\r\ngdb_125516,Cc1ncn(n1)CC=O,4.3918,0.90834,0.79423,1.4384,72.29,-0.2566,-0.0442,0.2124,1403.1994,0.123864,-434.103876,-434.095115,-434.094171,-434.139395,29.778,-1601.367199987,-1610.0895750871,-1618.97949509,-1493.532915882\r\ngdb_99318,OCC(=O)CCC1CO1,5.69119,0.59625,0.56151,2.5763,72.96,-0.2682,-0.0269,0.2414,1920.7951,0.156438,-460.154254,-460.144336,-460.143392,-460.191501,34.31,-1805.4406568951,-1816.1032898231,-1826.772197841,-1679.0465390971\r\ngdb_45872,O=C1NC=CCC1C#C,2.03131,1.82063,1.27596,3.0961,76.33,-0.2228,-0.0135,0.2093,1033.0203,0.125006,-400.760454,-400.752379,-400.751435,-400.793139,30.771,-1683.417766791,-1692.571868083,-1701.461788086,-1574.421336018\r\ngdb_64296,CC1(CCC=C1C#C)C,2.27751,1.4932,1.13371,0.6582,93.35,-0.2238,-0.0143,0.2094,1187.9224,0.181081,-349.952766,-349.94328,-349.942336,-349.986418,36.773,-2088.690061878,-2101.4027667091,-2113.257666737,-1947.4371584691\r\ngdb_116116,OCC1CCNC(=O)O1,2.57471,1.31156,0.91009,5.0694,68.92,-0.2571,0.0506,0.3077,1249.1254,0.149364,-476.271291,-476.262722,-476.261777,-476.305056,31.598,-1730.2782563841,-1740.8982187001,-1750.9735032041,-1608.076526215\r\ngdb_127163,CN(C=N)c1conn1,4.29421,1.10173,0.88169,2.3474,72.66,-0.2251,-0.0456,0.1795,1231.6518,0.112862,-450.096739,-450.088581,-450.087637,-450.130233,28.98,-1448.28449691,-1456.496079684,-1464.793003682,-1345.556881047\r\ngdb_106978,CCC12CN=C(N)C1O2,4.22637,1.08908,0.96995,1.9039,76.87,-0.2286,0.0178,0.2464,1266.1842,0.159905,-419.211282,-419.202651,-419.201707,-419.244449,33.138,-1817.026983071,-1828.4972200821,-1839.1661281,-1687.968462059\r\ngdb_46852,N=C1NCC2CC2C1=O,2.29277,1.85307,1.28182,4.7913,73.73,-0.2327,-0.0476,0.1851,1008.9649,0.138141,-418.001808,-417.99407,-417.993126,-418.034542,29.384,-1685.922782719,-1696.1762797791,-1705.659195787,-1569.963512082\r\ngdb_43225,O=C1C2OCC3OC2C13,3.01719,1.86038,1.62773,1.9375,63.66,-0.2345,-0.039,0.1955,846.8275,0.114182,-457.72257,-457.716349,-457.715405,-457.753124,25.08,-1535.2403015671,-1544.668624292,-1552.96554829,-1431.388189576\r\ngdb_55343,CC(=O)N1CCC1C=O,2.81017,1.19915,0.92561,3.8219,74.19,-0.2387,-0.0305,0.2081,1271.2264,0.145483,-439.111884,-439.102273,-439.101329,-439.1477,32.772,-1766.924781984,-1776.8915074311,-1786.967419444,-1646.266096446\r\ngdb_82757,CC1C2OC3C2NC13C,2.20089,2.17257,1.52482,1.5524,77.35,-0.2278,0.0759,0.3037,966.1479,0.172157,-403.101693,-403.09405,-403.093106,-403.133215,31.27,-1896.8630706141,-1909.84246677,-1921.104370793,-1760.39492333\r\ngdb_14352,OC1CC2CCC2C1,3.03496,1.99922,1.93228,1.3883,73.22,-0.2588,0.0691,0.3279,845.8897,0.179486,-349.008744,-349.001499,-349.000555,-349.039995,29.104,-1890.988958865,-1904.2181036031,-1915.480635135,-1757.10050108\r\ngdb_131299,c1c(nn(n1)CC#N)O,4.02746,1.02259,0.91767,3.2321,65.23,-0.2519,-0.0249,0.227,1216.4507,0.090788,-448.93225,-448.924587,-448.923643,-448.965664,27.145,-1345.408788923,-1352.153883164,-1359.264815152,-1256.001934112\r\ngdb_48713,O=CC12CC1C1NC1C2,3.60829,1.34563,1.14996,3.3051,76.51,-0.2427,-0.0129,0.2298,1075.5579,0.149357,-401.933222,-401.925945,-401.925,-401.964858,29.193,-1791.4886217891,-1802.9205807511,-1812.995865255,-1668.462972303\r\ngdb_50633,N=C1CCOC(O1)C=O,3.02831,1.2356,0.92446,2.3245,67.73,-0.2673,-0.0482,0.2192,1210.2702,0.12387,-475.033993,-475.025768,-475.024823,-475.067906,29.481,-1581.7142456161,-1590.77296554,-1599.6622580341,-1472.976348551\r\ngdb_101466,OCC(O)C1COC=N1,4.14116,0.94486,0.91741,2.4419,68.08,-0.2536,0.0042,0.2577,1301.6947,0.148757,-476.226264,-476.217666,-476.216721,-476.260103,31.563,-1702.0234086411,-1712.625173196,-1722.7004577,-1579.868114138\r\ngdb_72035,CC12CC3CN1C2(C)C3,2.32879,1.96009,1.4915,1.4859,84.91,-0.2283,0.0874,0.3157,1020.1511,0.196174,-367.230955,-367.222896,-367.221952,-367.262828,32.971,-2114.309371821,-2128.8054572301,-2141.253353263,-1964.9465420961\r\ngdb_99795,CCC(C)(C#C)C#CC,1.91763,1.04939,0.96114,0.7422,93.5,-0.2467,0.0391,0.2858,1469.2801,0.177944,-349.90497,-349.893435,-349.892491,-349.942747,40.834,-2058.697641714,-2070.1245806041,-2081.979480632,-1920.0332129301\r\ngdb_44831,O=C1OC2CC12CC#N,3.08679,1.24812,1.01681,5.3238,65.46,-0.2929,-0.0178,0.2751,1124.8841,0.100453,-436.690125,-436.682351,-436.681406,-436.723447,28.004,-1502.9524534811,-1510.517701985,-1518.221002469,-1407.470684041\r\ngdb_31809,CCc1cnc(nc1)N,4.61862,0.99147,0.87985,0.7631,83.7,-0.2184,-0.0257,0.1927,1350.7145,0.149981,-398.205919,-398.197448,-398.196504,-398.239511,31.791,-1801.733333723,-1812.4154194301,-1822.491331443,-1679.574901675\r\ngdb_15078,OCC(C=O)C1CO1,3.11559,1.55515,1.15969,3.3343,60.9,-0.2513,-0.024,0.2272,1018.9517,0.127933,-420.846948,-420.838298,-420.837354,-420.880851,29.797,-1516.7940470031,-1525.586075602,-1534.476623114,-1410.7996192861\r\ngdb_62064,CC(CCC1CC1)C=O,3.50183,0.64949,0.5779,2.8369,87.67,-0.2436,-0.0174,0.2262,1954.8692,0.203598,-388.293215,-388.282594,-388.28165,-388.330578,37.766,-2165.306400742,-2179.0833608371,-2192.124252875,-2012.7551977881\r\ngdb_57437,CC(C)(O)C1NC1C#C,2.61109,1.28345,1.08557,2.7402,81.69,-0.2404,0.0309,0.2713,1210.5091,0.167892,-403.092434,-403.082521,-403.081576,-403.126627,38.002,-1891.0529647831,-1902.607915509,-1913.869192023,-1756.260894038\r\ngdb_71067,CC12OC3=NCC1C23O,2.47532,1.80237,1.53054,3.7743,70.62,-0.2131,-0.0538,0.1593,939.8633,0.121405,-437.743394,-437.735293,-437.734349,-437.775422,31.128,-1536.0366104881,-1545.1737690371,-1554.06368904,-1426.37188263\r\ngdb_68562,OC12CC(C=C1)C2C#C,2.52181,1.64807,1.37323,1.7473,78.29,-0.2338,0.002,0.2358,1000.9724,0.134851,-384.616448,-384.608517,-384.607573,-384.648416,31.881,-1741.656877081,-1751.790519922,-1761.27343593,-1625.8331483881\r\ngdb_101650,OCC(O)CC(O)C#N,4.02162,0.7402,0.64859,3.706,69.23,-0.2815,0.0166,0.2981,1640.6433,0.146115,-476.2198,-476.209983,-476.209039,-476.254953,35.502,-1697.9671904651,-1707.8040215491,-1717.8799335621,-1576.636442788\r\ngdb_127400,CN1CCc2c1cn[nH]2,3.05726,1.62626,1.09819,2.8028,78.99,-0.1803,0.0265,0.2068,1084.145,0.151471,-398.133601,-398.126039,-398.125095,-398.165471,29.246,-1756.353137861,-1767.6056292491,-1777.681541262,-1633.1141353151\r\ngdb_97821,OCC#CC1CC2OC12,5.19718,0.68004,0.65154,2.7981,78.27,-0.2439,0.0299,0.2738,1705.2631,0.134596,-421.742815,-421.733884,-421.73294,-421.779201,31.663,-1684.2774541211,-1693.782960453,-1703.265876461,-1570.970036518\r\ngdb_1521,CCOC(=N)CC,7.45443,1.25944,1.10695,1.2804,65.95,-0.2566,0.039,0.2956,1103.4198,0.158861,-326.998454,-326.98982,-326.988876,-327.031941,30.074,-1639.775770859,-1650.35494509,-1660.432112121,-1522.79491557\r\ngdb_111280,CC1C(CO)C2OC12C,3.10271,1.18317,1.00667,0.7088,78.49,-0.2432,0.0745,0.3177,1261.413,0.180424,-424.192955,-424.183213,-424.182269,-424.227178,36.343,-1966.0591155531,-1978.6099230621,-1990.4648230901,-1824.652472439\r\ngdb_44543,O=C1CC2C3CC3C2O1,3.50851,1.46748,1.37928,4.4874,70.28,-0.2633,0.015,0.2783,973.7289,0.13834,-421.843458,-421.836771,-421.835827,-421.874686,27.024,-1747.4318424081,-1758.345478936,-1767.828394944,-1630.8877333831\r\ngdb_133212,CC1=NOC(F)=CC1=N,3.23451,1.3588,0.96247,1.4179,69.48,-0.2446,-0.0542,0.1903,1122.7671,0.099819,-477.95611,-477.948466,-477.947522,-477.988411,28.598,-1424.3538136861,-1432.0000108511,-1439.703938844,-1328.010474389\r\ngdb_8150,CC12CC(O)CC1O2,3.78527,1.75091,1.56039,2.6733,67.25,-0.2419,0.0778,0.3197,893.8528,0.153396,-384.939491,-384.93173,-384.930786,-384.971118,29.905,-1711.198845239,-1722.3264623361,-1732.403001858,-1590.661268938\r\ngdb_94671,CC12CC1(O)C(O)CC2,2.17278,1.72624,1.32924,2.2919,78.1,-0.2254,0.063,0.2884,1077.4708,0.18228,-424.231879,-424.222849,-424.221905,-424.264767,35.804,-1990.4842758691,-2003.481869786,-2015.336769814,-1848.23990824\r\ngdb_128452,COC(=O)c1nnon1,4.8596,1.07392,0.88446,3.0525,58.15,-0.309,-0.0949,0.2141,1204.0549,0.075861,-505.902643,-505.894738,-505.893794,-505.936402,26.377,-1202.4252156921,-1208.129272502,-1214.647208485,-1119.716391947\r\ngdb_98339,COC(=O)C(N)CC#C,3.28586,0.88332,0.74552,2.1217,75.22,-0.2457,0.0036,0.2493,1518.9468,0.144832,-439.07449,-439.063864,-439.06292,-439.111182,36.585,-1743.4597104381,-1752.78951425,-1762.865426263,-1623.350722784\r\ngdb_90576,CC1CC2C1=CCC2=O,3.40903,1.26653,1.01045,2.631,83.16,-0.2357,-0.0154,0.2203,1202.7858,0.159413,-385.895488,-385.88738,-385.886435,-385.928221,31.398,-1916.4143685271,-1928.2140477631,-1938.882328272,-1787.6997224471\r\ngdb_9866,C#CC#CC1(CC1)O,6.05871,0.92865,0.86186,1.2366,82.21,-0.2329,-0.0225,0.2104,1295.546,0.104655,-345.329703,-345.321545,-345.320601,-345.362262,30.354,-1465.912479738,-1473.237392295,-1480.941947797,-1372.957689041\r\ngdb_52153,O=COCC1C2CC1O2,4.54022,0.81368,0.79304,3.9342,69.96,-0.2513,0.0024,0.2537,1435.786,0.135012,-458.899931,-458.891828,-458.890884,-458.934419,29.232,-1646.1933054021,-1656.2177616771,-1665.700677685,-1531.438852045\r\ngdb_91141,CC1=CC2CC(O)C2C1,2.78111,1.38309,1.22369,1.5584,84.13,-0.2315,0.0232,0.2546,1141.0171,0.183759,-387.098912,-387.09058,-387.089636,-387.131625,33.154,-2043.722139429,-2057.1583621371,-2069.013262165,-1901.6239813971\r\ngdb_35546,C#CC1CC2OCC2O1,2.62415,1.6017,1.45729,1.0278,72.31,-0.2512,0.0342,0.2854,977.657,0.136285,-421.772007,-421.764416,-421.763472,-421.80416,29.399,-1702.595696849,-1712.942065241,-1722.424981249,-1586.6320336491\r\ngdb_17237,CCC12CC3CC1C23,4.31597,1.58593,1.52626,0.2951,78.28,-0.2383,0.0935,0.3318,950.8165,0.178303,-311.825774,-311.818471,-311.817527,-311.857169,29.125,-1912.8494898981,-1926.042866623,-1937.305398155,-1779.307417081\r\ngdb_12631,CN1C=CC(O)C1=O,3.40568,2.0121,1.34727,4.1438,63.86,-0.2191,-0.0217,0.1973,887.8831,0.118615,-399.830336,-399.822426,-399.821482,-399.862721,28.276,-1494.441548914,-1502.809381429,-1511.106932936,-1394.127960174\r\ngdb_23446,ON=C1CC2OCCC12,3.48763,1.18353,1.05267,0.7972,75.12,-0.2457,-0.001,0.2447,1167.7802,0.148376,-439.037868,-439.029986,-439.029042,-439.070639,30.363,-1720.47907584,-1731.530764348,-1741.606676361,-1597.9096253971\r\ngdb_103620,OCC1(O)CC(O)C1=O,2.83743,1.23695,1.04495,1.4193,64.83,-0.2395,-0.0442,0.1954,1164.624,0.133507,-496.088825,-496.079411,-496.078467,-496.123193,34.215,-1628.0501376851,-1637.251302152,-1646.73421816,-1512.9643595761\r\ngdb_19893,N#CC1C2C3CN3C12,4.18947,1.94787,1.62503,3.5383,65.21,-0.2475,0.0109,0.2584,784.6817,0.108284,-341.497656,-341.491415,-341.490471,-341.527938,24.353,-1410.429388976,-1418.9566087771,-1426.661164279,-1315.68557012\r\ngdb_127940,CN=C1NN=CC=C1C,2.42314,1.63875,0.98956,1.0366,86.45,-0.2032,-0.045,0.1582,1186.2846,0.148801,-398.148772,-398.140223,-398.139278,-398.181782,31.964,-1765.8730769,-1776.506216905,-1786.581501409,-1643.349434614\r\ngdb_79840,CN1C2C3CCC1C23C,2.53284,1.68014,1.43081,0.8224,85.52,-0.1912,0.0881,0.2794,1061.3332,0.195639,-367.182748,-367.174444,-367.1735,-367.21495,33.239,-2084.059045458,-2098.401391162,-2110.849287195,-1934.902666194\r\ngdb_27192,OC1=C2C3CC3N2C=C1,3.25294,1.6474,1.23114,1.3773,76.47,-0.185,0.0212,0.2062,992.5601,0.126037,-400.707159,-400.700187,-400.699243,-400.738319,28.062,-1649.9746746361,-1659.8209183551,-1668.710838358,-1540.021292638\r\ngdb_17030,OC1C2NC1C2C#C,4.37077,1.59248,1.34423,2.108,67.71,-0.2557,0.0327,0.2884,905.8493,0.119142,-362.55073,-362.543915,-362.542971,-362.581538,27.04,-1455.662120223,-1464.7177026021,-1473.015254109,-1354.6149734621\r\ngdb_82593,CC1C2OC3C(C)N1C23,2.20331,2.11824,1.59078,0.4547,77.71,-0.207,0.0706,0.2777,959.6913,0.171606,-403.077162,-403.069365,-403.068421,-403.109072,31.063,-1881.4696473351,-1894.352407105,-1905.614311128,-1745.2449735431\r\ngdb_117611,COCC(N)(C#C)C#N,2.32161,1.22442,1.03909,4.0419,75.19,-0.2594,0.0007,0.2601,1231.3492,0.132983,-417.951662,-417.941301,-417.940356,-417.987224,36.38,-1654.455716405,-1663.063257358,-1672.545545857,-1540.27104122\r\ngdb_102639,CCC(CC(C)O)OC,2.08178,0.90451,0.75594,1.4978,86.09,-0.2428,0.0685,0.3113,1694.9101,0.225368,-426.625083,-426.613113,-426.612169,-426.663161,42.546,-2236.5380848771,-2251.2450133101,-2265.471897358,-2070.8085654141\r\ngdb_115833,OCC1CCC2(CC2)C1,4.13374,0.88663,0.80592,1.2774,85.98,-0.2589,0.0779,0.3368,1498.119,0.206728,-388.289443,-388.280205,-388.279261,-388.324297,35.548,-2162.9394367941,-2177.5842418361,-2190.625133874,-2008.8138137591\r\ngdb_126900,CCNc1cn[nH]c1C,3.14548,1.01215,0.77792,2.696,83.66,-0.1846,0.0319,0.2166,1489.0036,0.172022,-399.348905,-399.339267,-399.338323,-399.384074,34.695,-1891.115715683,-1902.8426038751,-1914.104507898,-1756.575903556\r\ngdb_59030,CC(O)C1=CCN2CC12,3.11879,1.17662,1.06286,0.4211,80.72,-0.2208,0.0094,0.2302,1227.9673,0.170881,-403.125624,-403.116792,-403.115847,-403.159359,33.578,-1911.8799884931,-1924.113276448,-1935.374552962,-1776.800518626\r\ngdb_133276,CN=C1N=NON=C1F,2.57204,1.62637,1.00248,2.3567,66.09,-0.2705,-0.1007,0.1698,1069.2723,0.074031,-509.979228,-509.971394,-509.97045,-510.012329,27.613,-1141.65222406,-1147.400834009,-1153.918769992,-1058.565639897\r\ngdb_106476,CNC12CC(C1)OC2=N,2.60172,1.68763,1.27649,3.2844,76.74,-0.2227,0.0313,0.254,1044.8665,0.161189,-419.187541,-419.179658,-419.178714,-419.219697,31.291,-1802.1292919021,-1814.0689056451,-1824.737813663,-1672.436359291\r\ngdb_133620,CC(C)(C)CC(F)(F)F,2.56631,1.03994,1.02563,2.03,67.82,-0.3236,0.0665,0.3901,1290.1918,0.165102,-534.679822,-534.669843,-534.668899,-534.714975,37.376,-1844.874577473,-1856.3874850961,-1867.6493891191,-1710.7426461961\r\ngdb_107218,OCC12CC1N1CC1C2,3.80282,1.2096,1.0867,1.3918,77.94,-0.2261,0.0693,0.2954,1164.4322,0.172203,-403.100195,-403.092117,-403.091173,-403.132742,31.953,-1895.923062132,-1908.629491873,-1919.891395896,-1760.0981115731\r\ngdb_49007,O=CC12CCC1C(=O)O2,3.09581,1.39189,1.12748,2.2122,66.15,-0.2736,-0.0563,0.2172,1055.6479,0.111729,-457.78093,-457.773276,-457.772332,-457.813541,28.073,-1571.8617268071,-1580.3908291351,-1588.687753133,-1469.3004008291\r\ngdb_109907,CCC1C(O)C=CC1=O,2.11431,1.6434,1.00533,3.9502,77.98,-0.2366,-0.0488,0.1878,1192.6321,0.158715,-423.058326,-423.049201,-423.048257,-423.092627,33.668,-1881.9208263061,-1893.08170138,-1903.750609398,-1753.934090666\r\ngdb_88796,CC1NC1C1COC1=O,2.80688,1.35657,1.05947,4.9828,71.91,-0.2572,0.012,0.2692,1159.9182,0.148066,-439.082449,-439.073999,-439.073055,-439.116064,31.053,-1748.454054569,-1759.1493179651,-1769.225229978,-1626.414221722\r\ngdb_122570,OCCOC1CC2OC12,2.93592,1.28746,1.12355,2.9678,71.56,-0.2462,0.0751,0.3213,1148.7286,0.15883,-460.100529,-460.092112,-460.091168,-460.13422,31.7,-1771.7277358701,-1783.332259807,-1794.001167825,-1643.1021960681\r\ngdb_35948,C#CC1OCCCC=C1,3.2982,1.1902,0.91931,1.6927,83.82,-0.2472,-0.0007,0.2465,1274.2315,0.159104,-385.849236,-385.840708,-385.839764,-385.882641,32.867,-1887.3908222591,-1898.926947715,-1909.595855733,-1759.097862227\r\ngdb_80082,CC1C2C=CNC(=O)C12,2.28239,1.78233,1.37133,3.1312,77.75,-0.2107,-0.001,0.2096,1011.4121,0.1499,-401.992718,-401.984866,-401.983922,-402.024952,30.671,-1828.8228972531,-1839.8940385401,-1849.969950553,-1706.1724981491\r\ngdb_119286,CCNC1(CNC1)C#N,2.10488,1.27083,0.9309,3.5469,80.27,-0.2367,0.0263,0.263,1341.6591,0.171492,-399.304619,-399.295105,-399.294161,-399.339308,34.718,-1863.325852109,-1875.130551417,-1886.39245544,-1728.484835662\r\ngdb_48553,O=CC12C3OC1COC23,3.04268,1.4793,1.43009,1.5306,66.09,-0.2373,-0.0321,0.2052,951.2271,0.112651,-457.716817,-457.709902,-457.708958,-457.748497,26.273,-1531.6302422901,-1540.623073769,-1548.9199977671,-1428.484705433\r\ngdb_5582,CNc1cc[nH]c1O,3.28712,1.85297,1.26271,0.3097,67.39,-0.1765,0.0628,0.2393,952.9441,0.131982,-379.939492,-379.931645,-379.930701,-379.971645,29.403,-1550.666982823,-1559.962901149,-1568.853448661,-1443.470247862\r\ngdb_130884,c1c(nc(=O)[nH]n1)C=O,3.88604,1.2321,0.93549,2.124,65.48,-0.268,-0.1233,0.1447,1120.6064,0.078678,-468.837321,-468.830226,-468.829281,-468.869489,25.34,-1298.110925557,-1304.323892166,-1310.84120064,-1214.659758665\r\ngdb_64936,C1CC1C#CC2(CC2)O,4.25622,0.70089,0.65818,1.843,88.61,-0.2275,0.0381,0.2656,1758.7067,0.157805,-385.830917,-385.821529,-385.820585,-385.866595,34.779,-1875.895484888,-1886.8919526041,-1897.560860622,-1749.028852813\r\ngdb_20830,CCc1nc(no1)N,5.9852,1.3044,1.15686,1.9403,63.51,-0.2351,-0.0047,0.2303,1037.252,0.119923,-395.999918,-395.992219,-395.991275,-396.032761,27.719,-1439.980670313,-1448.480279718,-1456.777831225,-1339.594290529\r\ngdb_79255,CC12OC3C(O1)C3C2O,2.45915,2.19097,1.62503,1.3309,65.08,-0.2361,0.0791,0.3152,876.9241,0.136683,-458.953805,-458.946729,-458.945785,-458.984839,29.028,-1679.9997252681,-1690.668633286,-1700.151549294,-1563.077855825\r\ngdb_92047,CC1CC2OC2C=C1C,2.65677,1.44823,1.02656,2.3271,84.23,-0.2341,0.0075,0.2415,1213.3873,0.183024,-387.092227,-387.083639,-387.082695,-387.12527,33.709,-2039.527241764,-2052.8028221681,-2064.657722196,-1897.636161702\r\ngdb_17589,CC12OC1(C)C1CC21,2.74393,2.42859,1.76679,1.8148,71.1,-0.2363,0.0894,0.3257,839.3904,0.152558,-347.763181,-347.755511,-347.754566,-347.79454,29.849,-1737.238586212,-1748.423934137,-1758.49984615,-1616.788861171\r\ngdb_65996,CC1(OCC1=O)C1CO1,2.4123,1.47,1.36523,2.2986,68.58,-0.254,-0.0311,0.2229,1044.5458,0.133216,-458.935862,-458.927146,-458.926201,-458.969468,31.617,-1668.7403312811,-1678.380124539,-1687.862413038,-1553.4324149861\r\ngdb_133081,COc1cccc(c1)F,3.63095,1.10928,0.85431,2.594,75.99,-0.2236,-0.0019,0.2217,1275.488,0.124609,-445.911906,-445.904227,-445.903283,-445.944258,29.333,-1693.8237485381,-1703.2263433941,-1712.116263397,-1584.7576642661\r\ngdb_35595,C#CC1CN=C(O1)C#N,3.98067,1.05475,0.90893,4.8864,71.73,-0.2837,-0.0615,0.2222,1216.8502,0.08882,-415.585472,-415.577538,-415.576594,-415.61889,28.227,-1425.353435523,-1431.9297298431,-1439.0406618311,-1336.5653045861\r\ngdb_21921,C(c1c(nno1)O)O,3.61406,1.92915,1.293,2.5445,52.98,-0.2485,-0.0446,0.2039,855.2525,0.082761,-451.758862,-451.751481,-451.750537,-451.791202,25.741,-1164.6535664551,-1170.686437981,-1177.205001473,-1083.8040518771\r\ngdb_5666,c1coc2c1C=CC2,5.1175,2.13031,1.51829,0.3907,68.72,-0.1976,-0.0043,0.1933,781.7285,0.109835,-345.432255,-345.426553,-345.425609,-345.461824,22.966,-1530.264782706,-1539.1308573671,-1546.835412869,-1435.433740099\r\ngdb_11766,C#CC12CC(C1)CO2,4.65198,1.73576,1.58229,1.8939,71.12,-0.2386,0.0344,0.273,844.6848,0.131161,-346.545252,-346.538666,-346.537722,-346.575649,26.719,-1600.8287972651,-1610.917259458,-1619.80780697,-1493.1463703381\r\ngdb_41841,C1OC2CC3CC1C2C3,2.77338,1.94257,1.69241,1.6554,79.61,-0.2353,0.079,0.3143,918.0218,0.188254,-387.125041,-387.11864,-387.117696,-387.155584,27.745,-2060.1183220901,-2074.7662646771,-2086.621164705,-1916.658469528\r\ngdb_52760,CC#CC(OC=O)C=O,2.90548,0.83539,0.67819,5.8715,73.83,-0.2649,-0.051,0.2139,1610.8278,0.107709,-457.756474,-457.746314,-457.74537,-457.794665,32.856,-1556.5153667031,-1563.471931477,-1571.768855475,-1457.455540945\r\ngdb_88887,OC1CC1N=C1CCO1,4.55206,0.85234,0.77677,4.0661,78.65,-0.2271,0.0148,0.2419,1479.3041,0.147332,-439.056888,-439.048385,-439.047441,-439.090825,31.555,-1732.4142970201,-1743.0763024391,-1753.152214452,-1610.576522071\r\ngdb_36412,N#CCN1CC2CC1C2,4.43476,1.0383,0.98923,4.5644,79.59,-0.2285,0.0278,0.2563,1234.1828,0.161617,-382.050922,-382.043291,-382.042347,-382.083531,29.771,-1853.075492594,-1865.173866114,-1875.842774132,-1723.922845232\r\ngdb_10461,O=CCC1(CO1)C#C,2.94281,1.6162,1.17679,2.7994,64.1,-0.2538,-0.0308,0.223,992.8123,0.103312,-382.484812,-382.476555,-382.475611,-382.518325,28.959,-1426.568920456,-1433.8310821131,-1441.535637615,-1333.9567496731\r\ngdb_89787,CN1CC2(CC2)C1C=O,1.95579,1.63959,1.00956,2.7575,81.06,-0.2227,-0.0263,0.1964,1228.0075,0.168893,-403.113662,-403.104362,-403.103418,-403.148334,33.958,-1904.373725835,-1916.313339578,-1927.575243601,-1769.882231901\r\ngdb_122209,CCC1(CC1)CCOC,2.50625,0.83118,0.70375,1.0413,90.05,-0.2497,0.0807,0.3304,1754.7611,0.22714,-389.473696,-389.462689,-389.461745,-389.510609,39.926,-2278.2172326571,-2293.5290797661,-2307.755963814,-2112.012688881\r\ngdb_13718,CCCOC1CC1C,5.06888,0.86725,0.80514,0.8631,79.93,-0.2397,0.0931,0.3328,1527.908,0.198452,-350.191045,-350.181244,-350.1803,-350.226299,35.041,-2005.04185716,-2018.444194382,-2030.892717924,-1860.29435613\r\ngdb_6519,CC1(O)CC1(O)C#C,3.08784,1.80955,1.41536,1.9582,68.4,-0.2376,0.0373,0.2749,920.4401,0.12607,-383.682917,-383.673914,-383.67297,-383.715538,33.893,-1550.538970987,-1559.110116418,-1568.00066393,-1443.996100404\r\ngdb_56886,CC(C#N)C1CC(C)O1,2.35902,1.33471,1.07352,3.6624,78.84,-0.2586,0.0345,0.2931,1228.052,0.169939,-403.160333,-403.150794,-403.14985,-403.194956,34.769,-1933.660198374,-1945.4498374661,-1956.711741489,-1799.137956499\r\ngdb_33710,N#CC12C=CC3CC1C23,3.77377,1.50885,1.26048,4.5726,77.93,-0.2446,-0.0094,0.2352,982.8999,0.126572,-363.598669,-363.59205,-363.591105,-363.629483,27.072,-1718.572075989,-1728.640457894,-1737.529750388,-1608.657599549\r\ngdb_109901,OCC1C(O)C2CCC12,2.62572,1.41659,1.17961,2.1492,77.27,-0.2391,0.0587,0.2978,1135.1607,0.183883,-424.188863,-424.180375,-424.179431,-424.221823,33.54,-1963.4913487251,-1976.82905252,-1988.683952548,-1821.292161744\r\ngdb_124007,c1c2c(n[nH]1)C=CCN2,3.68001,1.61083,1.14578,1.8494,78.55,-0.1959,-0.0075,0.1885,1012.3994,0.1291,-396.95105,-396.944276,-396.943332,-396.982087,27.153,-1642.143362316,-1652.113225308,-1661.003145311,-1531.752606545\r\ngdb_13906,CC1C([NH3+])C1C([O-])=O,2.92784,1.69472,1.5165,6.4059,64.83,-0.2458,0.0264,0.2721,922.7366,0.143082,-401.023853,-401.015831,-401.014887,-401.056145,29.932,-1615.532588153,-1625.6072451481,-1635.090788665,-1501.789679304\r\ngdb_57682,CC(C)(CCC#C)C#N,2.98616,0.81855,0.75349,3.7331,85.56,-0.271,0.0329,0.3039,1573.1766,0.167907,-366.003409,-365.992806,-365.991861,-366.039201,38.499,-1971.864828821,-1982.9874258461,-1994.24870236,-1838.3321686391\r\ngdb_31328,COc1cc[nH]c1C=O,2.74993,1.3209,0.8974,4.2693,78.82,-0.2116,-0.0337,0.1779,1239.9734,0.12505,-437.930163,-437.92186,-437.920916,-437.963429,30.167,-1653.235838909,-1662.24624064,-1671.136160643,-1544.3479671931\r\ngdb_122559,CCOCC1CC2OC12,4.93967,0.72411,0.69344,1.1194,79.69,-0.2482,0.0896,0.3378,1697.9204,0.181049,-424.177682,-424.168302,-424.167358,-424.212985,33.961,-1956.4751705961,-1969.2531363631,-1981.108036391,-1815.7462372021\r\ngdb_59008,CC(C)C1=NCC(=N)N1,3.66022,1.0359,0.87432,3.7542,82.92,-0.2258,0.0032,0.2289,1377.6716,0.171217,-399.370244,-399.360812,-399.359868,-399.404822,34.841,-1904.5061302341,-1916.3622852801,-1927.624189303,-1769.5954602881\r\ngdb_86793,OC1CC(CC#C)C1O,3.58481,0.92666,0.81329,1.32,77.06,-0.2504,0.0487,0.2991,1413.4477,0.157227,-422.97555,-422.965905,-422.964961,-423.010458,35.775,-1829.9781413221,-1840.8127117161,-1851.481619734,-1702.3723036451\r\ngdb_14644,CC#CCOCC#C,7.52851,0.66916,0.62169,1.2336,77.64,-0.2475,0.0183,0.2658,1749.9674,0.126233,-346.524375,-346.514528,-346.513584,-346.561082,32.489,-1587.7282918721,-1595.770447216,-1604.660994728,-1484.005446735\r\ngdb_23829,c1c(nc(=O)[nH]n1)F,4.00315,2.00435,1.33561,3.8054,51.67,-0.2745,-0.0908,0.1837,787.5502,0.061819,-454.776823,-454.771024,-454.77008,-454.807087,21.164,-1067.436107121,-1072.684592397,-1078.017163879,-998.689986135\r\ngdb_88871,CC1OC1C1OCCO1,3.73063,1.06317,0.90023,2.5919,72.24,-0.2537,0.0779,0.3316,1320.7386,0.158493,-460.139386,-460.130478,-460.129534,-460.174909,31.642,-1796.1108530831,-1807.4072701011,-1818.076178119,-1668.634909769\r\ngdb_33013,CNCc1ncncn1,3.95824,1.09353,0.96182,0.4465,74.56,-0.2268,-0.0524,0.1744,1241.2204,0.138558,-414.225659,-414.217464,-414.21652,-414.259529,29.176,-1665.516190039,-1675.482287977,-1684.965203985,-1549.90957946\r\ngdb_85223,OC1CC(=O)C1NC=O,2.50147,1.19565,0.96611,4.6855,68.53,-0.2417,-0.0295,0.2122,1228.4604,0.122723,-475.045293,-475.036384,-475.03544,-475.07982,31.782,-1588.8050973161,-1597.434601084,-1606.324521087,-1480.452490777\r\ngdb_99928,OCC(O)(C#N)C1CO1,2.37117,1.3386,1.15399,2.2801,64.52,-0.2896,0.0086,0.2982,1118.3368,0.122224,-474.994453,-474.985284,-474.984339,-475.028696,32.993,-1556.9025397561,-1565.368891184,-1574.258183678,-1448.371720661\r\ngdb_69553,CC12NC1C1(C)OCC21,2.78497,1.52889,1.35868,3.1505,78.86,-0.2219,0.0754,0.2972,1063.4103,0.170335,-403.082229,-403.073945,-403.073001,-403.114412,32.992,-1884.649235438,-1897.226398325,-1908.488302348,-1748.5958716031\r\ngdb_17277,CC1OC1CC1CO1,5.95837,0.98909,0.92848,2.4804,68.94,-0.2615,0.0887,0.3501,1273.2692,0.152155,-384.898251,-384.889812,-384.888868,-384.932091,30.078,-1685.3203740791,-1696.022540074,-1706.099079596,-1566.171475195\r\ngdb_60789,CC(C=O)C(C)(C)C#N,2.37212,1.23587,1.04115,2.4444,79.31,-0.2658,-0.0462,0.2196,1239.7184,0.167333,-403.180187,-403.169519,-403.168575,-403.216078,38.071,-1946.1187620601,-1957.199943491,-1968.461847514,-1812.392201597\r\ngdb_52113,O=CCOC12CC1CC2,5.22115,0.76363,0.72936,2.6848,76.99,-0.2523,-0.0285,0.2238,1563.1389,0.156961,-422.989467,-422.980446,-422.979502,-423.024705,32.448,-1838.711184075,-1849.937320085,-1860.606228103,-1711.312424368\r\ngdb_66627,CC12C3CN1C1C3C21O,2.69259,1.98517,1.51269,2.6549,74.44,-0.2124,0.0666,0.2789,925.1733,0.147826,-401.853818,-401.846466,-401.845522,-401.885015,30.123,-1741.661897153,-1753.0467929401,-1763.122704953,-1618.360771216\r\ngdb_107566,CCC12CC=CCC1O2,3.29475,1.26929,1.04122,1.6947,83.33,-0.2408,0.0292,0.27,1228.749,0.183024,-387.100586,-387.091957,-387.091013,-387.133854,33.366,-2044.7725894951,-2058.0224420301,-2069.877342058,-1903.0226989581\r\ngdb_81682,CC1C2CC(O)CC1O2,2.2574,1.66087,1.44493,1.5095,76.79,-0.2358,0.0704,0.3062,1042.1036,0.184534,-424.217029,-424.208873,-424.207929,-424.249359,32.745,-1981.1657672191,-1994.711804002,-2006.56670403,-1838.5712495681\r\ngdb_83595,CC12COC1C(O)C2=O,2.2653,1.76001,1.27556,3.2675,68.1,-0.2341,-0.0384,0.1956,1027.5844,0.134471,-458.939131,-458.930648,-458.929704,-458.972319,31.375,-1670.7916582021,-1680.5776610571,-1690.0605770651,-1555.2214431451\r\ngdb_66540,OC12C3C1C1C3C21C=O,3.1293,1.58693,1.2403,4.3561,70.88,-0.2221,-0.0216,0.2005,981.1471,0.109793,-420.514744,-420.506823,-420.505879,-420.547987,29.102,-1541.5034688961,-1549.8656538301,-1558.162577828,-1439.594752278\r\ngdb_71214,OC12CC3C(CC13)C2=O,2.43934,2.05522,1.63956,2.5598,71.37,-0.2418,-0.0147,0.2271,888.7746,0.137448,-421.809569,-421.802728,-421.801784,-421.840417,28.248,-1726.1661899071,-1736.983190049,-1746.466106057,-1609.3836274621\r\ngdb_87482,CC1C2CN1C21CC1O,4.0443,1.13074,1.0369,2.4623,80.57,-0.2326,0.061,0.2936,1223.4381,0.169852,-403.037223,-403.028459,-403.027515,-403.070397,33.737,-1856.407565384,-1868.6835239511,-1879.945427974,-1720.9760629681\r\ngdb_87796,CC1C2CC3(CC3O)N12,2.68565,1.48874,1.24465,1.1427,79.24,-0.2226,0.0691,0.2917,1092.3898,0.170271,-403.079627,-403.071022,-403.070078,-403.11239,33.772,-1883.0164570201,-1895.3921895181,-1906.654093541,-1747.3270484051\r\ngdb_52672,CC#CC(C)C1(C)CO1,2.6755,0.91999,0.83419,2.0382,87.04,-0.2443,0.0549,0.2992,1526.3141,0.178908,-387.056868,-387.04591,-387.044966,-387.097438,38.416,-2017.3391510331,-2029.1275351071,-2040.982435135,-1880.1713312141\r\ngdb_34363,C#CC12CC=CC1CO2,2.74956,1.61973,1.3057,1.7017,79.21,-0.2471,0.0127,0.2598,1014.2696,0.135948,-384.64642,-384.638791,-384.637847,-384.678395,30.031,-1760.464576829,-1770.7877273881,-1780.270643396,-1644.6452406991\r\ngdb_125586,Cc1nc([nH]n1)C2CC2,4.12812,1.06251,0.91024,2.6395,79.57,-0.2347,0.0209,0.2556,1311.1534,0.149661,-398.160047,-398.151718,-398.150774,-398.193694,30.663,-1772.948240875,-1783.71943286,-1793.795344873,-1650.824321822\r\ngdb_76335,CC1C(C)C(C)(O)C1O,1.82942,1.75328,1.19363,1.5437,80.99,-0.2422,0.0526,0.2948,1186.3598,0.20338,-425.42717,-425.416622,-425.415678,-425.461677,40.226,-2112.6885160741,-2126.5106568171,-2139.5515488551,-1958.0891237441\r\ngdb_127467,Cn1c(=O)oc(n1)CO,3.56118,1.14844,0.87876,2.1925,64.48,-0.2432,-0.0028,0.2404,1250.3812,0.112089,-491.117124,-491.108328,-491.107383,-491.151268,30.23,-1485.275524951,-1493.086756983,-1501.3830534721,-1383.059956432\r\ngdb_31412,CNc1ccoc(=O)c1,3.06649,1.26078,0.89849,6.8022,77.37,-0.213,-0.0267,0.1862,1238.8047,0.124759,-437.94934,-437.940777,-437.939833,-437.984044,30.776,-1665.2695790021,-1674.1168283931,-1683.0067483961,-1557.284065228\r\ngdb_94823,CC1OCC2(CC2)C=C1,3.73934,1.15658,1.01988,1.0639,86.86,-0.2291,0.021,0.2501,1258.9527,0.183181,-387.089386,-387.080974,-387.08003,-387.122163,33.41,-2037.744488695,-2051.1305106831,-2062.985410711,-1895.686491239\r\ngdb_39986,C1CC11CC2C3OC2C13,4.01995,1.37523,1.27603,2.0583,78.65,-0.2263,0.0855,0.3118,1039.7021,0.160873,-385.816258,-385.80949,-385.808546,-385.847235,28.507,-1866.6968304571,-1879.3373717531,-1890.006279771,-1736.8802785731\r\ngdb_129889,c1c(nc2n1cno2)N,5.52693,1.33523,1.07656,4.4714,67.81,-0.2034,-0.0173,0.1861,1018.17,0.092058,-448.922474,-448.915855,-448.914911,-448.953258,25.857,-1339.274260939,-1346.674474576,-1353.785406564,-1248.217057458\r\ngdb_47001,O=C1NCC2COC1C2,2.67679,1.78718,1.42817,4.8266,70.94,-0.2282,0.0241,0.2522,957.0809,0.151811,-439.131708,-439.1246,-439.123656,-439.163299,28.066,-1779.3645204,-1790.901900874,-1800.977812887,-1656.054609337\r\ngdb_53123,CC#CC1C2C3CN1C23,4.54414,0.98109,0.94476,1.7549,84.57,-0.2224,0.0357,0.2581,1318.8119,0.147781,-364.706764,-364.698604,-364.69766,-364.740861,30.373,-1786.0600414301,-1796.9385374541,-1807.014449467,-1664.8347152651\r\n"
  },
  {
    "path": "graphium/data/QM9/norm_micro_qm9.csv",
    "content": "mol_id,smiles,A,B,C,mu,alpha,homo,lumo,gap,r2,zpve,u0,u298,h298,g298,cv,u0_atom,u298_atom,h298_atom,g298_atom\ngdb_1,C,0.0817354099104199,98.68946496237167,142.91669186791543,-1.768196605616354,-7.569963931339775,-6.674818289787962,2.257884669066512,5.338921850966618,-4.125589852379649,-3.1188228706583905,9.26267898891294,9.26258973253578,9.262589752904445,9.262904106134895,-6.186302195134863,5.661252369499702,5.64471236173955,5.636546488290379,5.707883588529799\ngdb_2,N,0.1568393218386398,184.45255129690693,173.66371017088588,-0.705986479363799,-8.027962819534471,-0.7691910328246705,1.52923397261724,1.868722791403268,-4.158503396414672,-3.431109711802459,8.8621082274628,8.86201660741038,8.862016602814018,8.862353250279831,-6.223963998504721,6.159085940277457,6.141833617537457,6.132597728634267,6.22164075748973\ngdb_3,O,0.4364679324105007,275.6024525032204,257.2252235871683,-0.5586388025245081,-8.412681885618017,-2.386799722949275,1.226694794559355,2.32327888410532,-4.184083080212001,-3.8212954550969873,8.365885045629856,8.365790064916677,8.365790085279798,8.366172010993274,-6.301256849865082,6.425572258008738,6.409589666726299,6.400906652867241,6.48471871476045\ngdb_4,C#C,-0.0054239117514669,21.59619185499186,31.4754977834281,-1.768196605616354,-7.195015508204384,-2.011767540601949,0.8410638704151496,1.7677103263583682,-4.039226740304344,-3.6570225387565514,8.343325889311837,8.343232407888792,8.343232403289225,8.343579318429569,-5.6681447043273465,5.705117858128321,5.691953032968984,5.685892764383127,5.738178018379722\ngdb_5,C#N,-0.0054239117514669,27.268546789957043,39.675297975234685,0.1226236714118315,-7.596833199447198,-5.441278702067244,0.1699382289487148,2.7041800543787997,-4.0777501494546575,-3.964771287992934,7.941344651700702,7.941239437938911,7.941239458299409,7.94159431535452,-6.23331791045279,6.054791014686964,6.04124186475396,6.0351594147302965,6.0882153229649285\ngdb_131852,c1c(cn[nH]1)NCCO,-0.0023889329155309,-0.4118887228164443,-0.4023857904573995,0.2310924712137041,0.0560228896940481,2.515729407735631,0.4618246190468151,-0.7155127726621006,1.4586591189216138,0.0362028807903958,-0.5921024973616195,-0.5920964544168864,-0.5920964590716694,-0.5921242945205567,0.1381730596478094,0.206707735855223,0.2058000047361694,0.204308873399935,0.2262987148139725\ngdb_43476,O=C1NC(=O)C2CC1C2,-0.0040015692065541,0.2624978345153555,0.1383593577747917,0.0576730768316739,-0.7024232691563699,-0.8143756331074807,-0.76537655392038,-0.3767001294906637,-0.7782626211295555,-0.6236537147402724,-0.6592971682937696,-0.6593432898265893,-0.6593432944817892,-0.6592336127357417,-1.1502052334099302,0.3410888959206847,0.3409645682972814,0.3433977050519765,0.3196298376100148\ngdb_21453,c1nc(on1)NC=N,-0.0020336401834512,0.0882075918524058,0.1001799847465993,1.0062523338941955,-1.8223832170884664,-0.3580111702510979,-0.797334917799734,-0.6229180130376082,-1.0536257135109317,-1.8791123329629964,0.0176090232363207,0.0175619826170693,0.0175619779660538,0.0176590128502,-2.07624251626878,2.341472124052768,2.338008324453182,2.336108776254511,2.363519304882029\ngdb_47651,C1CC(=O)NCCC=C1,-0.0042126480942185,0.1697522405986742,0.0192484761066323,0.5728345332400862,0.6544747702684502,0.2339070934537156,0.0506270037991264,-0.0589317498702477,-0.22645514147871,0.7688492683767794,0.2086656627335486,0.2086634005379692,0.2086634208506837,0.2086709825164921,0.131034547897968,-0.8281784931184392,-0.8286734900224964,-0.827743419217958,-0.8361976963668892\ngdb_120327,COCC1C2CCCC12,-0.0036450606459337,-0.3063068709902654,-0.2092709985543409,-1.0993492352717948,1.3066651870576982,-0.2902342698268825,1.4887533783700582,1.6056694970155074,1.0026883243772016,1.7584237865512329,0.5807952497749932,0.5808167385163491,0.5808167338688143,0.580724162637872,0.6713460406877539,-1.6899993876150767,-1.6916994232899245,-1.692001340254954,-1.6912950953280772\ngdb_18112,C1OCC2C=CCC12,-0.0034370102766345,0.4948258945438261,0.5099579267675708,-0.8193559779519007,-0.4251812755025142,0.0622056123790369,0.3126855876098295,0.2777764669460866,-1.3252452021359398,0.2646101556142743,1.5906638796840422,1.5906223647021456,1.5906223600608511,1.5907046379455685,-1.52190015555689,-0.0965197634563808,-0.1009602672266666,-0.1002903774067669,-0.108208099442752\ngdb_45538,O=C1CC2CCNC2=C1,-0.0030843978938378,-0.1171725346995588,-0.1248528725352755,2.2559958670331204,0.832789004072253,1.2234498396472588,-0.3200900172013804,-0.88597130742537,-0.0670590492179551,0.0779172620345463,0.2381260524482999,0.2380990136539298,0.2380990339668247,0.2381639718474192,-0.6158014744820828,-0.3571299044875164,-0.3570023141470737,-0.3545213801876228,-0.3812341194174597\ngdb_46983,N=C1OCC2COC1C2,-0.0039185170642328,0.2470791715468392,0.3067757553641307,0.9382959773918176,-0.333581497863575,-0.570378791580305,0.1848521320924134,0.448235001709356,-0.855558226617658,0.0818542821663787,-0.6879512970442715,-0.6879903360196908,-0.6879903406750664,-0.6878961880662299,-0.917588212596104,-0.0452677555223741,-0.0486020298856986,-0.0482988456791099,-0.0405350564481133\ngdb_29887,Cn1cnc(=O)nc1O,-0.0035817601909065,0.0416611613020773,-0.0887088808566616,3.255607734122908,-1.026075816813954,-0.2992711898834448,-0.9507350644206334,-0.7996898268661837,-0.4139488154514159,-1.3804932413047113,-1.4610024423199834,-1.4610290556488927,-1.461029060309045,-1.4609692867619544,-0.9298959914751428,1.277538021145946,1.279033409818176,1.2797278740070828,1.274772090599154\ngdb_107310,CCC12CC1N=C(N)O2,-0.0036951417288266,-0.0717878707283487,-0.0807590281414917,-0.6148334819399361,0.1427373458589098,1.1330806390816384,0.9390695196451688,0.3977287691869067,0.048820374953052,0.3461455420086403,-0.1916881254263564,-0.1916874330886013,-0.1916874377409098,-0.1916754837081405,0.3752208808580893,-0.3075534328812455,-0.308407528167253,-0.308710714065434,-0.2991077467084007\ngdb_51556,O=CNC1CC11COC1,-0.0039717261423716,-0.0531996054624553,-0.0761862655579096,1.3859584758512324,-0.2798429616487292,-0.0010528280168966,0.1656771137648009,0.1641374437705745,-0.0535333906161042,-0.0335214452084719,-0.6869490062539516,-0.6869480996739107,-0.6869481043292798,-0.6869571889543636,-0.0796746265111698,0.0600157102809207,0.059914283119302,0.0594520940951924,0.0666595223934458\ngdb_107668,CCC12COCCC1N2,-0.0036826021153766,-0.1047024858204353,-0.0724167027894077,-0.6167937614544278,0.7790504678574072,0.2158332533405922,1.52923397261724,1.4099578459910125,0.2030730780750333,1.4398256917607148,0.1798982046298884,0.1799038730990811,0.1799038684490689,0.1798998719987629,0.5037140923552499,-1.2264395810709687,-1.2299514914044634,-1.2310713525310235,-1.2087527871127677\ngdb_107965,CCC12CC3C1CC23C,-0.0040033155784582,-0.0341756777293924,0.1631490247947703,-1.5781801780116278,1.8501572010487384,-1.2707400959638635,1.7401591742209763,2.310652325974708,-0.0339056543168456,1.7060103276969114,1.5088695173875373,1.5088895624082836,1.5088895577664845,1.508853954775356,0.8960860830189944,-1.7920995966063016,-1.7935829131777652,-1.7931662386826388,-1.7957283814859983\ngdb_94256,OC1CCC1(C=O)C#N,-0.0042220928713829,0.0274610904518753,0.1754343310572485,0.4833151020782996,-0.6389140899933724,-1.6412538182829064,-1.6197301482951112,-0.8333606485478176,-0.470663000421603,-0.7543267188105648,-0.657759059317203,-0.6577608139953863,-0.6577608186505751,-0.6577462213680156,-0.0636745139684204,0.502656222887189,0.5057299243663429,0.5070009594122976,0.4894279654467912\ngdb_115519,OCC1COC(CO)O1,-0.0041924045490147,0.0350567708577414,-0.0089365355861856,0.1243225803243911,-0.9552386554398412,-0.6155633918631153,1.2330864673352258,1.5046570319706063,-0.1340643018999558,0.3234851360590088,-2.1404348050417843,-2.140432505108665,-2.140432509773016,-2.1404338318388976,0.3429745001950086,-0.0138704597581866,-0.0160279028765378,-0.0183932629504398,0.0165801027544119\ngdb_5417,c1ncnc(n1)CO,-0.0023667827301154,0.1191838246630285,0.111899400230271,-0.5159047091085895,-1.7881859667699298,-1.166815515313399,-1.2831020487659153,-0.7239304780825087,-1.0402487552969968,-1.5385751183538694,0.4175412711348177,0.4175037050281038,0.4175037003795599,0.4175760311996537,-1.886210410376427,1.7765672798198482,1.7751462269912743,1.7747777883155145,1.7774042991476378\ngdb_1722,CCC1(COC1)C,-0.0031723354562957,0.3699170454422055,0.5929978349221235,-0.5059072835846822,-1.0419531116047047,-0.091422028582517,1.2842198495421924,1.3131542336563162,-1.3130460510811028,0.6713854799223263,2.512193676700593,2.5121970127264666,2.5121970080908653,2.5121819070666462,-0.5889705165257795,0.0882612147048248,0.0826639307464615,0.079597328715766,0.1072211736476858\ngdb_65065,OC1(CC1)C1(CO1)C#C,-0.0041406213061646,0.1087531812461617,0.0880498420847614,-0.9799028701887688,-0.1186273530041969,-0.6743033722307695,0.1188048467417483,0.4313995908685399,-0.4144646199220008,-0.4775632196193678,-0.2546742833702859,-0.2546642497542876,-0.2546642294444363,-0.2546714481298294,0.6176841247751457,0.2689018613133624,0.2713459162528722,0.2718338623632094,0.2637970124708564\ngdb_76797,CC1C(O)C2(CC2)C1O,-0.0043383592388407,0.2239069794524679,0.0285126557919337,-1.0917694878157604,0.3308222226108658,-0.6381556920045204,0.8368027552312358,1.1237558616971275,-0.1652134586398936,0.9852050540948186,-0.3158839412974146,-0.3158587404214892,-0.315858720112016,-0.3158925077784049,1.2377500247011,-0.9136256016091338,-0.9138917519211932,-0.9147994273470672,-0.901283274183866\ngdb_97767,CN1CC(C1)C#CCO,-0.0020156625005279,-0.542903160407236,-0.5287254704612816,-0.7173561005478506,1.456888822385559,0.9613791580069596,0.4256051399835471,-0.0294697808988182,3.159181027208356,0.6343294049410317,0.21117353647683,0.2112154467348184,0.2112154420849997,0.2110735591578542,0.9458095096903092,-0.5647443260690119,-0.5629577097000228,-0.5639043318236767,-0.5619235379033979\ngdb_52004,C(C=O)C(=O)OCC=O,-0.0031637086001496,-0.4717702132314355,-0.4556160327477216,-0.3273911624649736,-1.2642352386751965,-1.275258555992144,-1.2064019754554658,-0.597664896776383,1.8846785812450977,-1.2243047403494882,-2.0817757315458785,-2.0817542893055982,-2.081754293969587,-2.081887011307013,-0.0144433984522661,0.9007363843817494,0.907218450307323,0.9080965550432216,0.8763736940199777\ngdb_122318,OCCCC12CCC1O2,-0.0027037109247361,-0.458018432746002,-0.4262901122266646,0.0199050248524678,0.5347843941535692,-0.2179389093743854,1.4141838626515653,1.4983437529053,2.015380502977957,0.9826204454586536,-0.3155845920456332,-0.3155591648709596,-0.3155591695240335,-0.3156298066950311,0.805254674891691,-0.8821811076385189,-0.8827003294176476,-0.8838305808896131,-0.8712937548784597\ngdb_61349,CC(OC(=N)C#N)C#N,-0.0036451324903474,-0.2971453312821093,-0.2909144343031291,0.8513249095988698,-0.6218154648341037,-3.480267049793281,-1.8114803315712356,-0.1704663466906586,0.646517294748908,-1.4738396880945752,-0.5327084555518221,-0.5326962717716895,-0.5326962764261055,-0.5327343293199973,-0.0885362273040777,1.1985177350966263,1.204216620534613,1.2054387728300595,1.1823443009098376\ngdb_75167,CC1=CCN(C=N)C1=O,-0.0036642265249309,-0.0340304569070025,-0.150162793980931,1.7357376838870302,0.5408910459961664,0.1073902126618471,-0.9869545434839012,-1.024863446862108,0.0336622291598198,-0.3716844263029831,-0.1626515725149743,-0.1626460035838738,-0.1626460082360028,-0.162659345270002,-0.0966593613642434,0.1189741364758223,0.1222217551364258,0.1237589037640964,0.1014193244162607\ngdb_49438,O=CC1C2C3COC1C23,-0.0036679403284736,0.0376202340703614,0.1668455693982648,0.7712801627571266,-0.4703704991377231,-0.0100897480734589,-0.8079877057595187,-0.7933765478008776,-0.6710299815347475,-0.3468902155490754,-0.255983062662387,-0.2560176442733331,-0.2560176489260392,-0.2559509052496926,-1.011619643231958,0.1314239744817821,0.130432212905256,0.1319114525940205,0.1177363129710062\ngdb_2635,C(COC=O)C#N,0.0033854457850931,-0.2843911547070057,-0.1592079191712105,-1.4756575594037131,-2.73715966310934,-2.644351944561292,-0.5011874125177199,0.7365414123583436,-0.2809137989302604,-1.7883504948092974,1.2730369247429447,1.273016964132106,1.273016959488849,1.273037821090102,-1.9182106354619264,2.217634518878554,2.217362158201029,2.2163109552082587,2.211035536537616\ngdb_98188,CCC(C)(C)C(=O)CO,-0.0038140331806338,-0.2876491522875767,-0.2342979746105932,0.640725547091981,0.8547729507055974,0.1661301930295002,-0.3925289753279162,-0.4629816100498493,1.030161078497619,1.597877515221081,-0.3468809172815627,-0.3468087814600832,-0.3468087861133502,-0.3469606801637004,2.418312174778463,-1.5460815677292257,-1.5432432228087103,-1.544592211854677,-1.536067132087544\ngdb_82108,OC1C2NC2C1N1CC1,-0.0038543434232537,0.0418000681756675,0.1311761957921984,0.5802182527446714,-0.0502328523671219,0.9839714581483648,1.2330864673352258,0.7596901022644663,-0.458822374580901,0.3563036550205445,-0.1889571376393458,-0.188970309555224,-0.1889703142075157,-0.1889394134831719,-0.1060132733123118,-0.0206827342187135,-0.0255041070631016,-0.0278026307986847,0.0132374852947221\ngdb_27764,c1c(c2n(c1N)CC2)O,-0.0041735260476725,0.2546306543111085,-0.0851036089594262,-0.4389964094900331,0.3479208477701346,3.5640121342968274,0.7728860274725277,-0.8964934392008804,-0.2453415205555199,-0.331142135021745,-0.1609985116284309,-0.1610019201817423,-0.1610018998713122,-0.1609801950917022,-0.0045971753490356,0.2926163379198683,0.2934016125590184,0.293733998283039,0.293108319538973\ngdb_23737,Cc1ccncc1F,-0.003583799466959,0.5066961182869918,0.1951766174210974,-0.2383944725070516,-1.0883636656084337,-0.6246003119196788,-0.7845515722479923,-0.4840258736008703,-1.1950512209108424,-1.213575609150524,0.6188475521287921,0.6188075413875865,0.6188075367402864,0.6188975955489964,-1.6326701654682356,1.1069348597704385,1.107346734014352,1.109249535733207,1.078874190115861\ngdb_53808,CC(=O)C(=O)C1CCO1,-0.0035120987420138,-0.1938365010237704,-0.191618857163826,-0.6927872639662217,-0.7170792335786007,0.2067963332840299,-2.084191703341723,-2.154940399551931,0.3369977947536978,-0.4634079792980459,-1.1837909314077697,-1.1837734188304394,-1.1837734234888784,-1.1838382655246034,0.113311346312152,0.2615074756394015,0.2653264733303456,0.2658542942597159,0.2498452178569533\ngdb_109868,CCC1C(O)C2CC12O,-0.0041702985755461,0.0034302013207644,-0.0851674998538076,-0.8426833041743516,0.3674621336664412,-0.2043835292895426,0.9838112290762644,1.0669363501093718,0.1691079077389278,0.9759485487466928,-0.3157420050194265,-0.3157147314763164,-0.3157147361293913,-0.3157544174711803,1.244888536450943,-0.8987162126234,-0.8988977250544385,-0.8999137351498512,-0.8855191140926443\ngdb_30981,COc1c(oc(=O)[nH]1)N,-0.0036184284743974,-0.0886461140231638,-0.1851293677488022,1.457508678130179,-1.3069818015733692,1.8424788635217595,0.7473193363690445,-0.1178556878131062,0.0184025667556064,-1.078184191639791,-1.9863387867085505,-1.986343133475325,-1.9863431131761744,-1.9863230270471064,-0.0565360022185781,1.109449475324377,1.1113596959633931,1.1108005587932848,1.1194728865768435\ngdb_100775,CC(CO)C(O)C1CC1,-0.0037995482415185,-0.3508454658223349,-0.288733016623536,-1.52433783401359,0.8266823522296556,-0.76467257279639,1.3460060197089434,1.6856376985093864,1.330973026931184,1.6537170831977603,-0.3460229841232531,-0.3459626508992345,-0.3459626555524963,-0.3460950446478081,2.022986317183751,-1.4559618368022715,-1.4551451928791082,-1.4571155435611878,-1.4372476194979005\ngdb_85839,NC1(CC(O)C1O)C#N,-0.0038421022404455,-0.084346314890666,0.0355680359857642,0.0255898354444937,-0.7341778587378677,-1.5011815574061953,0.1550243258050162,0.8522848618889587,-0.324273545216414,-0.3535320586722434,-1.0882104277356357,-1.088215158698918,-1.0882151633567665,-1.088179475229478,0.3225435872558053,0.4853003935354149,0.4847840303615844,0.483764001619977,0.5037673099120012\ngdb_19123,C#CC1(CN1)C1CN1,-0.0038756812141442,0.3831068844844783,0.2600623843007101,-0.2016065602850913,-0.4947971065081072,-0.0236451281583017,0.2572910902189492,0.2651499088154743,-1.0013489482897695,-0.5437412221407077,1.7187813404978345,1.718772677382827,1.7187726727423245,1.7187927087572274,-0.5931551613446524,0.921493106719674,0.9188128954188431,0.9171678463180508,0.9358796938437128\ngdb_30417,CN=c1cc[nH]c(=O)o1,-0.0038205599693004,0.0281682527174256,-0.1387080693454108,0.6588908039262706,-0.0795447812115819,0.3287947540476178,-1.0508712712426094,-1.1890087025600713,-0.1454316601598854,-1.06162466421506,-1.060063684983441,-1.0600754014265608,-1.0600754060842354,-1.0600751272201112,-0.5687857591641564,0.8183597814317992,0.8215323560571028,0.8230148108757761,0.8002059002608013\ngdb_87650,CC1NC11CC(=O)C1O,-0.003972229053268,-0.0959071551426535,-0.0453817271954546,-0.7219954287321476,-0.2468670416987115,0.0170210120962267,-0.9741711979321596,-0.9722527879845556,-0.0050824433080901,-0.0899019777834193,-0.6868680032250968,-0.686854140639481,-0.6868541452948496,-0.68687571267765,0.5261142499151,0.0685244980506546,0.0696971774029768,0.0691659889340136,0.0759607188025627\ngdb_5976,CC(C)(CCO)C=O,-0.0035234501593899,-0.2043365978774323,-0.0889918262460649,0.4314983802452364,-0.4239599451339944,-0.3354188701096928,-0.6396736559949207,-0.4756081681804616,0.1558621347229866,0.746970255735755,0.6339064520351547,0.6339539673931435,0.633953962745937,0.6338454719124961,1.0501794745845547,-0.3838257344670711,-0.3832840960286526,-0.3855005495938964,-0.3715653205762597\ngdb_42578,O=C1C2C3C4C(CN14)N23,-0.0036962470275001,0.5082493496916825,0.538316356602281,0.5297083839212694,-0.9320333784379772,-0.0643112684128314,-0.6162375224833944,-0.5787250595804645,-1.412500159085888,-0.991148998496376,-0.1298724050839537,-0.1299416195642212,-0.1299416242161482,-0.1297840670095821,-1.895072011169334,0.9386365441425282,0.9342331695387944,0.9349207375497874,0.9424993872447284\ngdb_25201,N=C1NC=NC=NC1=O,-0.0039455581962782,0.2743933140537195,0.0550638860427133,0.8105510956974431,-0.6401354203618905,-0.9770401941255972,-1.990447169295618,-1.5109859348906916,-0.8019377961124147,-1.6967471561694054,-0.9340417687817644,-0.9340804550765966,-0.934080459733493,-0.9339977531269812,-1.380606854025529,1.6162507053232378,1.6168915378142534,1.6176418613903911,1.6147558961240145\ngdb_48641,O=CC12CC(CC1)C2=O,-0.004003719012474,0.0932145532504538,0.1335766679668134,0.7126024626233426,-0.2199977735912894,-0.2179389093743854,-1.0806990775300065,-0.9659395089192494,-0.5594800263503845,-0.3666354233858236,-0.2570234211319058,-0.2570429809734414,-0.2570429856261538,-0.2569989138285971,-0.5845397161293255,0.0221417717455173,0.0236754693561232,0.0259076719073592,-0.0019026055525625\ngdb_44445,O=C1NC2C3CC1CC23,-0.0037429735289192,0.2212298651614561,0.4060804597740596,0.8087215014839177,-0.1259553352153123,0.5773100556030728,0.600310862524016,0.3240738467583334,-0.9930771317590624,0.1106155666409123,0.2382255526248841,0.2381792432864096,0.2381792386367575,0.2382891068275547,-1.006942687257923,-0.3466781238865429,-0.3486489076037528,-0.3462294715486395,-0.3669489179465091\ngdb_104356,CC1(CO)OCCC1=O,-0.0041446114343757,0.0337813532002311,0.0592989396131367,0.1511784096729268,-0.4923544457710693,-0.3173450299965682,-0.8740349911101836,-0.7155127726621006,-0.2344109696850949,0.2823718265907104,-1.2142366374722768,-1.2142249078404377,-1.214224887536516,-1.2142387518040365,0.6048840347409459,-0.3130415354400767,-0.3121163990869756,-0.3123908453194611,-0.3087167032928347\ngdb_86839,CC1OC(CC1=O)C#N,-0.0039031423596855,-0.0782533542990943,-0.0552209249301884,0.2058048654767615,-0.7549404750026941,-1.0131878743518463,-1.176574169168069,-0.6902596564008754,-0.1441401829927168,-0.7586243820079089,-0.6584389605489208,-0.6584417923320615,-0.6584417969872558,-0.6584393438482575,-0.2283525953699543,0.4312374701955931,0.4348273324246913,0.4365984484652742,0.4103022544466171\ngdb_131498,OC1CC1N1C=NN=C1,-0.0033463426264533,-0.0999354544767704,-0.062559250513422,1.852897056203149,-0.7683751090564052,-0.4980834311278091,-0.0452480878389356,0.1872861336766973,-0.2774572298169149,-0.6929873440848368,-0.5617668755366334,-0.5617866777975503,-0.5617866824521475,-0.5617503429476632,-0.7014636154801898,0.769693186360535,0.7684879646001521,0.767905748604736,0.7818314779418933\ngdb_131711,C(c1nc(on1)CO)O,-0.0036618114473294,-0.2142242416975374,-0.2699673482166597,-1.084451110961658,-1.459648097638267,-2.097618281139289,-0.9549961796045472,0.0315585833991421,0.4721285573670495,-1.0954048980179598,-1.9851900414435648,-1.9851940324572812,-1.985194037120673,-1.9851949650308804,-0.3615227628411491,1.2301169342998357,1.2310026217283896,1.2295970538040797,1.248250572528112\ngdb_131800,c1cnoc1CCCO,-0.0023877723519238,-0.464919578783917,-0.4442708353597123,0.8220514021824611,-0.007486289468951,-0.5387495713823388,-0.3882678601440024,-0.132586672298821,1.918546166951633,-0.0281418528145942,-0.6873282102636087,-0.6873192927775161,-0.6873192974328876,-0.6873675408271865,-0.0095202869006508,0.020183046178802,0.0212661327461899,0.0210765318599966,0.0198144342260361\ngdb_22715,ON=C1CC2(OC12)C#C,-0.0027700730570887,-0.2829200046367091,-0.2304280118652063,-0.6028104342510539,0.0511375682199705,-0.4303065307035937,-0.5928013889718681,-0.3830134085559699,0.3001251028903993,-1.518198785152476,-0.6244222315889193,-0.6244342883670468,-0.6244342930220297,-0.6243978864531853,-0.2251525728614039,1.38089684897522,1.3825179259748477,1.382482506552917,1.384511390149581\ngdb_10975,OC12CC(OC1)C2=O,-0.003861483652684,0.7145765685635806,0.6766857792911157,-0.5890231349991291,-2.2095449439090498,0.3875347344152708,-0.8314238392710449,-0.9996103306008828,-1.604298995251059,-1.247626325252558,-0.2024184681023519,-0.2024638403788525,-0.2024638450312277,-0.2023469788384319,-1.4638074392478286,1.485223240036663,1.4822207972908623,1.4814795857285783,1.49256085836702\ngdb_115819,CCC1CCC2(CC2)O1,-0.0037799181370785,-0.1791502561160026,-0.1836324953661524,-0.8583001976398019,1.3823876699058886,0.9252314777807118,1.654936870542699,1.2058284895461098,0.7004569306541741,1.7182721919242985,0.5803414331141052,0.5803652159300667,0.5803652112825292,0.5802862276505367,0.982979001905006,-1.737669576105801,-1.7387113764812523,-1.7386817148151583,-1.7412890260295242\ngdb_119662,COCC12CC3C1CN23,-0.0033821985154193,-0.132692220848868,-0.0864635722826872,-0.5013332980508681,0.671573395427719,0.7038269363949424,1.4802311480022303,1.1342779934726384,0.130687087558162,0.6679293172111762,0.2124233295879792,0.2124195901734034,0.2124195855235921,0.212423209687679,-0.0644129807011627,-0.4334625148930831,-0.4375838376288593,-0.4394147305391198,-0.4078496753198478\ngdb_64560,OC1(C=O)C2CC1C=C2,-0.0037423932471156,0.1205350097061335,0.1696841505629236,-0.3608465995122983,-0.1870218536412718,0.2564993935951207,-1.1424852476967575,-1.2500370668580316,-0.7326452026076326,-0.3706325506952413,-0.2559560533319646,-0.2559874146264957,-0.2559874192792016,-0.2559039765118789,-0.5042929978379951,0.1342611111125525,0.1335796850039783,0.1350367253490194,0.1230936197265188\ngdb_38013,C1C2C=C3CC4C3C1N24,-0.0035131100902999,0.4598466181942846,0.6160715750644306,-1.3503303557772122,0.7363039049592361,1.4267805409199048,-0.9486045068286764,-1.6014762681600814,-1.32369814617704,0.0639423432459803,1.1699029697579255,1.1698423307505876,1.1698423261066917,1.169983910426867,-1.3870068990426287,-0.0702932934185152,-0.0767187542122845,-0.0762172607605058,-0.0644064174543822\ngdb_53522,CC#CCC(C=O)C#C,-0.0037034977867977,-0.3683224760995064,-0.4120333155089891,-0.3526134255514333,1.1881961413113369,-0.3489742501945356,-0.9145155853573654,-0.7386614625682234,1.6220361636348328,-0.5224933348643281,0.6715979726397883,0.6716582501269784,0.671658245480005,0.671471336316884,1.0767642769632786,-0.02248675455367,-0.0127790671020658,-0.0102897481269946,-0.0674697710512839\ngdb_111810,OCC1C2CCC1CC2,-0.0039417393893615,-0.0612309483354907,0.1178686352196178,-0.9241655893267222,1.0489644793001491,-0.9499294339559116,1.3417449045250294,1.7677103263583682,-0.087276225276046,1.8377352073749451,0.5799857187346957,0.5799819908778433,0.5799819862303033,0.5799976159129864,0.3471591450138824,-1.7750348228604074,-1.7786122779285858,-1.77830119238306,-1.7742364625763825\ngdb_51987,C(C(=O)NC=N)OC=O,-0.0025647804079797,-0.4142059420258815,-0.4170168052707373,3.3460419623914577,-0.9503533339657656,-0.9047448336731012,-0.9912156586678152,-0.5576807960294435,1.38231862746359,-1.127772613147607,-1.9863325211429903,-1.9863219901963227,-1.9863219948597213,-1.986366885447286,-0.0811515599766544,1.110107628091774,1.1135611070841755,1.1129838624685349,1.114466084465441\ngdb_83041,OC1C2CC=CC(=O)C12,-0.0041483086584383,0.2123903368420774,0.2372624622771804,0.7124064346718934,-0.0624461560523131,0.1254640527749718,-1.4130660618752884,-1.4541664233029348,-0.7094586656783354,-0.3441553389689472,-0.2570547739221194,-0.2570763559014531,-0.2570763605541658,-0.2570172110439245,-0.4193693235726303,0.0188483857859526,0.0202005146117488,0.0224572262645582,-0.003991385260962\ngdb_80797,CC1C2NC1(C#C)C2O,-0.0043321750927629,0.1649094418694147,0.0854668244976283,-0.6079071609887322,0.6190561895813947,0.0396133122376318,0.2956411268741741,0.2735676142358829,-0.2845859121155149,-0.0653782493286447,0.2421411816261849,0.2421484383524484,0.2421484337028208,0.2421505681246305,0.5270988722254225,0.0646306460204372,0.0646185973207369,0.0641232284358586,0.0738690894633648\ngdb_31322,CCc1ccoc1C=O,-0.0041432077050604,-0.0171722136121874,-0.1882874033853679,1.1494180811025705,0.6092855466332414,-0.0823851085259547,-1.3853688131798485,-1.3279008419968092,0.2552479692323577,-0.3842768800070891,-0.2578867960961619,-0.2578869647550774,-0.2578869444452461,-0.2579026415008209,-0.2315526178785039,-0.0685495819037189,-0.0641990406713667,-0.0613444722493132,-0.1050706554337268\ngdb_78546,CN1C2C3(O)CC3C12C,-0.0041041519764349,0.0899312998746845,0.1163078719425869,-0.9292623160644006,0.5640963229980308,1.5849266419097408,1.3161782134215465,0.5639784512399711,-0.2777978823882713,0.5957706505201058,0.2121024377741329,0.2121196901097362,0.212119710422472,0.2121082879024409,0.9714096897587088,-0.467169900649608,-0.4688090480244594,-0.4704171265802362,-0.4438006232068428\ngdb_46783,N=C1OCC2CC(C2)O1,-0.0038136960645384,0.1976535803439131,0.2183142484577858,1.114067707191237,-0.29449892607096,-0.6200818518913958,0.427735697575504,0.7112882960971177,-0.7633825736579559,0.0451588502505951,-0.68732638800749,-0.6873622283617942,-0.6873622330171658,-0.687276129636829,-0.7285407290140751,0.0203744611270888,0.0167957347205167,0.016637663867762,0.0302497838747286\ngdb_62192,CC(OCCC#N)C#C,-0.0027933174881909,-0.4956558815401567,-0.5085176932955111,0.6177249341219456,0.5860802696313753,-1.383701596670889,0.2210716111556812,0.8628069936644692,2.443448527616704,-0.1277995532509072,0.2402609128602451,0.2403090479669528,0.2403090433173139,0.240173021067705,1.0403332514813242,-0.1328781151409665,-0.1268963729814276,-0.1260409732361396,-0.1518843976569014\ngdb_133563,FC(F)(F)C1=NCCN1,-0.0035960185437937,-0.139422890269195,-0.0714035757499314,0.5068384562521998,-2.3744245436591407,0.0712425324355992,-0.3094372292415957,-0.3388204550988261,-0.267005308320096,-1.483456836508212,-3.815191772488354,-3.815230027228639,-3.81523003190334,-3.815143378322831,-0.9013419444757738,1.6283885107426816,1.626926541754027,1.6251672910791413,1.6529552030695736\ngdb_92492,CC12CCC1CC(O)C2,-0.0039925886548326,-0.0441896005254886,0.0045261885870354,-0.858234854989319,1.1356789354650108,-0.7059325924287356,1.29274207991002,1.6056694970155074,0.0768089335440544,1.7526835510918433,0.5802937299436454,0.5803071530410956,0.5803071483935576,0.5802895725712107,0.9364555977422406,-1.7426804523552049,-1.744756810090388,-1.7446845095534935,-1.7409071754416303\ngdb_76861,OC1C(O)C2C1C1CN21,-0.0036364614222544,0.0224730709002259,0.0724604638557027,-0.8221003692721891,-0.618151473728546,0.6360500359707271,0.6130942080757574,0.309342862272618,-0.5157521036229961,-0.0080059483235416,-0.6852547823490475,-0.6852795529737602,-0.685279557629119,-0.685205523891305,-0.2317987734560849,0.2379817918581843,0.2336412277890805,0.2319537303376219,0.2666266963050179\ngdb_76877,CC1C2OC(C=C2)C1O,-0.0041075341903756,0.3095178112256507,0.4081523502061418,-0.98120972319843,-0.1906858447468295,-0.0417189682714263,0.0122769671439016,0.0294541570440402,-0.8591588492317981,0.3621039976575192,-0.2864275954935041,-0.2864462426259009,-0.286446247278795,-0.2863842915697832,-0.0976439836745659,-0.4430017968364698,-0.4446273135879129,-0.4439697317055707,-0.4445814221701469\ngdb_19532,OCC1C2CCCC12,-0.003551917126724,0.0740201488998029,0.296398048662468,-0.5228963727102767,-0.1943498358523872,-0.5974895517499906,1.2224336793754411,1.4857171947746874,-0.7334866466296406,0.9285540392207384,1.5612961254824131,1.5612892977374695,1.561289293095994,1.5612999395431753,-0.4090307893142376,-0.5578376551956301,-0.5619544692128665,-0.562910747999833,-0.5530925306510142\ngdb_94951,CC1CNC2C1OC2=O,-0.0034809735313701,-0.0314291100015854,0.045078651977965,0.4624054539237217,-0.5412076605118359,0.3559055142173034,-0.1155564883735145,-0.279896517155967,-0.3968257513111243,0.0364132559119444,-0.6882974508196591,-0.6883196170035288,-0.6883196216589064,-0.6882610839648213,-0.4811543735454025,-0.0816287293291209,-0.0828863440461271,-0.0823413501474668,-0.0821909661039333\ngdb_121098,COCC1OCC(C)=C1,-0.0034367450049529,-0.3505802799727535,-0.3571601645060026,-0.8525500443972931,0.9805699786630742,0.5818285156313546,0.2572910902189492,-0.0189476491233083,1.5282748498078298,0.974385762129477,-0.3165121703350708,-0.3164783608127265,-0.3164783654658061,-0.3165620660413618,0.8495626788562292,-0.9796165605616388,-0.9784058331959238,-0.9788610669815162,-0.9777189332812716\ngdb_84624,CC1C=CC2CC1C2=O,-0.0041567531403033,0.2514484241161322,0.3079531732748735,0.061070894656793,0.5982935733165683,0.7716038368191578,-0.3840067449600884,-0.7386614625682234,-0.6665367991402533,0.3662814464996928,0.6399518491162325,0.6399348942829618,0.6399348896357923,0.6399833763870136,-0.2369680405852804,-0.7231310183476964,-0.7226470847554645,-0.7200286097645228,-0.7501758777350731\ngdb_94983,OC1COC2C3CC12O3,-0.0038950570998894,0.2357456334516359,0.2146085765836654,-1.132020560513323,-0.9247053962268617,-0.0778666484976742,0.7110998573057765,0.7407502650685472,-0.8446223139543996,-0.3769438043416911,-1.1808704289843734,-1.1809064201528767,-1.180906424811298,-1.1808135086758007,-0.6719249461704982,0.5682853289237554,0.5638347024256108,0.5622571237040108,0.595146435299331\ngdb_12210,OC1CC(C#N)C1O,-0.0039225403514041,0.3416431827177927,0.580000601550824,0.8264293597648258,-1.7503247253458347,-1.483107717293071,0.0911075980463082,0.7828387921705892,-1.300408662825339,-0.8925431736678041,0.2932739629141017,0.2932550637184198,0.2932550590691081,0.2933090829402264,-0.770141021625224,1.1629853520249291,1.1616100886538847,1.1606913704362434,1.1695295091653997\ngdb_83977,CN1C2CCOC2C1=O,-0.0039979825123589,0.1684389392483666,0.0890173327711082,1.1697396454028008,-0.3238108549154199,0.3062024539062127,0.2423771870752506,0.0967958004073068,-0.4965193527291405,0.0187116921130936,-0.6881344462654455,-0.6881479495540674,-0.688147954209444,-0.6881143318107776,-0.4518618598132912,-0.0645062688864828,-0.065012549149709,-0.0645936203901405,-0.065437983967489\ngdb_79038,CC1C2C3C4CCC13C24,-0.00348368151312,0.110868354094013,0.2002787617009826,-1.6423466607859885,1.3726170269577351,-0.9318555938427868,1.7870314412440291,2.1991177291542976,-0.5473098289768483,1.092045562252567,1.5392362423786103,1.5392100978862375,1.5392100932446258,1.539276382735926,-0.4110000339348836,-1.2258469813676354,-1.2297747547387456,-1.2284570657425735,-1.2346616344648826\ngdb_69272,CC12CN1C(=O)OCC2,-0.0039487746154179,0.1334407210524265,0.0783110443269127,1.7863128953609153,-0.3567867748654393,-0.7149695124852993,0.0037547367760738,0.3367004048889457,-0.4672743473508146,-0.0068038047718366,-0.6879676723869305,-0.6879828222924429,-0.6879828269478183,-0.6879389481342476,-0.2273679730596309,-0.0469878679339572,-0.0478197102307915,-0.0475220437804688,-0.0454164747843859\ngdb_22713,ON=C1CC2(NC12)C#C,-0.0029170004097473,-0.2777236247746743,-0.2260195401528905,-0.717682813800266,0.5714243052091462,0.4191639546132382,-0.3946595329198731,-0.5850383386457707,0.3400003987404718,-1.1334226878406182,-0.1281699435828929,-0.1281761432863105,-0.1281761479382266,-0.1281534181810549,-0.0225665325124317,1.1174679261717873,1.1180522976588052,1.1174433763584086,1.1286515488425386\ngdb_57117,OC(C#N)C1OCCO1,-0.0040304782933576,0.0292668798085484,0.0918011503120116,0.6108639558212245,-1.3753763022104433,-1.5057000174344757,0.1145437315578345,0.8144051874971205,-0.4708363650565532,-0.720396217063702,-1.5846114917206982,-1.584620857406783,-1.5846208620676994,-1.584601274221515,-0.3888460319526145,0.7331014657663942,0.7338865640677206,0.7335483941973947,0.7393833707982123\ngdb_83947,CC1C2CCOC1C2C,-0.004211316209317,0.137715264389726,0.2372989713596841,-0.8575814284844884,1.1356789354650108,0.410127034556676,1.586759027600077,1.378391450664481,-0.2589125756700279,1.772909616349274,0.5806165438632604,0.5806238029751409,0.5806237983276049,0.5806306047073803,0.699900087687123,-1.7087711631600755,-1.711787624651051,-1.7119478581107594,-1.7019755132645222\ngdb_48689,O=CC12CC(C#N)C1N2,-0.0026875514581306,-0.2596467893615449,-0.1954523108267092,-0.7059211367133159,-0.4276239362395521,-1.2933323961052687,-1.2703187032141738,-0.6544844083641397,0.1132137196194144,-1.079266120836325,-0.1309066727248004,-0.130921224874565,-0.1309212045639491,-0.1308934074548729,-0.6982635929716403,0.8299941393156052,0.8322379197011407,0.8336474481595014,0.815858924733159\ngdb_80962,CC1C2CC1(C)C2C=O,-0.0041704588438538,-0.0208027341719322,0.1865239648677324,-0.2967454593884205,1.0623991133538602,-0.1863096891764192,-0.76537655392038,-0.6692153928498544,-0.1512252560429371,0.9800959440000738,0.6112730575020927,0.6112929654124062,0.6112929607650582,0.6112613912714214,0.7530696924445673,-1.1120783268986567,-1.111680873871017,-1.1111961093686278,-1.1171228941751798\ngdb_33640,C#CC12C3CC1N=CN23,-0.0035343152453496,0.0789892266050536,0.2875628506965847,-0.5663492352815087,0.1231960599626032,-0.4800095910146845,-0.6993292685697148,-0.4671904627600536,-0.8938657342432083,-1.0484912459126867,0.7976185408711841,0.7975791575566143,0.797579152910419,0.7976705254315712,-1.090389428057804,0.7752625747184881,0.7748712771294194,0.7766828358382198,0.7609864253262388\ngdb_83113,CC12CC=CCC1C2O,-0.00419272508563,0.1339395230075915,0.2798594342883145,-0.6057508535227915,0.983012639400112,0.6767161762252556,0.1763299017245856,-0.1431088040743315,-0.4212773138962781,1.0506918240739285,0.6103504467327608,0.6103555967349674,0.6103555920876151,0.6103664252943948,0.6110379241804647,-1.2089919774285018,-1.2092784996047594,-1.208105372145616,-1.2192907234865658\ngdb_82810,OC1C2OC3CN(C3)C12,-0.0037626588982929,0.1880248084245899,0.5003925471516143,-0.6173165026582922,-0.6218154648341037,1.1827836993927303,1.2714365039904505,0.7049750170318122,-1.0323579835103254,0.049035763204843,-0.685413044044868,-0.6854602069408609,-0.6854601866336718,-0.685337498485357,-0.8011566244004011,0.2213575347060266,0.2148317681892003,0.2132795159237776,0.2515606958712185\ngdb_50736,O=C[C-]1CC[NH2+]CC1=O,-0.0040129537828904,-0.0070572676352984,-0.1142652386092168,6.385651377562247,0.8303463433352133,3.0714999912141967,-0.4692290486383659,-1.8918871051641697,-0.0183089138698306,0.0289900194801684,-0.6867487078553289,-0.6867561376721125,-0.6867561423274804,-0.686736848545494,-0.1569674778715309,0.0810556218565706,0.0799011208038509,0.0792979632000104,0.0918132174630805\ngdb_58581,CC(C)C1(C)CCCO1,-0.0041442964242538,-0.016509248988234,0.0615259936458594,-0.9170432404240692,1.5362752963393054,-0.0462374282997068,1.365181038036556,1.3699737452440723,0.1862002309336402,2.4084228049579237,0.5499322720216284,0.5499744108766822,0.5499744062289568,0.5499005438931899,1.8046463198696097,-2.308379799729644,-2.3098359130843824,-2.3106556629866994,-2.2981611158467765\ngdb_38086,C1C2C=CC3C=CC1C23,-0.0039924007540581,0.5728726156551406,0.3871778823077947,-1.679330600959398,0.9585860320297296,0.6631607961404128,-0.0835981244941605,-0.3935355403314803,-1.0286272480567582,0.4540679793629232,1.5671117685180125,1.567064882699228,1.567064878057788,1.567170749605674,-1.1186973194795915,-0.9212769552954448,-0.9226973964043206,-0.9186679524170446,-0.9621941422152968\ngdb_1728,C1C(CO1)(CO)O,-0.0028992382600654,0.4195131132627196,0.6838050503793276,0.3123787284146256,-2.728610350529706,-0.2586050496289152,0.8772833494784176,0.9869681486154918,-1.6685168305653373,-0.7316663128609329,0.7185984761475351,0.7185729146396499,0.7185729349555152,0.7186426308029028,-1.270821466424507,1.6161327098071707,1.609840264648926,1.6057601472995475,1.6523197352997157\ngdb_77603,CC1C2(C)CC=CC12O,-0.0044068324916595,0.1604896777097253,0.4656267733375133,-0.8958722216675591,0.9915619519797472,0.7670853767908761,0.195504920052198,-0.1641530676253524,-0.4200662636216125,0.9855356435715376,0.6103656738044406,0.610388347599253,0.6103883429519008,0.6103681227168263,1.1008875235661937,-1.207392482655147,-1.205868521575875,-1.2047194449329353,-1.2190969485613663\ngdb_118388,OCCC(CC#N)C#N,-0.0036716817644831,-0.3894868324756189,-0.4209871679930093,1.6407948127351504,-0.2322110772764807,-2.151839801478663,-0.0345952998791509,0.9680283114195728,1.532647928042853,-0.4146911118652157,-0.1618710477981057,-0.1618329484004482,-0.1618329280900233,-0.1619354694892369,0.618422591507888,0.2009626652840407,0.2068760191442605,0.207818675998446,0.1840557811200485\ngdb_37416,C1C2OC3C4NC4C2C13,-0.003395998169357,0.2628387877505316,0.5100400722032039,-1.067396679185581,-0.2419817202246357,0.2564993935951207,1.4759700328183163,1.336302923562439,-1.1820099109282862,0.1132001752770773,0.2407371457676405,0.2406698566501756,0.2406698520005388,0.2408270029836427,-1.4896537748938097,-0.0828532605732226,-0.089329435154059,-0.0887389976781359,-0.0772269084608703\ngdb_54164,CC1(CC1C#N)C(N)=O,-0.0036841163745592,-0.1867080528290714,-0.135340106484449,1.3489091930273402,-0.090536754528255,-1.1622970552851184,-0.3009149988737679,0.2420012189093515,0.12938059737106,-0.3987927633939228,-0.1625142043584914,-0.1624926836079254,-0.1624926882600549,-0.1625327874504734,0.6319611482748302,0.1334036770291315,0.1381852345978694,0.1396097916754915,0.115866954869083\ngdb_87747,OC1CC11CC(C1)C#N,-0.0028910811558555,-0.3345049663290834,-0.2723769476619006,1.6483745601911846,0.3515848388756906,-1.243629335794178,0.4170829096157194,0.9890725749705936,0.7100745571860961,-0.0377289476394382,0.2399162568296134,0.2399220535698302,0.2399220489201888,0.2399069501015628,0.2253121341114011,-0.1690817615929573,-0.1671897337911461,-0.1660501421219658,-0.1822586171819075\ngdb_111357,COC1C2CN2C1=NC,-0.0042825527088194,0.1510124405615914,-0.0964305518061837,0.652421881528448,0.533563063785051,0.1028717526335666,-0.1176870459654714,-0.1641530676253524,0.0749541106765101,0.2842351490958528,-0.1896903835467689,-0.1896734795979594,-0.1896734842502555,-0.1897091695937766,0.3646361910221164,-0.0977049628626296,-0.0987172710079712,-0.1004994171203547,-0.074636593652409\ngdb_19182,OC1C2C=CC1C2=O,-0.0033129128680755,0.3872425209481878,0.611425794315841,-0.3387607836490253,-1.584223795227225,-0.4935649710995286,-0.4585762606785812,-0.2230770055682109,-1.5524436669466124,-1.2227720073210644,0.7244840638632113,0.7244329978297671,0.7244329931831197,0.724556575364837,-1.631439387580332,1.2600405971740545,1.2578951847351547,1.2587361575833385,1.2481422865405007\ngdb_127857,CN=C1C=NNC(N)=C1,-0.0033104646315139,-0.1730951792172282,-0.2238563770145492,1.0078205575057888,1.3555184017984665,2.244621806038769,-0.8314238392710449,-1.8666339889029449,0.3629914089577734,-0.3292487589278099,-0.0660352036075878,-0.0660317277795245,-0.0660317324310565,-0.0660285559739721,0.1563885723887857,0.4515772750434142,0.4516303110502583,0.450844117836114,0.4663032078296995\ngdb_51851,O=CNC1CNCC1=O,-0.003801316719396,-0.1203800206897333,-0.2074272898879066,-0.5979750781153078,-0.6779966617859857,0.1887224931709053,-1.0593935016104372,-1.134293617327417,0.192897825088424,-0.3442154461465325,-1.0893530572095362,-1.0893601408963758,-1.0893601455542314,-1.0893516203047344,-0.2618297539209386,0.3652753545918778,0.3655699509188188,0.3653907475111785,0.3699571751680539\ngdb_55369,NC(=O)C1OCC2NC12,-0.0036597334858234,-0.0250393938164344,0.0573457036991916,0.5570869544736698,-0.8673028689064598,0.2610178536234024,0.6067025352998868,0.4755925443256837,-0.4430908744073372,-0.3271750613011191,-1.0885941000171464,-1.0886120133028945,-1.0886120179607452,-1.0885711804176488,-0.3297686933332305,0.4449983694916061,0.4434640374475462,0.4427354415386709,0.4590508962909865\ngdb_120542,C1C(CN1C=O)CCO,-0.0023129436317328,-0.4959084394921389,-0.4766543915404502,1.592571936678655,0.4175366787757276,0.026057932152789,0.3382522787133127,0.3219694204032315,2.2994979732450584,0.6522112902726376,-0.7174984311427716,-0.7174656395885595,-0.7174656442441171,-0.7175790141264486,0.633930392895477,-0.5254282201154267,-0.5244057248654206,-0.5256242568650581,-0.5171700790760938\ngdb_56766,CC(C#N)C12CCC1O2,-0.0039347925871991,-0.0637186441625159,-0.0419133643576078,-0.3349055672705252,0.0352602734292216,-1.1984447355113677,0.3915162185122361,0.942775195158348,-0.0669421621341844,-0.052485259736613,0.2399678042116125,0.2399748992860597,0.2399748946364187,0.2399559506932257,0.1325114813634527,-0.1636670784663449,-0.1616875055235019,-0.1605867214594296,-0.1766647910319476\ngdb_127843,CN=C1C=CN(C)N=N1,-0.0027710457199213,-0.2121469525424834,-0.2192562326190892,1.3174793781449898,1.5680298859208048,1.4538913010895904,-1.2937548367257,-1.952915469462131,0.4218860563469278,-0.3913394733733533,-0.065218383542814,-0.065208213288068,-0.0652082179395949,-0.0652430238195907,-0.0585052468392241,0.5373783700823099,0.5373735848096949,0.5359826382220658,0.5559782537282295\ngdb_69763,OC12CC1N1CC(=O)C21,-0.0037624378385582,0.0772402627875765,0.1436805565496987,-0.4827759853136803,-0.8208923149027312,0.3513870541890229,-0.9805628707080304,-1.1300847646172127,-0.7016822788054455,-0.7542365580441872,-0.65581061820248,-0.6558405699136848,-0.6558405745688616,-0.6557604867440682,-0.570262692629641,0.7073260008125817,0.7056632779277326,0.7055241688906041,0.7161161316180196\ngdb_51568,O=CCC1NC1C1CN1,-0.0032449204201796,-0.3575698212938623,-0.3181593371214781,0.8680526281225325,0.2111318464959847,0.0125025520679462,-0.697198710977758,-0.6944685091110796,1.0485452363123553,0.3286844069201307,-0.1900420040905126,-0.1900309682621652,-0.190030947951916,-0.1900774852993256,0.1608193727852397,-0.1346401815138187,-0.1359385327089363,-0.1374555738929162,-0.1166829027894267\ngdb_22315,ON=C1COCC(O)C1,-0.003653344859491,-0.1949666978588909,-0.2411981912037832,-1.298448291293666,-0.3665574178135926,-0.2947527298551631,-0.2007787920517919,-0.0610361762253496,0.4083972153892002,-0.0275407810387421,-1.6134328935979656,-1.613437349382485,-1.6134373540435794,-1.6134078808377372,0.3048203856699893,0.3291739709204783,0.3266774880954652,0.3243337993202088,0.3627761044106649\ngdb_45474,O=C1OC2CCC2C=C1,-0.0035231130432945,0.0352967009121246,0.0502811962347352,1.4539801750040933,-0.0856514330541792,-0.9951140342387216,-1.2127936482313366,-0.7344526098580191,-0.5210617083396086,-0.3076402285859194,-0.2575744414274212,-0.2576104545998223,-0.2576104342899892,-0.2575283848167624,-0.93531141418192,-0.0357389620693039,-0.0354091575723106,-0.0327576462318776,-0.0623461340580573\ngdb_14562,C#CC#CC#CC#C,128.5760646402841,-0.5277117995955037,-0.5062084938271553,-1.7619237111699808,5.401785912702542,0.0757609924638797,-1.886049847289728,-1.8982003842294763,2.0761503480474235,-2.71448188704254,2.640368850612294,2.640366247019289,2.640366242384481,2.6404763894383985,-0.9761732400603268,2.0835968568718126,2.0928823657349995,2.097589301576403,2.0261742578973845\ngdb_47102,O=C1COC2OCCC12,-0.0040201382242677,0.2807198907508748,0.1559384810002994,-0.8997927806965424,-1.043174441973223,-0.5161572712409337,-0.997607331443686,-0.7449747416335295,-0.7606755831827368,-0.3268444718244008,-1.1844875574559215,-1.184523094101959,-1.184523098760403,-1.1844496371450903,-0.9857733075859773,0.1883319008201117,0.1872712445472038,0.1883495945255262,0.1800520492101323\ngdb_34465,N#CC12COC3CN1C23,-0.0033810158458387,0.0672263399914804,0.1320341592310342,0.9028802608300008,-0.7476124927915804,-1.026743254436688,-0.1112953731896006,0.3682668002154771,-0.8149976936434606,-1.0180770140545596,-0.1302811396277042,-0.1303274906466389,-0.1303274952985683,-0.1302190564697571,-1.341221961612606,0.8957019090295496,0.8940567668177847,0.8950277018381766,0.8928417130311284\ngdb_29140,Cc1coc(=O)c(n1)N,-0.0036805849452976,0.0283576711814123,-0.1139914204904395,-0.2817166497773176,-0.116184692267159,1.2415236797603837,-1.2831020487659153,-1.845589725351924,-0.2230060788753311,-1.0829026050802315,-1.0601277385341472,-1.0601318167872982,-1.060131821444973,-1.0601406776804825,-0.1869984583363846,0.8116314148929724,0.8156584609764442,0.8171823447925443,0.7927227685901476\ngdb_114157,OCC1CC2(CC12)C#C,-0.002851108029331,-0.3213845807236055,-0.2767489102917127,-0.5243339110209039,1.1063670066205509,0.468867014924329,0.6088330928918436,0.3851022110562932,0.7815894350939473,0.2829428447777699,0.6420740785474092,0.6420922076925014,0.6420922280078941,0.6420575017878747,0.7060539771266419,-0.5002060230254215,-0.4980303765143655,-0.4969935583151414,-0.5133971672972119\ngdb_38800,C1C2OC3CC4C2C1C34,-0.0037158108140197,0.4647967540531368,0.591473580727596,-0.8618940434163701,0.2599850612367513,0.5321254553202638,1.5079283966976704,1.2437081639379466,-1.2235859674651537,0.4373581839942289,0.6417183891304121,0.6416570855009107,0.6416570808537518,0.6418009164176722,-1.3242372267595333,-0.5375686476574496,-0.5433347416218012,-0.5419809692690033,-0.5426885269461035\ngdb_96418,OC1CCOCC1C#C,-0.0042657576954767,0.140303983397544,0.060421593900124,-1.5675946686333728,0.1573933102811406,-0.6471926120610839,0.6748803782425086,0.9680283114195728,-0.2887648934499641,0.3316597122106001,-0.2859104242294606,-0.2859097974479248,-0.2859097771382666,-0.2858924134960577,0.4377443975636038,-0.3886766612387216,-0.3887733289527852,-0.3885071084374744,-0.3884294383311137\ngdb_73499,CC12OCC3CCC1C23,-0.0038563826993062,0.2284972202297453,0.2877910324622324,-0.8615019875134717,0.6813440383758723,0.6450869560272882,1.5462784333528952,1.2268727530971306,-0.6589187638823802,1.1040068905920284,0.6101330990063382,0.6101115129308602,0.6101115082835064,0.6101742920524033,-0.2005370151033268,-1.2318227987262116,-1.2346921925669858,-1.2333398205340325,-1.2412243352399326\ngdb_119684,CCCC12OC3CC1C23,-0.0030460053444164,-0.2866894320700441,-0.169384825919103,-0.7849204011473304,0.8755355669704239,0.6857530962818179,1.6336312946231295,1.2942143964603972,0.6916668074828924,1.016220357728798,0.6118783960074033,0.6118813326922969,0.6118813280449541,0.6118672961931948,0.2651893376794853,-1.048491854350924,-1.0504208265075827,-1.0503681334517778,-1.0479537947722968\ngdb_120904,CC1C2CC(CCO)C12,-0.0037121854343709,-0.3261768678624687,-0.2960895967480215,-0.859411022698014,1.3054438566891784,-0.3534927102228161,1.4994061663298428,1.6456535977624462,1.2344039209236908,1.71484608280194,0.581367887519811,0.5814023099907535,0.5814023053432222,0.581322004803096,1.2215037565807705,-1.6298478956466071,-1.630730471612298,-1.6314624069050412,-1.6230464268203468\ngdb_38445,C1C2CC3(CCOC3)N12,-0.0034548055852767,-0.0525555826849005,0.108047692026136,-0.4061943989475389,0.3112809367145575,0.1616117330012197,1.4610561296746176,1.3699737452440723,-0.3698927510026153,0.7302304067782681,0.2106906136428603,0.2106718373052808,0.2106718326554587,0.2106991776633217,-0.367922807858249,-0.6154719094879998,-0.6195576270285555,-0.6201050459323415,-0.6046623074353323\ngdb_127919,CN=c1c(=O)[nH]nc[nH]1,-0.0039340354576078,0.1092077855597296,-0.1121112027415015,1.5649973381748052,-0.1357259781634656,0.6089392758010402,-0.9720406403402028,-1.2437237877927254,-0.2546074134775631,-1.0297378065060945,-0.963662966358421,-0.96367949800085,-0.9636794776953804,-0.9636347737436438,-0.5444163569836603,1.1283104030370708,1.128204259686873,1.127526316617514,1.1433499468451638\ngdb_101818,CCC(C)COC(C)C,-0.0033751522363762,-0.4995452740006832,-0.4735967558807694,-1.1500551320466463,2.3093774196119523,-0.272160429713758,1.5739756820483355,1.6814288457991826,3.178623960381286,3.0119289215024367,0.520187760126777,0.5202840050230956,0.5202840253377355,0.5200547153553549,3.0019470492224647,-2.809273387557408,-2.8080357823594158,-2.8102167059291,-2.7934042305564084\ngdb_25267,c1c2n(cn1)C(=O)CC2,-0.0036920358395543,0.2206931795134938,0.0146939680643019,0.5672804079490263,-0.3677787481821107,0.1345009728315341,-1.087090750305877,-1.136398043682519,-0.6907445788779093,-0.9708628260613604,-0.1333643969811871,-0.1334082685938233,-0.1334082732457717,-0.1333203221384528,-1.508115443212367,0.5718278165283499,0.5732901140636326,0.5765234389720783,0.5388063757976042\ngdb_18198,C#CC1OCC2OC12,-0.002956840900431,0.232626542744655,0.3062189918559501,-0.2143483771292871,-1.4816320442716124,-0.7917833329660756,0.4490412734950734,0.8101963347869169,-1.237959504240413,-1.2631940842471323,0.7256094692724242,0.7255719141278194,0.725571909481179,0.7256582774091868,-1.477345996014771,1.3782563715378948,1.3764776905033644,1.376482292551785,1.3739107618886726\ngdb_81526,CC1(C=O)C2CC1C2O,-0.0041656618476111,0.3055905350723267,0.0993585303902672,0.8094402706392311,0.1573933102811406,0.3920531944435513,-0.2412593862989737,-0.4208930829478075,-0.5197023151298821,0.2800877538424718,-0.2853303227232788,-0.2853319144386124,-0.2853319190914996,-0.2853067777950843,0.3550361234964661,-0.3277411546194678,-0.3286048904095732,-0.3287656228838408,-0.3215742395060665\ngdb_106999,OCC12OC1C(O)C2=O,-0.0037835269372472,-0.0430025781511719,-0.063499359387891,-0.5074101645457924,-1.6697169210235676,-0.0100897480734589,-1.2149242058232936,-1.1953219816253773,-0.2929449471429804,-1.1464959989654062,-2.079628290077987,-2.079637340337789,-2.0796373450017662,-2.07961121714403,-0.1759214573452505,1.12630972350867,1.1276324645905662,1.126955973695256,1.136174575349826\ngdb_94479,NC1COC1=NCC#C,-0.0034770276151059,-0.3418165190389695,-0.3272318441236352,-0.7490472860321327,0.3418141959275372,-0.2089019893178244,0.0527575613910833,0.1494064592848591,1.0874375368096143,-0.409822430480812,-0.1599085029202693,-0.1598865435673956,-0.1598865482195076,-0.1599688059055484,0.4197750404002076,0.4071139424667213,0.4095331966189662,0.4090439173301647,0.4085668290140357\ngdb_33961,N#CC12OC1CC1CC21,-0.0035759021079373,0.0213933856555019,0.1071440922341706,1.139289970277697,-0.3006055779135556,-1.9756198603757027,-0.2753483077702847,0.6481555054440549,-0.614769046785421,-0.687096840681484,0.2697673332638112,0.2697325542466485,0.2697325495971914,0.269815733041932,-0.9131574121996512,0.3430056675257138,0.343514254613086,0.3459294082498736,0.3201712675480721\ngdb_289,CC(=O)N(C)C,-0.0027075739435998,1.2546275512360223,0.8772940603780486,0.5406859492024227,-2.467245651666599,0.2610178536234024,0.5172191164376955,0.3893110637664981,-2.0274477536914737,-0.5608116605749129,3.090849334983085,3.090847121169916,3.090847116537892,3.090853433272873,-1.4258994803003906,1.5843840496173611,1.5794519543492442,1.575581006426758,1.6046796000136456\ngdb_55665,NC(=O)CC(C=O)C#N,-0.0042633647238487,-0.1751850962698813,-0.2419557546657339,0.6558850420040501,-0.969894619862072,-1.5057000174344757,-1.3107992974613556,-0.5934560440661792,0.503925418681278,-1.1489904468351932,-1.059808893638134,-1.0597843131426905,-1.0597843178003634,-1.059866793578736,0.2883279619720778,0.8451237865979974,0.8518400952299924,0.8531087874208073,0.8239889226973218\ngdb_89329,CC1CC2(C)NC12C#N,-0.0043445820703721,0.2145939049731225,-0.013691843582286,0.5958351462101219,0.846223638125964,-0.2043835292895426,-0.0026369359997969,0.0925869476971031,-0.1031464171575704,0.3128161120376296,0.7360512464588784,0.736068291543407,0.7360682868968316,0.7360431328074002,0.5667299202159265,-0.4448320383963835,-0.443306986729168,-0.4426587172055378,-0.4504801588641711\ngdb_59280,CC(O)C1C=CC2OC12,-0.0037124230735857,-0.0553716038495025,-0.0796089920426267,0.1999240269332866,0.0670148630107195,-0.1817912291481375,-0.3158289020174665,-0.2251814319233128,-0.048711709047342,0.3049120181851721,-0.2861699584331605,-0.2861714299239776,-0.28617143457687,-0.2861594579838889,0.260020070550289,-0.4159388696958173,-0.4160141671545735,-0.4155583958180613,-0.4189147934754382\ngdb_47547,O=C1CC2CN2CCO1,-0.0038409306238516,0.0653132135052149,-0.0458928543505056,0.8196337241145878,-0.3323601674950551,-0.2224573694026672,-0.0835981244941605,0.0210364516236316,-0.3114577879857217,0.0269764290310628,-0.687474415114149,-0.6874981744036396,-0.6874981790590121,-0.6874417531343768,-0.5188161769152606,0.0048252742324312,0.0026412070178855,0.002582969050487,0.0113424805115218\ngdb_109674,CCC1(O)C(O)C1OC,-0.0043057474014812,-0.1127527705398695,0.0185274217271852,-0.9868291911399728,-0.0111502805745087,0.6044208157727597,1.2629142736226229,0.9659238850644708,0.2904931782442541,0.9122649940951412,-1.2426849261685609,-1.2426339361288403,-1.242633940787643,-1.2427258436096178,1.821877210300264,-0.6777761640940517,-0.676900877981873,-0.6794826475401011,-0.6488486896428477\ngdb_101200,OCC(O)C1C2CCC12,-0.0036142836043721,-0.2659039126219051,-0.1846182405937511,0.0480023645601819,0.3711261247719989,-0.380603470392503,1.1691697395765177,1.3320940708522353,0.6180468897085992,1.0304657588164978,-0.3155444524861889,-0.3155304080145122,-0.315530412667586,-0.3155707713413463,0.7380542022121406,-0.8779647345310518,-0.8797062023674805,-0.8808575716296975,-0.8645543769648197\ngdb_15763,CC#CC(=O)C1CO1,-0.0037053712680492,-0.0169764811994012,-0.0898224078730229,0.3098957076962695,-0.8111216719545778,-0.8505233133337274,-1.4748522320420396,-1.0606386948988438,-0.2198633533691738,-1.3248039412719947,0.7249324636805681,0.7249398124620817,0.7249398078154374,0.7248763897206134,-0.7839257339697473,1.3071417850659115,1.3106640749184308,1.3111328650521468,1.2846517621526166\ngdb_71579,CC12CC3N1CC3(O)C2,-0.003703923326787,0.0561074761554619,0.3306161862390143,-1.377382213077197,0.3637981425608835,0.6315315759424454,1.29274207991002,0.9806548695501862,-0.6518662190420167,0.7050755529588487,0.210656839498626,0.210636615148645,0.2106366104988213,0.2107062668981833,0.0618648305977716,-0.6190196413377536,-0.6232249128503642,-0.6237464661282976,-0.603853012159045\ngdb_14829,CN=C1CCN1C=O,-0.0040402104481772,0.5219443046379199,0.0749887178190805,0.5851189515309008,-0.5070104101932984,-0.3534927102228161,-0.7206348444892842,-0.5450542378988307,-0.7697409450528058,-0.5472875456182358,0.7890144964996172,0.7890087656089961,0.7890087859252969,0.7890181142835875,-0.80731051383992,0.845800294223867,0.8446224816734262,0.8435032835445865,0.8525450774834114\ngdb_21067,Cc1cnn(c1N)C,-0.0035309219784222,0.3744567746290864,0.1476144101894671,0.305779120715837,-0.3775493911302657,1.819886563380354,0.6045719777079298,-0.2504345481845381,-0.8630200549776474,-0.159896786081421,1.284974499616624,1.2849839601968365,1.284983955553653,1.2849674307493892,-0.295553068049504,0.5516689381393352,0.5506964105540619,0.5492089163966029,0.5629911962416956\ngdb_59727,CC(C)C1CCN=CO1,-0.0036815962935838,-0.1849906587555921,-0.1122298572596384,-0.2785802025541309,0.9329380942908256,-0.1908281492046998,0.3680800850007099,0.4524438544195609,0.4263853154399663,1.3980512033389791,0.1789213006131738,0.178942290749114,0.1789422860990958,0.1789025115097638,0.8219932541671827,-1.329056348210728,-1.3300702136996096,-1.330483930396658,-1.322609803824136\ngdb_23790,FC1=CC(=N)C=NN1,-0.0032238589539575,0.2913841502733253,0.1335310316136839,0.2147568085929402,-1.5939944381753783,1.0381929784877373,-1.1105268838174036,-1.5804320046090603,-1.2138432324337824,-1.898647165678197,-0.1813012405851898,-0.1813560581218922,-0.1813560627741368,-0.1812270248929339,-1.91304136833273,2.2068811941813826,2.203901581820239,2.2029478979618675,2.217931644169258\ngdb_102583,OCC(CO)C1CC1O,-0.0040565191301036,-0.2331913438914043,-0.2723404385793969,0.5361119636686088,-0.047790191630084,-0.1456435489218895,1.156386394024776,1.210037342256313,0.8635548046871946,0.9921173795171204,-1.2428131830444489,-1.2427804912540088,-1.242780495912814,-1.242842266818745,1.4056281286111842,-0.6912486299063441,-0.6921600098474102,-0.6946341556694101,-0.6621393698065244\ngdb_121421,C1COCCN1CCO,-0.0036417005379664,-0.2764545210659635,-0.2069891808978627,-0.8112534892920019,0.5250137512054158,0.4191639546132382,1.4163144202435225,1.2058284895461098,0.7363509165989267,1.4363394754607712,-0.7470450659930211,-0.7470405437279313,-0.7470405234211227,-0.7470491631408347,0.5263604054926803,-1.0055362422573115,-1.0105797036380844,-1.0132442287613166,-0.9695262434290908\ngdb_708,C1C2C1C1NC21,-0.001248590747663,1.6690877783364917,2.564342253789227,-0.6834432649471448,-2.8250954496427214,1.5397420416269303,1.4653172448585317,0.7281237069379349,-2.7279266675774565,-1.1708995130650102,4.051014506386713,4.050939456587183,4.050939451961092,4.051111687725091,-3.1635117024430324,2.2296490845371904,2.2182718322178805,2.2147728358342054,2.25607110885896\ngdb_122713,C(CN=CNC=O)CO,-0.0035729564869727,-0.5241886161653513,-0.5736042601288938,-0.1027431301042279,0.5323417334165312,-0.4664542109298429,-0.6737625774662318,-0.4482506255641345,3.312235190905616,0.2906365635086796,-1.1188041858692157,-1.1187658488786365,-1.118765853536674,-1.1188684484359874,0.9012553501481908,-0.1048004265621361,-0.1029875407499365,-0.1047395683478211,-0.0877277996279268\ngdb_97216,CN=C1OCOC1C#N,-0.0040786527360389,0.0206546536459537,-0.0855325906788441,0.6120401235299197,-0.8318842882194043,-1.6051061380566598,-0.4607068182705382,0.292507451431802,-0.1410864632475964,-1.084916195529337,-1.0586256503189482,-1.0586287717870062,-1.0586287764446718,-1.0586318937977002,-0.554754891242052,0.9694150189784252,0.9721535806988396,0.9725736933676418,0.9649630304115948\ngdb_90692,NC(=O)N1C2CC(O)C12,-0.0035132924645811,-0.1808613362406824,-0.1082503672667404,0.5536891366485508,-0.699980608419332,-0.0688297284411119,0.3574272970409252,0.3851022110562932,0.0488532606157641,-0.3718948014245316,-1.088058531455025,-1.0880668561953184,-1.088066860853166,-1.0880479000289385,0.0566955634685753,0.5012560094302764,0.5002250970304926,0.499096161354033,0.5187877162462082\ngdb_27268,c1c2c(c(o1)N)OCO2,-0.0035359566138797,0.1130466664298599,-0.0260684225510235,-0.7843976599434659,-1.2275953276196212,2.470544807452821,0.5662219410527048,-0.5913516177110768,-0.6152884257845616,-1.3822062958658905,-1.5542614915460609,-1.5542980503368702,-1.554298054997599,-1.5542040579764722,-1.0133427322750228,1.2975972588769484,1.2979312377511223,1.2984924142230958,1.297571990253866\ngdb_43962,O=C1CC23CC12N=CO3,-0.0036548922776338,0.2129585922340374,0.3770922482661613,0.4518852871959499,-1.217824684671468,-0.23601274948751,-0.8356849544549588,-0.7155127726621006,-1.1553485747900092,-1.412169723892128,-0.6267437359598007,-0.6267967440042538,-0.6267967236967023,-0.6266714589897517,-1.442638059575883,1.1370394491030458,1.136542071818477,1.1382441182289986,1.1249641260007903\ngdb_128357,Cc1c(onc1NC)C,-0.0037020664250157,-0.1184858360498665,-0.2128580159103246,0.4656072437973915,0.642261466583259,1.2370052197321015,0.1273270771095761,-0.450355051919237,0.4874529187381208,0.29328127932243,-0.1915507572698735,-0.1915175130175697,-0.1915175176698773,-0.1915852706983242,0.8042700525813685,-0.2931238923275183,-0.2907156680733769,-0.2911436359124432,-0.2888091793598495\ngdb_63348,CC1(O)CC(=O)NC1=O,-0.0039857247500706,0.028660740723791,0.0294436373957768,-0.9095288356185176,-1.1457661929288352,-0.9634848140407528,-0.8186404937193034,-0.3598647186498465,-0.4170418550106487,-0.7410430325642288,-1.5867614543922606,-1.5867716056659356,-1.5867716103268652,-1.5867367065266664,-0.0090279757454896,0.507263292258968,0.5099534108731562,0.5111946573696668,0.495605966003149\ngdb_24270,C1C2OC3=C(NC=N3)C12,-0.0031561041452764,0.1507156849680123,0.1458163378761623,1.2422699874389926,-0.579068901935931,2.240103346010488,0.2871188965063463,-0.7597057261192444,-0.9127127935059056,-0.966655323630394,-0.1314489312134961,-0.131505448371324,-0.1315054530232607,-0.1313733786094834,-1.5098385322554326,0.773033770526108,0.7714093177285566,0.773245293881442,0.7610662150017755\ngdb_102180,CCC(CC#N)C1CO1,-0.0042264477481562,-0.1949351281148932,-0.3096983572512571,1.0057949353408142,0.5103577867831868,-1.8807321997818016,0.4788690797824704,1.3489294816930513,0.9632019372339864,0.6325261896134747,0.2095880738035803,0.2096240842361718,0.2096240795863432,0.2095311263715762,0.8564550350284904,-0.7312858195688101,-0.7286483341980683,-0.7284263286289028,-0.7380051026537991\ngdb_1935,OC12CC1(O)CC2,-0.0034428904655771,0.9316943259339202,0.949609425547154,-0.4276267883059812,-2.177790354327551,0.8303438171868108,1.3140476558295893,0.9112087998318168,-1.8865341187807363,-0.7403518000219987,1.6458006818762425,1.6457740463205606,1.645774041679607,1.645850173393985,-1.217898017244642,1.422428648508707,1.4171972985658654,1.414473324943467,1.4426039728718614\ngdb_52476,CC#CC#CC(O)C#N,-0.0029092301600731,-0.5396262209802662,-0.5669961161957331,1.3040187921454804,1.8990104157895047,-0.7059325924287356,-1.136093574920887,-0.7933765478008776,3.2346829342672154,-1.5132399430016945,0.3005550992680431,0.3005966492290424,0.300596644579776,0.3004129455887595,0.4254366186845661,0.953485624309356,0.9639171322429853,0.9668341338241938,0.9011911326017664\ngdb_24970,N=C1NC2=C(CCC2)O1,-0.002933071452459,-0.0975298399841394,-0.1139184023254321,0.5891701958608501,0.2331157931293292,2.5383217078770364,0.6812720510183795,-0.5092789898620955,-0.1130206948397903,-0.2878649671603786,-0.1621678758462147,-0.1621921345179859,-0.1621921391701121,-0.1621318712494025,-0.6367246985764483,0.1697830056943504,0.1694780207771645,0.170681867621549,0.1616348824224304\ngdb_24551,c1cnc-2cncoc12,-0.0031896333805348,0.2177129956801033,0.0883966783685461,-0.1693272909464618,0.0083910053217996,0.0531686923224746,-1.988316611703661,-1.9886907174988664,-0.98311992501435,-1.6607730103846443,-0.1025221380302723,-0.1025893807876663,-0.1025893604768753,-0.1024381672819777,-2.1077504301991183,1.1880318669025405,1.1889860784243051,1.1927570302057324,1.152354781603977\ngdb_94187,CC1(CCC1O)C1CC1,-0.0039740306901057,-0.1110480043639892,-0.0431546731627319,-0.9105743180262468,1.254147981211372,-0.4574172908732794,1.4227060930193929,1.6182960551461198,0.2686109867849252,1.6960926433953485,0.5810878841377878,0.5811216560523651,0.5811216514048321,0.5810472220659443,1.4162128184471572,-1.659260244617869,-1.6599518001017908,-1.6604776352984718,-1.6544151676526266\ngdb_56880,OC(C#N)C1CC(=O)O1,-0.0035217866848864,-0.2326799140386403,-0.2566689149147051,1.14321052930668,-1.5255999375383034,-2.481687383543176,-0.5395374491729448,0.6229023891828303,0.2890047445536672,-1.466416451662799,-1.5552344265391738,-1.555243981168615,-1.5552439858293512,-1.5552270794987002,-0.4683542835112027,1.1953974092272937,1.1994421314804502,1.200697958584865,1.1807855526144833\ngdb_12366,CC1COC=NC1C,-0.0037829245494702,0.4244948188655693,0.3731310128145152,-0.4345531092571852,-0.4776984813488385,-0.1998650692612621,0.2935105692822171,0.3829977847011913,-1.027619588456911,0.5465128184890088,1.15974027234423,1.159733047582317,1.15973304293836,1.1597622322409311,-0.3871229429095496,-0.1634809077632164,-0.1671949319279716,-0.1684966809921045,-0.1519926836440583\ngdb_95172,OC1CCC2CC2(O)C1,-0.0040182371105494,-0.0145519248603717,-0.0754652111784624,-0.1140474086377966,0.3002889633978861,-0.0643112684128314,1.2650448312145797,1.279483411974683,0.031607232693157,1.0450718029697088,-0.3165247264286032,-0.3165221700862187,-0.3165221747392986,-0.3165043536786892,0.7705467384528032,-0.9809354882190122,-0.982967198623696,-0.9833902607759186,-0.9711305858249464\ngdb_90599,CC1CN2C1C(O)C2=O,-0.0041121101268836,0.0621057275150403,0.0595088668375328,0.2676843554842154,-0.373885400024708,0.373979354330428,-0.5374068915809878,-0.7049906408865901,-0.3043712851240787,-0.0400430739764702,-0.6869872237076271,-0.6869876653140726,-0.6869876699694419,-0.6869808530200266,0.0547263188479293,0.0560012406120571,0.0557947593572723,0.0553616256256037,0.0639580719652118\ngdb_41069,C1CC23C4C5C4N(C12)C35,-0.0032148839287293,0.2879935597679636,0.4807141516821467,-0.7229102258389105,0.3076169456089998,1.368040560552252,1.2309559097432687,0.5787094357256866,-1.1811809777562232,0.0472926550548712,1.1700178218182524,1.1699598793938208,1.169959874749927,1.1700941180744435,-1.336791161216152,-0.0582289074317091,-0.0644797400826956,-0.0640645692631589,-0.0518252954726856\ngdb_40101,C1OC11CC2CCOC12,-0.0036824307940822,0.0528115948820936,0.1223044887438114,-0.1666482422766564,-0.1503819425856964,-0.0010528280168966,1.0328140536912738,1.018534543942023,-0.5180004820845208,0.3688660551358579,-0.2859267496472945,-0.2859553291372979,-0.2859553337901889,-0.2858871215320056,-0.5567241358626989,-0.3903915294055654,-0.3935140301148943,-0.3932169538362123,-0.3878253165054919\ngdb_70707,CC12OC1COCC2=O,-0.0041171668683145,0.2933982999403837,0.0939095498265974,-0.1660601584223088,-0.8294416274823646,-0.1501620089501701,-1.0018684466276,-0.9196421291070032,-0.6005081076596657,-0.3978611021413516,-1.1836938276227935,-1.1837093649297097,-1.1837093446255993,-1.1836597116620624,-0.286691467256596,0.2717075324727641,0.2719956834081808,0.2724790466644063,0.2702286302087241\ngdb_44745,N=C1OC2CC(C2)C1=O,-0.0039474316775297,0.2997501324327373,0.1885319644054331,1.5389256206320665,-0.5289943568266446,-0.7330433525984213,-1.5238550566570492,-1.161651159943744,-0.8510689762045749,-0.6732421362480877,-0.6578133276021746,-0.6578549228051096,-0.6578549274602989,-0.6577569800307814,-1.0177735326714767,0.496955728399855,0.4959314356705367,0.4972715801502474,0.4881997743772314\ngdb_77512,CC12OC(C)(C=C1)C2O,-0.0040999407884891,0.1477165592882231,0.3011716111998204,-0.633390794677124,0.0596868807996058,0.3784978143587085,-0.1027731428217729,-0.2777920908008652,-0.5595654575828658,0.2426710357956646,-0.2857439498998973,-0.285733387114153,-0.2857333668044937,-0.2857254170833085,0.7720236719182878,-0.3711897257579837,-0.370405711018233,-0.3702690386112391,-0.3693654052494494\ngdb_80083,CC1C2C=CNC(=O)N12,-0.0042017553757919,0.4065884600700281,0.2474028599425693,0.7489983189424047,-0.1259553352153123,1.0833775787705475,-0.3946595329198731,-0.894389012845778,-0.7983786380295207,-0.3263335608149258,-0.1619822553461949,-0.1620002474038344,-0.1620002520559595,-0.1619521691305109,-0.4110000339348836,0.1892811091938073,0.1894570612558541,0.190519994514753,0.1821493778122902\ngdb_103687,OCC1(O)CC(C1)C=O,-0.0032673579832502,-0.3589588900297646,-0.3037017404500326,0.8520436787541834,-0.2163337824857317,-0.2314942894592295,-0.6801542502421024,-0.5639940750947497,0.9597224911378838,0.255053114378223,-1.213474485091075,-1.2134560363687383,-1.2134560410273607,-1.2134903632463918,0.8217470985896025,-0.2329828888497841,-0.2320624865353787,-0.2329041394119958,-0.2232819086986682\ngdb_32275,[NH3+]CC1=NC=NC=C1[O-],-0.0036842711163735,0.0306748903908494,-0.0889644444341872,1.112826196832059,-0.1357259781634656,0.3965716544718331,-0.9251683733171502,-1.0985183692906813,-0.2849247774225932,-0.6197467481972326,-0.5635610739039613,-0.5635904715881748,-0.5635904762427816,-0.5635181584859164,-0.7775256889526474,0.5812255038528744,0.5806792661490899,0.5814216781867648,0.5800205926087655\ngdb_233,CC#CC(C)C,-0.0012987160425029,0.1911060154387733,0.3239258968702203,-1.6926605016579417,-1.21538202393443,-0.0191266681300211,1.1116446845936805,1.1048160245012086,-1.1733030742723525,-0.2313041130526754,4.41979006878531,4.4198067495779645,4.419806769916701,4.419740043507813,-0.9894656412496888,1.0155119339220038,1.0131071049375175,1.0107944113705496,1.0100014523641925\ngdb_119488,CCCC12NC1C1NC21,-0.0031974975805963,-0.296892773330127,-0.2293144848488449,0.0968786671215075,0.9427087372389792,1.9554403642287836,1.3076559830537184,0.3808933583460894,0.8616767473840384,1.0105402294469938,0.7084488091132111,0.708460186639619,0.7084601819928729,0.708428366359059,0.56131449750915,-0.7207160434517706,-0.7247003489645163,-0.7269449854733538,-0.6910289314487539\ngdb_96758,CN=C1C(O)C(C)N1C,-0.0042229107924013,0.0265771376199375,-0.1375215241640422,0.1079869177036272,1.0147672289816116,1.1918206194492915,0.5129580012537815,-0.0484096180947373,0.3508458758312717,0.9501926231514216,-0.220914118086885,-0.2208542494769476,-0.2208542291668875,-0.2209987557580089,1.5050749819538154,-0.7539802904923988,-0.75209193314624,-0.7541407941374083,-0.7346966807686895\ngdb_77027,OC1C(CC#C)C1C#C,-0.0037442777813538,-0.2537874448755566,-0.1921482388601289,-0.6876905372285433,0.7301972531166406,-0.4619357509015599,0.3979078912881069,0.6081714046971161,0.4657011975710749,-0.4880819756967829,0.6725551313881443,0.6725902768189308,0.6725902721719631,0.6725049916914037,0.989871358077267,0.078055913626303,0.0842623579475016,0.0860672368881693,0.0505306095013938\ngdb_105526,CCC1(CCC1)C2CC2,-0.0041222733481857,-0.089908903783075,-0.0786962649800355,-1.7357213083262752,2.0785459799618256,-0.9363740538710672,1.5398867605770246,1.9571086983175567,0.504155618320264,2.427356565897271,1.4780431844558546,1.4780817080487845,1.478081703406793,1.4779842310419835,1.5033518929107488,-2.4066307327745,-2.408130089208171,-2.4082565633345725,-2.4078576709279567\ngdb_69482,OC12CC1C(CC=O)C2,-0.0030894712147489,-0.3447461912819635,-0.2894449437323572,-0.413447433151158,0.1464013369644675,-0.2947527298551631,-0.6780236926501455,-0.5303232534131165,0.8554853064727391,0.2665035317082094,-0.2859538588273685,-0.2859421988365252,-0.2859421785268672,-0.285979406395674,0.4296212635034381,-0.3932391545270713,-0.3921469200209964,-0.391856905329289,-0.3983604032480373\ngdb_376,CNC(C)C#N,-0.0018134978466647,0.5683897120074555,0.6393643697017594,1.1117807144243304,-2.578386715201846,-0.1094958686956416,0.3766023153685376,0.4208774590930295,-1.8197193884707537,-0.8782677189913116,3.617765825051019,3.61775413413558,3.617754129506812,3.6177833051719888,-1.6767320138551929,1.9184555768293443,1.9129800340251053,1.9091954856268896,1.939907370977361\ngdb_131984,c1c(n[nH]c1N(=O)=O)N,-0.0027859009340922,-0.2003903798777098,-0.216782742279467,0.5569562691727035,-0.8782948422231329,0.3784978143587085,-2.19071958293957,-2.3401299188009155,0.0121671591430718,-1.737409661805814,-1.8585982084000443,-1.858632461207602,-1.8586324409076624,-1.858545284995284,-0.9673116392674196,2.0878735387986027,2.0853580620813235,2.082806838867378,2.128134064127306\ngdb_46158,O=C1COC(CO1)C#N,-0.0027404344731608,-0.2313287289955352,-0.2206161959423502,-1.1694618992401138,-1.4584267672697482,-1.7813260791596175,-0.6141069648914375,0.2230613817134324,0.0574628700949836,-1.421696711539387,-1.5553244909237989,-1.555350346589898,-1.5553503512506337,-1.5552912570438686,-0.9451576372851508,1.185936790961285,1.188367500092432,1.189701437355264,1.1734591506636478\ngdb_52538,CC#CC(=N)OC(C)=O,-0.003324872199722,-0.3999174758924858,-0.4359832736313837,0.7970905096979336,0.4419632861461117,-1.0222247944084073,-1.061524059202394,-0.5724117805151583,1.821174579283389,-0.8457797895064941,-0.6579580596703715,-0.6579134849450596,-0.6579134896002492,-0.6580759955995304,0.6201456805509529,0.4817526616856611,0.489834020689423,0.491217170667815,0.4517814869647558\ngdb_99039,CC(=NC)NC(=O)OC,-0.0027619104263854,-0.3988567324941603,-0.4101713523013029,-1.455401337753966,0.5286777423109735,0.0215394721245085,-0.1773426585402656,-0.1851973311763733,1.734676350028679,0.2454660195533782,-1.119589054045874,-1.1195317248444552,-1.1195317295024974,-1.1197314379698535,1.180888086279944,-0.1872452046991468,-0.1827295650671844,-0.1839191665279168,-0.1862452513047602\ngdb_13715,CCCCC1CN1C,-0.0028747337884755,-0.3512116748527091,-0.3147457379073868,-0.934293700151596,0.9610286927667676,0.9161945577241496,1.5505395485368092,1.1048160245012086,1.3387640693711518,1.8756027292536412,2.0279668811853893,2.028010756341351,2.0280107517027592,2.0279016958410816,0.9677173560949972,-1.3050403275067686,-1.3086149022449998,-1.3116213221668624,-1.2810080371621213\ngdb_61331,OC(CC#CC#N)C=O,-0.0013989334732225,-0.5293849960273861,-0.534521287308736,1.6492240146474646,0.3381502048219795,-2.011767540601949,-1.626121821070982,-0.6692153928498544,2.4498934026026022,-1.5041637591863246,-0.6273525941652877,-0.6273374328155589,-0.6273374374705596,-0.6273907167830195,-0.0250280882882398,1.0730832572720543,1.080246245518435,1.0823427696360886,1.0428548510787958\ngdb_78463,CC1C2C1C1OC2C=C1,-0.0035086667896328,0.0290143218565662,0.2666796555044967,-0.6522094780162441,0.2868543293441751,0.3197578339910555,-0.0409869726550217,-0.1894061838865771,-0.6294717976406589,0.3852452610278331,0.641141208226503,0.6411067611456911,0.6411067564985288,0.6411954857756961,-0.5838012493965832,-0.5981973659353468,-0.6006338084094139,-0.5988759016857572,-0.611803483354727\ngdb_111626,CC1=CCC2C(CO)C12,-0.0042544339105674,0.0580900560785225,0.0015415710923618,-0.5770000873102469,1.046521818563111,0.730937696564628,0.2274632839315521,-0.1157512614580038,0.0177312702928493,1.0368972268181174,0.6104558630011295,0.6104713231121242,0.6104713184647725,0.6104421852514496,0.6674075514464604,-1.1979187537762417,-1.1972292174843997,-1.1961410744642198,-1.2106420926339156\ngdb_16234,CC1CC=CC11CN1,-0.0038491319400085,0.5454385081210689,0.298908048084594,-0.6197341807261653,0.1989185428107918,0.7219007765080658,0.2381160718913368,-0.1031247033273915,-0.9986319491345944,0.5547775554069779,2.0875701579375168,2.0875554471827,2.087555442544476,2.0875938271965477,-0.5345701338804297,-0.2912516968059193,-0.2951522782074097,-0.295551535058221,-0.2904676647495101\ngdb_47029,N=C1CC2CCC2CO1,-0.0034673451987267,-0.0603533094523524,-0.0702626669216922,0.4621440833217895,0.5653176533665489,-0.389640390449064,0.4895218677422551,0.6649909162848722,-0.0849374112420625,0.7746195574249616,0.2092854045509243,0.2092656969202014,0.2092656922703706,0.2093062927856819,-0.3349379604624269,-0.7630790558424679,-0.7659631623693381,-0.7654779726780414,-0.7636717313485069\ngdb_108781,C1C(=CC(=O)CN1)CO,-0.0039684489318049,-0.1197928234513747,-0.2438542269559237,1.506254295390538,0.1842625783885627,1.2867082800431937,-1.176574169168069,-1.7635170975029422,0.4290747907252629,-0.006082518640814,-0.6878353466377908,-0.6878319236837469,-0.6878319283391229,-0.6878375021818697,0.1364499706047446,-0.0330879961416611,-0.0321083404254477,-0.0319214873779379,-0.0338355733724688\ngdb_113800,OCC1CC1C1(O)CC1,-0.0035832081321687,-0.2945124146326943,-0.2298895028982775,-0.5236804845160733,0.5653176533665489,-0.3354188701096928,1.2437392552950106,1.3868091560848894,0.814817895094726,0.9749567803165364,-0.3154823709660765,-0.3154516012473246,-0.3154516059003979,-0.3155177019282655,1.270242560941763,-0.8714435156764058,-0.8715009427354748,-0.8727101842741818,-0.8584960609215396\ngdb_118118,CCCC(C)C1CN1C,-0.003677595112386,-0.4130631172931618,-0.4087566253542865,-0.9212905127054678,2.2654095263452616,0.9433053178938364,1.527103415025283,1.0690407764644736,2.277976809169854,2.7107919618004286,1.0472426168476991,1.047318476763946,1.0473184721192943,1.047143196403441,2.361450236357307,-2.460667434888528,-2.461236858311992,-2.463427563298528,-2.4400699026115595\ngdb_26325,CC(O)C1=CNC(C)=C1,-0.0033228439766562,-0.2687893872233023,-0.2912430160456619,0.0922393389372106,1.009881907507534,1.9961065044833133,0.7004470693459919,-0.2378079900539251,0.9262377374396626,0.6663665305939594,0.2084455941041548,0.2084757320945992,0.2084757274447636,0.2084100787039265,1.0420563405243908,-0.8512951257768715,-0.848213287904063,-0.8471479822607586,-0.8659820422225396\ngdb_122768,COCCCC(=O)C#C,-0.0021551014546741,-0.5507892824578816,-0.5744074599439741,0.6354981350533366,0.7057706457462565,-0.7104510524570162,-1.4492855409385563,-1.1006227956457832,3.4408227789977897,0.195186365503332,-0.2858324915773552,-0.2857773711254892,-0.2857773757783792,-0.2859284088364439,1.1912266205383348,-0.3804903945466572,-0.3749852699263421,-0.3748188783032798,-0.3925386065978427\ngdb_57662,CC(C)(OC=O)C1CO1,-0.0041537467279116,-0.1132515724950343,-0.0825753549960483,1.1366109216078917,-0.3128188815987485,-1.329480076331515,-0.1155564883735145,0.5050545132971133,0.2022988351898723,0.2142703943866429,-1.2136503452877725,-1.2136235350725175,-1.213623539731141,-1.2136684927533246,0.9101169509410988,-0.2514557424207456,-0.249502236973156,-0.2502208860561224,-0.2436168773200482\ngdb_110519,OCC1C2C3N2CC13O,-0.0038849049315739,0.1061328924943458,0.1515847729117389,0.2357317993980013,-0.5888395448840843,0.5999023557444779,0.8666305615186328,0.5745005830154817,-0.6155293490092146,-0.0268194949077186,-0.6854490897686494,-0.6854641510236026,-0.6854641556789639,-0.6854182258992322,-0.0661360697442276,0.2175711897011416,0.2144211153473076,0.212869178708216,0.2423449883987807\ngdb_89813,OC1CC2(CO2)C2NC12,-0.003627619032867,-0.0310123893808146,0.0055028065440079,-0.0083230001562123,-0.5155597227729336,-0.2586050496289152,1.1329502605132498,1.239499311227743,-0.3441993971972166,0.0130015102424967,-0.6864274416052933,-0.6864528676643319,-0.686452872319698,-0.6863845335425725,-0.3194301590748379,0.1148023394517845,0.1114772054626801,0.1106513395009785,0.1320329129665137\ngdb_103817,COC1(C)CC1C(C)=O,-0.0040666547189389,-0.1258100166573518,-0.0795085920657418,0.0822419134133031,0.6642454132166036,0.1390194328598146,-0.3605706114485622,-0.4208930829478075,0.391734193074025,0.8994621652694865,-0.3166637670667296,-0.3166113862363394,-0.3166113908894198,-0.3167278892356657,1.5461829634098023,-0.9955407109855496,-0.9922562698700644,-0.992613815545831,-0.996649033694956\ngdb_84802,[NH3+]C1C=CCC1C([O-])=O,-0.0039288681863095,0.0145111814639855,-0.0870659721439973,2.337151438933076,-0.2749576401746533,-0.1817912291481375,-0.2860010957300694,-0.1978238893069857,-0.1628374695089305,0.0374350779308926,-0.6886298752681211,-0.6886428320879598,-0.6886428367433394,-0.6886181916899018,-0.1887215473794503,-0.1165475357172689,-0.1165390845328263,-0.1157567354750604,-0.1229577908083837\ngdb_33669,C#CC12C3OC1COC23,-0.0036089947502197,0.0356060844033029,0.2293034822913846,-0.9029945705702122,-0.3934266859210146,0.468867014924329,0.2849883389143894,0.0631249787256735,-0.7905990340809307,-1.0829026050802315,-0.2245072327175225,-0.2245469593502788,-0.2245469640027904,-0.2244505141967208,-0.9916810414479158,0.814180118040024,0.813992457991709,0.8155280922446861,0.8018615360187558\ngdb_28191,c1c2c([nH]c1O)ocn2,-0.0024498459254234,-0.0705503367636357,-0.0744064477858565,0.0077512918626196,-1.2092753720918328,2.4524709673396963,0.1315881922934899,-1.0122368887314952,-0.5536603364088764,-1.7311284617481573,-1.029082085845448,-1.0291161242448366,-1.02911612890232,-1.0290486415327675,-1.1730977021249416,1.4492005200430604,1.451845482333449,1.4537598874119526,1.4302308739716287\ngdb_27090,c1c[nH]c(c1OC=O)O,-0.0035735754542298,-0.1725584935692659,-0.2263481218954234,0.9379039214889188,-1.1787421128788529,1.828923483436915,-0.2135621376035336,-1.0627431212539455,0.1425924123657218,-1.44459754619936,-1.5557065156860783,-1.5557196425397803,-1.555719647200518,-1.5556924977521696,-0.4390617697790913,1.145807827008126,1.1499168789348206,1.1515220111476316,1.127654177904012\ngdb_13307,N#CCC1CCC=C1,-0.0023867554771442,-0.1328121858760595,-0.08659135407145,0.9776975956331,-0.143053960374581,-0.5613418715237439,0.0144075247358585,0.2777764669460866,-0.2626122127251608,-0.1438782232549567,2.1164310503515016,2.1164196675520874,2.116419662914042,2.116427917078905,-0.9535269269228964,0.1168239959608202,0.1170262169658916,0.1185974293545204,0.08927704564797\ngdb_101932,COC(C1CO1)C(C)=O,-0.0042890131795656,0.046346111311348,-0.0378882380115803,-0.3150414015236761,-0.416631962922879,-0.0100897480734589,-0.6780236926501455,-0.6650065401396501,0.0428244607537482,0.2053144249264436,-1.213050323776343,-1.2130129511240175,-1.2130129557826372,-1.2131059220277445,0.8754090145022091,-0.1884277819823984,-0.1859290185382984,-0.1870960540274222,-0.1793947377722562\ngdb_86215,OC1CC(O)C(O)C=C1,-0.0040036029561133,-0.0573604977213627,-0.0980278241657184,0.2783352075129536,-0.281064292017249,-0.6968956723721746,-0.3456567083048636,-0.0147387964131034,0.014889877543936,0.3370393046044778,-1.213724608465228,-1.2137249579090987,-1.2137249625677229,-1.2137003194239164,0.6157148801544989,-0.2592565570940762,-0.2600622527744982,-0.260706421319586,-0.2472501571670828\ngdb_41,CCOC,0.0102136715058962,1.7318610573016793,2.5178205554089508,-1.0910507186604468,-4.378627678399131,-0.4664542109298429,1.7337675014451055,1.929751155701229,-2.933237936588823,-1.2210589527598854,5.424079075269798,5.424035592542227,5.42403558792462,5.424150776575758,-3.375451654740073,2.997223027167101,2.985288170678543,2.978812988997675,3.036559462355872\ngdb_40701,C1NC1C1NC1C1CO1,-0.0035955211593907,-0.2083522693139502,-0.1686272624571522,-0.7018698923833665,0.1818199176515248,0.1119086726901289,1.2629142736226229,1.1953063577705991,0.3899451415319058,0.350984169804252,-0.1890823491008982,-0.1890843135164998,-0.1890842932062434,-0.1890922563882937,-0.0624437360805159,-0.0338353010763354,-0.0373740534487109,-0.0395862768757464,-0.004210806866983\ngdb_501,N=COCC=O,0.0028976332685629,0.1064359620367245,0.3588377070143361,-0.3676422351625359,-3.6006402336524075,-1.0222247944084073,-1.2191853210072074,-0.7281393307927129,-1.5998640776720223,-1.98589273394316,2.2247167230844616,2.224685512652024,2.2246855080146477,2.2247208471907816,-2.588246117636778,2.9459028440456154,2.941197570616036,2.937477321188588,2.962796757373716\ngdb_89082,CC1CC1OCCC#N,-0.0026649757327259,-0.4936354179242987,-0.4873059163608901,0.7369752712535222,0.737525235327756,-0.9183002137579428,0.4490412734950734,0.8733291254399796,2.450733416813188,0.625944453667892,0.2096857766864569,0.2097244336831445,0.2097244290333166,0.2096148492368021,0.884516770872699,-0.7210228317939634,-0.7182000783473682,-0.7180517650656557,-0.7284474394310089\ngdb_102316,COC(CN(C)C)C#N,-0.0040006462821618,-0.2619387527757838,-0.3189899187484362,0.3920314193534707,0.68500802948143,0.8890837975544639,0.1678076713567579,-0.2504345481845381,1.2500667901489857,0.940395153205029,-0.2207563806017289,-0.2206923174215938,-0.2206922971115326,-0.2208588681799748,1.5796601219607875,-0.737411097914405,-0.7352317750114579,-0.73739955188995,-0.7187273472272565\ngdb_22208,CC#CC(C)C(C)=NO,-0.0039787779479081,-0.347941049374539,-0.3653473262574462,-1.285445103847538,1.524061992654114,-0.0055712880451783,0.0676714645347819,0.0694382577909803,1.6272867886302766,0.5422752624692498,0.2109221899444589,0.2109986469967684,0.2109986673094973,0.2107857212452353,1.8991700616606249,-0.5911464783201056,-0.5855306206610513,-0.5863154537100543,-0.5947826358814154\ngdb_48073,O=CC(=O)C12CC1CC2,-0.0039229437854199,0.0193918638860425,-0.0688479399746758,-0.4829720132651294,0.0267109608495864,-0.1772727691198569,-2.1459778735084742,-2.0370925236662143,-0.1971490116620826,-0.4116857529859544,-0.2564726754229297,-0.2564778787892166,-0.2564778834419254,-0.2564759579471168,-0.2738913772223961,0.0799936622123843,0.0825131847665259,0.0843304007493468,0.057797169196826\ngdb_16863,O=C1CC2NC2CN1,-0.003101861612878,0.2655790415295389,0.2335659176736857,0.920130720557528,-1.337515060786349,0.2700547736799635,0.5150885588457385,0.3851022110562932,-1.1355789297070895,-0.4424906814983854,0.7889931536368525,0.7889553207915907,0.7889553161453422,0.7890334160475656,-1.3838068765340794,0.8435583794185902,0.839057875758497,0.8379753444519309,0.8542919014414594\ngdb_5181,c1c(ocn1)OC=O,-0.0010146155979786,-0.1140471300437784,-0.0168316246776001,-0.5987591899211043,-2.588157358149999,-0.8866709935599766,-0.8420766272308297,-0.4187886565927056,-0.7769958082094215,-2.28751055106585,-0.5748816528256502,-0.5749347768681401,-0.574934781522817,-0.5748318534235031,-2.341352073323268,2.3120020883839274,2.312178780511824,2.3129002065718542,2.298376734439995\ngdb_114042,CCC1OC2(C)C(C)C12,-0.0039520960379316,-0.1382358678948784,-0.1120746936589978,-0.8457544087470553,1.3787236788003308,0.6089392758010402,1.4781005904102733,1.17636652057468,0.555079067030244,1.6715989685293662,0.581089731356319,0.5811294443676513,0.5811294397201184,0.5810567326239798,1.4644593116529872,-1.659066207547421,-1.6591408906928955,-1.659672445290577,-1.6533294581452591\ngdb_13759,[O-]C(=O)C1CN1C=[NH2+],-0.0033168864168065,0.4306446049963373,0.3814733381665992,0.0213425631630951,-2.2168729261201654,-1.0131878743518463,-0.2497816166668015,0.2251658080685343,-1.4093642251839869,-1.1857760395173549,-0.1079585770562145,-0.1080063537705137,-0.1080063584223051,-0.1078878917709822,-1.5691620264523976,1.5913038311036387,1.587067225404605,1.5855865245697207,1.6095268227748838\ngdb_46332,C1CCC2(C1)COC2=O,-0.0037707165256222,-0.0757593445232695,0.0544432316401512,0.9210455176642904,-0.1051927189504858,-1.1848893554265236,0.0719325797186958,0.6229023891828303,-0.2178877114364476,0.3849447251399074,-0.2877669287786151,-0.2877816641096676,-0.2877816687625699,-0.2877654193008765,-0.40336921102988,-0.5836891617046606,-0.5836696875290193,-0.5820314299501069,-0.6022486697640981\ngdb_40689,C1CC1C1CC1C1CC1,-0.0036862274950254,-0.2898653483162209,-0.3018945408661019,-1.7357213083262752,1.96740491642658,0.1661301930295002,1.5015367239217998,1.4057489932808085,1.2262529234581658,1.7303837882077229,1.5085238878228144,1.508549322865436,1.5085493182236345,1.508468190565696,0.8995322611051259,-1.8284055058388835,-1.8290082184626015,-1.8283416867838969,-1.8397665828688168\ngdb_41436,C1CC23CCC4C2CC134,-0.0039354281339363,0.4321031271690347,0.4603785927276136,-1.601703532185528,1.3604037232725423,0.8800468774979016,1.5761062396402925,1.1469045516032506,-0.926497605428044,1.1154272543332222,1.5398007174145103,1.539762893533806,1.5397628888921977,1.5398571757905517,-0.5737088707157716,-1.1665529234830057,-1.172218380158209,-1.1713066403425465,-1.1683592641024505\ngdb_74705,NC1=NC2CC(C2)C1=O,-0.0038836006791393,0.2447808941838007,0.1723128045031864,-1.2051389864038624,-0.0697741382634286,0.4553116348394861,-1.393891043547676,-1.5888497100294685,-0.7643491261794128,-0.2952280964145693,-0.1619274878129728,-0.16195776114544,-0.1619577408350159,-0.1618736383809589,-0.5158623099842913,0.195034046132728,0.1938806760463744,0.1949149899745066,0.1911143177340156\ngdb_124354,CC(=O)n1c(=O)[nH]nn1,-0.003832657463281,0.1468831180466817,-0.067378449403904,0.154706912799012,-1.63674100107355,-1.5011815574061953,-1.5472911901685753,-0.8291517958376133,-0.5312391060433508,-1.7993801618961875,-1.8607406573854175,-1.8607643627422612,-1.860764367404884,-1.860709798140623,-0.926449813389012,1.8628246241875368,1.8633872056949528,1.8623989778924712,1.881036838923031\ngdb_93084,OC1CN=COC(=O)C1,-0.00409081654794,0.193202246440226,0.1049809290958293,0.0681932435594461,-1.0773716922917604,-1.0583724746346566,-0.8250321664951741,-0.319880617902907,-0.6445359333328763,-0.7236119510645118,-1.5852731703912244,-1.5852898537199394,-1.58528985838086,-1.5852454959584703,-0.3959845437024567,0.6635968625580844,0.6642315250630524,0.6643846371090806,0.6658400874225328\ngdb_126676,CCCOc1ncon1,-0.0017662684343492,-0.4241630392827816,-0.4159306600662537,-1.3058973534487344,-0.5424289908803557,-1.1080755349457472,-0.6183680800753513,-0.0947069979069828,1.6421418144019866,-0.3731871057426137,-1.0884006912443982,-1.0884046244458156,-1.0884046291036653,-1.08841159774699,-0.398938410633426,0.465314575235795,0.4650570995385792,0.4641762062356371,0.4772685888905294\ngdb_101961,CCC(C=O)(C#N)C#N,-0.0045415131350166,0.0255227081704116,0.2062388694197036,-0.3097486468345485,-0.5534209641970288,-2.8838303260601874,-1.9776638237438764,-0.6102914549069959,-0.2234990063631597,-1.1658204565590582,-0.1319871707537636,-0.1319575450962333,-0.1319575497481727,-0.132026986094002,0.33706676633307,0.7164955634722905,0.724337585959394,0.7265055623655288,0.6864514702749926\ngdb_2151,OC1CC=CC1O,-0.0033675035695561,1.02759058030158,0.7388516195242065,-1.090201264204167,-2.125273148481226,-0.5748972516085855,-0.0559008757987203,0.212539249937922,-1.847575689505196,-0.6735727257248069,1.6446653164644045,1.6446292138983964,1.6446292092574355,1.6447132997583709,-1.611993096951451,1.3031666472353198,1.2979988135352307,1.296115555257486,1.3128203670883078\ngdb_54427,CC12CCC1(O2)C(N)=O,-0.0040120087525247,0.0504249222358613,0.047689051376976,-0.1022203889003635,-0.251752363172789,-0.3851219304207836,-0.0473786454308925,0.1325710484440431,-0.3026758862301185,-0.0630941765804061,-0.687807912946354,-0.6878013695237734,-0.6878013492165989,-0.687798236807093,0.5056833369758966,-0.0302062834270435,-0.0289270804350852,-0.0287600843016614,-0.0293531034116056\ngdb_65761,OC1(CC=O)CNC1=O,-0.0039401146003116,-0.0217119427990684,0.0501077780928429,1.3017317993785735,-1.1311102285066044,0.0983532926052848,-1.1147879990013174,-1.1469201754580294,-0.3977615628870022,-0.7846207363135216,-1.585185727059922,-1.5851850609392448,-1.5851850656001647,-1.58518049466418,0.0138644929695218,0.6727821579528509,0.6751424151280371,0.6752185718947795,0.6732605272051687\ngdb_95580,CC1CCN=COC1C,-0.0041669882060192,0.0997684321043931,-0.0004573011747128,-0.2141523491778381,0.915839469131557,0.084797912520442,0.1912438048682841,0.1494064592848591,-0.0293288279541479,1.3977206138622609,0.179243240848349,0.1792624354400175,0.1792624307900013,0.1792310875607389,0.8603935242697822,-1.2952388333054563,-1.296737158653533,-1.2973859757452526,-1.2851001076413324\ngdb_57464,CC(C)(C)C1CCC1=O,-0.0041123090806448,-0.0480789929860151,-0.0464131087761826,-0.1130019262300675,1.282238579687314,0.170648653057782,-0.6204986376673083,-0.6923640827559772,0.25905412723845,1.6413350046152029,0.5798798032180751,0.5799249264908324,0.579924921843292,0.5798410286936768,1.7379381583452214,-1.7861604889646705,-1.7845537487933538,-1.7842007576332055,-1.7921121994260272\ngdb_99817,CCC(N)(C#N)C(N)=O,-0.004316656699388,0.0265076841831424,0.0107144780714041,-0.5187797857298442,-0.2957202564394798,-0.8279310131923223,-0.1688204281724379,0.2188525290032287,-0.0669117786414613,-0.0588866741494398,-0.5933083566266102,-0.5932666238262838,-0.593266628481074,-0.5933440423101802,1.3399045893971184,0.0800408604188116,0.0839634650562499,0.0833316554510947,0.0870543332706831\ngdb_78336,CC1C2N1C(=O)C21CC1,-0.0038166306325164,-0.0240039062133073,-0.0491969263170859,0.8892889895295255,0.3650194729294033,-0.0191266681300211,-0.3989206481037869,-0.3851178349110717,-0.0728440662324737,-0.0465947563332603,0.2401896950971264,0.2401861823007822,0.2401862026136914,0.2402033250505269,-0.0329050667708244,-0.1403590308592077,-0.1396889887274007,-0.1387407810209006,-0.1484249453153867\ngdb_92484,CN1CC2OCC1CO2,-0.0039733619844083,0.1492382209489161,0.4660375005156794,-0.3796652828514183,-0.185800523272752,1.327374420297722,1.3502671348928572,0.7154971488073226,-0.7705691633191578,0.7594424950846905,-0.7171997558761313,-0.7172255497922931,-0.7172255544478494,-0.7171550079878901,-0.3893383431077766,-0.4940545234544533,-0.4994078828823268,-0.5008027264294089,-0.4687662426137841\ngdb_50444,C1C=CNC(=O)N1C=O,-0.0037705562573146,0.0816032014080699,-0.0538335797950495,2.113810259581991,-0.7244072157897143,0.2790916937365258,-0.7632459963284229,-0.8817624547151657,-0.4460573756556988,-1.034005416114646,-1.060039596255291,-1.0600616470620634,-1.060061651719738,-1.0600138702400084,-0.75906402063409,0.8208901297207949,0.8229644428660754,0.8244367970756141,0.8071988953554978\ngdb_80064,CC1C2C=CC3CN2C13,-0.0038482642805498,0.280959820805258,0.4744072076796412,-0.8717607836393115,0.8669862543907905,1.1963390794775732,-0.0580314333906772,-0.6145003076172002,-0.9058539881132598,0.7608550137579432,1.1378015068663136,1.137769698920985,1.137769694276892,1.1378518036463243,-0.5276777777081677,-0.818767695181437,-0.8229529399908071,-0.8220657973674248,-0.8206472586200915\ngdb_1477,OCC(O)CC#C,-0.0012290545936101,-0.0495943406979086,0.0611700300914489,-0.4217459497625063,-1.964057539836693,-0.968003274069036,0.6067025352998868,1.0479965129134523,-0.8289440742563156,-0.7707660318801257,1.6457604424671466,1.6457611906078276,1.6457611859668726,1.6457673990883537,-0.5803550713104526,1.4182017869109225,1.415858778227198,1.4131442452830012,1.4331545956371377\ngdb_94250,CC1NCC1(C=O)C#N,-0.0043542313277911,0.0639115168717134,0.1470211375987827,0.2350130302426876,-0.1760298803245987,-0.5206757312692143,-1.364063237260279,-1.1069360747110892,-0.3176956977682455,-0.411385217098028,-0.1612467877840513,-0.1612326240595847,-0.1612326287117065,-0.1612630905095924,0.2075889325255868,0.2665367067468602,0.2693810203768863,0.2698802442991843,0.2608134485494903\ngdb_86791,CC1OC(CC#N)C1C,-0.0037254766509188,-0.262349159447755,-0.2513294615985462,1.909810504773891,0.5494403585758,-0.8776340735034155,0.5086968860698677,0.9112087998318168,0.7860876218284193,0.6378156212409754,0.2093866021715646,0.2094182430571314,0.2094182384073016,0.2093313048044517,0.8173162981931485,-0.7524489709065224,-0.7500802540346878,-0.749707087619583,-0.7608164008335225\ngdb_55061,NC(=O)N1CC1CC#C,-0.0038091201280304,-0.1472269309854464,-0.1984643101332605,0.604395033423402,0.0059483445847599,-0.1637173890350141,0.5449163651331357,0.6144846837624216,0.2859088453715898,-0.4182975425203298,-0.160682212898501,-0.1606631035041869,-0.1606631081563037,-0.1607104297355614,0.6058686570512684,0.3258412531222255,0.3286787709325397,0.3287597636264192,0.3239042844894139\ngdb_50944,O=COC(C=O)C1CO1,-0.003874940664033,-0.2162447053133954,-0.2892167619667093,0.979919245749524,-1.379040293316001,-1.5915507579718156,-1.387499370771805,-0.6292312921029144,0.525213523494652,-1.1864071648819996,-2.08052219407284,-2.080520839834489,-2.080520844498472,-2.080535164112865,-0.3125378029025767,1.0324115139449856,1.0356436289374569,1.0356159418061996,1.0306983241542402\ngdb_101903,COC(C)CCOC=O,-0.0033875481609987,-0.4984655887559592,-0.5123237651465166,1.551798122777228,0.3222729100312306,-1.144223215171994,0.0484964462071695,0.5808138620807883,2.806661378499694,0.9210105884337916,-1.244004838696676,-1.243948638696938,-1.243948643355749,-1.2441060727053057,1.3923357274218222,-0.816423517595569,-0.8137860249695582,-0.8154023339046627,-0.8064133505112399\ngdb_80850,CC1C2CC1(C)C(C)=C2,-0.0040526340052665,0.0928799139640773,0.0959814402586795,-1.519763848479776,1.945420969793235,1.3725590205805325,0.3595578546328821,-0.2841053698661713,-0.0973292293860506,1.7042071123693543,1.5078163032755345,1.5078429076913304,1.5078429030495246,1.5078066201730065,1.216088333873992,-1.902732192470889,-1.9025592613280176,-1.9013739689421891,-1.91529035996483\ngdb_131137,c1cn[nH]c1C2(CC2)O,-0.0033664369563362,-0.0982433161984893,-0.0942400068559645,-0.2974642285437342,-0.0258062449967378,0.5953838957161974,0.1528937682130593,-0.1262733932335143,-0.1360967788367169,-0.3321339034519006,-0.1613623387919316,-0.1613774817314227,-0.161377461420995,-0.1613465138296841,-0.2655220875846503,0.2543989013269983,0.2542986251770658,0.2549068070370053,0.2512899809017356\ngdb_61918,CC(CCC#N)C1CO1,-0.003539438304701,-0.4286522568792661,-0.4514357428010537,1.7101233648976724,0.6092855466332414,-1.6051061380566598,0.5321330195813939,1.2710657065542743,1.961521293965195,0.628529062304057,0.2096033508000852,0.2096417077957641,0.2096417031459357,0.2095347458752899,0.8714705252609172,-0.7296810805502969,-0.726813391752647,-0.7266043281623219,-0.7375919061221237\ngdb_92237,OC1CN2CC3NC3C12,-0.0038970190050347,0.2768557540855464,0.2695182366691555,0.1812360288951326,-0.1711445588505211,1.331892880326004,1.192605873088044,0.5576651721746656,-0.8328478168919778,0.4342927179373824,-0.1901362122356283,-0.1901745029189068,-0.190174507571206,-0.190078908138716,-0.6687249236619479,-0.1445360721279858,-0.150883177271505,-0.152297393557651,-0.1168453317703898\ngdb_131283,C#CCn1c(ncn1)O,-0.0037453665005472,-0.0672355136438687,-0.0709563394892616,-0.354639047716408,-0.8844014940657285,-0.1501620089501701,0.1081520587819637,0.1788684282562887,-0.2596671586481327,-1.4095550616671704,-0.5319149004555818,-0.5319291976035179,-0.5319292022579291,-0.5319151732242146,-0.6532171222743598,1.2818750118920592,1.2840834001460146,1.2847422463959857,1.2758578001065213\ngdb_11860,COC1CC2(CO2)C1,-0.0019013635647088,-0.1881602610529693,-0.0671137585557524,-0.1739012764802755,-0.7476124927915804,-0.570378791580305,1.2586531584387088,1.5088658846808105,-0.13279677407411,0.1141618901184403,0.6652202759223096,0.6652133443389263,0.665213339691913,0.6652293149458667,-0.4681081279336216,0.2819128335512859,0.2782750331926654,0.2762701496182393,0.2992549741510937\ngdb_49764,O=CC1C=CC=CC=C1,-0.0039830886127344,-0.001172667354112,0.0665277379488596,0.0388543934925539,0.9940046127167852,1.173746779336168,-1.4727216744500826,-2.0013172756294786,-0.3475272832825582,-0.3462590901844299,0.6697952121649409,0.6697846860119038,0.6697847063274676,0.6697974531405404,-0.4681081279336216,-0.2118538251060206,-0.2078521623440174,-0.2039843982951422,-0.2585574939791729\ngdb_68763,CC12CCC(=O)C1(C)N2,-0.0041045885694109,0.1897927140884657,0.0669202105857738,0.4609679156130947,0.5079151260461471,0.6857530962818179,-0.4862735093740214,-0.7996898268661837,-0.3245820270307692,0.657771204199272,0.2089616420596304,0.2089751328498602,0.2089751282000276,0.2089639875826877,0.6678998626016225,-0.7970879856960263,-0.7962163211012163,-0.7955177537417728,-0.8027487247204235\ngdb_88711,CC1OC1C1NC1C#N,-0.0034681299607848,-0.2695912587208459,-0.2672017852170082,1.7058107499657909,-0.0893154241597369,-1.6096245980849402,-0.0132897239595816,0.7365414123583436,0.6427415202355394,-0.4053744993395055,-0.1603741517647263,-0.1603647011934601,-0.1603646808830261,-0.1604033460480206,0.0542340076927681,0.3582008678725873,0.359748037210551,0.3596124769096588,0.358960448162988\ngdb_13452,CCCC1C(C)C1C,-0.0036731960236657,-0.1996263920729634,-0.1292613442475913,-1.7293830712294187,1.2920092226354674,-0.7194879725135798,1.4738394752263593,1.790859016264491,0.6384256344574071,2.2163202653955243,2.428170021185313,2.4282277657427414,2.4282277611066228,2.4280942707526183,1.398735772438923,-1.841489897509444,-1.8428144709693828,-1.84449194021146,-1.835665963495847\ngdb_66038,NC1(CNC1C#C)C#N,-0.0041241855148908,-0.0214341290518879,0.0109244052958,0.6307281215680736,0.0755641755903547,-0.7691910328246705,-0.1709509857643948,0.1893905600317992,-0.240621713050595,-0.7391496564702942,0.3663091144823794,0.3663217426892554,0.366321763002944,0.3663020417345237,0.3542976567637238,0.6677739038268624,0.6703107465642435,0.670423562168284,0.668561485269083\ngdb_85663,CC1OC(C#C)C(C)=C1,-0.0042179258953841,0.0476152150200589,-0.0655164861962178,-0.5932050646300446,1.0807190688816486,0.0667240724073187,0.0570186765749972,0.0252453043338365,0.1015789866235193,0.2456162874973414,0.6409242598986822,0.6409530916940576,0.6409530870468944,0.6408882023913991,0.8606396798473641,-0.6209862332717883,-0.6166336748311172,-0.6147629199184309,-0.6468824440783244\ngdb_92582,CN1CC2OCC=CC12,-0.0036115645696355,0.011076393317027,-0.03123445772529,-0.7975968753410432,0.8303463433352133,1.4041882407784998,0.0612797917589111,-0.5934560440661792,-0.1435307258739721,0.7070891434079544,0.2102456836008547,0.2102354170611427,0.2102354373738668,0.2102667592988892,-0.1141364073724774,-0.6622086223405032,-0.6649971437042091,-0.6652214937464429,-0.6540264696294261\ngdb_68820,CC12CC1(O)CC2C=O,-0.0038864799821836,-0.1255385168589708,-0.0788240467687983,-0.4807503631487057,0.220902489444138,-0.0688297284411119,-0.7738987842882077,-0.7302437571478148,0.0673546629671268,0.2538810244153107,-0.2861430739148007,-0.2861285692273463,-0.2861285738802399,-0.2861521440901766,0.640330437912576,-0.4131148436779427,-0.4115515663347596,-0.4111272700378512,-0.4180798515178757\ngdb_18213,O=CC1=CCC2OC12,-0.0029259533290021,0.1736732028031987,0.1830738565711373,0.8770045712387106,-1.2239313365140636,-0.6833402922873305,-1.6452968393985945,-1.3047521520906864,-1.087515818747368,-1.2388206237363213,0.7240280005851888,0.7239835471350521,0.7239835424884005,0.7240823954161704,-1.7658403329394312,1.2121344176507791,1.2110989542396675,1.2122699842111333,1.194010691259797\ngdb_95566,CC1OCC=CCC1=O,-0.0041387257189397,0.1697522405986742,-0.0178995153408317,-0.3214449812710155,0.2880756597126931,0.4236824146415188,-0.565104140276428,-0.75549687340904,-0.2232162611544045,0.3193978479832123,-0.2871947153948117,-0.2871947945827158,-0.2871947992356145,-0.2871910165424647,0.1824810636123476,-0.5235822458200381,-0.5225655842831739,-0.5213582982655441,-0.5366758050028712\ngdb_121329,COCC1COCC1=O,-0.0036960867591924,-0.2689409219944916,-0.3164525375144325,-0.7001056408203239,-0.3042695690191133,-0.2586050496289152,-0.8292932816790879,-0.7007817881763858,0.968145510226497,0.2852569711148011,-1.2137691414092884,-1.2137559613949545,-1.2137559660535788,-1.2137967230331888,0.2413122466541522,-0.2639344237750768,-0.2632902960000127,-0.2639116969275216,-0.2582554330663625\ngdb_120857,COCC1CN1CCO,-0.0038473303031708,-0.3348459195642594,-0.3188530096890474,0.2112936481173382,0.6092855466332414,0.6631607961404128,1.3417449045250294,1.0164301175869213,1.3834664085828203,1.3226768026470996,-0.745992800452697,-0.745946584980625,-0.7459465646738098,-0.7460743434232424,1.338427655931634,-0.8950032870507347,-0.8966781204537335,-0.9001460014982818,-0.8582424437400288\ngdb_84118,CC1=CC(O)C2OCC12,-0.0040638251543349,0.1718295297537282,0.0376490536884721,0.3018585616868537,0.180598587283005,-0.0688297284411119,0.0463658886152125,0.0778559632113889,-0.345998099966437,0.3520660990007856,-0.2865023828816241,-0.2865137163958435,-0.286513696086189,-0.2864768260543969,-0.0747515149595545,-0.4508576760839653,-0.4516525960662161,-0.4509428836333185,-0.4551450052247669\ngdb_21840,c1c(nn[nH]1)OC=O,-0.0008845495765831,-0.1063819962011172,-0.0324118756360328,1.7455390814594884,-2.539304143409232,-1.0448170945498123,-0.7632459963284229,-0.2672699590253547,-0.777289991909554,-2.26671346762136,-0.4788586404655315,-0.4789110400854447,-0.4789110447395281,-0.4788031248866808,-2.2771054675746862,2.582277373242254,2.580101170092319,2.5789329205902054,2.594530361663501\ngdb_105443,OCC1(CN=CO1)C#C,-0.0041489552581623,0.0883086150331986,0.0736561363076972,-1.6098060208454268,-0.5607489464081442,-0.8776340735034155,-0.0452480878389356,0.3640579475052734,-0.4402126640143007,-0.7579632030544716,-0.6567439628092254,-0.6567456869778181,-0.6567456916330006,-0.6567367542631454,-0.0119818426764589,0.6092848375736499,0.6114236488500684,0.61194921858179,0.6046670533158329\ngdb_65611,CC1(CC2COC12)C#N,-0.0036482991710468,-0.0626326449689922,0.063451847747927,0.7527881926704217,0.0609082111681238,-1.0809647747760616,0.2658133205867769,0.766003381329772,-0.3471898477869019,-0.0183443828682017,0.2399628616539193,0.2399577749774475,0.2399577703278064,0.2399734491215271,-0.1515520551647544,-0.1641862587374585,-0.1634704665965313,-0.1623571071819145,-0.1746671995240959\ngdb_40313,C1CC1C1(CN1)C1CN1,-0.0038225439804192,-0.0630367376921638,-0.0619751051933636,-0.0451762550286558,0.7680584945407357,0.3649424342738657,1.1372113756971636,0.9532973269338584,0.0739754047579641,1.0668306012555624,0.7075959183622447,0.7076035967707395,0.707603592123988,0.7075811279068721,0.4281443300379552,-0.8103061056177124,-0.8138873886453081,-0.81550298265565,-0.787748265804987\ngdb_37703,C1C2C3C4CC1CC2C34,-0.0039967224718712,0.4808404979528091,0.7637873228741997,-1.7123939821038243,0.9707993357149208,0.0802794524921615,1.6059340459276894,1.5509544117828522,-1.1378358870371483,1.2015608398128583,1.5372247462096849,1.5371703081573824,1.5371703035157578,1.5372969635219074,-1.057650736239562,-1.437140240927847,-1.4421550479188752,-1.4393394256946084,-1.460628843928664\ngdb_123319,C#Cc1c(n[nH]n1)C#N,-0.0043126665711769,0.3570997293791064,-0.1181169468133519,0.733054712224539,-0.4117466414488032,-1.4379231170102604,-1.4301105226109438,-0.7428703152784277,-0.4599451340002414,-2.4369069409539423,0.0243331732781386,0.0243008723446739,0.024300892656249,0.0243682996696855,-1.3921761661718253,2.0734649752255105,2.0775634552912114,2.0799448013072688,2.0501368071032933\ngdb_42745,O=C1C2OC(C=C2)C1=O,-0.0039936055296121,0.3939037369317195,0.358217052611774,0.958160143138666,-1.503615990904958,0.7851592169040006,-1.960619363008221,-2.3022502444090778,-1.1946565929582948,-1.760400657232165,-1.12389774119582,-1.1239531414481478,-1.123953146106217,-1.1238223758024155,-1.5905775617019244,1.3057494379759012,1.3074828149280682,1.3104154201092155,1.277345307620552\ngdb_116299,CCC1OCCC1OC,-0.004216328738801,-0.0692875470037245,-0.2207987413548685,-0.3129504367082183,0.92805277281675,-0.1140143287239221,1.4823617055941871,1.517283590101219,0.7858130980353426,1.70438743390211,-0.3465711337413194,-0.3465343931214932,-0.3465343977747586,-0.3466251396883377,1.1530725060133156,-1.5135410265204765,-1.5146742605419046,-1.5162247484996498,-1.4977623887851723\ngdb_105618,CC1COC1(CN)C#N,-0.0043118541766519,0.0002795408697858,0.0858136607814128,1.7957222370304755,0.0804494970644305,-0.3083081099400059,-0.0367258574711079,0.1073179321828173,-0.0648199645306745,0.3073463588773739,-0.1909241757514484,-0.1908953214837836,-0.1908953261360886,-0.1909542039852125,0.8047623637365289,-0.2273059934656623,-0.2259338827304162,-0.2268187610831056,-0.2167676516549188\ngdb_68288,CC12CC(C1)C21CCO1,-0.0039305040283462,0.0714314298919849,0.2445916605897882,-0.7799543597106181,0.9304954335537878,0.3423501341324606,1.6719813312783545,1.494134900195096,-0.4286290467767529,1.0329000995086997,0.6118304182504056,0.6118266647099909,0.6118266600626476,0.6118445058008414,0.1883887974742862,-1.0535315739482822,-1.056112786784457,-1.0560199479302637,-1.0505555081066789\ngdb_55493,CC(=O)CC#CCC#C,-0.00217677636166,-0.5431430904616191,-0.5551397916526726,-0.1944842113824383,1.040415166720514,-0.2540865896006346,-0.560843025092514,-0.4356240674335222,3.2226678714329964,-0.529736249763349,0.6714469001187925,0.6715069521174994,0.6715069474705251,0.6712828725027938,1.110733746669424,-0.038355840403416,-0.0285320220049092,-0.0259315963252143,-0.0889844870104698\ngdb_17232,O=CC12CC3CC1N23,-0.0025110021010242,0.1894580748020891,0.3716341404318654,0.3299559013945676,-1.1152329337158555,-0.4664542109298429,-0.9102544701734516,-0.6818419509804667,-1.2894266387437123,-0.8642627466139529,1.221443387243439,1.2213976329722558,1.2213976283286796,1.2214987092184226,-1.7838096901028273,1.0708806743054664,1.0669883964899711,1.066737051758736,1.0719267891216664\ngdb_102700,COC(CO)CCC=O,-0.00361721817235,-0.4529420179111589,-0.482614499259171,-0.3950208057149363,0.2135745072330226,-0.380603470392503,-0.6908070382018872,-0.5071745635069931,2.295618179950719,0.9179451223769453,-1.2434136039544783,-1.2433554286825397,-1.2433554333413483,-1.243508255502176,1.5065519154192983,-0.7543185443055422,-0.7520217582935121,-0.7540736949700844,-0.7381675316347611\ngdb_97189,CN=C1COCC(=O)N1,-0.0039352568126419,-0.0342703869613857,-0.1785486056275192,-0.9921219458291004,-0.1821365321671943,-0.7917833329660756,-0.7035903837536288,-0.3261938969682132,0.1318995676442503,-0.3551850060558373,-1.089317660508483,-1.0893235208369954,-1.089323500532302,-1.0893082611464466,-0.1997985483705844,0.3689935244088653,0.3693827845837384,0.3691792697278069,0.3749069846543977\ngdb_24902,c1nc(oc(=O)n1)C#N,-0.0033345877750614,-0.048849294739561,-0.1293708714951021,-0.1739666191307587,-1.5317065893809,-3.620339310669993,-3.1281649234006217,-1.4057646171355869,-0.4929859312518467,-2.820390733947717,-1.400595825205881,-1.4006427275055156,-1.4006427072027456,-1.400544468006334,-1.8704564534112575,2.375711800692885,2.380126231433016,2.382812377449146,2.348954839548292\ngdb_92487,OC1CC2CCC1OC2,-0.004082477069449,0.1389591123032386,0.4435296511521788,-0.3960662881226654,0.1317453725422367,0.1751671130860625,1.2735670615824075,1.17636652057468,-0.6847222131672935,1.12838035110284,-0.3170070002398444,-0.3170251155234385,-0.3170251201765215,-0.3169791327176272,-0.0683514699424546,-1.0315948964505215,-1.0353332331705234,-1.035386953977985,-1.025330572255267\ngdb_49503,N=C1OC2CC1C2C=O,-0.0032160610718165,-0.1173619531635454,0.024505783987158,-0.0758873007556918,-0.4740344902432808,-1.1351862951154328,-1.2404908969267767,-0.6986773618212834,-0.4214185077742279,-0.6982166685347521,-0.6573689467332445,-0.6574039743574542,-0.6574039540500919,-0.65732528556709,-0.9754347733275844,0.5436347545564495,0.5428836102831994,0.5438951784919454,0.5374812972654454\ngdb_112609,CC1(CO)CC(CO)C1,-0.0035176915533013,-0.369111719699451,-0.3104011570894525,-1.495391039849596,0.8889702010241349,-0.9047448336731012,1.3012643102778476,1.7045775357053057,1.350772698053971,1.66670023355617,-0.3461147709143466,-0.3460652469756171,-0.3460652266663306,-0.3461632410902039,1.8250772328088127,-1.4656033815262486,-1.46582736490557,-1.4677197927356609,-1.445032812080911\ngdb_10742,CC(N)(C#N)C1CN1,-0.0040268473872154,0.330827388424153,0.4113468949252111,-0.1420794056950275,-0.8770735118546131,-0.8098571730792002,0.1933743624602411,0.568187303950176,-1.0066206630036922,-0.1866144265180563,1.2860925161517107,1.2861016084031405,1.286101603759964,1.2860953180309525,-0.0068125755472626,0.6691085642197108,0.6670645098583909,0.6647562632665703,0.6917489347741945\ngdb_24085,C#CCCc1ccoc1,-0.0017226809811626,-0.443420583121428,-0.4193990229041005,-1.5229002957029625,0.6630240828480854,0.6947900163383801,0.1273270771095761,-0.1957194629518838,1.5629913860463398,-0.3874024532415209,0.6704670754998051,0.6704722544615163,0.6704722498145355,0.6704370818520933,-0.0646591362787429,-0.1412793958845319,-0.1362634162867977,-0.1329031534636709,-0.1855385427540328\ngdb_104993,CC(C#N)C1(CO)CN1,-0.0039938486953203,-0.0564323472478281,-0.1067808766959685,-0.4837561250709262,-0.0490115219986021,-1.0674093946912175,0.2040271504200258,0.6965573116114036,0.0423612018529316,0.315370667085002,-0.1909170864262722,-0.1908915771014347,-0.1908915817537384,-0.1909435451708262,0.8232240320550872,-0.226561310652732,-0.2255440224378935,-0.2264316505023879,-0.2155508591099168\ngdb_106713,CCC12OC1(C)C1OC21,-0.0040475109459151,0.0618721114094567,0.0408344711369156,-1.4131246428914288,-0.0685528078949105,-0.1230512487804844,1.088208551082154,1.1321735671175364,-0.1857980962335091,0.2446245190671849,-0.2851255310903898,-0.2851177108056574,-0.2851177154585432,-0.285124254959505,0.4244519963742436,-0.3062292609791317,-0.3063022825849815,-0.3066203169295562,-0.3007377357846248\ngdb_13880,CC(O)C1CCC=C1,-0.0032773388302713,0.0078499654804538,0.1674297147183232,-0.7686500811770494,0.1525079888070631,-0.1366066288653272,0.1081520587819637,0.1704507228358801,-0.5283637552731589,0.8700998090191009,1.5607336474395197,1.560740396247545,1.5607403916060647,1.5607250874670622,0.0399569841930836,-0.6169219432743371,-0.6191053890883345,-0.6196585783959138,-0.6187166887743412\ngdb_78014,CC1CC2(C=O)C(O)C12,-0.003876145439587,-0.0958061319618606,-0.0809324462833842,-0.2862906353111316,0.4150940180386897,-0.2314942894592295,-0.697198710977758,-0.5808294859355664,0.057434273866538,0.2845056313949857,-0.286213467918306,-0.2862031822863028,-0.2862031869391954,-0.2862092324004848,0.5145449377688046,-0.4205092293514865,-0.4193201824390896,-0.418841093542551,-0.4245969581937855\ngdb_78873,OC1C2C3C2C2(CC2)C13,-0.0039103378540495,0.2438022321198696,0.2151288310093423,-0.6023530356976725,0.3882247499312676,-0.1456435489218895,1.6101951611116034,1.6582801558930598,-0.7128144330864112,0.3520660990007856,0.6421456208218921,0.642119991009536,0.6421199863623799,0.6421906995245609,-0.2731529104896538,-0.4926910197132324,-0.4951376131403616,-0.4941237785434179,-0.4981915349315641\ngdb_64974,C1CC1(C(=O)CC#N)O,-0.0032803673486364,-0.273644813850161,-0.2638338223560464,2.124918510164111,-0.5399863301433178,-1.0809647747760616,-1.02956569532304,-0.5134878425722992,0.5192951765650065,-0.8060489451226562,-0.6578633522770052,-0.6578503546586432,-0.6578503343512835,-0.6578872072782115,0.3264820764970971,0.4917009947509967,0.4964070652283338,0.4977464357959283,0.4733332481306834\ngdb_35082,N#CC1C=CC2CC1C2,-0.0036027940246618,0.0052107348822392,0.1072718740229334,0.9775015676816506,0.5921869214739727,-0.8008202530226366,-0.3754845145922607,-7.811927389259577e-06,-0.5157706911714851,0.0709748830234518,1.1662576587225622,1.1662299004348342,1.1662298957909172,1.1662911679270662,-0.728294573436494,-0.4532070979145736,-0.4528403704253625,-0.4496860646141757,-0.4859637672252312\ngdb_55614,CC(=O)CC(C#C)C#C,-0.0040834331528015,-0.2281843824933562,-0.2580106236967142,-0.0656938472803351,0.6300481628980661,-0.4619357509015599,-0.6460653287707915,-0.4229975093029093,0.7767463063538514,-0.53664857518565,0.6716240833233569,0.6716743260085325,0.6716743213615591,0.6715449245717101,1.2215037565807705,-0.0197440143357525,-0.0111052669110579,-0.0086277533671111,-0.0590690581171848\ngdb_131383,c12[nH]nc(n1nnn2)O,-0.0034059403309245,0.2511264127273548,0.0752625359378578,1.9641102473253105,-2.1765690239590323,-0.4980834311278091,-0.9507350644206334,-0.707095067241692,-1.0768462084615078,-2.4145470708922367,-1.733811856734341,-1.733870789590903,-1.7338707942527412,-1.7337290208824985,-1.9071336344707916,2.75597726138388,2.752310258688264,2.749929987042244,2.798717839461121\ngdb_102972,COC(COC=N)C=O,-0.0042578437569749,-0.1826481837509568,-0.2291410667069526,1.2840239410976657,-0.4801411420858764,-0.5613418715237439,-1.076437962346093,-0.8017942532212862,0.6825953689142784,-0.1345916643180384,-1.6140816417387716,-1.61404533722589,-1.6140453169244393,-1.6141451363011985,0.7023616434629302,0.2610276272073949,0.2633745727970435,0.2614799446976006,0.2786122453553054\ngdb_130835,c1nnnn1C(=O)C=O,-0.0028574966556634,-0.1682397525903695,-0.1915367117281927,-1.1993888331613534,-2.0116894242089414,-2.581093504165357,-3.072770426009741,-1.8329631672213111,-0.2229185029257169,-2.5134534316087342,-1.8298286284913123,-1.8298661189928371,-1.829866123655269,-1.8297978884674069,-1.7008552604581086,2.486357507170371,2.487345609196848,2.4868341519629413,2.497982005182898\ngdb_95973,CC(=O)C1OCCC1O,-0.0042079782073233,0.1401019370359583,-0.0015890827323259,-0.3268684212611092,-0.5118957316673759,-0.0236451281583017,-0.9358211612769348,-0.9133288500416972,-0.1999453653511933,0.30052419422145,-1.2141745309897531,-1.2141657965244097,-1.2141658011830363,-1.2141937451475062,0.3739901029701848,-0.3065176944628515,-0.3059618045958168,-0.3062822403557289,-0.3035788181437984\ngdb_89179,CC1CC2(C)C1C2(C)O,-0.0041100653243377,-0.036884361764402,0.0051194611777197,-0.9623256972088268,1.3958223039595996,0.2339070934537156,1.354528250076771,1.2289771794522324,0.2020103707354287,1.6554000841701468,0.5809333668038548,0.5809745267885683,0.5809745221410344,0.580917319325743,1.802184764093801,-1.6754911833839723,-1.6752707105459914,-1.6756885003834898,-1.6692446486928865\ngdb_15575,OC1CNC=C1C=O,-0.0034207789656152,0.2795770660181551,0.1452869561798593,1.6216494161436148,-0.8208923149027312,1.0517483585725802,-1.0593935016104372,-1.5362390511519162,-1.0262212328859277,-0.8561182240511539,0.2926189242453247,0.292593206694276,0.2925932270075091,0.2926534784881421,-1.110328029841846,1.094178233422704,1.0926983832744157,1.0922682849257024,1.0946867939388376\ngdb_5126,Cc1c(n(cn1)C)O,-0.0034860136933209,0.3760100060337772,0.1551078993733411,0.1130183017908223,-0.905164110330555,2.231066425953927,0.8027138337599249,-0.2462256954743337,-0.9347533365802506,-0.5514349408716175,0.7883873907705276,0.7883973828589274,0.7883973782126753,0.7883629092250155,-0.438323303046349,0.7799273307874283,0.7809660930427416,0.7802932871877173,0.7777479563564416\ngdb_68127,CC12CC(N1)(C=C2)C#N,-0.0032295070301787,-0.0879452657064131,0.003102334369393,1.241551218283679,0.3491421781386526,-0.475491130986404,-0.5544513523166432,-0.3240894706131114,-0.3257426764528002,-0.3620672778893465,0.7665755092358072,0.7665605194333714,0.7665605397495334,0.766605398421814,-0.2566604867917423,0.1379687924400595,0.1383775656753036,0.1398033469662618,0.1265502224358721\ngdb_108756,C[C-]1OC(=O)C=C1C[NH3+],-0.004069335068222,-0.0891954275687253,-0.2209539049555091,1.4867168428961046,0.0694575237477591,0.4327193346980811,-0.3200900172013804,-0.5176966952825035,0.4220597784347335,-0.0378191084058161,-0.6887239336387617,-0.6887120282737832,-0.6887120329291646,-0.6887330922112607,0.3963902605300349,-0.1264276935959601,-0.1237437027458799,-0.1229105390071438,-0.1360746434651776\ngdb_101064,CCN(C)C1(CC1)C#N,-0.0041861706644965,-0.1132768282902326,-0.0443594728853524,0.5663656108422637,1.0709484259334936,0.1932409531991871,0.4533023886789872,0.3577446684399666,0.3363940568806424,0.9830411957017506,0.7059777050036757,0.7060212207897308,0.7060212161429696,0.7059291614495723,1.301504319294519,-0.980287823941514,-0.9786423484407192,-0.9790959140671524,-0.9763340124923467\ngdb_83276,CC1C2CCC(O)CN12,-0.0039524773659739,-0.0702220114260588,0.0303107281052385,-0.2508095760988322,0.7619518426981402,1.3590036404956896,1.277828176766321,0.629215668248137,0.0008834448513453,1.4107939249870485,0.1801394413851591,0.1801523752743598,0.1801523706243491,0.1801398326139736,0.7860545398403904,-1.2010993884648966,-1.2040777632969863,-1.20538011365736,-1.1813592818787824\ngdb_62529,CC1(C)C(OC1=O)C#C,-0.0040463061703611,-0.0217245706966674,0.0220870572712911,0.8390404913080555,-0.0673314775263907,-1.30688777619011,-0.1368620642930839,0.4734881179705818,-0.2130227780721618,-0.4655117305135294,-0.2566315112542412,-0.2566126266287102,-0.256612606318871,-0.2566396094391919,0.5869146775775497,0.0633090962400674,0.068483412357634,0.0704021620546962,0.0391149867026011\ngdb_133589,FC(F)(F)C1CC2CN12,-0.0037072115903405,-0.0508318746626216,0.0820532252835369,-0.1975553159551417,-2.200995631329416,-0.4031957705339081,1.2629142736226229,1.4352109622522369,-0.6097933030359093,-1.1571349693979922,-3.412929757660409,-3.4129730533605493,-3.412973058032767,-3.412866543400033,-0.840541516813324,1.3082089889552573,1.3051254596901114,1.3031970981470131,1.3362329374624002\ngdb_67661,OC12C3CC4CC1C4C23,-0.0038044281351617,0.2845398297746063,0.442580415007084,-0.9699054446648612,0.2514357486571178,0.1028717526335666,1.4738394752263593,1.4078534196359105,-0.9901267158892008,0.3992201798163999,0.6413559099371907,0.6413137256394378,0.6413137209922768,0.6414213677695629,-0.7132790832040671,-0.5756444896310393,-0.5790849304840856,-0.5774790095208606,-0.5860171701473958\ngdb_80081,CC1C2C=CCC(=O)N12,-0.0041856401211333,0.3710219864821277,0.2169907942170285,0.4952728071166988,0.289296990081213,0.2067963332840299,-0.2796094229541986,-0.3724912767804594,-0.6987847659582083,0.0069006317175955,0.2390396018618603,0.2390243004576526,0.2390242958080057,0.2390685981550471,-0.3186916923420956,-0.2611680844543642,-0.26066263762562,-0.2588637749549563,-0.277963482811643\ngdb_24486,c1cncc2c1C=CC2,-0.0032592064055337,0.1157364086184708,0.0124121504078238,0.083418081121998,1.1564415517298372,0.1073902126618471,-1.025304580139126,-1.0627431212539455,-0.6160412214983885,-0.5973868781355259,1.1944688793864775,1.1944197325804806,1.1944197279367377,1.1945284640521727,-1.4839921966094518,-0.1133747673963513,-0.1108783130793872,-0.1052582965259831,-0.1743480408237677\ngdb_7127,COCC(=O)OC=N,-0.001202649008302,-0.3360203140409769,-0.303875158591925,-0.9468394890443428,-1.6001010900179735,-1.5011815574061953,-0.7483320931847244,-0.0399919126743286,0.7061118348292652,-0.972786255744088,-0.6341400738478457,-0.6341312401509617,-0.6341312198434554,-0.6341662280394296,-0.561401091836733,1.334438081115702,1.334973163717336,1.3328342842072074,1.3486772771435025\ngdb_94950,CC1CNC2C1OC2=N,-0.003489462225182,-0.0313217728719929,0.0445401430110362,-0.3775089753854775,0.0951054614866613,0.4327193346980811,0.2679438781787338,0.0631249787256735,-0.3233187886391908,0.3976573931991841,-0.1914521807025542,-0.1914677876199642,-0.1914677922722714,-0.1914242901503682,-0.2354911071197966,-0.2827691302619787,-0.2855383233835941,-0.2860028074005055,-0.2704319074096877\ngdb_19642,O=CC12C3C4C3N1C24,-0.002099720464642,0.2479946941227749,0.5275096681812007,1.024025534825587,-1.5952157685438972,0.3920531944435513,-1.127571344553059,-1.2963344466702778,-1.5632183683720091,-1.620471147813747,1.25328356856247,1.2532314480955895,1.25323144345221,1.253341879754186,-2.1353198548881647,1.7919093190311612,1.7883572927849996,1.787895675527451,1.7951831485346834\ngdb_125246,Cc1cc(n[nH]1)CC=O,-0.0030933839720528,-0.3384448703800065,-0.3244936629358614,1.1311874816177978,0.2233451501811759,0.1390194328598146,-0.6439347711788346,-0.7007817881763858,0.9885585704494396,-0.385929827390683,-0.162381628985225,-0.1623678708629335,-0.1623678755150609,-0.1624417756536296,-0.0639206695460006,0.147329770047638,0.1511805776953411,0.1525134776998424,0.1262567104172744\ngdb_110993,COC1C2CC1(O)C=C2,-0.0039291831964314,0.0750303807077318,0.1146010723355412,-1.140580447726603,0.1341880332792763,0.7987145969888434,0.0548881189830403,-0.319880617902907,-0.4131617042634536,0.3058136258489506,-0.2854308962836045,-0.2854359084177396,-0.2854359130706275,-0.2854090225344897,0.2491892251367359,-0.33830568649135,-0.3394326102787317,-0.3395169740789869,-0.3332463291180771\ngdb_53827,CC(=O)C(=O)CC1CO1,-0.0032598308992842,-0.3520893137358474,-0.340831477356245,-0.6881479357819247,-0.674332670680428,-0.1953466092329803,-2.088452818525637,-1.9739597330131515,1.1354827751265582,-0.5004640542793414,-1.1838400075109212,-1.183813608534326,-1.1838136131927657,-1.1839131268462524,0.3922056157111629,0.2563523826485554,0.2611419728528268,0.261699307360007,0.2412991737294089\ngdb_124882,CC1(CN1)C1=NC=NO1,-0.0033720960855441,-0.040697986839334,-0.0523093256005223,0.350212123044315,-0.5289943568266446,-1.0583724746346566,-0.790943245023863,-0.2883142225763757,-0.3087822533617928,-0.6983068293011292,-0.56165177385218,-0.5616731231621568,-0.5616731278167517,-0.5616306996283346,-0.747987019642955,0.7817837935735523,0.7803111277506084,0.7796455221493157,0.7954897605371988\ngdb_10808,CC1=CNC(=O)OC1,-0.0025183965491494,0.0781999830051091,0.0851473700257213,1.2593897618655534,-1.2324806490936988,0.911676097695869,-0.1560370826206963,-0.5787250595804645,-0.8343112288826747,-0.8598148154726459,0.291765534246105,0.2917410601591183,0.2917410555097972,0.2917997249292703,-1.1169742304365269,1.0045357288051775,1.0039739778083223,1.0041670782286545,0.9972237058260428\ngdb_226,CC(C)(C)CO,-0.00300299264654,0.7545249226183727,1.3410050445275374,-0.8880964462600754,-1.8895563873570225,-1.0764463147477783,1.3076559830537184,1.7929634426195928,-1.9269463088198733,0.4611606263179801,3.4623527399047047,3.4623635397106014,3.4623635350808737,3.4623663338191344,-0.4439848813307065,0.6567872102197175,0.6501107852528895,0.6454781563468046,0.6879047822139887\ngdb_102254,CNC(CC(C)=O)C#N,-0.0039752575716332,-0.2587312667856092,-0.3338491153274219,0.6062899702874108,0.2709770345534244,-0.0462374282997068,-0.995476773851729,-0.9617306562090452,1.1150950940563618,0.2468184310490456,-0.1919062220251563,-0.1918558304441436,-0.1918558101339056,-0.1919841649697334,1.1447032163755697,-0.3304629178563332,-0.3259408450745064,-0.3261177865117283,-0.3343462867817094\ngdb_128300,CNc1nnn(n1)C=O,-0.0015148019331578,-0.3135237144681581,-0.2856479991519775,1.3635459467355442,-0.5619702767766623,0.026057932152789,-1.8839192896977712,-1.8708428416131488,0.526220468188788,-1.4194126387911483,-1.363771398964979,-1.3637841260066483,-1.3637841306661995,-1.3637555826332606,-0.8063258915295974,1.6747083061054988,1.6741048353370065,1.672012832820862,1.7062005630306158\ngdb_123667,C1C2NC3=NOC=C3C12,-0.0031523848152403,0.1115755163595633,0.1142998724048861,0.8765471726853297,-0.5436503212488755,0.9252314777807118,-0.1837343313161364,-0.6123958812620978,-0.8536819565787797,-0.9767232758759204,-0.130649360175821,-0.1307071710169036,-0.1307071756688353,-0.1305710469654499,-1.4921153306696169,0.857022978863145,0.8545249331140157,0.8557746889533533,0.8526590627335282\ngdb_43212,O=C1C2COC2C1C#N,-0.0041191508794333,0.2050345864905945,0.0817794071647595,1.7477607315759125,-1.04928109381582,-1.5373292376324443,-1.462068886490298,-0.7281393307927129,-0.6831283308843141,-1.435250880084857,-0.6271914617920179,-0.6272197094344831,-0.6272197140894831,-0.6271660828938813,-1.1250973644966915,1.0900090585212463,1.0925034531279465,1.094513526293868,1.0684986827243887\ngdb_82734,OC1C2OC3C2C3C1=O,-0.0039365555385832,0.3649353398393556,0.2765188532392305,0.9315003417415793,-1.407130891791942,0.1842040331426248,-0.7632459963284229,-0.8396739276131238,-1.0661530062871802,-1.063638254664165,-1.153009630668399,-1.153056527999489,-1.1530565326577384,-1.1529437544313104,-1.1216511864105612,0.8713083026745948,0.8704026433110127,0.8715404125374022,0.864804191133527\ngdb_36791,C#CCCOCC1CN1,-0.0007787227550958,-0.5658417363960236,-0.5717788060037113,-0.3507838313379076,0.9805699786630742,-0.3534927102228161,0.8751527918864607,1.0311611020726354,3.5089146883607176,0.614554143515491,0.21146382437284,0.2115056114043767,0.2115056317171087,0.2113811420872854,0.956886510681444,-0.5342516625950907,-0.5327461360656432,-0.5339032618180181,-0.5268103816047649\ngdb_130129,NC1=NNC(=N1)C1CO1,-0.002249167898276,-0.2609285209678548,-0.2089789258943118,-0.2771426642435039,-0.8147856630601356,1.2279682996755406,-0.1198176035574284,-0.6902596564008754,0.2217993182725481,-1.0055146139392463,-0.963185685029698,-0.9632102270422293,-0.9632102067367566,-0.9631511830853182,-0.7346946184535939,1.1784453867527271,1.177064150666305,1.1760415953303216,1.1985558531082237\ngdb_26397,CC(O)CC1=CN=CN1,-0.0029840478272774,-0.2925614044536314,-0.2887969075179174,1.455483055965204,0.3686834640349593,1.4448543810330294,0.662097032690767,-0.0189476491233083,0.807993762629071,0.3656503211350481,-0.1924470077315101,-0.1924403285289408,-0.1924403331812555,-0.1924529281438768,0.3245128318764503,-0.3872685814140715,-0.3867980368023206,-0.3865483288990408,-0.3878595120809808\ngdb_116534,CCN=C(N)N(C)C=O,-0.0033216502540889,-0.3625641547943112,-0.387398812089651,0.7616094504856343,0.9256101120797104,0.2203517133688727,-0.3413955931209498,-0.4398329201437265,1.5365248617143474,0.6377855676521826,-0.6229998233947082,-0.6229511385183075,-0.6229511431732812,-0.6230617404165291,1.40439735072328,-0.4152807169280025,-0.4136230240245629,-0.4156229142481806,-0.3935616242176444\ngdb_41809,C1OC2CCC=CC1O2,-0.0041033285289232,0.3910245762791218,0.3634561059510479,-1.1120910521159908,0.0401455949032974,0.1073902126618471,0.0037547367760738,-0.0463051917396348,-0.8599892122152833,0.4230526757289438,-0.2869264693091171,-0.2869614446746777,-0.2869614493275749,-0.286874122751754,-0.7359253963414975,-0.4954049165831955,-0.4982694908269527,-0.497233566875187,-0.5004997362468933\ngdb_46116,O=C1CCC(CC#N)N1,-0.0037458141465099,-0.1940322334365567,-0.2245317950408667,1.536311914612744,-0.2676296579635379,-0.5387495713823388,0.1188048467417483,0.3682668002154771,0.345355042516886,-0.3264237215813038,-0.1630471768297029,-0.1630505965780398,-0.1630506012301714,-0.1630567168536427,-0.2586297314123882,0.0774187378391223,0.080096050950319,0.0819303151488949,0.0560560445012846\ngdb_5530,CCc1c(nc[nH]1)O,-0.0036215730491233,0.2442631503822372,0.1106489641545211,0.0173566614836287,-0.9283693873324194,2.1994372057559595,0.4554329462709442,-0.5745162068702602,-0.780762288948545,-0.5010050188776082,0.7881840968823931,0.7881730194685301,0.7881730148222768,0.7881946647075376,-0.6825096360064711,0.7585727645014302,0.757605664289568,0.7570976211910846,0.7585414417116771\ngdb_2337,CCCC1OC1C,-0.0020710932290002,-0.093949831014791,0.0287773466400851,-0.6635137565498126,-0.8331056185879223,-0.7511171927115459,1.7785092108762013,2.104418543174704,-0.3337410415489976,0.6501976998235328,2.5120319452294213,2.5120458894548303,2.512045884819229,2.5120066981247797,-0.3666920299703454,0.0712724825137304,0.0669291693235408,0.0639735456779808,0.0872196118833533\ngdb_114238,CCC1CC2(CC2O)O1,-0.0034682349641588,-0.3086619738924998,-0.2202511051173137,-0.95533403360714,0.588522930368415,0.7670853767908761,1.2437392552950106,0.8712246990848767,0.913766564761738,0.9627550232667348,-0.315749518705617,-0.3157242921325827,-0.3157242967856577,-0.3157811019502886,1.03565629550729,-0.8995054715197605,-0.8998931683361507,-0.9009021574992848,-0.8885653699024826\ngdb_44004,O=C1CC23CC2(CN3)O1,-0.0034343133478713,0.0404109994397651,0.2484890051470529,-0.0036183293214322,-0.97233728059911,0.6993084763666607,-0.232737155931146,-0.5555763696743411,-0.8450473253996702,-0.6856242148306461,-0.656836798021648,-0.656881832720055,-0.6568818373752383,-0.6567711270673837,-1.0012811089735654,0.5995331637009242,0.5972483285988951,0.5978738778672881,0.6007431110805447\ngdb_24654,C1NC1C1=CC=CC=C1,-0.0030389645918666,-0.1856283675843473,-0.1653231904905718,-0.7299018894405974,1.445896849068886,0.4824223950091718,-0.1837343313161364,-0.4061620984620926,0.2094235855070605,0.0820646572879272,1.1655181722119932,1.165495552168399,1.1654955724870264,1.1655335683565218,-0.7081098160748708,-0.5308848572028892,-0.5292997710764981,-0.5256036109674199,-0.5724500757517305\ngdb_32540,CCC1=CNC=NC1=N,-0.0038546750128557,0.013822961044834,-0.0930352071333441,1.868056551115218,1.0049965860334584,1.53522358159865,-0.5437985643568586,-1.252141493213134,-0.051712883222694,0.0563387852814484,0.3336863116038231,0.3336834844618986,0.3336834798128367,0.333693084061194,-0.0474282458480891,-0.135463528003709,-0.1348157350656932,-0.1339044794991287,-0.1421158617213942\ngdb_4883,CNC1=CNC(=N)O1,-0.0027460549169151,0.0470090759353014,0.0493228328190143,0.9487508014691064,-1.0908063263454717,2.963056950535452,0.7771471426564417,-0.6102914549069959,-0.7014799604891938,-0.8671178375492513,0.3882220688256774,0.3882095796398664,0.3882095749911415,0.3882426994256688,-0.7223868395745553,1.3203494164976983,1.318206572052444,1.3161833677619168,1.3406669636919628\ngdb_71527,CC12NC3C1NC23C#N,-0.0040009889247506,0.1027044182961867,0.2917613951845044,0.2085492567970499,-0.0844301026856594,-0.3941588504773458,-0.3520483810807344,-0.1641530676253524,-0.7428261748399306,-0.7086753174345819,0.3672650251101062,0.3672444084253343,0.3672444287390287,0.3672970807487269,-0.6049706290685289,0.7681854658774546,0.7663775208810552,0.7658127707316528,0.7821534862734753\ngdb_75150,CC1=CCC(O)(C1)C#C,-0.0036186163751719,-0.2119385922320981,-0.0839353183193092,-1.159072417813308,1.0758337474075712,0.3287947540476178,0.3936467761041931,0.2356879398440448,0.2869143602543031,0.2524384521532654,0.6408115795682496,0.6408368411033719,0.6408368614187568,0.640794095294828,1.0132561379474398,-0.6328224945950347,-0.6287375373929016,-0.6267788323439232,-0.6576255539018795\ngdb_115981,CCC1OCC=C1C#N,-0.0042491782153752,0.0214691530410965,-0.2026446000799284,0.3202198464725923,0.7326399138536784,-0.6743033722307695,-1.3129298550533126,-0.982774919760066,0.3263003031451034,-0.0309067829835151,0.2388378306808943,0.2388499120903503,0.2388499074407023,0.2388125369888323,0.088941944131656,-0.28236270126261,-0.2788197310015722,-0.2768928050676047,-0.3071950002037696\ngdb_122464,CCOCC1CC1C#N,-0.003382977750984,-0.4367972508306936,-0.4437049445809057,0.4676982086128494,0.7985917537537155,-0.9408925138993478,0.3787328729604945,0.8123007611420187,2.100752398284772,0.6344496192962024,0.2096287625361017,0.2096673942586839,0.2096673896088556,0.2095554394516981,0.8340548774686402,-0.7270117597650172,-0.724138950142688,-0.7239487495785955,-0.7352295618129885\ngdb_17222,C#CC12OC3CC1C23,-0.0023275722596756,0.2045105287402312,0.4186943477790708,-0.4676818330520945,-0.7854737342156739,0.505014695150577,0.1081520587819637,-0.1283778195886167,-1.3567839824884476,-1.259166903348922,1.6536953943316484,1.6536497305295537,1.653649725888649,1.6537560431793228,-1.5849159834175668,1.2773806937911898,1.2751140143398545,1.2758335415650586,1.2658213988336935\ngdb_33072,CCCc1c[nH]c(n1)C,-0.003082999691016,-0.3809440597498193,-0.379038232196315,0.3624965413351295,1.4654381349651926,1.5171497414855253,0.5853969593803174,-0.1283778195886167,1.7303468810420353,1.0352142258457309,0.7042179048408967,0.7042551703731932,0.7042551657264211,0.7041519851258521,0.9157785292254552,-1.1651422215349403,-1.1625212551381523,-1.1616779098314836,-1.1792135095448697\ngdb_110461,OCC1C2C3C2C1C3O,-0.0034753807200825,-0.1605177932085123,-0.0896216079192529,-1.3887518342612488,-0.1015287278449281,-0.1908281492046998,1.2820892919502354,1.355242760758358,0.0299053996477965,0.3297963897054577,-0.2848900856148396,-0.2848986894007451,-0.2848986940536296,-0.2848712391688285,0.0894342552868172,-0.2814974008114506,-0.2834980545172261,-0.283976928694747,-0.2718538734053553\ngdb_21921,C(c1c(nno1)O)O,-0.0034266038896242,0.3302528190833932,0.1534102270369214,-0.1055528640749991,-2.7127330557389566,-0.3851219304207836,-1.1872269571278533,-0.9932970515355763,-1.1948753541059027,-1.976425853473486,-1.0038603637133283,-1.0038930660851002,-1.0038930707424276,-1.0038208004924858,-1.442391903998302,2.449338380595809,2.447000267015152,2.44433199093729,2.477595743198966\ngdb_73081,OC12CNC1C=CC2=O,-0.0040626922231947,0.3220636274903689,0.2228505019588645,-0.1947455819843705,-0.5839542234100086,0.2113147933123104,-1.481243904817911,-1.5614921674131417,-0.8663386472887441,-0.692957290496044,-0.6575315269264437,-0.6575606143524195,-0.6575606190076071,-0.6574848432756543,-0.6453401437917752,0.5265568701976596,0.5265744546953522,0.5276984717946965,0.5192664542967009\ngdb_54084,CC(=O)N(CC=O)C=O,-0.0043196686382731,-0.0136490301820351,-0.2383231009566207,1.0926353178327948,-0.7769244216360404,-0.8731156134751326,-1.2554048000704754,-0.8333606485478176,0.3445929530288137,-0.8176496303966062,-1.5860887922601954,-1.586072754144176,-1.5860727588051011,-1.5861120301097684,0.2240813562234984,0.5779216294025751,0.5827169359472529,0.5834449761553187,0.5669179881082366\ngdb_52712,CC#CC(C)OC1CC1,-0.0031942701084698,-0.4468679991659858,-0.4571767960247527,-1.1676976476770713,1.74145879825053,0.1932409531991871,0.7494498939610015,0.6523643581542599,2.380493930694004,0.919598069760539,0.6116453968482864,0.6117058709353875,0.6117058662880436,0.6115230689089171,1.4989210925142948,-1.0729667465062551,-1.0686896798345913,-1.0685081352642325,-1.0872502097506898\ngdb_15318,O=CCC1NC1C=O,-0.0024163001106849,-0.297290552104499,-0.2523699704499003,-0.203762867751032,-1.352171025208578,-0.6562295321176449,-1.2511436848865614,-0.9280598345274118,0.299970325803938,-0.9583905867124248,0.2935700420898335,0.293567245356193,0.2935672407068832,0.2935455738242906,-0.840295361235743,1.1940863479380766,1.194114040810731,1.19296606991932,1.196526915656136\ngdb_38890,C1C2OC3CNC3=NC12,-0.0034038678959118,0.1319443151869316,0.1234819066545541,-0.844120842484979,-0.169923228482003,1.0020452982614894,0.197635477644155,-0.271478811735559,-0.7338201501438858,-0.2915315049930772,-0.1603513361196184,-0.1603954800163755,-0.1603954846684906,-0.1602870226872713,-1.0140811990077652,0.3605974879104582,0.3565433856030268,0.3564278471989499,0.3722397298011993\ngdb_56864,CC(C#C)N1CC(=O)C1,-0.0038437159765087,-0.177565454967314,-0.1502631939578161,-0.0341986897475022,0.3821180980886703,0.0938348325770043,-0.7185042868973273,-0.7533924470539382,0.2797417112544686,-0.1163190823321281,0.2406789583838923,0.2406995620834839,0.2406995574338473,0.2406553386592059,0.629499592499023,-0.0889654283055018,-0.0862365434967614,-0.0856679204044386,-0.0968238225877411\ngdb_68379,OC12CC(C1)NC(=O)C2,-0.0037936293671223,0.0729594055014774,0.0676595195064727,0.1456242843818672,-0.4410585702932632,-0.032682048214864,0.3425133938972266,0.3535358157297629,-0.5302929283346615,0.0184712634027524,-0.6883100069131914,-0.6883332215927326,-0.6883332012855613,-0.6882638297952255,-0.1298903643376466,-0.0829476569860755,-0.0843028364437964,-0.0837452711868712,-0.0825044255417559\ngdb_43795,O=C1CC2(CC2)C11CO1,-0.0040598516056039,0.1498443600336736,0.0372930901340616,0.4015714463239969,-0.1625952462708876,-0.1727543090915752,-1.1062657686334898,-1.0122368887314952,-0.4307937812700744,-0.4290266737192935,-0.2557683359892868,-0.2557799508817703,-0.2557799555344749,-0.2557504596302037,-0.2396757519386695,0.1539794729086686,0.1551805443012849,0.1564852322580111,0.1406188519315992\ngdb_21018,CN1C=NC(=N)N=N1,-0.0023408579497302,0.2009557755660811,0.2144534129830249,1.86766449521232,-1.0614943975010114,-0.3128265699682876,-1.7326497006688288,-1.5678054464784474,-1.1811799053976568,-1.5662544736318684,0.5146435334032704,0.5146101174859569,0.514610112838013,0.51466944299456,-1.7175938397336004,2.160210034392944,2.155795420740485,2.152742236379924,2.1951003985712183\ngdb_129309,N=C1ON=NC(=C1)C#N,-0.0033232584636588,-0.105561182857175,-0.1739302066908074,-0.4327235150436596,-0.5167810531414534,-2.19250594173319,-3.098337117113225,-2.0413013763764183,-0.2656105272776681,-2.508073839214856,-0.9021388818823076,-0.9021738990854808,-0.90217390374218,-0.9020973195462496,-1.37445296458601,2.343866121967643,2.345834120066729,2.346320753373856,2.344549309631258\ngdb_125528,CC1=NN2C3CC3C2=N1,-0.0026245936456922,-0.0549927669215292,0.0673491923051917,0.0874693254519476,-0.034355557576373,-0.2269758294309477,-0.4884040669659782,-0.3767001294906637,-0.5083531869655867,-0.6694854376490111,0.3658066960041777,0.3657726165363889,0.3657726118875252,0.3658496287323309,-1.135189743177503,0.6149984426738806,0.6131364350707843,0.6136499244001558,0.6169147684409363\ngdb_91578,CC1CC2CC12NC=O,-0.0039089120187608,-0.0905024149702333,-0.0361266747807792,0.5628371077161787,0.6361548147406635,0.0622056123790369,0.3659495274087529,0.332491552178742,0.0529607513790997,0.6711751048007786,0.2095577943971083,0.2095712135573933,0.2095712089075644,0.2095344712922502,0.5061756481310571,-0.7344664542578841,-0.734153161534333,-0.7338923300286436,-0.7376232520659054\ngdb_90413,CC1C2CN(C)C2C1=O,-0.0040775861228191,0.0617332045358665,0.0352759633257349,0.2797727458235807,0.6532534398999321,1.291226740071474,-0.5714958130522987,-1.163755586298846,-0.1398364506116731,0.6455093399718843,0.2102034971235774,0.2102161709358639,0.210216166286039,0.2101934705891989,0.4121442174952041,-0.6666400094994712,-0.6670010256103174,-0.6672138228685389,-0.6623929869880363\ngdb_7865,NC(=O)OC1CC1O,-0.0031164405023805,-0.084516791508254,0.0461647971824486,0.0989696319369653,-2.054435987107113,-0.2811973497703203,0.7068387421218627,0.8291361719828347,-0.5389107592295586,-0.936030716650719,-0.634224771313763,-0.634229168230682,-0.6342291479231764,-0.6342342996713528,-0.3585688959101806,1.3255412192046545,1.3247770175227345,1.3227100521528246,1.3409063327172095\ngdb_89197,CC1CC2(O)C1CC2=O,-0.003975600214222,0.0101734986386904,0.0195770578491651,0.931173628489164,0.0731215148533168,0.1932409531991871,-0.6481958863627485,-0.7302437571478148,-0.199687641842329,0.2669543355400982,-0.2864132670686767,-0.2864048547196611,-0.2864048593725549,-0.2864157438089558,0.5300527391563918,-0.4414966984759687,-0.4403180578162396,-0.4396908694200325,-0.448171957548842\ngdb_13473,NC(=O)C1CCC1=O,-0.0036346376794432,0.2978433198952713,0.1369811399102788,-0.6254843339686743,-1.5524692056457263,-0.2902342698268825,-1.027435137731083,-0.8796580283600632,-0.966287470045636,-0.9029417153900492,0.2919952383667598,0.2919774554981336,0.2919774758113628,0.292011703036455,-0.8466954062528429,1.0286645007796256,1.028587157635719,1.0286092402952007,1.021422774425945\ngdb_35496,C#CC1CC2OC=NC12,-0.0036448285332122,0.0308769367524353,0.0908519141669167,-0.5979097354648247,-0.1772512106931185,-0.2992711898834448,0.0250603126956432,0.1641374437705745,-0.5924182346324423,-0.7014624561243544,0.2703571451484917,0.2703317802353546,0.2703317755859012,0.270379452023863,-0.7846642007024897,0.4049611798287219,0.4059048968256225,0.4078800048508058,0.3845244901330443\ngdb_28451,Cc1cnc(c(n1)N)C,-0.0037180379908467,-0.0065458377825344,-0.1382608330847411,-0.5295613230595484,1.2504839901058162,1.187302159421011,-0.8015960329836477,-1.342631826482524,0.0892093305565765,-0.0037383387149901,0.3329685423922052,0.3329783174152441,0.3329783127661778,0.3329671364265782,0.3668515912203434,-0.2108600406484767,-0.2082368244997147,-0.2068077247971809,-0.2249888378196294\ngdb_274,CCCC(C)C,-0.0017782277659957,0.1822728010681942,0.3044300468132712,-1.70951890548257,-0.945468012491688,-2.901904166173312,1.593150700375948,2.925144821664519,-1.0109479872732028,1.190921869380273,4.359094262121349,4.359115054572689,4.359115049948502,4.359070870808337,-0.3962306992800378,-0.1130391357057861,-0.1197801231010247,-0.1238602502980941,-0.0920763369199015\ngdb_85898,CC1(C)CC(O)C2OC12,-0.0039514383852209,-0.0257528700307843,0.0877577694247323,-0.5297573510109975,0.1952545517052358,-0.5161572712409337,1.3779643835882975,1.6014606443053023,-0.1693609841230533,0.9996307767152744,-0.3165766732092026,-0.316565430183633,-0.3165654348367132,-0.3165662846353453,0.8995322611051259,-0.9863921253064748,-0.9874713845414228,-0.9878626783518166,-0.9782005209634714\ngdb_55092,CC12CC1C(O2)C(N)=O,-0.0036812315450216,-0.092712297050078,-0.0525648891780479,0.265920103921173,-0.3555654444969195,0.2836101537648076,0.4021690064720208,0.2651499088154743,-0.0830554219574953,-0.0573839947098092,-0.6875136560267409,-0.6875091329626491,-0.6875091376180218,-0.6875105985912315,0.3954056382197124,0.0007032975374,0.0015002158943054,0.001450025417585,0.0034831975159342\ngdb_127380,CN1CC1C1=CNN=N1,-0.0032643681503386,-0.1720912613580988,-0.0521632892705075,0.6554929861011518,0.0548015593255282,0.861973037384777,-0.366962284224433,-0.7639145788294486,-0.0875614726547894,-0.3208037004770849,-0.0651227525401814,-0.0651391918400872,-0.0651391964916137,-0.0651080038500004,-0.5840474049741634,0.5474237216830776,0.5445600095428258,0.5431183765933043,0.571391909174886\ngdb_108654,CCC1=C(CCC1=O)N,-0.0043606199541235,0.1661596037317267,-0.1405426507412192,1.808268025923222,1.2590333026854497,1.3002636601280364,-0.3797456297761746,-0.980670493404964,0.2350425891185009,0.683226593906617,0.2079875088707134,0.2080165460057739,0.2080165413559354,0.2079452345786325,0.9167631515357796,-0.8994136972290686,-0.896023155161186,-0.894620643142831,-0.9190478754146466\ngdb_39166,C1CC23OC2CCC3=C1,-0.0037083445214807,0.1783960365052667,0.1152947449031104,-0.5622979909515593,0.7717224856462934,0.5140516152071393,0.1571548833969732,-0.084184866131473,-0.5631156793443703,0.3960645529931755,0.6401072900594318,0.6400795772169573,0.6400795725697886,0.6401427343988217,-0.6155553189045018,-0.7068030610465779,-0.707582883035981,-0.7050706569255727,-0.7319838318158988\ngdb_119703,CC1N2CCC12CCO,-0.0035208306015338,-0.304475825838394,-0.2618440773595975,-0.6424734230942687,0.846223638125964,0.7625669167625955,1.4440116689389624,1.0711452028195754,0.9999205669165356,1.3593722345628834,0.1808066866736931,0.1808384459710306,0.1808384413210241,0.1807665309605333,1.1131953024452312,-1.131010051920991,-1.1326449613565266,-1.1344511323209714,-1.1098164396422074\ngdb_93931,CC1COC(C1)C(N)=O,-0.0037534130748897,-0.1742064342059501,-0.1444856316516134,0.1951540134480236,-0.0587821649467572,0.4372377947263616,0.5086968860698677,0.2988207304971075,0.3576389098984928,0.693655189217655,-0.7187197920659828,-0.7187043311949575,-0.7187043108879739,-0.7187423725444133,0.6674075514464604,-0.6537234336744016,-0.6533767079060793,-0.6536830177038361,-0.6499771436190089\ngdb_89439,CC1CC2(O)CN3C1C23,-0.0038688173093822,0.1347540224027342,0.2136319586266928,-0.1492017545976805,0.2135745072330226,0.9523422379503974,1.2352170249271823,0.7765255131052824,-0.5824892666633203,0.7034526591640475,0.2105380933019351,0.2105261309067776,0.2105261262569546,0.2105743172662259,0.1391576819581319,-0.6314930784473446,-0.6347283905606133,-0.6351688089966883,-0.6189161629620459\ngdb_10667,CC(=O)C1(CO1)C#N,-0.0037833721954329,0.3740021703155183,0.205773378617782,1.5123311618854625,-1.9445162539403853,-1.6864384185657166,-1.361932679668322,-0.5618896487396479,-1.0981218024249366,-1.678264199061947,0.3231588883797728,0.323150137511244,0.323150132862117,0.3231659946480559,-0.9498345932591848,1.6786283793610906,1.681119721541245,1.6814144584579032,1.6660378601517862\ngdb_125534,Cc1nc2n(n1)cco2,-0.0024159519416028,-0.0274323804114664,-0.025283477277195,0.062377747666454,-0.7024232691563699,0.0125025520679462,-0.3882678601440024,-0.389326687621276,-0.5394987691769688,-1.3548875836534024,-0.5329761025396133,-0.5330213091221971,-0.5330213137766151,-0.5329175011700356,-1.4965461310660706,1.1704033368021032,1.1703741480395131,1.171834993686517,1.1614337067762734\ngdb_133753,CCN(C)CC(F)(F)F,-0.0039652877775989,-0.2672866674090079,-0.2393544825373488,-0.2284623896336273,-1.1579794966140282,0.7173823164797852,1.4397505537550486,1.087980613660393,0.7349918808420557,0.169039743253757,-3.4738439355097053,-3.473823984465888,-3.47382396417593,-3.473866588663185,0.9623019333882206,0.1567195910044241,0.155658772927287,0.1520850753234368,0.1963832859202342\ngdb_113000,CC1C(CO)OC1C=O,-0.0042889689676186,0.0146248325423775,0.0318806186528954,0.9669814009538787,-0.455714534715494,-0.4303065307035937,-1.0551323864265232,-0.8417783539682256,-0.0876233119988027,0.265842352754772,-1.2131856699774068,-1.213167718927807,-1.2131677235864275,-1.213204746960192,0.5738684319657696,-0.2026449306076218,-0.2020432439792622,-0.203096624696696,-0.1906764279030701\ngdb_87733,CC1OC11CC(O)C=C1,-0.0036826794862837,-0.153692414556192,-0.1252727269840674,-0.604182629911198,0.3821180980886703,-0.3083081099400059,-0.3094372292415957,-0.16204864127025,0.2062415401867908,0.2966172276784103,-0.2864838358090696,-0.2864787937897949,-0.2864787984426892,-0.2864901807749968,0.4084518838314925,-0.4489094390075674,-0.4480164990678418,-0.4473350130206142,-0.4566695579455401\ngdb_68206,CC12CC(O1)C(C2)C=O,-0.0034251172629084,-0.1210808690064842,0.0394014896486474,-0.450758086576983,0.0096123356903177,-0.272160429713758,-0.7802904570640783,-0.6439622765886291,-0.2164192951057724,0.3086987703730413,-0.2865518084585526,-0.2865599719991371,-0.2865599766520318,-0.2865356866734187,-0.0420128231413126,-0.4560494787909213,-0.4564686702182929,-0.4557275704105839,-0.461864435719182\ngdb_102890,CCC(OCC#N)C#C,-0.0042925667148006,-0.2332797391745981,-0.3676747802670539,1.1535346680830032,0.5579896711554352,-1.980138320403983,-0.2455205014828876,0.6797219007705864,1.2573888544424476,-0.1277995532509072,0.2403651309328097,0.2404111947174546,0.2404111900678162,0.2402934881361556,1.0329485841539008,-0.1219307533725117,-0.1162609841901178,-0.1154805965941477,-0.1381320772297954\ngdb_18015,CCN1CC2OCC12,-0.0025837031212688,-0.0352364211277178,0.0921479865957963,-0.4147542861608193,-0.3457948015487662,1.268634439930069,1.367311595628513,0.7617945286195682,-0.5058467275423407,0.5163089617524299,1.1616459778085988,1.1616325477858007,1.1616325431418553,1.1616630707812077,-0.6736480352135631,0.0366997963060462,0.0305785956101344,0.0278819358690226,0.0650038866354923\ngdb_112978,COC1CN(C1)C1CC1,-0.0027414181889801,-0.3894300069364229,-0.3426934405639311,-0.7450613843526662,1.2236147219983924,1.354485180467409,1.5761062396402925,0.9259397843175308,1.5118985046828206,1.3435941004467602,0.1810164957515141,0.1810419406704667,0.1810419360204601,0.1809766369106249,0.7936853627453957,-1.1089711116418026,-1.1114573539693198,-1.1134129626275413,-1.085831093386275\ngdb_20378,C1CC1n2nnnn2,-0.0018171176998202,0.0862060700829464,0.2375910440197132,1.5999556561832404,-1.8626871192496004,-2.888348786088468,-1.1914880723117671,0.1683462964807782,-1.1756318796263805,-1.555645556788075,0.5155202882206736,0.515468704358757,0.5154686997108183,0.5155636351466536,-2.249536042885641,2.252306845744668,2.2451903859108424,2.2415066925386022,2.297179889313309\ngdb_62694,CC1(C)C2C3CC=C1C23,-0.0037201159523527,0.1332513025884398,0.2248402469553133,-1.2038974760446843,1.7023762264579152,1.557815881740055,-0.8079877057595187,-1.5257169193764055,-0.5341001586993203,1.008105888754792,1.5397247318305864,1.539721156151881,1.5397211515102724,1.5397496390870942,0.3865440374268045,-1.1745346646138874,-1.1765640228901415,-1.1756216329489515,-1.1806354755405373\ngdb_128021,CN=CNc1c[nH]nc1,-0.0027793520394521,-0.3178361414982549,-0.3303168615951937,1.0121331724376703,0.8352316648092909,2.0412911047661235,0.2018965928280688,-0.7512880206988357,0.951128252131454,-0.3205933253555363,-0.0663457110577972,-0.0663429858029856,-0.0663429654919707,-0.0663533877107595,-0.0781976930456851,0.4189606922798662,0.4192225244327496,0.4186674863668191,0.4292209563352894\ngdb_88351,CC1CN1C(C#C)C#N,-0.0042339306201753,0.0041563054327133,-0.1705166074767161,1.3176100634459562,0.4676112238850139,-0.6833402922873305,-0.1965176768678781,0.1241533430236345,0.2047245102677591,-0.4542716883050908,0.7673421298888627,0.7673702046726837,0.7673702000263016,0.7673032137758367,0.6066071237840107,0.218496798972042,0.2226809554203236,0.2235095582035562,0.2062116641120521\ngdb_78049,CC1(C)C2C1C(O)C2O,-0.0037869975750818,-0.1602336655125322,-0.0329868936854652,-0.8314443682912661,0.4529552594627848,-0.1185327887522039,1.0754252055304123,1.1174425826318222,0.1874155706425723,0.981718837794875,-0.3157734576592901,-0.3157528742511861,-0.3157528789042613,-0.3157668985184721,1.2175652673394768,-0.9020200870732812,-0.9028691019055664,-0.9038571015987666,-0.8869439297195636\ngdb_48873,O=CC12CC3C(C=C1)N23,-0.0035174041756462,0.1190322898918391,0.1969381806518985,0.2029951315059903,-0.3384668193376507,0.5999023557444779,-0.9826934282999874,-1.2479326405029296,-0.8509599530836267,-0.7068721021070249,0.2704100155383575,0.2703728935535562,0.270372888904103,0.2704557611469985,-1.068727737230696,0.4105148354516174,0.4101855628420665,0.4121304790270918,0.3932358128732588\ngdb_24811,C#Cc1nc(=O)nco1,-0.0032653131807044,-0.070588220456433,-0.1394291237248578,2.0759768649523016,-0.8966147977509215,-1.5102184774627565,-2.131063970364776,-1.3994513380702809,-0.3739752200660616,-2.4980359405581223,-0.9985856561585564,-0.9986220491262466,-0.9986220288209928,-0.9985584914095988,-1.4699613286873476,2.0290776842036164,2.033722365816184,2.036412926136907,1.9990286705819504\ngdb_110645,COC1C2C3NC3C12C,-0.0040658202184405,0.1255735408481795,0.0759470812348013,-0.3030183538347937,0.5518830193128379,1.973514204341908,1.2714365039904505,0.3388048312440476,-0.2565423057847542,0.6207451828067693,0.2123166152741576,0.2123294753715197,0.2123294707217078,0.2123085338251722,0.5497451853628547,-0.4446720889190473,-0.4469664753455327,-0.4487311918484037,-0.4209408812958199\ngdb_60439,CC(O)CCOC(C)=O,-0.0029149556072015,-0.5063264550114067,-0.5175445639445386,0.0902137167722359,0.2245664805496957,-0.9770401941255972,0.2572910902189492,0.7091838697420159,2.9501894235960577,0.9113633864313624,-1.2446905312081142,-1.2446276200296942,-1.2446276246885093,-1.2448013668877749,1.6980609547771373,-0.8884506027255551,-0.8844806914216433,-0.8855983858748924,-0.8857869794308735\ngdb_105178,CCC1(NC1C)C#CC,-0.0042318858176295,-0.239442153202965,-0.3052898855389414,-0.6863183415683992,2.234876267132282,0.7173823164797852,0.7558415667368723,0.410355327317519,1.415416259719231,1.295748787088916,1.107478466097274,1.1075525583209835,1.107552553676704,1.107350819974108,1.8991700616606249,-1.380431595905961,-1.3759957562075391,-1.376085556805258,-1.390681795193731\ngdb_44769,O=C1OC2CC1(C2)C#N,-0.003891453826214,0.1938020715761838,0.1283467418981656,1.961627226606954,-1.1152329337158555,-2.1428028814220994,-0.5118402004775046,0.4924279551665009,-0.8339123114958602,-1.3788102403323248,-0.6282184154459756,-0.6282639178216348,-0.6282639224766412,-0.6281577270632202,-1.3318680496645368,0.9821349356100496,0.9837818137024136,0.9865587082803872,0.9552942315703956\ngdb_106728,CC12OC1(CO)CC=C2,-0.0040184636967775,0.0822851078784219,-0.0063991543521819,-0.5220469182539969,0.0731215148533168,-0.1456435489218895,-0.1837343313161364,-0.1136468351029019,-0.2179113033249149,0.2889235089475005,-0.2864496373038209,-0.2864441707343336,-0.2864441753872276,-0.2864447996870489,0.4680215336060394,-0.4453171310739665,-0.4444115908924883,-0.4437555305175726,-0.4514889283277089\ngdb_109145,C1C(=O)COC(=N1)CO,-0.0030349026192418,-0.2408943614268629,-0.2561304059477762,-0.7476097477215055,-1.0028705398120898,-0.5071203511843702,-1.16166026602437,-0.9112244236865946,0.3644283694371583,-0.7515617886416442,-1.5851113141079898,-1.5851167634051846,-1.585116768066104,-1.5851196870017803,-0.3039223576872498,0.6805987053620755,0.6822534668712001,0.68227946888708,0.6802022289373119\ngdb_101652,CC(CO)CC(C)C=O,-0.0036243749812605,-0.4618825694113305,-0.4303061113020661,0.0742047674038871,0.9903406216112276,-0.2495681295723528,-0.6311514256270929,-0.5071745635069931,2.2148477752536366,1.6134152206268626,-0.3468201837317309,-0.3467464250126851,-0.3467464047034029,-0.3469232370218278,2.1431102390431658,-1.5397019434938597,-1.5367507493965928,-1.5381429495799128,-1.5317926852076915\ngdb_79726,CN1C2C3OC3C2C1=O,-0.0039087849094133,0.2424573610255641,0.2863397964327123,1.0785213053284552,-0.8013510290064245,-0.0552743483562691,0.2615522054028631,0.2840897460113934,-0.9024360239083152,-0.7287210611592565,-0.6565832298345575,-0.6566115632020493,-0.656611567857231,-0.6565423993947346,-0.855803162623332,0.626168684861574,0.6253884445434713,0.6258155195835259,0.6268542822511639\ngdb_31067,CNc1c(oc(n1)O)O,-0.0041002613251044,0.0144101582831926,-0.1940010747971892,-0.1971632600522434,-1.2190460150399878,2.68743088881031,0.5385246923572649,-0.7197216253723043,0.0902148454392899,-1.0890335371939253,-1.985851869888568,-1.985854940904563,-1.9858549455679584,-1.9858326966232456,0.0153414264350064,1.1605965983552138,1.1621896809564642,1.1612694555701155,1.1754481932781944\ngdb_35094,N#CC1C=CC2OCC12,-0.0037196130414563,0.0552866628115196,0.0429246161102497,0.6513763991207193,-0.3018269082820754,-0.8731156134751326,-0.594931946563825,-0.180988478466169,-0.5701163935206766,-0.6652478816292523,0.2694885780024186,0.2694585153837451,0.2694585107342863,0.2695196326760076,-1.0500199133345576,0.3137244246834157,0.3149816793073676,0.3175980752157124,0.2863689416257873\ngdb_69423,OC12CC1C(CC#C)C2,-0.0029526905039123,-0.3235060675202564,-0.2824717089741599,-0.9724538080337004,0.7399678960647939,-0.3083081099400059,0.9944640170360488,1.1258602880522297,0.7930025473193895,0.2684870685685215,0.6420851368961856,0.6421016185734741,0.6421016139263166,0.6420690093134761,0.7378080466345603,-0.499044422723223,-0.4970505276443709,-0.4960232011261414,-0.5120834872900643\ngdb_128969,c1c2cncnc2n[nH]1,-0.0031238073180389,0.2001791598637356,0.0855124608507577,1.5408205574960747,-0.3262535156524596,-0.2269758294309477,-1.6112079179272836,-1.4857328186294665,-0.987611677597422,-1.6055645677726094,-0.0073995698171047,-0.007469912342439,-0.0074699169936091,-0.0073138403939326,-2.2291051299464364,1.3637219460814152,1.362757207260202,1.365299958243439,1.3452634189029429\ngdb_81933,OC1C2CN2C(=O)C1O,-0.004122323086626,0.3584067167806145,0.1968377806750137,-0.4427862832180502,-1.3753763022104433,-1.0312617144649685,-0.7163737293053704,-0.2272858582784147,-0.8642850806335041,-0.6964735603847803,-1.584516759364918,-1.5845508873819398,-1.5845508920428557,-1.5844559948310508,-0.5953705615428795,0.743052420955145,0.7411717534083961,0.7407822005824158,0.7559682246902794\ngdb_85093,CC1CC(=O)C1(O)C#C,-0.0043072892931307,0.0819252127968472,0.1563674627197172,-0.5347233924477097,0.0169403179014331,0.026057932152789,-1.027435137731083,-1.0269678732172098,-0.3431102383463001,-0.5058737002620117,-0.2558788196274113,-0.2558470002883796,-0.2558470049410846,-0.2559190036928173,0.9123323511393256,0.1423739583728161,0.1481994459890999,0.1495533721259511,0.1213781417122536\ngdb_94377,CC1OCC11OC2CC12,-0.0037238905473225,0.0156413782991062,0.106998055904156,-0.1914784494602177,-0.077102120474544,-0.091422028582517,1.32043932860546,1.3468250553379495,-0.4058596573299033,0.2935517616215629,-0.2850803740860134,-0.2850902769643079,-0.2850902816171936,-0.2850658436581845,-0.0917362498126273,-0.3014858412328126,-0.303445906172066,-0.3037840867414943,-0.2940695986532163\ngdb_25072,c1c(oc(=O)o1)CC#N,-0.002006858796594,-0.3095585546220368,-0.295094724249797,-0.5091090734583518,-1.841924502984774,-0.9951140342387216,-0.4543151454946673,0.014723172558326,0.3278405674997469,-2.13354601568128,-1.526108657965194,-1.5261430908721745,-1.5261430955327293,-1.526076195674011,-1.3535297404916449,1.6312964446826757,1.636262396235962,1.639314892436199,1.5966949332424747\ngdb_110246,CCC1C2C(O)C1C2=O,-0.0040577736440979,-0.0505477469666415,-0.1023450231717749,0.1058959528881693,0.1634999621237362,0.12094559274669,-0.1475148522528686,-0.2041371683722918,0.1241814455868181,0.2942129405750011,-0.2855257035266222,-0.2855192084437469,-0.2855192130966353,-0.285525645440373,0.4325751304344075,-0.3482645080474202,-0.3481057022622607,-0.348128894131364,-0.3465598063317773\ngdb_20737,Cc1c(c[nH]n1)C=O,-0.0035420081241167,0.2907780111885679,0.0820349707422851,-0.148744356044299,-1.094470317451029,-0.2857158097986008,-1.13396301732893,-0.9869837724702704,-0.9293136190242112,-1.205551300942896,0.8182517721706364,0.8182262310092601,0.8182262263631923,0.818282301055582,-1.398330055611344,1.2934123512408502,1.2935803968819504,1.2941696794050757,1.2813889343684637\ngdb_66324,CC1(CCOC=N1)C#C,-0.0040704569463755,0.0428166139323961,0.1861862558545736,-0.4261239073448709,0.3735687855090368,-0.389640390449064,0.1166742891497914,0.2946118777869038,-0.4359679113544264,-0.0334312844420948,0.2399876742920337,0.2399900515533018,0.2399900469036609,0.2399999339038777,0.2905433621703048,-0.161579868893663,-0.1601098708710039,-0.1590202139761241,-0.1716437407647343\ngdb_117734,CCNC(C)(C=O)C#N,-0.0042859017637999,-0.1594128521685898,-0.1849924586894135,0.4912215627867495,0.3063956152404817,-0.6246003119196788,-1.300146509501571,-0.9932970515355763,0.6476132452040809,0.2279748308760759,-0.191538525687704,-0.1914846123779892,-0.1914846170302965,-0.1916153999464831,1.3199659876130774,-0.2918390522636764,-0.2872900956331883,-0.2877422242765326,-0.2922486842821383\ngdb_124067,c1cn[nH]c1C2COC2,-0.0029648708952935,-0.1759048864330308,-0.0837984092599207,0.0366980860266132,-0.1442752907430991,0.1616117330012197,0.1145437315578345,0.0378718624644488,-0.0938880307454939,-0.2771057823726215,-0.161203577847859,-0.1612320748835067,-0.1612320795356285,-0.1611810401047034,-1.004973442637277,0.2710756009311597,0.2694381998861026,0.2699370205176896,0.2701801864778789\ngdb_112038,CC1(O)CC=CC1CO,-0.0041742279123302,-0.0208279899671305,-0.0543629614913525,-0.3406557205130341,0.5225710904683779,0.229388633425435,0.327599490753528,0.2188525290032287,0.1152898058045552,0.9882404665628732,-0.3166839366961027,-0.3166596638061024,-0.3166596684591832,-0.3166903712075096,1.2439039141406203,-0.997659386029601,-0.9972828685803268,-0.99760496129989,-0.9923660379213444\ngdb_51166,O=CNC12CC1CC=C2,-0.0032056546848061,-0.1720660055629005,-0.0627874322790699,0.5952470623557743,0.3967740625109011,0.1570932729729392,-0.0047674935917538,-0.0778715870661662,-0.0431111377062975,0.0065700422408764,0.2389993374903533,0.2389880548365063,0.2389880501868591,0.2390040961026485,-0.2576451091020657,-0.2653975681743107,-0.2644364852612461,-0.2626110053763083,-0.2853269299687657\ngdb_2400,OC1CC11CCO1,-0.0024928807292733,0.4469156510527934,0.6779088335549882,-0.8154354189229174,-2.16435572027384,0.7038269363949424,1.1968669882719578,0.8564937145991625,-1.686231479181594,-0.7281800965609894,1.64594773544881,1.6459158835239702,1.6459158788830175,1.6459928567266122,-1.554638847375132,1.4378755726227723,1.4319652064627175,1.4291370737410716,1.4588924651136237\ngdb_13331,O=CCC1CN=CO1,-0.002016718060761,-0.1565589473111905,-0.0824840822897891,-1.0451148353708586,-1.609871732966128,-0.5974895517499906,-0.8996016822136669,-0.6081870285518934,-0.3963521262774964,-0.8909202798730029,0.2926746902750508,0.2926532416246174,0.2926532369753019,0.2926708770680654,-1.4123609235334482,1.100036055264377,1.098949143304912,1.0984723771660123,1.0966729869221314\ngdb_70398,CC12OC1CC2(O)C#C,-0.0040559609542735,0.070661128138439,0.1937892722859588,0.0354565756674351,-0.116184692267159,-0.23601274948751,0.2679438781787338,0.3745800792807827,-0.5337355567866402,-0.4923195317165418,-0.2548053110739727,-0.2547916086792764,-0.2547916133319763,-0.2547971322760471,0.6868538420753412,0.2551383398943522,0.2580854681557901,0.2586643604071768,0.2494491191123413\ngdb_98183,CCC(=O)C(C)(C)CC,-0.0038298831636107,-0.2881479542427417,-0.2356670652044801,-0.2173541390515076,1.6950482442467998,0.2203517133688727,-0.3690928418163899,-0.4671904627600536,1.242735432081279,2.305278888221667,0.5496051396046349,0.5496897629304542,0.549689758282727,0.5495007010621862,2.735360558702493,-2.342742716130985,-2.339473092553006,-2.340083809332896,-2.343806509255796\ngdb_55647,CC(O)(CC(N)=O)C#N,-0.0041316960193767,-0.1053401946491904,-0.012751734707817,1.895892520221,-0.7561618053712139,-1.6548091983677504,-0.2583038470346293,0.5155766450726237,-0.0109435977808451,-0.4314309608227026,-1.0897507084420976,-1.0897254678009671,-1.089725472458825,-1.089771058379986,0.939163309095628,0.3235049419036778,0.3275325816713063,0.3276216585191079,0.3220748212250308\ngdb_111776,CCC1C2CCC1C2=O,-0.0038344922590789,-0.1046267184348405,-0.0050026819464173,0.2595818668243165,0.8401169862833666,0.4914593150657341,-0.266826077402457,-0.4924435790212783,0.0650598156343777,1.0663196902460883,0.6101352208114086,0.6101362008918197,0.610136196244466,0.610135750578369,0.2437738024299594,-1.2315999183069732,-1.232121713702223,-1.230787471438498,-1.245624165894914\ngdb_1922,OC12CC(C1)C=C2,-0.0028134947154743,0.9679616478385712,1.2729429874701073,-0.934293700151596,-1.7747513327162188,0.56375467551823,0.0165380823278155,-0.2462256954743337,-2.1156996466556017,-0.6712886529765683,2.5727180165524715,2.572663046288796,2.57266304165357,2.5727970867555428,-1.9649801952022727,1.198800924335605,1.192055578464026,1.1909195453159234,1.203135210531686\ngdb_57873,CC(O)C(=O)N(C)C=N,-0.0036747102828483,-0.2443607193178192,-0.234224956445586,-0.5585081172235421,0.1366306940163142,-0.8550417733620105,-0.7994654753916909,-0.389326687621276,0.6415236783566182,0.2826122553010516,-1.119151962201422,-1.119115998553437,-1.1191160032114764,-1.1191895358585873,1.100149056833451,-0.1413318383365347,-0.1394446762771603,-0.1409395691197912,-0.1243826064348507\ngdb_95472,CC1CCC=C2CCC12,-0.004086284823379,0.1482658728337844,0.0693754463841443,-1.5954959803896376,1.9979381756395596,0.6902715563100984,0.4128217944318055,0.0862736686317963,-0.1514415150205561,1.7966519514954398,1.5072225972544897,1.5072256337796974,1.5072256291378876,1.5072318429831753,0.5167603379670317,-1.9650967558966823,-1.966829030152922,-1.965190438542184,-1.9809059691953064\ngdb_58183,CC(C)C(C)C(C)(C)C,-0.0042616183519447,-0.141557004963445,-0.1654692268205864,-1.711217814395129,2.8797387017104144,-2.2105797818463144,1.3587893652606848,2.371680690272669,1.0712227603753233,3.737362447778592,1.4173115567298276,1.417405489823887,1.417405485181522,1.417230137296742,3.639489995156655,-3.538944548303961,-3.5394058947025746,-3.5413085871990435,-3.5196299057345257\ngdb_98940,[NH3+]CC(=O)[N-]C(=N)C#N,-0.0033275801814719,-0.3582832975082121,-0.3206510820023522,2.7077096098224853,-0.3177042030728243,-1.1306678350871524,-1.2106630906393796,-0.6692153928498544,0.977717740245762,-1.0704003121425036,-0.9634438463007048,-0.9634306962749826,-0.9634307009320604,-0.9634851259866276,0.0387262063051791,1.1513273950379088,1.1541091766181988,1.1532459436000184,1.1604334862060397\ngdb_60456,CC(C)CCOC1CC1,-0.0028259956434707,-0.5301616117297315,-0.5296747066063765,-1.1163383243973894,1.9271010142654463,-0.0688297284411119,1.527103415025283,1.5425367063624449,3.505519243685673,2.35023905705542,0.5508671393355423,0.550932822982926,0.5509328432977554,0.5507638329721918,2.050555741872796,-2.21017868701296,-2.2100472725043354,-2.211568258008871,-2.1996094685953618\ngdb_20968,CN(C)c1cnno1,-0.0027861938382406,0.1374942761817416,0.0994315485552744,1.769323806235321,-1.0822570137658378,1.0291560584311752,-0.4926651821498922,-0.9659395089192494,-0.8831811109374144,-0.9224164409276638,0.3891874152453621,0.3891799488070815,0.3891799441583625,0.3892001205633378,-0.8762340755625353,1.4217521408832543,1.4192401665012673,1.4165043651236346,1.4499645703981865\ngdb_27145,CC1=C(NC=C1)C1CN1,-0.0038834514638184,-0.0003644819077687,-0.1234290183176331,-1.2219320475780078,0.9585860320297296,1.7521096629561388,0.5726136138285757,-0.2504345481845381,0.0516013581693731,0.382390170092535,0.7354261626852078,0.7354339931733573,0.735433988526778,0.7354052264803734,0.1549116389233011,-0.5104926099043186,-0.5093493203526243,-0.5082352495791973,-0.523302485532405\ngdb_14962,CN=C1CNCCO1,-0.003030901438044,0.0763247402116409,0.0657701744869088,-0.1927853024698787,-0.4483865525043786,0.7625669167625955,0.2232021687476382,-0.1346910986539229,-0.5903557316558128,0.215021734106459,0.7594653404456967,0.7594562279135272,0.7594562232670963,0.7594728293645125,-0.6517401888088751,0.3654274377014756,0.3607772683843179,0.3581904907098199,0.3916115230590871\ngdb_26929,Cc1c[nH]c(c1C=O)O,-0.00404776516461,0.142539121272587,-0.1214484005918101,0.6064206555883769,0.0108336660588375,1.6707773824470793,-0.8463377424147435,-1.6162072526457956,-0.1910998369872986,-0.7140248562396674,-0.6593541824441262,-0.659366230409119,-0.6593662350643177,-0.6593227523754923,-0.260352820455454,0.3350999679492128,0.3385760242358894,0.3410260075607767,0.3094538044057931\ngdb_60974,CC(C=O)C1CC1(C)O,-0.0041787983223448,-0.1485086625917563,-0.1813050413565448,0.1396781031879091,0.5750882963147039,-0.1004589486390793,-0.6226291952592652,-0.5682029278049541,0.5887060869649814,0.904601328953024,-0.316542100267766,-0.3164920652521261,-0.3164920699052058,-0.3166030538006631,1.6557221954332442,-0.9827604855341848,-0.9798327218680718,-0.9802778917069448,-0.9823980277985868\ngdb_62821,CC1(O)C2CC1(O2)C#N,-0.0041075618228424,0.0872541855836727,0.2195920663454136,1.0948569679492188,-0.5815115626729689,-1.668364578452592,-0.5416680067649017,0.2420012189093515,-0.6210091012850777,-0.7694436739732505,-0.6561221740740183,-0.6561362263440216,-0.6561362309992003,-0.6560881391975433,-0.1293980531824853,0.674599288900286,0.6748799091974604,0.6749579174370967,0.6787118717915742\ngdb_17025,CC1C2OC1C2C#N,-0.0030748370603127,0.1196699987205944,0.1923289089858127,-0.196640518848379,-1.1384382107177198,-1.171333975341682,0.0058852943680308,0.5513518931093588,-0.9738883475664416,-0.897471962229793,1.2216177496953808,1.2215874981199375,1.2215874934763626,1.2216593652591483,-1.2690983773814413,1.0891962005216718,1.0867569124104763,1.0863661389383532,1.0902670158655388\ngdb_125602,Cc1c(n[nH]c1C#C)C,-0.0036104205855085,-0.0903698220454427,-0.1848555496300247,-0.8688203643675739,1.3982649646966374,0.7219007765080658,-0.5246235460292461,-0.8523004857437361,0.1882530826831692,-0.4014074256188796,0.7654415417190747,0.7654653375211645,0.7654653328747708,0.7654185007486589,0.4995294475363777,0.0188536300315299,0.0243486281290077,0.0265760828433999,-0.0089440443785586\ngdb_16067,CCC12COC1C2C,-0.0037330479468318,0.1618471767016299,0.2290844277963627,-0.585559974523527,-0.0600034953152752,0.5411623753768249,1.3353532317491588,1.0669363501093718,-0.6271980400263883,0.8344863062998517,1.5620504646284756,1.562059492224267,1.562059487582796,1.5620433356819032,0.0283876720467873,-0.4785997329722315,-0.481762806024018,-0.4832846822833086,-0.4682276623074339\ngdb_345,CCC(=O)C#C,-0.0006070919770861,0.5087607795444465,0.6189557925822189,0.0176180320855609,-2.67853580542042,-1.0854832348043422,-1.4130660618752884,-0.8922845864906761,-1.88062470817248,-1.6485712533348422,3.552825684336771,3.55280729806549,3.552807293436321,3.552845539412816,-1.963257106159207,2.2896589818863795,2.28761758205682,2.286068281853675,2.281139314991006\ngdb_94704,CC12CCC(O)CC1N2,-0.0036372406578192,-0.1396438784771794,-0.1003096418221965,-0.1695233188979108,0.7692798249092538,0.3378316741041788,1.267175388806537,1.0942938927256982,0.2351773488450497,1.4228754676816793,0.1796784105870324,0.1796864492973049,0.1796864696098403,0.1796907645324549,0.8288856103394456,-1.2495273703814476,-1.2525893790813931,-1.2535469928475216,-1.2326241481190363\ngdb_79607,CC1C2C3NC3C(=O)C12,-0.0041601243012572,0.2438401158126669,0.2759073261072944,0.5243502865816588,0.0609082111681238,0.56375467551823,-0.8101182633514755,-1.0627431212539455,-0.6896178874771578,0.0145642968597126,0.2401297353820845,0.2401100714888867,0.2401100668392465,0.2401606897929817,-0.3903229654180992,-0.1466573692946164,-0.1476135489480774,-0.1466120294955037,-0.1532921154958497\ngdb_103,CC1OC1C,-0.0009187033055919,1.87968322659689,2.1068104318534817,-0.5816394154945437,-3.562778992228312,-0.7827464129095121,1.8594703993705648,2.2012221555094,-2.7070085264695964,-1.0622858431684972,4.473502690452004,4.473462101121769,4.473462096498289,4.473565134077064,-2.744554909400565,2.384860386371961,2.3754687004671484,2.37085840271698,2.410073730661186\ngdb_75963,CN1C(=N)CNCC1=O,-0.0042476860621661,0.3337318048719488,-0.0629152140678327,-0.0846432159204215,-0.0184782627856224,0.3513870541890229,-0.5075790852935906,-0.6650065401396501,-0.2803865559682978,0.0594042513382949,-0.5932313475837697,-0.5932369932806223,-0.5932369979354123,-0.5932203800936241,-0.0996132282952118,0.0881301085758611,0.0870485595076882,0.0863949905131771,0.1011714064977097\ngdb_103333,CCC1(C)C2CC1C2C,-0.0039181799481375,-0.0605869255579359,-0.0509128531947576,-1.7022658712789505,2.192129704234111,-1.5554030777455663,1.7124619255255362,2.4158736437298125,0.3486947245464608,2.420864990718066,1.4789416565722118,1.4789796858239024,1.478979706144467,1.4789063557779127,1.5794139663832072,-2.312252674778809,-2.314633793756121,-2.3154171231295333,-2.302589442813834\ngdb_99802,COC(C)(C#N)C#CC,-0.0040888767487681,-0.2609222070190552,-0.2304006300533286,0.4583542095937724,0.8718715758648662,-1.275258555992144,-0.198648234459835,0.3977287691869067,0.9548493363579204,-0.1586044817633381,0.2402220963086675,0.2402876300999123,0.2402876254502733,0.2400820841571448,1.4144897294040923,-0.1369555157517324,-0.1291263738574383,-0.1282552457578476,-0.1622656043108801\ngdb_33533,C#CC1(COC1)C1CN1,-0.0042610933350748,0.1638676403174878,0.0054297883790008,-0.5319136584769384,0.2392224449719266,-0.4122326905904692,0.4767385221905136,0.6607820635746673,-0.2639816146148432,-0.042056664425575,0.2419113526934674,0.2419147389687252,0.2419147343190962,0.2419101332296225,0.155157794500883,0.0404887634326743,0.0402861169042558,0.0399623667246343,0.0464214412351202\ngdb_2796,CC1COC1CO,-0.0030584123220256,0.4052435889757225,0.3964694438049735,0.0161804937749336,-1.8846710658829455,-0.2450496695440723,1.1670391819845607,1.2668568538440703,-1.3274095791764062,-0.0042191961356724,1.6153879261963533,1.6153742048244402,1.6153742001832987,1.6154059784869397,-1.0679892704979537,0.8513408392338659,0.8451318991228636,0.8415651499037912,0.8790523473981883\ngdb_83128,CC12C(O)C1OCC2=O,-0.0042527262241169,0.2628703574945294,0.2668074372932595,-0.031715669029146,-0.8477615830101531,-0.2857158097986008,-0.7589848811445091,-0.6166047339723021,-0.6982825446961348,-0.4263819579055432,-1.1836580814479638,-1.183660662996613,-1.1836606676550514,-1.183639891758965,0.0488185849859907,0.2754624120062794,0.2770664662849758,0.2775114842137427,0.2724912374240065\ngdb_36998,C1=C2C3C4C3C2N2C1C42,-0.0035601129163874,0.6680238240644525,0.9409932820762924,-0.5172115621182507,0.0633508719051617,1.4448543810330294,-0.9848239858919444,-1.645669221617225,-1.6725925080245276,-0.634052256462517,1.1992063458974829,1.1991310891907898,1.199131109509625,1.1993006424079544,-1.9307645699185447,0.3842621441879585,0.3796621009737514,0.3818248820312673,0.3704359132180924\ngdb_121462,OCCN=C1OCC1O,-0.0034068522023301,-0.3729632284671803,-0.3891603753204521,0.0348031491626046,-0.307933560124671,-0.5071203511843702,-0.1432537370689547,0.0946913740522049,1.424051605804051,-0.0602390856451072,-1.6139596504284437,-1.6139412433965672,-1.6139412480576647,-1.614003526338638,0.4308520413913426,0.2738419402522904,0.2742126889406808,0.2722390381043607,0.2947782034531914\ngdb_357,C(C(=O)C=O)O,0.0002021634996501,0.4489361146686514,0.5905152173118753,1.5873445246400104,-4.144132247643448,-0.5251941912974948,-2.2674196562500195,-1.9950039965641724,-2.008253607706019,-2.3703682953670917,1.7285506302891098,1.72851526070631,1.728515256065868,1.7285944745881157,-2.508245554923028,3.218385956115688,3.2148145195420232,3.2116032270811536,3.2321382041383817\ngdb_59659,CC(O)C1CCC1(C)C,-0.0041855516972394,-0.0861079066057422,-0.0946598613047566,-0.8947613966093472,1.6425310384004754,-0.6110449318348347,1.260783716030666,1.5299101482318311,0.5491099617951076,2.3583835796182187,0.5504334423792009,0.5504912604536945,0.5504912558059721,0.5503893266671881,2.311726809685991,-2.255735444705208,-2.256022197315622,-2.257221499161568,-2.242362486282653\ngdb_10125,OCC(CC=O)C=O,-0.0037305444453365,-0.054468709171166,-0.022253223429392,0.4267283667599734,-1.6526182958643,-0.5251941912974948,-0.8101182633514755,-0.5555763696743411,-0.3488412799796276,-0.6482676039614242,-0.2332255300515141,-0.233209388216053,-0.2332093679060691,-0.2332609853275933,-0.2069370601204267,0.872716382500081,0.8741608965349211,0.8728333618770007,0.8753763230814521\ngdb_52594,CC#CC(C)(C#C)C#N,-0.0042962473593831,-0.194568919084519,-0.1910803481968972,0.9325458241493084,1.059956452616822,-1.3565908365012034,-0.0068980511837108,0.6250068155379321,0.7068717796002038,-0.85446527666756,1.197845919373734,1.197903630731441,1.1979036260877196,1.1977434320616862,1.1385493269360507,0.2413590857402853,0.2518606988119023,0.2549222914602342,0.1926673667668645\ngdb_75733,CN1C(=O)C=C(C)C1=N,-0.003984978673466,0.1199351845701757,-0.1123393845071492,-0.8790138178429306,0.5311204030480131,-0.6426741520328009,-1.7390413734446994,-1.418391175266199,-0.1209625823848082,-0.3680178884702836,-0.1631169467728921,-0.1631085596168139,-0.1631085642689458,-0.1631185479619208,0.0635879196408373,0.0700899052304789,0.0740610136156613,0.0759378433593772,0.0489975078877708\ngdb_121101,CC1CCC(CCO)C1,-0.0029120210392235,-0.4617941741281366,-0.462461485717156,-0.8352342420192833,1.7377948071449725,-0.9408925138993478,1.456795014490704,1.877140496823677,2.474549785673996,2.4356814099928257,0.5500077333948895,0.550058060378376,0.5500580557306511,0.5499365142714816,1.6606453069848603,-2.300453123172091,-2.3011264341401785,-2.302007612613456,-2.294054797211301\ngdb_105473,CC#CC1(COC1)CO,-0.0041713099238323,-0.1987992647802216,-0.2516945524235827,0.1984864886226592,0.6605814221110459,-0.4122326905904692,0.561960825868791,0.747063544133854,0.8517613626234272,0.26352822641774,-0.2853176168552712,-0.2852783947335602,-0.285278399386447,-0.2853953183445636,0.8562088794509102,-0.3264064942266194,-0.3230324872896145,-0.3232325223171864,-0.3316818815602171\ngdb_54088,NC(=O)C(OC=N)C=O,-0.0040301688097291,-0.1390251114948229,-0.2290680485419453,-0.5992819311249689,-1.1775207825103349,-0.8008202530226366,-1.2426214545187335,-0.8565093384539404,0.2986616908997024,-1.1557825579023244,-1.986218143368503,-1.986204491478188,-1.9862044961415848,-1.986238130963436,0.2376199129904405,1.122122193749575,1.1257949230765254,1.1251313924914714,1.129164482468067\ngdb_3138,OCC12CCC1O2,-0.0021791085418609,0.3842560231659975,0.6140818300679816,-0.5578546907187114,-2.2449635245961064,-0.773709492852951,1.2735670615824075,1.6182960551461198,-1.641268914291072,-0.710388371995761,1.6458734722713546,1.6458380752587416,1.6458380706177884,1.645919543054826,-1.6698396576829315,1.4300747579498596,1.4238639095750811,1.421092915873748,1.4505230981237605\ngdb_28317,c1c(c(c2n1CC2)N)N,-0.0036824086881088,0.0352398753729286,-0.0894208079654828,-0.8973097599781863,0.8962981832352503,3.1212030515252875,0.7664943546966569,-0.6965729354661815,-0.1477415205125678,0.051981014906519,0.3352200521581862,0.3352219263566703,0.3352219217076179,0.3352267301901789,0.1558962612336254,0.0256449275118408,0.0253648639592631,0.025146354431947,0.0329626328275194\ngdb_105939,CCC12C3C1C1CC3N21,-0.0031843389998891,-0.0388669416874626,0.1361596855539467,-0.7319928542560551,0.8193543700185419,0.9884899181766468,1.616586833887474,1.13638241982774,-0.3735201825809227,0.7180887569060509,1.1385877729380776,1.1385681759757968,1.1385681962942578,1.1386108010941662,-0.4092769448918187,-0.7361760781795688,-0.7398165320563915,-0.7395131756606717,-0.7340013707434286\ngdb_31984,COc1ccc(o1)OC,-0.0036151899492843,-0.1817452890726203,-0.2166458332200783,-0.9995710079841686,-0.2285470861709247,2.2355848859822074,0.498044098110083,-0.5492630906090349,0.4398981057390175,-0.4255104038305572,-1.183781895014412,-1.183770248586716,-1.183770228282606,-1.183813852596102,0.0768803208301985,0.2624566840130972,0.2656565550446182,0.2661846286219283,0.2526321572218665\ngdb_107692,CC12CC3(CO)C1OC23,-0.0035293966662529,-0.0979213048097119,-0.0361084202395273,-0.9114891151330096,0.0560228896940481,0.3784978143587085,1.220303121783484,1.0269522493624317,-0.0802447701541605,0.2746180006822153,-0.2842733392869743,-0.2842730031101009,-0.2842729828004326,-0.2842602419797606,0.2816817613773968,-0.2167126182449915,-0.2183523995675237,-0.219288169919946,-0.2021034492273293\ngdb_73208,CC12CN=COC1CO2,-0.0040020057995302,0.2360550169428141,0.2673185644483105,-0.2277436204783136,-0.4801411420858764,-0.1546804689784518,0.195504920052198,0.2651499088154743,-0.757349484361673,-0.000131908059876,-0.6873877955424613,-0.6874071859125404,-0.6874071905679122,-0.6873592284496907,-0.4393079253566724,0.0139240395825004,0.0121148121362444,0.0119897561619387,0.0207633614337165\ngdb_65766,CC1(CC=O)OCC=C1,-0.0035358847694659,-0.2244402108552194,-0.1175784378464231,0.6897978776047559,0.3405928655590192,0.1525748129446574,-0.6354125408110068,-0.7007817881763858,0.254659601832092,0.252047755498961,-0.2870134882794065,-0.2869969164568057,-0.2869969211097031,-0.2870420177999073,0.4478367762444162,-0.5045456358945328,-0.5019627673357855,-0.5009007944436021,-0.5196663560536632\ngdb_118698,COCC1(O)C(O)C1O,-0.0040816425689506,-0.2056562131765397,-0.1218865095818538,-0.3966543719770127,-0.7757030912675206,0.2022778732557494,0.8112360641277525,0.7091838697420159,0.4546780664110574,0.2194997188365582,-2.1393797437112503,-2.1393474829539,-2.1393474876182443,-2.1393992279047835,1.3517200571209955,0.0969561731772682,0.0969432137424184,0.0937810603928692,0.1346887692942466\ngdb_111175,CCC1C2CN1C2C#C,-0.0039357376175648,-0.110782818514408,-0.1315614164453212,-0.6957276832379592,1.3994862950651572,0.4824223950091718,0.3233383755696142,0.0925869476971031,0.3291913818409404,0.6612574204992155,1.1394650020413204,1.1394768127593384,1.1394768081152562,1.139449302813249,0.5076525815965417,-0.6440294464984198,-0.6452104343037824,-0.6455769221435977,-0.6382794073257221\ngdb_44358,N=C1CC2OC3C(O1)C23,-0.0035622240368537,0.1940735713745648,0.2783260528231611,0.7516773676122097,-0.7647111179508475,0.1299825128032523,0.3467745090811405,0.2819853196562915,-0.9835896180665666,-0.6654883103395934,-0.656704696934221,-0.656756545686632,-0.6567565253792655,-0.6566377296339414,-1.2858369566569336,0.6134094363908432,0.610293054000967,0.6108291786353227,0.6159715404962148\ngdb_23178,ON=C1CC=CC11CO1,-0.0040655659997456,0.1388138914808488,-0.0650783772061739,-0.4667016932948486,-0.0404622094189686,-0.0597928083845496,-0.5672346978683849,-0.5324276797682183,-0.3097495207889602,-0.7495181446037464,-0.6562048995093418,-0.6562238448910062,-0.6562238495461853,-0.6561727606981748,-0.3743228528753498,0.6659095746730009,0.6657571783427442,0.6658995298482909,0.6690516218445869\ngdb_33605,C#CC12C3C1C1OC3C21,-0.0025022315560505,-0.0809304685901062,0.135292594844485,-1.2110198249473372,0.0780068363273926,0.2067963332840299,0.4447801583111596,0.3430136839542524,-0.7176203867295275,-1.0748782968726032,0.7032791433906201,0.7032324048156309,0.7032324001688525,0.7033387949706748,-1.211497972227542,0.68183896934207,0.6815543174121201,0.6840240472375164,0.6585592795708366\ngdb_91379,CC1CC2CC(C)CC12,-0.0037769946220873,-0.1633527562195129,-0.1368461061377246,-1.6313690955048352,2.0773246495933075,-1.5599215377738496,1.3950088443239528,2.1065229695298053,0.6368553440628952,2.4694916973845187,1.4770850521734071,1.4771090423270616,1.4771090376850655,1.4770594603240268,1.1277184815224977,-2.5072756637350646,-2.509402797969997,-2.508814988519132,-2.513427959955798\ngdb_91191,CC1CC2C=CCCC12,-0.0037771327844214,-0.0292129139729411,-0.0612449235432906,-1.5764812690990682,1.9112237194746973,0.3784978143587085,0.4000384488800639,0.2188525290032287,0.1333601200121026,1.8012501505807104,1.5071100916609463,1.5071134271219506,1.50711342248014,1.5071138221999945,0.4722061784249114,-1.976914662361455,-1.9785118435979208,-1.976790852277704,-1.99437902575917\ngdb_88570,CC1CC1C1=NCCO1,-0.0035170670595508,-0.1995506246873688,-0.1069725493791127,-1.1576348795026803,0.7717224856462934,0.2971655338496504,0.3062939148339587,0.1662418701256764,0.3469889595196893,0.7118977176147726,0.2092217503988198,0.2092291517484678,0.2092291720611858,0.2091993551724957,0.113311346312152,-0.7697654684196076,-0.7697681988283994,-0.7692535912086464,-0.7758795516360691\ngdb_61655,CC(CC1COC1)C#C,-0.0031867651304773,-0.3953524909104066,-0.3942168832472077,-0.8273931239613166,1.1283509532538971,-0.2540865896006346,0.8368027552312358,0.9469840478685516,1.5665526888463674,0.9692165448571468,0.6118075277180601,0.6118466846742882,0.611846680026945,0.6117457058304895,1.075287343497794,-1.0559360603534735,-1.0540283337515557,-1.0539501966920242,-1.0618343486053314\ngdb_16526,OC12CCN3CC1C23,-0.0034146224520042,0.6170765712008336,0.657463747352944,-0.4060637136465728,-1.0724863708176846,0.9975268382332076,1.2693059463984937,0.7891520712358959,-1.5398495304863722,-0.1235319436423564,1.1915072139447207,1.1914597983329973,1.1914597936892362,1.1915794421983303,-1.3358065389058296,0.5498544293140621,0.54302655905769,0.5415931609052742,0.5683000592659087\ngdb_76445,CC1C(O)C1(C)NC=O,-0.0041377420031203,-0.1358428812998466,-0.0064721725171891,-0.1321473228216029,0.1757132658089275,0.4779039349808913,0.3382522787133127,0.1115267848930221,0.1824487632144499,0.6010300285588137,-0.7176343015545,-0.7175909515845313,-0.7175909562400898,-0.7176667309268074,1.4733209124458952,-0.5397004333148149,-0.5374530493356976,-0.5385795576330936,-0.5271836832988992\ngdb_112124,COC1C=CCOC1C,-0.0041999261064873,0.0988402816308584,-0.0724349573306594,-0.4311552914320664,0.780271798225927,0.0531686923224746,-0.0239425119193662,-0.0484096180947373,0.1134775198268234,1.0020951709962689,-0.3164275976812147,-0.3164090897392563,-0.3164090943923355,-0.3164414492003441,0.719592533893584,-0.9707328092630696,-0.971193417777011,-0.9716995212382302,-0.9639495150675564\ngdb_27123,c1nc(c(o1)NC=O)N,-0.003618677166599,-0.0712069874387895,-0.1769513332679844,0.3979122578969457,-0.7353991891063875,1.033674518459457,-0.5267541036212031,-0.9996103306008828,-0.0924060312063072,-1.3880667456804503,-1.460375236741244,-1.460392186136297,-1.4603921907964454,-1.4603739657682822,-0.7425715969361785,1.3434214730727012,1.3453434475094488,1.3455702223128354,1.342732946350339\ngdb_41601,C1OC2C1C21CC=CC1,-0.0032156465848139,-0.0696726978804974,0.0279924013662565,-0.7250011906543682,0.5604323318924731,0.5140516152071393,0.3467745090811405,0.1031090794726135,-0.3175841724773077,0.3651995173031593,0.641333793239635,0.6413081340284621,0.6413081543438499,0.6413634057859452,-0.5828166270862598,-0.5779676902366914,-0.5796671218552848,-0.5780545139175279,-0.5926340139162499\ngdb_85885,CN1CC(C)(O)C1C#C,-0.0044247493831558,0.1756810385214575,-0.046805581413097,-0.2914527046992931,0.9439300676074988,1.0517483585725802,0.5598302682768342,0.0631249787256735,0.1077947343289977,0.5583539324732995,0.2111375406778751,0.2111796504395557,0.2111796457897367,0.2111093048772939,1.5215674056517252,-0.5685254268283196,-0.5666847741000826,-0.567605108975342,-0.557842865949199\ngdb_112764,CC1OC(CO)C1C#N,-0.004238622613044,-0.0166418419130247,-0.0968595335256016,0.3723632815580709,-0.3787707214987838,-1.166815515313399,0.0357131006554279,0.5787094357256866,0.0745383930054844,-0.053867724821073,-0.6873714701246275,-0.687354664709447,-0.6873546693648186,-0.6874004658299385,0.4121442174952041,0.0156389077489261,0.0175832525118348,0.0174196272408124,0.0160557706038714\ngdb_127648,Cn1c(ccn1)CCO,-0.0036326039298841,-0.249727575797442,-0.2743393108464718,-0.4513461704313306,0.4089873661960941,0.410127034556676,0.2508994174430784,0.0568116996603679,0.7253803310086776,0.3484296147568796,-0.1917403467934951,-0.1917184116119081,-0.1917184162642169,-0.1917931050978076,0.3572515236946931,-0.3130389133170795,-0.3116329723237346,-0.3119134089365755,-0.3125352091717644\ngdb_80178,CC1C2NC(=O)C(C)C12,-0.0041703648934665,0.1351076035355093,0.216214976213826,0.5308845516299643,0.3699047944034791,0.2519809335668401,0.3723412001846237,0.2504189243297601,-0.4355818622704128,0.7113868066052976,0.2085879297807434,0.2085924320111658,0.2085924273613308,0.2085976688447058,0.4069749503660078,-0.8363437828298704,-0.8360626421079546,-0.835083035828375,-0.8445670633567522\ngdb_55016,CC(=O)C1CC1C1CC1,-0.0038670211990379,-0.2268837090406477,-0.1776358785649279,-0.1346303435399591,1.1930814627854127,0.0486502322941941,-0.4394012423509688,-0.4566683309845432,0.699277336230798,0.9828909277577874,0.6104227378796224,0.6104515527733162,0.6104515481259645,0.6103746877477016,0.8810705927865675,-1.2013983104385155,-1.199287679831105,-1.1981850183304117,-1.2183474955413909\ngdb_3759,Cc1ncnn1C,-0.0032960515368124,1.3123622990591646,0.6820526144191524,0.4337200303616603,-2.0116894242089414,-0.4438619107884366,0.0357131006554279,0.2420012189093515,-1.8611349486754711,-1.0069872397900848,2.2653490405539256,2.2653233192565017,2.2653233146193763,2.2653540199398616,-1.9972265758653531,1.6705601081850925,1.6665727344774717,1.6645286949269764,1.6795964056532389\ngdb_75634,CC1C(=O)C2(C)CC12O,-0.0043306166216333,0.2452039287533711,0.0225525480732127,0.5828319587639936,0.1964758820737539,0.3333132140758983,-0.4095734360635716,-0.5597852223845454,-0.2076134440088464,0.2378324080000537,-0.2865502108641466,-0.2865215546362291,-0.2865215592891236,-0.2865649422482694,0.969932756293224,-0.4558816629458477,-0.4524687036131778,-0.4517558158524152,-0.4652042035480733\ngdb_80992,CC1C2CC1(C)CC2=O,-0.0039756444261689,0.0513151890165988,0.055200795102102,0.3783748054025119,0.902404835077846,0.468867014924329,-0.2753483077702847,-0.4903391526661764,-0.2009544547624631,1.0274603999372367,0.6098892661601589,0.6098931155896742,0.6098931109423189,0.6099047263930185,0.5536836746041465,-1.2574356920801106,-1.2574314439196657,-1.2559186903387225,-1.2719975031408035\ngdb_107444,OCC12CN1CC1CC21,-0.00395074757355,0.0478930287672393,0.1851366197325937,-0.6353510741916156,0.2355584538663689,0.6586423361121322,1.3737032684043835,1.0501009392685543,-0.4300531389533366,0.728096601973992,0.2106236644522937,0.2106108038729806,0.2106107992231581,0.2106453094632163,0.0173106710556524,-0.6225044422451861,-0.625912349803007,-0.6264149483980483,-0.6108118116787059\ngdb_120374,OCCN1C=NCC1=N,-0.0039163948907799,-0.0651140268472178,-0.1404331234937082,-0.4214845791605741,0.0120549964273573,1.033674518459457,-0.198648234459835,-0.6776330982702624,0.108612943915394,0.0130315638312894,-0.5926109817060794,-0.5926057653041003,-0.5926057699588864,-0.5926324478799494,0.0421723843913097,0.1532950989158966,0.1527712076909363,0.1516540922106485,0.1682887713980272\ngdb_65961,OC1(COC11CC1)C#C,-0.0041273245631233,0.0498503528951017,0.2100723230825869,-0.3325532318531352,-0.0746594597375061,0.3107209139344932,0.2104188231958966,0.0631249787256735,-0.4864645613548576,-0.4756698435254328,-0.2546628755477326,-0.2546525423188071,-0.2546525469715047,-0.25466398446355,0.5253757831823577,0.2701001713320888,0.2725648794358683,0.2730416473750494,0.2646490522159367\ngdb_132662,c1c(nc(c(n1)N)F)O,-0.0036740139446841,0.0181732717677281,-0.1247433452877645,-0.942592216762944,-1.0590517367639736,1.3815959406370946,-1.0956129806737052,-1.7256374231111045,-0.4017350088295,-1.751474741360758,-2.06064909319917,-2.0606903163221304,-2.0606903209859886,-2.06057649673148,-0.8398030500805818,1.6233828787388542,1.6243378694093424,1.6250356734821083,1.6234985648080107\ngdb_111382,OCC1C2CC2C1C#N,-0.0042027225121311,0.1507977663024065,0.013434404717926,0.8860871996558557,-0.0453475308930461,-1.0177063343801267,0.1592854409889301,0.6313200946032389,-0.3248097244997668,-0.0025662487520786,0.2400556968663418,0.2400547794428544,0.2400547747932139,0.2400604919453325,0.0096798481506488,-0.1544345848651509,-0.1533704859408543,-0.1523283624041088,-0.1647305353446657\ngdb_45291,O=C1OC2CC3CCC123,-0.0038096009329534,0.1787938152796388,0.1437900837972096,1.1177268956182884,-0.4068613199747257,-1.062890934662937,-0.1602981978046102,0.3388048312440476,-0.7069815173892467,-0.3298498307036621,-0.2565333840103488,-0.2565720375240378,-0.2565720421767472,-0.2564808505176552,-0.9902041079824312,0.0736166600995976,0.0727094979334811,0.0745958600128874,0.057238641471251\ngdb_69833,CC12CCN3CC3C1O2,-0.0037590887835777,0.1117333650795521,0.1315777956997386,-0.900511549851856,0.3076169456089998,0.5818285156313546,1.224564236967398,0.9385663424481444,-0.4850222390826397,0.7139113080638774,0.2103717437844344,0.2103553870716283,0.2103553824218043,0.2104097421766507,-0.0361050892793739,-0.6489669033151868,-0.652506019918935,-0.6528210514774375,-0.6377037818130836\ngdb_132628,c1c(cnc(c1F)O)N,-0.0037066423615236,-0.0256392189523923,-0.1600385007981685,0.253112944426494,-0.6547913847841214,1.7295173628147336,-0.6524570015466623,-1.4478531442376286,-0.2106743128110779,-1.4015006998707498,-1.659668274197413,-1.6597007659543386,-1.6597007706157187,-1.6596063276116375,-0.5774012043794833,1.168622915570361,1.1705742763232203,1.172033710451286,1.1576095016348382\ngdb_23087,ON=C1C2C3NC1C23O,-0.003657169192901,-0.0635355396473288,0.0421214162951694,0.2503032104557228,-0.6963166173137743,0.7535299967060332,0.1571548833969732,-0.1957194629518838,-0.4552900254621744,-1.0872603754551609,-1.0549098204680194,-1.0549389326068097,-1.0549389372644526,-1.0548580244905106,-0.4983852639760563,1.3597363197609231,1.356334908701641,1.3540453632933591,1.3957816817549291\ngdb_61359,CC(CC(=O)C#N)C=O,-0.0038235276962385,-0.3235439512130538,-0.3415890408181958,0.5493765217166693,-0.4813624724543962,-1.4514784970951045,-2.186458467755656,-1.4815239659192618,0.99530728036257,-0.8428044842160252,-0.6583750817351022,-0.65834516230489,-0.6583451669600824,-0.658436672904137,0.3818670814527702,0.4379474818759469,0.4448883270506604,0.4465884821850072,0.4106071649911346\ngdb_131522,OC1CCC2=NNN=C12,-0.0037408347759861,0.1267731911200952,0.0316159278047439,-0.8315750535922322,-0.6535700544156016,-0.9002263736448206,-0.5800180434201265,-0.1536309358498419,-0.594341688448256,-0.6191456764213797,-0.5623088844012024,-0.5623421444378635,-0.5623421490924626,-0.5622643623992833,-0.957219260586608,0.7127590387968301,0.71065348967732,0.7104791843237969,0.7231518711810546\ngdb_133430,FC1=NNC(=N)NC1=O,-0.0036431926911756,0.0302076581796823,-0.1123120026952715,0.445220336846678,-1.6807088943402408,-0.3489742501945356,-1.6751246456859916,-1.4920460976947725,-0.5136949624391997,-2.0985936919154686,-2.460634061713056,-2.4606784441117533,-2.4606784487780837,-2.4605686759477194,-1.163497634599292,2.1827498000843546,2.182368298307042,2.1815690709574045,2.201033330839824\ngdb_1855,CC1OC(=O)C=C1,-0.0022500410842281,0.5807082261153899,0.6225336826675766,1.334729837872517,-2.331677980760969,-1.2481477958224585,-1.1680519388002408,-0.5724117805151583,-1.797321749993696,-1.3856324049882487,1.6733617065794208,1.6733112575851947,1.67331125294441,1.6734286449568043,-2.254705310014837,1.6939625522046662,1.6912093059224431,1.6914302995496855,1.6790093816151348\ngdb_15427,CCCC1CN1CC,-0.0027017766520576,-0.3504350591503637,-0.3120166839902389,-0.9722577800822512,1.023316541561245,0.9478237779221168,1.6634591009105268,1.201619636835905,1.3698885618640555,1.8721465665424903,2.02788969740566,2.027935818769258,2.0279358390932147,2.0278186219903143,0.9901175136548472,-1.313147930522292,-1.316417306240971,-1.3193661145184314,-1.290491609972335\ngdb_107078,CCC12CC1C(C2)C#N,-0.0035269981681316,-0.2634099028460804,-0.2114980525870636,0.9978884746323642,1.0709484259334936,-1.2933323961052687,0.4511718310870304,1.0479965129134523,0.6615599832697914,0.6781475374006649,1.1366713835611202,1.1366883712234723,1.136688366579373,1.1366440376620996,0.4660522889853926,-0.9374790507127588,-0.9355393944536516,-0.9338581716044254,-0.9585238171619735\ngdb_91749,CC1OC2OC1CC=C2,-0.0041056828150976,0.3172397706075079,0.4802760426921029,-0.5644542984175,-0.1577099247968101,0.1570932729729392,-0.0750758941263327,-0.1473176567845352,-0.8909349782804029,0.3920974792725496,-0.2869804630075494,-0.2870122184993414,-0.2870122231522389,-0.28692651818828,-0.5033083755276716,-0.50107656772174,-0.5035559963991724,-0.5024827863497251,-0.5064811122463448\ngdb_72586,CC12CCC1(O)C(=N)O2,-0.0041999924244078,0.1663048245541165,0.3227210971475999,-0.4963019139636729,-0.2187764432227696,-0.9363740538710672,0.1422409802532746,0.5787094357256866,-0.6196493506224956,-0.0658891603381196,-0.6873554692181578,-0.6873527425931747,-0.6873527222859973,-0.6873376861620674,0.3614361685135669,0.0173196883222383,0.0177833807959562,0.0176209247423748,0.0232225932045432\ngdb_130623,C1c2c(nn[nH]2)CC1=O,-0.0027591029677549,-0.0817954795756453,-0.0938292796777986,0.4817468784667061,-0.9369186999120546,-0.6471926120610839,-1.0785685199380497,-0.7639145788294486,-0.4534856034472656,-1.356450370270619,-0.5327912059495581,-0.5328392822149007,-0.5328392868693175,-0.5327344790925644,-1.504423109548655,1.189825398746344,1.1893265564130555,1.1906537293838304,1.1823272031223198\ngdb_18796,OC1C2CC3OC1C23,-0.0035276116088953,0.6694507764931523,0.9789992369625924,-0.3115782410480741,-1.5488052145401685,-0.141125088893609,1.124428030145422,1.17636652057468,-1.7372864705424231,-0.4800276139003622,0.6953826086790941,0.6953219727384332,0.695321968091606,0.6954661746953191,-1.782578912214923,0.8266955091108822,0.820019698120094,0.8190714444268598,0.8391375624379107\ngdb_31457,CNc1cc(c[nH]1)C#N,-0.0024632145128785,-0.3394677300855345,-0.3102642480300638,2.534943641945285,0.6825653687443921,1.3002636601280364,-0.2796094229541986,-0.8838668810702675,0.8180657117404201,-0.66407579166634,0.363640957087857,0.3636440100581973,0.3636440054093205,0.3636236589101331,-0.1862599916036423,0.3875030876959375,0.3915086557417847,0.3935853014734861,0.362801751091941\ngdb_95655,OC1CCNC(=O)C1O,-0.0041535256681769,0.1540999615245744,-0.0519989983992412,0.0359793168712995,-0.8636388778009021,-0.4709726709581235,0.0506270037991264,0.2693587615256792,-0.2817277190823901,0.0282386797603531,-1.6155325070833346,-1.6155409683491284,-1.6155409480476868,-1.6155095893531106,0.1957734648017096,0.1086246186554411,0.1076513764550766,0.1068550750727361,0.1228485514387673\ngdb_60014,CC(O)CN(C)CC#N,-0.0037575634714084,-0.3659926289924702,-0.3351360604856755,0.9080423302181624,0.6886720205869877,-0.4257880706753132,0.1081520587819637,0.3072384359175162,1.420444906491367,0.9691864912683544,-0.2211901025204828,-0.2211404700637628,-0.2211404747162533,-0.2212536686678745,1.5072903821520407,-0.7829704777292331,-0.7818928539383125,-0.7837341076647111,-0.7637971151236359\ngdb_82910,CC12CC1OC1C(O)C21,-0.0038629868588799,0.0686911761129775,0.3014728111304754,-0.6573062047539224,-0.172365889219041,0.373979354330428,1.456795014490704,1.2647524274889677,-0.630019772868245,0.330547729425273,-0.2855234569094891,-0.285540950823925,-0.2855409554768135,-0.2854796153379649,0.0315876945553369,-0.3480285170152852,-0.3503694910299127,-0.3503767162367344,-0.3413050863013704\ngdb_77658,CC1C2(C)COC12C=O,-0.0042436185630479,0.0146311464911771,0.1210266708561836,1.2521367276619344,0.4444059468831496,-0.0778666484976742,-0.7824210146560353,-0.7365570362131215,-0.1626344362869683,0.2488921286757364,-0.2859983668090164,-0.285981240263158,-0.2859812449160492,-0.2860034199306603,0.567960698103831,-0.3979143990859109,-0.3962118633420149,-0.3958957590547826,-0.4011017485129037\ngdb_125919,CC1CC2=CC(N)=NN12,-0.0030022686759089,-0.1530357638810381,-0.1523350843898983,-0.2265674527696184,0.7570665212240626,2.154252605473149,0.5065663284779107,-0.5029657107967888,0.1661017292235975,0.0361127200240179,0.3350482857971529,0.3350422708915284,0.3350422662424748,0.3350624546457393,-0.0749976705371356,0.0076021020434614,0.0066593671037521,0.0065727887690892,0.0142092095515173\ngdb_74086,NC1=CC(=O)C2(CN2)C1,-0.0037397847422463,-0.0070383257888998,-0.082401936854156,1.8703435438821248,0.4920378312553981,0.6270131159141649,-0.6162375224833944,-0.9007022919110841,-0.1949381657503921,-0.3233582555244573,-0.161990817453713,-0.1620018450069705,-0.1620018496590956,-0.1619648997987479,-0.1003516950279542,0.1883817211491182,0.1892907208646153,0.1903548273336464,0.1806960658732952\ngdb_78782,CC1C2C3N2C1C3(C)C,-0.0040019671140766,0.1677254630340167,0.1503617186478667,-0.6597892254722786,1.1808681591002217,0.0757609924638797,1.32043932860546,1.2689612801991723,-0.3332288116069682,1.3685986863222164,1.107925567869175,1.1079402017443798,1.1079402220626515,1.1079255971639377,0.7818698950215184,-1.333466758388644,-1.3356348196137102,-1.3360067080149025,-1.3250661859641633\ngdb_132703,c1c(c(nc(n1)F)O)F,-0.0036370748630181,-0.0081495807776217,-0.1380691604015969,-0.6987334451601798,-2.351219266657276,-1.1080755349457472,-1.2767103759900449,-0.7449747416335295,-0.5506981245947812,-2.478290732721374,-3.15652344410798,-3.15658812947122,-3.1565881341418507,-3.156429373153677,-1.708486083363112,2.2352394498761954,2.236356151672899,2.2376149408337827,2.2279480980237696\ngdb_37985,C1CC2C3OC2C3CO1,-0.0037251837467704,0.1526730090958747,0.1332663407655323,-0.7882528763219662,-0.0465688612615642,0.2384255534819973,1.3055254254617616,1.1805753732848838,-0.6402622269917009,0.4008130200224085,-0.2847693424251409,-0.2848116449923876,-0.2848116246827226,-0.2847075377525643,-0.8516185178044591,-0.2688141938955178,-0.2744351022403455,-0.2749753173244467,-0.2531659916486248\ngdb_4015,C[NH2+]CC(CO)=N[O-],-0.003660374559054,-0.0294339021809256,0.0625573752265876,3.961831100543775,-0.43006659697659,4.246299598567261,-0.0196813967354524,-1.9971084229192744,-0.3853851152158037,0.135740366871539,-0.1646179116795247,-0.1646041657775032,-0.1646041704296458,-0.164627356806796,0.1598347504749172,0.8867552267890781,0.8804298500453414,0.8741779259602837,0.956057932746181\ngdb_17683,CN1CC1(C)C1CO1,-0.003170473028031,-0.0245090221172717,0.1878017827553602,-0.3159561986304388,-0.3409094800746887,0.5366439153485444,1.5846284700081203,1.31736308636652,-0.5672610601103965,0.4549094798491164,1.1616680196189155,1.1616786536138008,1.1616786489698556,1.161660050367764,0.0601417415547059,0.0390151305435418,0.0353790753504936,0.0326485574862659,0.0646590812538869\ngdb_119070,COCC1(CO)OC1C,-0.0043389118881774,-0.1000806752991601,-0.1602575552931904,0.0908018006265835,0.0804494970644305,-0.3851219304207836,1.0051168049958334,1.1721576678644765,0.6245593232841997,0.9368488297275002,-1.242755894307554,-1.2427111702554408,-1.242711174914244,-1.2428057971985615,1.429505219636519,-0.6852308585869177,-0.6849423962912584,-0.6874674484521254,-0.6579760585459904\ngdb_9882,CC1(CC1)OCC=O,-0.002752675655969,-0.2459265786201092,-0.16760500814705,0.130203418867866,-0.7439485016860227,-0.6607479921459255,-0.8079877057595187,-0.4903391526661764,0.2381056026378662,0.0487051737281239,0.6644817130210056,0.6644997149881614,0.6644997103411437,0.6644412116957441,0.0763880096750372,0.2043320927988214,0.2039728594966065,0.2024920344077643,0.209286416233966\ngdb_54843,OC12C[NH2+]C1C2C([O-])=O,-0.0039504823018684,0.0401458135901839,0.0110156780020592,1.6106718508624616,-1.2495792742529674,-0.1230512487804844,0.1699382289487148,0.2251658080685343,-0.4501237593406446,-0.7145057136603493,-1.584105952940922,-1.584131741221709,-1.5841317458826225,-1.5840598214279498,-0.4373386807360256,0.7862046922417848,0.7848127145993016,0.7841153589880082,0.8011947223055695\ngdb_68069,CC12CC(C)(O1)C=CC2,-0.0040163525763112,0.2449071731597918,0.1382041941741512,-0.791585351496602,0.9537007105556522,0.468867014924329,0.0548881189830403,-0.1662574939804543,-0.4181982149984122,1.009007496418571,0.6102296036934121,0.6102294609748777,0.6102294563275247,0.6102580648418182,0.6203918361285348,-1.2216856728347512,-1.222411593339484,-1.22114583724141,-1.231660972755091\ngdb_35851,N#CC1CCN=COC1,-0.0035427597272146,-0.1440383868416705,-0.1901858756755577,-0.4366440740726431,-0.1308406566893881,-0.9770401941255972,-0.3307428051611651,0.1283621957338382,0.0957149725279204,-0.3189103243831498,-0.1617780628112081,-0.1617876912991134,-0.1617876959512372,-0.1617722922769576,-0.4587542159855532,0.2107300718922422,0.2115881305519681,0.2124949718135216,0.2026838206209216\ngdb_97389,CN=COC12CN(C1)C2,-0.0026056488264297,-0.3786015847451839,-0.311396029587677,-0.6166630761534617,0.7350825745907164,-0.5523049514671804,0.163546556172844,0.4187730327379276,1.200296484351069,0.3112533254204137,-0.1891330477608685,-0.1891305691197934,-0.1891305737720862,-0.1891605526790676,-0.0090279757454896,-0.0391608320348342,-0.0421901276007877,-0.0443709636534233,-0.012007397975006\ngdb_93163,OC1CC=CCOC1=O,-0.004052009511516,0.1420908309078185,-0.011720353127089,0.6981617368665873,-0.6889886351026588,-1.2255554956810533,-0.4436623575348826,0.1325710484440431,-0.3905070571832428,-0.3483327878111208,-1.1845187604716598,-1.1845393696839062,-1.18453937434235,-1.1844806650286548,-0.3285379154453269,0.1850542475956057,0.185576651806825,0.1866669538675939,0.1765099575627368\ngdb_47103,O=C1NCC2CCCC12,-0.004009665519337,0.1777141300349146,0.0410991619850671,0.6410522603443964,0.3405928655590192,0.2203517133688727,0.5662219410527048,0.4566527071297646,-0.3720871540829497,0.8032606275443237,0.2078133960428969,0.207793905031256,0.2077939003814161,0.2078319815555175,-0.3947537658145532,-0.9177030022194816,-0.9192042481796092,-0.9176382382723336,-0.9319766524092014\ngdb_76146,CN1C(C(C)=O)C1(C)C,-0.0041807768069702,-0.0889807533095404,-0.0062896271046709,-0.6388142346672177,1.2749105974761985,0.852936117328216,-0.6716320198742748,-1.0606386948988438,0.3135177890299788,1.2606461953791408,0.1795942872566032,0.1796514767661585,0.1796514721161433,0.1795470577539495,1.9008931507036897,-1.2583639234731725,-1.2562306742170075,-1.257165186408635,-1.249029475241714\ngdb_92977,CC1CC=CC1N1CC1,-0.0037634270808709,-0.0805263758669346,-0.1006290962941035,-1.0567458271568426,1.5643658948152472,0.622494655885883,0.3062939148339587,0.014723172558326,0.2264168942607809,1.3976003995070905,1.1071394765342986,1.10715088594505,1.107150881300768,1.1071248630939574,0.5332527616649432,-1.4160400205328734,-1.4178173693651552,-1.4176121991674997,-1.41647665729573\ngdb_22363,CC1NC(C)=CC1=NO,-0.0041043288242227,-0.0555105107230929,-0.1973690376581509,-0.0255734598837389,1.3298704640595624,2.5835063081598464,-0.39679009051183,-1.5951629890947747,0.4138612397394177,0.3003438726886941,-0.1909811649393911,-0.1909584268076525,-0.1909584314599565,-0.1909929701180974,0.8825475262520521,-0.2332922993137199,-0.232504328201121,-0.2333428647368096,-0.221193128990723\ngdb_106964,CCC12CC(C)(C)C1O2,-0.0035642135744658,-0.2238719554632593,-0.1269977811323649,-0.6360698433469293,1.3176571603743712,-0.1140143287239221,1.6038034883357326,1.6351314659869358,0.649105610876079,1.649509580766794,0.580879997165737,0.5809189351919487,0.5809189305444145,0.5808497469357101,1.5680908098144903,-1.6810972814584515,-1.6810588363617902,-1.6814358021385527,-1.6769586004950283\ngdb_114977,CCC1CC=CC(=O)N1,-0.0041827110796487,0.0332888651938657,-0.0790431012638202,0.9943599715062792,0.725311931642563,0.12094559274669,-0.8719044335182267,-0.9175377027519014,0.1097496439961013,0.7230175454680406,0.2081605732772009,0.2081679438654486,0.208167939215611,0.2081541673102789,0.3907286822456782,-0.8812345213874023,-0.8802598039838647,-0.8789684719957926,-0.895196461828056\ngdb_88593,CC1CC1C1C(C)C1O,-0.0039951805802217,-0.1933124432734072,-0.1858504221282492,-0.9814057511498792,1.3713956965892151,0.3649424342738657,1.390747729140039,1.2037240631910078,0.7430102632981647,1.678060490119779,0.5811747782960132,0.5812224049001222,0.5812224002525899,0.5811139457447593,1.6387374605801723,-1.6501326359198445,-1.6494619591535902,-1.6500617799399466,-1.6467981033139922\ngdb_3128,CCC12CC1CC2,-0.0025489912164298,0.3455325751783193,0.5456273003736372,-1.5734755071768478,-0.4776984813488385,-0.1275697088087649,1.6485451977668282,1.6877421248644882,-1.3166069964282752,0.7142118439518038,3.439342862264958,3.43933052086214,3.4393305162322694,3.439357997418554,-0.9601731275175774,-0.1110122349520103,-0.1170199122274479,-0.1186781299908799,-0.1060851241593169\ngdb_66792,OC12C3C4C3C1N1C4C21,-0.0037685998786626,0.5565447440594881,0.6506365489247612,-1.3624840887670608,-0.8172283237971735,0.970416078063522,1.118036357369551,0.6523643581542599,-1.4370457317720562,-0.6654883103395934,0.2715228398916256,0.2714583899967505,0.2714583853473041,0.2716117557470604,-1.4810383296784826,0.5274090600355051,0.5232060617647943,0.5243538363772913,0.5252022361961048\ngdb_17907,O=C1C2CNC1CN2,-0.0038116678414727,0.6091146817645932,0.962953495202238,-0.1234567503073564,-1.4474347939530752,1.1556729392230434,-0.5842791586040403,-1.115353780131498,-1.6151305316804914,-0.4219340267642366,0.7895793210092941,0.7895303081452195,0.7895303034989746,0.7896447876680527,-1.648670278010986,0.9051310618250248,0.8989248223422523,0.8974200452270146,0.9240850700882792\ngdb_41182,C1NC2C3CC=CC12O3,-0.0037174963944967,0.2283267436121573,0.4120770765752842,-0.9266486100450784,0.3051742848719619,0.7670853767908761,-0.3499178234887775,-0.7007817881763858,-0.9681029730990676,0.0290801802465463,0.2429483162748741,0.242904054710629,0.242904075023555,0.2430172520484955,-0.8149413367449244,0.1494143574985758,0.1432924044349242,0.1422447243619911,0.1728082865651777\ngdb_104782,CCC1(CC2CC12)OC,-0.0041358519423888,0.0011256100089264,0.0166928403313769,-1.1141166742809656,1.22850004347247,-0.2224573694026672,1.620847949071388,1.7045775357053057,0.128855141673374,1.6858143160282737,0.581376299852854,0.5814071028001604,0.5814070981526291,0.5813547051469974,1.268765627476278,-1.628964240337393,-1.6302314504373394,-1.630966905361722,-1.6193134098794593\ngdb_78563,OC1C2C3(CC3)CC12O,-0.0039563127523707,0.0359280957920803,0.017815494618364,0.1854179585260481,0.0535802289570084,0.3920531944435513,1.3161782134215465,1.1174425826318222,-0.2638504294168492,0.2778337346830251,-0.285187288099138,-0.2851811656052114,-0.2851811702580977,-0.285176924979072,0.6105456130253043,-0.312716392240247,-0.3129091150159471,-0.3131805509041264,-0.3067504577287657\ngdb_6143,CC1(C)CC(O1)C#C,-0.0029572222284733,-0.0352111653325195,0.1139439088504754,-0.6248962501143267,-0.1369473085319854,-0.4709726709581235,0.3425133938972266,0.5576651721746656,-0.5331007205151508,0.072687937584631,1.5921173910443402,1.5921319249277253,1.5921319452489893,1.592101366985775,0.2415584022317324,0.0561611900889752,0.0562132094050242,0.0557771243155746,0.0512401676838283\ngdb_19087,C1C2CC3OC3CN12,-0.003559880803666,0.632110083292577,0.6917822849063752,-0.7531638730125653,-1.017526504234321,0.8890837975544639,1.752942519772718,1.31736308636652,-1.5237098191517495,-0.1102482573960196,1.1912445344770353,1.1911954449391036,1.1911954402953409,1.1913140452087414,-1.5487311135131934,0.5222618334128145,0.5155024223759533,0.5142631539065708,0.538002779784732\ngdb_89933,CC1NC2(COC12)C#N,-0.003681297862942,-0.0357478509804818,-0.016932024654485,-0.1482216148404346,-0.1784725410616366,-0.570378791580305,-0.266826077402457,0.0020966144277126,-0.2806739480641742,-0.3656436549556672,-0.1604634173521489,-0.1604733382067007,-0.1604733178962674,-0.1604432354751616,-0.2657682431622305,0.3488241575291149,0.3484368905785658,0.3483811085944215,0.3544067374208018\ngdb_118477,COC(CCO)C(C)O,-0.0042700904662765,-0.2649820760971699,-0.3769298326817293,-0.5097624999631825,0.4688325542535338,-0.3534927102228161,1.356658807668728,1.5046570319706063,1.5865500313982837,1.5977272472771171,-1.2730600135678516,-1.2729799084442222,-1.2729799131032138,-1.273162599812477,2.4360353763642792,-1.2449071903967728,-1.2433574873442412,-1.2468215916922554,-1.2115511250036015\ngdb_101493,CC(CO)N=C(N)C#N,-0.0038696186509204,-0.2968612035861292,-0.3134587927491331,1.5766283299607893,0.3625768121923637,-0.8098571730792002,-1.2490131272946043,-0.8565093384539404,0.98105313284107,-0.0692251086941,-0.592409310374765,-0.5923702686168495,-0.5923702732716342,-0.592450574058829,1.1963958876675311,0.1744792272334069,0.177290821048028,0.1760007670006167,0.1890511847068933\ngdb_91805,CC1CC2CC2(O)C1C,-0.004062200365285,0.0016812375032873,0.0372200719690542,-0.7823720377784912,1.1002603547779553,0.2113147933123104,1.2991337526858906,1.1847842259950887,0.0223924555294602,1.7164689765967411,0.5804370391543253,0.5804609223429262,0.5804609176953891,0.5804267392809352,1.33055067744905,-1.7276268466276106,-1.7287465473937924,-1.7287871683720004,-1.7252484517067672\ngdb_60572,CC1CC1C(O)C1CO1,-0.0034488701314004,-0.2852372238461462,-0.2542501881988382,-0.8496096251255556,0.4798245275702069,-0.8414863932771663,1.5377562029850675,1.912915744860413,0.864343345686578,0.9865574655904868,-0.3156302233358477,-0.3156056700997425,-0.3156056747528167,-0.3156613837446763,0.9509787768195054,-0.8869743477134264,-0.88754239425592,-0.8886384943021328,-0.8748985384134187\ngdb_10416,COC1(COC1)C#C,-0.0039453371365436,0.3613616448188068,0.2653105649106098,-0.4203737541023621,-1.0627157278695312,-0.6471926120610839,0.2828577813224324,0.5808138620807883,-0.9595544880581502,-0.6194462123093062,0.695968651239473,0.6959710239749406,0.6959710442906663,0.6959670141609995,-0.3893383431077766,0.8882550809048391,0.8875980812981352,0.8861757732257539,0.8963125638971999\ngdb_11989,CC1(O)C2OC2C1O,-0.003904805834189,0.4798807777352764,0.6311954624915675,-1.2240230123934654,-1.9152043250959256,-0.0778666484976742,1.1819530851282594,1.2037240631910078,-1.3689874229775514,-0.6067936514276149,-0.2318169011090534,-0.2318342013917315,-0.2318341810817392,-0.2317684268062892,-0.3900768098405189,1.0206827596479089,1.017343586787428,1.0150061744887635,1.0457643245880397\ngdb_101409,CCC(O)C1COC1=N,-0.0037369717571225,-0.2166614259341661,-0.1649124633124058,-0.6730537835203388,0.1097614259088922,-1.2617031759073023,0.1912438048682841,0.7765255131052824,0.507225423443885,0.6581919544423682,-0.7175172777642778,-0.7174963934489258,-0.7174963981044837,-0.7175396738653883,0.7698082717200608,-0.5274079226631941,-0.5276077774051541,-0.5288037251013568,-0.5126790602214722\ngdb_28566,Cc1ccc([nH]1)N2CC2,-0.0028348711918183,-0.2492603435862748,-0.2273977580174033,-0.6826591531413481,1.3115505085317756,2.967575410563732,0.8538472159668913,-0.5387409588335245,0.59835874387675,0.3538092071507574,0.7354923879658096,0.7355006182166347,0.7355006135700557,0.7354742965960802,0.2068504657928445,-0.503536118701096,-0.5024124062069727,-0.5013472619796185,-0.5154175558555403\ngdb_45200,O=C1NC2CC3CC12C3,-0.0035270810655321,0.093182983506456,0.173919204133347,0.5495725496681183,0.1354093636477944,-0.141125088893609,0.0527575613910833,0.1178400639583277,-0.7179692607165625,0.075963778763026,0.2402047973567422,0.240160271174922,0.2401602665252821,0.2402678271029255,-0.9350652586043392,-0.1387726466987494,-0.1423868219545213,-0.1414221669766755,-0.1410614981578097\ngdb_36381,C#CCC1CC1CC#C,-0.0022120519688224,-0.4730772006329437,-0.4490352706264386,-1.5171501424604537,1.580243189605996,-0.8911894535882571,0.7728860274725277,1.178470946929782,2.0072788340065006,0.2325730299613465,1.5694583601135474,1.5695021760583316,1.5695021714169068,1.5693843382289594,1.1671033739354195,-0.6747843222311035,-0.6689295345260434,-0.6666899332159676,-0.7094945419682096\ngdb_63082,CC1(O)C2OC=NC12C,-0.0041051854306946,0.2389468054930108,0.2350719173269615,-0.4188708731412518,-0.4471652221358587,0.2971655338496504,0.0463658886152125,-0.0904981451967791,-0.6754338007153495,-0.0690748407501367,-0.6873921390022524,-0.6873903112294174,-0.6873903158847893,-0.687364919807255,0.3978671939955196,0.0134677902532899,0.0138717825230784,0.0137343345123751,0.020113645508048\ngdb_320,OC12CN(C1)C2,-0.0018434238082478,1.372344812654949,2.230229385257071,-0.6173165026582922,-3.0718041840835983,-0.2043835292895426,1.21604200659957,1.2942143964603972,-2.609621568328562,-1.1781424279640302,3.1236842684438315,3.1236168821053942,3.123616877473573,3.123776763565361,-2.9084945240693565,2.4099042791265752,2.398270329466699,2.393501790951788,2.451245203003554\ngdb_48799,O=CC12OC1CC21CN1,-0.0038788313153635,0.0481960983096179,0.023456147865178,-0.2123880976147956,-0.5412076605118359,-0.6336372319762398,-1.3086687398693986,-0.9975059042457808,-0.4501405596248561,-0.7737413371705947,-0.6557276431430312,-0.6557456623023927,-0.655745666957569,-0.6557007774138299,-0.5266931553978452,0.7160419362656593,0.7155449368193646,0.7153361317432071,0.7229324495750337\ngdb_70700,CC12NC1CNCC2=O,-0.0041648826120463,0.2820710757939799,0.0878034057778618,0.33172015295761,0.0389242645347793,1.1330806390816384,-0.5864097161959974,-1.1069360747110892,-0.4694594566569041,0.385155100261456,-0.1911888022873166,-0.1911966693752933,-0.1911966740275988,-0.1911643848215876,0.0618648305977716,-0.2551031149280934,-0.2573098391063663,-0.2579734206193045,-0.2407615468041562\ngdb_46927,O=C1NCC2OC2CO1,-0.0035322925487773,-0.0037424445155314,-0.0355699112725986,0.4731216486029428,-1.3082031319418874,-0.8234125531640417,0.6343997839953268,1.0122212648767177,-0.460841625762007,-0.6669308826016388,-1.5848044262076069,-1.584838381058755,-1.5848383857196726,-1.5847556897385944,-0.8225721596499279,0.712835080351629,0.7112382801163109,0.7110598501948735,0.7217555518675719\ngdb_27842,c1c([nH]c(=O)nc1N)N,-0.0042862775653488,0.2776828813782883,-0.1374302514577829,3.048340846790655,-0.1528246033227343,1.331892880326004,-0.3584400538566052,-0.9743572143396574,-0.207460096733807,-1.001247004330695,-0.9653104855893692,-0.9653259778076848,-0.9653259824647744,-0.9652682432888589,-0.0528436685548665,0.9552503128052052,0.9567748916768012,0.9573034713269118,0.9568757769157724\ngdb_103104,CCC1(O)C(O)C1C#C,-0.0041771735332949,-0.1272117132908532,0.0359878904345561,-1.5074794301889614,0.4004380536164588,-0.0688297284411119,0.493782982926169,0.5197854977828275,0.0948427875603339,0.1940743827180058,-0.2853412063351689,-0.2852942709147228,-0.2852942755676098,-0.2853804409361938,1.65104523945921,-0.3288844000640303,-0.3246854949312514,-0.3248738711790204,-0.3299835013334696\ngdb_45975,O=C1COC(=O)C2CN12,-0.0037403318650897,0.1470283388690714,0.049888723597821,-0.5636701866117034,-1.5353705804864577,-1.3023693161618295,-0.993346216259772,-0.3745957031355613,-0.7569427030120364,-1.4025826290672836,-1.5552089399159197,-1.5552502717309638,-1.5552502763916984,-1.555154414841374,-1.2614675544764375,1.1980745963807298,1.1987871661887306,1.2000476128088464,1.189080829192224\ngdb_63139,CC1(O)C2COC1C=C2,-0.004090650753139,0.3958421192131833,0.4384913977666751,-1.1271198617270937,-0.1235126744782727,0.4191639546132382,0.0207991975117293,-0.1767796257559648,-0.94366785334527,0.3407358960259699,-0.2863863825503169,-0.2864066020980908,-0.2864066067509846,-0.2863323704130545,0.0377415839948566,-0.438672672457676,-0.4404999926196105,-0.4398715210243671,-0.4386541891645015\ngdb_46684,N=C1OCC2C3NC1C23,-0.0037086484786159,0.250185634356221,0.2587024209774491,0.5728998758905693,-0.3311388371265353,0.4553116348394861,0.2146799383798104,0.0020966144277126,-0.9296356840470784,-0.2628904348737143,-0.160369009507733,-0.1604160741192996,-0.1604160787714149,-0.1603041217220604,-1.1090972519539424,0.358741025123917,0.3543991539918761,0.3542987390050004,0.3702877323933943\ngdb_83672,CN1C2COC2C1C#C,-0.0042012800973623,0.1902536323508332,0.0023630254486941,-0.2725033360592067,0.4615045720424183,0.861973037384777,0.4447801583111596,0.0378718624644488,-0.264931366852088,-0.0711184847880341,0.2419237590125246,0.2419255976775391,0.2419255930279101,0.2419267080604253,0.0864803883558488,0.0417919583549891,0.0414167117537716,0.0410849874087167,0.0483135963870668\ngdb_113732,CCC1OC1C(O)C#C,-0.0035469211767201,-0.338779509666383,-0.3188073733359178,-0.9431803006172916,0.4273073217238809,-0.9183002137579428,0.2317243991154659,0.6544687845093616,1.1986436223469212,0.2109043924418707,-0.2850657710746481,-0.2850208311529938,-0.2850208108433315,-0.2851332912377433,1.2759041392261192,-0.2999518995239391,-0.2962152972724026,-0.2966018951009807,-0.3017693022981859\ngdb_68883,CC12NC1(C)C1OCC21,-0.0041299883329263,0.2432529185743082,0.2277427190143537,-0.8552944357175815,0.4138726876701699,-0.1456435489218895,1.324700443789374,1.376287024309379,-0.5451329410864373,0.6709947832680228,0.2108288554837845,0.2108290264763241,0.210829021826503,0.2108470780737175,0.2587892926623862,-0.6009505946439989,-0.6031912919310777,-0.603854143753793,-0.5877782422612534\ngdb_110902,CCC1C2CC(O2)C1C,-0.0041069870675322,0.0423430677724294,-0.0117568622095927,-0.7376776648480811,1.1478922391502038,-0.0823851085259547,1.5356256453931103,1.555163264493057,0.1112745378779559,1.755117891784045,0.5807629983379273,0.5807733536061902,0.5807733489586551,0.5807617056281226,0.7287002902640721,-1.6933871699874945,-1.696216604551165,-1.6964866615168763,-1.6870092499241214\ngdb_75707,OC1C(=O)C2CN2C1=O,-0.0041998984740205,0.4380382390406175,0.0620097389890329,-0.1923279039164972,-1.584223795227225,-0.859560233390291,-1.4322410802029009,-1.0143413150865976,-0.7893164932351515,-1.4509388534346022,-1.554729761443839,-1.554762503523535,-1.5547625081842669,-1.5546786373186527,-0.8831264317347964,1.2484088614124096,1.249572967015269,1.2504752177908132,1.2433948008731155\ngdb_67365,OC12C3OC1C3CC2=O,-0.0038064508317341,0.3304359235985805,0.195441308269249,-0.0678501547462761,-1.2727845512548317,-0.032682048214864,-1.0210434649552125,-0.9932970515355763,-1.0105487124335328,-1.086148392669834,-1.1525635273930004,-1.1526117203388877,-1.152611724997134,-1.152488670484997,-1.0421429348519728,0.9181682552887406,0.9167154470432578,0.9175265687895164,0.9167558185057378\ngdb_36832,C(C#CC1CO1)C1CC1,-0.0022503395148699,-0.5293407983857892,-0.5329787785729568,-0.5102852411670469,1.4300195542781369,-0.2586050496289152,0.3936467761041931,0.5092633660073169,3.062089325126021,0.284655899338949,0.6421115720911185,0.6421379390822659,0.6421379593976589,0.6420022357104707,0.4054980169005232,-0.4962675949117747,-0.4932688828033,-0.4922656477559708,-0.5197062508916587\ngdb_54015,CC(=O)N(C=N)C(=O)C,-0.0042237729253665,0.1195500336934028,-0.1400862872099236,-0.4744121260518492,-0.2859496134913265,-1.230073955709334,-0.965648967564332,-0.3809089822008674,0.0155994214622391,-0.4508155255939399,-1.0898648116300471,-1.0898397962753792,-1.0898398009332375,-1.0898975163511353,0.5999609231893297,0.3115192195942488,0.3156288473936424,0.31580188212118,0.3076385892972207\ngdb_22276,CC(=NO)C1CC(=O)N1,-0.0033629663185016,-0.3067551613550338,-0.2801168731526745,0.8046049145034849,-0.278621631280211,-0.3534927102228161,-0.3115677868335526,-0.1431088040743315,0.7077915057975811,-0.4386438221329309,-1.0879225112688211,-1.087904374963888,-1.0879043796217347,-1.0879630039452672,0.4650676666750701,0.5155439553643043,0.5171424346753208,0.5158941798199966,0.5284793121374316\ngdb_26798,C(#N)c1[nH]c(c(n1)N)N,-0.0033323771777145,-0.1750777591402888,-0.2284565214100092,2.041998686701113,0.2416651057089645,1.413225160835062,-0.8378155120469157,-1.4857328186294665,0.1443875406063866,-1.353174529092224,-0.4380728291210498,-0.4380780762728977,-0.4380780809267288,-0.4380553261992848,-0.2665067098949728,1.323058069122085,1.3257958524216094,1.3261604977956265,1.32441551665649\ngdb_9783,CC(C#N)C1(C)CN1,-0.0038192004519321,0.2831886447315013,0.1879752008972524,0.3910512795962248,-0.4715918295062429,-0.6291187719479593,0.2956411268741741,0.5850227147909921,-0.7981695281090133,0.1409095841438682,1.6865486002784116,1.686563675148048,1.6865636705073463,1.6865346432474275,0.0835265214248794,0.1592289623127867,0.1583799977719836,0.1572207423613757,0.16525961337616\ngdb_90173,CC1CC23CC2(CO3)O1,-0.0035052016782916,-0.0070130699937015,0.2008446524797892,-0.9833006880138876,-0.1454966211116189,0.672197716196975,0.9135028285416854,0.589231567501197,-0.520897280026047,0.3248676011434688,-0.2851598793701137,-0.2851816898187405,-0.2851816944716268,-0.2851184138293735,-0.2103832382065582,-0.3098373016482076,-0.3129636954573717,-0.3132347463858385,-0.3000709220718928\ngdb_119575,CCOC12CC1OC2C,-0.003757933746464,-0.2198625979755412,-0.1945669655759958,-0.3667274380557732,0.5054724653091092,0.5185700752354198,1.4312283233872207,1.1721576678644765,0.7013745121344178,0.946706406851478,-0.3153722368017287,-0.3153460096650612,-0.3153460143181339,-0.3154048732607588,0.9541787993280548,-0.8598747108566631,-0.8605068824746631,-0.8617936658979299,-0.8456157276582855\ngdb_20373,c1ncn(n1)C2CC2,-0.0019832385639429,0.047981424050433,0.18175040233038,0.4644964187391796,-1.067601049343607,-0.5748972516085855,-0.0793370093102466,0.1893905600317992,-0.9693955226248028,-0.8055680877019747,1.3154442944092812,1.31540442075843,1.3154044161154346,1.3154373366138523,-1.8497793848944728,1.1287482975078091,1.124908640677249,1.1242487803674337,1.1294779419058896\ngdb_15465,N=C1OCOC1C#C,-0.0037302183822278,0.4404249116868497,0.3343309853837607,-0.5357035322049555,-1.81749789561439,-1.4062938968122942,-0.1411231794769978,0.5155766450726237,-1.2844058559343992,-1.595827205003802,0.3239829225817854,0.323949662993114,0.323949658343992,0.3240238419904387,-1.4721767288855745,1.7651872678258131,1.7643652903576823,1.764072890390054,1.763968287790061\ngdb_67813,CC12C=CC3CC1C2O3,-0.0037986474230997,0.2759402315096107,0.4192876203697551,-0.8819542371146681,0.4150940180386897,0.8393807372433719,0.0932381556382651,-0.2988363543518861,-0.9183759190966752,0.3975972860215988,0.6401661763907323,0.6401303260790706,0.6401303214319023,0.640226057870534,-0.5739550262933527,-0.700617473882078,-0.7022989765327952,-0.6998240181882383,-0.7224717626940638\ngdb_111163,COC1C2OC1C21CO1,-0.0029920999281133,-0.2076956186387963,-0.1179982922952151,0.7043692886624774,-0.6828819832600632,-0.1817912291481375,1.1798225275363023,1.2500214430032537,0.0912253646619815,-0.4489221495000048,-1.1804613699292583,-1.180477189122849,-1.1804771937812677,-1.180434783776812,-0.3469995837638844,0.6112540516302647,0.6085256873396543,0.6066328999403383,0.638381040669276\ngdb_108433,CCC12COC1CC2=O,-0.0041673419015947,0.139906204623172,0.0521249049011694,-0.5084556469535213,0.0059483445847599,-0.1637173890350141,-0.8378155120469157,-0.7491835943437338,-0.2804923620135455,0.2842351490958528,-0.2863275461438429,-0.2863232022218878,-0.2863232068747811,-0.2863291503028531,0.183711841500252,-0.4324923295387534,-0.4318165043616027,-0.4312492780231714,-0.4382865867324195\ngdb_122200,COCCC1(C)CCO1,-0.0035126513913505,-0.3958639207631707,-0.3530346381830901,-1.0561577433024951,0.9475940587130566,-0.1094958686956416,1.4056616322837376,1.4394198149624418,1.7448809141494448,1.657563942563215,-0.3459647717770923,-0.3459157213071168,-0.3459157259603784,-0.3460362089908793,1.4457514877568487,-1.4498470469474125,-1.4502589438743034,-1.452263757616187,-1.430531038634738\ngdb_11926,CC12CC1(OC2)C#N,-0.003656235215522,0.3073710686338016,0.3719262130918947,1.38099243441452,-0.9882145753898608,-0.7104510524570162,-0.3818761873681315,-0.0463051917396348,-1.1573771197453595,-0.910455112588203,1.2208910938648818,1.220871472364468,1.2208714677208885,1.2209215606296069,-1.0111273320767968,1.012866212239102,1.0122052281265237,1.0123402729562176,1.0060404649221617\ngdb_131135,OC(C=O)C1=NON=C1,-0.0032428148262068,-0.2127341497808422,-0.155292320072694,-0.5732755562327126,-1.8492524851958891,-2.2964305223836545,-1.660210742542293,-0.5703073541600558,-0.0808513676500605,-1.8177729582372684,-1.954458890191089,-1.9544902220020075,-1.954490226665209,-1.954434600233868,-1.1553745005391265,1.834649917072856,1.8347168797523972,1.8339308657864564,1.8478956274514269\ngdb_95334,CC1OCC2OCC2=C1,-0.0035212893004833,-0.0508445025602208,-0.0400970375030513,-0.5728181576793313,0.3735687855090368,-0.0281635881865834,-0.0878592396780743,-0.0736627343559625,-0.1511594847175128,0.3493312224206582,-0.2859169394191472,-0.285933511869473,-0.2859335165223639,-0.2858847501330206,-0.2219525503528535,-0.3893610352323297,-0.391242444141384,-0.3909613895192279,-0.3875546015364634\ngdb_129653,c1(c([nH]nnc1=O)N)N,-0.0041038701252732,0.3288763782450901,-0.048475871937639,3.1849069863002413,-0.5118957316673759,1.4041882407784998,-1.2852326063578725,-1.923453500490701,-0.581400822718114,-1.3750835953220404,-1.3638343292071184,-1.363854770020314,-1.363854774679866,-1.363786760289392,-0.3799844311597073,1.668097935083157,1.6667494711436028,1.664709346531312,1.7026413735957036\ngdb_24953,c1coc2c1[nH]c(=O)[nH]2,-0.0025079017382452,-0.0468351450725026,-0.0547006705045112,0.2144300953405248,-1.135995549980682,2.3078802464347032,0.0442353310232556,-1.0311767259274145,-0.612099946312892,-1.6992115504503995,-1.0297898950544406,-1.029837941311675,-1.029837945969163,-1.0297198972181547,-1.35845285204326,1.3748502343078397,1.3766908141301697,1.379135290398307,1.353601439949023\ngdb_58459,CC(C)C1(O)CC(C)C1,-0.0038311653100719,-0.263428844692479,-0.1563967198184293,-0.9468394890443428,1.6791709494560507,-0.4664542109298429,1.1840836427202162,1.3868091560848894,0.9542341599934872,2.3439578569977626,0.5503579310811163,0.5504178206345407,0.5504178159868179,0.5503077505420964,2.367357970219245,-2.263667365507501,-2.263668657194833,-2.2648140280180526,-2.251675081217236\ngdb_11112,CCC1CC(N1)C#N,-0.0017197740456515,-0.2535475148211736,-0.1722599161662654,1.545721256282304,-0.2676296579635379,-0.3489742501945356,0.1379798650693608,0.2988207304971075,0.2067005096533407,0.1777252304148224,1.6865585103562104,1.6865644489870664,1.6865644443463648,1.6865442037296523,-0.3344456493072648,0.160269944976757,0.1584605688987769,0.1573007452147239,0.1663510221455796\ngdb_50442,O=CC1CN=COC1=O,-0.0039484927642562,0.1003240595987542,-0.0907990258299957,-0.4848016074786551,-1.5378132412234955,-1.3927385167274502,-1.1850963995358963,-0.519801121637606,-0.4313710676318171,-1.441021169133039,-1.555782076908988,-1.555808658989508,-1.5558086636502462,-1.5557581979851078,-1.042389090429554,1.1378706619610923,1.1406486002374088,1.1423191022753585,1.1201539484462948\ngdb_129384,c1(nc(no1)O)C(=N)N,-0.0030306361663624,-0.1659793589201284,-0.2004175460472057,1.962280653111785,-1.4376641510049215,-0.7782279528812316,-1.4876355775937813,-1.1069360747110892,-0.0363781557188114,-1.751474741360758,-1.8603588572848508,-1.8603868291512091,-1.8603868338138296,-1.8603147979559675,-0.726079173238267,1.902929989037482,1.902695519497439,1.9014300473776835,1.9261294038694337\ngdb_124853,CC1(CC1)c2cnoc2,-0.0034449573740964,-0.1699634606126483,-0.0687475399977907,0.3598828353158073,0.1964758820737539,-0.5071203511843702,-0.4351401271670549,-0.1936150365967814,0.0329573321286387,-0.0163307924190961,0.2399223975831107,0.2399143651047396,0.2399143854176472,0.2399197306939888,-0.2369680405852804,-0.1684367194384563,-0.1679902469259759,-0.1668424284438358,-0.1807996059799516\ngdb_25692,CC#Cc1c[nH]c(n1)O,-0.0020226479881439,-0.4127726756483822,-0.4122888790865147,-0.4389310668395499,0.7961490930166759,2.027735724681281,0.0676714645347819,-0.8775536020049614,1.5054264632799,-1.0698593475442364,-0.1320315539233489,-0.1320247442781376,-0.1320247489300774,-0.1320772098283006,-0.1033055619589235,0.7118334295263476,0.7173408932350779,0.7195582178098285,0.6807180121936387\ngdb_78595,CN1C2C3C(O)C1C23O,-0.003933195430616,0.1408343550967069,0.1158423811406653,-0.941481391704732,-0.4068613199747257,1.6933696825884843,1.2075197762317424,0.4040420482522123,-0.561562904139781,-0.0530562779236724,-0.6852863598009736,-0.6852958285557071,-0.685295833211066,-0.6852462620296611,0.2196505558270444,0.2346648067958253,0.2319466350491161,0.230271089680101,0.261976098100686\ngdb_72962,CC12CCN1C2(C)C#N,-0.0039924228600315,0.0830238398879701,0.0693663191135184,1.526706544991735,0.6483681184258546,-0.4257880706753132,0.2849883389143894,0.4819058233909904,-0.3260390048700659,0.2954150841267053,0.7363134266783119,0.7363303983078896,0.7363303936613157,0.736311125854826,0.5935608781722307,-0.4172918849463036,-0.4160167662236073,-0.4155609765552659,-0.4198865177322343\ngdb_107399,CC1CC2NC2(CO)C1,-0.0033777054763118,-0.2351676098656654,-0.1800363507395429,-0.2687134623311896,0.7961490930166759,0.3649424342738657,1.026422380915403,0.8438671564685502,0.5843322938242469,1.4071874943319342,0.1798899171089081,0.1799064442416271,0.1799064395916149,0.1798754091460725,0.8722089919936595,-1.2273101257672865,-1.2296837873370612,-1.2308055365989308,-1.211545425741095\ngdb_87239,CN1CC1(C)CCC#N,-0.0035625500999623,-0.3869928226997942,-0.3264468988498067,1.5933560484844516,1.1246869621483395,0.084797912520442,0.6173553232596714,0.568187303950176,1.398771467499691,0.974145333419136,0.7059519937187075,0.7059995033721017,0.7059994987253404,0.7058864013815546,1.2955965854325804,-0.9829886101985814,-0.9809035381397518,-0.9813411554353172,-0.9812154308286204\ngdb_101637,CCC(O)CC(C)C#N,-0.003928741076962,-0.3693263939586359,-0.3532445654074861,1.4124222492968703,0.954922040924172,-1.7361414788768077,0.5704830562366189,1.3699737452440723,1.7096800292916392,1.306507971876672,0.1789006566979614,0.1789649317810552,0.1789649271310371,0.1788182145163625,1.9065547289880465,-1.331224843583784,-1.3277128584612383,-1.3281432017519152,-1.332233008564835\ngdb_11291,CC1COC(O1)C#C,-0.0032042343760108,0.0819946662336424,0.2038292699744627,-0.7322542248579874,-1.1286675677695666,-0.8957079136165377,0.4831301949663843,0.8922689626358976,-0.8332892711686045,-0.575297490372954,0.6949688566904129,0.6949601405907013,0.6949601359438717,0.6949718004121347,-0.6849711917822783,0.7832338273594706,0.7823461984797376,0.7816636586434602,0.7827006154740381\ngdb_131034,C(=O)n1nc(nn1)C#N,-0.0013753850849851,-0.2885078493243163,-0.2597995687393931,0.2135152982337621,-1.3863682755271156,-3.9818161129324734,-2.804320169423168,-0.9133288500416972,0.0682558016160144,-2.826762094771752,-1.3038715937136924,-1.3039155960171442,-1.3039156006763255,-1.3038211192635976,-1.91919525777225,2.7196451309255054,2.721285176576371,2.721562523687216,2.7244051556471494\ngdb_17455,CN1C2C3NC3(C)C12,-0.0035170836390309,0.4506029971517342,0.4988226566039574,-1.425997145036591,-0.5265516960896067,0.6812346362535373,1.384356056364168,1.0501009392685543,-1.2498637566894062,0.1732472456847232,1.6884550546151575,1.6884375637762588,1.688437559135569,1.6884906979409189,-0.6662633678861407,0.3594883300594273,0.3534868809064043,0.3509515228503904,0.3885595679871965\ngdb_47752,N=C1NCCOC1C#N,-0.0043357507339714,0.2835359119154768,0.1738826950508435,1.5753214769511283,-0.5595276160396244,-0.3534927102228161,-0.3264816899772512,-0.1578397885600456,-0.6118772531838726,-0.6696958127705597,-0.5623681202062797,-0.5623772917068525,-0.5623772963614517,-0.5623598923352463,-0.4112461895124646,0.7065367419162213,0.7069940010609553,0.7068455063394556,0.7122463323765365\ngdb_118537,COCC(CO)CC#C,-0.0037092342869128,-0.3862919743830434,-0.4089756798493084,-0.2578665823510024,0.8352316648092909,-0.3760850103642212,0.864500003926676,1.0290566757175337,1.7990203662009732,0.913316869702882,-0.3152476993253174,-0.3151825798567685,-0.3151825845098415,-0.3153311102712698,1.831477277825912,-0.8467929413086819,-0.8434907802222921,-0.844897579418595,-0.8371950673058701\ngdb_12254,NC1C(CO)OC1=O,-0.0036180029344082,0.117340151613558,0.0706989006249016,-0.2155245448379822,-2.1973316402238576,-0.5794157116368686,-0.3499178234887775,-0.0757671607110644,-0.7621557954576452,-0.8871936348627186,-0.6340614172857741,-0.6340749246404201,-0.6340749292954626,-0.6340392957884832,-0.7110636830058401,1.3427003893629843,1.3408366625231023,1.3386538466040043,1.3631676520651177\ngdb_15979,C1CC12CCN2C=O,-0.0034595031046387,0.1783013272732734,0.1430325203352589,0.9958628524673896,-0.5033464190877425,0.5818285156313546,0.3041633572420018,0.0273497306889384,-0.8057407370428068,-0.1916333758464231,1.190416032003169,1.1904042070235012,1.1904042023797337,1.1904155096521902,-0.8459569395201005,0.4352335850068199,0.4331197443412153,0.4324615267264103,0.4354274532041762\ngdb_52210,O=COCC1CC1C#C,-0.0034556290327884,-0.3866708113110168,-0.3530163836418383,0.6358901909562349,0.1573933102811406,-0.8866709935599766,-0.0878592396780743,0.3261782731134353,1.2044697464398295,-0.4463375408638407,-0.2559662629587137,-0.2559481485369211,-0.2559481531896267,-0.2560119874951322,0.3245128318764503,0.1331886629780486,0.1376680199425737,0.1390962249721503,0.1107632652950803\ngdb_68631,CC12CC(CC1)C2C=O,-0.0042009706137337,0.1955320935472623,0.2440622788934853,-0.2121267270128631,0.8584369418111552,-0.2043835292895426,-0.7376793052249397,-0.6334401448131187,-0.4692042353180281,1.0388807636784303,0.6101905375177091,0.6101916676756945,0.6101916630283413,0.6101950854771895,0.282666383687721,-1.2257892946708917,-1.2263465832295233,-1.2250530733694585,-1.2388505924062394\ngdb_63650,CC1(C)CC1CN1CC1,-0.0032774880455922,-0.3823520703321204,-0.3254702808928341,-1.1103268005529483,1.9283223446339663,0.5004962351222965,1.586759027600077,1.334198497207337,1.59717638988862,2.0298076933485523,1.0776451878632518,1.0776922074338149,1.0776922027893507,1.0775857438122427,1.624952748235648,-1.8906494516260288,-1.891890084644653,-1.8932188393750584,-1.8767063528005787\ngdb_91998,OC1CN2CC2C1C#N,-0.0038877676551381,0.0885927427291787,0.1208167436317876,-0.1884726875379969,-0.4386159095562252,-0.3534927102228161,0.1741993441326287,0.3388048312440476,-0.6133660443273151,-0.3210441291874261,-0.161108570905541,-0.1611288048183014,-0.1611288094704211,-0.1610737030980035,-0.440538703244576,0.281055399467864,0.2801905467652933,0.2806135303338968,0.2824336008654879\ngdb_117373,CCOC(=N)OC1CC1,-0.0025734459495794,-0.4246807830843452,-0.4086379708361496,-0.9494531950636648,0.3100596063460394,-0.6065264718065542,1.054119629610843,1.3236763654318269,1.905888046430246,0.6254936498360023,-0.7173236443298154,-0.7172948208657633,-0.71729482552132,-0.717373925557368,0.8298702326497681,-0.5070681178153731,-0.506620298301654,-0.5079642721726949,-0.4937575087024557\ngdb_49361,O=CC1C(C#C)N1C=O,-0.0043902806440249,0.2440863598158496,0.2221020657675396,1.0534950701934445,-0.5546422945655469,-0.8143756331074807,-1.4855050200018245,-1.0901006638702726,-0.6678683110272452,-1.534157240801355,-0.6264357497132637,-0.6264321659761333,-0.6264321456685795,-0.6264381632537944,-0.2465681081109307,1.169391197486087,1.174501469006987,1.1759357851049248,1.1515967796911506\ngdb_122073,CCCCC(NC)C#N,-0.0041022287567431,-0.4087822600070627,-0.4790366091738134,1.4579660766835598,1.4959713941781725,0.1751671130860625,0.5278719043974801,0.4398172962889485,2.5442323602442034,1.6782107580637415,0.6756240353543871,0.675700710274327,0.6757007056273785,0.6755052607254214,1.9838475803484077,-1.5451690690716389,-1.5424479078107052,-1.5438025062700118,-1.5375688877578395\ngdb_73724,CC12OCCC3CC1C23,-0.0040089636546794,0.3084002422881292,0.2443452242828886,-0.9141681638028148,0.7534025301185049,0.8258253571585291,1.6038034883357326,1.199515210480803,-0.6208646903314282,1.0896713287379507,0.6109815714102772,0.6109624363011172,0.6109624316537686,0.6110220796706696,-0.0607206470374511,-1.1426968522568004,-1.1460951414628389,-1.145367650697224,-1.1444423089965885\ngdb_90984,CC1OC2C3CC2C13C,-0.0039619939875521,0.2815659598900153,0.2034367973375483,-0.8113188319424849,0.7900424411740803,0.3062024539062127,1.616586833887474,1.454150799448156,-0.5782688207976243,1.0456127675679765,0.6112025386865249,0.6111907188117087,0.6111907141643615,0.6112361046696104,0.1389115263805517,-1.1194858231850964,-1.1223266589363912,-1.1217668089594393,-1.120009570633946\ngdb_43048,O=C1C2CC3C2OCC13,-0.0040581107601933,0.5106928478771108,0.4621857923115442,0.2416779805919591,-0.5644129375137019,0.1390194328598146,-0.4692290486383659,-0.528218827058014,-1.1433356566729231,-0.2820946781121957,-0.2568316848408012,-0.2568881133194562,-0.2568880930096187,-0.2567555833306179,-1.3769145203618172,0.0422822952768952,0.0398000910723944,0.0419211462630687,0.0258755999010233\ngdb_120887,OCCN1CC2(O)CC12,-0.0031357666496854,-0.3214035225700042,-0.263149277059103,-0.7932842604091614,0.1097614259088922,0.8890837975544639,1.2011281034558718,0.7723166603950787,0.8060224101306114,0.6547658453200099,-0.7158733780831545,-0.7158661143363897,-0.7158661189919374,-0.7158751265124187,0.6398381267574157,-0.3547280402049043,-0.357865204928773,-0.360258358993867,-0.322657099382181\ngdb_29108,c1c([nH]c(=O)c(=O)o1)N,-0.003707957666945,0.036957269446408,-0.1158442564274997,2.94921604600786,-1.2666778994122363,0.8122699770736862,-1.3960216011396331,-1.7593082447927375,-0.388733018661057,-1.7569144409322217,-1.9568875831616328,-1.9569181044801385,-1.9569181091433567,-1.956835254793948,-0.7248483953503634,1.5795331228457108,1.5819288667437017,1.5829257845111735,1.5738408905944108\ngdb_40497,C1CN1C12CC3CC1C23,-0.0032251411004187,-0.065524433519189,0.0925860955858399,-1.097388955757303,0.9512580498186144,1.1285621790533578,1.7976842292038138,1.2521258693583552,-0.327040587771369,0.7149030764940338,1.1388507020298897,1.1388290346128298,1.1388290549312925,1.1388726285042303,-0.3829382980906758,-0.7085572610516917,-0.7126562649818099,-0.7125444718706384,-0.7041115885309666\ngdb_106621,CCC12CC(CO1)C2C,-0.0040143243532455,0.0097757198643184,0.1107950004845356,-0.8637236376298956,1.212622748681721,0.3107209139344932,1.5718451244563785,1.4099578459910125,-0.0512692842289337,1.7371157920972686,0.5805898091193772,0.5806048564004508,0.5806048517529147,0.5805865466104447,0.882301370674472,-1.7115794564420563,-1.7137603177333107,-1.7139066376491938,-1.7070051124259482\ngdb_8693,C(C(CN)(C#N)N)N,-0.0039327643641333,0.1456329561843695,0.1652209152268525,1.0085393266611025,-0.8966147977509215,-0.1772727691198569,0.429866255167461,0.5050545132971133,-0.6315875610927676,0.195006043970577,0.8549996386921251,0.8550281175968028,0.8550281129509625,0.8549866647764037,0.6856230641874385,0.5843668067028404,0.5816539168818456,0.5775092805843072,0.6289858064283574\ngdb_80709,CC1C2CC(C=C2)C1=O,-0.0041181063721869,0.3282828670579317,0.3472004369662975,0.0378089110848251,0.4395206254090738,0.8393807372433719,-0.6055847345236097,-0.9890881988253722,-0.794141749332469,0.3862971366355748,0.6395615118705613,0.6395381644917303,0.6395381848071072,0.6396005826666085,-0.3949999213921342,-0.764133149119752,-0.7639540823264053,-0.7610416854230114,-0.7938749729987915\ngdb_116427,OCC1COCOC1=O,-0.0039048500461359,0.0154267040399213,-0.0826301186198039,0.9352902154695968,-1.3094244623104072,-1.0674093946912175,-0.4138345512474856,0.0862736686317963,-0.2210661822281609,-0.3422018556974269,-2.1113960304756816,-2.1114164125911703,-2.111416417255342,-2.111369716254973,-0.4368463695808643,0.4128904785084362,0.4119633257782714,0.4114569066166418,0.4225841651467562\ngdb_128627,COC1=NC(=O)N=NO1,-0.0042457849484478,-0.2927760787128163,-0.378107250592472,0.5042247502328776,-2.1631343899053213,-2.305467442440216,-0.8058571481675617,0.2777764669460866,1.1055682605497532,-2.3015155234432094,-2.358829205423985,-2.3587741298862137,-2.358774134551914,-2.3590171339089583,0.084757299312783,1.9333675879376564,1.945476188975679,1.946347778427016,1.9014031534877396\ngdb_69893,CC12CC1C1OC21C=O,-0.00379651419666,0.0665444335211283,0.0249073838946981,0.9218296294700868,-0.1418326300060612,0.0667240724073187,-1.129701902145016,-1.1490246018131312,-0.4299526947009219,-0.4523783122111557,-0.2549353153187417,-0.2549454778313029,-0.2549454575214533,-0.2549099110193662,-0.1495828105441085,0.2414823255015106,0.2420648091847159,0.2427592770140687,0.2365744851115927\ngdb_56234,CC(=O)COCC(=O)N,-0.0026845726782058,-0.4628486035776626,-0.4797485362826345,0.8303499187938089,-0.5741835804618552,-0.1591989290067323,-0.7355487476329828,-0.6523799820090378,2.156398156669665,-0.1455011170497579,-1.6155472848315875,-1.6154989314166148,-1.6154989360777232,-1.615654918667764,1.057564141911978,0.1070723220880945,0.1120282080104449,0.1111984557883936,0.1062579982837404\ngdb_51625,O=COC1CC2(CO2)C1,-0.0015802577205982,-0.4429028393198645,-0.3944085559303519,0.9835784341765746,-0.4630425169266077,-1.5057000174344757,-0.1453842946609116,0.5555607458195637,1.3954214193373038,-0.4330238010287112,-1.1833225366979274,-1.1833292601961114,-1.1833292648545477,-1.1833312104973723,-0.2591220425675504,0.3107089837168362,0.3115717012780688,0.311773351344506,0.3077297774973152\ngdb_68716,CC12CC1(C#N)C1OC21,-0.0040747455052284,0.1565497736588022,0.2094607959506506,1.4613638945086789,-0.2053418091690586,-1.5960692180000962,0.0165380823278155,0.7596901022644663,-0.6877105190398451,-0.7425757655926524,0.2704522020156347,0.2704367727164421,0.2704367680669893,0.2704790258191478,-0.4663850388905567,0.4149462226110025,0.4168365794395636,0.418734585534143,0.3958916692005375\ngdb_114534,CCC1OC2CC(C2)O1,-0.0034023978486761,-0.1218132870672326,-0.0245898047096257,-0.8952841378132115,0.3894460802997857,-0.0281635881865834,1.7124619255255362,1.7066819620604072,0.0593577276823513,1.057574095907437,-0.3162630953822453,-0.3162743668623116,-0.31627437151539,-0.3162388069165319,0.1320191702082906,-0.9534530214656762,-0.9571662444367394,-0.95777128254399,-0.9408162085557976\ngdb_11066,CC12COC(CO1)C2,-0.0035621411394532,0.5566520811890807,0.7452954725861008,-0.8612406169115395,-1.056609076026934,0.0350948522093513,1.582497912416163,1.5467455590726484,-1.4588306960546702,0.2110847139746257,0.6638862845934923,0.66384577109205,0.6638457664450282,0.6639538018369473,-1.1002356511610345,0.14178660291506,0.1358850588695433,0.134884461853937,0.1536445163892073\ngdb_58435,CC(O)C1(O)C2CC1N2,-0.0038320163900505,-0.0616981805466578,0.1073814012704443,-0.2847224116995381,-0.0929794152652929,0.2564993935951207,0.7899304882081832,0.6607820635746673,-0.2511569210627615,0.6713253727447418,-0.7154319677413238,-0.7154314414706798,-0.7154314461262263,-0.715410981325774,0.6745460631963035,-0.3083610466360789,-0.3126076230560759,-0.315319982047307,-0.2696710558656095\ngdb_61731,OC(CC=O)C#CC#C,-0.0028334453565296,-0.5155700760539571,-0.5387472136085336,0.1732642255361998,1.2602546330539697,-0.4483803708167184,-1.1233102293691453,-0.9007022919110841,2.6264601002334387,-1.2138761450384503,-0.2253070034544977,-0.2252705237955455,-0.225270528448063,-0.2253718151835295,0.7060539771266419,0.73016993272277,0.7386558549846454,0.7407228436262944,0.696687345735526\ngdb_6586,CC12CC(C)(CC1)O2,-0.0033702391837728,0.3086780560353098,0.4322665991998027,-0.8047845668941793,-0.0673314775263907,0.1119086726901289,1.6613285433185698,1.5888340861746904,-1.015163071346065,0.8582586950348101,1.561020190973642,1.5610100916269225,1.561010111947994,1.5610553609404672,-0.1313672978031313,-0.5868225981868891,-0.5910250517228371,-0.5917737128981804,-0.5810132176672459\ngdb_36741,C1CCC(C1)OCC#N,-0.0035934100389244,-0.2469178685816394,-0.2571891693403821,0.3758917846841558,0.4688325542535338,-1.6051061380566598,0.0548881189830403,0.8017786293665082,0.7032861700059979,0.6996959605649702,0.2091271428551023,0.2091421822277571,0.2091421775779255,0.2090569464229081,0.2445122691627017,-0.7797033129954617,-0.7788233538985919,-0.778247460367332,-0.7921366979340486\ngdb_63734,CC1(C(=O)C2(O1)CC2)C,-0.0039609163213455,-0.0338726081870137,0.0878855512134949,-0.956706229267284,0.3210515796627108,-0.1320881688370467,-0.7802904570640783,-0.707095067241692,-0.161506315074794,0.2215133092856638,-0.2865736755319817,-0.2865526829348295,-0.2865526875877242,-0.2865834641224474,0.8010700300728172,-0.458346458170362,-0.4557097421813762,-0.4549739951467857,-0.4673186299372946\ngdb_105872,CC1CC2N(C)C12CO,-0.0041984671122384,-0.0201587113943776,0.067111883268918,0.0526416927444788,0.8962981832352503,-0.0688297284411119,1.3779643835882975,1.3931224351501963,0.0487367309848488,1.3526402306733374,0.1808296021684511,0.1808565937441534,0.180856614056696,0.1808173288230058,1.1535648171684776,-1.1286029433932203,-1.13075543846968,-1.132572355635886,-1.1040174400429437\ngdb_66189,CC1(OCC=C1)C1CO1,-0.0040602384601396,0.1291093521759308,0.20180301589551,-0.833731361058173,0.1329667029107565,0.2474624735385596,0.0804548100865235,-0.0357830599641243,-0.5026761207134495,0.2920190285931396,-0.2861328642880516,-0.2861344603889107,-0.2861344650418028,-0.2861204172679643,0.1357115038720022,-0.4120423955430207,-0.4121649465289796,-0.4117363240177704,-0.4144579701958522\ngdb_30467,CN=c1[nH]c(co1)C#C,-0.0017659921096808,-0.3209426043076366,-0.3014016682523027,-0.1322126654720861,0.8425596470204063,2.1045495451620586,-0.7057209413455857,-1.6772356169437566,0.6957800174919173,-1.1292752925872374,-0.1310049497431679,-0.1309915693376422,-0.1309915739895757,-0.1310308487476385,0.088941944131656,0.8196708427214344,0.8249137443311724,0.8263723499792048,0.800168855054514\ngdb_74181,CC1=CC(=O)CC2OC12,-0.0040906286471655,0.1558236695468532,-0.0376965653284361,0.8158438503865707,0.1671639532292939,-0.0507558883279886,-1.4599383288983412,-1.418391175266199,-0.2836207894054804,-0.3808808244735235,-0.2572372241957412,-0.2572522170591469,-0.2572522217118606,-0.2572148609088202,-0.3873690984871298,-0.0003167081459362,0.0018900761868272,0.0042759326564169,-0.0265547655212269\ngdb_94268,OC1COC1(C=O)C=O,-0.0041332710699863,0.0555076510195041,0.1235184157370578,-0.6876905372285433,-1.4547627761641906,-0.7691910328246705,-1.8476998106345035,-1.4667929814335472,-0.5550000697115467,-1.190524506546588,-2.0801510529224494,-2.080150795008138,-2.0801507996721167,-2.080163029206846,-0.1894600141121927,1.0713972324535828,1.0741722221544827,1.0738727901299745,1.0731806268725024\ngdb_92873,CC1N(C)CC=CC1=O,-0.0041377198971469,0.1128698758634723,0.028366619461919,0.6183130179762929,0.8559942810741173,1.2144129195906963,-1.263927030438303,-1.8140233300253925,-0.1860683305923184,0.6726176770628239,0.2090393999748481,0.2090511438115599,0.2090511391617278,0.2090404464783904,0.5243911608720352,-0.7889200738615979,-0.7883021571546039,-0.787659408953193,-0.7940203041922368\ngdb_55900,CC(=O)CN1CC1C=O,-0.0033042970649161,-0.3969941175982912,-0.3718459429430959,-0.3345788540181099,0.0633508719051617,-0.2540865896006346,-1.1211796717771882,-0.9890881988253722,1.3505231959607842,-0.1357337006921581,-0.6872319801630741,-0.6872044151270261,-0.6872044197823968,-0.6872909820831058,0.5408835845699467,0.0302913287223079,0.0332270465334836,0.0329530844764308,0.0285542532792339\ngdb_10013,CN=C(C=O)NC=N,-0.0030591749781102,-0.1437542591456905,-0.1785942419806487,-1.6333947176698098,-0.3250321852839397,-0.0055712880451783,-2.062886127422154,-2.034988097311112,-0.0084110442991426,-0.9435741674376656,0.3883584634480689,0.3883710374067877,0.3883710327580637,0.3883348345167687,-0.383430609245838,1.3346766942708332,1.3350173478838694,1.3328755760024835,1.3511849526465372\ngdb_33729,C#CC12CC(C1)N1CC21,-0.0035930287108821,0.0599211012303938,0.1489652462421021,-1.077328762059005,0.8364529951778107,0.3197578339910555,0.6109636504838006,0.4524438544195609,-0.6072922054054974,0.0156161724674544,1.168832756242948,1.1688041633002926,1.1688041586563915,1.168874944412995,-0.4533387932787758,-0.1827115547600056,-0.1848114190318799,-0.1835475403704279,-0.1910041354971577\ngdb_17750,OC12CC3C1C1C2N31,-0.0032442627674689,0.7562991422310481,0.9455751719305004,-1.0195005163815007,-1.528042598275342,0.5953838957161974,1.1031224542258526,0.8101963347869169,-1.8500363949629288,-0.8588831542200748,1.2226247833440922,1.2225646071753822,1.222564602531812,1.2227083972839312,-1.8601179191528647,1.1949778696146112,1.188492255386171,1.187383935345364,1.2100227692695698\ngdb_74728,CC1=CC2CC1(O2)C=O,-0.0040417357603465,0.2363770283315915,0.1138526361442163,-0.3673155219101208,-0.199235157326463,0.1028717526335666,-1.097743538265662,-1.1321891909723143,-0.61272441645157,-0.4289365129529155,-0.2557525597445299,-0.2557623023596291,-0.2557623070123335,-0.2557352077904147,-0.2615835983433576,0.1556366543787675,0.1570180858149112,0.1583098134613842,0.1423599766271416\ngdb_87301,CC1NC1(CC#N)C#C,-0.0042219989209957,-0.0536605237248228,-0.1553379564258235,1.4020981105205472,0.5714243052091462,-0.6833402922873305,0.1571548833969732,0.4734881179705818,0.2515079400045461,-0.4387940900768941,0.7667399616099502,0.7667704794329975,0.7667704747866103,0.7666988565039277,0.704330888083577,0.1552433359918774,0.1602383318349818,0.1615073468585279,0.1372192418473048\ngdb_115928,COC1CCC2OC2C1,-0.0034631726962345,-0.1902754339008207,-0.1750984973309242,-0.1597872639759355,0.3845607588257099,-0.4303065307035937,1.4994061663298428,1.6835332721542844,0.4120718307444422,1.0517436996816694,-0.3163359856270089,-0.3163348011934374,-0.3163347808839672,-0.3163332385203332,0.2774971165585247,-0.9611096193971465,-0.96345858956515,-0.9640166665799882,-0.95159636358564\ngdb_1680,C[NH2+]C(C)C([O-])=O,-0.0031444156118049,0.3467637951942328,0.2718274361375113,1.5477468784472788,-2.2144302653831267,-0.272160429713758,0.1891132472763272,0.3156561413379248,-1.193772254593619,-0.3636300645065624,1.2131219173823926,1.2131155085404726,1.213115528859394,1.213132238863869,-0.8811571871141506,1.17110082140819,1.1667536454524423,1.163359852705994,1.1961279672802712\ngdb_29985,Cn1cnc2c1C=CC2,-0.0036067454674193,0.1489098956113392,-0.013390643651631,0.4347655127693891,0.8926341921296926,2.325954086547828,-0.3690928418163899,-1.4478531442376286,-0.4442097018452665,-0.2970313117421264,0.7646615661752831,0.7646399258759847,0.7646399212295859,0.7646878103160734,-0.7733410441337745,-0.0630772120803625,-0.0615921748459305,-0.0587585735701154,-0.0923584504139413\ngdb_55886,NC(=O)CC1NC1C#N,-0.0034923581077064,-0.3285004010207054,-0.3281354439156005,-0.4621277077610348,-0.6022741789377953,-0.8188940931357612,0.0250603126956432,0.4061464746073141,0.8649302832754218,-0.7346716717401945,-0.5627081082657573,-0.5626912207230568,-0.562691225377658,-0.562756465131861,0.2216198004476894,0.6708234323865536,0.6743081141011535,0.6743901552520434,0.6669742406611996\ngdb_100417,CC(=O)CN=C(CO)N,-0.0038188633358367,-0.2778057061090686,-0.2791767642782056,-0.4382122976842362,0.0560228896940481,0.2881286137930881,-0.8527294151906142,-0.9743572143396574,0.9019777695850932,0.2651210666237493,-1.1189404057547196,-1.118905439452635,-1.1189054441106732,-1.1189822256630888,1.1771957526162322,-0.1191093494767981,-0.1175215324706126,-0.1191710507969943,-0.1007164188787922\ngdb_70832,CC12CC3(C)C1CCC23,-0.0041138343928141,0.2608751496738696,0.2134311586729226,-1.714550289569765,1.757336093041279,-1.2345924157376142,1.5505395485368092,2.1065229695298053,-0.3772955996414356,1.7470334763988318,1.5077409167895124,1.507750371522193,1.5077503668803869,1.5077480590991186,0.6856230641874385,-1.9106510026602872,-1.9121940087007887,-1.9109407617603364,-1.9219755948842097\ngdb_110393,CCN1C2C3C1C2N3C,-0.0023366301823043,-0.2746108480164931,-0.1520703935417468,-1.0126395380807798,1.0733910866705332,1.3770774806088142,1.3481365773009002,0.6902440325460968,0.5487911138479402,0.9937402733119216,0.708956319923581,0.7089602114586048,0.7089602068118618,0.7089587859068188,0.3513437898327545,-0.6674056692926175,-0.6726384054461805,-0.6752502385242487,-0.6304771169551925\ngdb_9380,CCCCC(O)C#C,-0.0024256675169422,-0.3968236409807032,-0.3803799409783243,-0.9037786823760088,0.3576914907182879,-1.0764463147477783,0.4852607525583413,0.9806548695501862,1.5182586633419484,0.7686990004328161,1.5618878095480375,1.561936951071234,1.5619369464297623,1.5618189763758064,1.07110269867892,-0.4956854836991769,-0.494521633877522,-0.4959535212216118,-0.4938401480087909\ngdb_39729,C1NC11C2C3OC1OC23,-0.0035279431984973,0.1905377600468133,0.4989960747458496,-1.162666263589876,-1.1030196300306645,0.1887224931709053,1.322569886197417,1.218455047676722,-1.1640636328616214,-0.6501309264665666,-0.6561392982890545,-0.6562022273235741,-0.6562022319787533,-0.6560537663933049,-1.5066385097468822,0.6728005128113239,0.6680079717668835,0.6681344482676371,0.6826358140273157\ngdb_105646,COC1(CCC1C)C#C,-0.0043204368208511,-0.0057502802337903,0.1234819066545541,-1.2311453612961183,1.2370493560521034,-0.484528051042965,0.6237469960355422,0.8417627301134483,0.0399237308308108,0.9341440067361644,0.6116981673885021,0.6117416921932006,0.6117416875458569,0.6116616584580333,1.4720901345579924,-1.067423579373677,-1.0649600163654978,-1.0648047773753613,-1.0714290570339546\ngdb_48634,O=CC12CN(C1)CCO2,-0.0035281863642055,0.0141007747920145,0.0719493367006517,-0.0357015707086124,-0.4654851776636473,0.970416078063522,-1.0572629440184802,-1.4962549504049767,-0.5120024231680841,-0.0035580171822342,-0.68652686689464,-0.6865501217552296,-0.6865501264105962,-0.6864969628165668,-0.5188161769152606,0.1043584252189675,0.101351234121221,0.1005967873511252,0.1191981738033063\ngdb_40551,C1NC1C1=CCC2OC12,-0.0033325706049824,-0.132679592951269,-0.0904704440874627,0.1769234139632509,0.4395206254090738,0.5456808354051067,-0.1389926218850408,-0.3914311139763779,-0.0071045541120407,0.0154959581122838,0.2406118594188505,0.2405895271674986,0.2405895225178614,0.2406343455376635,-0.5751858041812563,-0.0960136937985819,-0.0976932379722719,-0.0970438100031434,-0.0992203624710032\ngdb_108444,COC12CCC1CC2=O,-0.0041995337254583,0.2102246524038296,0.0861331152533198,-0.3393488675033728,0.0657935326422014,0.8077515170454057,-0.7675071115123369,-1.134293617327417,-0.3597982823613645,0.2765113767761504,-0.286142799328263,-0.2861351094151844,-0.2861351140680765,-0.2861529678392971,0.2550969589986746,-0.4130860003295711,-0.4122325223130891,-0.4118034231850944,-0.4181738893496762\ngdb_14593,C(C#N)N=COC=O,-0.0022751755760618,-0.3172489442598962,-0.3114599204820583,-0.488264767954257,-1.8223832170884664,-2.7618319052965985,-1.0913518654897911,0.2062259708726164,0.4281386216965295,-1.9987556699463995,-0.0783128414279826,-0.0783246598465268,-0.0783246644981348,-0.0783107552192209,-1.3050370917082337,2.0818216798856457,2.084360019731406,2.0842494709648545,2.0740965066779484\ngdb_37709,C1C2OC3C4C(C3O2)N14,-0.0038443791557127,0.5854247458686581,0.9340109200474692,-1.095886074796193,-1.2544645957270433,0.6857530962818179,1.3353532317491588,0.9974902803910024,-1.5654270695665664,-0.5994004685846316,-0.6566036241256433,-0.656685951598064,-0.6566859312906971,-0.6564943972868552,-2.0233190670889147,0.6240264107138921,0.6176432200571317,0.6181275034504631,0.6323341231505527\ngdb_52627,CC#CC(C)(C)OC=O,-0.0038874526450162,-0.3612697952904022,-0.3420180225376136,1.229528170594797,0.904847495814884,-0.986077114182158,-0.0239425119193662,0.4356084435787437,1.4894579718630947,0.1749603002459024,-0.2866781681910827,-0.2866153139702673,-0.2866153186231623,-0.2867806896317349,1.5545522530475482,-0.4693226632867712,-0.4622308053483096,-0.4614490647940097,-0.4898335664671676\ngdb_47079,O=C1COC2OCC=C12,-0.0036392467749114,0.1144736188585595,0.005603206520893,-0.5728181576793313,-0.7524978142656562,-0.2947527298551631,-1.687907991237733,-1.52992577208661,-0.6155479365577045,-1.0540511598393207,-1.1539174886518906,-1.153958499782448,-1.153958504440703,-1.153865779318863,-1.2784522893295105,0.7759443265890982,0.7764904968798304,0.7782906351168026,0.7595473615439622\ngdb_84848,CC1C=CCC2CC1O2,-0.0040538774662741,0.224986664697192,0.1847441470956793,-0.7975968753410432,1.0037752556649386,0.1887224931709053,0.361688412224839,0.2693587615256792,-0.4701007270797936,1.0911740081775811,0.6103573113962228,0.6103435897488984,0.610343585101546,0.6103886415585729,0.0040182698662913,-1.208270893719202,-1.2105286516107765,-1.2093467067411194,-1.2167545516714562\ngdb_48128,N=C(C=O)N1CCC=C1,-0.0037375409859393,-0.0576761951613405,-0.1784299511093823,0.4034663831880055,0.7717224856462934,1.53522358159865,-1.5707273236801018,-2.266474996372342,0.0836280616697319,-0.3480623055119869,-0.1618401443313191,-0.1618445809482804,-0.1618445856004045,-0.1618444826544877,-0.3524150064706609,0.2042088530375961,0.2056648531679515,0.2066134717238107,0.1944426870374414\ngdb_22429,ON=C1CC2CC(C1)O2,-0.0034119199967477,-0.0990325597984338,-0.0211488236836566,-0.8386974024948852,-0.1271766655838304,0.0079840920396644,-0.0772064517182896,-0.0799760134212687,-0.2528991462808016,0.0176598165053518,-0.6863671324164761,-0.686395803277321,-0.6863957829701377,-0.6863235261834149,-0.431677102451668,0.1211373876037211,0.1174186763270338,0.1165534854883284,0.1389974117491346\ngdb_48683,O=CC12CNC(=O)C1N2,-0.0033890900526481,-0.1251912496749953,-0.042625291466429,-0.2830235027869788,-1.0077558612861677,-0.8369679332488859,-1.195749187495681,-0.7933765478008776,-0.3385155393408245,-1.040857634359362,-1.0590314143356008,-1.0590586019182098,-1.059058606575878,-1.058992945533726,-0.8257721821584783,0.9267924164523804,0.92740021813834,0.9281359794383996,0.9237459639687262\ngdb_24680,c1cc2n(c1)C=CCO2,-0.0033391305526092,0.1716716810337394,0.0563417039303411,-0.2978562844466325,0.4883738401498404,2.348546386689233,-0.3776150721842177,-1.4667929814335472,-0.711962265478736,-0.6360357933228299,0.2686292719114158,0.2685869979105639,0.2685869932610997,0.2686853994750974,-1.1332204985568572,0.2234604770146014,0.2242403965920701,0.2274967971849537,0.1911342651523318\ngdb_3926,c1cn(nn1)C=O,-0.0006569685797245,0.505654316735065,0.5772076567392946,1.6031574460569102,-2.982647067181697,-1.8219922194141476,-1.8136108891631924,-0.942790819013126,-1.9727356621653247,-2.423953844184325,1.3990460352269622,1.3989772624640613,1.398977257821581,1.3991285249417382,-3.158834746468997,3.0141802938872444,3.009113832714206,3.0073559424826284,3.0271072354898947\ngdb_112733,COC1CC(C)C1(C)O,-0.0040877548706146,-0.139422890269195,-0.1107603666888663,-1.0710558676126318,0.8926341921296926,-0.1817912291481375,1.5079283966976704,1.5741031016889762,0.5471235962767138,1.6324992495101736,-0.3462585793732785,-0.3462069344037318,-0.3462069390569952,-0.3462953904189174,1.972032112624532,-1.4807094297054275,-1.48057967839029,-1.4823706378472414,-1.4601187599343897\ngdb_71613,CC12CC3C1COC3C2,-0.0038304413394408,0.1803091629915323,0.3975099526563277,-0.6838353208500431,0.6752373865332767,0.2429440135102779,1.593150700375948,1.4625685048685646,-0.7516870736766145,1.1146759146134069,0.6103991234373122,0.61036822778476,0.6103682231374077,0.6104525944448906,-0.3652150965048607,-1.2038788383989238,-1.2079633708832542,-1.2067995191199945,-1.209453796401443\ngdb_905,Cn1cc(nc1)O,-0.0012206045852518,0.4171643243092847,0.4719702264225225,0.252916916475045,-2.332899311129488,1.9283296040590976,0.4405190431272457,-0.4650860364049511,-1.674959918286957,-1.3559995664387297,1.7691253095479038,1.769084545113787,1.7690845404735958,1.7691811429549884,-2.0186421111148807,1.9369887392196297,1.9330552400427208,1.9315704771924005,1.943628989393691\ngdb_82996,OC1C2OC3COC2C13,-0.0033342727649395,0.1271899117408658,0.3342762217600053,-1.1103268005529483,-1.1433235321917974,0.3333132140758983,1.2501309280708812,1.0816673345950858,-1.018335465439234,-0.2897282896655201,-1.182990286986353,-1.1830508279245826,-1.1830508325830171,-1.182900065199763,-1.2516213313732072,0.3456094352469293,0.3405617126612471,0.3405588941267098,0.356948608498418\ngdb_65847,CC1(OCC(=N)O1)C#N,-0.0039394569476009,-0.0169827951482008,0.1277169602249776,0.1317062998289763,-0.9906572361268986,-2.300948982411935,-0.3179594596094234,0.7575856759093645,-0.4883815810192707,-1.1072760655910427,-1.0590519833635756,-1.059063369765069,-1.0590633744227376,-1.059028890949923,-0.3551227178240492,0.9246317874466436,0.9269037960315863,0.9276430586318738,0.919642494964504\ngdb_10524,CCCC1(CO)CO1,-0.0033181188248273,-0.1692752401934968,-0.1150775656949231,-0.5408002589426338,-0.5131170620358957,-0.9408925138993478,1.6038034883357326,2.0244503416808235,0.2081910880610613,0.8160935099587708,0.6345567977703652,0.6345780311181041,0.6345780514334504,0.6345288841379448,0.4441444425807046,-0.3155115749097493,-0.318307380539223,-0.3209795387369952,-0.2935481161334763\ngdb_92446,CC1CC2CCC1C2O,-0.0042320350329504,0.1777015021373156,0.2416983158013738,-0.7760991433321178,0.9671353446093632,-0.3399373301379733,1.1159057997775943,1.260543574778764,-0.3448135012030833,1.8111678348822733,0.5798556895275129,0.5798561297057934,0.5798561250582526,0.5798744279762276,0.6142379466890142,-1.7886934593762462,-1.7917167819089086,-1.791313269369601,-1.788299392809604\ngdb_17461,CC12CC1C1OC1C2,-0.0031344955562109,0.3500344206724029,0.4894124405886415,-0.5406042309911847,-0.6132661522544685,-0.4257880706753132,1.4994061663298428,1.6793244194440808,-1.2662944346484613,0.1930826142878494,1.5915292017164782,1.59149528007807,1.591495275436781,1.5915838778030085,-0.9951272195340464,-0.0056238842463801,-0.0100734366690859,-0.0100445780925397,-0.0078355378207136\ngdb_85749,CC1OC(C#C)C2NC12,-0.003710350638573,-0.0561924171934449,-0.0162657338987934,0.1675794149441739,0.3613554818238456,-0.1320881688370467,0.6088330928918436,0.6607820635746673,-0.1581491178552959,-0.0434691830988285,0.2410188465937199,0.2410168860063505,0.2410168813567159,0.2410288964046165,0.1298037700100636,-0.0532626072665707,-0.053197183205111,-0.0528615890571757,-0.0541790908871528\ngdb_22508,CC1(CCC1=NO)C#C,-0.0041231410076444,-0.1239221459662844,-0.0169959155488664,-1.214417642772456,0.8621009329167129,-0.3128265699682876,-0.2583038470346293,-0.1094379823926976,0.1067348866122378,-0.1259662843345583,0.2416389378848709,0.2416707550148152,0.2416707753277335,0.2415902439875626,1.0235946722058324,0.0118735397250932,0.0148828202165082,0.0147408220222422,0.0099034167283364\ngdb_109162,CC1C(O)CC=C1CO,-0.0040180934217219,-0.1186941963602518,-0.1683899534208785,-0.3192886738050747,0.6703520650592009,0.360423974245584,0.331860605937442,0.1578241647052677,0.4190621787879371,0.9994504551825184,-0.316757101527404,-0.316729159542513,-0.3167291392330452,-0.3167771893724629,1.1511032613926704,-1.0053448273094427,-1.004518675616816,-1.0047871529408152,-1.002277055419044\ngdb_26258,CC(C)c1nccn1C,-0.0040392654178114,-0.0161809236506571,-0.0594285966887341,0.5836160705697905,1.2028521057335675,1.368040560552252,0.5385246923572649,-0.1073335560375958,0.1746927511543272,1.0335312248733444,0.7045196504842864,0.7045558193133299,0.7045558146665597,0.7044550249540696,0.8793475037435027,-1.1334460037967156,-1.1312180726839642,-1.130595510937018,-1.1446189861342706\ngdb_70622,CC12OC1CCC21CN1,-0.0041185540181497,0.2892247797838771,0.1537753178619581,-0.9078952693564414,0.3515848388756906,-0.1094958686956416,1.2565226008467518,1.2921099701052954,-0.5226820421338997,0.6994855854434224,0.2101617849321382,0.2101542638143481,0.2101542591645228,0.2101870303687961,0.1728809960866973,-0.6710215763298497,-0.6734467157868687,-0.6736140511364136,-0.6631281918512927\ngdb_129970,NC1=NC(=NO1)C1CO1,-0.0022100845371837,-0.2645716694251987,-0.2128032522865691,0.2231206678547713,-1.2959898282566962,-0.3263819500531305,-0.4841429517820644,-0.3261938969682132,0.1963815606187933,-1.4080523822275397,-1.4587899488048832,-1.458817573508027,-1.4588175781681674,-1.458772797173455,-0.9190651460615888,1.5099446117144455,1.5092900969637584,1.5083605444536494,1.525519693438436\ngdb_34276,N#CC12CN=C3CC1N23,-0.0032549731116145,0.0964220392406284,0.1123740183028185,0.3966054048872847,-0.3885413644469371,-1.1080755349457472,-1.7560858341803551,-1.2163662451763984,-0.8754526227471707,-1.399967966842326,0.3971703947916881,0.3971258526664057,0.3971258480177358,0.3972287289677316,-1.4098993677576404,1.2859760116060392,1.2844680623017128,1.285124195502294,1.287193633230687\ngdb_131242,c1c(nc2n1cno2)O,-0.0023430353881168,-0.0431856826663591,-0.0420776552288743,0.0561048532200808,-1.5512478752772063,0.9071576376675872,-0.7675071115123369,-1.1805909971396626,-0.7032829526926807,-2.063070349962597,-1.4292981563750948,-1.4293613413265949,-1.4293613210240026,-1.4292080917447965,-1.7838096901028273,1.9842918305496733,1.9831081035185376,1.9837168531523437,1.9886702609770877\ngdb_119647,OCCC12CC3C(O1)C23,-0.0029316898291172,-0.2718327105446884,-0.1713836981861778,-0.4594486590912295,-0.1051927189504858,0.3875347344152708,1.3310921165652447,1.1342779934726384,0.377109366941301,0.3129363263928002,-0.2847728870877283,-0.2847819894841786,-0.2847819941370623,-0.2847627289436834,-0.0240434659779164,-0.2691865353017739,-0.2713474087207022,-0.2719119822623643,-0.2594665263484045\ngdb_79153,CC1CC2C3C(O)C2N13,-0.0030616674266188,-0.1077205533466232,0.0121109504771688,-1.0570071977587747,0.4358566343035161,0.410127034556676,1.3502671348928572,1.1426956988930468,-0.1897568866089281,0.714542433428523,0.2110302771909751,0.2110125511365595,0.2110125464867395,0.2110686665873159,-0.0850900492179471,-0.5797926875518639,-0.5840829394399467,-0.5848805638241922,-0.5624820656294262\ngdb_121976,OCCCC(O)CC#N,-0.0019966403103582,-0.5561119412959076,-0.5676076433276692,2.1512515983087823,0.2184598287071001,-1.47858925726479,0.3702106425926668,1.0543097919787594,3.2893456972992965,0.6027731367087855,-0.7176496284758314,-0.7175936475398235,-0.717593652195382,-0.7177341535442717,1.464705467230569,-0.5413104165784873,-0.5377337487466122,-0.53885827725121,-0.5348805373130693\ngdb_42978,O=C1C2CC2NC11CC1,-0.0039152840656131,0.2088482115655263,0.1622545522734307,-0.0327611514368751,0.1293027118051988,1.2460421397886638,-0.4905346245579353,-1.0648475476090475,-0.6581169971273405,0.021176086394088,0.2397308609913064,0.2397065019592298,0.2397065222721361,0.239768560249204,-0.4132154341331105,-0.1885562659892,-0.1896326913207812,-0.1883322271481039,-0.1980569728481649\ngdb_130459,[NH3+]CCN=C1[N-]N=NO1,-0.0034893793277815,-0.1402373896643376,-0.1812320231915375,2.235543617431924,-0.7158579032100809,0.5276069952919821,-0.8484683000067005,-1.0837873848049666,0.086956305207929,-0.7004406341054054,-1.3924176645555146,-1.392434966507198,-1.3924349711669264,-1.392394094504572,-0.5296470223288137,1.2891776232753285,1.2841431787238513,1.279924010034236,1.348782713500315\ngdb_80345,NC12C(O)C1CC2C#N,-0.0042092437743043,0.0927788907832844,0.24941085948027,1.5712048899706956,-0.307933560124671,0.2474624735385596,0.3552967394489682,0.2356879398440448,-0.5726779006836793,-0.3677474061711507,-0.1606739004151082,-0.1606717155835909,-0.1606717202357078,-0.1606575600192381,0.2617431595933556,0.3267144199411222,0.3277820922587856,0.3278694092907674,0.3299398034831239\ngdb_68086,CC12NC(C)(C=C1)C2=O,-0.0040121082294053,0.1774931418269302,0.2286098097238152,-0.8929318023958216,0.3222729100312306,-0.0597928083845496,-0.2199538103794044,-0.1873017575314752,-0.5959001828985337,-0.0651678742070961,0.2398963368243671,0.239900510890045,0.2399005062404036,0.2399094962352093,0.2782355832912671,-0.1711742154107974,-0.1694327300098416,-0.168277318329698,-0.1819679547941083\ngdb_8236,CC(C)N=C1CCO1,-0.0031639241333909,-0.0671408044118753,0.13999313921683,-0.3972424558313603,-0.0880940937912171,-0.1275697088087649,0.3638189698167959,0.4187730327379276,-0.3916019352798486,0.4896814820821725,1.1602544231563492,1.1602653240146097,1.1602653193706556,1.160240081617502,-0.0420128231413126,-0.1094730489979784,-0.1117749917527269,-0.1134676215744135,-0.0974421925691735\ngdb_88960,CC1CC1CC1(O)CC1,-0.0036320070686005,-0.328563540508701,-0.2595166233499899,-0.8297454593787067,1.2968945441095447,-0.4348249907318743,1.356658807668728,1.5425367063624449,1.180790996928415,1.673852987688813,0.5811254525687362,0.5811665636780119,0.5811665590304792,0.581068689467284,1.5941833010380522,-1.655313950136066,-1.655276075655585,-1.6558348890670578,-1.6519644847755584\ngdb_42304,C#CC(=O)N1CCC=C1,-0.003523964123273,-0.1113195041623702,-0.1906878755599829,0.4876930596606643,0.9475940587130566,1.209894459562416,-0.9251683733171502,-1.4773151132090576,0.1123515433317829,-0.7316062056833476,0.2694124676064305,0.2694127590314302,0.26941277934452,0.269402460604042,-0.251737375240127,0.3057295729392192,0.3102175865276835,0.3128701646565409,0.2729927725240693\ngdb_114933,COC1CN=C(O1)C#N,-0.0035516905404959,-0.1834626831460996,-0.20541016307958,1.5659774779320512,-0.7109725817360033,-2.011767540601949,-1.4599383288983412,-0.5071745635069931,0.2239654825772923,-1.0805283715656149,-1.0587883802866238,-1.0587986419329412,-1.058798646590608,-1.058779020383163,-0.60521678464611,0.9523214018837411,0.9544669187422024,0.9550117766886488,0.9481673038068118\ngdb_25540,c1c(oc(=O)[nH]1)CC=O,-0.0029351052020181,-0.3264420537120501,-0.2960895967480215,0.7304410062052168,-1.1372168803492018,0.7761222968474383,-1.0061295618115138,-1.3531539582580343,0.5802716293849903,-1.457370321436222,-1.5561784800209204,-1.5561937562329058,-1.5561937359310976,-1.5561805815875198,-0.7103252162730977,1.0962313554018552,1.1005527686427778,1.1025086501543004,1.0719353380154248\ngdb_118790,CC1(CCO)NC1C#N,-0.0033194120242752,-0.3157967360359983,-0.271610256929324,2.2456063856063144,0.2783050167645398,-0.9815586541538776,0.2807272237304755,0.7365414123583436,0.7847771996599054,0.2907267242750576,-0.1909673856876421,-0.1909347623112022,-0.1909347669635061,-0.1910047023025508,0.956394199526282,-0.2318448876503798,-0.2300404111497621,-0.2308963258670815,-0.2225324556796012\ngdb_115326,OCC1NCC(O)C1=O,-0.0038092748698447,-0.132464918692084,-0.1614075913920553,-0.066477959086132,-0.8905081459083242,-0.1682358490632946,-1.1062657686334898,-1.0143413150865976,0.1422460405486761,-0.0096288421183428,-1.614645392864707,-1.614647883233613,-1.614647887894715,-1.6146335696058718,0.2366352906801163,0.2018096108771451,0.2006382544572755,0.1991835293112261,0.222853510629177\ngdb_55457,CC(=O)C1COCCN1,-0.0036281440497369,-0.1881160634113725,-0.1609603551313856,-0.9114891151330096,0.1464013369644675,1.3228559602694416,-0.6758931350581886,-1.281603462184563,0.3778528688808835,0.6916415987685495,-0.7177550197817862,-0.7177476914296924,-0.7177476960852518,-0.7177612873410827,0.3912209934008386,-0.5523810181081685,-0.5537726011980235,-0.5547840065419564,-0.537978086485461\ngdb_22336,CC1CC(=NO)C(=O)N1,-0.003832160078878,-0.083266629645942,-0.1190296738759432,0.7395236346223616,-0.0856514330541792,0.2790916937365258,-1.0572629440184802,-1.1742777180743564,0.0388410061312938,-0.3716543727141904,-1.088431245237408,-1.088433256489517,-1.0884332611473668,-1.088423454741916,0.1440807935097481,0.4621050971987676,0.4620759678323389,0.4612161006617449,0.4759150140453863\ngdb_79164,CC1C2C3C(CN23)C1=O,-0.0038830922417495,0.2536898759399746,0.1721211318200422,-0.9742180595967428,-0.1406112996375414,0.1345009728315341,-0.6567181167305761,-0.7113039199518962,-0.7224084677298659,0.018591477757923,0.2400120376067212,0.2399853336315417,0.2399853289819007,0.2400501326760818,-0.6391862543522555,-0.1590206772562949,-0.1606010948405186,-0.1595079733078283,-0.1659131323150874\ngdb_34456,C#CC12OCC3CC1C23,-0.0034110191783289,0.064258784055689,0.1342977223462605,-0.4337036548009054,0.4395206254090738,0.7083453964232229,0.498044098110083,0.1620330174154715,-0.6590335062490178,-0.3602941161505821,0.6720390834326687,0.6720088491276611,0.6720088444806898,0.6720850543742616,-0.6246630752749908,0.0238487735450398,0.0237248516603099,0.0259567059142506,0.0025912629337661\ngdb_113301,CCC1(CC1CO)C#C,-0.0040623495806059,-0.2468547290936439,-0.2730341111469664,-0.6510333103075491,1.3701743662206955,-0.0055712880451783,0.8517166583749343,0.8438671564685502,1.0143916883213646,0.933062077539631,0.6116597751979389,0.611716130543027,0.6117161258956816,0.6115944604994211,1.763046027258459,-1.0714564039005947,-1.0676214626319442,-1.0674474522730646,-1.0791002643668488\ngdb_31907,COc1cnc(o1)C#N,-0.0032909063714876,-0.1797690230983591,-0.2292414666838377,1.864528047989133,-0.4154106325543591,0.0576871523507564,-1.3747160252200636,-1.3847203535845658,0.1284830332507281,-1.7748263798526205,-1.0291174825465008,-1.0291375421118771,-1.0291375218068115,-1.0291019605667937,-1.0411583125416504,1.445482350225654,1.4496154814582662,1.4515481956274492,1.4241440616149112\ngdb_53083,CC#CC1=CCNC1=O,-0.0033548700057188,-0.2923467301944465,-0.3326443156048014,0.5540158499009661,1.3420837677447557,0.3152393739627737,-1.0402184832828247,-1.1742777180743564,1.006061249522344,-0.7390594957039163,0.2686678388388667,0.2686849759153821,0.2686849712659185,0.2685985313859551,-0.0235511548227551,0.2275116563991568,0.2344417409234959,0.2376261907137458,0.1812175483921276\ngdb_84930,CC1C=CCNCC1=O,-0.0042233086999237,0.1416741102870477,-0.0238778776008045,-0.0233518097673149,0.8242396914926178,0.9613791580069596,-0.5310152188051169,-0.9722527879845556,-0.0960870807129503,0.7177581674293326,0.2092680806365865,0.2092769550297997,0.209276950379969,0.2092556446957765,0.3850671039613198,-0.7648988089128993,-0.7647909824219092,-0.7643140601986825,-0.7694536331611603\ngdb_59845,CC(C)CC(=O)C(=N)N,-0.0036324878735234,-0.3411156707222187,-0.3419997679963618,-0.516166079710522,0.7546238604870248,-0.3263819500531305,-1.323582643013097,-1.1553378808784374,1.4332392165507253,0.9572251629288938,-0.2222486086639543,-0.2221942141446319,-0.222194218797129,-0.2223216718845378,1.7933231633008946,-0.8941589635806275,-0.8916073375769384,-0.8926747672908322,-0.8857185882808033\ngdb_103486,CC1(CO)C2COCC12,-0.0037590943100711,-0.1070765305690684,0.0078211332829899,-0.1670402981795547,0.4004380536164588,0.0305763921810695,1.294872637501977,1.2647524274889677,-0.0259823543203166,1.0298045798630606,-0.3162435248507758,-0.3162319055664661,-0.3162318852569953,-0.3162361858966018,0.6676537070240424,-0.951397277363528,-0.9527452287152528,-0.9533788678218518,-0.9405169972742392\ngdb_2525,CNC(C=O)C=O,-0.0035269871151448,0.4576809337560369,0.2474758781075766,0.9912235242830923,-2.434269731716581,-0.3760850103642212,-1.2170547634152504,-1.024863446862108,-1.2516267141730693,-1.158397220127282,1.244084595011642,1.244087242034147,1.2440872623532595,1.2440569041674163,-1.1504513889875112,1.7999539911043634,1.7983637069703715,1.7953927171073596,1.8145292951108416\ngdb_102375,CCC(CC)(CC)C#N,-0.0045120071869294,0.03123683183401,-0.0806403736233549,0.805977110163629,1.673064297613455,-3.028421046965179,0.6791414934264225,2.08126985326858,0.5955305768834928,2.0222041353840203,1.0758336406191735,1.0759078844314098,1.0759078797869346,1.075755198530307,2.309757565065345,-2.0809395093258645,-2.077671509572029,-2.077689934773296,-2.085678361471873\ngdb_102335,CCC(C#N)(C#N)NC,-0.0044092696752344,0.0197454450188177,0.0702607916348578,0.8317221144539532,0.1732706050718896,-1.4062938968122942,-0.5587124675005571,0.1031090794726135,-0.0044029254296553,-0.0769789346025942,0.3346110441782241,0.3346574981612681,0.3346574935122108,0.3345655092668123,1.1026106126092583,-0.0383269970546265,-0.0334026765979963,-0.0332066945055109,-0.0425212494318604\ngdb_961,Cc1c(nc[nH]1)C,-0.0034833830824781,1.2358182977621444,0.5753913298847381,0.5169665670770733,-1.4230081865826918,1.5307051215703682,0.5790052866044466,-0.1431088040743315,-1.6912715644451075,-0.6714389209205315,2.66577065158884,2.6657523355865336,2.6657523309518827,2.665789850463094,-1.507869287634786,1.1570593549961945,1.1544470555383295,1.1535762779626433,1.1527081358793396\ngdb_15149,CC#CC#CC1CC1,0.0015336670730872,-0.4966724272968852,-0.46330119461474,-1.5462276219254134,2.5682994577380214,1.0743406587139854,-0.2646955198105,-0.7618101524743461,2.2838629853424983,-0.5796252071590907,2.549303448279814,2.54932758119667,2.5493276015238484,2.5492274776583006,-0.2605989760330341,0.3884916279087402,0.3934345655888914,0.3954976277422339,0.3450713454357417\ngdb_54390,CC(=O)C12CC(O1)C2=O,-0.0039673270536514,-0.0207143388887385,0.1513565911460912,-0.1648839907136139,-0.9076067710675928,-0.7240064325418603,-1.099874095857619,-0.7491835943437338,-0.503140094519977,-1.179344571515735,-1.15266489975053,-1.1526719050445224,-1.1526719097027691,-1.1526666252572682,-0.3762920974959957,0.907519815494323,0.9104490926010436,0.9113044113883622,0.8964407973035817\ngdb_80819,OC1C2NC1(C#N)C2O,-0.0042991045564539,0.1826263822009694,0.0997875121096851,0.9484894308671742,-0.9356973695435348,-1.4740707972365097,-0.2817399805461555,0.4082509009624172,-0.5550132954672027,-1.0651709876925883,-1.0567557159918215,-1.05677894705603,-1.0567789517136843,-1.0567120591034471,-0.4395540809342526,1.1658382213915932,1.1647549616839874,1.1638166431912411,1.1841281700746271\ngdb_3716,CCc1c[nH]nc1,-0.0016352518560945,0.2594355693475709,0.4277668547812279,-0.3101407027374471,-1.570789161173514,0.3423501341324606,0.3297300483454849,0.1662418701256764,-1.412771108350409,-0.6315277550039372,2.6664426896605904,2.666412320419503,2.6664123157848563,2.6664637022063182,-1.844117806610115,1.2276521390749031,1.223163830771745,1.2218083889200293,1.229633931552251\ngdb_31810,CCc1ccc(nc1)O,-0.0028980776964583,-0.2759936028035959,-0.2448217176422705,-0.8509818207856997,0.9378234157649032,0.6496054160555699,-0.6439347711788346,-0.9385819663029225,0.6362676915683413,0.0308232883965173,0.237483295286516,0.2374775460340656,0.2374775413844091,0.2374825066653473,-0.1360442537771663,-0.4246469387811574,-0.4217087265000673,-0.4187739943752261,-0.459029052622514\ngdb_71311,CC12CC3C1C(C#N)N23,-0.0035563825333646,0.0676683164074493,0.0754359540797503,1.0714642990762848,0.2917396508182509,-0.5929710917217101,-0.064423106166548,0.212539249937922,-0.5828853244272898,-0.3725259267891763,0.7676612743338278,0.7676427457826481,0.7676427411362678,0.7676916489296172,-0.5301393334839758,0.2520206361480175,0.2510575866084531,0.2516860470054302,0.2505547760384792\ngdb_39327,C1OC2CCC3OC3C12,-0.003829811319197,0.1969843017711602,0.1567416808153796,-0.0686996092025556,-0.224883095065367,-0.1094958686956416,1.4802311480022303,1.5130747373910152,-0.6681474817074443,0.4091679177067556,-0.286051087424407,-0.286087530796793,-0.2860875354496863,-0.2860007240244466,-0.7972181351591093,-0.4034523219733306,-0.4072786975245893,-0.4068845380727687,-0.4007939883380411\ngdb_24884,N=C1OC(=NC=C1)C#C,-0.0034985975187179,-0.0396372434410084,-0.1418661049819765,-0.6101941537556391,0.7082133064832943,0.0712425324355992,-1.9201387687610392,-1.9276623532009052,-0.2422527704305536,-1.7746160047310722,-0.1013463584724519,-0.1013744784904646,-0.1013744831422151,-0.1013108042043993,-0.974450151017262,1.3115390846313488,1.3154801490705077,1.3183563484883474,1.2810526778806175\ngdb_121471,CCOC=NC(C)C#C,-0.0039101996917153,-0.3830908023416685,-0.3840125946874375,-0.046744478640249,1.2859025707928715,-0.3128265699682876,0.2722049933626477,0.4145641800277227,1.84768721503375,0.5589249506603589,0.2104306301157349,0.2104892362593575,0.2104892565720832,0.2103385003587166,1.5646446317283609,-0.6427813161506869,-0.6385698139803488,-0.6389805578481604,-0.6458366294084983\ngdb_40612,C1OC1C1C2C3CC3C12,-0.00348913063558,-0.0316374703119707,0.0523348321255656,-0.6565220929481257,0.3747901158775566,1.4403359210047475,1.2799587343582783,0.5934404202114008,-0.3455041001200412,0.3304575686588951,0.6432079212520775,0.6431887625449025,0.6431887578977531,0.6432233564152969,-0.3871229429095496,-0.3811039712302057,-0.383858490193579,-0.383629515120426,-0.3803051396294575\ngdb_76796,CC1C(O)C(O)C11CO1,-0.0043223600405428,0.2459679165581174,0.0344545089694028,-0.1024817595022957,-0.5180023835099715,-0.1004589486390793,0.6770109358344656,0.7154971488073226,-0.3053632167982814,0.2636784943617032,-1.2126791077387158,-1.2126653476292142,-1.2126653522878332,-1.2126756004792556,0.7710390496079635,-0.1494341971064817,-0.1497369880102716,-0.1511592884507511,-0.130269944602954\ngdb_3008,C1CN1C(=O)CO,-0.0019112449348492,0.2195440408319746,0.2994191752396452,0.725017566215123,-2.426941749505466,-0.9454109739276308,-0.366962284224433,0.0778559632113889,-1.3469075600890936,-1.0753591542932852,1.2441169962231844,1.244092484169436,1.244092479526,1.2441433479009525,-1.6078084521325782,1.8033575062122569,1.7989095113804832,1.7959320911831598,1.8243975681397464\ngdb_51145,N=COC12CC1NC2=O,-0.0036818449857853,-0.0990704434912311,-0.1202527281398155,0.831395401201538,-0.7793670823730783,-0.583934171665149,-0.4436623575348826,-0.1683619203355561,-0.1032275589557844,-1.09678736310242,-1.0577938777690323,-1.0578091264906446,-1.0578091061857562,-1.0577921439738889,-0.5872474274827137,1.0567867654423044,1.057493998822656,1.0573147802240563,1.0608274753914948\ngdb_68771,NC12CC1(O)COC2=O,-0.0040256978765951,0.2215203068062356,0.134452885946901,-0.0871262366387774,-1.3045391408363296,-0.1682358490632946,-0.1752121009483087,-0.0947069979069828,-0.7344628503781977,-0.7123719088560736,-1.5850502560467954,-1.5850696840377747,-1.5850696886986937,-1.5850007675831954,-0.1791214798538,0.687012417190553,0.6871553102877235,0.6871467392548983,0.6937778722262823\ngdb_67554,OC12C3OC3C1NC2=O,-0.0040307601445194,0.2938529042539517,0.3897700271655538,1.725152174508775,-1.7429967431347193,-0.1908281492046998,-0.1070342580056867,-0.0168432227682053,-1.106764655019724,-1.4549059271552276,-1.5533116717470064,-1.5533516951417934,-1.5533516998025163,-1.5532416194577925,-0.8924803436828658,1.397369023018199,1.396464528187499,1.3963307423933973,1.4074423728419287\ngdb_57948,CC(=N)NC(=O)N(C)C,-0.003640595239293,-0.2497907152854376,-0.311916284013354,-0.3056974025045992,0.7875997804370424,0.1254640527749718,0.0570186765749972,-0.0021122382824911,1.024407874787246,0.6420531772607342,-0.6233253831797082,-0.6232715827597997,-0.6232715624522263,-0.6234065418289821,1.35615085751745,-0.4494784396068506,-0.4469872678940748,-0.4487492570088373,-0.4329235807144003\ngdb_111275,CC12NC1C(CO)C2O,-0.0036375059295008,-0.1204179043825308,-0.0654799771137141,0.9243779928389264,-0.0697741382634286,0.3016839938779309,1.1350808181052066,0.9806548695501862,0.05137580541751,0.6861718456082938,-0.7163181333882724,-0.7163119953865984,-0.7163119750796,-0.716303900411043,0.6757768410842062,-0.4014463981997708,-0.404289768611661,-0.4063529062085825,-0.3716052154138008\ngdb_23634,c1c[nH]c(=O)cc1F,-0.0032559678804205,0.2711921420123445,0.1145919450649152,0.893601604461407,-1.8529164763014463,0.6631607961404128,-1.0828296351219635,-1.3763026481641571,-1.192015016355645,-1.8956117532101424,-0.2784055996963004,-0.2784676627899365,-0.2784676674427813,-0.2783231076319607,-2.049903869467638,1.823018181311211,1.8227117817985703,1.8244466565588613,1.7999306342016144\ngdb_8648,NCC(O)(C#C)C#N,-0.0040065706830514,0.2633312757568969,0.2423828610983173,1.1935897128291164,-1.404688231054904,-1.5689584578304103,-0.2753483077702847,0.4587571334848664,-0.9395743032433012,-1.2936083161052594,0.8203912506289118,0.8203984720230998,0.8203984673770456,0.8203962659593937,-0.1178287410361891,1.5181492332649829,1.5197513481575566,1.518745430965715,1.5227156562855508\ngdb_46247,O=C1OCC11OC1C#N,-0.0037728773845287,-0.0238208016981201,-0.0438300911890494,1.9614311986555049,-1.81749789561439,-2.7030919249289465,-1.2724492608061306,0.0042010407828156,-0.486549992587339,-2.1614056924920355,-1.5244695011424625,-1.524502676964745,-1.5245026566627409,-1.524428922090492,-1.2858369566569336,1.8034781238504856,1.8070601905710624,1.8089106185860384,1.7847449492547385\ngdb_13051,O=CCC1CCC1=O,-0.0031346502980252,-0.0878568704232193,-0.0918760437638533,0.823358255192122,-1.117675594452893,-0.3489742501945356,-0.9230378157251932,-0.7491835943437338,-0.2923630138941158,-0.6018949164544186,0.6936259038554756,0.6936213741253687,0.6936213694785309,0.6936095939487346,-0.7871257564782977,0.6421662547172864,0.6429555493427794,0.6432561416134827,0.6271933883698094\ngdb_90410,CC1OC2C(C)(O)C12C,-0.0041897739381719,0.1109062377868103,0.1598723346400675,-0.5921595822223157,0.4150940180386897,0.672197716196975,1.322569886197417,0.9932814276807984,-0.2463931468566021,0.9439715302713504,-0.3157604272799184,-0.3157335532382619,-0.3157335578913369,-0.3157578872023283,1.3337506999575997,-0.9006513390869021,-0.9008574227935988,-0.9018596110022614,-0.8859152128372553\ngdb_27773,Cc1cc([nH]c(=O)c1)N,-0.0043438083613007,0.2256811990651432,-0.1748429337533987,1.7948727825741957,0.8315676737037332,2.0819572450206536,-0.5459291219488155,-1.508881508535589,0.1006167235363294,-0.3330355111156792,-0.1634813730342963,-0.163482224012771,-0.1634822286649052,-0.1634671186506547,0.1974965538447744,0.0318095376952886,0.0351555554492097,0.0373067881409079,0.0092052570713684\ngdb_24240,C1C2OC2C2=C1OC=C2,-0.0031973981037156,0.1058677066447644,0.0942928951928856,-0.7203618624700713,-0.6242581255711416,1.101451418883671,-0.0154202815515385,-0.528218827058014,-0.81346922523305,-1.0008563076763908,-0.2268117127230179,-0.2268699491975195,-0.2268699538500455,-0.2267359187661172,-1.6001776292275738,0.5721110057669113,0.5721257313224765,0.5753672687043334,0.5409635466565281\ngdb_38670,C1C2OC22CCC=CC12,-0.0038209191913692,0.2195503547807741,0.1415447752232352,-0.5124415486329877,0.6202775199499128,-0.0552743483562691,0.2104188231958966,0.2335835134889429,-0.6351831793669298,0.387228797888146,0.6408463771714004,0.6408158975247626,0.6408159178401475,0.6408909731838978,-0.5985705840514289,-0.6291672557195315,-0.630918155964313,-0.6289440708587398,-0.646566135009249\ngdb_100154,CCC(C)(C)OCC=O,-0.0040222603977207,-0.3443673543539902,-0.3302894797833159,0.2428541483006543,0.9537007105556522,-0.6155633918631153,-0.7547237659605953,-0.458772757339645,1.5326754519127317,1.5651191034371297,-0.3467869587605737,-0.346712450983499,-0.3467124306742165,-0.3468943309163032,2.2437878702736995,-1.53621189834085,-1.5332134170057623,-1.5346305662446076,-1.5284928122172503\ngdb_21824,c1c(cn[nH]1)NC=O,-0.0021639548970482,0.0256679289928014,0.0336604364249483,1.8603461183582173,-1.542698562697572,0.934268397837274,-0.1027731428217729,-0.5366365324784226,-0.8898086443325074,-1.5089723333931429,0.4174852804433775,0.4174440944610957,0.4174440898125515,0.4175305752254208,-1.7555017986810386,1.7706858588745438,1.7689396511277238,1.7686149878704809,1.7722151206355938\ngdb_34408,N#CC12CCCC1CN2,-0.003985144468267,0.0700360472072829,0.1354021220919959,1.6045949843675371,0.4285286520924007,-0.0778666484976742,0.2828577813224324,0.3156561413379248,-0.4404678853531766,0.4105804363800083,0.7354724429981508,0.735459180385297,0.7354591757387178,0.7354929432807322,-0.2581374202572269,-0.5056311946419334,-0.5067268601154705,-0.505631285739567,-0.5132888813096006\ngdb_7492,CCCC1(COC1)C,-0.003390278248722,-0.1639462674066713,-0.0652152862655627,-0.4787900836342139,0.3247155707682685,-0.0778666484976742,1.3161782134215465,1.3384073499175408,0.2124451344951797,1.5241560619127943,1.5314615492029369,1.5314941240657374,1.5314941194240763,1.531399369131924,0.6110379241804647,-1.0681918612894037,-1.0710626294838503,-1.0733057257279324,-1.0545848866983245\ngdb_25771,Cc1c[nH]c(n1)C(=O)C,-0.0032647881638345,-0.194581546982118,-0.2361325560064016,-0.2572784984966548,0.6532534398999321,0.4462747147829239,-1.2426214545187335,-1.4352265861070157,0.4528929468503493,-0.3820529144364358,-0.1633654475902266,-0.1633532175595481,-0.1633532222116815,-0.163394004675626,0.0436493178567944,0.0439866749538395,0.0485875420747705,0.0506440380152508,0.0175518270116614\ngdb_68714,CC12CC1(C#N)C1CN21,-0.0040262560524251,0.1228017173251742,0.1960163263186814,0.1748977917982762,0.316166258188635,-0.6426741520328009,0.0229297551036863,0.3240738467583334,-0.6147057776299855,-0.3653731726565342,0.7667123781440381,0.7667011334718791,0.7667011288254929,0.7667337035879632,-0.3071223801957993,0.152345890541783,0.1530181192093818,0.1543380589036271,0.1411973270763991\ngdb_65199,CC1(CC1)C1CC2CC12,-0.0036232917885605,-0.1509900444699819,-0.0305042760752171,-1.6216330405828598,1.6816136101930887,-0.3037896499117254,1.5100589542896274,1.630922613276732,0.2228502296679194,1.7433368849773396,1.5085175723324291,1.508529827114668,1.5085298224728665,1.508506757001825,0.7944238294781379,-1.8290689028514384,-1.8310380910544928,-1.8303572425408372,-1.8353639025825808\ngdb_10940,OC12CC1C(C2)C#N,-0.0029117226085817,0.0045288284118872,0.1607485526201552,0.065840908142056,-1.160422157351066,-0.8234125531640417,0.3531661818570112,0.7302281332930368,-0.8781660473737901,-0.8846090262265536,1.2206810601253495,1.2206540735252407,1.22065406888166,1.2207184440659988,-1.059866136437788,0.9908036728575368,0.989569939518212,0.9898646326397208,0.9828530154175044\ngdb_35262,N#CC1NC1C1COC1,-0.0036235073218018,-0.2402187689053103,-0.2326368113566771,0.863217271986786,-0.2590803453839044,-0.7059325924287356,-0.0495092030228495,0.2819853196562915,0.4333316967822267,-0.35797998981355,-0.160207602547924,-0.1602083857116324,-0.1602083903637464,-0.1602482815164824,-0.440784858822157,0.3756956697218996,0.3760234049067577,0.3757704725488349,0.3766623575062041\ngdb_127685,Cn1c(=O)c(=O)ocn1,-0.0039882669370195,0.2912199876045367,-0.0283046038543722,0.9876950211570074,-1.4059095614234232,-0.8188940931357612,-1.728388585484915,-1.3257964156417072,-0.6509686549216843,-1.7846839569765984,-1.9564664922236503,-1.956493965810108,-1.956493970473322,-1.9564366101434827,-1.0546968693085923,1.6237657086354278,1.6260896416589374,1.6267750903581346,1.6193495017037418\ngdb_67664,OC12C3CN4CC1C4C23,-0.0037332966390333,0.2994723186855568,0.4679633546177468,-0.8405269967084108,-0.185800523272752,1.2279682996755406,1.322569886197417,0.7344369860032406,-1.097169905470558,0.0482844234850269,0.2411930343087739,0.2411451935082032,0.2411452138211184,0.2412657617200999,-0.930388302630304,-0.0349654359080013,-0.0398379705000698,-0.0395940190873604,-0.027138939928077\ngdb_3883,c1(c(n[nH]n1)N)N,-0.0033103485751531,1.352234885728362,0.6642179276161192,-1.151035271803892,-2.807996824483453,1.9283296040590976,0.1209354043337053,-0.7807499896702652,-1.9849916482241967,-1.6627565472449568,1.4645290584902202,1.4644910729095515,1.4644910932300268,1.4645904447629317,-1.9012259006088537,2.700002810684188,2.69350893023895,2.689102011125394,2.745711848525324\ngdb_40429,C1CC1C12CC1COC2,-0.0035761894855924,-0.1193824167794035,-0.0129890437440908,-0.7092536118879517,0.8437809773889243,-0.1230512487804844,1.6293701794392157,1.6666978613134673,0.0121757380116057,1.0762073209588587,0.6110178417957713,0.6110097153689215,0.6110097107215732,0.6110243262591819,0.0387262063051791,-1.1388869081487023,-1.1411725054973592,-1.1404797344313562,-1.1441858421842788\ngdb_112132,CCC1CC(=O)C(=N)N1,-0.0037988132179007,-0.1596969798645699,-0.1878858034778277,1.185813937421633,0.4517339290942651,0.3559055142173034,-1.4322410802029009,-1.578327578253958,0.4233530428661803,0.3395638060630576,-0.1924083908792327,-0.1923953709781947,-0.1923953756305091,-0.1924392239539522,0.4330674415895696,-0.3832121577839397,-0.3821171142180483,-0.381900421193629,-0.3862950645231218\ngdb_116772,CCN=COC(C)CC,-0.0038257714525456,-0.4434710947118245,-0.4479126163394514,0.007359235959721,1.9417569786876772,-0.0281635881865834,0.4639551766387719,0.4692792652603769,2.6681962956496004,1.9799487895416028,0.1493844260662465,0.1494588244267237,0.1494588197765219,0.1492775975089752,2.2125261119209414,-1.8081391204237005,-1.8067238041179337,-1.8086532426485704,-1.7926336819448605\ngdb_130961,c1[nH]c(=O)n(n1)CC=O,-0.0035235662157506,-0.177723303687303,-0.2113155071745454,1.1552989196460457,-1.337515060786349,0.3152393739627737,-0.961387852380418,-1.0964139429355788,0.0903892824328073,-1.4232294445678109,-1.4604078376520866,-1.4604264846786208,-1.4604264893387695,-1.460403520888267,-0.8814033426917316,1.339996980984174,1.34177232722615,1.3420242893934573,1.3393589829468675\ngdb_48756,N=C1CC2CC2(O1)C=O,-0.0033056565822844,-0.1117235968855418,-0.0541530342669565,-1.1746239686282751,-0.4715918295062429,-0.9318555938427868,-1.099874095857619,-0.6523799820090378,-0.2702799339299575,-0.7361743511798251,-0.6579710900497446,-0.6579933151767564,-0.657993319831948,-0.6579411503646018,-0.6108783629304676,0.480383913698864,0.4815221992440152,0.4829639730869035,0.467175194993096\ngdb_95272,CC1COC2OC=NC12,-0.0039196113099195,0.2524902256680589,0.1547975721720601,-1.078439587117217,-0.5802902323044509,-0.8324494732206054,0.097499270822179,0.4840102497460923,-0.696227905682328,0.056729481935752,-0.6883145001474577,-0.6883459275301733,-0.6883459321855511,-0.6882686974036679,-0.7797410891508744,-0.0834196390507627,-0.0856257623711608,-0.0850614471613135,-0.0830601036356228\ngdb_46689,O=C1CCC2C3CN1C23,-0.0038866955154249,0.3706936611445509,0.2747572900084294,0.8371455544440469,-0.0795447812115819,0.324276294019336,-0.0878592396780743,-0.2378079900539251,-0.9332005613756617,0.0717262227432671,0.2397808856661369,0.2397380296586147,0.2397380250089722,0.2398401265743683,-0.9956195306892076,-0.1833015323403418,-0.1863500676542555,-0.1850753367956617,-0.1898870800455533\ngdb_76311,CC1C(O)C(C)C1C#N,-0.0040716672484229,-0.0296612043377096,-0.0162200975456638,-0.0613812323484536,0.3613554818238456,-1.4017754367840136,0.3893856609202792,1.0353699547828403,-0.0663859654909198,0.645779822271018,0.2093203519285502,0.2093531157667945,0.2093531111169642,0.2092938616625805,1.0696257652134356,-0.759408084231907,-0.7568612240631649,-0.7564402309868749,-0.7650908477124673\ngdb_52154,O=COCC1C2CN1C2,-0.00316868244418,-0.2998855850611167,-0.2275894307005475,1.761744058779286,-0.172365889219041,-0.2766788897420397,-0.0047674935917538,0.1220489166685326,0.554186864702745,-0.0086370736881864,-0.6857184591629091,-0.6857333721145502,-0.6857333518073643,-0.6857208164097484,-0.5281700888633298,0.189275864948231,0.1863901602851674,0.1850385086917842,0.2078017583511883\ngdb_14982,OCC1CCN=CO1,-0.0029151656139494,0.0337497834562333,0.0366724357314995,-0.9868945337904558,-1.2483579438844477,-0.5071203511843702,0.2082882656039396,0.4419217226440504,-0.6529818294042454,-0.1343211820189054,0.2629474773321246,0.2629242686328277,0.2629242889458774,0.2629802378722425,-0.9249728799235276,0.6009595983840473,0.5967337130122189,0.5949240952422155,0.6191460297119816\ngdb_64007,CC1(O)CC=C2COC12,-0.0039464921736573,0.1187544761446586,0.1553086993271113,-0.2983790256504969,0.1903692302311583,0.0802794524921615,-0.1624287553965671,-0.1957194629518838,-0.524431416409051,0.299923122445598,-0.2857390572670292,-0.2857444954484576,-0.2857445001013474,-0.2857080185033853,0.2821740725325589,-0.3706757897320284,-0.3715622965539452,-0.3714200474045738,-0.3673792122657023\ngdb_26234,CC(O)C1=CC=C(N)O1,-0.0036316865319852,-0.2129803937840248,-0.1517691936110916,-0.2139563212263888,0.1769345961774473,2.447952507311416,0.4149523520237624,-0.7302437571478148,0.3681873436663152,-0.0437396653979614,-0.6881932327470944,-0.688175383395417,-0.6881753880507937,-0.6882092626564693,0.6996539321095411,-0.0706813675606651,-0.0678689255626244,-0.0674298505782025,-0.0762751316228457\ngdb_77753,CC1N2C(=O)C(O)C12C,-0.0043121083953468,0.1199225566725766,0.3111477179939427,0.2171744866608134,-0.3286961763894974,0.170648653057782,-0.5267541036212031,-0.5997693231314855,-0.487719935783614,-0.0969946247384762,-0.6870933389235481,-0.6870764321383109,-0.6870764367936808,-0.6870889888137526,0.6422996825332228,0.044854597527578,0.0465524713460557,0.046184524125378,0.0516134693779626\ngdb_68802,CC12CC1(O)C(C2)C#N,-0.0037911203391337,-0.0898647061414781,-0.0445511455684966,0.5006962471067927,0.2355584538663689,-1.062890934662937,0.3680800850007099,0.8585981409542643,-0.0823551718134362,-0.0554305114382889,0.2397425434003988,0.2397538059895816,0.2397538013399392,0.2397424998224602,0.4530060433736126,-0.1873291126221012,-0.1847074562866829,-0.1834443108822359,-0.2010319878757726\ngdb_26136,Cc1ccoc1N(C)C,-0.0040626093257942,-0.0099869398782925,-0.0318459848572261,-1.6445683109024123,0.9646926838723252,1.3815959406370946,0.3254689331615711,-0.3219850442580089,0.0964463210704134,0.6452088040839586,0.2090192303454751,0.2090478737176423,0.2090478940303591,0.2089812863142329,0.7828545173318409,-0.7910387489056494,-0.7886426351433543,-0.7879949047898157,-0.8007739302616864\ngdb_94468,OC1CCC11CCCO1,-0.0039519302431305,0.0288943568293746,0.0886796237579495,-1.0802038386802597,0.3870034195627478,0.3287947540476178,1.3012643102778476,1.1300691407624346,-0.2355283673116015,1.04726571495157,-0.3165743017800069,-0.3165724196973519,-0.3165724243504321,-0.31656663410467,0.4163288623140779,-0.986143023661444,-0.9881991237544908,-0.988585284769157,-0.9782404158005584\ngdb_36008,C#CCC(C#N)C1CC1,-0.0040143354062322,-0.2096213730226609,-0.2858579263763736,0.8863485702577879,0.8303463433352133,-1.5011815574061953,0.4128217944318055,1.1069204508563115,0.7889493893900997,-0.0760773269388157,1.1673518611160385,1.1673829455356193,1.167382940891708,1.1672976645426818,0.6605151952741992,-0.3382689767752401,-0.3327867918184735,-0.3304792323878194,-0.3710637854761979\ngdb_68203,CC12CC(C1)C(C2)C=O,-0.0037014750902254,-0.0733411021330395,0.0267876016436361,-0.1489403839957483,0.9133968083945192,-0.3580111702510979,-0.6716320198742748,-0.4966524317314826,-0.1101925278465338,1.0427276230438849,0.6101498737475989,0.6101484075782804,0.6101484029309268,0.6101568685103854,0.2351583572146316,-1.230060732352941,-1.2308507691472503,-1.2295254909453557,-1.2432133778544792\ngdb_57250,CC(O)(C#N)C(=O)C=O,-0.004346206859422,0.1238561467747001,0.1422201932495527,-0.0905893971143794,-1.475525392429017,-1.5373292376324443,-2.9662425464118947,-2.213864337494789,-0.5153228027434582,-1.5527303586751913,-1.5550734189779667,-1.555058159953872,-1.5550581646146056,-1.555081101169588,0.2664201155673898,1.2123100998640082,1.2187895982845829,1.2199089663368945,1.1974501961820858\ngdb_116760,CCN=COC(C)C#C,-0.0038289602392184,-0.3821310821241359,-0.4125444426640401,-0.0504690097177832,1.2932305530039871,-0.2089019893178244,0.2508994174430784,0.3451181103093543,1.9492163368651096,0.5536655726216517,0.2105352975117249,0.2105935048265244,0.2105935001767018,0.2104438653599438,1.5747370104091714,-0.6317867561758046,-0.6277135043563745,-0.6282033992809659,-0.6338083358898711\ngdb_85770,CC1CN(C(C)=O)C1=N,-0.0038040247011459,-0.1227098677967695,-0.1939371839028078,-1.1613594105802147,0.5946295822110106,-0.5794157116368686,-0.6226291952592652,-0.3451337341641323,0.3941838174932353,0.2969478171551286,-0.1922867740050927,-0.1922626950302684,-0.1922626996825805,-0.1923158113583413,0.6238380142146646,-0.3704371765773152,-0.3683030645049954,-0.3681838029497698,-0.3722064876081698\ngdb_100207,CC(CO)(CC#C)C#C,-0.0041251692307101,-0.1330016043400463,-0.0898862987674043,-0.8010600358166451,0.995225943085305,-0.6471926120610839,0.6258775536274992,0.9196265052522252,0.2501274570763414,0.1784465165458451,0.6421832142152529,0.6422443045035476,0.6422443248189413,0.6421269213729047,1.9338779980995129,-0.4887421031088501,-0.4821942514152816,-0.4812691265263698,-0.5054723427828071\ngdb_276,CCOC(C)C,-0.0017030950886695,0.3296024823570391,0.461784192404004,-1.1320859031638058,-1.7100208231847016,-0.2631235096571956,1.627239621847259,1.72983065196653,-1.3410982362803967,0.4628736808791592,3.462396523976388,3.4624085471864454,3.462408567519267,3.462380362516289,-0.7489716419532784,0.6613864132237591,0.6547969059739872,0.650133806264242,0.689506274978136\ngdb_15980,CN=C1NCC11CO1,-0.0032822408298879,0.1281938295999952,0.1159975447413058,0.567933834453857,-0.7244072157897143,0.4869408550374536,0.0122769671439016,-0.2146593001478023,-0.7401767342744577,-0.563396269211078,0.7904632150766986,0.7904547212597282,0.7904547166134889,0.7904745526540328,-0.7553716869703783,0.9979778002344214,0.9951735314624348,0.9929899053947188,1.018809662567006\ngdb_95306,CC1COC2CC1CO2,-0.0039926383932729,0.1822601731705952,0.271343690794338,-0.211996041711897,0.1537293191755829,-0.3851219304207836,1.3971394019159098,1.5593721172032606,-0.5942662658957308,1.124323116615836,-0.3171352321533197,-0.3171609117899903,-0.3171609164430741,-0.3170840484011517,-0.2049678154997808,-1.0450647401402355,-1.0494721664614368,-1.049426164372031,-1.037307572411342\ngdb_91719,CC1NC2CC1OC2=N,-0.0039689352632212,0.1538852872653895,0.3370600393009086,0.7222731748948347,0.0743428452218349,0.2836101537648076,0.3829939881444084,0.2462100716195552,-0.7035810683742246,0.4210691388686317,-0.1916592938398152,-0.1916877326391885,-0.1916877372914971,-0.1916107569968913,-0.3942614546593919,-0.3045248813021873,-0.3084387169902738,-0.3087416829118918,-0.2917186528690932\ngdb_52870,CC#CC1(CC1)C(C)C,-0.0040438137218525,-0.2588007202224043,-0.2818236727597201,-1.6783504612021525,2.4913556445213128,0.6812346362535373,1.0860779934901972,0.7554812495542614,1.3164768838264649,1.642477040989322,1.5080932612432223,1.5081653239741928,1.508165319332389,1.5079717693907568,2.0596634982432858,-1.873639742453882,-1.8689896910375576,-1.868041167205149,-1.8964371995954288\ngdb_106625,CCC12CCC(O1)C2C,-0.0041764937746107,0.1276823997472312,0.0500256326572096,-0.7972048194381448,1.1772041679946637,0.0079840920396644,1.5654534516805076,1.5425367063624449,-0.0226004928537847,1.7398206150886035,0.5805131245878995,0.5805280965622801,0.5805281168772924,0.580510137638931,0.9017476613033528,-1.719634617005993,-1.7217524537385112,-1.7218398238167123,-1.715727833691174\ngdb_57107,NC(C#N)C1OCCO1,-0.0040212987878748,0.006366187512558,0.0732910454826608,-0.251855058506561,-0.9063854406990748,-1.2120001155962092,0.3595578546328821,0.9175220788971234,-0.3551832085431217,-0.3408193906129669,-1.0884302467409042,-1.088431084747754,-1.0884310894056035,-1.0884299698486009,-0.1451520101476545,0.462209982102357,0.4623020868022423,0.461440624798973,0.4751712602888248\ngdb_110586,CCN1C2C3NC2C13C,-0.0037126938717607,-0.0417334744424611,-0.0228282414788246,-1.0564191139044272,1.0025539252964204,1.295745200099756,1.3737032684043835,0.7554812495542614,0.0129138781583544,1.0064529413711982,0.7085738458378757,0.7085806808636337,0.7085806762168884,0.7085749937026301,0.4660522889853926,-0.7075818314522038,-0.7121546447382319,-0.714487766985843,-0.6742901974690289\ngdb_66494,CC12C3C1C1=NC2C3O1,-0.0037419400746595,0.2105024661510101,0.2602084206307247,0.4033356978870394,-0.226104425433885,0.170648653057782,-0.629020868035136,-0.6986773618212834,-0.8907869927981978,-0.7576326135777529,0.2717773067255697,0.2717553694421838,0.2717553897552881,0.2718113526553343,-0.653709433429521,0.5541389776090077,0.5541271811314894,0.5550594476398577,0.5479878876950064\ngdb_18230,O=CC1NCC2OC12,-0.0029543042399755,0.1356758589274694,0.20265185206372,0.6891444510999254,-1.6855942158143176,0.3694608943021462,-0.7589848811445091,-0.921746555462105,-1.060249314924613,-0.8836172577963971,0.2936972006195673,0.2936621529674851,0.2936621483181758,0.2937327396094616,-1.4662689950236367,1.207443440356882,1.2039956997019496,1.202778032771923,1.2178934507906238\ngdb_5280,c1c(ocn1)C(=N)N,-0.0025697653049969,0.1579072726507068,0.1187904895528349,0.7874197974264413,-1.448656124321595,0.265536313651683,-0.9315600460930208,-1.0438032840580265,-1.067533846668241,-1.5260427718273486,0.4175284654171587,0.4174888772739993,0.4174888726254553,0.4175772044180988,-1.587131383615794,1.7752221309366822,1.7736023802316598,1.7732448304158703,1.7775382318165265\ngdb_84555,CC1C=CC2N3CC12C3,-0.0038471866143432,0.2135899871139931,0.203153851948145,-0.3201381282613545,1.077055077776091,1.142117559138201,-0.1730815433563518,-0.7028862145314877,-0.6217450967146931,0.7354597312281834,1.1390408407265884,1.139015579741495,1.1390155750974096,1.1390823600229028,-0.3637381630393761,-0.6885845533653875,-0.6932334251877096,-0.6932612034764635,-0.6801689867438281\ngdb_58,C(#N)C(=N)N,0.0004807208713272,1.8396969888493,1.784389596993119,1.617532829163182,-4.504424706356608,-1.5328107776041613,-1.1701824963921978,-0.4419373464988284,-2.877388430076256,-2.762086771690043,4.227341000951392,4.227280918686119,4.227280939023665,4.227427673524305,-3.4795754640567385,4.038897931498495,4.031480865842469,4.02738716120062,4.061266913491597\ngdb_24827,N=C1C=CNC=CC1=O,-0.0039498522816245,0.1616261884936454,-0.092122480070753,3.3908670206228337,0.8156903789129842,0.7806407568757189,-1.329974315788968,-1.6772356169437566,-0.3720260296446473,-0.9818023323818724,-0.1320916883752785,-0.1321192275260963,-0.1321192321780367,-0.1320738898697211,-0.8745109865194705,0.705516736232885,0.7075034185103928,0.709790127490118,0.6810970131507328\ngdb_75762,CC1(OC(=N)C1N)C#N,-0.0040332470665346,-0.0526692337632924,0.032327854913565,0.4238532901387187,-0.4202959540284367,-1.243629335794178,-0.3605706114485622,0.2209569553583306,-0.2558867372476422,-0.7592254537837614,-0.5623576859178163,-0.5623405717972763,-0.5623405764518754,-0.5623758680757778,0.3683285246858281,0.7076327891543565,0.7108172309999394,0.710641770767698,0.71042256837466\ngdb_95782,CC1CCOC(C)C1=O,-0.0041910118726862,0.1301006421374612,-0.1125949480846748,-0.0603357499407245,0.4663898935164959,0.5095331551788587,-0.6460653287707915,-0.8754491756498595,0.1394840023336994,1.0068436380255017,-0.3174830084855257,-0.3174648557865935,-0.3174648604396792,-0.3174921287233677,0.7929468960126533,-1.081596151914635,-1.081118425972995,-1.0808492205775275,-1.0838933441342806\ngdb_66514,CC12C3C1C1=CCC3C21,-0.0031282450922127,0.0988150258356602,0.2817031429547488,-0.9416120770056982,1.3408624373762357,2.2084741258125216,-0.861251645558442,-1.8792605470335573,-0.7727374723410353,0.3357470002863957,1.570299393718535,1.570271272192973,1.5702712675515529,1.570345728339667,-0.504046842260414,-0.5864397682898974,-0.5888522303568706,-0.587177419936453,-0.5997438438927704\ngdb_62721,CC1(C)C2C=CC1C=C2,-0.0040977467706224,0.3922242265510375,0.4255489280191311,-1.764537417189303,1.2333853649465474,1.1330806390816384,-0.3051761140576818,-0.8291517958376133,-0.7951615623294055,1.047115447007607,1.5376642344457467,1.5376556549975586,1.537655650355937,1.5376973305568973,0.3860517262716423,-1.390975150797208,-1.3916213567484372,-1.3891621522219193,-1.414923608262881\ngdb_7076,CN(CC=O)CC#N,-0.0018007040145198,-0.3287719008190863,-0.2579193509904551,-0.3685570322692989,-1.0309611382880317,-0.6110449318348347,-1.129701902145016,-0.8312562221927151,0.4034039564497651,-0.6151184955231694,0.7894381086412721,0.7894540974831283,0.7894540928368828,0.789404876977031,-0.3019531130666039,0.8902977143940908,0.8909898658466835,0.8895410545407975,0.8966972641163464\ngdb_106967,CCC12CC1C(C)(C)C2,-0.0036865756641076,-0.2263659652390841,-0.1417018141107101,-1.5999392806224857,2.2104496597618994,-0.0597928083845496,1.593150700375948,1.6014606443053023,0.8055184016042597,2.3762955185386168,1.4777238153491767,1.4777718978531642,1.4777718932111723,1.4776816904556564,1.9090162847638552,-2.4401781690536883,-2.440387129845744,-2.440286092783194,-2.4423952017139507\ngdb_60437,CC(O)CCNC(N)=O,-0.0031631448978261,-0.4776358716662232,-0.4798854453420232,-0.1855322682662596,0.1610573013866983,-0.0869035685542365,1.064772417570628,1.0921894663705964,2.467950848507349,0.986527412001694,-1.149378473320975,-1.1493220807442317,-1.1493220854024575,-1.1494543781160642,1.9070470401432085,-0.6928559910474373,-0.6913361050950044,-0.6938160619754925,-0.6674596313557497\ngdb_130983,c1conc1NCC=O,-0.0031105492604511,-0.2811962966144302,-0.2881214894915998,-0.9091367797156196,-0.8929508066453637,0.3649424342738657,-1.1083963262254466,-1.2647680513437465,0.4833100401420839,-1.0909569668766523,-1.0581462471851546,-1.058159226240347,-1.05815923089801,-1.0581545686251157,-0.666755679041302,1.019772883113319,1.0210420614330864,1.0211173601897017,1.019453679230169\ngdb_81236,CC1C2CC11CCOC21,-0.003881157969071,0.094590994088757,0.2692170367385004,-0.78341752018622,0.8816422188130195,-0.272160429713758,1.680503561646182,1.7845457371991842,-0.5079288904260272,1.0782509649967569,0.6121009109532319,0.6120850021295757,0.6120849974822341,0.6121284496267055,-0.1079825179329586,-1.0251182536797236,-1.0292150256404522,-1.0293118985979144,-1.0181409526045728\ngdb_26837,C(=O)c1nc(c(o1)N)N,-0.0034494504132039,-0.1460588504575285,-0.2157787425106166,1.3896176642782836,0.0511375682199705,1.318337500241161,-1.4705911168581258,-2.064450066282541,0.1003068119105514,-1.414664171761915,-1.4602611585157068,-1.4602720413879675,-1.4602720460481151,-1.4602341281144386,-0.3583227403325996,1.3554045732599682,1.3578527647754743,1.35799131047948,1.3586965806292666\ngdb_127153,CN(C)Cc1nc[nH]n1,-0.0030758594615856,-0.2732722908709872,-0.249357971143349,-0.2120613843623802,0.3564701603497681,1.557815881740055,-0.0686842213504619,-0.7933765478008776,0.6927502470881205,0.3708796455849634,-0.0961699778608728,-0.0961606257688165,-0.0961606304205362,-0.096202586131363,0.0997727895452099,-0.0903105771886677,-0.092224797596268,-0.0940527355827943,-0.06640685859394\ngdb_29149,Cc1c(=O)occ(n1)O,-0.0037409950442937,0.0364837232864412,-0.1189201466284322,0.1336012366929848,-0.6731113403119081,0.5366439153485444,-1.8562220410023311,-2.0833899034784595,-0.2990931362587564,-1.4586025185767193,-1.5558463800838211,-1.555868993470438,-1.5558689981311766,-1.555803154717449,-0.5454009792939828,1.1311160741964723,1.1343666513838904,1.1360814604509752,1.1150217625597632\ngdb_57950,CN(C)C(=O)NC(=O)N,-0.0035622406163338,-0.2760819980867897,-0.2932236337714849,2.230054834791348,-0.334802828232093,0.0486502322941941,0.1997660352361119,0.174659575546085,0.8770575863062892,-0.062763587103687,-1.5208438604882213,-1.5208007559108323,-1.5208007605713558,-1.5209059218338443,1.1124568357124889,0.2387343410388529,0.2419348557541512,0.2401888627576893,0.2563195800636159\ngdb_5336,c1c(nc2n1CC2)O,-0.0018376983611194,0.1481332799089937,0.1948480356785647,0.1930630486325657,-1.2727845512548317,2.086475705048934,0.4596940614548581,-0.5155922689274016,-1.1475478811229407,-1.1704186556443283,0.8192661697310617,0.8192203395605722,0.8192203349145106,0.8193245933148277,-1.7633787771636231,1.399967546494674,1.3970857055879915,1.3969449578485478,1.4003752873346569\ngdb_85303,CC1OC(=N)C2CC1O2,-0.0040585418266759,0.2865287236464666,0.1919364363488985,0.4475073296135849,-0.251752363172789,-0.6155633918631153,0.0911075980463082,0.3766845056358857,-0.6773411691526626,0.0212061399828807,-0.6874570911998111,-0.6874849192901209,-0.6874849239454933,-0.6874096768428399,-0.4897698187607294,0.0066450273024454,0.0040213124548807,0.0039533405062289,0.0150042566710854\ngdb_53776,CC(=O)C(=O)C#CC=O,-0.003140292847753,-0.4304833020311374,-0.4471824346893784,-0.0489007861061897,-0.3115975512302287,-1.230073955709334,-2.7169673081529333,-2.1107474460947877,1.6760908993595915,-1.930534023387162,-1.1236757504606558,-1.1236476996986042,-1.1236477043566715,-1.123748463040361,-0.0326589111932433,1.3290679740733558,1.3392850185572145,1.3419933205469996,1.2857830657600309\ngdb_59951,CC(C)CC(C)(C)C=O,-0.0041894976135036,-0.2013248443000441,-0.2044244178519813,-0.113459324783449,1.7121468694060684,-0.2450496695440723,-0.597062504155782,-0.4756081681804616,1.0129990519960703,2.298817366631254,0.5498140999604272,0.549893657030672,0.5498936523829461,0.549731375778212,2.673329353152139,-2.320792928019912,-2.3182439000687136,-2.319004347844189,-2.3174730668478998\ngdb_116048,COC1OCC=CC=C1,-0.0036757492636014,-0.0778492615759227,-0.1477988308888197,-0.6066656506295541,0.7387465656962741,0.8077515170454057,-0.8889488942538822,-1.252141493213134,0.1623120140488621,0.32649049493827,-0.2864181347391322,-0.286419507735923,-0.286419487426268,-0.2864113255182147,0.0862342327782677,-0.4420080123789269,-0.4418437110959314,-0.4412031814220383,-0.4476675728170731\ngdb_6853,CC1(COCC=C1)C,-0.0036141288625578,0.2400580604817327,0.408864277314963,-1.016364069158314,0.1769345961774473,-0.141125088893609,0.2572910902189492,0.3198649940481285,-0.9130899062685304,0.8649606453355635,1.560824111222746,1.5608223482959056,1.5608223436544273,1.5608467277539564,-0.0107510647885553,-0.6074193710470602,-0.6105726468106767,-0.6111860181525949,-0.604830435678801\ngdb_98309,COC(=O)C(O)C1CO1,-0.0037671187784402,-0.183298520477311,-0.1588428283461739,0.069304068617658,-1.1933980773010835,-1.0222247944084073,-0.2348677135231029,0.2441056452644534,0.2901471638800638,-0.4487418279672498,-2.110726114209,-2.110712044346083,-2.1107120490102504,-2.11074648763956,0.3156512310835441,0.4832603821683243,0.4853012450168802,0.4842775683241411,0.4937309086391735\ngdb_108366,CC1(O)C2CCC12CO,-0.0042297028527495,0.1160394781608495,0.070452464318002,0.1287658805572387,0.3918887410368253,0.0125025520679462,1.0839474358982402,1.0648319237542698,-0.1781457455015012,0.9864673048241088,-0.3156818955299094,-0.3156591648422457,-0.3156591694953218,-0.3156858465973653,1.195903576512371,-0.8924021414525165,-0.8931121983076736,-0.894169014131993,-0.8776911770408385\ngdb_14734,CCCOC=NCC,-0.0028003969261942,-0.409830375507789,-0.3908032840331164,0.1346467191007139,0.5848589392628573,-0.0597928083845496,0.4596940614548581,0.4819058233909904,1.8620310832219973,1.145781379013764,1.130267526235248,1.1303137849359504,1.1303137802918113,1.130186668301129,0.8269163657187988,-0.6358274470704607,-0.6371637178571573,-0.640028337153304,-0.6163828407781893\ngdb_99079,CC(NC(=O)CO)C=O,-0.0031864169613952,-0.4388429702417498,-0.3975483370256658,-0.1853362403148103,-0.4984610976136649,-0.8369679332488859,-1.1041352110415328,-0.7007817881763858,1.651564629127624,-0.1228106575113329,-1.61547781443735,-1.615438721748431,-1.6154387264095376,-1.6155504273398462,0.9549172660607972,0.1143696892262053,0.1182971615208642,0.117423193926342,0.11818655470897\ngdb_67280,CC12C3OC1C1OC3C21,-0.0036550691254215,0.3153013883260442,0.5472610818156755,0.1838497349144549,-0.6572340455211593,0.7354561565929086,0.5001746557020399,0.151510885639961,-1.1860130254577963,-0.3454175896982375,-0.2544311494716589,-0.2544858673791442,-0.2544858720318422,-0.254351633774053,-1.249159775597399,0.2944413352350408,0.2899188606087857,0.2902732296910069,0.3003064880838798\ngdb_68389,CC12CC(CC3OC13)O2,-0.0039673988980652,0.3355565360750206,0.3057808828659062,-1.5325056653239717,-0.2199977735912894,-0.0959404886107988,1.277828176766321,1.3068409545910096,-0.833680682045452,0.3589183172455021,-0.2860880567574519,-0.286120656099314,-0.2861206357896571,-0.2860262602472028,-0.4070615446935917,-0.4073356855128125,-0.4107276615827682,-0.4103065956063175,-0.4037091611097915\ngdb_8566,O=C1CC2CC(C2)N1,-0.0031852177123345,0.3490368167620731,0.5016612377686162,0.7441629628066584,-0.9137134229101902,0.170648653057782,0.4234745823915902,0.3367004048889457,-1.3458545039765897,-0.0927270151299247,1.189405778203233,1.1893614215016433,1.189361416857869,1.189462681539924,-1.3776529870945595,0.3291136621007371,0.3245462518274118,0.3246538107336016,0.3266541786480388\ngdb_44565,N=C1OC2C3NC3CC12,-0.0037090463861383,0.1439345039572889,0.2982691391407802,0.9444381865372244,-0.3958693466580525,-0.2811973497703203,0.4682162918226857,0.5955448465665025,-0.8690056030441395,-0.2813132848035877,-0.1608654120445,-0.1609102577018203,-0.1609102373913897,-0.1608065587617942,-1.0898971169026426,0.3065974955129577,0.3029453925301061,0.3032104652990194,0.3129303545343703\ngdb_47117,N=C1CC(=CCO1)C#N,-0.0035905804743204,-0.142579864668973,-0.2113702707983008,1.04676477719369,0.0914414703811036,-1.469552337208229,-1.4748522320420396,-0.7723322842498567,0.107924489715569,-1.046207173164448,-0.1321839244897989,-0.1321957377387762,-0.1321957423907171,-0.1321931088334416,-0.7147560166695518,0.6958279933024797,0.6995372731909743,0.7018801679574431,0.6674871742867279\ngdb_97828,CCC#CC1CNC1=O,-0.0032486010647622,-0.445826197614059,-0.4467534529699605,0.5132420359995392,1.064841774090898,0.2519809335668401,0.3829939881444084,0.2609410561052706,2.1517627080386545,-0.0513432233624933,0.2395462889126155,0.239577844981692,0.2395778403320485,0.2394655953072701,0.4978063584933112,-0.2079442403403283,-0.2030282909864966,-0.2016359274387853,-0.2326429473650056\ngdb_110049,OCC1C(O)COC1=O,-0.0040986696950147,0.0771581814531823,-0.0514422348910606,1.8283935622720036,-1.3472857037345023,-0.9318555938427868,-0.1837343313161364,0.252523350684862,-0.264710460987347,-0.3664250482642751,-2.111611905419761,-2.1116242258115823,-2.1116242304757558,-2.1115805211437126,-0.0747515149595545,0.3902143624429034,0.3903260795202906,0.3899722693867831,0.3985190292161958\ngdb_26080,OC(C#N)C1=COC=C1,-0.0035723319932222,-0.1327806161320618,-0.1924129297082804,0.1425531798091635,-1.0321824686565515,-0.2043835292895426,-0.39679009051183,-0.2967319279967843,-0.0256695830716948,-1.4138827784533072,-0.628212349579716,-0.6282307925191138,-0.6282307971741201,-0.6282012359940763,-0.7595563317892512,0.9827721113972304,0.9872307777610068,0.9899833465511408,0.9503273242960798\ngdb_133015,CN=C1NC=C(F)C=N1,-0.0026547572464901,-0.2511103305845449,-0.25656851493782,-0.0145958946025856,0.1708279443348516,1.856034243606602,-1.5089411535133506,-2.35486090328663,0.3159631240149217,-1.07208331311489,-1.162507054824155,-1.1625240489955602,-1.162524053653868,-1.1624887851194288,-0.5909397611464254,1.0006707201233325,1.0017491750699652,1.0019605479185614,1.0014183630303597\ngdb_31972,CNc1ccc(o1)OC,-0.0037459633618308,-0.161730071378027,-0.2252163403378102,-0.512833604535886,0.2306731323922913,2.411804827085168,0.5278719043974801,-0.6018737494865873,0.5111877883478605,-0.0496301688013142,-0.6876676241875969,-0.6876492727127257,-0.6876492773681004,-0.687714513941867,0.3456822115483977,-0.0154699545311235,-0.0130909553360013,-0.0130382332500935,-0.0197954401892706\ngdb_76650,CC(=O)NC1C(O)C1O,-0.0038401624412736,-0.1286260378219538,-0.0666208859419531,0.1313795865765609,-0.7708177697934447,0.0305763921810695,-0.0005063784078399,-0.0147387964131034,0.0458038303049103,-0.0586762990278913,-1.614773425078882,-1.614761188243517,-1.614761167942071,-1.614790955612207,0.5915916335515838,0.1883607441680661,0.1888410819930146,0.1874721438758986,0.2048865855794379\ngdb_58862,CN(C)C12CC1OC2=O,-0.0039696868663191,0.0629139129613836,0.0074560424579532,1.1246532165694925,-0.3641147570765547,0.8438991972716536,-0.1688204281724379,-0.5618896487396479,-0.2753932970288627,-0.0755664159293415,-0.6868101653151254,-0.6868070612720698,-0.686807065927438,-0.6868098127479542,0.2080812436807473,0.0745999560668249,0.0745990208194992,0.0740332593022434,0.0834837453107587\ngdb_43920,C1CC2(C1)C=CC(=O)N2,-0.0030552622208063,-0.1678167180207993,-0.0677526674995664,1.1315795375206965,0.533563063785051,-0.032682048214864,-0.8250321664951741,-0.7996898268661837,-0.0586714179620431,0.0221979084130371,0.2385161151074332,0.2384996875279017,0.2384996828782515,0.2385362565260051,-0.3204147813851613,-0.3161566170638322,-0.31528466373424,-0.3131005480507782,-0.3387347189112253\ngdb_113474,OCC1CC11CC2NC12,-0.0032656502967998,-0.2831031091518962,-0.176367187947926,-0.7957672811275176,0.6141708681073171,0.6767161762252556,1.4141838626515653,1.079562908239984,0.5264703277348302,0.6733990703714319,0.2114572093335033,0.2114627007826476,0.2114626961328303,0.2114515851181936,0.3818670814527702,-0.5349465250785973,-0.5372139350231113,-0.5383421298102526,-0.5187687222094437\ngdb_37829,C1C2N3CC22COCC132,-0.0037383699599444,0.2925143471084458,0.5297367222139234,-0.8924090611919571,-0.0050436287319113,0.5501992954333872,1.1713002971684747,0.9006866680563062,-1.117935414209091,0.0254737495914322,0.2427276984724019,0.2426847836802274,0.2426847790306031,0.242781210482134,-1.055189180463754,0.1262400381425635,0.120462185680558,0.1195729480179307,0.1458621734374493\ngdb_41166,C1OC2C3CC12CC=C3,-0.0038088935418024,0.2301325329688304,0.4187764932147039,-0.5166234782639033,0.6105068770017594,0.4327193346980811,-0.0431175302469787,-0.2441212691192319,-0.9179426862357262,0.3869583155890122,0.6432127889225331,0.6431719128243285,0.6431719081771791,0.6432762261316202,-0.6970328150837367,-0.3805926573268304,-0.3856128615117936,-0.3853715127336578,-0.3742696206357465\ngdb_112033,CCC1C=CC=CC=C1,-0.0040543140592501,-0.0102458117790743,0.0425412707439612,-1.6386221297084542,2.1579324539155733,1.3590036404956896,-0.9443433916447626,-1.5657010201233457,-0.0059800074284229,1.0776198396321115,1.536837728964887,1.536847841949421,1.5368478373077945,1.5368280006510078,0.4278981744603733,-1.47779362939686,-1.4757298163465735,-1.4726773889060585,-1.5141648662770513\ngdb_121339,OCCC1OCCC1=O,-0.0038197917867224,-0.2655377035915308,-0.2907866525143663,-0.0933337884346677,-0.3897626948154569,-0.1501620089501701,-0.7824210146560353,-0.7028862145314877,0.8543407424292088,0.2951446018275723,-1.214026029597255,-1.2140123267731675,-1.2140123314317937,-1.2140517857156214,0.3922056157111629,-0.2909186872387703,-0.2899827307226565,-0.2904158680206926,-0.2873729652083713\ngdb_79702,CC1C2C3CN3C1C2C,-0.0040001046858119,0.1681611255011861,0.1477787010607334,-0.7821106671765589,1.279795918950276,0.803233057017124,1.7380286166290195,1.3447206289828475,-0.3388107953995238,1.4307194543565525,1.108215206742459,1.1082114448017957,1.10821144015752,1.1082356763028276,0.3501130119448517,-1.3030422701013642,-1.3073933399937985,-1.3079669982848838,-1.2896680665402378\ngdb_2951,OCC1CCCO1,-0.0023252400794747,0.3011265732710405,0.393293153627156,0.0470875674534189,-1.8663511103551584,-0.2405312095157905,1.4440116689389624,1.5383278536522398,-1.35256854096273,0.0448282607738768,1.614691325110612,1.61466414511818,1.6146641404770343,1.614714528467036,-1.4470688599723365,0.7781678865363201,0.771201392238576,0.7681560801136003,0.8001175616919606\ngdb_92952,CC1OC=NC1(C)C=O,-0.0042585235156591,0.0927599489368857,0.269472600316026,-0.4094615314716917,-0.3714427392876684,-0.3173450299965682,-0.9230378157251932,-0.7660190051845505,-0.4845886487688352,-0.0686540905070397,-0.6882776057016489,-0.6882691427294537,-0.688269147384831,-0.6882884424204816,0.256327736886579,-0.0795441418781821,-0.0776310272973418,-0.0771230995193855,-0.0853141619566925\ngdb_46328,C1CCC2(C1)CNC2=O,-0.0038240029746681,-0.0607132045339271,0.0581854125967754,0.4576354404384588,0.4761605364646492,-0.032682048214864,0.6343997839953268,0.6418422263787493,-0.1730756341981195,0.7402382518462091,0.2085139911146508,0.2085132008806449,0.2085131962308094,0.2084916797911128,-0.0038587086162932,-0.8441105099100884,-0.8443120859064928,-0.8432742957163718,-0.8566665976567033\ngdb_63697,CC1(C)CC2(O)COC12,-0.0040425039429245,0.0884222661115907,0.0733731909182941,-0.2471503876717812,0.3369288744534615,0.0034656320113839,0.9710278835245229,0.9575061796440624,-0.2394246034372963,0.9666619898097746,-0.3158720342266094,-0.3158565686797261,-0.3158565733328019,-0.3158658482613926,1.016210004878409,-0.9123748491388216,-0.9136656329512896,-0.9145774839478664,-0.8982398680061882\ngdb_20438,c1nc2n(n1)CCO2,-0.0026442403296124,0.5483429245688647,0.4352329621532243,0.9711633305847944,-2.098403880373804,0.1390194328598146,0.1827215745004564,0.1157356376032259,-1.6162036051529078,-1.4612472343904692,0.4180786619530579,0.4180135650913956,0.4180135604428548,0.41815230611516,-2.617046320213727,1.833016334706388,1.8282322035465528,1.8274893457233063,1.8431908862528357\ngdb_74563,CC1=NC2C(O1)C2C#N,-0.0039928705059943,0.1826705798425663,0.1936249814146923,0.8369495264925976,-0.6975379476822923,-0.9047448336731012,-0.2923927685059402,0.1325710484440431,-0.7490419225454087,-1.0610836996167927,-0.1318476807922102,-0.1318691527102304,-0.1318691573621692,-0.1318209739274227,-0.8348799385289665,0.7311479844452552,0.7335408879409026,0.7356439528076827,0.7099694770054434\ngdb_73234,NC12CNC1COC2=O,-0.0041928135095238,0.3342053510319155,0.3066662281166197,1.3262352933097192,-0.9332547088064967,-0.1863096891764192,-0.0729453365343758,0.0168275989134279,-0.8347734154249244,-0.3031021366782341,-1.0886174149105048,-1.0886438904779645,-1.0886438951358153,-1.0885683097767709,-0.4887851964504069,0.4425493070029847,0.4401450268203463,0.439439840128156,0.459378603885073\ngdb_73565,CC12CCCC1N1CC21,-0.004244944921456,0.3742610422163001,0.3124346631521965,-0.6549538693365324,1.0489644793001491,0.7670853767908761,1.4823617055941871,1.1090248772114135,-0.5694225775280186,1.4413884783779307,1.1074087211164971,1.1074010606111118,1.107401055966831,1.107432196402442,0.2181736223615597,-1.3877578063924425,-1.3917695036597515,-1.391748050901116,-1.3813919973096265\ngdb_7801,CC1COCC(=O)O1,-0.0036627288452284,0.3470163531462151,0.1469024830806459,0.6541207904410076,-1.6697169210235676,-0.6020080117782737,-0.0559008757987203,0.2251658080685343,-0.9309732726326152,-0.5367086823632353,-0.2338495654038558,-0.2338754389484278,-0.2338754436009971,-0.2338109252333147,-1.018265843826639,0.8071659401404747,0.8048125476273621,0.8039715510416457,0.8125960969485515\ngdb_52852,CC#CC1(OCCO1)C,-0.0040181873721091,-0.1646281738770234,-0.1263862540004287,-0.7031114027425446,0.5408910459961664,-0.2224573694026672,0.7792777002483985,0.8733291254399796,0.4848238529854154,0.2571869191824984,-0.286147567149067,-0.2861116196565766,-0.2861116243094686,-0.286240110511482,0.7131924888764832,-0.4135868257422118,-0.4097867987420663,-0.4093749494753894,-0.4281219520536632\ngdb_106373,CCC12CC(O1)C(C)C2,-0.0034352694312239,-0.168915345111922,-0.0408272191531242,-0.777471338992262,1.212622748681721,0.0034656320113839,1.6421535249909571,1.6204004815012216,0.3062865177618303,1.7349218801154072,0.5804847423247852,0.5804986906795605,0.5804986860320237,0.5804822300171886,0.888701415691571,-1.722615970378207,-1.7248141565723725,-1.7248825129811567,-1.718913721431953\ngdb_98187,CCC(=O)C(C)(O)CO,-0.0038257659260522,-0.2744656271941033,-0.1825828592441725,-0.207226028226634,-0.0111502805745087,0.0622056123790369,-0.462837375862495,-0.4861302999559721,0.7875478167434161,0.8936017154549264,-1.244129251361026,-1.2440695323217372,-1.2440695369805488,-1.2441916677276248,2.075417455208453,-0.8294921765306535,-0.8263733142937567,-0.8279008441874508,-0.8161847360775455\ngdb_131928,Cc1cc(c[nH]1)N(=O)=O,-0.0029525965535251,-0.2192312030955855,-0.2411160457681499,2.607212613379546,-0.1210700137412348,0.2971655338496504,-1.5089411535133506,-1.626729384421306,0.296628856510085,-1.0611738603831706,-1.0576668690137736,-1.05768511254722,-1.05768511720488,-1.057643869132075,-0.6507555664985517,1.0701281251256338,1.070406171724716,1.070133301919826,1.0777542850339137\ngdb_131291,C(=O)n1c(nc(n1)O)O,-0.0039253809689948,0.048423400466402,-0.1518148299642212,1.9428738859183168,-1.868793771092196,-0.4303065307035937,-1.2127936482313366,-0.9975059042457808,-0.3319162447213216,-2.10815073315152,-2.3574897972516364,-2.357529047867273,-2.357529052532965,-2.3574268987115587,-1.1287896981604033,2.074062819173584,2.0751125335829506,2.075069788727424,2.082941762087052\ngdb_53869,C(C#N)OC(=O)C(=O)N,-0.0030809327824967,-0.3577781816042475,-0.3843959400537258,1.6492893572979477,-1.7161274750272972,-1.329480076331515,-1.6474273969905513,-1.005923609666189,0.9817415870408952,-1.8281714999595131,-1.9570200087604244,-1.9570148343575084,-1.957014814058177,-1.957051950737008,-0.2244141061286616,1.565622762562261,1.5718574758428392,1.5729280085802362,1.5491032416877053\ngdb_131765,C(c1c(non1)C=O)O,-0.0040458585243984,0.1511387195375825,-0.0864270632001835,-0.6661928052196179,-1.6550609566013377,-2.2422090020442837,-2.457039281934187,-1.3847203535845658,-0.473081168989523,-1.7915962823989,-1.954552649012778,-1.9545867022538848,-1.954586706917088,-1.954503944932616,-1.151928322452996,1.8248012246655323,1.824671479537732,1.8239563164899515,1.8399793518307803\ngdb_113155,CC(=O)C1(C)OC1CO,-0.0041470651974307,-0.1538818330201787,-0.1178066196120709,-1.0415863322447736,-0.334802828232093,-0.2405312095157905,-0.7483320931847244,-0.6271268657478125,0.3722340674442048,0.1822332687337151,-1.2135162222449252,-1.213471013898138,-1.2134710185567603,-1.2135674711565534,1.3147967204838809,-0.2373670778023247,-0.2336219277075396,-0.2344525817348678,-0.2320844196394305\ngdb_58207,CC(O)C(C)C(O)C#C,-0.0039340962490348,-0.2057824921525308,-0.2809657093208844,-0.2020639588384728,0.7277545923796027,-0.4438619107884366,0.1912438048682841,0.3956243428318036,0.8410517176177441,0.918666408507968,-0.3157113511767666,-0.3156573176136203,-0.3156573222666949,-0.3157588357619214,2.065078920950062,-0.895496246096056,-0.8929198672294106,-0.8939780395788387,-0.8860234988244132\ngdb_105283,CCC1(OC1C=O)C#N,-0.0040628801239691,-0.1732025163468207,-0.1958356561929977,0.2544851400866382,-0.4178532932913988,-1.862658359668677,-1.6985607791975177,-0.8102119586416943,0.3744738670371955,-0.8257039921930269,-0.6574640784876252,-0.6574451625633012,-0.6574451422559391,-0.6574924317524077,0.2472199805160891,0.5336418454068486,0.5385951470613105,0.5396369621040454,0.51840016639581\ngdb_118672,COCC(CCO)C=O,-0.0038673583151333,-0.4188530083423548,-0.4665596302281908,-0.255122191030714,0.2392224449719266,-0.3128265699682876,-0.6354125408110068,-0.4819214472457684,2.2638352592978586,0.9065247586357515,-1.2433128306948529,-1.2432485640102775,-1.2432485686690844,-1.2434119267791872,1.565875409616265,-0.7437330354530268,-0.7408951455335164,-0.7430255589963873,-0.7271708046301495\ngdb_44610,O=C1NC2C3CCN1C23,-0.0038578140610882,0.4217356232401634,0.4091837317868698,1.0066443897970936,-0.6645620277322747,0.3333132140758983,0.4170829096157194,0.2567322033950657,-1.1373436744550307,-0.2471423543463838,-0.1619316814982873,-0.1619831730203201,-0.161983177672445,-0.1618600590015055,-1.2228211287962572,0.1945935295394104,0.1912348241916447,0.1922852187628277,0.1926645171356106\ngdb_93289,CC1COC(=N)C11CO1,-0.0042665203515613,0.298373691594434,0.1475961556482151,0.3819033085285971,-0.2969415868079978,-0.8460048533054468,-0.1475148522528686,0.2483144979746583,-0.5229208206414195,-0.0189755082328464,-0.6874590382679939,-0.687468069569547,-0.6874680492623703,-0.6874381086685672,-0.1079825179329586,0.006440501741262,0.0057756837735096,0.0056979188566663,0.01175852667445\ngdb_27119,c1[nH]c(c(n1)N)OC=O,-0.0035754323560011,-0.1454463974239715,-0.1959178016286308,-0.0715746858238103,-1.1457661929288352,1.7024066026450468,-0.430879011983141,-1.2184706715315,0.0241057270661992,-1.4067600779094571,-1.4604324505909008,-1.460445980429389,-1.4604459601269888,-1.460417200116097,-0.5478625350697909,1.3374115681210124,1.3397424546342598,1.3400113143737231,1.3377973850202602\ngdb_82367,OC1C2CC2CC1C=O,-0.0042356217271457,0.2183128208160611,0.1980699622095118,-0.3811028211620457,-0.1955711662209053,-0.4257880706753132,-0.7248959596731982,-0.519801121637606,-0.5589413448970433,0.3412468070354441,-0.2867371294096207,-0.2867426728952562,-0.2867426775481521,-0.2867327873722336,0.0313415389777567,-0.4755161168194266,-0.4754912534449783,-0.4746159860124262,-0.48436512409279\ngdb_66048,OC1(CNC1C#C)C#C,-0.0040692466443281,-0.0438107635975151,-0.0117294803977149,-0.1984047704114215,0.3528061692442104,0.2203517133688727,0.1017603860060929,-7.811927389259577e-06,-0.1634065344549956,-0.8258843137257827,0.2722199152632046,0.2722419394472608,0.2722419347978191,0.272205204583638,0.7358388020139153,0.6006318330620566,0.6047882266647023,0.6053605964983771,0.5929493696037745\ngdb_3029,N#CC#CC1CO1,0.0037097404158766,-0.2317770193603037,-0.0843004091443459,1.2037178236539896,-1.554911866382764,-2.1653951815635044,-1.528116171840963,-0.498756858086585,-0.5470328030137833,-2.468793798662907,2.231970400784552,2.231931766188021,2.231931761550689,2.2320080813861103,-2.438337370890089,2.7335187814924264,2.7335787711469712,2.733764249191453,2.715391771994123\ngdb_39138,C1CC2(C1)OCC1OC21,-0.0035547245853545,0.0133936125264641,0.1371089216990416,-0.4192629290441501,-0.034355557576373,-0.3128265699682876,1.6038034883357326,1.72983065196653,-0.4854812085491888,0.3499924013740956,-0.2859835141735244,-0.2860101718574479,-0.2860101765103393,-0.2859440850318412,-0.4978929528208951,-0.3963542361508269,-0.3992241838721048,-0.3988868334755421,-0.3943281750246835\ngdb_28996,C#Cc1cccnc1O,-0.0037275490859315,0.0791407613762429,-0.0942126250440868,-1.0608624141372751,0.8633222632852328,0.5592362154899495,-1.131832459736973,-1.3763026481641571,-0.3836747033019085,-1.388397335157169,0.2984003687756753,0.2983732599523049,0.2983732553030248,0.2984394423911572,-0.7844180451249085,0.7271466253892882,0.7324206893662789,0.7369704517309436,0.6758992857453838\ngdb_113555,COC1CC11CCC1O,-0.0041370180324893,-0.133702452656797,0.0479263604132497,-0.853530184154539,0.5115791171517048,0.0396133122376318,1.0967307814499818,1.0648319237542698,0.1503412753687326,0.9744158157182696,-0.3155033393926532,-0.3154793096767124,-0.3154793143297859,-0.3155239174898162,0.9682096672501592,-0.8736460986429926,-0.8743859089032061,-0.8755748025714969,-0.8592056191030655\ngdb_29167,c1c(nc(c(=O)o1)N)O,-0.0036565170666837,0.0467565179833192,-0.1032121138812366,0.3318508382585764,-1.017526504234321,1.5713712618248978,-1.4151966194672454,-2.127582856935604,-0.4529733735052355,-1.771340163552677,-1.957012395224585,-1.957038848329644,-1.9570388280303128,-1.956961288409489,-0.5692780703193177,1.566422509948939,1.569357171830806,1.5704453393892304,1.5594531023983556\ngdb_34751,N#CC1C2CCC3C2N13,-0.0038297173688097,0.1871219137462535,0.3595496341231572,1.0286648630098838,-0.1112993707930814,-0.3489742501945356,-0.0409869726550217,0.1220489166685326,-0.9395943206032128,-0.2835673039630345,0.7662674980268577,0.766230514535613,0.7662305098892239,0.7663143903231856,-1.0679892704979537,0.1056144219344379,0.1040178785244929,0.1056834203817623,0.09332922128964\ngdb_47292,N=C1CCOC(=N)CO1,-0.0036546325324455,-0.0450419836134286,-0.1000175691621674,-0.5930743793290786,-0.3457948015487662,-1.2933323961052687,-0.2881316533220263,0.3177605676930267,-0.1663662440991015,-0.3191206995046984,-1.088614519270645,-1.088633331319739,-1.0886333359775902,-1.0885922234633802,-0.4548157267442605,0.4428534732221803,0.4412444328464277,0.4405314919657816,0.4566486571447645\ngdb_103892,CC1(CO)CC1OC=N,-0.0033661827376413,-0.3693011381634377,-0.3076994849841824,-0.0439347446694777,0.3601341514553258,-0.5929710917217101,0.1337187498854469,0.4082509009624172,1.15294863655534,0.5966422045950917,-0.7168142863009143,-0.7167719303518946,-0.716771935007448,-0.7168737351061465,1.2279038015978694,-0.4535637065857729,-0.4521776079277845,-0.4539055699440039,-0.4366565976557421\ngdb_73419,CC12OCC3C4C1N4C23,-0.0039515986535285,0.5885627784220376,0.4457384506436498,-1.064325574612877,-0.2822856223857687,-0.2992711898834448,1.3140476558295893,1.43731538860734,-1.135096010899217,0.0370443812765891,0.2408157773672996,0.2407708301608731,0.240770825511237,0.2408839165592879,-0.9961118418443696,-0.0745935744485195,-0.0788162025882832,-0.0782999156847695,-0.0707297492041843\ngdb_69330,CC1(CC2(O)CC12)C#C,-0.0039240877695469,-0.0378630238283331,-0.0294181308707335,-0.636984640453692,0.7961490930166759,-0.3534927102228161,0.9348084044612548,1.087980613660393,-0.0775324178861083,0.2409279276456937,0.6420376583874399,0.6420580838880202,0.6420580792408638,0.6420360593486281,1.044764051877778,-0.5040316998685774,-0.5015833033173271,-0.5005240068112916,-0.515845000543935\ngdb_15249,CCC1C(C)C1C=O,-0.0042673161666062,0.2251508273659805,0.0233831297001707,0.4889345700198423,0.1903692302311583,-0.032682048214864,-0.5757569282362126,-0.5534719433192392,-0.2171399200625985,0.8071675940873643,1.5609928571318563,1.5610215494369135,1.5610215447954363,1.5609541646090344,0.4456213760461893,-0.5896938224107721,-0.5898320792264509,-0.5905917352583879,-0.5925656227666342\ngdb_117321,COCC(=O)NC(=N)N,-0.0039432591750375,-0.2256588029735337,-0.2866519989208279,-0.8205974883110788,-0.3506801230228437,0.4191639546132382,-0.3520483810807344,-0.5429498115437288,0.8071633996455853,-0.0334913916196793,-1.5201209490196113,-1.5200964126282943,-1.520096392326263,-1.5201508434348536,0.8190393872362134,0.3146710109341142,0.3152701759241414,0.3130095244655991,0.3425180758331134\ngdb_40263,C1CC12CN=COCO2,-0.0037535954491708,0.0806434811905374,0.0583770852799196,-0.6926565786652557,-0.2920562653339221,-0.2314942894592295,0.0463658886152125,0.153615311995064,-0.4804346891342753,0.0175696557389739,-0.6870742676403291,-0.6870980497057443,-0.6870980543611144,-0.6870353452724978,-0.4476772149944182,0.0468578991781406,0.0443016779210876,0.0439496057056204,0.05773732694006\ngdb_16555,C1CC2OC(C1)C=C2,-0.0037704236214737,0.7825020297492065,0.9410480457000476,-0.7508115375951753,-0.5851755537785266,0.2745732337082452,0.0357131006554279,-0.092602571551881,-1.587332495461507,0.2755496619347865,1.5907456565476867,1.5906929587907122,1.5906929541494184,1.5908196133532106,-1.510330843410594,-0.0879296898871078,-0.0936101011709164,-0.0929920525916265,-0.0950826978917454\ngdb_131338,c1(nnc2n1nno2)O,-0.0033797502788576,0.2432971162159051,0.0748244269478142,0.0100382846295264,-2.542968134514789,-0.9905955742104412,-2.118280624813034,-1.6309382371315104,-1.118797947949577,-2.811585032431481,-2.2290252091283653,-2.229096833120752,-2.2290968377856517,-2.2289250062959285,-2.172489347102861,3.1285389259369776,3.1242369781497588,3.121672277705911,3.174270742285083\ngdb_132046,Cc1cc[nH]c(=O)c1F,-0.0041405549882442,0.265478018348746,-0.0876409901934299,1.4525426366934666,-0.2993842475450375,1.151154479194763,-0.927298930909107,-1.4520619969478323,-0.3818920859111895,-1.067996025039094,-1.2591525299046218,-1.259174320795565,-1.2591743254544698,-1.2591185010453916,-0.563370336457379,0.6650101866278928,0.6685927622067032,0.6711539107972394,0.6366570137607981\ngdb_29326,C#Cc1c(c(c[nH]1)N)N,-0.0035805775213259,-0.0806589687917252,-0.1803558052114498,-0.8464731779023689,1.2968945441095447,2.872687749969831,0.0506270037991264,-1.285812314894767,0.0489683604352567,-0.7194946093999235,0.3648658876360681,0.3648910641185101,0.364891059469641,0.3648529172577922,0.8121470310639521,0.5161732647837476,0.5213503267700021,0.5225111900130731,0.5031318421425978\ngdb_123071,COC=NCCCCO,-0.0010119960401225,-0.6403273903843884,-0.6714759830505543,-0.1319512948701536,1.2602546330539697,-0.3489742501945356,0.2615522054028631,0.4187730327379276,5.683907721184669,1.2866124960959608,-0.7470787902124304,-0.74700592067247,-0.7470059253282101,-0.7472060998294674,1.715538000785371,-1.0090787298619068,-1.0069747954623165,-1.0096673269950691,-0.987441875116732\ngdb_33268,C#CC#CC#CC1CO1,0.003066274213661,-0.6041042661213344,-0.6169587956019786,-0.6682184273845927,4.353884456513077,0.0712425324355992,-1.5451606325766187,-1.5593877410580397,3.918919905301884,-1.8502308341332927,0.7328152940275813,0.7328254567281242,0.7328254520815286,0.7327841067009513,-0.2404142186714118,1.1608457000002446,1.1696178190708018,1.1735227958184486,1.1080800608271657\ngdb_18527,COCC1CC2CN12,-0.0023331650709631,-0.1773697225545278,-0.0662831769287944,-0.7001056408203239,-0.4154106325543591,0.4056085745283942,1.5484089909448522,1.340511776272644,-0.0520807022110742,0.5127025310973166,1.1619223616407957,1.1619197169494784,1.1619197123055347,1.161917734070128,-0.4905082854934717,0.0657319375033135,0.0604782810097506,0.0575707366729015,0.0940758246779086\ngdb_77569,CC1C2(C)OC2C1(C)C,-0.0041635562536382,0.0168473425198213,0.1458802287705437,-0.6297969489005559,1.2407133471576612,-0.0823851085259547,1.5761062396402925,1.595147365239997,-0.0117114065146058,1.6287726044998896,0.5808252296325149,0.5808690849815985,0.580869080334064,0.5808058386113417,1.7933231633008946,-1.6868502183969545,-1.6862491763950855,-1.6865895343365138,-1.681971101868937\ngdb_110579,CCC1C2C3CC2C13O,-0.0036976728627888,-0.0949095512323237,-0.0657994315856211,-0.84542769549464,0.9366020853963836,-0.5342311113540583,1.318308771013503,1.5509544117828522,0.1526111010015924,1.010450068680616,0.6121697323247423,0.6121749421936169,0.6121749375462757,0.6121713095431014,0.4783600678644303,-1.0178890617286742,-1.019850581404116,-1.020013502449063,-1.013248135743287\ngdb_89649,OC1CC2(OC12)C1CN1,-0.0027808828781148,-0.2337090876929679,-0.1324650162372865,-1.074780398690166,-0.4068613199747257,-0.2314942894592295,1.122297472553465,1.2163506213216202,0.140571016467483,-0.0199672766630029,-0.6857072760020686,-0.6857217894918158,-0.6857217691846283,-0.6856842469411868,-0.1697675679057308,0.1904505758637442,0.1875961281246509,0.1862359707548055,0.2119764681362802\ngdb_90224,CC1CC23CCC12OC3,-0.004037740105642,0.1565813434028,0.3503402180616115,-0.6851421738597042,0.6886720205869877,0.5953838957161974,1.064772417570628,0.7744210867501805,-0.571303137001164,1.0274904535260283,0.6122045049654805,0.6121976830757537,0.6121976784284128,0.6122264008858423,0.2085735548359094,-1.0142364449757502,-1.0174828298916812,-1.0176624508555014,-1.006958999568065\ngdb_129438,c1c(nnnn1)C(=O)N,-0.0031463940964304,-0.0681384083222053,-0.1257017087034853,0.4691357469234765,-1.5023946605364389,-1.3159246962466735,-2.625353331698785,-1.9802730120784575,-0.4203711709074122,-2.134207194634718,-1.3334153079616693,-1.333449137212825,-1.333449141872189,-1.333371721108818,-1.2338981297873912,2.2398438971253944,2.2394178545067613,2.2382162526024985,2.2628646297659505\ngdb_4873,c1cc([nH]c1CO)O,-0.0028728658337174,-0.0296043787985136,-0.0217694780862187,-0.6395330038225312,-1.2337019794622168,2.5021740276507884,0.662097032690767,-0.5113834162171974,-0.5576630934855307,-0.8676287485587258,0.2924941621071992,0.2924789531075108,0.2924789484581942,0.2925175848452409,-0.6679864569292056,1.0810728647710917,1.0808024462030243,1.0804536700021838,1.0791734013974197\ngdb_85465,CC1CC(=O)NC1C#C,-0.0041061525670338,0.0018895978136726,-0.0686288854796539,0.5576750383280171,0.3002889633978861,-0.1004589486390793,0.197635477644155,0.2420012189093515,-0.0444254918562224,-0.0308767293947224,0.2390856325506767,0.2390948945462192,0.2390948898965727,0.2390807297330142,0.4195288848226274,-0.2563328904181894,-0.2533124715694563,-0.2515654501398168,-0.2765785620222638\ngdb_9593,CC1(OC1C=O)C#C,-0.002749846091365,-0.0299579599312888,0.095826276658039,0.2286747931458311,-1.035846459762109,-0.968003274069036,-1.3853688131798485,-0.9175377027519014,-0.7756596494353066,-1.3656167148523666,0.7252671846709884,0.7252658982396474,0.7252659185555541,0.725280401221413,-0.5102007316999337,1.3423018267309348,1.344615708295967,1.3448476158954952,1.330773043981782\ngdb_50920,O=CNC(C#C)C1CN1,-0.0044124639884006,0.0662413639787496,-0.1933530385827494,0.2050860963214479,0.0670148630107195,-0.3851219304207836,0.031451985471514,0.212539249937922,0.2613146590970642,-0.4146310046876312,-0.1601782966755434,-0.1601564636097163,-0.1601564682618299,-0.1602183769271728,0.4404521089169929,0.3787740416299633,0.3814294676350642,0.381138405934794,0.3800762157472167\ngdb_2531,CCC(CO)C#N,-0.0034314119388536,0.2998637835111292,0.1818690568485169,0.3419136064329667,-1.7527673860828723,-1.9439906401777336,0.3297300483454849,1.2310816058073344,-1.0017725299236184,-0.3763126789770455,2.140505649775456,2.140512396530628,2.140512391892732,2.1404973172338537,-0.6985097485492204,0.996464835506182,0.9944925754836912,0.9923111715098596,1.004396227689219\ngdb_89511,CC12CC(O)C1CCC2,-0.0042222641926773,0.1686851832515492,0.1001708574759733,-0.7273535260717582,1.0611777829853404,-0.5071203511843702,1.0754252055304123,1.298423249170602,-0.182767968376856,1.760647752121886,0.5801641001350643,0.5801739029545423,0.5801739232695524,0.5801696796309355,0.8987937943723835,-1.7562971349093577,-1.758630638381691,-1.7584579040154469,-1.7545939543499194\ngdb_70940,CC12CC3(C=O)C1CN23,-0.0033857631036411,-0.0422385903464257,0.000583207676641,0.3633459957914093,0.3466995174016148,-0.0869035685542365,-1.035957368098911,-0.982774919760066,-0.3105609387711007,-0.0675721613105061,0.24165443954309,0.2416463666044428,0.2416463619548122,0.2416709963635337,-0.183552280250254,0.0135018778468197,0.0123435301747671,0.01221686103596,0.019121973832027\ngdb_29588,c1c(c(c[nH]1)O)N2CC2,-0.0037756461577057,-0.0200071766231882,-0.0997985146671455,0.242462092397756,0.1146467473829679,2.4615078873962584,0.6429220143631547,-0.5113834162171974,-0.1164500975361138,-0.3410898729121007,-0.1610834337560624,-0.1610926590473508,-0.1610926636994702,-0.1610849360405651,-0.1225056970102233,0.2836958769051902,0.283953998126856,0.2843504378064302,0.2811512668016687\ngdb_100241,CC(CO)(CC#N)C=O,-0.0038527960051109,-0.2105621513937948,-0.1578570831185753,-0.3845659816376474,-0.3457948015487662,-1.4108123568405748,-1.3384965461567957,-0.6650065401396501,0.2858981217859221,-0.1031255568521701,-0.6880620053441088,-0.6880298517347563,-0.688029856390132,-0.6880908674418708,0.9207016407770716,-0.0568968691614399,-0.052716355510075,-0.0523841526742892,-0.0627593305897327\ngdb_40555,C1OC1C1=NCC2CN12,-0.0031717662274789,-0.1306212456426136,-0.066210158763787,0.3031654146965147,-0.1845791929042322,-0.1637173890350141,-0.5118402004775046,-0.429310788368216,-0.1433966810531343,-0.3373031207242307,-0.1601722308092852,-0.1601987501677192,-0.1601987548198331,-0.1601562961979496,-0.8828802761572163,0.3794112174163079,0.3770266453935005,0.3767666371098833,0.3871632486728062\ngdb_75924,OC1C(=N)OCC11CC1,-0.0039249664819923,0.1218483110564413,0.0255097837560085,-0.1458692794230446,-0.251752363172789,-1.3114062362183931,0.1656771137648009,0.7765255131052824,-0.419836421435482,0.0013707713797545,-0.6875830515337422,-0.6875976002363003,-0.6875976048916734,-0.6875504131320891,-0.0767207595802004,-0.0065862032329724,-0.0077108832934767,-0.0076961072361829,-0.0010619643324934\ngdb_87911,CC1CC11CCC(=N)O1,-0.0033610541517966,-0.1975996145083059,-0.1226349457731787,0.3281916498315251,0.7656158338036961,-0.0507558883279886,0.4405190431272457,0.4587571334848664,0.3107982877048146,0.6866527030289753,0.2091262941330749,0.2091303499795319,0.2091303453297003,0.2091271148707766,0.3550361234964661,-0.7797924651631573,-0.7800553124246853,-0.7794707298024017,-0.7841263844825093\ngdb_104183,COC1(C)COC(C)C1,-0.004071683827903,-0.0709544294868073,0.044549270281662,-0.5830116111546879,0.8083623967018688,0.1119086726901289,1.4120533050596082,1.340511776272644,0.1711071415601213,1.669795753201809,-0.346654558124195,-0.3466146227539743,-0.3466146274072402,-0.3466973550279609,1.486367158057677,-1.5223041601799814,-1.5230276670852254,-1.5245192378758377,-1.5060063719999062\ngdb_59283,CC(C)C1N=CNC1=N,-0.004048953360684,0.0544216518259803,-0.1246794543933831,0.4964489748253939,0.859658272179675,0.3830162743869891,-0.3541789386726913,-0.528218827058014,0.1085389511742912,0.694526743292641,0.3040957179450027,0.3041143965964443,0.3041143919471997,0.3040776806184396,0.6893153978511501,-0.6201891080076904,-0.6203841308495808,-0.6209257203634646,-0.61105403033566\ngdb_110848,COC1C2OC(N)=NC12,-0.00389733954165,0.0672137120938813,0.1520685182549124,-1.1209123099312033,-0.7757030912675206,1.2189313796189782,0.824019409679494,0.2462100716195552,-0.5384671602357982,-0.3550347381118741,-1.0879603542863077,-1.087974270101083,-1.08797427475893,-1.087929005572448,-0.1958600591292926,0.5115688175341306,0.5098650425405046,0.50866811564618,0.5323605099039258\ngdb_70611,CC12CC1COC21CC1,-0.0041307178300507,0.2900140233838216,0.1863505467258401,-0.7845283452444319,0.8572156114426354,0.622494655885883,1.5761062396402925,1.2668568538440703,-0.4899668844337167,1.03554481532245,0.6108276032494212,0.6108235446784904,0.6108235400311409,0.6108516135267739,0.3902363710905161,-1.158870104325742,-1.1605563592621655,-1.1597268725046632,-1.1639024408228644\ngdb_107584,CCC12NC1COC2=O,-0.0039315982740329,0.1087279254509634,0.015104695242468,1.39830823679253,-0.4337305880821477,-0.3399373301379733,-0.1411231794769978,0.0189320252685298,-0.3360530466188137,0.0005593244823539,-0.6880465536107148,-0.6880571358008107,-0.6880571404561867,-0.68802259611319,-0.1793676354313811,-0.0552737752844527,-0.0555571375112729,-0.0552048984391223,-0.0549655891129625\ngdb_6506,CC1(C)CC1(C)C#N,-0.0038531994391267,0.2594671390915686,0.2306543183440196,1.0009595792050676,0.1647212924922561,-1.5373292376324443,0.5683524986446619,1.2773789856195812,-0.7427114324732932,0.4535871219422417,2.086810027511734,2.086839071951544,2.086839067313316,2.0867936922168355,0.583222343913838,-0.3710979514677091,-0.3697403494520355,-0.3696135313612227,-0.3818097449310058\ngdb_69762,OC12CC1C1OC(=O)C21,-0.0036658071020339,0.0636842147149294,0.1593338256731388,0.3560929615877899,-1.3765976325789622,-0.6968956723721746,-0.198648234459835,0.1283621957338382,-0.7878294893559873,-1.0832331945569504,-1.1530466499262704,-1.153087506522796,-1.1530875111810457,-1.1529819713981144,-0.904049655829162,0.8674196948895356,0.8671771991541167,0.8683377176662598,0.860441405684833\ngdb_92759,CC1CN=C(O1)C(N)=O,-0.0034372258098758,-0.1761321885898147,-0.1919200570944811,0.3816419379266647,-0.5607489464081442,-0.6065264718065542,-0.9315600460930208,-0.637648997523323,0.2909088959152798,-0.3701516932745598,-1.0898039532681514,-1.0897982336312968,-1.0897982133266062,-1.0898204833272571,0.1140498130448944,0.3179119544425113,0.3199562966452383,0.3201013903043564,0.3164325513442246\ngdb_8777,CN=C(C(=O)C#C)N,-0.0036659342113813,0.1434420159509236,-0.0342099479493376,-0.4257318514419727,-0.6657833581007945,-0.23601274948751,-1.892441520065599,-1.7593082447927375,-0.547422784079208,-1.3210772962617103,0.8198208095764021,0.8198333948014239,0.8198333901553662,0.8198126022638929,-0.1409673653287816,1.458228488083489,1.460916231816188,1.460325282860933,1.4560855783294844\ngdb_77824,CC1N2C(=O)CCC12C,-0.0041527519591055,0.1292987706399176,0.1840230927162323,0.8082641029305362,0.4358566343035161,-0.0462374282997068,-0.2710871925863709,-0.2462256954743337,-0.3770761235881165,0.6609268310224965,0.2089557509302598,0.2089668203210434,0.2089668156712108,0.2089535284650575,0.5081448927517038,-0.797706806624317,-0.7970818109515361,-0.7963771392309668,-0.8039427202149477\ngdb_124590,Cc1ccn(n1)C(C)C,-0.003353698389125,-0.2722746869606573,-0.1719130798824807,-0.4769604894206884,1.3872729913799642,0.6179761958576026,0.4746079645985566,0.1809728546113906,0.7678167765688981,1.0184443232994511,0.7047458598671775,0.7047809066176477,0.7047809019708788,0.7046530991745721,0.9241478188632012,-1.109684328983782,-1.1077822709416518,-1.1073250035614464,-1.122007162142706\ngdb_123893,c1nc2n(n1)CCC=C2,-0.0033708857834967,0.1671445797444575,0.0544614861814032,0.6320349745777346,0.2819690078700976,0.4191639546132382,-1.0380879256908675,-1.2205750978866026,-0.7119719167058364,-0.5741554539988343,0.3635936782784115,0.3635458822780855,0.363545902591757,0.3636524651672796,-1.5164847328501128,0.3825367875312161,0.3812917169986403,0.3834430042586698,0.3660902255578246\ngdb_86775,OC1CN(CC#C)C1=O,-0.0038647221777972,-0.2069063750388518,-0.1767687878554662,-0.2370222768469074,-0.3396881497061705,-0.3489742501945356,0.0293214278795571,0.1893905600317992,0.3151120287658134,-0.7869048090617602,-0.6568324295994458,-0.6568150579014841,-0.6568150625566669,-0.6568681297669274,0.2410660910765703,0.5999920351522967,0.6042008371566777,0.6047773498900959,0.5896694440316492\ngdb_130565,c1c(con1)C(=O)C#N,-0.00338776369424,-0.1207083460273103,-0.1928419114276983,-0.7983809871468398,-1.3546136859456177,-3.100716407417675,-2.5784810646757323,-1.1027272220008857,-0.2221989503274571,-2.476186981505891,-0.998713114237242,-0.9987507310663334,-0.9987507357236294,-0.998678359387778,-1.648916433588566,2.0156891263138594,2.020324167082263,2.023106645109021,1.9853447413049157\ngdb_93926,CC1CCC(O1)C(N)=O,-0.0035806880511933,-0.16296760534274,-0.1208003643773703,0.1875742659919891,-0.090536754528255,0.4824223950091718,0.4831301949663843,0.252523350684862,0.2274163324449494,0.6911306877590753,-0.7188035659226353,-0.7187890540862584,-0.7187890587418243,-0.7188294652924077,0.6742999076187233,-0.6625232770504333,-0.6621979468013387,-0.6624446205145024,-0.6599195070604903\ngdb_82959,OC1C2CC3CC3CC12,-0.0037576795277691,-0.0249825682772385,0.1268133604330122,-0.8696044761733707,0.6642454132166036,-0.2992711898834448,1.3502671348928572,1.473090636644075,-0.3147209751042054,1.106381124106645,0.6109740826865007,0.6109583923681797,0.6109583877208312,0.6110059541575695,0.0943573668384333,-1.1434834890301635,-1.146516190578796,-1.1457857301243997,-1.1462831707864367\ngdb_62989,CC1(C)C(C=O)C2CN12,-0.0042284096533016,0.1499327553168672,0.1625922612865896,-0.689716159393518,0.4773818668331673,0.2429440135102779,-0.7930738026158201,-0.894389012845778,-0.3746368653017184,0.6071008534949223,0.2103879194277931,0.2104056366827629,0.2104056569954881,0.2103701023704562,0.693253887092442,-0.6472677678842368,-0.6472740947873121,-0.6476234467469943,-0.6422289962422865\ngdb_98412,CCC(=O)C1(C)OC1C,-0.0042041483474198,-0.1031492544157445,-0.0858702996920028,-0.8868549359008974,0.4908165008868783,-0.0733481884693937,-0.6588486743225331,-0.6145003076172002,0.4500647798570928,0.8940825728756079,-0.3169546540606431,-0.3169012263927587,-0.3169012310458423,-0.3170140047637573,1.6200296366840337,-1.0260963054017909,-1.0224340556115616,-1.022578755231034,-1.0293115071156131\ngdb_49603,N=C1NC2CC2N1C=O,-0.003785450156939,0.1000083621587762,-0.0095936990712514,2.354467241311086,-0.3433521408117282,0.2971655338496504,-0.3222205747933373,-0.458772757339645,-0.4818351894223917,-0.6801544616703895,-0.5625591575498321,-0.5625839067249127,-0.5625839113795132,-0.5625258653021173,-0.6465709216796788,0.6864696378170623,0.6854815100967157,0.6854847444954262,0.6932991341757896\ngdb_27324,c1c(=O)c(c[nH]c1N)N,-0.0036079889284269,-0.0395741039530129,-0.1538867203963034,2.26625466315896,0.3943314017738633,2.511210947707349,-0.2178232527874474,-1.3868247799396678,-0.0843590525217525,-0.6426175292684126,-0.5637522360595777,-0.5637638864160671,-0.5637638661081261,-0.5637121888470971,0.1236498805705447,0.5611452891404015,0.5626235364493019,0.563495877561898,0.5578704088801761\ngdb_77732,OC1C23COC12CCC3,-0.0038985940556443,0.1069347639918895,0.3351889488225965,-0.3688184028712311,-0.2224404343283273,0.3287947540476178,0.8431944280071066,0.6797219007705864,-0.7046641505265968,0.3746663977728326,-0.2853644463412899,-0.285391874481307,-0.2853918791341945,-0.2853100228673801,-0.2975223126701499,-0.3313255961853315,-0.3348478532342124,-0.3349645536497407,-0.321944691568947\ngdb_112507,CC1C(CO)NC1C#N,-0.0037756682636792,-0.2419424769275892,-0.2055379448683427,0.4291460448278463,0.0816708274329503,-0.6291187719479593,-0.0282036271032801,0.2651499088154743,0.5165406448599965,0.332441105519208,-0.1909176355993493,-0.1908958956224105,-0.1908959002747156,-0.1909502100500796,0.6267918811456339,-0.226618997349476,-0.2259936613086663,-0.2268781180392271,-0.2163117106544504\ngdb_123706,C1C2OC3=CNN=C3C12,-0.0031501244794532,0.137816287570519,0.1314773957228535,-1.303283647429412,-0.5546422945655469,1.4990759013724009,0.1784604593165425,-0.5219055479927078,-0.8890544188072581,-0.9703218614630936,-0.1310528775753406,-0.1311088433928371,-0.1311088480447713,-0.1309758073290865,-1.4583920165410522,0.8146363673692345,0.8127033199563984,0.814248046591111,0.8064522919667759\ngdb_56981,CC(C#N)C1CN2CC12,-0.0038219526456289,-0.1273632480620426,-0.1587059192867852,0.3691614916844011,0.4346353039349963,-0.3218634900248487,0.4319968127594178,0.5745005830154817,0.2301808728299153,0.3422986826431858,0.7365182432736134,0.7365258550665473,0.7365258504199746,0.7365029845137779,0.1364499706047446,-0.3957773691838054,-0.3956660589323177,-0.3953538042417768,-0.3979842519221959\ngdb_54432,[NH3+]C12CC1(NC2)C([O-])=O,-0.0040425205224046,0.138498194040871,0.0865986060552413,2.1279242720863314,-0.6902099654711769,0.6179761958576026,-0.0111591663676246,-0.300940780706988,-0.5408581623866955,-0.3371528527802674,-1.087945002402565,-1.0879614143883487,-1.0879614190461957,-1.087907513209014,-0.0762284484250392,0.5131814229203823,0.5112035628791722,0.5099971953066454,0.534814042412246\ngdb_56216,[O-]C(=O)CNC1C[NH2+]C1,-0.0038168627452378,-0.1046267184348405,-0.0716043757037014,2.0874118287868364,-0.4202959540284367,0.9568606979786792,0.2615522054028631,-0.1873017575314752,-0.0253428711617052,0.3953132132733594,-1.1181193420798048,-1.1181297032799615,-1.1181297079379948,-1.1181013882315964,-0.0181357321159777,-0.0328624935994265,-0.0367528760486315,-0.0389720614210073,-0.0001614808570201\ngdb_98331,CNC(=O)C(N)CC#N,-0.0030225288005928,-0.3750847152638312,-0.3841768855587038,0.9279718386154944,-0.0600034953152752,-0.583934171665149,0.0442353310232556,0.3156561413379248,1.312207466919556,-0.0270599236180598,-0.5933370883635013,-0.5932986008515508,-0.5932986055063413,-0.5933951147556938,0.8859937043381837,0.077022797330071,0.0806340581545712,0.0800257310917611,0.0812239877271834\ngdb_63196,CC1(O)C=C2CC3C2C13,-0.0036035124687995,0.0623330296718244,0.2779335801862469,-0.6480928910358116,0.7900424411740803,0.9794529981200844,-1.061524059202394,-1.504672655825385,-0.680679778823671,0.3024175703153851,0.6428978880877075,0.6428833207953575,0.6428833411107551,0.6429360926308547,0.0827880546921371,-0.4136707336647481,-0.4156606938227249,-0.4152048348210051,-0.4130986960877492\ngdb_62799,CC1(C)C2C(CN12)C=O,-0.003320401266588,-0.279706204697735,-0.1712924254799186,-0.1520768312189349,0.6972213331666229,-0.0100897480734589,-0.8058571481675617,-0.7912721214457757,0.5517354530192573,0.6232396306765563,0.210404744093879,0.2104216376766702,0.2104216330268466,0.2103928178765247,0.64476123830903,-0.6455004572658083,-0.6456080918025779,-0.6459717749359304,-0.6396358318025713\ngdb_90857,OC1CN2C3C=CC2C13,-0.0032494521447407,0.0538912801268176,0.2370160259702808,-0.82190434132074,-0.0441262005245263,1.087896038798828,-0.1475148522528686,-0.6523799820090378,-0.7928406209382,0.0454894397273142,0.2409040943830433,0.2408623678280504,0.2408623631784149,0.2409672400310001,-0.8050951136416938,-0.0653165047630602,-0.0692854179598793,-0.0688363523548125,-0.0612176800823504\ngdb_8991,C#CC(=O)NCC=O,-0.0020910825555092,-0.3069319519214213,-0.2575907692479222,0.5910651327248586,-1.3338510696807913,-0.8369679332488859,-1.1723130539841549,-0.7681234315396523,0.2900771031203721,-1.6830727732687654,0.3232702457023372,0.3232747505558443,0.3232747459067181,0.3232403567278134,-0.6660172123085596,1.6903256681876349,1.6940942720897605,1.6942974985842043,1.6745269116542716\ngdb_69345,CC12CC1C(C)(O2)C#C,-0.0039727540701379,0.0398238022014065,0.0233283660764152,-0.7488512580806835,0.807141066333349,0.1932409531991871,0.4405190431272457,0.3451181103093543,-0.1708919547034527,0.2192893437150097,0.6420452719232791,0.6420670454431107,0.6420670407959543,0.6420412265022076,0.9879021134566202,-0.5032319524823186,-0.5006502376828988,-0.4995975221547727,-0.5152551268745782\ngdb_34893,N#CC1N2CC2C11CO1,-0.0041024498164778,0.2141771843523517,0.090185623411225,0.3332230339187206,-0.7109725817360033,-0.9951140342387216,-0.3051761140576818,0.1641374437705745,-0.6625115225336977,-1.0826621763698905,-0.1296758011223936,-0.1297024533822684,-0.1297024580341939,-0.1296406597762124,-0.868849408235112,0.9592883815772822,0.9591348459829632,0.9596467807088594,0.9588705187928256\ngdb_37062,C1CC23CC=CC2(C1)N3,-0.0036740526301376,0.1783581528124694,0.2558638398127904,-0.6205836351824451,0.8621009329167129,0.7173823164797852,0.1720687865406717,-0.16204864127025,-0.7005512979704275,0.7853186350351326,1.1366112491091906,1.136579933910625,1.1365799292665233,1.1366611616589832,-0.495677552622668,-0.9437957440062223,-0.9468297485370948,-0.9450688940220244,-0.9565689701220071\ngdb_24492,c1ccc2c(c1)cc[nH]2,-0.0032719947111852,0.1466558158898977,0.0257744746041598,-0.3596050891531202,1.3127718389002938,1.8424788635217595,-0.3009149988737679,-1.1553378808784374,-0.6600240081117977,-0.5768302234013772,1.1939544040629957,1.1939071765618967,1.1939071719181509,1.1940165663785924,-1.2548213538817568,-0.167416713754702,-0.1642449890451657,-0.1582485735518923,-0.232785428927198\ngdb_48227,C(=N)N(C=N)C(=O)C#N,-0.0038976379722918,-0.0180814222393236,-0.1662267902825372,1.1900612097030314,-0.6169301433600262,-2.1428028814220994,-2.088452818525637,-1.0648475476090475,-0.152013439589465,-1.7932792833712865,-0.9336629392082963,-0.933663605470872,-0.933663610127766,-0.9336686279099268,-0.512908443053322,1.6560440375862495,1.6602933846925727,1.6607375919731433,1.6523282841939284\ngdb_108553,CCC12COC3CC1C23,-0.0036441764069949,-0.0155937264122984,0.0755728631391389,-0.6648859522099569,0.7118772975888521,0.2429440135102779,1.620847949071388,1.4878216211297892,-0.2246861072965017,1.0659891007693691,0.6109871130658724,0.6109771142799295,0.6109771096325797,0.6110060540059491,0.0158337375901677,-1.1421147410442027,-1.1445668891141134,-1.1438501772208092,-1.1462717722609712\ngdb_95781,CC1CCNC(C)C1=O,-0.0042151958076607,0.1255545990017808,-0.1218682550406021,0.2239701223110511,0.9109541476574812,0.861973037384777,-0.4692290486383659,-0.8628226175192466,0.2287671467861421,1.3932726827209534,0.1787529790650778,0.1787753412214129,0.1787753365713937,0.1787399583498503,0.9773174236206472,-1.3467373207627489,-1.3474527846269275,-1.3477439008222774,-1.3411666025427778\ngdb_65607,CC1(CC2CCC12)C#C,-0.0037227355102087,-0.0889239277703443,0.021484657409981,-1.310536681633031,1.5900138325541495,-0.4574172908732794,0.9241556165014702,1.1258602880522297,-0.0298975354473575,1.0001717413135411,1.538841486785909,1.5388596236982997,1.5388596190566854,1.5388196512913703,0.809685475288145,-1.26731322783664,-1.266265678157609,-1.2646906160977949,-1.2868013374997869\ngdb_104230,CCC1(O)CNC1=NC,-0.00421017222519,-0.0528965359200766,-0.0591912876524602,-0.2547301351278158,0.8083623967018688,0.4553116348394861,0.3829939881444084,0.1683462964807782,0.3151270417857475,0.9650090424261808,-0.2210649909085805,-0.2210146088917128,-0.2210146135442026,-0.2211227424817951,1.5075365377296226,-0.7698283993615107,-0.768788349958404,-0.7707220306781711,-0.7488507992020056\ngdb_118406,CCN=C(CO)OCC,-0.0043710705530808,-0.243767208130661,-0.3833463039317458,-0.8402002834559956,0.9500367194500944,-0.0597928083845496,0.3126855876098295,0.3367004048889457,1.768582540643616,1.2912106951812303,-0.7474913939301324,-0.7474247173570157,-0.7474246970502095,-0.7476064918265519,1.8396004118860776,-1.0524197939750894,-1.0505793696929633,-1.0529617743425896,-1.0331499604137675\ngdb_95924,CC1OCOC1(C)C#C,-0.0041965770515069,0.1022624418802179,0.1661701513719473,-1.0140117337409242,0.180598587283005,-0.5161572712409337,0.5044357708859537,0.7386458387134455,-0.3538063001434718,0.2541815603032372,-0.2858642936909942,-0.2858439462436673,-0.2858439508965577,-0.2858830277484946,0.8242086543654097,-0.3838309787122294,-0.3819169859347547,-0.3817017044284493,-0.3873579769800114\ngdb_89684,CC1CC2(OC12C)C=O,-0.0038964884616714,-0.0711249061043953,-0.0688023036215463,0.2017536211468122,0.3870034195627478,-0.6562295321176449,-1.0231740225471693,-0.7049906408865901,0.0563669196398106,0.2336249055690875,-0.2863457437426206,-0.2863275956305117,-0.2863276002834051,-0.2863518408468271,0.6454997050417723,-0.4344038568990414,-0.4322739404390627,-0.4317034877716246,-0.4408769015413351\ngdb_123872,c1c(cn[nH]1)N2CCC2,-0.0026436600478089,-0.2234299790472904,-0.2070621990628701,-0.044522828523825,0.5140217778887445,2.511210947707349,0.4256051399835471,-0.7491835943437338,0.305429345814178,0.0698027930605403,0.3351091940838723,0.3350926453154076,0.3350926406663544,0.335117720723142,-0.6209707416112791,0.0140000811368812,0.011904287578473,0.0117807164483509,0.020518293145964\ngdb_38718,C1C2CC3COCC12O3,-0.0039360802601536,0.3962209561411567,0.4343658714437626,-1.1693965565896307,-0.1259553352153123,0.1525748129446574,1.616586833887474,1.5257012955216274,-1.0107456689569512,0.4118126335205059,-0.2859591258964244,-0.2860049796472566,-0.2860049843001479,-0.285885374185385,-0.8082951361502434,-0.3937924223912967,-0.3986835775996474,-0.3983500401369466,-0.3876258423177862\ngdb_119512,OCCC12CC1C1OC21,-0.0031846705894911,-0.3034908498256632,-0.2366528104320787,-1.1859282471618438,-0.0709954686319484,0.5276069952919821,1.0711640903464987,0.8123007611420187,0.6409932183189546,0.2834237021984522,-0.2843903381147854,-0.2843903021278447,-0.2843903067807261,-0.2843923913084741,0.216696688896075,-0.2290025067744512,-0.2305654230109162,-0.2314176347820376,-0.2171893970803536\ngdb_123280,C#CC#Cc1conn1,-4.2245669511377713e-05,-0.461390081404965,-0.4503130885140665,0.4025515860812425,1.4104782683818289,-0.3354188701096928,-1.5749884388640154,-1.3994513380702809,1.465985472649343,-2.522770044134445,-0.0695074502364626,-0.0695332994151751,-0.0695333040667288,-0.0694847077413291,-1.1659591903750992,2.0324340011046647,2.0376157706078963,2.0402762897324744,2.002359889516629\ngdb_100457,[NH3+]CC(O)C#CC([O-])=O,-0.0031500305290659,-0.489575548846184,-0.4937497694227844,12.288314366298607,0.6630240828480854,3.568530594325108,-1.2852326063578725,-2.931473724584603,2.3035275392858803,-0.773320586927498,-1.5820974272990955,-1.5820694851987551,-1.5820694898596557,-1.5821747339693544,0.5411297401475268,0.9971859192150636,0.9995321695370518,0.9973203824239416,1.0163931752640656\ngdb_108144,OCC12CC3CCC1C23,-0.003556780440887,-0.0988368273856476,0.0311504370028223,-0.7347372455763436,0.7509598693814671,-0.2947527298551631,1.2224336793754411,1.3447206289828475,-0.1113592539671086,1.089220524906061,0.6109202886873685,0.6109123115027287,0.6109123318179288,0.6109323659027436,0.0653110086839022,-1.1491341631889105,-1.1513140712505356,-1.1505471902672335,-1.1546838837196285\ngdb_31910,COc1ncc(o1)C#N,-0.0018282148985014,-0.2976567611348733,-0.2769223284336051,1.4734522848480445,-0.53632233903776,-0.3625296302793784,-1.197879745087638,-1.0143413150865976,0.3995391761753605,-1.7557122973805168,-1.029446836618215,-1.0294692943880668,-1.0294692990455525,-1.0294206516283138,-1.1140203635055577,1.4108860649147563,1.4150738595036711,1.4172476174386142,1.3877628194091771\ngdb_23802,c1c(cnoc1=O)F,-0.0031964420203631,0.2976854711752823,0.1423571023089414,0.7946728316300606,-2.793340860061223,-1.853621439612116,-2.243983522738493,-1.3531539582580343,-1.34131735488086,-2.6721063268449523,-1.173636521515054,-1.173705373793115,-1.1737053784514917,-1.173545944611002,-2.392798589037648,2.751522275121282,2.7502180084490897,2.7502887095137094,2.7490601652475206\ngdb_122490,CCCCC1NC1C=O,-0.0030906649373162,-0.5206401769400008,-0.5354887779950829,0.3555702203839255,1.2431560078947008,-0.3263819500531305,-0.7781598994721215,-0.6166047339723021,3.21362574399854,1.3240592677315597,0.1798213952863481,0.1798701736579332,0.1798701690079208,0.1797371690662808,1.279596472889831,-1.2345078522473851,-1.2334602340408924,-1.234555347757487,-1.2273266836198344\ngdb_32675,CCC1CC2=NC=CN12,-0.0035063511889119,-0.078442772763081,-0.1225436730669196,0.8830160950831523,0.8059197359648309,1.2008575395058536,0.6599664750988101,0.0925869476971031,0.0874903397741482,0.3825103844477057,0.7354169765173751,0.7354124754561212,0.7354124957720906,0.7354097945436816,-0.2797991110843348,-0.5114575510130729,-0.5115897175026993,-0.5104572643125203,-0.5227810030135726\ngdb_17890,CC1C2CC1OCO2,-0.0033028988620942,0.3335739561519599,0.5588618427812103,-0.7225835125864951,-1.0456171027102623,-0.0778666484976742,1.665589658502484,1.6814288457991826,-1.270163504357133,0.2118961608720264,0.6647064496215547,0.6646689361078214,0.6646689564233537,0.664768539641989,-1.080050893799411,0.2279390623799958,0.2215919456687209,0.2199894326562255,0.2466536308532144\ngdb_126580,CCCCc1cnn[nH]1,-0.0024309121591476,-0.4755712104087684,-0.452439742569904,1.5346130057001843,0.7961490930166759,-0.3760850103642212,-0.1581676402126532,0.0189320252685298,2.257944436238092,0.7115671281380536,0.3045913716093911,0.3046115257597479,0.3046115211105063,0.3045319657777269,0.2964510960322434,-0.5681242420741092,-0.5686236792907017,-0.5695303389301141,-0.5591935911635425\ngdb_13495,CN(C)C1CNC1=O,-0.0036997950362418,0.1808521625882941,0.2744378355365224,0.1041317013251269,-0.7818097431101162,1.6255927821642704,0.4426496007192025,-0.3219850442580089,-0.8326015318744913,0.1537424665583154,0.7594021605794303,0.759409622834547,0.7594096181881159,0.7593979181186742,-0.1645983007765353,0.3587908454533416,0.3559248072711535,0.3533722543484815,0.3830597796690372\ngdb_102044,COC(=O)C(CO)C=O,-0.0040604926788345,-0.1851800772195788,-0.2887695257060396,-0.4013590428117928,-1.0578304063954538,-0.669784912202489,-0.7952043602077771,-0.4735037418253598,0.6936971397025207,-0.4914479776415559,-2.111409060855053,-2.111371080602189,-2.1113710852663616,-2.111501516114362,0.7225464008245533,0.411521730522057,0.4166832343914229,0.4161435253801254,0.407538112131273\ngdb_83997,CC12CCCOC1C2O,-0.0042336377160269,0.1651493719237977,0.2575341303373324,-0.5351807910010911,0.3088382759775196,0.468867014924329,1.4930144935539718,1.25633472206856,-0.4221512861281428,1.0534567542428486,-0.3163236541951893,-0.3163247912112887,-0.3163247709018185,-0.3162956456058935,0.5494990297852727,-0.9598142908429872,-0.9624163630486992,-0.962981790960868,-0.9473048189187244\ngdb_87729,CC1CC2(CC2O)C=C1,-0.0036870067305902,-0.1725269238252682,-0.1494234850602321,-0.928478204258604,1.3567397321669843,0.9839714581483648,0.3169467027937434,-0.1452132304294333,0.4265933530019068,1.014928053410716,0.6104898618070767,0.6105058213548393,0.6105058167074879,0.6104698682143399,0.7959007629436227,-1.1943474228232749,-1.1936373046521445,-1.1925744956472033,-1.207481851574414\ngdb_52295,O=CNCC1OCC1=O,-0.0032954380960486,-0.3532384524173667,-0.3631385267659753,0.8327675968616822,-0.8550895652212686,-0.7827464129095121,-1.1850963995358963,-0.8060031059314905,1.0355032114240874,-0.7705256031697849,-1.5849183546586687,-1.5849164139869245,-1.5849164186478424,-1.5849546126703131,-0.2571527979469036,0.7008677128994198,0.7031135916119272,0.7029924656927069,0.6990468404129542\ngdb_79086,CN1C2C3C=CC2(O)C13,-0.0036985128897806,0.2523513187944688,0.184269529023132,-0.5143364854969963,0.1586146406496587,1.7882573431823867,-0.3350039203450789,-1.163755586298846,-0.7789664458021697,-0.048007275006513,0.2418730603525544,0.2418531064352477,0.2418531017856183,0.2419200431811718,-0.122013385855062,0.0364664273964912,0.0338690164821051,0.0335905265660127,0.0475527448425342\ngdb_42846,O=C1C2CC1N1CCC21,-0.003932985423868,0.2959996468458009,0.4516803038211189,-0.5083249616525553,-0.0917580848967748,1.4674466811744349,-0.4926651821498922,-1.1700688653641522,-0.997149234689697,0.0473227086436639,0.2403709222125301,0.2403303909463465,0.2403303862967077,0.2404248636399363,-0.9727270619741962,-0.1213224209341205,-0.1246741693116887,-0.1238344429260465,-0.1231344679456117\ngdb_106205,OCC12C3CC4C1N4C23,-0.0035432239526575,-0.0008380280677355,0.2021681067205465,-1.2069685806173878,-0.2041204788005405,0.6496054160555699,0.9198945013175563,0.6060669783420131,-0.6155121912721476,0.0233699983759487,0.2411621807668124,0.2411331615595852,0.2411331569099513,0.2411961923625013,-0.6926020146872827,-0.0382063794163983,-0.0410907215747063,-0.0408405151572731,-0.035080862230454\ngdb_17337,CC12OC1CC1NC21,-0.0034522799778079,0.495791928710158,0.5914461989157184,0.3010744498810569,-0.9882145753898608,0.1164271327184094,1.318308771013503,1.2479170166481517,-1.3841788118863512,-0.1630824664934381,1.191148928436815,1.1911124195011344,1.1911124148573713,1.191207382178595,-1.12140503083298,0.5122191039342071,0.5068579201476535,0.5056796219634461,0.5258263054409514\ngdb_68719,CC12NC1(C#C)C1NC21,-0.0040675389578776,0.153323345822229,0.1833750565017926,-0.0588328689796142,0.5726456355776643,0.1028717526335666,0.4554329462709442,0.4019376218971104,-0.5812328198759976,-0.3999949069456269,0.7684382294256951,0.7684337839978574,0.768433779351482,0.7684556138342794,0.0522647630721213,0.3336342014278178,0.3334194720942344,0.3334670282879525,0.3377677405344754\ngdb_124092,C1C(CO1)n2nnnn2,-0.002518435234603,-0.2125321034192563,-0.0942765159384682,0.5676724638519249,-1.5109439731160734,-1.8039183793010227,-1.438632752978772,-0.5808294859355664,-0.2226811542296189,-1.391673176335564,-1.3618383846212356,-1.3618809063833646,-1.3618808860803553,-1.3618100619436828,-1.7424555530692585,1.8777576122764836,1.8722656240990148,1.868778560262296,1.9282979732524608\ngdb_8488,CC1OC2(C)CC12O,-0.0038092693433513,0.364032445161019,0.3521291631042904,-0.4855203766339686,-0.8184496541656933,0.7625669167625955,1.3438754621169864,0.9722371641297776,-0.9691245733602818,0.0847694802792628,0.6649749203689606,0.6649784217903025,0.6649784171432876,0.6649964685276121,0.113311346312152,0.2561399907196345,0.2538151984142382,0.2519828317839811,0.2726736138241944\ngdb_39036,C1N2CC=C3COCC123,-0.0036544501581644,0.1863579259415071,0.1484815008989288,-0.662664302093533,0.2526570790256376,0.8438991972716536,-0.1943871192759211,-0.5850383386457707,-0.7094443675641126,0.0444075105307798,0.2403013020438161,0.2402662122328719,0.2402662325457816,0.2403494781143007,-0.8816494982693127,-0.1286355208072881,-0.1313563747326221,-0.1304669375423509,-0.1317403543299221\ngdb_119342,COCC12C3CC1CC23,-0.0031702685477764,-0.2552080833554568,-0.1397485781967648,-0.9599080191409536,1.036751175614958,0.3062024539062127,1.5782367972322493,1.418375551411421,0.5376804067383093,1.0337716535836856,0.6120286447687828,0.6120337040913844,0.6120336994440411,0.6120090309662275,0.133249948096195,-1.0327092985467117,-1.034556111653684,-1.0346153135537532,-1.031773588518147\ngdb_86475,CC1CC(O1)(C#N)C#N,-0.0041265011156116,0.0081277792276343,-0.0878782992297036,1.6781708088114582,-0.5448716516173936,-3.041976427050024,-1.1808352843519825,0.2504189243297601,-0.1087945297284057,-1.121281037968402,-0.1313468349460019,-0.1313418188626398,-0.1313418235145754,-0.1313610722968546,-0.1653367675092777,0.7837582518753254,0.7884462125294708,0.7901620262584151,0.762471083209471\ngdb_35439,C#CC12CC1OC2C#N,-0.0040056367056724,0.0085634416948035,-0.0156907158493608,0.9026188902280684,-0.1931285054838674,-0.7782279528812316,-0.4841429517820644,-0.1157512614580038,-0.3729235937649793,-1.4769953149178,0.3012658290793086,0.3012529645673084,0.301252959918046,0.3012828496228257,-0.5111853540102562,1.028142698385933,1.0322518443893227,1.034686876412476,1.0004979321352083\ngdb_93081,OC1CC=CCC(=O)C1,-0.0041866348899394,0.0717344994343635,-0.116957783443861,0.3905285383923604,0.2587637308682332,0.2971655338496504,-0.5672346978683849,-0.6965729354661815,0.002025149272031,0.3346350175010686,-0.2872516296955167,-0.2872505609171779,-0.2872505655700769,-0.2872449346667591,0.2792202056015896,-0.5295606853007754,-0.5283719035788953,-0.5271236651810397,-0.5428310085087503\ngdb_79461,CC12CC3OC1C3C2O,-0.0040549164470271,0.2777586487638829,0.3017557565198788,-1.342881293622144,-0.2199977735912894,0.1796855731143443,1.118036357369551,1.020638970297125,-0.7816695042959778,0.3441920587371208,-0.2855630472958564,-0.2855861330376115,-0.2855861376905003,-0.2855138633317306,-0.1793676354313811,-0.3521872034260092,-0.3550738052313476,-0.3550478505773997,-0.3452147803803936\ngdb_93435,CC1CCC(=O)NC1=O,-0.0038385708111839,-0.0330644227406704,-0.1532934478056191,0.2384108480678064,-0.1894645143783097,-0.8821525335316961,-0.7568543235525522,-0.3367160287437237,0.0478809888486176,0.0192226031225677,-0.6899208563600451,-0.6899284033613777,-0.6899284080167652,-0.6899059362251649,-0.0700745589855203,-0.2521558491494114,-0.2503911184410518,-0.248664701522046,-0.2699645678842072\ngdb_63570,CC1(O)CC11COC1=N,-0.0039739477927052,0.0557033834322904,0.0453342155554906,-0.0197579639907468,-0.1137420315301193,-0.1230512487804844,0.0932381556382651,0.151510885639961,-0.3428278505904013,-0.0601489248787301,-0.6871426396884139,-0.6871429822939416,-0.6871429869493118,-0.6871215643471813,0.3501130119448517,0.0396759054331006,0.0396233544058491,0.0393042787374132,0.0478947005928857\ngdb_66382,CC1(CCOCO1)C=O,-0.0041078215680307,0.0364395256448443,0.1621267704846681,-1.200107602316667,-0.4947971065081072,-0.4257880706753132,-1.0061295618115138,-0.7954809741559801,-0.3910614665622292,0.3011252659973021,-1.2139839429696275,-1.2139867151978962,-1.213986719856522,-1.2139625462274928,0.2302352456630172,-0.2864977885701198,-0.2873160863193846,-0.2877680316489916,-0.2771855334791384\ngdb_7241,NC(=O)C(=N)OC=O,-0.0032678885266135,0.0905058692154441,0.0041610977619989,1.1496794517045026,-2.4721309731406755,-1.862658359668677,-1.6239912634790252,-0.7344526098580191,-0.8257627438417572,-1.978619765455347,-1.006156256648893,-1.0061767399176684,-1.00617674457501,-1.0061327647305178,-1.0704508262737618,2.208171278490384,2.2092270734217525,2.208235828494477,2.2136657461840725\ngdb_111200,COC1C2CC1C2C=O,-0.0039172072853048,-0.0858490347049604,-0.1023358959011492,-0.1670402981795547,0.2392224449719266,-0.2134204493461049,-0.8569905303745281,-0.7470791679886314,0.1185554950930273,0.2923195644810661,-0.2851328450772787,-0.2851309908817236,-0.2851309705720621,-0.285141179259631,-0.005581797659359,-0.3069975428948581,-0.3076849870910104,-0.3079906883852982,-0.3026697857745668\ngdb_11896,CC1=NC2(CC2)CO1,-0.0027621867510538,0.0982593983412992,0.1813122933403362,-1.095755389495227,-0.4996824279821847,0.7580484567343138,0.3808634305524515,0.0210364516236316,-0.810913079862881,-0.1800326905724736,1.189882635170943,1.189865490253568,1.1898654856097968,1.189903811675366,-0.8594954962870427,0.3792040697329634,0.3770292444625343,0.3767666371098833,0.377012862150315\ngdb_76733,CC12OC1(C)C(O)C2O,-0.0043058689843353,0.254011887328752,0.0859779516526793,-0.5835343523585524,-0.4361732488191856,0.0622056123790369,0.8133666217197094,0.7723166603950787,-0.351375978178463,0.2342259773449404,-1.2129047929109438,-1.212885067985498,-1.212885072644117,-1.2128967895993417,1.0442717407226176,-0.1731408073456683,-0.1726139899997897,-0.1738749373268831,-0.1555205271351889\ngdb_32164,c1cc(nc(c1)N)CO,-0.0034595804755459,-0.1653479640401727,-0.2318518660828487,-1.0043410214694315,0.6080642162647216,1.521668201513806,-0.3925289753279162,-1.0964139429355788,0.3362703781926159,-0.3219457368512046,-0.1629228640150043,-0.1629255591701076,-0.1629255388596894,-0.1629174283658785,0.0259261162709793,0.0904769082843083,0.0931147856657811,0.0948598085448819,0.0719569868926466\ngdb_81474,CN1C2CC1C1CC2O1,-0.0039136703295499,0.3847990227627595,0.4852869142657289,-0.819682691204316,0.2783050167645398,2.023217264652999,0.9667667683406088,0.012618746203223,-0.9851238057226662,0.757338743869207,0.2118000930328426,0.211764697700431,0.2117646930506156,0.211846710113322,-0.6094014294649829,-0.498929049329317,-0.5057704028630519,-0.5071203711067289,-0.4736619091063216\ngdb_40981,C1CC2(OCC=C2)C=C1,-0.0032707512501776,-0.1097789006552785,0.0932980226946611,-0.869931189425786,1.022095211192727,0.5185700752354198,-0.098512027637859,-0.3388204550988261,-0.2759055269708928,0.368595572836724,0.6399735164903613,0.6399557629739243,0.6399557583267549,0.6399900662283615,-0.4299540134086032,-0.7208550159484718,-0.7204742633894979,-0.7178711134613202,-0.7494121765588334\ngdb_37839,C1C2C3OC4=NCC13C24,-0.0036442316719286,0.5799884359522403,0.5476261726407119,0.7279579854868606,-0.3286961763894974,1.101451418883671,-1.364063237260279,-1.8603207098376384,-1.411835296774531,-0.6902524675047085,0.2736190335264698,0.2735524982322254,0.2735525185453408,0.2737106435457461,-1.579008249555628,0.7475991815076,0.7412419282611241,0.74085446122415,0.7648077808368757\ngdb_20443,C1NC1C1=CC=NO1,-0.0019634924031421,0.036471095388842,0.1556281537990183,1.0035732852243902,-1.2227100061455436,-0.2495681295723528,-0.5267541036212031,-0.4040576721069908,-0.9742565240076768,-1.1836722883018718,0.8202610966096663,0.8202155963891382,0.8202155917430828,0.8203036565884979,-1.9268260806772533,1.5044774861362484,1.500710571450534,1.499838950203439,1.5121435243371717\ngdb_58773,CC(C)C12CC1(C)CC2,-0.0041128562034881,-0.1063314846107208,0.0290329102176107,-1.570273717303178,2.17991640054892,0.2067963332840299,1.456795014490704,1.340511776272644,0.3930771434523955,2.3833280583160894,1.477710110984664,1.4777604650057234,1.4777604603637315,1.4776632684298563,1.887600749514328,-2.4416177143501248,-2.441577503273509,-2.4414680704229865,-2.444498229578162\ngdb_120605,CC1(OC1CCO)C=O,-0.0029350665165645,-0.4454536746348851,-0.4149084057561515,0.0870119268985661,-0.1088567100560436,-0.67882183225905,-0.9848239858919444,-0.6565888347192415,1.7544981832285114,0.2017981550377074,-1.2133400375368648,-1.2132971497443643,-1.213297129440437,-1.2134062160255576,1.0829181664027974,-0.2188601366378316,-0.2155194147721842,-0.2164751663663154,-0.2136758017450337\ngdb_111558,OCC1C2OC3C2C13O,-0.0039603802514889,0.1701247635778481,0.1329468862936255,-0.4593179737902634,-1.0590517367639736,1.042711438516018,0.9667667683406088,0.4671748389052751,-0.6752775938174663,-0.3944049394302007,-1.1814883485456302,-1.1815118618162832,-1.1815118664747084,-1.1814423038383055,-0.2477988859988343,0.5033773065964889,0.5007968921267996,0.4996639235390863,0.5233642240384176\ngdb_44747,O=C1CC2CC(O2)C1=O,-0.0039545221685197,0.2724991294138526,0.1627656794284819,1.259847160418935,-1.2508006046214857,0.0125025520679462,-2.1076278368532497,-2.087598756188664,-0.864779437932755,-1.0652611484589662,-1.153953434526019,-1.1539942211900651,-1.1539942258483205,-1.1539024486358025,-1.1184511639020116,0.7721684700753667,0.7727712296848012,0.7745976001767505,0.7553612532329504\ngdb_93811,CC1OCC(O)CC1=O,-0.0040921815918017,0.0678387930250374,0.0121748413715502,0.074074082102921,-0.4410585702932632,0.0486502322941941,-0.8271627240871311,-0.8396739276131238,-0.2228620253745366,0.3057535186713654,-1.2142562080037476,-1.214255137487275,-1.2142551171833535,-1.21424541668329,0.4015595276592313,-0.315097279542225,-0.3152638711852835,-0.31551611807446,-0.3094775548373683\ngdb_11639,CC1C2CC1(O2)C#C,-0.0036470336040657,0.3075983707905856,0.3235334242333061,-0.4043648047340133,-0.3812133822358234,0.0396133122376318,0.0825853676784804,0.0631249787256735,-1.055654973371993,-0.5907750886011506,1.6236512086736528,1.6236347116887917,1.62363473201025,1.6236800142052592,-0.6699557015498515,0.7450085243992803,0.7431158567354255,0.7427125920115971,0.7442989847095205\ngdb_75784,CC1(NC(=O)C1O)C=O,-0.0040583870848617,-0.0733221602866408,0.0675865013414655,0.1911681117685572,-1.122560915926971,-0.6833402922873305,-1.1020046534495758,-0.7702278578947548,-0.3021089660011871,-0.8325562104377434,-1.5850766662793156,-1.5850572526883724,-1.5850572573492911,-1.5850964472917255,0.4690061559163619,0.6842382115016848,0.6884496464602721,0.6884319463832942,0.682855235633792\ngdb_96266,CC1CC(=O)NCCO1,-0.004196654422414,0.0769308792963984,-0.1384159966853816,0.1790143787787087,0.1305240421737186,0.12094559274669,0.3915162185122361,0.330387125823639,0.1354022481759648,0.7121381463251137,-0.7185079111079178,-0.7185048804284602,-0.7185048601214754,-0.718499965643934,0.4296212635034381,-0.6314668572215513,-0.632610149635658,-0.6330629274375816,-0.6223043745222367\ngdb_75178,CC1=CCC(NC1)C#N,-0.002896403168968,-0.2813667732320182,-0.2752246560971854,1.3051296172036926,0.9146181387630388,-0.0869035685542365,-0.0431175302469787,-0.0021122382824911,0.6968927682313019,0.3695873412668805,0.7351369481729394,0.7351465993467394,0.7351465947001582,0.7351295201457216,0.2715893826965861,-0.5408725221073309,-0.5392723973698165,-0.53794727701792,-0.5547766627214977\ngdb_78665,CC1C2C3C1C(O)C23C,-0.0039720466789869,0.0817231664352615,0.0673400650345659,-0.9516095025296056,0.9109541476574812,0.0215394721245085,1.458925572082661,1.431002109542033,-0.1962421537675049,1.004499458099678,0.6114420779977409,0.6114524012128579,0.6114523965655124,0.6114536742859815,0.724023334290038,-1.0943239349139973,-1.0950806225977774,-1.0947129408416476,-1.0951721846338418\ngdb_58062,CC(C)C(C)(C#C)C#C,-0.0044439097356596,0.0178512603789507,0.1165816900613642,-1.388621148960283,1.7988613255709318,-0.9092632937013816,0.4703468494146428,0.8880601099256938,0.1124462683385091,0.8773126703293291,1.5389808768978124,1.539050861786176,1.539050882107112,1.5389220208412495,2.3774503489000565,-1.2526712953539936,-1.2463542134630288,-1.2449170076347111,-1.275114999731057\ngdb_133478,c1conc1C(F)(F)F,-0.0034434541679006,-0.1237390414510972,-0.0457103089379874,0.4671101247585018,-3.1914945601984788,-2.2241351619311587,-1.0487407136506526,0.0020966144277126,-0.5621140964430681,-2.573801037904307,-4.280917251551366,-4.28097724747021,-4.280977252147789,-4.280859279808232,-1.6644242349761549,2.474885720886061,2.4750234248710186,2.474598876874631,2.4819984234852\ngdb_61256,CC(CC#C)C1CCC1,-0.0033597443728686,-0.3699704167361907,-0.3726308882169244,-1.2478730798197808,2.071217997750712,-0.7465987326832654,0.9774195563003936,1.3131542336563162,1.5917098633684132,1.6767381322129036,1.5083346727353806,1.5083830722891065,1.508383067647304,1.5082704158903275,1.5545522530475482,-1.8482811949901736,-1.8463180154689856,-1.845529396567784,-1.8623442112858\ngdb_7017,C(C#N)C(=N)NC=O,-0.0013455751797628,-0.2511860979701396,-0.2126480886859287,2.7603757861118283,-1.7368900912921237,-1.492144637349634,-1.099874095857619,-0.3914311139763779,-0.0438367670031012,-1.597209670088262,0.4174744217939013,0.4174654624030384,0.4174654577544942,0.4174695678662647,-1.0702046706961807,1.7695452355525614,1.7711644538664963,1.7708240989177793,1.7652506218538817\ngdb_125589,Cc1nc([nH]n1)N2CC2,-0.0030572959703654,-0.1867396225730691,-0.1580031194485899,-0.5540648169906942,0.0560228896940481,-0.0778666484976742,0.1869826896843703,0.2209569553583306,0.2058672870470112,-0.320713539710707,-0.0661611140166924,-0.0661684726229372,-0.0661684772744701,-0.0661867906915226,-0.4855851739418565,0.4383512887535726,0.4373926131522144,0.4367068394282861,0.4482393953173603\ngdb_73368,CC12CC3OC1C3CO2,-0.0040405475642725,0.4639380570163972,0.4273378730618101,-1.0788969856705986,-0.2529736935413071,0.4417562547546433,1.618717391479431,1.3931224351501963,-0.9926789292779592,0.3870184227665975,-0.2862382056691828,-0.2862803165627075,-0.286280321215602,-0.2861678202855738,-0.6881712142908287,-0.4231077528275436,-0.4273513044731694,-0.4268155715057566,-0.4198694199447166\ngdb_125187,Cc1c2c(c[nH]1)[nH]nn2,-0.0035491704595205,0.1489477793041365,-0.009840135378151,1.5411472707484903,0.1219747295940834,2.2265479659256453,-0.5714958130522987,-1.6014762681600814,-0.5915935908946466,-1.000645932554843,-0.0361802829618509,-0.0362053509215435,-0.0362053306103423,-0.0361490036463577,-0.8565416293560743,0.964068511039292,0.9639873070961285,0.9644675978074034,0.9653961743620408\ngdb_31107,NC1=C(C[NH3+])C=C([O-])O1,-0.0034291129176129,-0.4107585259813238,-0.4540643967413165,3.20647006095965,0.710655967220334,2.127141845303464,-0.7632459963284229,-1.7445772603070235,1.90130550082186,-0.5355666459891164,-1.089350261419326,-1.089273645664096,-1.0892736503219511,-1.089487389137163,2.015601649856328,0.3655690323207559,0.3745757236852388,0.374333001925768,0.3544580307833552\ngdb_33883,N#CC12CC1C1CN2C1,-0.0035672255133509,0.1157742923112682,0.2189805392134775,1.610606508211978,0.1439586762274296,-0.3173450299965682,0.2615522054028631,0.4061464746073141,-0.8141426664129408,-0.284739393925946,0.7671182420103424,0.7670766201339126,0.7670766154875289,0.7671717883478656,-1.0374659788779388,0.1949789815585632,0.1921133093850617,0.1931575079380468,0.1912083555653629\ngdb_90478,OC1CC2C(C=O)N2C1,-0.0041325470993552,0.2636027755552779,0.2492374413383778,0.1630707720608431,-0.337245488969131,0.1977594132274676,-1.3150604126452694,-1.391033632649872,-0.7489139544231153,0.0093950795873826,-0.6871122604322918,-0.6871347196902227,-0.6871347243455929,-0.6870689692138992,-0.346261117031142,0.0428670286120731,0.0404836461193434,0.0401585027521978,0.0538988736423607\ngdb_72516,CC12CCC(CO1)C2=O,-0.0042124436139639,0.2877283739183821,0.473375826098913,-0.2217320966338726,-0.1198486833727167,0.6405684959990077,-0.66097923191449,-0.9491040980784328,-0.796390127793994,0.3541698502162692,-0.287356396941157,-0.2873707555906053,-0.2873707602435051,-0.2873228912881371,-0.1613982782679849,-0.5405657337659743,-0.5408864189825733,-0.539549914822505,-0.5517304069121127\ngdb_117811,CCC(C)(CCO)C=O,-0.0040727338616427,-0.2897516972378289,-0.2552359334264367,-0.1124791850262031,0.8169117092815024,-0.4709726709581235,-0.8101182633514755,-0.5808294859355664,1.161862438414648,1.6144370426458117,-0.3467708080796274,-0.3466984719560597,-0.346698476609326,-0.3468569626607142,2.2376339808341807,-1.534515385032062,-1.5317579385787994,-1.53318793414672,-1.5242269142311562\ngdb_105686,CCC1(CC1)CNC=O,-0.0040768068872543,-0.165032266600195,-0.2053553994558244,0.5204297275526754,1.013545898613092,-0.4935649710995286,0.3915162185122361,0.6186935364726266,0.6646751849060702,1.3625579149749003,0.179074869375428,0.1791140081236732,0.179114028436205,0.1790237773652404,1.2943658075446776,-1.3129250501026355,-1.3121912206655382,-1.312728458427716,-1.3087662951969363\ngdb_133761,CCC(NC)C(F)(F)F,-0.0044020354954168,-0.0265736833747267,-0.1527458115680643,0.0813924589570233,-1.27400588162335,0.324276294019336,1.2565226008467518,1.0900850400154944,0.3509037831938739,0.179227909854453,-3.474033025785078,-3.474010329894159,-3.4740103345667537,-3.474042346771132,1.103841390497163,0.1368570124660305,0.1362567256821433,0.1328172913524898,0.1763190322687905\ngdb_110046,OCC1C(O)CCC1=O,-0.0041033948468436,0.0284776362086039,-0.0672050312620115,1.424902695539134,-0.4996824279821847,-0.2811973497703203,-0.7675071115123369,-0.6250224393927101,-0.0734860515610743,0.3208103666564658,-1.2143543851724652,-1.214351767514448,-1.214351772173076,-1.2143443414641155,0.4515291099081279,-0.3254100876464964,-0.3253248658116669,-0.3255087325313986,-0.320770643492739\ngdb_37230,C1C2C1N1C2C11COC1,-0.0031724736186299,-0.0282721356018072,0.0722231548194291,-0.1972939453532095,-0.0600034953152752,0.6676792561686933,0.894327810214073,0.5723961566603798,-0.5039919046747972,-0.0095987885295502,0.2426393065694206,0.2426075495536269,0.2426075449040022,0.2426830345641451,-0.7300176624795588,0.1169551020897839,0.1124206673720005,0.1115881471063169,0.1346545737192121\ngdb_36623,N#CCOC12CC1CC2,-0.002561066604437,-0.3715489039360797,-0.3149282833199051,1.1404661379863916,0.3051742848719619,-1.1848893554265236,0.0037547367760738,0.5534563194644607,0.990300080761768,-0.0448516481832885,0.2403747664240693,0.2403782441527762,0.2403782395031377,0.2403506762948404,-0.0525975129772854,-0.1209186140564948,-0.1196917547679597,-0.1188871697044677,-0.1316035720293266\ngdb_53403,CC#CC1OCC1C#N,-0.0041941564474121,-0.1159350007348458,-0.2389437553591828,-0.4393884653929314,0.3063956152404817,-1.0086694143235633,-0.1155564883735145,0.3556402420848648,0.5487911138479402,-0.7721484969645861,0.2703767406423751,0.2704014257470602,0.2704014210976073,0.2702993486623507,0.0089413814179065,0.4070195460534493,0.4131562982742424,0.4150802616521635,0.3753800234423848\ngdb_111470,CCN1C2CC2CC1=O,-0.0041538683107656,0.1612789213096698,-0.0135275527110196,0.5518595424350254,0.52012842973134,0.672197716196975,0.4596940614548581,0.1409887538644505,-0.1447221162415815,0.7100343951096303,0.2086163120438591,0.2086189422382018,0.208618937588367,0.2086076536825387,0.1876503307415439,-0.8333624294572397,-0.8333024312339634,-0.83234229291689,-0.8434272108555788\ngdb_95768,CC1COCC(C)=C1C,-0.0043355296742367,0.1657176273157579,-0.1361341790289034,-0.8492829118731402,1.7243601730912614,0.7219007765080658,0.3680800850007099,0.0273497306889384,0.3728478139972152,1.704567755434866,0.5799558387268257,0.5799938980137153,0.5799938933661754,0.5799249013314713,1.3817510375858495,-1.778173503587795,-1.7773725221970484,-1.7770701807363771,-1.78253743841572\ngdb_36221,C#CCC1=CCC2NC12,-0.0034570659210638,-0.2084217227507452,-0.1532751932643672,-1.0253813549249755,0.9854553001371518,0.373979354330428,0.0421047734312987,-0.1304822459437186,0.2865136556032116,-0.0169919713725335,1.1678397015452873,1.1678380876915058,1.1678380830475987,1.1678325023811826,-0.0169049542280741,-0.2870248352085528,-0.2853979736781363,-0.2834246509329226,-0.3100075862508676\ngdb_127743,CN1N=NC(N)=NC1=N,-0.0035654680884602,0.0580458584369256,-0.0746985204458858,-0.3058934304560485,-0.0441262005245263,0.4643485548960485,-1.602685687559456,-1.799292345539678,-0.3726748065775037,-1.042119885088652,-0.8677410767317036,-0.867745027293373,-0.8677450319498594,-0.8677571407653228,-0.3292763821780684,1.3879634693267722,1.3868323798837596,1.3843277336543396,1.4222547560946717\ngdb_122108,CCC(CCCO)C=O,-0.0037464717992206,-0.4745925483448371,-0.5108177654932411,0.791732412358323,1.0172098897186497,-0.2495681295723528,-0.6439347711788346,-0.519801121637606,2.888763652536624,1.616931490515599,-0.3468250763645991,-0.3467485468293503,-0.3467485265200681,-0.346949921500936,2.008955449261647,-1.540215879519396,-1.53697167022967,-1.5383623122423185,-1.53483894101753\ngdb_82395,CC1C2CC2CC=C1C,-0.0041370843504097,0.1542767520909619,-0.0281859493362353,-1.5811859399338484,1.9686262467950997,0.6857530962818179,0.4767385221905136,0.151510885639961,0.0423444015687193,1.7650656296744007,1.50707571841881,1.507091310303537,1.5070913056617266,1.5070684411120452,0.8771321035452757,-1.980525325153111,-1.9808146183948672,-1.9790773854411472,-1.9995596553770008\ngdb_29171,OC1=COC(=N)C(O)=N1,-0.003711599626074,0.0546805237267621,-0.1072828765803938,-0.3448376501439495,-0.9283693873324194,1.5849266419097408,-1.4514160985305131,-2.169671384037646,-0.4343475775601354,-1.765239285027776,-1.9561917309482708,-1.9562288885022932,-1.956228893165506,-1.9561294765317545,-0.7991873797797552,1.6526274118650424,1.6536891513302276,1.654179938735777,1.6544113646398224\ngdb_86810,CC1CC(CC(C)=O)C1,-0.0030633585335892,-0.4292520820152239,-0.3949835739797844,-0.2164393419447449,1.4336835453836947,-0.0236451281583017,-0.3946595329198731,-0.3788045558457655,2.012234560396101,1.6489686161685266,0.5798821996096848,0.5799339130084717,0.5799339083609315,0.5797893072337044,1.4922748919196138,-1.7859087651966434,-1.783618084090307,-1.783271692239483,-1.7980166353821037\ngdb_95300,CC1CCC2CC1CO2,-0.0040329928478397,0.1411374246390854,0.2230513019126345,-0.7479364609739209,1.0428578274575535,0.0983532926052848,1.4035310746917806,1.340511776272644,-0.3861958182922324,1.83824611838442,0.5796740879759198,0.5796602485838024,0.5796602439362604,0.579709178910099,0.1876503307415439,-1.8077694011404404,-1.8121116733667308,-1.8115643142155715,-1.8071639517040168\ngdb_55334,NC(=O)N1CCC1C#C,-0.004139046255555,0.0601231475919796,-0.0845742272631232,0.8359693867353517,0.0377029341662595,-0.0010528280168966,0.4682162918226857,0.4629659861950713,-0.0931859933371575,-0.3922711346259245,-0.1609588463548245,-0.160942958641008,-0.1609429632931265,-0.1609815180827145,0.4091903505642348,0.2967828906987478,0.2995406126384596,0.2998271188235432,0.2929572890825678\ngdb_125907,CC1CC2=C(NN=C2)O1,-0.002959128868685,-0.0852365816714035,-0.0780482287655958,0.0977934642282704,-0.1418326300060612,1.7792204231258244,0.6386608991792407,-0.1957194629518838,-0.2036478620291708,-0.2860617518328215,-0.1617507788942464,-0.1617769074779464,-0.1617769121300701,-0.1617209702104988,-0.7206637505314905,0.213596051871385,0.2127109281952112,0.21360985028599,0.2085426624764959\ngdb_42647,O=C1C2C3NC1C23C#C,-0.0037674780005091,0.0448433914970536,0.0851564972963472,0.7491290042433707,-0.0184782627856224,0.4417562547546433,-0.4841429517820644,-0.6839463773355692,-0.6416066071814933,-1.4368737738796582,0.3017981774902055,0.3017660198368725,0.3017660151876133,0.3018393545594239,-0.906018900449808,1.0840620845114597,1.085670501727493,1.0877287681824812,1.0640276112885378\ngdb_116621,CCN=C1OCC2CC12,-0.0030099891871427,-0.287516559362786,-0.2520596432486192,0.0742701100543703,1.0623991133538602,0.0893163725487238,0.3702106425926668,0.3240738467583334,0.8739963600512021,0.699185049555496,0.209321799748481,0.2093269799779924,0.209326975328162,0.2092969070381189,0.1554039500784632,-0.7592560011218911,-0.7595824489078615,-0.7591422628402879,-0.7647431926996091\ngdb_102867,OCC(CC=O)OC=N,-0.004435094978739,-0.1769403740361579,-0.3366238055976993,0.2522634899702142,-0.3909840251839768,-0.5974895517499906,-0.8591210879664851,-0.5703073541600558,1.0467865682629585,-0.1196850842769011,-1.6144238015281434,-1.6143893461136367,-1.614389350774737,-1.6144974013799307,0.7309156904622992,0.2250861930133316,0.2275568081502362,0.2259122245408023,0.2383982491134692\ngdb_64637,CC1(N=COC1=O)C#C,-0.004123074689724,0.1400324835991631,0.2725302359757067,-0.7335610778676486,-0.9308120480694588,-1.4062938968122942,-1.0018684466276,-0.3367160287437237,-0.7747456424836185,-1.480661852750499,-0.6277762312693537,-0.6277878320871376,-0.6277878367421411,-0.6277506202626931,-0.410015411624561,1.0285832149796683,1.033351250415404,1.0357785282501015,1.0017688676744707\ngdb_90844,OC1CC2C3N4CC13C24,-0.0036734281363872,0.2739702794841491,0.2796403797932926,0.0826339693162014,-0.0123716109430268,0.8574545773564964,1.21604200659957,0.8017786293665082,-0.9377076844315232,0.0289599658913757,0.2424735311874094,0.2424305151561304,0.2424305105065045,0.2425359079786836,-0.904049655829162,0.0995415860404297,0.0939880727215448,0.0932855588499615,0.1178588471144281\ngdb_40590,C1CC1N1C2C3OC2C13,-0.0022587729437481,-0.2272688599174206,-0.0915474620213203,-0.8424219335724193,0.1317453725422367,0.5773100556030728,1.1201669149615083,0.8375538774032434,0.0284548559599001,-0.0422369859583308,0.2431767722748997,0.2431445189451307,0.2431445392580582,0.243212855021635,-0.7319869071002056,0.1734120233440613,0.1683292324473117,0.167104965855713,0.1951379970631566\ngdb_69805,CC12CC1C1CC1C=C2,-0.0037325616154155,0.0901207183386712,0.1117990002533859,-1.5166927439070723,1.5313899748652295,0.8484176572999341,0.2913800116902602,-0.1073335560375958,-0.3648927004589253,1.0850130224750956,1.5375430668950336,1.5375337878333484,1.537533783191726,1.5375680518690604,0.1760810185952485,-1.4037029337974063,-1.4043100097492114,-1.401761311255694,-1.429681848521818\n"
  },
  {
    "path": "graphium/data/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\n- ✅ `datamodule.py`: loading from disc and process into dataloader\n- `collate.py` : defining the collate function to use for the dataloader\n- `dataset.py`: defining the dataset class\n- `normalization.py`: defining label normalization\n- `sdf2csv.py`: convertion function from sdf to csv for the ogb pcqm dataset\n- `smiles_transform.py`: transform smiles to molecule object \n- `utils.py`: utility functions for dataset loading "
  },
  {
    "path": "graphium/data/__init__.py",
    "content": "from .utils import load_micro_zinc\nfrom .utils import load_tiny_zinc\n\nfrom .collate import graphium_collate_fn\n\nfrom .datamodule import GraphOGBDataModule\nfrom .datamodule import MultitaskFromSmilesDataModule\nfrom .datamodule import ADMETBenchmarkDataModule\nfrom .datamodule import FakeDataModule\n\nfrom .dataset import SingleTaskDataset\nfrom .dataset import MultitaskDataset\nfrom .dataset import FakeDataset\n"
  },
  {
    "path": "graphium/data/collate.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom collections.abc import Mapping, Sequence\n\n# from pprint import pprint\nimport torch\nfrom numpy import ndarray\nfrom scipy.sparse import spmatrix\nfrom torch.utils.data.dataloader import default_collate\nfrom typing import Union, List, Optional, Dict, Type, Any, Iterable\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.features import GraphDict, to_dense_array\nfrom graphium.utils.packing import fast_packing, get_pack_sizes, node_to_pack_indices_mask\nfrom loguru import logger\nfrom graphium.data.utils import get_keys\n\n\ndef graphium_collate_fn(\n    elements: Union[List[Any], Dict[str, List[Any]]],\n    labels_size_dict: Optional[Dict[str, Any]] = None,\n    labels_dtype_dict: Optional[Dict[str, Any]] = None,\n    mask_nan: Union[str, float, Type[None]] = \"raise\",\n    do_not_collate_keys: List[str] = [],\n    batch_size_per_pack: Optional[int] = None,\n) -> Union[Any, Dict[str, Any]]:\n    \"\"\"This collate function is identical to the default\n    pytorch collate function but add support for `pyg.data.Data` to batch graphs.\n\n    Beside pyg graph collate, other objects are processed the same way\n    as the original torch collate function. See https://pytorch.org/docs/stable/data.html#dataloader-collate-fn\n    for more details.\n\n    Note:\n        If graphium needs to manipulate other tricky-to-batch objects. Support\n        for them should be added to this single collate function.\n\n    Parameters:\n\n        elements:\n            The elements to batch. See `torch.utils.data.dataloader.default_collate`.\n\n        labels_size_dict:\n            (Note): This is an attribute of the `MultitaskDataset`.\n            A dictionary of the form Dict[tasks, sizes] which has task names as keys\n            and the size of the label tensor as value. The size of the tensor corresponds to how many\n            labels/values there are to predict for that task.\n\n        labels_dtype_dict:\n            (Note): This is an attribute of the `MultitaskDataset`.\n            A dictionary of the form Dict[tasks, dtypes] which has task names as keys\n            and the dtype of the label tensor as value. This is necessary to ensure the missing labels are added with NaNs of the right dtype\n\n        mask_nan:\n            Deal with the NaN/Inf when calling the function `make_pyg_graph`.\n            Some values become `Inf` when changing data type. This allows to deal\n            with that.\n\n            - \"raise\": Raise an error when there is a nan or inf in the featurization\n            - \"warn\": Raise a warning when there is a nan or inf in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans or inf by the specified value\n\n        do_not_batch_keys:\n            Keys to ignore for the collate\n\n        batch_size_per_pack: The number of graphs to pack together.\n            This is useful for using packing with the Transformer.\n            If None, no packing is done.\n            Otherwise, indices are generated to map the nodes to the pack they belong to under the key `\"pack_from_node_idx\"`,\n            with an additional mask to indicate which nodes are from the same graph under the key `\"pack_attn_mask\"`.\n\n    Returns:\n        The batched elements. See `torch.utils.data.dataloader.default_collate`.\n    \"\"\"\n\n    elem = elements[0]\n    if isinstance(elem, Mapping):\n        batch = {}\n        for key in elem:\n            # Multitask setting: We have to pad the missing labels\n            if key == \"labels\":\n                labels = [d[key] for d in elements]\n                batch[key] = collate_labels(labels, labels_size_dict, labels_dtype_dict)\n\n            # If the features are a dictionary containing GraphDict elements,\n            # Convert to pyg graphs and use the pyg batching.\n            elif isinstance(elem[key], GraphDict):\n                pyg_graphs = [d[key].make_pyg_graph(mask_nan=mask_nan) for d in elements]\n                batch[key] = collage_pyg_graph(pyg_graphs)\n\n            # If a PyG Graph is provided, use the PyG batching\n            elif isinstance(elem[key], Data):\n                pyg_graphs = [d[key] for d in elements]\n                batch[key] = collage_pyg_graph(pyg_graphs, batch_size_per_pack=batch_size_per_pack)\n\n            # Ignore the collate for specific keys\n            elif key in do_not_collate_keys:\n                batch[key] = [d[key] for d in elements]\n            # Otherwise, use the default torch batching\n            else:\n                batch[key] = default_collate([d[key] for d in elements])\n        return batch\n    elif isinstance(elements, Sequence) and isinstance(elem, Sequence):\n        temp_elements = [{ii: sub_elem for ii, sub_elem in enumerate(elem)} for elem in elements]\n        batch = graphium_collate_fn(temp_elements)\n        return list(batch.values())\n    elif isinstance(elements, Sequence) and not isinstance(elem, Sequence):\n        temp_elements = [{\"temp_key\": elem} for elem in elements]\n        batch = graphium_collate_fn(temp_elements)\n        return batch[\"temp_key\"]\n    else:\n        return default_collate(elements)\n\n\ndef collage_pyg_graph(pyg_graphs: Iterable[Union[Data, Dict]], batch_size_per_pack: Optional[int] = None):\n    \"\"\"\n    Function to collate pytorch geometric graphs.\n    Convert all numpy types to torch\n    Convert edge indices to int64\n\n    Parameters:\n        pyg_graphs: Iterable of PyG graphs\n        batch_size_per_pack: The number of graphs to pack together.\n            This is useful for using packing with the Transformer,\n    \"\"\"\n\n    # Calculate maximum number of nodes per graph in current batch\n    num_nodes_list = []\n    for pyg_graph in pyg_graphs:\n        num_nodes_list.append(pyg_graph[\"num_nodes\"])\n    max_num_nodes_per_graph = max(num_nodes_list)\n\n    pyg_batch = []\n    for pyg_graph in pyg_graphs:\n        for pyg_key in get_keys(pyg_graph):\n            tensor = pyg_graph[pyg_key]\n\n            # Convert numpy/scipy to Pytorch\n            if isinstance(tensor, (ndarray, spmatrix)):\n                tensor = torch.as_tensor(to_dense_array(tensor, tensor.dtype))\n\n            # pad nodepair-level positional encodings\n            if pyg_key.startswith(\"nodepair_\"):\n                pyg_graph[pyg_key] = pad_nodepairs(tensor, pyg_graph[\"num_nodes\"], max_num_nodes_per_graph)\n            else:\n                pyg_graph[pyg_key] = tensor\n\n        # Convert edge index to int64\n        pyg_graph.edge_index = pyg_graph.edge_index.to(torch.int64)\n        pyg_batch.append(pyg_graph)\n\n    # Apply the packing at the mini-batch level. This is useful for using packing with the Transformer,\n    # especially in the case of the large graphs being much larger than the small graphs.\n    # CAREFUL!!! This changes the order of the graphs in the batch, without changing the order of the labels or other objects.\n    # An error is raised temporarily.\n    if batch_size_per_pack is not None:\n        raise NotImplementedError(\n            \"Packing is not yet functional, as it changes the order of the graphs in the batch without changing the label order\"\n        )\n        num_nodes = [g.num_nodes for g in pyg_batch]\n        packed_graph_idx = fast_packing(num_nodes, batch_size_per_pack)\n\n        # Get the node to pack indices and the mask\n        pack_from_node_idx, pack_attn_mask = node_to_pack_indices_mask(packed_graph_idx, num_nodes)\n        for pyg_graph in pyg_batch:\n            pyg_graph.pack_from_node_idx = pack_from_node_idx\n            pyg_graph.pack_attn_mask = pack_attn_mask\n\n    return Batch.from_data_list(pyg_batch)\n\n\ndef pad_to_expected_label_size(labels: torch.Tensor, label_size: List[int]):\n    \"\"\"Determine difference of ``labels`` shape to expected shape `label_size` and pad\n    with ``torch.nan`` accordingly.\n    \"\"\"\n    if label_size == list(labels.shape):\n        return labels\n\n    missing_dims = len(label_size) - len(labels.shape)\n    for _ in range(missing_dims):\n        labels.unsqueeze(-1)\n\n    pad_sizes = [(0, expected - actual) for expected, actual in zip(label_size, labels.shape)]\n    pad_sizes = [item for before_after in pad_sizes for item in before_after]\n    pad_sizes.reverse()\n\n    if any([s < 0 for s in pad_sizes]):\n        logger.warning(f\"More labels available than expected. Will remove data to fit expected size.\")\n\n    return torch.nn.functional.pad(labels, pad_sizes, value=torch.nan)\n\n\ndef collate_pyg_graph_labels(pyg_labels: List[Data]):\n    \"\"\"\n    Function to collate pytorch geometric labels.\n    Convert all numpy types to torch\n\n    Parameters:\n        pyg_labels: Iterable of PyG label Data objects\n    \"\"\"\n    pyg_batch = []\n    for pyg_label in pyg_labels:\n        for pyg_key in set(get_keys(pyg_label)) - set([\"x\", \"edge_index\"]):\n            tensor = pyg_label[pyg_key]\n            # Convert numpy/scipy to Pytorch\n            if isinstance(tensor, (ndarray, spmatrix)):\n                tensor = torch.as_tensor(to_dense_array(tensor, tensor.dtype))\n\n            pyg_label[pyg_key] = tensor\n\n        pyg_batch.append(pyg_label)\n\n    return Batch.from_data_list(pyg_batch)\n\n\ndef get_expected_label_size(label_data: Data, task: str, label_size: List[int]):\n    \"\"\"Determines expected label size based on the specfic graph properties\n    and the number of targets in the task-dataset.\n    \"\"\"\n    if task.startswith(\"graph_\"):\n        num_labels = 1\n    elif task.startswith(\"node_\"):\n        num_labels = label_data.x.size(0)\n    elif task.startswith(\"edge_\"):\n        num_labels = label_data.edge_index.size(1)\n    elif task.startswith(\"nodepair_\"):\n        raise NotImplementedError()\n    return [num_labels] + label_size\n\n\ndef collate_labels(\n    labels: List[Data],\n    labels_size_dict: Optional[Dict[str, Any]] = None,\n    labels_dtype_dict: Optional[Dict[str, Any]] = None,\n):\n    \"\"\"Collate labels for multitask learning.\n\n    Parameters:\n        labels: List of labels\n        labels_size_dict: Dict of the form Dict[tasks, sizes] which has task names as keys\n            and the size of the label tensor as value. The size of the tensor corresponds to how many\n            labels/values there are to predict for that task.\n        labels_dtype_dict:\n            (Note): This is an attribute of the `MultitaskDataset`.\n            A dictionary of the form Dict[tasks, dtypes] which has task names as keys\n            and the dtype of the label tensor as value. This is necessary to ensure the missing labels are added with NaNs of the right dtype\n\n    Returns:\n        A dictionary of the form Dict[tasks, labels] where tasks is the name of the task and labels\n        is a tensor of shape (batch_size, *labels_size_dict[task]).\n    \"\"\"\n    if labels_size_dict is not None:\n        for this_label in labels:\n            for task in labels_size_dict.keys():\n                labels_size_dict[task] = list(labels_size_dict[task])\n                if len(labels_size_dict[task]) >= 2:\n                    labels_size_dict[task] = labels_size_dict[task][1:]\n                elif not task.startswith(\"graph_\"):\n                    labels_size_dict[task] = [1]\n            label_keys_set = set(get_keys(this_label))\n            empty_task_labels = set(labels_size_dict.keys()) - label_keys_set\n            for task in empty_task_labels:\n                labels_size_dict[task] = get_expected_label_size(this_label, task, labels_size_dict[task])\n                dtype = labels_dtype_dict[task]\n                this_label[task] = torch.full([*labels_size_dict[task]], torch.nan, dtype=dtype)\n\n            for task in label_keys_set - set([\"x\", \"edge_index\"]) - empty_task_labels:\n                labels_size_dict[task] = get_expected_label_size(this_label, task, labels_size_dict[task])\n\n                if not isinstance(this_label[task], (torch.Tensor)):\n                    this_label[task] = torch.as_tensor(this_label[task])\n\n                # Ensure explicit task dimension also for single task labels\n                if len(this_label[task].shape) == 1:\n                    # Distinguish whether target dim or entity dim is missing\n                    if labels_size_dict[task][0] == this_label[task].shape[0]:\n                        # num graphs/nodes/edges/nodepairs already matching\n                        this_label[task] = this_label[task].unsqueeze(1)\n                    else:\n                        # data lost unless entity dim is supposed to be 1\n                        if labels_size_dict[task][0] == 1:\n                            this_label[task] = this_label[task].unsqueeze(0)\n                        else:\n                            raise ValueError(\n                                f\"Labels for {labels_size_dict[task][0]} nodes/edges/nodepairs expected, got 1.\"\n                            )\n\n                this_label[task] = pad_to_expected_label_size(this_label[task], labels_size_dict[task])\n\n    return collate_pyg_graph_labels(labels)\n\n\ndef pad_nodepairs(pe: torch.Tensor, num_nodes: int, max_num_nodes_per_graph: int):\n    \"\"\"\n    This function zero-pads nodepair-level positional encodings to conform with the batching logic.\n\n    Parameters:\n        pe (torch.Tensor, [num_nodes, num_nodes, num_feat]): Nodepair pe\n        num_nodes (int): Number of nodes of processed graph\n        max_num_nodes_per_graph (int): Maximum number of nodes among graphs in current batch\n\n    Returns:\n        padded_pe (torch.Tensor, [num_nodes, max_num_nodes_per_graph, num_feat]): padded nodepair pe tensor\n    \"\"\"\n    padded_pe = torch.zeros((num_nodes, max_num_nodes_per_graph, pe.size(-1)), dtype=pe.dtype)\n    padded_pe[:, :num_nodes] = pe[:, :num_nodes]\n    # Above, pe[:, :num_nodes] in the rhs is needed to \"overwrite\" zero-padding from previous epoch\n\n    return padded_pe\n"
  },
  {
    "path": "graphium/data/datamodule.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport tempfile\nfrom contextlib import redirect_stderr, redirect_stdout\nfrom typing import Type, List, Dict, Union, Any, Callable, Optional, Tuple, Iterable, Literal\nfrom os import PathLike as Path\n\nfrom dataclasses import dataclass\n\nimport os\nfrom functools import partial\nimport importlib.resources\nimport zipfile\nfrom copy import deepcopy\nimport time\nimport gc\n\nimport platformdirs\nimport re\nfrom graphium.data.utils import get_keys\n\nfrom loguru import logger\nimport fsspec\nimport omegaconf\n\nimport pandas as pd\nimport numpy as np\nimport datamol as dm\nfrom tqdm import tqdm\nimport os.path as osp\nfrom fastparquet import ParquetFile\n\nfrom sklearn.model_selection import train_test_split\n\nimport lightning\nfrom lightning.pytorch.trainer.states import RunningStage\n\nimport torch\nfrom torch.utils.data.dataloader import DataLoader, Dataset\nfrom torch.utils.data import Subset\n\nfrom graphium.utils import fs\nfrom graphium.features import (\n    mol_to_graph_dict,\n    GraphDict,\n    mol_to_pyggraph,\n)\n\nfrom graphium.data.sampler import DatasetSubSampler\nfrom graphium.data.utils import graphium_package_path, found_size_mismatch\nfrom graphium.utils.arg_checker import check_arg_iterator\nfrom graphium.utils.hashing import get_md5_hash\nfrom graphium.data.smiles_transform import (\n    did_featurization_fail,\n    BatchingSmilesTransform,\n    smiles_to_unique_mol_ids,\n)\nfrom graphium.data.collate import graphium_collate_fn\nimport graphium.data.dataset as Datasets\nfrom graphium.data.normalization import LabelNormalization\nfrom graphium.data.multilevel_utils import extract_labels\n\ntorch.multiprocessing.set_sharing_strategy(\"file_system\")\n\n\nPCQM4M_meta = {\n    \"num tasks\": 1,\n    \"eval metric\": \"mae\",\n    \"download_name\": \"pcqm4m_kddcup2021\",\n    \"url\": \"https://dgl-data.s3-accelerate.amazonaws.com/dataset/OGB-LSC/pcqm4m_kddcup2021.zip\",  # TODO: Allow PyG\n    \"data type\": \"mol\",\n    \"has_node_attr\": True,\n    \"has_edge_attr\": True,\n    \"task type\": \"regression\",\n    \"num classes\": -1,\n    \"split\": \"scaffold\",\n    \"additional node files\": \"None\",\n    \"additional edge files\": \"None\",\n    \"binary\": False,\n    \"version\": 1,\n}\n\nPCQM4Mv2_meta = deepcopy(PCQM4M_meta)\nPCQM4Mv2_meta.update(\n    {\n        \"download_name\": \"pcqm4m-v2\",\n        \"url\": \"https://dgl-data.s3-accelerate.amazonaws.com/dataset/OGB-LSC/pcqm4m-v2.zip\",  # TODO: Allow PyG\n        \"version\": 2,\n    }\n)\n\n\nclass BaseDataModule(lightning.LightningDataModule):\n    def __init__(\n        self,\n        batch_size_training: int = 16,\n        batch_size_inference: int = 16,\n        batch_size_per_pack: Optional[int] = None,\n        num_workers: int = 0,\n        pin_memory: bool = True,\n        persistent_workers: bool = False,\n        multiprocessing_context: Optional[str] = None,\n        collate_fn: Optional[Callable] = None,\n    ):\n        \"\"\"\n        base dataset module for all datasets (to be inherented)\n\n        Parameters:\n            batch_size_training: batch size for training\n            batch_size_inference: batch size for inference\n            num_workers: number of workers for data loading\n            pin_memory: whether to pin memory\n            persistent_workers: whether to use persistent workers\n            multiprocessing_context: multiprocessing context for data worker creation\n            collate_fn: collate function for batching\n        \"\"\"\n        super().__init__()\n\n        self.batch_size_training = batch_size_training\n        self.batch_size_inference = batch_size_inference\n        self.batch_size_per_pack = batch_size_per_pack\n        if self.batch_size_per_pack is not None:\n            # Check that batch_size_per_pack is a divisor of batch_size_training and batch_size_inference\n            assert (\n                self.batch_size_training % self.batch_size_per_pack == 0\n            ), f\"batch_size_training must be a multiple of batch_size_per_pack, provided batch_size_training={self.batch_size_training}, batch_size_per_pack={self.batch_size_per_pack}\"\n            assert (\n                self.batch_size_inference % self.batch_size_per_pack == 0\n            ), f\"batch_size_inference must be a multiple of batch_size_per_pack, provided batch_size_inference={self.batch_size_inference}, batch_size_per_pack={self.batch_size_per_pack}\"\n\n        self.num_workers = num_workers\n        self.pin_memory = pin_memory\n        self.persistent_workers = persistent_workers\n        self.multiprocessing_context = multiprocessing_context\n\n        self.collate_fn = self.get_collate_fn(collate_fn)\n\n        self.train_ds = None\n        self.val_ds = None\n        self.test_ds = None\n        self._predict_ds = None\n\n        self._data_is_prepared = False\n        self._data_is_cached = False\n\n    def prepare_data(self):\n        raise NotImplementedError()\n\n    def setup(self):\n        raise NotImplementedError()\n\n    def train_dataloader(self, **kwargs):\n        \"\"\"\n        return the training dataloader\n        \"\"\"\n        return self.get_dataloader(\n            dataset=self.train_ds,  # type: ignore\n            shuffle=True,\n            stage=RunningStage.TRAINING,\n            **kwargs,\n        )\n\n    def val_dataloader(self, **kwargs):\n        r\"\"\"\n        return the validation dataloader\n        \"\"\"\n        return self.get_dataloader(\n            dataset=self.val_ds,  # type: ignore\n            shuffle=False,\n            stage=RunningStage.VALIDATING,\n            **kwargs,\n        )\n\n    def test_dataloader(self, **kwargs):\n        r\"\"\"\n        return the test dataloader\n        \"\"\"\n        return self.get_dataloader(\n            dataset=self.test_ds,  # type: ignore\n            shuffle=False,\n            stage=RunningStage.TESTING,\n            **kwargs,\n        )\n\n    def predict_dataloader(self, **kwargs):\n        \"\"\"\n        return the dataloader for prediction\n        \"\"\"\n        return self.get_dataloader(\n            dataset=self.predict_ds,  # type: ignore\n            shuffle=False,\n            stage=RunningStage.PREDICTING,\n            **kwargs,\n        )\n\n    def get_collate_fn(self, collate_fn):\n        if collate_fn is None:\n            # Some values become `inf` when changing data type. `mask_nan` deals with that\n            collate_fn = partial(\n                graphium_collate_fn, mask_nan=0, batch_size_per_pack=self.batch_size_per_pack\n            )\n            collate_fn.__name__ = graphium_collate_fn.__name__\n\n        return collate_fn\n\n    @property\n    def is_prepared(self):\n        raise NotImplementedError()\n\n    @property\n    def is_setup(self):\n        raise NotImplementedError()\n\n    @property\n    def num_node_feats(self):\n        raise NotImplementedError()\n\n    @property\n    def num_edge_feats(self):\n        raise NotImplementedError()\n\n    @property\n    def predict_ds(self):\n        \"\"\"Get the dataset used for the prediction\"\"\"\n        if self._predict_ds is None:\n            return self.test_ds\n        else:\n            return self._predict_ds\n\n    @property\n    def get_num_workers(self):\n        \"\"\"\n        get the number of workers to use\n        \"\"\"\n        if self.num_workers == -1:\n            num_workers = os.cpu_count()\n            num_workers = num_workers if num_workers is not None else 0\n        else:\n            num_workers = self.num_workers\n        return num_workers\n\n    @predict_ds.setter\n    def predict_ds(self, value):\n        \"\"\"Set the dataset for the prediction\"\"\"\n        self._predict_ds = value\n\n    def get_fake_graph(self):\n        raise NotImplementedError()\n\n    # Private methods\n\n    @staticmethod\n    def _read_csv(\n        path: str,\n        **kwargs,\n    ) -> pd.DataFrame:\n        \"\"\"\n        private method for reading a csv file\n        Parameters:\n            path: path to the csv file\n            kwargs: keyword arguments for pd.read_csv\n        Returns:\n            pd.DataFrame: the panda dataframe storing molecules\n        \"\"\"\n\n        path = str(path)\n\n        if path.endswith((\".csv\", \".csv.gz\", \".csv.zip\", \".csv.bz2\")):\n            sep = \",\"\n        elif path.endswith((\".tsv\", \".tsv.gz\", \".tsv.zip\", \".tsv.bz2\")):\n            sep = \"\\t\"\n        else:\n            raise ValueError(f\"unsupported file `{path}`\")\n        kwargs.setdefault(\"sep\", sep)\n\n        if path.startswith(\"graphium://\"):\n            path = graphium_package_path(path)\n\n        df = pd.read_csv(path, **kwargs)\n        return df\n\n    @staticmethod\n    def _get_data_file_type(path):\n        # Extract the extension\n        name, ext = os.path.splitext(path)\n\n        # support compressed files\n        _, ext2 = os.path.splitext(name)\n        if ext2 != \"\":\n            ext = f\"{ext2}{ext}\"\n\n        if ext.endswith((\".parquet\")):  # Support parquet files. Compression is implicit\n            return \"parquet\"\n        elif \".sdf\" in ext:  # support compressed sdf files\n            return \"sdf\"\n        elif \".csv\" in ext:  # support compressed csv files\n            return \"csv\"\n        elif \".tsv\" in ext:  # support compressed tsv files\n            return \"tsv\"\n        elif \".pkl\" in ext:  # support compressed pickle files\n            return \"pkl\"\n        elif ext.endswith(\".pt\"):  # Pytorch tensor files, used for storing index splits\n            return \"pt\"\n        else:\n            raise ValueError(f\"unsupported file `{path}`\")\n\n    @staticmethod\n    def _get_table_columns(path: str) -> List[str]:\n        \"\"\"\n        Get the columns of a table without reading all the data.\n        Might be slow to decompress the file if the file is compressed.\n\n        Parameters:\n            path: path to the table file\n\n        Returns:\n            List[str]: the column names\n        \"\"\"\n\n        datafile_type = BaseDataModule._get_data_file_type(path)\n\n        if datafile_type == \"parquet\":\n            # Read the schema of a parquet file\n            file = ParquetFile(path)\n            schema = file.pandas_metadata[\"columns\"]\n            column_names = [s[\"name\"] for s in schema if s[\"name\"] is not None]\n        elif datafile_type == \"sdf\":\n            df = BaseDataModule._read_sdf(path, max_num_mols=5, discard_invalud=True, n_jobs=1)\n            column_names = df.columns\n        elif datafile_type in [\"csv\", \"tsv\"]:\n            # Read the schema of a csv / tsv file\n            df = BaseDataModule._read_csv(path, nrows=5)\n            column_names = df.columns\n        return column_names\n\n    @staticmethod\n    def _read_parquet(path, **kwargs):\n        kwargs.pop(\"dtype\", None)  # Only useful for csv\n        column_names = BaseDataModule._get_table_columns(path)\n\n        # Change the 'usecols' parameter to 'columns'\n        columns = kwargs.pop(\"columns\", None)\n        if \"usecols\" in kwargs.keys():\n            assert columns is None, \"Ambiguous value of `columns`\"\n            columns = kwargs.pop(\"usecols\")\n        if columns is None:\n            columns = column_names\n        for column in columns:\n            assert (\n                column in column_names\n            ), f\"Column `{column}` is not in the parquet file with columns {column_names}\"\n\n        # Read the parquet file per column, and convert the data to float16 to reduce memory consumption\n        all_series = {}\n        progress = tqdm(columns)\n        for col in progress:\n            # Read single column\n            progress.set_description(f\"Reading parquet column `{col}`\")\n            this_series = pd.read_parquet(path, columns=[col], engine=\"fastparquet\", **kwargs)[col]\n\n            # Check if the data is float\n            first_elem = this_series.values[0]\n            is_float = False\n            if isinstance(first_elem, (list, tuple)):\n                is_float = isinstance(first_elem[0], float) or (first_elem[0].dtype.kind == \"f\")\n            elif isinstance(first_elem, np.ndarray):\n                is_float = isinstance(first_elem, float) or (first_elem.dtype.kind == \"f\")\n\n            # Convert floats to float16\n            if is_float:\n                if isinstance(first_elem, (np.ndarray, list)):\n                    this_series.update(\n                        pd.Series([np.asarray(elem).astype(np.float16) for elem in this_series])\n                    )\n                else:\n                    this_series = this_series.astype(np.float16)\n\n            all_series[col] = this_series\n            gc.collect()  # Reset memory after each column\n\n        # Merge columns into a dataframe\n        df = pd.concat(all_series, axis=1)\n\n        return df\n\n    @staticmethod\n    def _read_sdf(path: str, mol_col_name: str = \"_rdkit_molecule_obj\", **kwargs):\n        r\"\"\"\n        read a given sdf file into a pandas dataframe\n        uses datamol.read_sdf to read the sdf file\n        Parameters:\n            path: path to the sdf file\n            mol_col_name: name of the column containing the molecule object\n            kwargs: arguments to pass to datamol.read_sdf\n        Returns:\n            pandas dataframe containing the molecules and their conformer coordinates from the sdf file\n\n        Note: please change mol_col_name to be the column name corresoonding to the molecule object in your sdf file\n        \"\"\"\n\n        # Set default arguments for reading the SDF\n        kwargs.setdefault(\"smiles_column\", \"smiles\")\n        kwargs.setdefault(\"sanitize\", False)\n        kwargs.setdefault(\"include_private\", True)\n        kwargs.setdefault(\"include_computed\", True)\n        kwargs.setdefault(\"remove_hs\", True)\n        kwargs.setdefault(\"n_jobs\", -1)\n        kwargs.setdefault(\"max_num_mols\", kwargs.pop(\"sample_size\", None))\n        kwargs.setdefault(\"discard_invalid\", False)\n\n        # Get the interesting columns\n        mol_cols = mol_col_name  # adjust this to be the column name in your sdf file\n        kwargs.setdefault(\"mol_column\", mol_cols)\n        usecols = kwargs.pop(\"usecols\", None)\n        dtype = kwargs.pop(\"dtype\", None)\n        smiles_col = kwargs[\"smiles_column\"]\n\n        # Read the SDF\n        df = dm.read_sdf(path, as_df=True, **kwargs)\n\n        # Keep only the columns needed\n        if usecols is not None:\n            df = df[usecols + [mol_cols]]\n\n        # Convert the dtypes\n        if dtype is not None:\n            label_columns = list(set(usecols) - set([mol_cols, smiles_col]))\n            dtype_mapper = {col: dtype for col in label_columns}\n            df.astype(dtype=dtype_mapper, copy=False)\n\n        return df\n\n    @staticmethod\n    def _glob(path: str) -> List[str]:\n        \"\"\"\n        glob a given path\n        Parameters:\n            path: path to glob\n        Returns:\n            List[str]: list of paths\n        \"\"\"\n        files = dm.fs.glob(path)\n        files = [f.replace(\"file://\", \"\") for f in files]\n        return files\n\n    def _read_table(self, path: str, **kwargs) -> pd.DataFrame:\n        \"\"\"\n        a general read file function which determines if which function to use, either _read_csv or _read_parquet\n        Parameters:\n            path: path to the file to read\n            kwargs: keyword arguments for pd.read_csv or pd.read_parquet\n        Returns:\n            pd.DataFrame: the panda dataframe storing molecules\n        \"\"\"\n        files = self._glob(path)\n        if len(files) == 0:\n            raise FileNotFoundError(f\"No such file or directory `{path}`\")\n\n        if len(files) > 1:\n            files = tqdm(sorted(files), desc=f\"Reading files at `{path}`\")\n        dfs = []\n        for file in files:\n            file_type = self._get_data_file_type(file)\n            if file_type == \"parquet\":\n                df = self._read_parquet(file, **kwargs)\n            elif file_type == \"sdf\":  # support compressed sdf files\n                df = self._read_sdf(file, **kwargs)\n            elif file_type in [\"csv\", \"tsv\"]:  # support compressed csv and tsv files\n                df = self._read_csv(file, **kwargs)\n            else:\n                raise ValueError(f\"unsupported file `{file}`\")\n            dfs.append(df)\n\n        return pd.concat(dfs, ignore_index=True)\n\n    def get_dataloader_kwargs(self, stage: RunningStage, shuffle: bool, **kwargs) -> Dict[str, Any]:\n        \"\"\"\n        Get the options for the dataloader depending on the current stage.\n\n        Parameters:\n            stage: Whether in Training, Validating, Testing, Sanity-checking, Predicting, or Tuning phase.\n            shuffle: set to ``True`` to have the data reshuffled at every epoch.\n\n        Returns:\n            Arguments to pass to the `DataLoader` during initialization\n        \"\"\"\n        loader_kwargs = {}\n\n        # Get batch size and IPU options for training set\n        # if stage in [RunningStage.TRAINING, RunningStage.TUNING]:\n        if stage in [RunningStage.TRAINING]:\n            loader_kwargs[\"batch_size\"] = self.batch_size_training\n\n        # Get batch size and IPU options for validation / testing sets\n        elif stage in [RunningStage.VALIDATING, RunningStage.TESTING, RunningStage.PREDICTING]:\n            loader_kwargs[\"batch_size\"] = self.batch_size_inference\n        else:\n            raise ValueError(f\"Wrong value for `stage`. Provided `{stage}`\")\n\n        # Set default parameters\n        loader_kwargs[\"shuffle\"] = shuffle\n        loader_kwargs[\"collate_fn\"] = self.collate_fn\n        loader_kwargs[\"num_workers\"] = self.get_num_workers\n        loader_kwargs[\"pin_memory\"] = self.pin_memory\n        loader_kwargs[\"persistent_workers\"] = self.persistent_workers\n        loader_kwargs[\"multiprocessing_context\"] = self.multiprocessing_context\n\n        # Update from provided parameters\n        loader_kwargs.update(**kwargs)\n\n        return loader_kwargs\n\n    def get_dataloader(self, dataset: Dataset, shuffle: bool, stage: RunningStage) -> DataLoader:\n        \"\"\"\n        Get the dataloader for a given dataset\n\n        Parameters:\n            dataset: The dataset from which to load the data\n            shuffle: set to ``True`` to have the data reshuffled at every epoch.\n            stage: Whether in Training, Validating, Testing, Sanity-checking, Predicting, or Tuning phase.\n\n        Returns:\n            The dataloader to sample from\n        \"\"\"\n        kwargs = self.get_dataloader_kwargs(stage=stage, shuffle=shuffle)\n        return self._dataloader(dataset=dataset, shuffle=shuffle, stage=stage, **kwargs)\n\n    def _dataloader(self, dataset: Dataset, **kwargs) -> DataLoader:\n        r\"\"\"\n        Get a dataloader for a given dataset\n        Parameters:\n            dataset: The dataset from which to load the data\n            kwargs: keyword arguments for DataLoader\n        Returns:\n            The dataloader to sample from\n        \"\"\"\n\n        loader = DataLoader(\n            dataset=dataset,\n            **kwargs,\n        )\n\n        return loader\n\n    def get_max_num_nodes_datamodule(self, stages: Optional[List[str]] = None) -> int:\n        \"\"\"\n        Get the maximum number of nodes across all datasets from the datamodule\n\n        Parameters:\n            datamodule: The datamodule from which to extract the maximum number of nodes\n            stages: The stages from which to extract the max num nodes.\n                Possible values are [\"train\", \"val\", \"test\", \"predict\"].\n                If None, all stages are considered.\n\n        Returns:\n            max_num_nodes: The maximum number of nodes across all datasets from the datamodule\n        \"\"\"\n\n        allowed_stages = [\"train\", \"val\", \"test\", \"predict\"]\n        if stages is None:\n            stages = allowed_stages\n        for stage in stages:\n            assert stage in allowed_stages, f\"stage value `{stage}` not allowed.\"\n\n        max_num_nodes = 0\n        # Max number of nodes in the training dataset\n        if (self.train_ds is not None) and (\"train\" in stages):\n            logger.info(\"Max num nodes being calcuated train\")\n            max_num_nodes = max(max_num_nodes, self.train_ds.max_num_nodes_per_graph)\n\n        # Max number of nodes in the validation dataset\n        if (\n            (self.val_ds is not None)\n            and (\"val\" in stages)\n            and (self.val_ds.max_num_nodes_per_graph is not None)\n        ):\n            logger.info(\"Max num nodes being calcuated val\")\n            max_num_nodes = max(max_num_nodes, self.val_ds.max_num_nodes_per_graph)\n\n        # Max number of nodes in the test dataset\n        if (\n            (self.test_ds is not None)\n            and (\"test\" in stages)\n            and (self.test_ds.max_num_nodes_per_graph is not None)\n        ):\n            logger.info(\"Max num nodes being calcuated test\")\n            max_num_nodes = max(max_num_nodes, self.test_ds.max_num_nodes_per_graph)\n\n        # Max number of nodes in the predict dataset\n        if (\n            (self.predict_ds is not None)\n            and (\"predict\" in stages)\n            and (self.predict_ds.max_num_nodes_per_graph is not None)\n        ):\n            max_num_nodes = max(max_num_nodes, self.predict_ds.max_num_nodes_per_graph)\n\n        return max_num_nodes\n\n    def get_max_num_edges_datamodule(self, stages: Optional[List[str]] = None) -> int:\n        \"\"\"\n        Get the maximum number of edges across all datasets from the datamodule\n\n        Parameters:\n            datamodule: The datamodule from which to extract the maximum number of nodes\n            stages: The stages from which to extract the max num nodes.\n                Possible values are [\"train\", \"val\", \"test\", \"predict\"].\n                If None, all stages are considered.\n\n        Returns:\n            max_num_edges: The maximum number of edges across all datasets from the datamodule\n        \"\"\"\n\n        allowed_stages = [\"train\", \"val\", \"test\", \"predict\"]\n        if stages is None:\n            stages = allowed_stages\n        for stage in stages:\n            assert stage in allowed_stages, f\"stage value `{stage}` not allowed.\"\n\n        max_num_edges = 0\n        # Max number of nodes/edges in the training dataset\n        if (\n            (self.train_ds is not None)\n            and (\"train\" in stages)\n            and (self.train_ds.max_num_edges_per_graph is not None)\n        ):\n            max_num_edges = max(max_num_edges, self.train_ds.max_num_edges_per_graph)\n\n        # Max number of nodes/edges in the validation dataset\n        if (\n            (self.val_ds is not None)\n            and (\"val\" in stages)\n            and (self.val_ds.max_num_edges_per_graph is not None)\n        ):\n            max_num_edges = max(max_num_edges, self.val_ds.max_num_edges_per_graph)\n\n        # Max number of nodes/edges in the test dataset\n        if (\n            (self.test_ds is not None)\n            and (\"test\" in stages)\n            and (self.test_ds.max_num_edges_per_graph is not None)\n        ):\n            max_num_edges = max(max_num_edges, self.test_ds.max_num_edges_per_graph)\n\n        # Max number of nodes/edges in the predict dataset\n        if (\n            (self.predict_ds is not None)\n            and (\"predict\" in stages)\n            and (self.predict_ds.max_num_edges_per_graph is not None)\n        ):\n            max_num_edges = max(max_num_edges, self.predict_ds.max_num_edges_per_graph)\n\n        return max_num_edges\n\n\n@dataclass\nclass DatasetProcessingParams:\n    def __init__(\n        self,\n        task_level: Optional[str] = None,\n        df: Optional[pd.DataFrame] = None,\n        df_path: Optional[Union[str, os.PathLike, List[Union[str, os.PathLike]]]] = None,\n        smiles_col: Optional[str] = None,\n        label_cols: List[str] = None,\n        weights_col: Optional[str] = None,  # Not needed\n        weights_type: Optional[str] = None,  # Not needed\n        idx_col: Optional[str] = None,\n        mol_ids_col: Optional[str] = None,\n        sample_size: Union[int, float, Type[None]] = None,\n        split_val: float = 0.2,\n        split_test: float = 0.2,\n        seed: int = None,\n        epoch_sampling_fraction: float = 1.0,\n        splits_path: Optional[Union[str, os.PathLike]] = None,\n        split_names: Optional[List[str]] = [\"train\", \"val\", \"test\"],\n        label_normalization: Optional[Union[Dict[str, Any], omegaconf.DictConfig]] = None,\n    ):\n        \"\"\"\n        object to store the parameters for the dataset processing\n        Parameters:\n            task_level: The task level, wether it is graph, node, edge or nodepair\n            df: The dataframe containing the data\n            df_path: The path to the dataframe containing the data. If list, will read all files, sort them alphabetically and concatenate them.\n            smiles_col: The column name of the smiles\n            label_cols: The column names of the labels\n            weights_col: The column name of the weights\n            weights_type: The type of weights\n            idx_col: The column name of the indices\n            mol_ids_col: The column name of the molecule ids\n            sample_size: The size of the sample\n            split_val: The fraction of the data to use for validation\n            split_test: The fraction of the data to use for testing\n            seed: The seed to use for the splits and subsampling\n            splits_path: The path to the splits, or a dictionary with the splits\n        \"\"\"\n\n        if df is None and df_path is None:\n            raise ValueError(\"Either `df` or `df_path` must be provided\")\n        if epoch_sampling_fraction <= 0 or epoch_sampling_fraction > 1:\n            raise ValueError(\"The value of epoch_sampling_fraction must be in the range of (0, 1].\")\n\n        self.df = df\n        self.task_level = task_level\n        self.df_path = df_path\n        self.smiles_col = smiles_col\n        self.label_cols = label_cols\n        self.weights_col = weights_col\n        self.weights_type = weights_type\n        self.idx_col = idx_col\n        self.mol_ids_col = mol_ids_col\n        self.sample_size = sample_size\n        self.split_val = split_val\n        self.split_test = split_test\n        self.seed = seed\n        self.splits_path = splits_path\n        self.split_names = split_names\n        self.label_normalization = label_normalization\n        self.epoch_sampling_fraction = epoch_sampling_fraction\n\n\nclass IPUDataModuleModifier:\n    def __init__(\n        self,\n        ipu_inference_opts: Optional[\"poptorch.Options\"] = None,\n        ipu_training_opts: Optional[\"poptorch.Options\"] = None,\n        ipu_dataloader_training_opts: Optional[\"IPUDataloaderOptions\"] = None,\n        ipu_dataloader_inference_opts: Optional[\"IPUDataloaderOptions\"] = None,\n        *args,\n        **kwargs,\n    ) -> None:\n        r\"\"\"\n        wrapper functions from the a `DataModule` to support IPU and IPU options To be used in dual inheritance, for example:\n        ```\n        IPUDataModule(BaseDataModule, IPUDataModuleModifier):\n            def __init__(self, **kwargs):\n                BaseDataModule.__init__(self, **kwargs)\n                IPUDataModuleModifier.__init__(self, **kwargs)\n        ```\n\n        Parameters:\n            ipu_inference_opts: Options for the IPU in inference mode. Ignore if not using IPUs\n            ipu_training_opts: Options for the IPU in training mode. Ignore if not using IPUs\n            ipu_dataloader_kwargs_train_val: Options for the dataloader for the IPU. Ignore if not using IPUs\n            ipu_dataloader_kwargs_test: Options for the dataloader for the IPU. Ignore if not using IPUs\n            args: Arguments for the `DataModule`\n            kwargs: Keyword arguments for the `DataModule`\n        \"\"\"\n        self.ipu_inference_opts = ipu_inference_opts\n        self.ipu_training_opts = ipu_training_opts\n        self.ipu_dataloader_training_opts = ipu_dataloader_training_opts\n        self.ipu_dataloader_inference_opts = ipu_dataloader_inference_opts\n\n    def _dataloader(self, dataset: Dataset, **kwargs) -> \"poptorch.DataLoader\":\n        \"\"\"\n        Get a poptorch dataloader for a given dataset\n        Parameters:\n            dataset: The dataset to use\n            kwargs: Keyword arguments for the dataloader\n        Returns:\n            The poptorch dataloader\n        \"\"\"\n\n        # Use regular Dataloader if no IPUs\n        if (\"ipu_options\" not in kwargs.keys()) or (kwargs[\"ipu_options\"] is None):\n            raise ValueError(f\"No IPU options provided.\")\n\n        # Initialize the IPU dataloader\n        from graphium.ipu.ipu_dataloader import create_ipu_dataloader\n\n        loader = create_ipu_dataloader(\n            dataset=dataset,\n            **kwargs,\n        )\n\n        return loader\n\n\nclass MultitaskFromSmilesDataModule(BaseDataModule, IPUDataModuleModifier):\n    def __init__(\n        self,\n        task_specific_args: Union[Dict[str, DatasetProcessingParams], Dict[str, Any]],\n        processed_graph_data_path: Optional[Union[str, os.PathLike]] = None,\n        dataloading_from: str = \"ram\",\n        featurization: Optional[Union[Dict[str, Any], omegaconf.DictConfig]] = None,\n        batch_size_training: int = 16,\n        batch_size_inference: int = 16,\n        batch_size_per_pack: Optional[int] = None,\n        num_workers: int = 0,\n        pin_memory: bool = True,\n        persistent_workers: bool = False,\n        multiprocessing_context: Optional[str] = None,\n        featurization_n_jobs: int = -1,\n        featurization_progress: bool = False,\n        featurization_backend: str = \"loky\",\n        featurization_batch_size: int = 1000,\n        collate_fn: Optional[Callable] = None,\n        prepare_dict_or_graph: str = \"pyg:graph\",\n        **kwargs,\n    ):\n        \"\"\"\n        only for parameters beginning with task_*, we have a dictionary where the key is the task name\n        and the value is specified below.\n        Parameters:\n            task_specific_args: A dictionary where the key is the task name (for the multi-task setting), and\n                the value is a `DatasetProcessingParams` object. The `DatasetProcessingParams` object\n                contains multiple parameters to define how to load and process the files, such as:\n\n                - `task_level`\n                - `df`\n                - `df_path`\n                - `smiles_col`\n                - `label_cols`\n            dataloading_from: Whether to load the data from RAM or from disk. If set to \"disk\", the data\n                must have been previously cached with `processed_graph_data_path` set. If set to \"ram\", the data\n                will be loaded in RAM and the `processed_graph_data_path` will be ignored.\n            featurization: args to apply to the SMILES to Graph featurizer.\n            batch_size_training: batch size for training and val dataset.\n            batch_size_inference: batch size for test dataset.\n            num_workers: Number of workers for the dataloader. Use -1 to use all available\n                cores.\n            pin_memory: Whether to pin on paginated CPU memory for the dataloader.\n            featurization_n_jobs: Number of cores to use for the featurization.\n            featurization_progress: whether to show a progress bar during featurization.\n            featurization_backend: The backend to use for the molecular featurization.\n\n                - \"multiprocessing\": Found to cause less memory issues.\n                - \"loky\": joblib's Default. Found to cause memory leaks.\n                - \"threading\": Found to be slow.\n            featurization_batch_size: Batch size to use for the featurization.\n\n            collate_fn: A custom torch collate function. Default is to `graphium.data.graphium_collate_fn`\n            prepare_dict_or_graph: Whether to preprocess all molecules as Graph dict or PyG graphs.\n                Possible options:\n\n                - \"pyg:dict\": Process molecules as a `dict`. It's faster and requires less RAM during\n                  pre-processing. It is slower during training with with `num_workers=0` since\n                  pyg `Data` will be created during data-loading, but faster with large\n                  `num_workers`, and less likely to cause memory issues with the parallelization.\n                - \"pyg:graph\": Process molecules as `pyg.data.Data`.\n        \"\"\"\n        BaseDataModule.__init__(\n            self,\n            batch_size_training=batch_size_training,\n            batch_size_inference=batch_size_inference,\n            batch_size_per_pack=batch_size_per_pack,\n            num_workers=num_workers,\n            pin_memory=pin_memory,\n            persistent_workers=persistent_workers,\n            multiprocessing_context=multiprocessing_context,\n            collate_fn=collate_fn,\n        )\n        IPUDataModuleModifier.__init__(self, **kwargs)\n\n        self.task_specific_args = task_specific_args\n\n        self.task_dataset_processing_params = {}\n        for task, ds_args in task_specific_args.items():\n            if not isinstance(ds_args, DatasetProcessingParams):\n                # This is needed as long as not all classes have been migrated\n                # to use the new `DatasetProcessingParams` class\n                ds_args = DatasetProcessingParams(**ds_args)\n\n            key = self._get_task_key(ds_args.task_level, task)\n            self.task_dataset_processing_params[key] = ds_args\n\n        self.sampler_task_dict = {\n            task: self.task_dataset_processing_params[task].epoch_sampling_fraction\n            for task in self.task_dataset_processing_params.keys()\n        }\n\n        self.featurization_n_jobs = featurization_n_jobs\n        self.featurization_progress = featurization_progress\n        self.featurization_backend = featurization_backend\n        self.featurization_batch_size = featurization_batch_size\n\n        self.task_train_indices = None\n        self.task_val_indices = None\n        self.task_test_indices = None\n\n        self.single_task_datasets = None\n        self.train_singletask_datasets = None\n        self.val_singletask_datasets = None\n        self.test_singletask_datasets = None\n\n        self.train_ds = None\n        self.val_ds = None\n        self.test_ds = None\n\n        self._parse_caching_args(processed_graph_data_path, dataloading_from)\n\n        self.task_norms = {}\n\n        if featurization is None:\n            featurization = {}\n\n        self.featurization = featurization\n\n        # Whether to transform the smiles into a pyg `Data` graph or a dictionary compatible with pyg\n        if prepare_dict_or_graph == \"pyg:dict\":\n            self.smiles_transformer = partial(mol_to_graph_dict, **featurization)\n        elif prepare_dict_or_graph == \"pyg:graph\":\n            self.smiles_transformer = partial(mol_to_pyggraph, **featurization)\n        else:\n            raise ValueError(\n                f\"`prepare_dict_or_graph` should be either 'pyg:dict' or 'pyg:graph', Provided: `{prepare_dict_or_graph}`\"\n            )\n        self.data_hash = self.get_data_hash()\n\n        if self.processed_graph_data_path is not None:\n            if self._ready_to_load_all_from_file():\n                self._data_is_prepared = True\n                self._data_is_cached = True\n\n    def _parse_caching_args(self, processed_graph_data_path, dataloading_from):\n        \"\"\"\n        Parse the caching arguments, and raise errors if the arguments are invalid.\n        \"\"\"\n\n        # Whether to load the data from RAM or from disk\n        dataloading_from = dataloading_from.lower()\n        if dataloading_from not in [\"disk\", \"ram\"]:\n            raise ValueError(\n                f\"`dataloading_from` should be either 'disk' or 'ram', Provided: `{dataloading_from}`\"\n            )\n\n        # If loading from disk, the path to the cached data must be provided\n        if dataloading_from == \"disk\" and processed_graph_data_path is None:\n            raise ValueError(\n                \"When `dataloading_from` is 'disk', `processed_graph_data_path` must be provided.\"\n            )\n\n        self.processed_graph_data_path = processed_graph_data_path\n        self.dataloading_from = dataloading_from\n\n    def _get_task_key(self, task_level: str, task: str):\n        task_prefix = f\"{task_level}_\"\n        if not task.startswith(task_prefix):\n            task = task_prefix + task\n        return task\n\n    def get_task_levels(self):\n        task_level_map = {}\n\n        for task, task_args in self.task_specific_args.items():\n            if isinstance(task_args, DatasetProcessingParams):\n                task_args = task_args.__dict__  # Convert the class to a dictionary\n            task_level_map.update({task: task_args[\"task_level\"]})\n\n        return task_level_map\n\n    def prepare_data(self, save_smiles_and_ids: bool = False):\n        \"\"\"Called only from a single process in distributed settings. Steps:\n\n        - If each cache is set and exists, reload from cache and return. Otherwise,\n        - For each single-task dataset:\n            - Load its dataframe from a path (if provided)\n            - Subsample the dataframe\n            - Extract the smiles, labels from the dataframe\n        - In the previous step, we were also able to get the unique smiles, which we use to compute the features\n        - For each single-task dataframe and associated data (smiles, labels, etc.):\n            - Filter out the data corresponding to molecules which failed featurization.\n            - Create a corresponding SingletaskDataset\n            - Split the SingletaskDataset according to the task-specific splits for train, val and test\n        \"\"\"\n\n        def has_atoms_after_h_removal(smiles):\n            # Remove all 'H' characters from the SMILES\n            smiles_without_h = re.sub(\"H\", \"\", smiles)\n            # Check if any letters are remaining in the modified string\n            has_atoms = bool(re.search(\"[a-zA-Z]\", smiles_without_h))\n            if has_atoms == False:\n                logger.info(f\"Removed Hydrogen molecule: {smiles}\")\n            return has_atoms\n\n        if self._data_is_prepared:\n            logger.info(\"Data is already prepared.\")\n            self.get_label_statistics(self.processed_graph_data_path, self.data_hash, dataset=None)\n            return\n\n        \"\"\"Load all single-task dataframes.\"\"\"\n        task_df = {}\n        for task, args in self.task_dataset_processing_params.items():\n            if args.label_normalization is None:\n                args.label_normalization = {}\n            label_normalization = LabelNormalization(**args.label_normalization)\n            logger.info(f\"Reading data for task '{task}'\")\n            if args.df is None:\n                # Only load the useful columns, as some datasets can be very large when loading all columns.\n                label_cols = self._parse_label_cols(\n                    df=None, df_path=args.df_path, label_cols=args.label_cols, smiles_col=args.smiles_col\n                )\n                usecols = (\n                    check_arg_iterator(args.smiles_col, enforce_type=list)\n                    + label_cols\n                    + check_arg_iterator(args.idx_col, enforce_type=list)\n                    + check_arg_iterator(args.weights_col, enforce_type=list)\n                )\n                label_dtype = {col: np.float32 for col in label_cols}\n                task_df[task] = self._read_table(args.df_path, usecols=usecols, dtype=label_dtype)\n\n            else:\n                label_cols = self._parse_label_cols(\n                    df=args.df, df_path=None, label_cols=args.label_cols, smiles_col=args.smiles_col\n                )\n                task_df[task] = args.df\n            task_df[task] = task_df[task]\n            args.label_cols = label_cols\n            self.task_norms[task] = label_normalization\n        logger.info(\"Done reading datasets\")\n\n        \"\"\"Subsample the data frames and extract the necessary data to create SingleTaskDatasets for each task (smiles, labels, extras).\"\"\"\n        task_dataset_args = {}\n        for task in task_df.keys():\n            task_dataset_args[task] = {}\n\n        for task, df in task_df.items():\n            # Subsample all the dataframes\n            sample_size = self.task_dataset_processing_params[task].sample_size\n            df = self._sub_sample_df(df, sample_size, self.task_dataset_processing_params[task].seed)\n\n            logger.info(f\"Prepare single-task dataset for task '{task}' with {len(df)} data points.\")\n\n            logger.info(\"Filtering the molecules for Hydrogen\")\n            logger.info(f\"Looking at column {df.columns[0]}\")\n            logger.info(\"Filtering done\")\n            # Extract smiles, labels, extras\n            args = self.task_dataset_processing_params[task]\n            smiles, labels, sample_idx, extras = self._extract_smiles_labels(\n                df,\n                task_level=args.task_level,\n                smiles_col=args.smiles_col,\n                label_cols=args.label_cols,\n                idx_col=args.idx_col,\n                weights_col=args.weights_col,\n                weights_type=args.weights_type,\n            )\n\n            # Store the relevant information for each task's dataset\n            task_dataset_args[task][\"smiles\"] = smiles\n            task_dataset_args[task][\"labels\"] = labels\n            task_dataset_args[task][\"sample_idx\"] = sample_idx\n            task_dataset_args[task][\"extras\"] = extras\n\n        \"\"\"Convert SMILES to features (graphs, fingerprints, etc.) for the unique molecules found.\"\"\"\n        all_smiles = []\n        all_tasks = []\n        idx_per_task = {}\n        total_len = 0\n        for task, dataset_args in task_dataset_args.items():\n            all_smiles.extend(dataset_args[\"smiles\"])\n            num_smiles = len(dataset_args[\"smiles\"])\n            idx_per_task[task] = (total_len, total_len + num_smiles)\n            total_len += num_smiles\n            for count in range(len(dataset_args[\"smiles\"])):\n                all_tasks.append(task)\n        # Get all unique mol ids\n        all_unique_mol_ids = smiles_to_unique_mol_ids(\n            all_smiles,\n            n_jobs=self.featurization_n_jobs,\n            featurization_batch_size=self.featurization_batch_size,\n            backend=self.featurization_backend,\n        )\n        _, unique_ids_idx, unique_ids_inv = np.unique(\n            all_unique_mol_ids, return_index=True, return_inverse=True\n        )\n\n        smiles_to_featurize = [all_smiles[ii] for ii in unique_ids_idx]\n\n        # Convert SMILES to features\n        features, _ = self._featurize_molecules(smiles_to_featurize)\n\n        # Store the features (including Nones, which will be filtered in the next step)\n        for task in task_dataset_args.keys():\n            task_dataset_args[task][\"features\"] = []\n            task_dataset_args[task][\"idx_none\"] = []\n        # Create a list of features matching up with the original smiles\n        all_features = [features[unique_idx] for unique_idx in unique_ids_inv]\n\n        # Add the features to the task-specific data\n        for all_idx, task in enumerate(all_tasks):\n            task_dataset_args[task][\"features\"].append(all_features[all_idx])\n\n        \"\"\"Filter data based on molecules which failed featurization. Create single task datasets as well.\"\"\"\n        self.single_task_datasets = {}\n        for task, args in task_dataset_args.items():\n            # Find out which molecule failed featurization, and filter them out\n            idx_none = []\n            for idx, (feat, labels, smiles) in enumerate(\n                zip(args[\"features\"], args[\"labels\"], args[\"smiles\"])\n            ):\n                if did_featurization_fail(feat) or found_size_mismatch(task, feat, labels, smiles):\n                    idx_none.append(idx)\n            this_unique_ids = all_unique_mol_ids[idx_per_task[task][0] : idx_per_task[task][1]]\n            df, features, smiles, labels, sample_idx, extras, this_unique_ids = self._filter_none_molecules(\n                idx_none,\n                task_df[task],\n                args[\"features\"],\n                args[\"smiles\"],\n                args[\"labels\"],\n                args[\"sample_idx\"],\n                args[\"extras\"],\n                this_unique_ids,\n            )\n            task_dataset_args[task][\"smiles\"] = smiles\n            task_dataset_args[task][\"labels\"] = labels\n            task_dataset_args[task][\"features\"] = features\n            task_dataset_args[task][\"sample_idx\"] = sample_idx\n            task_dataset_args[task][\"extras\"] = extras\n\n            # We have the necessary components to create single-task datasets.\n            self.single_task_datasets[task] = Datasets.SingleTaskDataset(\n                features=task_dataset_args[task][\"features\"],\n                labels=task_dataset_args[task][\"labels\"],\n                smiles=task_dataset_args[task][\"smiles\"],\n                unique_ids=this_unique_ids,\n                indices=task_dataset_args[task][\"sample_idx\"],\n                **task_dataset_args[task][\"extras\"],\n            )\n\n        \"\"\"We split the data up to create train, val and test datasets\"\"\"\n        self.task_train_indices = {}\n        self.task_val_indices = {}\n        self.task_test_indices = {}\n\n        for task, df in task_df.items():\n            train_indices, val_indices, test_indices = self._get_split_indices(\n                len(df),\n                split_val=self.task_dataset_processing_params[task].split_val,\n                split_test=self.task_dataset_processing_params[task].split_test,\n                split_seed=self.task_dataset_processing_params[task].seed,\n                splits_path=self.task_dataset_processing_params[task].splits_path,\n                split_names=self.task_dataset_processing_params[task].split_names,\n                sample_idx=task_dataset_args[task][\"sample_idx\"],\n            )\n            self.task_train_indices[task] = train_indices\n            self.task_val_indices[task] = val_indices\n            self.task_test_indices[task] = test_indices\n\n        (\n            self.train_singletask_datasets,\n            self.val_singletask_datasets,\n            self.test_singletask_datasets,\n        ) = self.get_subsets_of_datasets(\n            self.single_task_datasets, self.task_train_indices, self.task_val_indices, self.task_test_indices\n        )\n\n        if self.processed_graph_data_path is not None:\n            self._save_data_to_files(save_smiles_and_ids)\n            self._data_is_cached = True\n\n        self._data_is_prepared = True\n\n    def setup(\n        self,\n        stage: str = None,\n        save_smiles_and_ids: bool = False,\n    ):\n        \"\"\"\n        Prepare the torch dataset. Called on every GPUs. Setting state here is ok.\n        Parameters:\n            stage (str): Either 'fit', 'test', or None.\n        \"\"\"\n\n        # Can possibly get rid of setup because a single dataset will have molecules exclusively in train, val or test\n        # Produce the label sizes to update the collate function\n        labels_size = {}\n        labels_dtype = {}\n        if stage == \"fit\" or stage is None:\n            if self.train_ds is None:\n                self.train_ds = self._make_multitask_dataset(\n                    self.dataloading_from, \"train\", save_smiles_and_ids=save_smiles_and_ids\n                )\n\n            if self.val_ds is None:\n                self.val_ds = self._make_multitask_dataset(\n                    self.dataloading_from, \"val\", save_smiles_and_ids=save_smiles_and_ids\n                )\n\n            logger.info(self.train_ds)\n            logger.info(self.val_ds)\n            labels_size.update(\n                self.train_ds.labels_size\n            )  # Make sure that all task label sizes are contained in here. Maybe do the update outside these if statements.\n            labels_size.update(self.val_ds.labels_size)\n            labels_dtype.update(self.train_ds.labels_dtype)\n            labels_dtype.update(self.val_ds.labels_dtype)\n\n        if stage == \"test\" or stage is None:\n            if self.test_ds is None:\n                self.test_ds = self._make_multitask_dataset(\n                    self.dataloading_from, \"test\", save_smiles_and_ids=save_smiles_and_ids\n                )\n\n            logger.info(self.test_ds)\n\n            labels_size.update(self.test_ds.labels_size)\n            labels_dtype.update(self.test_ds.labels_dtype)\n\n        default_labels_size_dict = self.collate_fn.keywords.get(\"labels_size_dict\", None)\n\n        if default_labels_size_dict is None:\n            self.collate_fn.keywords[\"labels_size_dict\"] = labels_size\n\n        default_labels_dtype_dict = self.collate_fn.keywords.get(\"labels_dtype_dict\", None)\n\n        if default_labels_dtype_dict is None:\n            self.collate_fn.keywords[\"labels_dtype_dict\"] = labels_dtype\n\n    def _make_multitask_dataset(\n        self,\n        dataloading_from: Literal[\"disk\", \"ram\"],\n        stage: Literal[\"train\", \"val\", \"test\"],\n        save_smiles_and_ids: bool,\n    ) -> Datasets.MultitaskDataset:\n        \"\"\"\n        Create a MultitaskDataset for the given stage using single task datasets\n        The single task datasets must exist before this can be used\n\n        Parameters:\n            stage: Stage to create multitask dataset for\n            save_smiles_and_ids: Whether to save SMILES strings and unique IDs\n            processed_graph_data_path: path to save and load processed graph data from\n        \"\"\"\n\n        allowed_stages = [\"train\", \"val\", \"test\"]\n        assert stage in allowed_stages, f\"Multitask dataset stage `{stage}` not in {allowed_stages}\"\n\n        if stage == \"train\":\n            singletask_datasets = self.train_singletask_datasets\n            about = \"training set\"\n        elif stage == \"val\":\n            singletask_datasets = self.val_singletask_datasets\n            about = \"validation set\"\n        elif stage == \"test\":\n            singletask_datasets = self.test_singletask_datasets\n            about = \"test set\"\n        else:\n            raise ValueError(f\"Unknown stage {stage}\")\n\n        processed_graph_data_path = self.processed_graph_data_path\n\n        multitask_dataset = Datasets.MultitaskDataset(\n            singletask_datasets,\n            n_jobs=self.featurization_n_jobs,\n            backend=self.featurization_backend,\n            featurization_batch_size=self.featurization_batch_size,\n            progress=self.featurization_progress,\n            about=about,\n            save_smiles_and_ids=save_smiles_and_ids,\n            data_path=self._path_to_load_from_file(stage) if processed_graph_data_path else None,\n            dataloading_from=dataloading_from,\n            data_is_cached=self._data_is_cached,\n        )  # type: ignore\n\n        # calculate statistics for the train split and used for all splits normalization\n        if stage == \"train\":\n            self.get_label_statistics(\n                self.processed_graph_data_path, self.data_hash, multitask_dataset, train=True\n            )\n        # Normalization has already been applied in cached data\n        if not self._data_is_prepared:\n            self.normalize_label(multitask_dataset, stage)\n\n        return multitask_dataset\n\n    def _ready_to_load_all_from_file(self) -> bool:\n        \"\"\"\n        Check if the data for all stages is ready to be loaded from files\n        \"\"\"\n\n        paths = [self._path_to_load_from_file(stage) for stage in [\"train\", \"val\", \"test\"]]\n        ready = all(self._data_ready_at_path(path) for path in paths)\n\n        return ready\n\n    def _path_to_load_from_file(self, stage: Literal[\"train\", \"val\", \"test\"]) -> Optional[str]:\n        \"\"\"\n        Get path from which to load the data from files\n        \"\"\"\n        if self.processed_graph_data_path is None:\n            return None\n        return osp.join(self.processed_graph_data_path, f\"{stage}_{self.data_hash}\")\n\n    def _data_ready_at_path(self, path: str) -> bool:\n        \"\"\"\n        Check if data can be loaded from this path\n        \"\"\"\n        can_load_from_file = osp.exists(path) and self.get_folder_size(path) > 0\n\n        return can_load_from_file\n\n    def _save_data_to_files(self, save_smiles_and_ids: bool = False) -> None:\n        \"\"\"\n        Save data to files so that they can be loaded from file during training/validation/test\n        \"\"\"\n\n        stages = [\"train\", \"val\", \"test\"]\n\n        # At the moment, we need to merge the `SingleTaskDataset`'s into `MultitaskDataset`s in order to save to file\n        #     This is because the combined labels need to be stored together. We can investigate not doing this if this is a problem\n        temp_datasets = {\n            stage: self._make_multitask_dataset(\n                dataloading_from=\"ram\", stage=stage, save_smiles_and_ids=save_smiles_and_ids\n            )\n            for stage in stages\n        }\n        for stage in stages:\n            self.save_featurized_data(temp_datasets[stage], self._path_to_load_from_file(stage))\n            temp_datasets[stage].save_metadata(self._path_to_load_from_file(stage))\n        # self.train_ds, self.val_ds, self.test_ds will be created during `setup()`\n\n        if self.dataloading_from == \"disk\":\n            del temp_datasets\n        else:\n            self.train_ds = temp_datasets[\"train\"]\n            self.val_ds = temp_datasets[\"val\"]\n            self.test_ds = temp_datasets[\"test\"]\n\n    def get_folder_size(self, path):\n        # check if the data items are actually saved into the folders\n        return sum(os.path.getsize(osp.join(path, f)) for f in os.listdir(path))\n\n    def calculate_statistics(self, dataset: Datasets.MultitaskDataset, train: bool = False):\n        \"\"\"\n        Calculate the statistics of the labels for each task, and overwrites the `self.task_norms` attribute.\n\n        Parameters:\n            dataset: the dataset to calculate the statistics from\n            train: whether the dataset is the training set\n\n        \"\"\"\n\n        if self.task_norms and train:\n            for task in dataset.labels_size.keys():\n                # if the label type is graph_*, we need to stack them as the tensor shape is (num_labels, )\n                if task.startswith(\"graph\"):\n                    labels = np.stack(\n                        np.array([datum[\"labels\"][task] for datum in dataset if task in datum[\"labels\"]]),\n                        axis=0,\n                    )\n                # for other tasks with node_ and edge_, the label shape is [num_nodes/num_edges, num_labels]\n                # we can concatenate them directly\n                else:\n                    labels = np.concatenate(\n                        [datum[\"labels\"][task] for datum in dataset if task in datum[\"labels\"]], axis=0\n                    )\n\n                self.task_norms[task].calculate_statistics(labels)\n\n    def get_label_statistics(\n        self,\n        data_path: Union[str, os.PathLike],\n        data_hash: str,\n        dataset: Datasets.MultitaskDataset,\n        train: bool = False,\n    ):\n        \"\"\"\n        Get the label statistics from the dataset, and save them to file, if needed.\n        `self.task_norms` will be modified in-place with the label statistics.\n\n        Parameters:\n            data_path: the path to save and load the label statistics to. If None, no saving and loading will be done.\n            data_hash: the hash of the dataset generated by `get_data_hash()`\n            dataset: the dataset to calculate the statistics from\n            train: whether the dataset is the training set\n\n        \"\"\"\n        if data_path is None:\n            self.calculate_statistics(dataset, train=train)\n        else:\n            path_with_hash = os.path.join(data_path, data_hash)\n            os.makedirs(path_with_hash, exist_ok=True)\n            filename = os.path.join(path_with_hash, \"task_norms.pkl\")\n            if self.task_norms and train and not os.path.isfile(filename):\n                self.calculate_statistics(dataset, train=train)\n                torch.save(self.task_norms, filename, pickle_protocol=4)\n            # if any of the above three condition does not satisfy, we load from file.\n            else:\n                self.task_norms = torch.load(filename)\n\n    def normalize_label(self, dataset: Datasets.MultitaskDataset, stage) -> Datasets.MultitaskDataset:\n        \"\"\"\n        Normalize the labels in the dataset using the statistics in `self.task_norms`.\n\n        Parameters:\n            dataset: the dataset to normalize the labels from\n\n        Returns:\n            the dataset with normalized labels\n        \"\"\"\n        for task in dataset.labels_size.keys():\n            # we normalize the dataset if (it is train split) or (it is val/test splits and normalize_val_test is set to true)\n            if (stage == \"train\") or (stage in [\"val\", \"test\"] and self.task_norms[task].normalize_val_test):\n                for i in range(len(dataset)):\n                    if task in dataset[i][\"labels\"]:\n                        dataset[i][\"labels\"][task] = self.task_norms[task].normalize(\n                            dataset[i][\"labels\"][task]\n                        )\n        return dataset\n\n    def save_featurized_data(self, dataset: Datasets.MultitaskDataset, processed_data_path):\n        os.makedirs(processed_data_path)  # In case the len(dataset) is 0\n        for i in range(0, len(dataset), 1000):\n            os.makedirs(os.path.join(processed_data_path, format(i // 1000, \"04d\")), exist_ok=True)\n        process_params = [(index, datum, processed_data_path) for index, datum in enumerate(dataset)]\n\n        # Check if \"about\" is in the Dataset object\n        about = \"\"\n        if hasattr(dataset, \"about\"):\n            about = dataset.about\n        for param in tqdm(process_params, desc=f\"Saving featurized data {about}\"):\n            self.process_func(param)\n        return\n\n    def process_func(self, param):\n        index, datum, folder = param\n        filename = os.path.join(folder, format(index // 1000, \"04d\"), format(index, \"07d\") + \".pkl\")\n        torch.save(\n            {\"graph_with_features\": datum[\"features\"], \"labels\": datum[\"labels\"]},\n            filename,\n            pickle_protocol=4,\n        )\n        return\n\n    def get_dataloader_kwargs(self, stage: RunningStage, shuffle: bool, **kwargs) -> Dict[str, Any]:\n        \"\"\"\n        Get the options for the dataloader depending on the current stage.\n\n        Parameters:\n            stage: Whether in Training, Validating, Testing, Sanity-checking, Predicting, or Tuning phase.\n            shuffle: set to ``True`` to have the data reshuffled at every epoch.\n\n        Returns:\n            Arguments to pass to the `DataLoader` during initialization\n        \"\"\"\n        loader_kwargs = super().get_dataloader_kwargs(stage=stage, shuffle=shuffle, **kwargs)\n\n        # Get batch size and IPU options for training set\n        # if stage in [RunningStage.TRAINING, RunningStage.TUNING]:\n        if stage in [RunningStage.TRAINING]:\n            loader_kwargs[\"ipu_dataloader_options\"] = self.ipu_dataloader_training_opts\n            loader_kwargs[\"ipu_options\"] = self.ipu_training_opts\n\n        # Get batch size and IPU options for validation / testing sets\n        elif stage in [RunningStage.VALIDATING, RunningStage.TESTING, RunningStage.PREDICTING]:\n            loader_kwargs[\"ipu_dataloader_options\"] = self.ipu_dataloader_inference_opts\n            loader_kwargs[\"ipu_options\"] = self.ipu_inference_opts\n        else:\n            raise ValueError(f\"Wrong value for `stage`. Provided `{stage}`\")\n\n        # Remove the IPU options if not available\n        if loader_kwargs[\"ipu_options\"] is None:\n            loader_kwargs.pop(\"ipu_options\")\n            if loader_kwargs[\"ipu_dataloader_options\"] is not None:\n                logger.warning(\n                    \"`ipu_dataloader_options` will be ignored since it is provided without `ipu_options`.\"\n                )\n            loader_kwargs.pop(\"ipu_dataloader_options\")\n        return loader_kwargs\n\n    def get_dataloader(\n        self, dataset: Dataset, shuffle: bool, stage: RunningStage\n    ) -> Union[DataLoader, \"poptorch.DataLoader\"]:\n        \"\"\"\n        Get the poptorch dataloader for a given dataset\n\n        Parameters:\n            dataset: The dataset from which to load the data\n            shuffle: set to ``True`` to have the data reshuffled at every epoch.\n            stage: Whether in Training, Validating, Testing, Sanity-checking, Predicting, or Tuning phase.\n\n        Returns:\n            The poptorch dataloader to sample from\n        \"\"\"\n        kwargs = self.get_dataloader_kwargs(stage=stage, shuffle=shuffle)\n        sampler = None\n        # use sampler only when sampler_task_dict is set in the config and during training\n        if DatasetSubSampler.check_sampling_required(self.sampler_task_dict) and stage in [\n            RunningStage.TRAINING\n        ]:\n            sampler = DatasetSubSampler(\n                dataset, self.sampler_task_dict, self.processed_graph_data_path, self.data_hash\n            )\n            # turn shuffle off when sampler is used as sampler option is mutually exclusive with shuffle\n            kwargs[\"shuffle\"] = False\n        is_ipu = (\"ipu_options\" in kwargs.keys()) and (kwargs.get(\"ipu_options\") is not None)\n        if is_ipu:\n            loader = IPUDataModuleModifier._dataloader(self, dataset=dataset, sampler=sampler, **kwargs)\n        else:\n            loader = BaseDataModule._dataloader(self, dataset=dataset, sampler=sampler, **kwargs)\n\n        return loader\n\n    def get_collate_fn(self, collate_fn):\n        if collate_fn is None:\n            # Some values become `inf` when changing data type. `mask_nan` deals with that\n            collate_fn = partial(\n                graphium_collate_fn,\n                mask_nan=0,\n                do_not_collate_keys=[\"smiles\", \"mol_ids\"],\n                batch_size_per_pack=self.batch_size_per_pack,\n            )\n            collate_fn.__name__ = graphium_collate_fn.__name__\n        return collate_fn\n\n    # Cannot be used as is for the multitask version, because sample_idx does not apply.\n    def _featurize_molecules(self, smiles: Iterable[str]) -> Tuple[List, List]:\n        \"\"\"\n        Precompute the features (graphs, fingerprints, etc.) from the SMILES.\n        Features are computed from `self.smiles_transformer`.\n        A warning is issued to mention which molecules failed featurization.\n\n        Note:\n            (hadim): in case of very large dataset we could:\n            - or cache the data and read from it during `next(iter(dataloader))`\n            - or compute the features on-the-fly during `next(iter(dataloader))`\n            For now we compute in advance and hold everything in memory.\n\n        Parameters:\n            smiles: A list of all the molecular SMILES to featurize\n            sample_idx: The indexes corresponding to the sampled SMILES.\n                If not provided, computed from `numpy.arange`.\n\n        Returns:\n            features: A list of all the featurized molecules\n            idx_none: A list of the indexes that failed featurization\n        \"\"\"\n\n        batch_size = BatchingSmilesTransform.parse_batch_size(\n            numel=len(smiles),\n            desired_batch_size=self.featurization_batch_size,\n            n_jobs=self.featurization_n_jobs,\n        )\n\n        # Loop all the smiles and compute the features\n        features = dm.parallelized_with_batches(\n            BatchingSmilesTransform(self.smiles_transformer),\n            smiles,\n            batch_size=batch_size,\n            progress=True,\n            n_jobs=self.featurization_n_jobs,\n            backend=self.featurization_backend,\n            tqdm_kwargs={\"desc\": f\"featurizing_smiles, batch={batch_size}\"},\n        )\n\n        # Warn about None molecules\n        idx_none = [ii for ii, feat in enumerate(features) if did_featurization_fail(feat)]\n        if len(idx_none) > 0:\n            mols_to_msg = [\n                f\"idx={idx} - smiles={smiles[idx]} - Error_msg[:-200]=\\n{str(features[idx])[:-200]}\"\n                for idx in idx_none\n            ]\n            msg = \"\\n\".join(mols_to_msg)\n            logger.warning(\n                (f\"{len(idx_none)} molecules will be removed since they failed featurization:\\n\" + msg)\n            )\n\n        return features, idx_none\n\n    @staticmethod\n    def _filter_none_molecules(\n        idx_none: Iterable,\n        *args: Union[pd.DataFrame, pd.Series, np.ndarray, torch.Tensor, list, tuple, Dict[Any, Iterable]],\n    ) -> List[Union[pd.DataFrame, pd.Series, np.ndarray, torch.Tensor, list, tuple, Dict[Any, Iterable]]]:\n        \"\"\"\n        Filter the molecules, labels, etc. for the molecules that failed featurization.\n\n        Parameters:\n            idx_none: A list of the indexes that failed featurization\n            args: Any argument from which to filter the failed SMILES.\n                Can be a `list`, `tuple`, `Tensor`, `np.array`, `Dict`, `pd.DataFrame`, `pd.Series`.\n                Otherwise, it is not filtered.\n                WARNING: If a `pd.DataFrame` or `pd.Series` is passed, it filters by the row indexes,\n                NOT by the `DataFrame.index` or `Series.index`! Be careful!\n\n        Returns:\n            out: All the `args` with the indexes from `idx_none` removed.\n        \"\"\"\n        if len(idx_none) == 0:\n            return args\n        idx_none = np.asarray(idx_none)\n\n        out = []\n        for arg in args:\n            if isinstance(arg, pd.DataFrame):\n                new = arg.drop(arg.index[idx_none], axis=0)\n            elif isinstance(arg, pd.Series):\n                new = arg.drop(arg.index[idx_none], axis=0)\n            elif isinstance(arg, np.ndarray):\n                new = np.delete(arg, idx_none, axis=0)\n            elif isinstance(arg, torch.Tensor):\n                not_none = torch.ones(arg.shape[0], dtype=bool)\n                not_none[idx_none] = False\n                new = arg[not_none]\n            elif isinstance(arg, (list, tuple)):\n                arg = list(arg)\n                new = [elem for ii, elem in enumerate(arg) if ii not in idx_none]\n            elif isinstance(arg, dict):\n                new = {}\n                for key, val in arg.items():\n                    new[key] = MultitaskFromSmilesDataModule._filter_none_molecules(idx_none, val)  # Careful\n            else:\n                new = arg\n            out.append(new)\n\n        out = tuple(out) if len(out) > 1 else out[0]\n\n        return out\n\n    def _parse_label_cols(\n        self,\n        df: pd.DataFrame,\n        df_path: Optional[Union[str, os.PathLike, List[Union[str, os.PathLike]]]],\n        label_cols: Union[Type[None], str, List[str]],\n        smiles_col: str,\n    ) -> List[str]:\n        r\"\"\"\n        Parse the choice of label columns depending on the type of input.\n        The input parameters `label_cols` and `smiles_col` are described in\n        the `__init__` method.\n        Parameters:\n            df: The dataframe containing the labels.\n            df_path: The path to the dataframe containing the labels. If list, the first file is used.\n            label_cols: The columns to use as labels.\n            smiles_col: The column to use as SMILES\n        Returns:\n            the parsed label columns\n        \"\"\"\n        if df is None:\n            files = self._glob(df_path)\n            if len(files) == 0:\n                raise FileNotFoundError(f\"No such file or directory `{df_path}`\")\n\n            cols = BaseDataModule._get_table_columns(files[0])\n            for file in files[1:]:\n                _cols = BaseDataModule._get_table_columns(file)\n                if set(cols) != set(_cols):\n                    raise RuntimeError(\n                        f\"Multiple data files have different columns. \\nColumn set 1: {cols}\\nColumn set 2: {_cols}\"\n                    )\n        else:\n            cols = list(df.columns)\n\n        # A star `*` at the beginning or end of the string specifies to look for all\n        # columns that starts/end with a specific string\n        if isinstance(label_cols, str):\n            if label_cols[0] == \"*\":\n                label_cols = [col for col in cols if str(col).endswith(label_cols[1:])]\n            elif label_cols[-1] == \"*\":\n                label_cols = [col for col in cols if str(col).startswith(label_cols[:-1])]\n            else:\n                label_cols = [label_cols]\n\n        elif label_cols is None:\n            label_cols = [col for col in cols if col != smiles_col]\n\n        return check_arg_iterator(label_cols, enforce_type=list)\n\n    @property\n    def is_prepared(self):\n        if not hasattr(self, \"dataset\"):\n            return False\n        return getattr(self, \"dataset\") is not None\n\n    @property\n    def is_setup(self):\n        if not (hasattr(self, \"train_ds\") or hasattr(self, \"test_ds\")):\n            return False\n        return (getattr(self, \"train_ds\") is not None) or (getattr(self, \"test_ds\") is not None)\n\n    @property\n    def num_node_feats(self):\n        \"\"\"Return the number of node features in the first graph\"\"\"\n        graph = self.get_fake_graph()\n        num_feats = graph.feat.shape[1]\n        return num_feats\n\n    @property\n    def in_dims(self):\n        \"\"\"\n        Return all input dimensions for the set of graphs.\n        Including node/edge features, and\n        raw positional encoding dimensions such eigval, eigvec, rwse and more\n        \"\"\"\n\n        graph = self.get_fake_graph()\n        if isinstance(graph, (GraphDict)):\n            graph = graph.data\n\n        # get list of all keys corresponding to positional encoding\n        pe_dim_dict = {}\n        g_keys = get_keys(graph)\n        # ignore the normal keys for node feat and edge feat etc.\n        for key in g_keys:\n            prop = graph.get(key, None)\n            if hasattr(prop, \"shape\"):\n                pe_dim_dict[key] = prop.shape[-1]\n        return pe_dim_dict\n\n    @property\n    def num_edge_feats(self):\n        \"\"\"Return the number of edge features in the first graph\"\"\"\n\n        graph = self.get_fake_graph()\n        empty = torch.Tensor([])\n        num_feats = graph.get(\"edge_feat\", empty).shape[-1]\n\n        return num_feats\n\n    def get_fake_graph(self):\n        \"\"\"\n        Low memory footprint method to get the featurization of a fake graph\n        without reading the dataset. Useful for getting the number of node/edge features.\n\n        Returns:\n            graph: A fake graph with the right featurization\n        \"\"\"\n\n        smiles = \"C1=CC=CC=C1\"\n        trans = deepcopy(self.smiles_transformer)\n        trans.keywords.setdefault(\"on_error\", \"raise\")\n        trans.keywords.setdefault(\"mask_nan\", 0.0)\n        graph = trans(smiles)\n        return graph\n\n    ########################## Private methods ######################################\n    def _save_to_cache(self):\n        raise NotImplementedError()\n\n    def _load_from_cache(self):\n        raise NotImplementedError()\n\n    def _extract_smiles_labels(\n        self,\n        df: pd.DataFrame,\n        task_level: str,\n        smiles_col: Optional[str] = None,\n        label_cols: List[str] = [],\n        idx_col: Optional[str] = None,\n        mol_ids_col: Optional[str] = None,\n        weights_col: Optional[str] = None,\n        weights_type: Optional[str] = None,\n    ) -> Tuple[\n        np.ndarray, np.ndarray, Union[Type[None], np.ndarray], Dict[str, Union[Type[None], np.ndarray]]\n    ]:\n        \"\"\"\n        For a given dataframe extract the SMILES and labels columns. Smiles is returned as a list\n        of string while labels are returned as a 2D numpy array.\n\n        Parameters:\n            df: Pandas dataframe\n            smiles_col: Name of the column containing the SMILES\n            label_cols: List of column names containing the labels\n            idx_col: Name of the column containing the index\n            mol_ids_col: Name of the column containing the molecule ids\n            weights_col: Name of the column containing the weights\n            weights_type: Type of weights to use.\n        Returns:\n            smiles, labels, sample_idx, extras\n        \"\"\"\n\n        if smiles_col is None:  # Should we specify which dataset has caused the potential issue?\n            smiles_col_all = [col for col in df.columns if \"smile\" in str(col).lower()]\n            if len(smiles_col_all) == 0:\n                raise ValueError(f\"No SMILES column found in dataframe. Columns are {df.columns}\")\n            elif len(smiles_col_all) > 1:\n                raise ValueError(\n                    f\"Multiple SMILES column found in dataframe. SMILES Columns are {smiles_col_all}\"\n                )\n\n            smiles_col = smiles_col_all[0]\n\n        if label_cols is None:\n            label_cols = df.columns.drop(smiles_col)\n\n        label_cols = check_arg_iterator(label_cols, enforce_type=list)\n        smiles = df[smiles_col].values\n        if len(label_cols) > 0:\n            if task_level == \"graph\":\n                labels = extract_labels(df, \"graph\", label_cols)\n            elif task_level == \"node\":\n                labels = extract_labels(df, \"node\", label_cols)\n            elif task_level == \"edge\":\n                labels = extract_labels(df, \"edge\", label_cols)\n            elif task_level == \"nodepair\":\n                labels = extract_labels(df, \"nodepair\", label_cols)\n            else:\n                raise ValueError(f\"Unknown task level: {task_level}\")\n        else:\n            labels = float(\"nan\") + np.zeros([len(smiles), 0])\n\n        # Get the indices, used for sub-sampling and splitting the dataset\n        if idx_col is not None:\n            df = df.set_index(idx_col)\n        sample_idx = df.index.values\n\n        # Get the molecule ids\n        mol_ids = None\n        if mol_ids_col is not None:\n            mol_ids = df[mol_ids_col].values\n\n        # Extract the weights\n        weights = None\n        if weights_col is not None:\n            weights = df[weights_col].values\n        elif weights_type is not None:\n            if not np.all((labels == 0) | (labels == 1)):\n                raise ValueError(\"Labels must be binary for `weights_type`\")\n\n            if weights_type == \"sample_label_balanced\":\n                ratio_pos_neg = np.sum(labels, axis=0, keepdims=1) / labels.shape[0]\n                weights = np.zeros(labels.shape)\n                weights[labels == 0] = ratio_pos_neg\n                weights[labels == 1] = ratio_pos_neg**-1\n\n            elif weights_type == \"sample_balanced\":\n                ratio_pos_neg = np.sum(labels, axis=0, keepdims=1) / labels.shape[0]\n                weights = np.zeros(labels.shape)\n                weights[labels == 0] = ratio_pos_neg\n                weights[labels == 1] = ratio_pos_neg**-1\n                weights = np.prod(weights, axis=1)\n\n            else:\n                raise ValueError(f\"Undefined `weights_type` {weights_type}\")\n\n            weights /= np.max(weights)  # Put the max weight to 1\n\n        extras = {\"weights\": weights, \"mol_ids\": mol_ids}\n        return smiles, labels, sample_idx, extras\n\n    def _get_split_indices(\n        self,\n        dataset_size: int,\n        split_val: float,\n        split_test: float,\n        sample_idx: Optional[Iterable[int]] = None,\n        split_seed: int = None,\n        splits_path: Union[str, os.PathLike, Dict[str, Iterable[int]]] = None,\n        split_names: Optional[List[str]] = [\"train\", \"val\", \"test\"],\n    ):\n        r\"\"\"\n        Compute indices of random splits.\n        Parameters:\n            dataset_size: Size of the dataset\n            split_val: Fraction of the dataset to use for validation\n            split_test: Fraction of the dataset to use for testing\n            sample_idx: Indices of the samples to use for splitting\n            split_seed: Seed for the random splitting\n            splits_path: Path to a file containing the splits\n        Returns:\n            train_indices, val_indices, test_indices\n        \"\"\"\n\n        if sample_idx is None:\n            sample_idx = np.arange(dataset_size)\n\n        if splits_path is None:\n            # Random splitting\n            if split_test + split_val > 0:\n                train_indices, val_test_indices = train_test_split(\n                    sample_idx,\n                    test_size=split_val + split_test,\n                    random_state=split_seed,\n                )\n                sub_split_test = split_test / (split_test + split_val)\n            else:\n                train_indices = sample_idx\n                val_test_indices = np.array([])\n                sub_split_test = 0\n\n            if split_test > 0:\n                val_indices, test_indices = train_test_split(\n                    val_test_indices,\n                    test_size=sub_split_test,\n                    random_state=split_seed,\n                )\n            else:\n                val_indices = val_test_indices\n                test_indices = np.array([])\n\n        else:\n            train, val, test = split_names\n            if isinstance(splits_path, (Dict, pd.DataFrame)):\n                # Split from a dataframe\n                splits = splits_path\n            else:\n                # Split from an indices file\n                file_type = self._get_data_file_type(splits_path)\n\n                train, val, test = split_names\n\n                if file_type == \"pt\":\n                    splits = torch.load(splits_path)\n                elif file_type in [\"csv\", \"tsv\"]:\n                    with fsspec.open(str(splits_path)) as f:\n                        splits = self._read_csv(splits_path)\n                else:\n                    raise ValueError(\n                        f\"file type `{file_type}` for `{splits_path}` not recognised, please use .pt, .csv or .tsv\"\n                    )\n            train_indices = np.asarray(splits[train]).astype(\"int\")\n            train_indices = train_indices[~np.isnan(train_indices)].tolist()\n            val_indices = np.asarray(splits[val]).astype(\"int\")\n            val_indices = val_indices[~np.isnan(val_indices)].tolist()\n            test_indices = np.asarray(splits[test]).astype(\"int\")\n            test_indices = test_indices[~np.isnan(test_indices)].tolist()\n\n        # Filter train, val and test indices\n        _, train_idx, _ = np.intersect1d(sample_idx, train_indices, return_indices=True)\n        train_indices = train_idx.tolist()\n        _, valid_idx, _ = np.intersect1d(sample_idx, val_indices, return_indices=True)\n        val_indices = valid_idx.tolist()\n        _, test_idx, _ = np.intersect1d(sample_idx, test_indices, return_indices=True)\n        test_indices = test_idx.tolist()\n\n        return train_indices, val_indices, test_indices\n\n    def _sub_sample_df(\n        self, df: pd.DataFrame, sample_size: Union[int, float, None], seed: Optional[int] = None\n    ) -> pd.DataFrame:\n        r\"\"\"\n        subsample from a pandas dataframe\n        Parameters:\n            df: pandas dataframe to subsample\n            sample_size: number of samples to subsample\n        Returns:\n            subsampled pandas dataframe\n        \"\"\"\n        # Sub-sample the dataframe\n        if isinstance(sample_size, int):\n            n = min(sample_size, df.shape[0])\n            df = df.sample(n=n, random_state=seed)\n        elif isinstance(sample_size, float):\n            df = df.sample(frac=sample_size, random_state=seed)\n        elif sample_size is None:\n            pass\n        else:\n            raise ValueError(\n                f\"Wrong value for `sample_size`: {sample_size}\"\n            )  # Maybe specify which task it was for?\n\n        return df\n\n    def get_data_hash(self):\n        \"\"\"\n        Get a hash specific to a dataset and smiles_transformer.\n        Useful to cache the pre-processed data.\n        \"\"\"\n        args = {}\n        # pop epoch_sampling_fraction out when creating hash\n        # so that the data cache does not need to be regenerated\n        # when epoch_sampling_fraction has changed.\n        for task_key, task_args in deepcopy(self.task_specific_args).items():\n            if isinstance(task_args, DatasetProcessingParams):\n                task_args = task_args.__dict__  # Convert the class to a dictionary\n\n            # Keep only first 5 rows of a dataframe\n            if \"df\" in task_args.keys():\n                if task_args[\"df\"] is not None:\n                    task_args[\"df\"] = task_args[\"df\"].iloc[:5]\n\n            # Remove the `epoch_sampling_fraction`\n            task_args.pop(\"epoch_sampling_fraction\", None)\n            args[task_key] = task_args\n\n        hash_dict = {\n            \"smiles_transformer\": self.smiles_transformer,\n            \"task_specific_args\": args,\n        }\n        data_hash = get_md5_hash(hash_dict)\n        return data_hash\n\n    def get_data_cache_fullname(self, compress: bool = False) -> str:\n        \"\"\"\n        Create a hash for the dataset, and use it to generate a file name\n\n        Parameters:\n            compress: Whether to compress the data\n        Returns:\n            full path to the data cache file\n        \"\"\"\n        if self.processed_graph_data_path is None:\n            return\n        ext = \".datacache\"\n        if compress:\n            ext += \".gz\"\n        data_cache_fullname = fs.join(self.processed_graph_data_path, self.data_hash + ext)\n        return data_cache_fullname\n\n    def load_data_from_cache(self, verbose: bool = True, compress: bool = False) -> bool:\n        \"\"\"\n        Load the datasets from cache. First create a hash for the dataset, and verify if that\n        hash is available at the path given by `self.processed_graph_data_path`.\n\n        Parameters:\n            verbose: Whether to print the progress\n            compress: Whether to compress the data\n\n        Returns:\n            cache_data_exists: Whether the cache exists (if the hash matches) and the loading succeeded\n        \"\"\"\n        full_cache_data_path = self.get_data_cache_fullname(compress=compress)\n\n        if full_cache_data_path is None:\n            logger.info(\"No cache data path specified. Skipping loading the data from cache.\")\n            return False\n\n        cache_data_exists = fs.exists(full_cache_data_path)\n\n        if cache_data_exists:\n            try:\n                logger.info(f\"Loading the data from cache at path `{full_cache_data_path}`\")\n                now = time.time()\n                with fsspec.open(full_cache_data_path, mode=\"rb\", compression=\"infer\") as file:\n                    load_params = torch.load(file)\n                    self.__dict__.update(load_params)\n                    (\n                        self.train_singletask_datasets,\n                        self.val_singletask_datasets,\n                        self.test_singletask_datasets,\n                    ) = self.get_subsets_of_datasets(\n                        self.single_task_datasets,\n                        self.task_train_indices,\n                        self.task_val_indices,\n                        self.task_test_indices,\n                    )\n                elapsed = round(time.time() - now)\n                logger.info(\n                    f\"Successfully loaded the data from cache in {elapsed}s at path: `{full_cache_data_path}`\"\n                )\n                return True\n            except Exception as e:\n                if verbose:\n                    logger.warning(\n                        f\"Data cache failed to load path: `{full_cache_data_path}`.\\nThe data will be prepared and cache will be created for future runs.\"\n                    )\n                    logger.warning(e.__str__())\n                return False\n        else:\n            if verbose:\n                logger.info(\n                    f\"Data cache not found at path: `{full_cache_data_path}`.\\nThe data will be prepared and cache will be created for future runs.\"\n                )\n            return False\n\n    def get_subsets_of_datasets(\n        self,\n        single_task_datasets: Dict[str, Datasets.SingleTaskDataset],\n        task_train_indices: Dict[str, Iterable],\n        task_val_indices: Dict[str, Iterable],\n        task_test_indices: Dict[str, Iterable],\n    ) -> Tuple[Subset, Subset, Subset]:\n        \"\"\"\n        From a dictionary of datasets and their associated indices, subset the train/val/test sets\n\n        Parameters:\n            single_task_datasets: Dictionary of datasets\n            task_train_indices: Dictionary of train indices\n            task_val_indices: Dictionary of val indices\n            task_test_indices: Dictionary of test indices\n        Returns:\n            train_singletask_datasets: Dictionary of train subsets\n            val_singletask_datasets: Dictionary of val subsets\n            test_singletask_datasets: Dictionary of test subsets\n        \"\"\"\n        train_singletask_datasets = {}\n        val_singletask_datasets = {}\n        test_singletask_datasets = {}\n        for task in task_train_indices.keys():\n            train_singletask_datasets[task] = Subset(single_task_datasets[task], task_train_indices[task])\n            val_singletask_datasets[task] = Subset(single_task_datasets[task], task_val_indices[task])\n            test_singletask_datasets[task] = Subset(single_task_datasets[task], task_test_indices[task])\n        return train_singletask_datasets, val_singletask_datasets, test_singletask_datasets\n\n    def __len__(self) -> int:\n        r\"\"\"\n        Returns the number of elements of the current DataModule, which is the combined size of all single-task datasets given.\n        Returns:\n            num_elements: Number of elements in the current DataModule\n        \"\"\"\n        num_elements = 0\n        for task, args in self.task_dataset_processing_params.items():\n            if args.df is None:\n                df = self._read_table(args.df_path, usecols=[args.smiles_col])\n                num_elements += len(df)\n            else:\n                num_elements += len(args.df)\n        return num_elements\n\n    def to_dict(self) -> Dict[str, Any]:\n        \"\"\"\n        Returns a dictionary representation of the current DataModule\n        Returns:\n            obj_repr: Dictionary representation of the current DataModule\n        \"\"\"\n        # TODO: Change to make more multi-task friendly\n        obj_repr = {}\n        obj_repr[\"name\"] = self.__class__.__name__\n        obj_repr[\"len\"] = len(self)\n        obj_repr[\"train_size\"] = len(self.train_indices) if self.train_indices is not None else None\n        obj_repr[\"val_size\"] = len(self.val_indices) if self.val_indices is not None else None\n        obj_repr[\"test_size\"] = len(self.test_indices) if self.test_indices is not None else None\n        obj_repr[\"batch_size_training\"] = self.batch_size_training\n        obj_repr[\"batch_size_inference\"] = self.batch_size_inference\n        obj_repr[\"batch_size_per_pack\"] = self.batch_size_per_pack\n        obj_repr[\"num_node_feats\"] = self.num_node_feats\n        obj_repr[\"num_node_feats_with_positional_encoding\"] = self.num_node_feats_with_positional_encoding\n        obj_repr[\"num_edge_feats\"] = self.num_edge_feats\n        obj_repr[\"num_tasks\"] = len(self.task_dataset_processing_params)\n        obj_repr[\"num_labels\"] = len(self.label_cols)\n        obj_repr[\"collate_fn\"] = self.collate_fn.__name__\n        obj_repr[\"featurization\"] = self.featurization\n        return obj_repr\n\n    def __repr__(self) -> str:\n        r\"\"\"\n        Controls how the class is printed\n\n        Returns:\n\n        \"\"\"\n        return omegaconf.OmegaConf.to_yaml(self.to_dict())\n\n\nclass GraphOGBDataModule(MultitaskFromSmilesDataModule):\n    def __init__(\n        self,\n        task_specific_args: Dict[str, Union[DatasetProcessingParams, Dict[str, Any]]],\n        processed_graph_data_path: Optional[Union[str, os.PathLike]] = None,\n        dataloading_from: str = \"ram\",\n        featurization: Optional[Union[Dict[str, Any], omegaconf.DictConfig]] = None,\n        batch_size_training: int = 16,\n        batch_size_inference: int = 16,\n        batch_size_per_pack: Optional[int] = None,\n        num_workers: int = 0,\n        pin_memory: bool = True,\n        persistent_workers: bool = False,\n        multiprocessing_context: Optional[str] = None,\n        featurization_n_jobs: int = -1,\n        featurization_progress: bool = False,\n        featurization_backend: str = \"loky\",\n        collate_fn: Optional[Callable] = None,\n        prepare_dict_or_graph: str = \"pyg:graph\",\n        **kwargs,\n    ):\n        r\"\"\"\n        Load an OGB (Open-graph-benchmark) GraphProp dataset.\n\n        Parameters:\n            task_specific_args: Arguments related to each task, with the task-name being the key,\n              and the specific arguments being the values. The arguments must be a\n              Dict containing the following keys:\n\n              - \"dataset_name\": Name of the OGB dataset to load. Examples of possible datasets are\n                \"ogbg-molhiv\", \"ogbg-molpcba\", \"ogbg-moltox21\", \"ogbg-molfreesolv\".\n              - \"sample_size\": The number of molecules to sample from the dataset. Default=None,\n                meaning that all molecules will be considered.\n            processed_graph_data_path: Path to the processed graph data. If None, the data will be\n              downloaded from the OGB website.\n            dataloading_from: Whether to load the data from RAM or disk. Default is \"ram\".\n            featurization: args to apply to the SMILES to Graph featurizer.\n            batch_size_training: batch size for training and val dataset.\n            batch_size_inference: batch size for test dataset.\n            num_workers: Number of workers for the dataloader. Use -1 to use all available\n                cores.\n            pin_memory: Whether to pin on paginated CPU memory for the dataloader.\n            featurization_n_jobs: Number of cores to use for the featurization.\n            featurization_progress: whether to show a progress bar during featurization.\n            featurization_backend: The backend to use for the molecular featurization.\n\n                - \"multiprocessing\": Found to cause less memory issues.\n                - \"loky\": joblib's Default. Found to cause memory leaks.\n                - \"threading\": Found to be slow.\n\n            collate_fn: A custom torch collate function. Default is to `graphium.data.graphium_collate_fn`\n            sample_size:\n\n                - `int`: The maximum number of elements to take from the dataset.\n                - `float`: Value between 0 and 1 representing the fraction of the dataset to consider\n                - `None`: all elements are considered.\n        \"\"\"\n\n        new_task_specific_args = {}\n        self.metadata = {}\n        for task_name, task_args in task_specific_args.items():\n            # Get OGB metadata\n            this_metadata = self._get_dataset_metadata(task_args[\"dataset_name\"])\n            # Get dataset\n            df, mol_ids_col, smiles_col, label_cols, splits_path = self._load_dataset(\n                this_metadata, sample_size=task_args.get(\"sample_size\", None)\n            )\n            new_task_specific_args[task_name] = {\n                \"df\": df,\n                \"mol_ids_col\": mol_ids_col,\n                \"smiles_col\": smiles_col,\n                \"label_cols\": label_cols,\n                \"splits_path\": splits_path,\n                \"task_level\": task_args[\"task_level\"],\n            }\n            self.metadata[task_name] = this_metadata\n\n        # Config for datamodule\n        dm_args = {}\n        dm_args[\"task_specific_args\"] = new_task_specific_args\n        dm_args[\"processed_graph_data_path\"] = processed_graph_data_path\n        dm_args[\"dataloading_from\"] = dataloading_from\n        dm_args[\"dataloader_from\"] = dataloading_from\n        dm_args[\"featurization\"] = featurization\n        dm_args[\"batch_size_training\"] = batch_size_training\n        dm_args[\"batch_size_inference\"] = batch_size_inference\n        dm_args[\"batch_size_per_pack\"] = batch_size_per_pack\n        dm_args[\"num_workers\"] = num_workers\n        dm_args[\"pin_memory\"] = pin_memory\n        dm_args[\"featurization_n_jobs\"] = featurization_n_jobs\n        dm_args[\"featurization_progress\"] = featurization_progress\n        dm_args[\"featurization_backend\"] = featurization_backend\n        dm_args[\"persistent_workers\"] = persistent_workers\n        dm_args[\"multiprocessing_context\"] = multiprocessing_context\n        dm_args[\"collate_fn\"] = collate_fn\n        dm_args[\"prepare_dict_or_graph\"] = prepare_dict_or_graph\n\n        super().__init__(**dm_args, **kwargs)\n\n    def to_dict(self) -> Dict[str, Any]:\n        r\"\"\"\n        geenrate a dictionary representation of the class\n        Returns:\n            dict: dictionary representation of the class\n        \"\"\"\n        # TODO: Change to make more multi-task friendly\n        obj_repr = {}\n        obj_repr[\"dataset_name\"] = self.dataset_name\n        obj_repr.update(super().to_dict())\n        return obj_repr\n\n    # Private methods\n\n    def _load_dataset(\n        self,\n        metadata: dict,\n        sample_size: Optional[int] = None,\n    ) -> Tuple[pd.DataFrame, str, str, List[str], str]:\n        \"\"\"\n        Download, extract and load an OGB dataset.\n        Parameters:\n            metadata: Metadata for the dataset to load.\n            sample_size: The number of molecules to sample from the dataset. Default=None,\n                meaning that all molecules will be considered.\n        Returns:\n            df: Pandas dataframe containing the dataset.\n            mol_ids_col: Name of the column containing the molecule ids.\n            smiles_col: Name of the column containing the SMILES.\n            label_cols: List of column names containing the labels.\n            splits_path: Path to the file containing the train/val/test splits.\n        \"\"\"\n\n        base_dir = fs.get_cache_dir(\"ogb\")\n        dataset_dir = base_dir / metadata[\"download_name\"]\n\n        if not dataset_dir.exists():\n            # Create cache filepath for zip file and associated folder\n            dataset_path = base_dir / f\"{metadata['download_name']}.zip\"\n\n            # Download it\n            if not dataset_path.exists():\n                logger.info(f\"Downloading {metadata['url']} to {dataset_path}\")\n                fs.copy(metadata[\"url\"], dataset_path, progress=True)\n\n            # Extract\n            zf = zipfile.ZipFile(dataset_path)\n            zf.extractall(base_dir)\n\n        # Load CSV file\n        if metadata[\"download_name\"].startswith(\"pcqm4m\"):\n            df_path = dataset_dir / \"raw\" / \"data.csv.gz\"\n        else:\n            df_path = dataset_dir / \"mapping\" / \"mol.csv.gz\"\n        logger.info(f\"Loading {df_path} in memory.\")\n        df = pd.read_csv(df_path)\n\n        # Subsample the dataset\n        df = self._sub_sample_df(df, sample_size)\n\n        # Load split from the OGB dataset and save them in a single CSV file\n        if metadata[\"download_name\"].startswith(\"pcqm4m\"):\n            split_name = metadata[\"split\"]\n            split_dict = torch.load(dataset_dir / \"split_dict.pt\")\n            train_split = pd.DataFrame(split_dict[\"train\"])\n            val_split = pd.DataFrame(split_dict[\"valid\"])\n            if \"test\" in split_dict.keys():\n                test_split = pd.DataFrame(split_dict[\"test\"])\n            else:\n                test_split = pd.DataFrame(split_dict[\"test-dev\"])\n\n            splits = pd.concat([train_split, val_split, test_split], axis=1)  # type: ignore\n            splits.columns = [\"train\", \"val\", \"test\"]\n\n            splits_path = dataset_dir / \"split\"\n            if not splits_path.exists():\n                os.makedirs(splits_path)\n                splits_path = dataset_dir / f\"{split_name}.csv.gz\"\n            else:\n                splits_path = splits_path / f\"{split_name}.csv.gz\"\n        else:\n            split_name = metadata[\"split\"]\n            train_split = pd.read_csv(dataset_dir / \"split\" / split_name / \"train.csv.gz\", header=None)  # type: ignore\n            val_split = pd.read_csv(dataset_dir / \"split\" / split_name / \"valid.csv.gz\", header=None)  # type: ignore\n            test_split = pd.read_csv(dataset_dir / \"split\" / split_name / \"test.csv.gz\", header=None)  # type: ignore\n\n            splits = pd.concat([train_split, val_split, test_split], axis=1)  # type: ignore\n            splits.columns = [\"train\", \"val\", \"test\"]\n\n            splits_path = dataset_dir / \"split\" / f\"{split_name}.csv.gz\"\n\n        logger.info(f\"Saving splits to {splits_path}\")\n        splits.to_csv(splits_path, index=None)\n\n        # Get column names: OGB columns are predictable\n        if metadata[\"download_name\"].startswith(\"pcqm4m\"):\n            mol_ids_col = df.columns[0]\n            smiles_col = df.columns[-2]\n            label_cols = df.columns[-1:].to_list()\n        else:\n            mol_ids_col = df.columns[-1]\n            smiles_col = df.columns[-2]\n            label_cols = df.columns[:-2].to_list()\n\n        return df, mol_ids_col, smiles_col, label_cols, splits_path\n\n    def _get_dataset_metadata(self, dataset_name: str) -> Dict[str, Any]:\n        ogb_metadata = self._get_ogb_metadata()\n        if dataset_name not in ogb_metadata.index:\n            raise ValueError(f\"'{dataset_name}' is not a valid dataset name.\")\n\n        return ogb_metadata.loc[dataset_name].to_dict()\n\n    def _get_ogb_metadata(self):\n        \"\"\"\n        Get the metadata of OGB GraphProp datasets.\n        \"\"\"\n\n        with importlib.resources.open_text(\"ogb.graphproppred\", \"master.csv\") as f:\n            ogb_metadata = pd.read_csv(f)\n        ogb_metadata = ogb_metadata.set_index(ogb_metadata.columns[0])\n        ogb_metadata = ogb_metadata.T\n\n        # Add metadata related to PCQM4M\n        ogb_metadata = pd.concat([ogb_metadata, pd.DataFrame(PCQM4M_meta, index=[\"ogbg-lsc-pcqm4m\"])])\n        ogb_metadata = pd.concat([ogb_metadata, pd.DataFrame(PCQM4Mv2_meta, index=[\"ogbg-lsc-pcqm4mv2\"])])\n\n        # Only keep datasets of type 'mol'\n        ogb_metadata = ogb_metadata[ogb_metadata[\"data type\"] == \"mol\"]\n\n        return ogb_metadata\n\n\nclass ADMETBenchmarkDataModule(MultitaskFromSmilesDataModule):\n    \"\"\"\n    Wrapper to use the ADMET benchmark group from the TDC (Therapeutics Data Commons).\n\n    !!! warning \"Dependency\"\n\n        This class requires [PyTDC](https://pypi.org/project/PyTDC/) to be installed.\n\n    !!! note \"Citation\"\n\n        Huang, K., Fu, T., Gao, W., Zhao, Y., Roohani, Y., Leskovec, J., Coley, C., Xiao, C., Sun, J., & Zitnik, M. (2021).\n        Therapeutics Data Commons: Machine Learning Datasets and Tasks for Drug Discovery and Development.\n        Proceedings of Neural Information Processing Systems, NeurIPS Datasets and Benchmarks.\n\n\n    Parameters:\n        tdc_benchmark_names: This can be any subset of the benchmark names that make up the ADMET benchmarking group.\n            If `None`, uses the complete benchmarking group. For all full list of options, see\n            [the TDC website](https://tdcommons.ai/benchmark/admet_group/overview/) or use:\n\n           ```python\n           import tdc.utils.retrieve_benchmark_names\n           retrieve_benchmark_names(\"admet_group\")\n           ```\n        tdc_train_val_seed: TDC recommends a default splitting method for the train-val split. This parameter\n          is used to seed that splitting method.\n    \"\"\"\n\n    def __init__(\n        self,\n        # TDC-specific\n        tdc_benchmark_names: Optional[Union[str, List[str]]] = None,\n        tdc_train_val_seed: int = 0,\n        # Inherited arguments from superclass\n        processed_graph_data_path: Optional[Union[str, Path]] = None,\n        dataloading_from: str = \"ram\",\n        featurization: Optional[Union[Dict[str, Any], omegaconf.DictConfig]] = None,\n        batch_size_training: int = 16,\n        batch_size_inference: int = 16,\n        batch_size_per_pack: Optional[int] = None,\n        num_workers: int = 0,\n        pin_memory: bool = True,\n        persistent_workers: bool = False,\n        multiprocessing_context: Optional[str] = None,\n        featurization_n_jobs: int = -1,\n        featurization_progress: bool = False,\n        featurization_backend: str = \"loky\",\n        collate_fn: Optional[Callable] = None,\n        prepare_dict_or_graph: str = \"pyg:graph\",\n        **kwargs,\n    ):\n        try:\n            from tdc.benchmark_group import admet_group\n            from tdc.utils import retrieve_benchmark_names\n        except ImportError as error:\n            # To make sure we use the exact same train-test set and preprocessing as other benchmark entries,\n            # we rely on the PyTDC library.\n            raise RuntimeError(\n                f\"To use {self.__class__.__name__}, `PyTDC` needs to be installed. \"\n                f\"Please install it with `pip install PyTDC`\"\n            ) from error\n\n        # Pick a path to save the TDC data to\n        tdc_cache_dir = fs.get_cache_dir(\"tdc\")\n        tdc_cache_dir = fs.join(tdc_cache_dir, \"ADMET_Benchmark\")\n        fs.mkdir(tdc_cache_dir, exist_ok=True)\n\n        # Create the benchmark group object\n        # NOTE (cwognum): We redirect stderr and stdout to a file since TDC uses print statements,\n        #  which quickly pollute the logs. Ideally, we would use `redirect_stderr(None)`, but that breaks TQDM.\n        with tempfile.TemporaryFile(\"w\") as f:\n            with redirect_stderr(f):\n                with redirect_stdout(f):\n                    self.group = admet_group(path=tdc_cache_dir)\n\n        # By default, use all available benchmarks in a benchmark group\n        if tdc_benchmark_names is None:\n            tdc_benchmark_names = retrieve_benchmark_names(\"admet_group\")\n        if isinstance(tdc_benchmark_names, str):\n            tdc_benchmark_names = [tdc_benchmark_names]\n\n        # Create the task-specific arguments\n        logger.info(\n            f\"Preparing the TDC ADMET Benchmark Group splits for each of the {len(tdc_benchmark_names)} benchmarks.\"\n        )\n\n        task_specific_args = {\n            t: self._get_task_specific_arguments(t, tdc_train_val_seed, tdc_cache_dir)\n            for t in tdc_benchmark_names\n        }\n\n        super().__init__(\n            task_specific_args=task_specific_args,\n            featurization=featurization,\n            processed_graph_data_path=processed_graph_data_path,\n            dataloading_from=dataloading_from,\n            batch_size_training=batch_size_training,\n            batch_size_inference=batch_size_inference,\n            batch_size_per_pack=batch_size_per_pack,\n            num_workers=num_workers,\n            pin_memory=pin_memory,\n            persistent_workers=persistent_workers,\n            multiprocessing_context=multiprocessing_context,\n            featurization_n_jobs=featurization_n_jobs,\n            featurization_progress=featurization_progress,\n            featurization_backend=featurization_backend,\n            collate_fn=collate_fn,\n            prepare_dict_or_graph=prepare_dict_or_graph,\n            **kwargs,\n        )\n\n    def _get_task_specific_arguments(self, name: str, seed: int, cache_dir: str) -> DatasetProcessingParams:\n        \"\"\"\n        Loads the data and split from the TDC benchmark group object.\n\n        For the train-test split, this is fixed.\n\n        For the train-val split, this does not have to be fixed. Here we use the default splitting method\n        from TDC to do the train-val split and allow the seed to be changed. This is likely to best match\n        other entries in the benchmarking group.\n        \"\"\"\n\n        benchmark = self.group.get(name)\n\n        # Get the default train-val-test split\n\n        # NOTE (cwognum): TDC prints by default to stderr, which pollutes the logs quite a bit.\n        #  This context manager mutes these by temporarily writing to a file.\n        #  Ideally, we would use `redirect_stderr(None)`, but that breaks TQDM.\n\n        with tempfile.TemporaryFile(\"w\") as f:\n            with redirect_stderr(f):\n                train, val = self.group.get_train_valid_split(seed, name)\n        test = benchmark[\"test\"]\n\n        # Convert to the Graphium format\n        n_val = len(val)\n        n_test = len(test)\n        n_train = len(train)\n        max_len = max(n_train, n_val, n_test)\n        total_len = n_train + n_val + n_test\n\n        data = pd.concat([train, val, test], ignore_index=True)\n\n        # NOTE (cwognum): We need to convert the labels to float, since we use NaNs down the line.\n        #  If you uncomment this line, collating the labels will raise an overflow by converting a NaN to the int dtype.\n        data[\"Y\"] = data[\"Y\"].astype(float)\n\n        split = pd.DataFrame(\n            {\n                \"train\": list(range(n_train)),\n                \"val\": list(range(n_train, n_train + n_val)) + [float(\"nan\")] * (max_len - n_val),\n                \"test\": list(range(n_train + n_val, total_len)) + [float(\"nan\")] * (max_len - n_test),\n            }\n        )\n        split_path = fs.join(cache_dir, f\"{name}_split.csv\")\n        split.to_csv(split_path, index=False)\n\n        return DatasetProcessingParams(\n            df=data,\n            idx_col=None,\n            smiles_col=\"Drug\",\n            label_cols=[\"Y\"],\n            splits_path=split_path,\n            split_names=[\"train\", \"val\", \"test\"],\n            task_level=\"graph\",\n        )\n\n\nclass FakeDataModule(MultitaskFromSmilesDataModule):\n    \"\"\"\n    A fake datamodule that generates artificial data by mimicking the true data coming\n    from the provided dataset.\n    It is useful to test the speed and performance of the model on a dataset without\n    having to featurize it and wait for the workers to load it.\n    \"\"\"\n\n    def __init__(\n        self,\n        task_specific_args: Dict[str, Dict[str, Any]],  # TODO: Replace this with DatasetParams\n        featurization: Optional[Union[Dict[str, Any], omegaconf.DictConfig]] = None,\n        batch_size_training: int = 16,\n        batch_size_inference: int = 16,\n        num_workers: int = 0,\n        pin_memory: bool = True,\n        persistent_workers: bool = False,\n        multiprocessing_context: Optional[str] = None,\n        collate_fn: Optional[Callable] = None,\n        prepare_dict_or_graph: str = \"pyg:graph\",\n        num_mols_to_generate: int = 1000000,\n        indexing_single_elem: bool = True,\n        **kwargs,\n    ):\n        super().__init__(\n            task_specific_args=task_specific_args,\n            featurization=featurization,\n            batch_size_training=batch_size_training,\n            batch_size_inference=batch_size_inference,\n            num_workers=num_workers,\n            pin_memory=pin_memory,\n            persistent_workers=persistent_workers,\n            multiprocessing_context=multiprocessing_context,\n            collate_fn=collate_fn,\n            prepare_dict_or_graph=prepare_dict_or_graph,\n            **kwargs,\n        )\n        self.num_mols_to_generate = num_mols_to_generate\n        self.indexing_single_elem = indexing_single_elem\n\n    def generate_data(self, label_cols: List[str], smiles_col: str):\n        \"\"\"\n        Parameters:\n            labels_cols\n            smiles_col\n        Returns:\n            pd.DataFrame\n        \"\"\"\n        num_generated_mols = int(1)\n        # Create a dummy generated dataset - singel smiles string, duplicated N times\n        example_molecules = dict(\n            smiles=\"C1N2C3C4C5OC13C2C45\",\n            cxsmiles=\"[H]C1C2=C(NC(=O)[C@@]1([H])C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H])C([H])=C([H])N=C2[H] |(6.4528,-1.5789,-1.2859;5.789,-0.835,-0.8455;4.8499,-0.2104,-1.5946;3.9134,0.7241,-0.934;3.9796,1.1019,0.3172;5.0405,0.6404,1.1008;5.2985,1.1457,2.1772;5.9121,-0.5519,0.613;6.9467,-0.2303,0.8014;5.677,-1.7955,1.4745;4.7751,-2.7953,1.0929;4.2336,-2.7113,0.154;4.5521,-3.9001,1.914;3.8445,-4.6636,1.5979;5.215,-4.0391,3.1392;4.9919,-5.2514,4.0126;5.1819,-5.0262,5.0671;5.6619,-6.0746,3.7296;3.966,-5.6247,3.925;6.1051,-3.0257,3.52;6.6247,-3.101,4.4725;6.3372,-1.9217,2.7029;7.0168,-1.1395,3.0281;2.8586,1.2252,-1.7853;2.1303,1.9004,-1.3493;2.8118,0.8707,-3.0956;2.0282,1.2549,-3.7434;3.716,0.0207,-3.7371;4.6658,-0.476,-3.0127;5.3755,-1.1468,-3.5021)|\",\n        )\n        example_df_entry = {smiles_col: example_molecules[smiles_col]}\n        for label in label_cols:\n            example_df_entry[label] = np.random.random()\n        df = pd.DataFrame([example_df_entry])\n        logger.info(f\"Generating fake dataset on host... \\n Generating {num_generated_mols} rows in the df.\")\n        df = pd.concat([df] * num_generated_mols, ignore_index=True)\n        return df\n\n    def prepare_data(self):\n        \"\"\"Called only from a single process in distributed settings. Steps:\n\n        - If each cache is set and exists, reload from cache and return. Otherwise,\n        - For each single-task dataset:\n            - Load its dataframe from a path (if provided)\n            - Subsample the dataframe\n            - Extract the smiles, labels from the dataframe\n        - In the previous step, we were also able to get the unique smiles, which we use to compute the features\n        - For each single-task dataframe and associated data (smiles, labels, etc.):\n            - Filter out the data corresponding to molecules which failed featurization.\n            - Create a corresponding SingletaskDataset\n            - Split the SingletaskDataset according to the task-specific splits for train, val and test\n        \"\"\"\n\n        \"\"\"Load all single-task dataframes.\"\"\"\n        if self.num_mols_to_generate is None:\n            num_mols = 0\n\n        task_df = {}\n        for task, args in self.task_dataset_processing_params.items():\n            logger.info(f\"Reading data for task '{task}'\")\n            if args.df is None:\n                # Only load the useful columns, as some datasets can be very large when loading all columns.\n                label_cols = self._parse_label_cols(\n                    df=None, df_path=args.df_path, label_cols=args.label_cols, smiles_col=args.smiles_col\n                )\n                task_df[task] = self.generate_data(label_cols=args.label_cols, smiles_col=args.smiles_col)\n                if self.num_mols_to_generate is None:\n                    num_mols = max(num_mols, len(task_df[task]))\n            task_df[task] = task_df[task].iloc[0:1]\n\n            args.label_cols = label_cols\n        if self.num_mols_to_generate is None:\n            self.num_mols_to_generate = num_mols\n        logger.info(\"Done reading datasets\")\n\n        \"\"\"Subsample the data frames and extract the necessary data to create SingleTaskDatasets for each task (smiles, labels, extras).\"\"\"\n        task_dataset_args = {}\n        for task in task_df.keys():\n            task_dataset_args[task] = {}\n\n        for task, df in task_df.items():\n            logger.info(f\"Prepare single-task dataset for task '{task}' with {len(df)} data points.\")\n            # Extract smiles, labels, extras\n            args = self.task_dataset_processing_params[task]\n            smiles, labels, sample_idx, extras = self._extract_smiles_labels(\n                df,\n                task_level=args.task_level,\n                smiles_col=args.smiles_col,\n                label_cols=args.label_cols,\n                idx_col=args.idx_col,\n                weights_col=args.weights_col,\n                weights_type=args.weights_type,\n            )\n\n            # Store the relevant information for each task's dataset\n            task_dataset_args[task][\"smiles\"] = smiles\n            task_dataset_args[task][\"labels\"] = labels\n            task_dataset_args[task][\"sample_idx\"] = sample_idx\n            task_dataset_args[task][\"extras\"] = extras\n\n        \"\"\"Convert SMILES to features (graphs, fingerprints, etc.) for the unique molecules found.\"\"\"\n        all_smiles = []\n        idx_per_task = {}\n        total_len = 0\n        for task, dataset_args in task_dataset_args.items():\n            all_smiles.extend(dataset_args[\"smiles\"])\n            num_smiles = len(dataset_args[\"smiles\"])\n            idx_per_task[task] = (total_len, total_len + num_smiles)\n            total_len += num_smiles\n        # Get all unique mol ids\n        all_unique_mol_ids = smiles_to_unique_mol_ids(\n            all_smiles,\n            n_jobs=self.featurization_n_jobs,\n            featurization_batch_size=self.featurization_batch_size,\n            backend=self.featurization_backend,\n        )\n        # Convert SMILES to features\n        features, _ = self._featurize_molecules(all_smiles)\n        task_dataset_args[task][\"features\"] = features\n        \"\"\"Filter data based on molecules which failed featurization. Create single task datasets as well.\"\"\"\n        self.single_task_datasets = {}\n        for task, args in task_dataset_args.items():\n            self.single_task_datasets[task] = Datasets.SingleTaskDataset(\n                features=task_dataset_args[task][\"features\"],\n                labels=task_dataset_args[task][\"labels\"],\n                smiles=task_dataset_args[task][\"smiles\"],\n                indices=task_dataset_args[task][\"sample_idx\"],\n                unique_ids=all_unique_mol_ids[idx_per_task[task][0] : idx_per_task[task][1]],\n                **task_dataset_args[task][\"extras\"],\n            )\n\n        \"\"\"We split the data up to create train, val and test datasets\"\"\"\n        self.train_singletask_datasets = {}\n        self.val_singletask_datasets = {}\n        self.test_singletask_datasets = {}\n        for task, df in task_df.items():\n            self.train_singletask_datasets[task] = Subset(self.single_task_datasets[task], [0])\n            self.val_singletask_datasets[task] = Subset(self.single_task_datasets[task], [0])\n            self.test_singletask_datasets[task] = Subset(self.single_task_datasets[task], [0])\n\n    def setup(self, stage=None):\n        # TODO\n        \"\"\"\n        Prepare the torch dataset. Called on every GPUs. Setting state here is ok.\n        Parameters:\n            stage (str): Either 'fit', 'test', or None.\n        \"\"\"\n        labels_size = {}\n\n        if stage == \"fit\" or stage is None:\n            self.train_ds = Datasets.FakeDataset(self.train_singletask_datasets, num_mols=self.num_mols_to_generate, indexing_same_elem=self.indexing_single_elem)  # type: ignore\n            self.val_ds = Datasets.FakeDataset(self.val_singletask_datasets, num_mols=self.num_mols_to_generate, indexing_same_elem=self.indexing_single_elem)  # type: ignore\n            print(self.train_ds)\n            print(self.val_ds)\n\n            labels_size.update(\n                self.train_ds.labels_size\n            )  # Make sure that all task label sizes are contained in here. Maybe do the update outside these if statements.\n            labels_size.update(self.val_ds.labels_size)\n\n        if stage == \"test\" or stage is None:\n            self.test_ds = Datasets.FakeDataset(self.test_singletask_datasets, num_mols=self.num_mols_to_generate, indexing_same_elem=self.indexing_single_elem)  # type: ignore\n            print(self.test_ds)\n            labels_size.update(self.test_ds.labels_size)\n\n        default_labels_size_dict = self.collate_fn.keywords.get(\"labels_size_dict\", None)\n\n        if default_labels_size_dict is None:\n            self.collate_fn.keywords[\"labels_size_dict\"] = labels_size\n\n    def get_fake_graph(self):\n        \"\"\"\n        Low memory footprint method to get the first datapoint DGL graph.\n        The first 10 rows of the data are read in case the first one has a featurization\n        error. If all 20 first element, then `None` is returned, otherwise the first\n        graph to not fail is returned.\n        \"\"\"\n        keys = list(self.task_dataset_processing_params.keys())\n        task = keys[0]\n        args = self.task_dataset_processing_params[task]\n        if args.df is None:\n            df = self._read_csv(args.df_path, nrows=20)\n        else:\n            df = args.df.iloc[0:20, :]\n\n        df = df.iloc[0:20, :]\n        label_cols = self._parse_label_cols(\n            df, df_path=None, label_cols=args.label_cols, smiles_col=args.smiles_col\n        )\n\n        smiles, labels, sample_idx, extras = self._extract_smiles_labels(\n            df,\n            task_level=args.task_level,\n            smiles_col=args.smiles_col,\n            label_cols=label_cols,\n            idx_col=args.idx_col,\n            weights_col=args.weights_col,\n            weights_type=args.weights_type,\n        )\n\n        graph = None\n        for s in smiles:\n            graph = self.smiles_transformer(s, mask_nan=0.0)\n            num_nodes = graph.num_nodes\n            num_edges = graph.num_edges\n            if (graph is not None) and (num_edges > 0) and (num_nodes > 0):\n                break\n        return graph\n"
  },
  {
    "path": "graphium/data/dataset.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport os\nfrom copy import deepcopy\nfrom functools import lru_cache\nfrom multiprocessing import Manager\nfrom typing import Any, Dict, List, Optional, Tuple, Union\n\nimport fsspec\nimport numpy as np\nimport torch\nfrom datamol import parallelized, parallelized_with_batches\nfrom loguru import logger\nfrom torch.utils.data.dataloader import Dataset\nfrom torch_geometric.data import Batch, Data\n\nfrom graphium.data.smiles_transform import smiles_to_unique_mol_ids\nfrom graphium.features import GraphDict\n\n\nclass SingleTaskDataset(Dataset):\n    def __init__(\n        self,\n        labels: List[Union[torch.Tensor, np.ndarray]],\n        features: Optional[List[Union[Data, \"GraphDict\"]]] = None,\n        smiles: Optional[List[str]] = None,\n        indices: Optional[List[int]] = None,\n        weights: Optional[Union[torch.Tensor, np.ndarray]] = None,\n        unique_ids: Optional[List[str]] = None,\n        mol_ids: Optional[List[str]] = None,\n    ):\n        r\"\"\"\n        dataset for a single task\n        Parameters:\n            labels: A list of labels for the given task (one per graph)\n            features: A list of graphs\n            smiles: A list of smiles\n            indices: A list of indices\n            weights: A list of weights\n            unique_ids: A list of unique ids for each molecule generated from `datamol.unique_id`\n            mol_ids: A list of ids coming from the original dataset. Useful to identify the molecule in the original dataset.\n        \"\"\"\n\n        # Verify that all lists are the same length\n        numel = len(labels)\n\n        def _check_if_same_length(to_check, label):\n            \"\"\"Simple utility method to throw an error if the length is not as expected.\"\"\"\n            if to_check is not None and len(to_check) != numel:\n                raise ValueError(\n                    f\"{label} must be the same length as `labels`, got {len(to_check)} and {numel}\"\n                )\n\n        _check_if_same_length(features, \"features\")\n        _check_if_same_length(indices, \"indices\")\n        _check_if_same_length(weights, \"weights\")\n        _check_if_same_length(unique_ids, \"unique_ids\")\n        _check_if_same_length(mol_ids, \"mol_ids\")\n\n        self.labels = labels\n        if smiles is not None:\n            manager = Manager()  # Avoid memory leaks with `num_workers > 0` by using the Manager\n            self.smiles = manager.list(smiles)\n        else:\n            self.smiles = None\n        self.features = features\n        self.indices = indices\n        if self.indices is not None:\n            self.indices = np.array(\n                self.indices\n            )  # Avoid memory leaks with `num_workers > 0` by using numpy array\n        self.weights = weights\n        self.unique_ids = unique_ids\n        self.mol_ids = mol_ids\n\n    def __len__(self):\n        r\"\"\"\n        return the size of the dataset\n        Returns:\n            size: the size of the dataset\n        \"\"\"\n        return len(self.labels)\n\n    def __getitem__(self, idx):\n        \"\"\"\n        get the data at the given index\n        Parameters:\n            idx: the index to get the data at\n        Returns:\n            datum: a dictionary containing the data at the given index, with keys \"features\", \"labels\", \"smiles\", \"indices\", \"weights\", \"unique_ids\"\n        \"\"\"\n        datum = {}\n\n        if self.features is not None:\n            datum[\"features\"] = self.features[idx]\n\n        if self.labels is not None:\n            datum[\"labels\"] = self.labels[idx]\n\n        if self.smiles is not None:\n            datum[\"smiles\"] = self.smiles[idx]\n\n        if self.indices is not None:\n            datum[\"indices\"] = self.indices[idx]\n\n        if self.weights is not None:\n            datum[\"weights\"] = self.weights[idx]\n\n        if self.unique_ids is not None:\n            datum[\"unique_ids\"] = self.unique_ids[idx]\n\n        if self.mol_ids is not None:\n            datum[\"mol_ids\"] = self.mol_ids[idx]\n\n        return datum\n\n    def __getstate__(self):\n        \"\"\"Serialize the class for pickling.\"\"\"\n        state = {}\n        state[\"labels\"] = self.labels\n        state[\"smiles\"] = list(self.smiles) if self.smiles is not None else None\n        state[\"features\"] = self.features\n        state[\"indices\"] = self.indices\n        state[\"weights\"] = self.weights\n        state[\"unique_ids\"] = self.unique_ids\n        state[\"mol_ids\"] = self.mol_ids\n        return state\n\n    def __setstate__(self, state: dict):\n        \"\"\"Reload the class from pickling.\"\"\"\n        if state[\"smiles\"] is not None:\n            manager = Manager()\n            state[\"smiles\"] = manager.list(state[\"smiles\"])\n\n        self.__dict__.update(state)\n\n\nclass MultitaskDataset(Dataset):\n    pass\n\n    def __init__(\n        self,\n        datasets: Dict[str, SingleTaskDataset],\n        n_jobs=-1,\n        backend: str = \"loky\",\n        featurization_batch_size=1000,\n        progress: bool = True,\n        save_smiles_and_ids: bool = False,\n        about: str = \"\",\n        data_path: Optional[Union[str, os.PathLike]] = None,\n        dataloading_from: str = \"ram\",\n        data_is_cached: bool = False,\n    ):\n        r\"\"\"\n        This class holds the information for the multitask dataset.\n        Several single-task datasets can be merged to create a multi-task dataset. After merging the dictionary of single-task datasets.\n        we will have a multitask dataset of the following form:\n        - self.mol_ids will be a list to contain the unique molecular IDs to identify the molecules\n        - self.smiles will be a list to contain the corresponding smiles for that molecular ID across all single-task datasets\n        - self.labels will be a list of dictionaries where the key is the task name and the value is the label(s) for that task.\n            At this point, any particular molecule will only have entries for tasks for which it has a label. Later, in the collate\n            function, we fill up the missing task labels with NaNs.\n        - self.features will be a list of featurized graphs corresponding to that particular unique molecule.\n            However, for testing purposes we may not require features so that we can make sure that this merge function works.\n\n        Parameters:\n            datasets: A dictionary of single-task datasets\n            n_jobs: Number of jobs to run in parallel\n            backend: Parallelization backend\n            featurization_batch_size: The batch size to use for the parallelization of the featurization\n            progress: Whether to display the progress bar\n            save_smiles_and_ids: Whether to save the smiles and ids for the dataset. If `False`, `mol_ids` and `smiles` are set to `None`\n            about: A description of the dataset\n            data_path: The location of the data if saved on disk\n            dataloading_from: Whether to load the data from `\"disk\"` or `\"ram\"`\n            data_is_cached: Whether the data is already cached on `\"disk\"`\n        \"\"\"\n        super().__init__()\n        self.n_jobs = n_jobs\n        self.backend = backend\n        self.featurization_batch_size = featurization_batch_size\n        self.progress = progress\n        self.about = about\n        self.save_smiles_and_ids = save_smiles_and_ids\n        self.data_path = data_path\n        self.dataloading_from = dataloading_from\n\n        logger.info(f\"Dataloading from {dataloading_from.upper()}\")\n\n        if data_is_cached:\n            self._load_metadata()\n\n            if dataloading_from == \"disk\":\n                self.features = None\n                self.labels = None\n            elif dataloading_from == \"ram\":\n                logger.info(f\"Transferring {about} from DISK to RAM...\")\n                self.transfer_from_disk_to_ram()\n\n        else:\n            task = next(iter(datasets))\n            self.features = None\n            if (len(datasets[task]) > 0) and (\"features\" in datasets[task][0]):\n                self.mol_ids, self.smiles, self.labels, self.features = self.merge(datasets)\n            else:\n                self.mol_ids, self.smiles, self.labels = self.merge(datasets)\n            # Set mol_ids and smiles to None to save memory as they are not needed.\n            if not save_smiles_and_ids:\n                self.mol_ids = None\n                self.smiles = None\n            self.labels_size = self.set_label_size_dict(datasets)\n            self.labels_dtype = self.set_label_dtype_dict(datasets)\n            self.dataset_length = len(self.labels)\n            self._num_nodes_list = None\n            self._num_edges_list = None\n            if self.features is not None:\n                self._num_nodes_list = get_num_nodes_per_graph(self.features)\n                self._num_edges_list = get_num_edges_per_graph(self.features)\n\n    def transfer_from_disk_to_ram(self, parallel_with_batches: bool = False):\n        \"\"\"\n        Function parallelizing transfer from DISK to RAM\n        \"\"\"\n\n        def transfer_mol_from_disk_to_ram(idx):\n            \"\"\"\n            Function transferring single mol from DISK to RAM\n            \"\"\"\n            data_dict = self.load_graph_from_index(idx)\n            mol_in_ram = {\n                \"features\": data_dict[\"graph_with_features\"],\n                \"labels\": data_dict[\"labels\"],\n            }\n\n            return mol_in_ram\n\n        if parallel_with_batches and self.featurization_batch_size:\n            data_in_ram = parallelized_with_batches(\n                transfer_mol_from_disk_to_ram,\n                range(self.dataset_length),\n                batch_size=self.featurization_batch_size,\n                n_jobs=0,\n                backend=self.backend,\n                progress=self.progress,\n                tqdm_kwargs={\"desc\": \"Transfer from DISK to RAM\"},\n            )\n        else:\n            data_in_ram = parallelized(\n                transfer_mol_from_disk_to_ram,\n                range(self.dataset_length),\n                n_jobs=0,\n                backend=self.backend,\n                progress=self.progress,\n                tqdm_kwargs={\"desc\": \"Transfer from DISK to RAM\"},\n            )\n\n        self.features = [sample[\"features\"] for sample in data_in_ram]\n        self.labels = [sample[\"labels\"] for sample in data_in_ram]\n\n    def save_metadata(self, directory: str):\n        \"\"\"\n        Save everything other than features/labels\n        \"\"\"\n        attrs_to_save = [\n            \"mol_ids\",\n            \"smiles\",\n            \"labels_size\",\n            \"labels_dtype\",\n            \"dataset_length\",\n            \"_num_nodes_list\",\n            \"_num_edges_list\",\n        ]\n        attrs = {attr: getattr(self, attr) for attr in attrs_to_save}\n\n        path = os.path.join(directory, \"multitask_metadata.pkl\")\n\n        torch.save(attrs, path, pickle_protocol=4)\n\n    def _load_metadata(self):\n        \"\"\"\n        Load everything other than features/labels\n        \"\"\"\n        attrs_to_load = [\n            \"mol_ids\",\n            \"smiles\",\n            \"labels_size\",\n            \"labels_dtype\",\n            \"dataset_length\",\n            \"_num_nodes_list\",\n            \"_num_edges_list\",\n        ]\n        path = os.path.join(self.data_path, \"multitask_metadata.pkl\")\n        with fsspec.open(path, \"rb\") as f:\n            attrs = torch.load(path)\n\n        if not set(attrs_to_load).issubset(set(attrs.keys())):\n            raise ValueError(\n                f\"The metadata in the cache at {self.data_path} does not contain the right information. \"\n                f\"This may be because the cache was prepared using an earlier version of Graphium. \"\n                f\"You can try deleting the cache and running the data preparation again. \"\n                f\"\\nMetadata keys found: {attrs.keys()}\"\n                f\"\\nMetadata keys required: {attrs_to_load}\"\n            )\n\n        for attr, value in attrs.items():\n            setattr(self, attr, value)\n\n        if self.save_smiles_and_ids:\n            if self.smiles is None or self.mol_ids is None:\n                logger.warning(\n                    f\"Argument `save_smiles_and_ids` is set to {self.save_smiles_and_ids} but metadata in the cache at {self.data_path} does not contain smiles and mol_ids. \"\n                    f\"This may be because `Datamodule.prepare_data(save_smiles_and_ids=False)` was run followed by `Datamodule.setup(save_smiles_and_ids=True)`. \"\n                    f\"When loading from cached files, the `save_smiles_and_ids` argument of `Datamodule.setup()` is superseeded by the `Datamodule.prepare_data()`. \"\n                )\n\n    def __len__(self):\n        r\"\"\"\n        Returns the number of molecules\n        \"\"\"\n        return self.dataset_length\n\n    @property\n    def num_nodes_list(self):\n        \"\"\"\n        The number of nodes per graph\n        \"\"\"\n        if self._num_nodes_list is None:\n            if len(self) == 0:\n                self._num_nodes_list = []\n            else:\n                self._num_nodes_list = get_num_nodes_per_graph(self.features)\n        return self._num_nodes_list\n\n    @property\n    def num_edges_list(self):\n        \"\"\"\n        The number of edges per graph\n        \"\"\"\n        if self._num_edges_list is None:\n            if len(self) == 0:\n                self._num_edges_list = []\n            else:\n                self._num_edges_list = get_num_edges_per_graph(self.features)\n        return self._num_edges_list\n\n    @property\n    def num_graphs_total(self):\n        r\"\"\"\n        number of graphs (molecules) in the dataset\n        \"\"\"\n        return len(self)\n\n    @property\n    def num_nodes_total(self):\n        \"\"\"Total number of nodes for all graphs\"\"\"\n        if len(self) == 0:\n            return\n        return sum(self.num_nodes_list)\n\n    @property\n    def max_num_nodes_per_graph(self):\n        \"\"\"Maximum number of nodes per graph\"\"\"\n        if len(self) == 0:\n            return\n        return max(self.num_nodes_list)\n\n    @property\n    def std_num_nodes_per_graph(self):\n        \"\"\"Standard deviation of number of nodes per graph\"\"\"\n        if len(self) == 0:\n            return\n        return np.std(self.num_nodes_list)\n\n    @property\n    def min_num_nodes_per_graph(self):\n        \"\"\"Minimum number of nodes per graph\"\"\"\n        if len(self) == 0:\n            return\n        return min(self.num_nodes_list)\n\n    @property\n    def mean_num_nodes_per_graph(self):\n        \"\"\"Average number of nodes per graph\"\"\"\n        if len(self) == 0:\n            return\n        return self.num_nodes_total / self.num_graphs_total\n\n    @property\n    def num_edges_total(self):\n        \"\"\"Total number of edges for all graphs\"\"\"\n        if len(self) == 0:\n            return\n        return sum(self.num_edges_list)\n\n    @property\n    def max_num_edges_per_graph(self):\n        \"\"\"Maximum number of edges per graph\"\"\"\n        if len(self) == 0:\n            return\n        return max(self.num_edges_list)\n\n    @property\n    def min_num_edges_per_graph(self):\n        \"\"\"Minimum number of edges per graph\"\"\"\n        if len(self) == 0:\n            return\n        return min(self.num_edges_list)\n\n    @property\n    def std_num_edges_per_graph(self):\n        \"\"\"Standard deviation of number of nodes per graph\"\"\"\n        if len(self) == 0:\n            return\n        return np.std(self.num_edges_list)\n\n    @property\n    def mean_num_edges_per_graph(self):\n        \"\"\"Average number of edges per graph\"\"\"\n        if len(self) == 0:\n            return\n        return self.num_edges_total / self.num_graphs_total\n\n    def __getitem__(self, idx):\n        r\"\"\"\n        get the data for at the specified index\n        Parameters:\n            idx: The index of the data to retrieve\n        Returns:\n            A dictionary containing the data for the specified index with keys \"mol_ids\", \"smiles\", \"labels\", and \"features\"\n        \"\"\"\n        datum = {}\n        if self.dataloading_from == \"disk\":\n            data_dict = self.load_graph_from_index(idx)\n            datum[\"features\"] = data_dict[\"graph_with_features\"]\n            datum[\"labels\"] = data_dict[\"labels\"]\n            if \"smiles\" in data_dict.keys():\n                datum[\"smiles\"] = data_dict[\"smiles\"]\n        else:\n            if self.mol_ids is not None:\n                datum[\"mol_ids\"] = self.mol_ids[idx]\n\n            if self.smiles is not None:\n                datum[\"smiles\"] = self.smiles[idx]\n\n            if self.labels is not None:\n                datum[\"labels\"] = self.labels[idx]\n\n            if self.features is not None:\n                datum[\"features\"] = self.features[idx]\n\n        return datum\n\n    def load_graph_from_index(self, data_idx):\n        r\"\"\"\n        load the graph (in pickle file) from the disk\n        Parameters:\n            data_idx: The index of the data to retrieve\n        Returns:\n            A dictionary containing the data for the specified index with keys \"graph_with_features\", \"labels\" and \"smiles\" (optional).\n        \"\"\"\n        filename = os.path.join(\n            self.data_path, format(data_idx // 1000, \"04d\"), format(data_idx, \"07d\") + \".pkl\"\n        )\n        with fsspec.open(filename, \"rb\") as f:\n            data_dict = torch.load(f)\n        return data_dict\n\n    def merge(\n        self, datasets: Dict[str, SingleTaskDataset]\n    ) -> Tuple[List[str], List[str], List[Dict[str, Any]], List[Any]]:\n        r\"\"\"This function merges several single task datasets into a multitask dataset.\n\n        The idea: for each of the smiles, labels, features and tasks, we create a corresponding list that concatenates these items across all tasks.\n        In particular, for any index, the elements in the smiles, labels, features and task lists at that index will correspond to each other (i.e. match up).\n        Over this list of all smiles (which we created by concatenating the smiles across all tasks), we compute their molecular ID using functions from Datamol.\n        Once again, we will have a list of molecular IDs which is the same size as the list of smiles, labels, features and tasks.\n        We then use numpy's `unique` function to find the exact list of unique molecular IDs as these will identify the molecules in our dataset. We also get the\n        inverse from numpy's `unique`, which will allow us to index in addition to the list of all molecular IDs, the list of all smiles, labels, features and tasks.\n        Finally, we use this inverse to construct the list of list of smiles, list of label dictionaries (indexed by task) and the list of features such that\n        the indices match up. This is what is needed for the `get_item` function to work.\n\n        Parameters:\n            datasets: A dictionary of single-task datasets\n        Returns:\n            A tuple of (list of molecular IDs, list of smiles, list of label dictionaries, list of features)\n        \"\"\"\n\n        # Get all the smiles, labels, features and tasks.\n        all_lists = self._get_all_lists_ids(datasets=datasets)\n        mol_ids, inv = self._get_inv_of_mol_ids(all_mol_ids=all_lists[\"mol_ids\"])\n\n        # Store the smiles.\n        smiles = [[] for _ in range(len(mol_ids))]\n        for all_idx, unique_idx in enumerate(inv):\n            smiles[unique_idx].append(all_lists[\"smiles\"][all_idx])\n\n        # Store the labels.\n        labels = [Data() for _ in range(len(mol_ids))]\n        for all_idx, unique_idx in enumerate(inv):\n            task: str = all_lists[\"tasks\"][all_idx]\n            label = all_lists[\"labels\"][all_idx]\n            labels[unique_idx][task] = label\n\n            if all_idx < len(all_lists[\"features\"]):\n                features = all_lists[\"features\"][all_idx]\n                labels[unique_idx][\"x\"] = torch.empty(\n                    (features.num_nodes, 1)\n                )  # IPU is not happy with zero-sized tensors, so use shape (features.num_nodes, 1) here\n                labels[unique_idx][\"edge_index\"] = torch.empty((2, features.num_edges))\n\n        # Store the features\n        if len(all_lists[\"features\"]) > 0:\n            features = [-1 for i in range(len(mol_ids))]\n            for all_idx, unique_idx in enumerate(inv):\n                features[unique_idx] = all_lists[\"features\"][all_idx]\n            return mol_ids, smiles, labels, features\n        else:\n            return mol_ids, smiles, labels\n\n    def _get_all_lists_ids(self, datasets: Dict[str, SingleTaskDataset]) -> Dict[str, Any]:\n        all_smiles = []\n        all_features = []\n        all_labels = []\n        all_mol_ids = []\n        all_tasks = []\n\n        for task, ds in datasets.items():\n            if len(ds) == 0:\n                continue\n            # Get data from single task dataset\n            ds_smiles = [ds[i][\"smiles\"] for i in range(len(ds))]\n            ds_labels = [ds[i][\"labels\"] for i in range(len(ds))]\n            if \"unique_ids\" in ds[0].keys():\n                ds_mol_ids = [ds[i][\"unique_ids\"] for i in range(len(ds))]\n            else:\n                ds_mol_ids = smiles_to_unique_mol_ids(\n                    ds_smiles,\n                    n_jobs=self.n_jobs,\n                    featurization_batch_size=self.featurization_batch_size,\n                    backend=self.backend,\n                    progress=self.progress,\n                    progress_desc=f\"{task}: mol to ids\",\n                )\n            if \"features\" in ds[0]:\n                ds_features = [ds[i][\"features\"] for i in range(len(ds))]\n            else:\n                ds_features = None\n            all_smiles.extend(ds_smiles)\n            all_labels.extend(ds_labels)\n            all_mol_ids.extend(ds_mol_ids)\n            if ds_features is not None:\n                all_features.extend(ds_features)\n\n            task_list = [task] * ds.__len__()\n            all_tasks.extend(task_list)\n\n        all_lists = {\n            \"smiles\": all_smiles,\n            \"features\": all_features,\n            \"labels\": all_labels,\n            \"mol_ids\": all_mol_ids,\n            \"tasks\": all_tasks,\n        }\n\n        return all_lists\n\n    def _get_inv_of_mol_ids(self, all_mol_ids):\n        mol_ids, inv = np.unique(all_mol_ids, return_inverse=True)\n        return mol_ids, inv\n\n    def _find_valid_label(self, task, ds):\n        r\"\"\"\n        For a given dataset, find a genuine label for that dataset\n        \"\"\"\n        valid_label = None\n        for i in range(len(ds)):\n            if ds[i] is not None:\n                valid_label = ds[i][\"labels\"]\n                break\n\n        if valid_label is None:\n            raise ValueError(f\"Dataset for task {task} has no valid labels.\")\n\n        return valid_label\n\n    def set_label_size_dict(self, datasets: Dict[str, SingleTaskDataset]):\n        r\"\"\"\n        This gives the number of labels to predict for a given task.\n        \"\"\"\n        task_labels_size = {}\n        for task, ds in datasets.items():\n            if len(ds) == 0:\n                continue\n\n            valid_label = self._find_valid_label(task, ds)\n\n            # Assume for a fixed task, the label dimension is the same across data points\n            torch_label = torch.as_tensor(valid_label)\n\n            # First dimension is graph-specific\n            task_labels_size[task] = torch_label.size()\n        return task_labels_size\n\n    def set_label_dtype_dict(self, datasets: Dict[str, SingleTaskDataset]):\n        r\"\"\"\n        Gets correct dtype for a given label\n        \"\"\"\n        task_labels_dtype = {}\n        for task, ds in datasets.items():\n            if len(ds) == 0:\n                continue\n\n            valid_label = self._find_valid_label(task, ds)\n\n            torch_label = torch.as_tensor(valid_label)\n            task_labels_dtype[task] = torch_label.dtype\n        return task_labels_dtype\n\n    def __repr__(self) -> str:\n        \"\"\"\n        summarizes the dataset in a string\n        Returns:\n            A string representation of the dataset.\n        \"\"\"\n        if len(self) == 0:\n            out_str = (\n                f\"-------------------\\n{self.__class__.__name__}\\n\"\n                + f\"\\tabout = {self.about}\\n\"\n                + f\"\\tnum_graphs_total = {self.num_graphs_total}\\n\"\n                + f\"-------------------\\n\"\n            )\n            return out_str\n\n        # Faster to compute the statistics if we unbatch first.\n        features = self.features\n        if isinstance(self.features, Batch):\n            self.features = self.features.to_data_list()\n\n        out_str = (\n            f\"-------------------\\n{self.__class__.__name__}\\n\"\n            + f\"\\tabout = {self.about}\\n\"\n            + f\"\\tnum_graphs_total = {self.num_graphs_total}\\n\"\n            + f\"\\tnum_nodes_total = {self.num_nodes_total}\\n\"\n            + f\"\\tmax_num_nodes_per_graph = {self.max_num_nodes_per_graph}\\n\"\n            + f\"\\tmin_num_nodes_per_graph = {self.min_num_nodes_per_graph}\\n\"\n            + f\"\\tstd_num_nodes_per_graph = {self.std_num_nodes_per_graph}\\n\"\n            + f\"\\tmean_num_nodes_per_graph = {self.mean_num_nodes_per_graph}\\n\"\n            + f\"\\tnum_edges_total = {self.num_edges_total}\\n\"\n            + f\"\\tmax_num_edges_per_graph = {self.max_num_edges_per_graph}\\n\"\n            + f\"\\tmin_num_edges_per_graph = {self.min_num_edges_per_graph}\\n\"\n            + f\"\\tstd_num_edges_per_graph = {self.std_num_edges_per_graph}\\n\"\n            + f\"\\tmean_num_edges_per_graph = {self.mean_num_edges_per_graph}\\n\"\n            + f\"-------------------\\n\"\n        )\n\n        # Restore the original features.\n        self.features = features\n\n        return out_str\n\n\nclass FakeDataset(MultitaskDataset):\n    \"\"\"\n    A dataset to hold the fake data.\n    \"\"\"\n\n    def __init__(\n        self, datasets: Dict[str, SingleTaskDataset], num_mols: int = 1234, indexing_same_elem: bool = False\n    ):\n        \"\"\"\n        Parameters:\n            datasets:\n                A dictionary of datasets. The keys are the task names and the values are the datasets.\n            num_mols:\n                The number of molecules to generate. In reality, it is the same molecule,\n                but `num_mols` will change the length of the dataset.\n            indexing_same_elem:\n                If True, the same molecule is used for all samples.\n                Otherwise, a deepcopied molecule is used for each sample.\n        \"\"\"\n        self.indexing_same_elem = indexing_same_elem\n        self.num_mols = num_mols\n        self.num_datasets = len(datasets)\n\n        self.about = \"FakeDatasets\"\n        task = next(iter(datasets))\n        if \"features\" in datasets[task][0]:\n            self.mol_ids, self.smiles, self.labels, self.features = self.merge(datasets)\n            if self.indexing_same_elem is False:\n                self.mol_ids, self.smiles, self.labels, self.features = self.deepcopy_mol(\n                    self.mol_ids, self.smiles, self.labels, self.features\n                )\n        else:\n            self.mol_ids, self.smiles, self.labels = self.merge(datasets)\n            if self.indexing_same_elem is False:\n                self.mol_ids, self.smiles, self.labels, _ = self.deepcopy_mol(\n                    self.mol_ids, self.smiles, self.labels\n                )\n\n        self.labels_size = self.set_label_size_dict(datasets)\n        self.labels_dtype = self.set_label_dtype_dict(datasets)\n        self.features = self.features\n\n    def _get_inv_of_mol_ids(self, all_mol_ids):\n        # The generated data is a single molecule duplicated\n        mol_ids = np.array(all_mol_ids)\n        inv = [_ for _ in range(len(mol_ids) // self.num_datasets)] * self.num_datasets\n        mol_ids = np.unique(inv)\n        return mol_ids, inv\n\n    def deepcopy_mol(self, mol_ids, labels, smiles, features=None):\n        \"\"\"\n        Create a deepcopy of the single molecule num_mols times\n\n        Args:\n            mol_ids (array): The single value for the mol ID\n            labels (List[Dict]): List containing one dict with the label name-value pairs\n            smiles (List[List[str]]): List of list containing SMILE sting\n            features (List[Data], optional): list containing Data object. Defaults to None.\n\n        Returns:\n            The deep copy of the inputs\n        \"\"\"\n        logger.info(\"Duplicating the single dataset element...\")\n        mol_ids = [deepcopy(mol_ids[0]) for _ in range(self.num_mols)]\n        logger.info(\"Finished `mol_ids`\")\n        labels = [deepcopy(labels[0]) for _ in range(self.num_mols)]\n        logger.info(\"Finished `labels`\")\n        smiles = [deepcopy(smiles[0]) for _ in range(self.num_mols)]\n        logger.info(\"Finished `smiles`\")\n        if features is not None:\n            features = [deepcopy(features[0]) for _ in range(self.num_mols)]\n            logger.info(\"Finished `features`\")\n        return mol_ids, labels, smiles, features\n\n    def __len__(self):\n        r\"\"\"\n        Returns the number of molecules\n        \"\"\"\n        return self.num_mols\n\n    def __getitem__(self, idx):\n        r\"\"\"\n        get the data for at the specified index\n        Parameters:\n            idx: The index of the data to retrieve\n        Returns:\n            A dictionary containing the data for the specified index with keys \"mol_ids\", \"smiles\", \"labels\", and \"features\"\n        \"\"\"\n        datum = {}\n        if self.indexing_same_elem is True:\n            # If using a single memory location override the idx value passed\n            idx = 0\n        if self.labels is not None:\n            datum[\"labels\"] = self.labels[idx]\n\n        if self.features is not None:\n            datum[\"features\"] = self.features[idx]\n\n        return datum\n\n\ndef get_num_nodes_per_graph(graphs):\n    r\"\"\"\n    number of nodes per graph\n    \"\"\"\n    if isinstance(graphs, Batch):\n        graphs = graphs.to_data_list()\n    counts = [graph.num_nodes for graph in graphs]\n    return counts\n\n\ndef get_num_edges_per_graph(graphs):\n    r\"\"\"\n    number of edges per graph\n    \"\"\"\n    if isinstance(graphs, Batch):\n        graphs = graphs.to_data_list()\n    counts = [graph.num_edges for graph in graphs]\n    return counts\n"
  },
  {
    "path": "graphium/data/make_data_splits/make_train_val_test_splits_large.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"from typing import List, Tuple, Dict\\n\",\n    \"\\n\",\n    \"import random\\n\",\n    \"import pandas as pd\\n\",\n    \"import numpy as np\\n\",\n    \"import torch\\n\",\n    \"from os.path import join\\n\",\n    \"import datamol as dm\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"BASE_PATH = \\\"../neurips2023/large-dataset/\\\"\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"df_L1000_VCAP.shape (15220, 984)\\n\",\n      \"df_L1000_MCF7.shape (11622, 984)\\n\",\n      \"df_PCBA.shape (1563664, 1332)\\n\",\n      \"df_PCQM4M.shape (3810323, 31)\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"df_L1000_VCAP = pd.read_csv(join(BASE_PATH, \\\"LINCS_L1000_VCAP_0-4.csv.gz\\\"))\\n\",\n    \"print(\\\"df_L1000_VCAP.shape\\\", df_L1000_VCAP.shape)\\n\",\n    \"df_L1000_MCF7 = pd.read_csv(join(BASE_PATH, \\\"LINCS_L1000_MCF7_0-4.csv.gz\\\"))\\n\",\n    \"print(\\\"df_L1000_MCF7.shape\\\", df_L1000_MCF7.shape)\\n\",\n    \"df_PCBA = pd.read_parquet(join(BASE_PATH, \\\"PCBA_1328_1564k.parquet\\\"))\\n\",\n    \"print(\\\"df_PCBA.shape\\\", df_PCBA.shape)\\n\",\n    \"df_PCQM4M = pd.read_parquet(join(BASE_PATH, \\\"PCQM4M_G25_N4.parquet\\\"))\\n\",\n    \"print(\\\"df_PCQM4M.shape\\\", df_PCQM4M.shape)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def smiles_to_unique_ids(smiles: List):\\n\",\n    \"    return dm.parallelized_with_batches(loop_smiles_to_unique_ids, smiles, batch_size=100, n_jobs=32, progress=True)\\n\",\n    \"\\n\",\n    \"def loop_smiles_to_unique_ids(smiles: List):\\n\",\n    \"    unique_ids = []\\n\",\n    \"    for s in smiles:\\n\",\n    \"        if not isinstance(s, str):\\n\",\n    \"            unique_ids.append(None)\\n\",\n    \"            continue\\n\",\n    \"        mol = dm.to_mol(s)\\n\",\n    \"        if mol is None:\\n\",\n    \"            unique_ids.append(None)\\n\",\n    \"        else:\\n\",\n    \"            unique_ids.append(dm.unique_id(mol))\\n\",\n    \"    return unique_ids\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"a1bdefc329d14d288510253cc36867a8\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/38103 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"[16:03:00] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:10] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:37] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:37] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:37] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:44] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:44] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:44] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:44] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:44] WARNING: not removing hydrogen atom without neighbors\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"7087b7c562ca46ce837695034e38cb57\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/152 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted'[16:03:49]  for input: 'restricted'\\n\",\n      \"SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"f651dcb510004bb7afef0b02fb1e3b20\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/116 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\",\n      \"[16:03:49] SMILES Parse Error: syntax error while parsing: restricted\\n\",\n      \"[16:03:49] SMILES Parse Error: Failed parsing SMILES 'restricted' for input: 'restricted'\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"1ab2cca58d1f4a36ac5994b897cfe940\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/15636 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"[16:03:51] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:52] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:53] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:55] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:59] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:59] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:03:59] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:02] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:02] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:07] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:08] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:08] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:08] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:08] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:09] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:12] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:13] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:15] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:16] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:20] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:20] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:22] WARNING: not removing hydrogen atom without neighbors\\n\",\n      \"[16:04:22] WARNING: not removing hydrogen atom without neighbors\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"unique_ids_QM = smiles_to_unique_ids(df_PCQM4M[\\\"ordered_smiles\\\"])\\n\",\n    \"unique_ids_L1000_VCAP = smiles_to_unique_ids(df_L1000_VCAP[\\\"SMILES\\\"])\\n\",\n    \"unique_ids_L1000_MCF7 = smiles_to_unique_ids(df_L1000_MCF7[\\\"SMILES\\\"])\\n\",\n    \"unique_ids_PCBA = smiles_to_unique_ids(df_PCBA[\\\"SMILES\\\"])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"L1000_VCAP 726\\n\",\n      \"L1000_MCF7 1023\\n\",\n      \"PCBA 56512\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Check the number of unique ids that intersect between unique_ids_QM and the other columns\\n\",\n    \"intersection_VCAP = set(unique_ids_QM) & set(unique_ids_L1000_VCAP)\\n\",\n    \"print(\\\"L1000_VCAP\\\", len(intersection_VCAP))\\n\",\n    \"\\n\",\n    \"intersection_MCF7 = set(unique_ids_QM) & set(unique_ids_L1000_MCF7)\\n\",\n    \"print(\\\"L1000_MCF7\\\", len(intersection_MCF7))\\n\",\n    \"\\n\",\n    \"intersection_PCBA = set(unique_ids_QM) & set(unique_ids_PCBA)\\n\",\n    \"print(\\\"PCBA\\\", len(intersection_PCBA))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"QM_in_VCAP 1443 VCAP_in_QM 748\\n\",\n      \"QM_in_MCF7 2373 MCF7_in_QM 1065\\n\",\n      \"QM_in_PCBA 91174 PCBA_in_QM 56512\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"def find_indices(list_1, list_2):\\n\",\n    \"    intersection = set(list_1) & set(list_2)\\n\",\n    \"    intersection = {elem for elem in intersection if elem is not None}\\n\",\n    \"    is_2_in_1 = np.isin(list_2, list(intersection))\\n\",\n    \"    is_1_in_2 = np.isin(list_1, list(intersection))\\n\",\n    \"    return is_1_in_2, is_2_in_1\\n\",\n    \"\\n\",\n    \"QM_in_VCAP, VCAP_in_QM = find_indices(unique_ids_QM, unique_ids_L1000_VCAP)\\n\",\n    \"print(\\\"QM_in_VCAP\\\", sum(QM_in_VCAP), \\\"VCAP_in_QM\\\", sum(VCAP_in_QM))\\n\",\n    \"\\n\",\n    \"QM_in_MCF7, MCF7_in_QM = find_indices(unique_ids_QM, unique_ids_L1000_MCF7)\\n\",\n    \"print(\\\"QM_in_MCF7\\\", sum(QM_in_MCF7), \\\"MCF7_in_QM\\\", sum(MCF7_in_QM))\\n\",\n    \"\\n\",\n    \"QM_in_PCBA, PCBA_in_QM = find_indices(unique_ids_QM, unique_ids_PCBA)\\n\",\n    \"print(\\\"QM_in_PCBA\\\", sum(QM_in_PCBA), \\\"PCBA_in_QM\\\", sum(PCBA_in_QM))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"test_seen_VCAP = np.where(VCAP_in_QM)[0].tolist()\\n\",\n    \"test_seen_MCF7 = np.where(MCF7_in_QM)[0].tolist()\\n\",\n    \"test_seen_PCBA = np.where(PCBA_in_QM)[0].tolist()\\n\",\n    \"\\n\",\n    \"train_QM_seen = np.where(QM_in_VCAP | QM_in_MCF7 | QM_in_PCBA)[0].tolist()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def make_random_splits(num_elem, val_ratio, test_ratio, seed=42, ignore_idx=None):\\n\",\n    \"    \\\"\\\"\\\"Make random splits of the data into train, validation, and test sets.\\n\",\n    \"\\n\",\n    \"    Args:\\n\",\n    \"        num_elem (int): Number of elements in the dataset.\\n\",\n    \"        val_ratio (float): Ratio of the dataset to use for validation.\\n\",\n    \"        test_ratio (float): Ratio of the dataset to use for testing.\\n\",\n    \"        seed (int): Random seed.\\n\",\n    \"        ignore_idx (list): List of indices to ignore.\\n\",\n    \"\\n\",\n    \"    Returns:\\n\",\n    \"        train_idx (list): List of indices for the training set.\\n\",\n    \"        val_idx (list): List of indices for the validation set.\\n\",\n    \"        test_idx (list): List of indices for the test set.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Create a list of indices\\n\",\n    \"    idx = list(range(num_elem))\\n\",\n    \"    # Remove the indices to ignore\\n\",\n    \"    if ignore_idx is not None:\\n\",\n    \"        idx = list(set(idx) - set(ignore_idx))\\n\",\n    \"        num_elem = len(idx)\\n\",\n    \"    # Shuffle the list of indices\\n\",\n    \"    random.seed(seed)\\n\",\n    \"    random.shuffle(idx)\\n\",\n    \"    # Compute the number of elements in each set\\n\",\n    \"    num_val = int(num_elem * val_ratio)\\n\",\n    \"    num_test = int(num_elem * test_ratio)\\n\",\n    \"    num_train = num_elem - num_val - num_test\\n\",\n    \"    # Split the list of indices into three sets\\n\",\n    \"    train_idx = idx[:num_train]\\n\",\n    \"    val_idx = idx[num_train:(num_train + num_val)]\\n\",\n    \"    test_idx = idx[(num_train + num_val):]\\n\",\n    \"    # Return the three lists of indices\\n\",\n    \"    return train_idx, val_idx, test_idx\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"train_VCAP, val_VCAP, test_VCAP = make_random_splits(len(df_L1000_VCAP), 0.04, 0.04, seed=42, ignore_idx=test_seen_VCAP)\\n\",\n    \"train_MCF7, val_MCF7, test_MCF7 = make_random_splits(len(df_L1000_MCF7), 0.04, 0.04, seed=42, ignore_idx=test_seen_MCF7)\\n\",\n    \"train_PCBA, val_PCBA, test_PCBA = make_random_splits(len(df_PCBA), 0.04, 0.04, seed=42, ignore_idx=test_seen_PCBA)\\n\",\n    \"train_QM, val_QM, test_QM = make_random_splits(len(df_PCQM4M), 0.04, 0.04, seed=42, ignore_idx=train_QM_seen)\\n\",\n    \"train_QM = np.concatenate((train_QM, train_QM_seen))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 11,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def make_random_splits_file(df, out_file, train_idx, val_idx, test_idx, test_seen_idx=None):\\n\",\n    \"    # Save the splits\\n\",\n    \"    if test_seen_idx is None:\\n\",\n    \"        splits_dict = {\\\"train\\\": train_idx, \\\"val\\\": val_idx, \\\"test\\\": test_idx}\\n\",\n    \"    else:\\n\",\n    \"        splits_dict = {\\\"train\\\": train_idx, \\\"val\\\": val_idx, \\\"test\\\": test_idx, \\\"test_seen\\\": test_seen_idx}\\n\",\n    \"\\n\",\n    \"    # Check the splits validity\\n\",\n    \"    \\n\",\n    \"    assert len(set(train_idx).intersection(set(val_idx))) == 0\\n\",\n    \"    assert len(set(train_idx).intersection(set(test_idx))) == 0\\n\",\n    \"    assert len(set(val_idx).intersection(set(test_idx))) == 0\\n\",\n    \"    assert len(train_idx) > 0\\n\",\n    \"    assert len(val_idx) > 0\\n\",\n    \"    assert len(test_idx) > 0\\n\",\n    \"\\n\",\n    \"    if test_seen_idx is None:\\n\",\n    \"        assert len(df) == len(train_idx) + len(val_idx) + len(test_idx), f\\\"{len(df)} != {len(train_idx)} + {len(val_idx)} + {len(test_idx)}\\\"\\n\",\n    \"        print(out_file, \\\"train\\\", len(train_idx), \\\"val\\\", len(val_idx), \\\"test\\\", len(test_idx))\\n\",\n    \"    else:\\n\",\n    \"        assert len(test_seen_idx) > 0\\n\",\n    \"        assert len(set(train_idx).intersection(set(test_seen_idx))) == 0\\n\",\n    \"        assert len(set(val_idx).intersection(set(test_seen_idx))) == 0\\n\",\n    \"        assert len(set(test_idx).intersection(set(test_seen_idx))) == 0\\n\",\n    \"        assert len(df) == len(train_idx) + len(val_idx) + len(test_idx) + len(test_seen_idx), f\\\"{len(df)} != {len(train_idx)} + {len(val_idx)} + {len(test_idx)} + {len(test_seen_idx)}\\\"\\n\",\n    \"        print(out_file, \\\"train\\\", len(train_idx), \\\"val\\\", len(val_idx), \\\"test\\\", len(test_idx), \\\"test_seen\\\", len(test_seen_idx))\\n\",\n    \"\\n\",\n    \"    torch.save(splits_dict, out_file)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 12,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"l1000_vcap_random_splits.pt train 13316 val 578 test 578 test_seen 748\\n\",\n      \"l1000_mcf7_random_splits.pt train 9713 val 422 test 422 test_seen 1065\\n\",\n      \"pcba_1328_random_splits.pt train 1386580 val 60286 test 60286 test_seen 56512\\n\",\n      \"pcqm4m_g25_n4_random_splits.pt train 3512805 val 148759 test 148759\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"make_random_splits_file(df_L1000_VCAP, \\\"l1000_vcap_random_splits.pt\\\", train_VCAP, val_VCAP, test_VCAP, test_seen_VCAP)\\n\",\n    \"make_random_splits_file(df_L1000_MCF7, \\\"l1000_mcf7_random_splits.pt\\\", train_MCF7, val_MCF7, test_MCF7, test_seen_MCF7)\\n\",\n    \"make_random_splits_file(df_PCBA, \\\"pcba_1328_random_splits.pt\\\", train_PCBA, val_PCBA, test_PCBA, test_seen_PCBA)\\n\",\n    \"make_random_splits_file(df_PCQM4M, \\\"pcqm4m_g25_n4_random_splits.pt\\\", train_QM, val_QM, test_QM)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"graphium\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.10\"\n  },\n  \"orig_nbformat\": 4\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "graphium/data/make_data_splits/make_train_val_test_splits_small.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import random\\n\",\n    \"import pandas as pd\\n\",\n    \"import torch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def make_random_splits(num_elem, val_ratio, test_ratio, seed=42):\\n\",\n    \"    \\\"\\\"\\\"Make random splits of the data into train, validation, and test sets.\\n\",\n    \"\\n\",\n    \"    Args:\\n\",\n    \"        num_elem (int): Number of elements in the dataset.\\n\",\n    \"        val_ratio (float): Ratio of the dataset to use for validation.\\n\",\n    \"        test_ratio (float): Ratio of the dataset to use for testing.\\n\",\n    \"\\n\",\n    \"    Returns:\\n\",\n    \"        train_idx (list): List of indices for the training set.\\n\",\n    \"        val_idx (list): List of indices for the validation set.\\n\",\n    \"        test_idx (list): List of indices for the test set.\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    # Create a list of indices\\n\",\n    \"    idx = list(range(num_elem))\\n\",\n    \"    # Shuffle the list of indices\\n\",\n    \"    random.seed(seed)\\n\",\n    \"    random.shuffle(idx)\\n\",\n    \"    # Compute the number of elements in each set\\n\",\n    \"    num_val = int(num_elem * val_ratio)\\n\",\n    \"    num_test = int(num_elem * test_ratio)\\n\",\n    \"    num_train = num_elem - num_val - num_test\\n\",\n    \"    # Split the list of indices into three sets\\n\",\n    \"    train_idx = idx[:num_train]\\n\",\n    \"    val_idx = idx[num_train:num_train + num_val]\\n\",\n    \"    test_idx = idx[num_train + num_val:]\\n\",\n    \"    # Return the three lists of indices\\n\",\n    \"    return train_idx, val_idx, test_idx\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def make_random_splits_from_file(in_file, out_file, val_ratio, test_ratio, seed=42):\\n\",\n    \"    # Make the splits for QM9\\n\",\n    \"    df = pd.read_csv(in_file, usecols=[\\\"smiles\\\"])\\n\",\n    \"    train_idx, val_idx, test_idx = make_random_splits(len(df), val_ratio, test_ratio, seed=seed)\\n\",\n    \"    # Save the splits\\n\",\n    \"    splits_dict = {\\\"train\\\": train_idx, \\\"val\\\": val_idx, \\\"test\\\": test_idx}\\n\",\n    \"\\n\",\n    \"    # Check the splits validity\\n\",\n    \"    assert len(set(train_idx).intersection(set(val_idx))) == 0\\n\",\n    \"    assert len(set(train_idx).intersection(set(test_idx))) == 0\\n\",\n    \"    assert len(set(val_idx).intersection(set(test_idx))) == 0\\n\",\n    \"    assert len(train_idx) + len(val_idx) + len(test_idx) == len(df)\\n\",\n    \"    \\n\",\n    \"    torch.save(splits_dict, out_file)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 12,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"make_random_splits_from_file(\\\"qm9.csv.gz\\\", \\\"qm9_random_splits.pt\\\", 0.1, 0.1)\\n\",\n    \"make_random_splits_from_file(\\\"Tox21-7k-12-labels.csv.gz\\\", \\\"Tox21_random_splits.pt\\\", 0.1, 0.1)\\n\",\n    \"make_random_splits_from_file(\\\"ZINC12k.csv.gz\\\", \\\"ZINC12k_random_splits.pt\\\", 0.1, 0.1)\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"graphium\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.9\"\n  },\n  \"orig_nbformat\": 4\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "graphium/data/micro_ZINC/__init__.py",
    "content": ""
  },
  {
    "path": "graphium/data/micro_ZINC/micro_ZINC.csv",
    "content": "SMILES,SA,logp,score\nCc1cccc(COC(=O)Nc2ccc(-c3cn[nH]c3)cc2OCCN2CCCC2)c1.O=C(O)C(F)(F)F.O=C(O)C(F)(F)F,2.896514950230738,5.87502,2.978505049769262\nCCC(C)C1=C(C=CC=C1)OCCC[N]2C=CN=C2.O=C(O)C(=O)O,2.66931991795572,3.0213,0.35198008204428\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,2.775189478,3.7897,1.014510522\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.071497898,3.161,0.089502102\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,2.84000896,1.5048,-1.33520896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,3.605168457,-1.1109,-4.716068457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.132664673,2.2426,0.109935327\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,3.117639223,0.98102,-2.136619223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,2.744789746,4.3197,1.574910254\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,2.547847184,0.97954,-1.568307184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.444743614,2.4136,-0.031143614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,3.546311784,0.8084,-2.737911784\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,2.342516783,3.55016,1.207643217\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,2.685016252,2.63872,-0.046296252\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,2.04419285,3.0455,1.00130715\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,2.369895336,4.13924,1.769344664\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,4.165551538,0.6424,-3.523151538\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,3.989079408,0.7123,-3.276779408\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,3.287567863,3.0474,-0.240167863\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,3.152874099,1.2597,-1.893174099\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,2.173785623,3.35398,1.180194377\nCc1ncc(-c2ccc(S(=O)(=O)NC3CCCCCC3)s2)o1,2.513723357,3.71262,1.198896643\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,1.947448768,4.97972,3.032271232\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,4.391338086,0.42742,-3.963918086\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,2.956072154,2.7309,-0.225172154\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,2.665131639,3.32222,0.657088361\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,2.249569987,1.76082,-0.488749987\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,2.921725033,1.12714,-1.794585033\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,3.257260117,4.3639,1.106639883\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,4.36293875,1.63596,-2.72697875\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,3.725368177,1.9079,-1.817468177\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,3.812797047,0.2247,-3.588097047\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,1.826753956,2.73994,0.913186044\nCC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1,3.228025092,3.2416,0.013574908\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,4.787067722,1.3846,-3.402467722\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,2.485526606,3.6869,1.201373394\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,2.708478583,4.09694,1.388461417\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,2.943459024,3.01252,0.069060976\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,2.781430253,2.0053,-0.776130253\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,2.258312512,2.22984,-0.028472512\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,3.277555708,1.84896,-1.428595708\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,3.259412472,2.4361,-0.823312472\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,2.887282024,3.0685,0.181217976\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,2.501067124,2.11938,-0.381687124\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,2.296421252,3.7633,1.466878748\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,3.273511868,1.3679,-1.905611868\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,2.000346516,4.1539,2.153553484\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,4.483674272,0.9483,-3.535374272\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,3.510193788,0.3764,-3.133793788\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,2.774610155,3.0369,0.262289845\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,2.584851266,3.792,1.207148734\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,3.068452859,2.49544,-0.573012859\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,2.492863438,1.83022,-0.662643438\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,4.576896432,3.54158,-1.035316432\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,4.151966768,2.7888,-1.363166768\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,4.514122047,1.3831,-3.131022047\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,3.960984106,1.7691,-2.191884106\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,3.669896168,0.8081,-2.861796168\nC=CCSc1nnc(C2CCOCC2)n1N,2.825013358,1.164,-1.661013358\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,2.674797385,2.9367,0.261902615\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,2.133219774,4.04592,1.912700226\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,3.418504414,1.4823,-1.936204414\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,3.363061129,3.6889,0.325838871\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,1.931563902,4.0076,2.076036098\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,2.82917551,1.5648,-1.26437551\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,2.376871447,1.5444,-0.832471447\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,3.548164288,5.37382,1.825655712\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,3.226440403,3.7253,0.498859597\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,3.128631552,2.55582,-0.572811552\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,2.869359841,0.1186,-2.750759841\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,2.427176885,2.6842,0.257023115\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,2.493551313,2.76928,0.275728687\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,3.779297449,1.9323,-1.846997449\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,5.127905513,0.2312,-4.896705513\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,3.24179101,1.003,-2.23879101\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,2.114582277,1.80932,-0.305262277\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,2.852054716,1.8074,-1.044654716\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,3.940919906,2.9729,-0.968019906\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,2.400946341,5.3985,2.997553659\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,2.601600041,3.70032,1.098719959\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],3.576525387,0.94,-2.636525387\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,2.298959953,3.204,0.905040047\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,3.812988405,0.77654,-3.036448405\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,2.732126257,3.06272,0.330593743\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,3.955103091,1.47556,-2.479543091\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,3.628458628,1.43692,-2.191538628\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,2.795116489,0.8664,-1.928716489\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,3.002385916,2.4608,-0.541585916\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,3.789130146,1.5831,-2.206030146\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,2.245396536,2.436,0.190603464\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,2.86962049,2.07512,-0.79450049\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,3.445502237,3.7062,0.260697763\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,2.872766629,2.94166,0.068893371\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,1.983649537,3.6512,1.667550463\nO=c1c2ccccc2sn1-c1ncc(Br)s1,2.853247472,3.2712,0.417952528\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,2.60278761,2.92602,0.32323239\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,2.870575026,1.55642,-1.314155026\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,2.965149266,4.1977,1.232550734\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,1.961616674,1.8963,-0.065316674\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,2.529083407,1.1019,-1.427183407\nCCOc1cc(/C=C(\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC,2.350767662,5.01848,2.667712338\nCCOC(=O)Nc1ccc(NC(=O)N2C[C@@H](C)CC[C@H]2C)cc1C#N,3.067347555,3.77898,0.711632445\nCn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,2.748880773,2.327,-0.421880773\nCC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,3.356907136,1.8907,-1.466207136\nCCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,5.501545361,1.5698,-3.931745361\nCC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,3.063324,0.5446,-2.518724\nCOc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,2.583698935,3.5677,0.984001065\nCC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,3.087047482,3.1426,0.055552518\nCOc1ccc2c(c1)SC(=O)C2=O,2.401574365,1.5102,-0.891374365\nCOc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,2.913337418,4.3619,1.448562582\nCOc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,2.051226236,4.0812,2.029973764\nCNC(=O)O/N=C(/N)c1ccc(Br)cc1,2.250317734,1.4254,-0.824917734\nCC1(C)CN(C[C@@H](CS)c2ccccc2)CCO1,3.027019854,2.8108,-0.216219854\nCCOC(=O)[C@H]1CCCN(C(=O)Cn2c(SC(F)F)nc3ccccc32)C1,2.799716077,3.1527,0.352983923\n[NH3+][C@@H](Cc1ccccn1)c1ccc(Cl)cn1,3.324055225,1.6557,-1.668355225\nCCOCCN(C)C(=O)C(=O)Nc1cccnc1-c1ccccc1,2.256607012,2.182,-0.074607012\nCCn1cc([C@H](NC(=O)c2ccc(OC)c(OC)c2)c2ccc(F)cc2)c2cc[nH+]cc21,3.447988059,4.151,0.703011941\nO=C([C@@H]1CC12CCOCC2)N1CC[C@@H](Oc2cccc(Cl)c2)C1,3.713785706,3.1364,-0.577385706\nC[NH2+][C@]1(C(N)=O)CCC[C@H]1CCSC1CCOCC1,4.614268117,0.5061,-4.108168117\nCc1cc(NC(=O)c2ccc(-n3cncn3)nc2)ccc1N(C)C,2.267592306,2.28902,0.021427694\nCc1ccccc1CNC(=O)N1CCC([C@@H](O)c2ccc(Cl)cc2)CC1,2.524142332,4.30362,1.779477668\nCCc1nc(CNCCc2cccs2)cs1,2.300594957,3.0993,0.798705043\nO=C(Nc1ccc2nc(-c3cc(F)ccc3F)[nH]c2c1)c1ccc([N+](=O)[O-])cc1,2.145744096,4.6686,2.522855904\nN#Cc1ccccc1NC(=O)C(=O)NC[C@H]1COC2(CCCC2)O1,3.412921889,1.29868,-2.114241889\nO=C(NC1(C(=O)[O-])CC[NH2+]CC1)OCC1c2ccccc2-c2ccccc21,3.490460281,0.371,-3.119460281\nCc1ccc(C(=O)N2CCC[C@H]2CNC(=O)OC(C)(C)C)cc1,2.424440646,3.12432,0.699879354\nC[NH+](C)CC(=O)NNC(=O)c1ccc(C2CCCCC2)cc1,2.819298295,0.6398,-2.179498295\nC[C@@H]1CCCC[C@@H]1OCC(=O)N(C)Cc1cccc(O)c1,2.912373702,2.9459,0.033526298\nC#CCNS(=O)(=O)c1ccc(C(=O)N[C@@H]2CCO[C@@]3(CCOC3)C2)cc1,3.898168643,0.666,-3.232168643\nCc1ccc(C2(CN3CCN(S(=O)(=O)N(C)C)CC3)CCCC2)cc1,2.439754533,2.23082,-0.208934533\nCc1sc2ncn(CCC(=O)NCC(N)=O)c(=O)c2c1C,2.347207183,0.06644,-2.280767183\nCOc1ccc(CCNC(=O)c2cc(C(C)C)nc3c2c(C)nn3C)cc1OC,2.282558575,3.38982,1.107261425\nO=[N+]([O-])c1cnc(Br)s1,3.179818392,1.8138,-1.366018392\nCC(C)(C)CC(=O)N1CCN(C(=O)c2ccn3ccsc23)CC1,2.712888331,2.7214,0.008511669\nC/[NH+]=C(/NCCCc1cccc(Br)c1)NC1CC1,2.991361127,0.7897,-2.201661127\nO=C(CCC1CCCC1)N1CCC(COc2ccccc2C(=O)N2CCCC2)CC1,2.19573155,4.5104,2.31466845\nCC(C)(C)c1n[nH]cc1/C=C1\\SC(NC2CCCCC2)=NC1=O,3.076214803,3.5998,0.523585197\nCN(CC(F)(F)F)C(=O)CNC(=O)N1CCO[C@H](C#N)C1,3.449633505,-0.05892,-3.508553505\nCC[C@@H](NC(=O)NCc1ccc(C#N)cc1)c1ccc(C)c(F)c1,2.506404532,3.9563,1.449895468\nCC(C)c1noc(CCNC(=O)/C=C/c2cn(C)c3ccccc23)n1,2.489753078,3.0568,0.567046922\nCC(C)=C1Oc2c3c(cc(C)c2C1=O)OCN(Cc1ccccc1)C3,2.840364996,4.21612,1.375755004\nO=C1/C(=N/O)c2ccc(Cl)cc2N1Cc1ccccc1,2.0410745,3.0651,1.0240255\nc1ccc(CC2(Cc3ccc4c(c3)CCC4)C[NH2+]C2)cc1,2.844717318,2.5239,-0.320817318\nCC[C@H](C)NC(=O)N[C@H](c1ccccc1)C(F)(F)F,2.819638554,3.3877,0.568061446\nNC(=O)COc1ccc2ccccc2c1C[NH2+]C1CCCCCCC1,2.95041288,2.8802,-0.07021288\nO=C(COc1ccc(Cl)cc1[N+](=O)[O-])N1CCc2cc([N+](=O)[O-])ccc21,2.251658336,3.1245,0.872841664\nO=C(Cc1ccc(F)cc1F)NC1CCN(c2cccc(F)c2)CC1,2.033335093,3.4316,1.398264907\nCOC(=O)c1ccc(C[NH+](C2CC2)[C@@H](C)c2ccco2)cc1F,4.038401414,2.5138,-1.524601414\nCc1ccc([C@@H]2C[NH2+]CC[C@@H]2c2c(F)cccc2F)cc1,3.825954589,3.10772,-0.718234589\nCc1ccc(-c2ccc(=O)n(CC(=O)NCc3ccco3)n2)c(C)c1,2.180504107,2.43654,0.256035893\nCCNC(=O)NC(=O)CSc1nnc(N2C[C@@H](C)C[C@@H](C)C2)n1Cc1ccco1,3.320744633,2.3395,-0.981244633\nCC(C)(C)NC(=O)N1CCC(CNC(=O)C(=O)Nc2ccccc2[N+](=O)[O-])CC1,2.351664532,1.8696,-0.482064532\nCc1ccc(-n2cnnn2)cc1NC(=O)c1cnn(Cc2ccccc2)c1,2.221673771,2.46782,0.246146229\nO=C(Nc1ccc(F)cc1OCC(F)F)c1cccnc1N1CCCC1,2.273178554,3.7171,1.443921446\nCCCNC(=O)CCNC(=O)CN1CCC([NH3+])CC1,2.682391179,-1.2748,-3.957191179\nCOCCNC(=O)COc1cc2c(c3oc(=O)c4c(c13)CCC4)CCC(C)(C)O2,2.819768111,2.5267,-0.293068111\nCC(=O)N[C@@H](CC(=O)Nc1cnn(C)c1)c1cccs1,2.760825185,1.6876,-1.073225185\nO=C(c1cccc(S(=O)(=O)[N-]c2ccccc2F)c1)N1CCC[C@@H]1c1nc2ccccc2[nH]1,3.413113671,5.0734,1.660286329\nC[C@H](CNC(=O)c1[nH]nc(C2CC2)c1Cl)Oc1cccc(F)c1,3.06692193,3.2769,0.20997807\nCOc1ccccc1C(=O)/C(C#N)=C\\c1cnc(-c2ccccn2)s1,2.549035852,4.00358,1.454544148\nO=C(NCc1ccco1)[C@H]1CCCN1C(=O)C[NH+]1CCc2sccc2C1,4.132206287,0.5895,-3.542706287\nCC[C@@H]1CN(C(=O)c2ccc3c(c2)OCO3)CC[NH2+]1,3.69238642,0.2131,-3.47928642\nCc1cc(C(=O)Nc2ccc(F)cc2C)on1,1.934252507,2.68284,0.748587493\nC[C@@H](N[C@H](C[C@@H]1CCOC1)c1ccccc1)c1ccc([N+](=O)[O-])cc1,3.168218916,4.4133,1.245081084\nO=C(NNS(=O)(=O)c1ccc(F)cc1)c1cc2c(s1)CCCC2,2.219578826,2.3893,0.169721174\nCc1cc(C#N)ccc1Oc1ccc([N+](=O)[O-])cc1C#N,2.231562188,3.43888,1.207317812\nCOc1ccc(-c2nc(CNC(=O)c3ccccc3)n[nH]2)cc1,1.980697063,2.4103,0.429602937\nCc1cccc(OCC(=O)N/N=C/c2cc(Br)c(O)c(Br)c2O)c1,2.453904832,3.46032,1.006415168\nCC1=Nc2ccccc2Oc2c1c(=O)n(-c1ccccc1)c1ccccc21,2.458237938,5.2371,2.778862062\nCOC[C@@](C)(O)CNC(=O)[C@@H]1CC(=O)N(c2ccc(F)cc2)C1,3.005123049,0.6922,-2.312923049\nFc1ccccc1[C@@H]([NH2+]Cc1ccoc1)C1CCCC1,3.69757193,3.4136,-0.28397193\nCC1CCN(C(=O)CN2C(=O)N[C@](C)(c3ccc(Cl)cc3Cl)C2=O)CC1,2.732754879,3.0189,0.286145121\nCOc1cccc(NC(=O)NCc2ccc3c(c2)N(C(=O)c2cccc(C)c2)CC3)c1,2.09420016,4.52822,2.43401984\nO=C(/C=C/c1ccccc1)NC(=S)Nc1cc([N+](=O)[O-])ccc1F,2.105372184,3.2603,1.154927816\nCCCn1nccc1NC(=O)c1nnn(-c2ccc(C)cc2)c1C,2.315214931,2.74294,0.427725069\nCOc1cccc(-c2nc(CC(=O)N[C@@H]3CCS(=O)(=O)C3)cs2)c1,2.722418907,1.6645,-1.057918907\nCOc1cc(NC(=O)c2cnn(-c3ccccc3)n2)cc(OC)c1OC,2.151091028,2.5454,0.394308972\nCOc1ccc2c(c1)cc(C(=O)N[C@H](C)Cc1cnccn1)n2C,2.807843567,2.3379,-0.469943567\nCC(C)COCCNC(=O)NNc1ccccc1Cl,2.165718029,2.6387,0.472981971\nCCC[C@@H](C)[NH2+]C[C@H](C)Oc1cccnc1C,4.126301903,1.90932,-2.216981903\nCc1ccc(C)c(OCCSc2ccc(N)c(C)c2)c1,2.060838323,4.36516,2.304321677\nC[C@H](CC#N)N(C)C[C@@H]1CSc2ccccc21,3.590360435,3.10988,-0.480480435\nCCN[C@@]1(C#N)CC[C@@H](Oc2cccc(F)c2F)C1,3.52564354,2.76798,-0.75766354\nCc1cc(C[NH2+][C@@H](CO)C(C)C)c(C)n1-c1ccccc1,3.596987275,2.17444,-1.422547275\nCC1CCC(NC(=O)C(=O)Nc2cccc(-c3nncn3C)c2)CC1,2.347143544,2.1155,-0.231643544\nCC(=O)Nc1cccc(C(=O)/C=C/c2ccco2)c1,2.003848944,3.1341,1.130251056\nCCCNC(=O)c1cccc(NC(=O)C(=O)N2C[C@H]3CC=CC[C@@H]3C2)c1,3.148539376,2.1895,-0.959039376\nCc1nc(C)c(C(=O)N2CCN(C(=O)Cc3ccc(Cl)cc3)CC2)o1,2.226794753,2.47194,0.245145247\nCCc1nc(CN/C(=[NH+]/C)N2CC[NH+](CC(C)C)CC2)cs1,4.614668767,-1.282,-5.896668767\nCN(C)c1ccc(C(=O)Nc2c3c(nn2C)CCC3)cc1,2.254443099,2.2271,-0.027343099\nO=C1[C@H](Nc2ccccc2O[C@H]2CCOC2)CCCN1Cc1ccccc1,2.979200776,3.4574,0.478199224\nN#Cc1cccc(CNC(=O)CCOc2ccc(C=O)cc2)c1,1.99088313,2.45608,0.46519687\nCOc1ccc(CCn2cc(C(=O)[O-])c(=O)[nH]c2=O)cc1,2.453088713,-0.8486,-3.301688713\nCN(CCc1ccccc1)C(=O)C(=O)Nc1ccc(Oc2ccncc2)cc1,2.089518656,3.5135,1.423981344\nCc1cc(C)cc(NC(=O)[C@H]2CCCN(S(=O)(=O)c3cn(C)cn3)C2)c1,2.830417176,2.07634,-0.754077176\nCc1nc(CNC(=O)[C@@H]2CCCCN2C(=O)OC(C)(C)C)sc1C,2.787253271,3.16574,0.378486729\nCCSC1=NC(=O)[C@H]2C(=N1)NC1=C(C(=O)CCC1)[C@H]2c1cccnc1,4.10432105,2.4395,-1.66482105\nCOCC[C@H](C)C(=O)Nc1ccc(Oc2cc[nH+]c3ccccc23)cc1,3.383061791,4.0573,0.674238209\nCOC(=O)c1sccc1S(=O)(=O)N1CCC[C@@H](NC(C)=O)C1,2.758371043,0.8239,-1.934471043\nCC(=O)O[C@H]1CC[C@]2(C)[C@@H]3CC[C@@]4(C)[C@@H](C[C@@H]5CCCC[C@@]54C(C)=O)[C@@H]3C[C@@H](O)[C@@]2(O)C1,4.807691125,4.422,-0.385691125\nCCNS(=O)(=O)[C@H]1CCN(CCSc2ccccc2)C1,2.893618866,1.7923,-1.101318866\nCC(C)OCCCNC(=O)N1CCC[NH+](CC2CCCCC2)CC1,3.869396076,1.682,-2.187396076\nC[C@]1(CCc2ccccc2)NC(=O)N(CC(=O)N2CCc3sccc3[C@H]2c2cccs2)C1=O,3.330561001,4.227,0.896438999\nOc1n[nH]c(-c2ccccc2)c1/C=N\\c1cccc(Cl)c1,2.575508438,4.1863,1.610791562\nC[C@@](NC(=O)c1scnc1C1CC1)(C(N)=O)c1ccccc1,3.152593137,2.151,-1.001593137\nCn1cc(C[NH2+]C2CN(C(=O)OC(C)(C)C)C2)c(C(C)(C)C)n1,3.531149004,1.4003,-2.130849004\nO=C(Cn1ncc2c([nH]c3ccccc32)c1=O)N1CC[C@@H](c2ccccc2)C1,2.801434255,2.8939,0.092465745\nCOc1cccn(CC(=O)N2CCC[C@H]2c2nccs2)c1=O,2.926470384,1.6771,-1.249370384\nCc1ccc(S(=O)(=O)N2CCCSCC2)c(C)c1,2.172121888,2.43104,0.258918112\nCC(C)CSc1ccc(N)c(N)n1,2.677260409,1.9941,-0.683160409\nCc1ccccc1O[C@@H](C)CC[C@H](C)[NH+]1CCCCC1,4.183468302,2.99982,-1.183648302\nCC1=CC(=O)C(C)=C(C)C1=O,2.554710776,1.4209,-1.133810776\nCOC(=O)c1c(NC(=O)c2cc3cc(Br)ccc3o2)sc2c1CCC2,2.326967163,4.7844,2.457432837\nO=C(c1cc(=O)c2ccccc2o1)N1CCN(c2ccc(-c3noc(C(F)(F)F)n3)c[nH+]2)CC1,3.316210434,2.6383,-0.677910434\nCCN1C(=O)c2cc(NC(=O)c3cccc(OC)c3OC)ccc2Oc2ccccc21,2.188834997,4.7285,2.539665003\nCCC[C@H](C)C(=O)NCc1c(O)ccc2c1CCCC2,2.79815046,3.3234,0.52524954\nCc1c(CCC(=O)NC2CC2)c(=O)n2nc(-c3ccccc3)nc2n1Cc1ccccc1,2.39384803,3.12592,0.73207197\nCC(C)(C)C(=O)N1N=C(c2cccc(NS(C)(=O)=O)c2)C[C@H]1c1cccs1,3.033061881,3.8434,0.810338119\nCOc1ccc(/C=C/C(=O)Nc2ccc(Cl)c(-c3nc4ncccc4o3)c2)cc1,2.282822064,5.2037,2.920877936\nCc1cc(C(=O)Nc2ccc(C[NH+]3CCCC3)cc2)c2cnn(Cc3cccs3)c2n1,3.308540728,3.28052,-0.028020728\nC=CCn1c(CC)nnc1Sc1nc2c([nH]1)c(=O)n(C)c(=O)n2C,2.9507363,0.4514,-2.4993363\nCCN(C(=O)[C@H](C)c1cccc(F)c1)c1ccc(C#N)c(Cl)c1,2.787640646,4.50738,1.719739354\nCOC(=O)CSc1nnc(-c2ccc(OC)cc2OC)o1,2.12901572,2.0189,-0.11011572\nC[C@@H](NC(=O)CCc1cccc(F)c1)c1ccc2c(c1)CCC(=O)N2,2.559093635,3.5204,0.961306365\nCOC(=O)[C@H](NC(=O)c1cc(C)on1)c1ccc(F)c(F)c1,2.709930274,1.90532,-0.804610274\nO=C([O-])c1ccccc1NCc1ccccc1[N+](=O)[O-],2.300858927,1.5704,-0.730458927\nCc1cc(=O)c(C(=O)NCCc2ccccc2F)nn1-c1ccccc1F,2.138447824,2.79162,0.653172176\nCC(C)[C@H](NC(=O)c1ccccc1)C(=O)N[C@H](C)C(N)=O,2.47185794,0.431,-2.04085794\nCc1ccccc1OC1CN(c2ncccc2C(=O)N2C[C@@H]3CC[C@H]2C3)C1,3.852979496,3.28212,-0.570859496\nCCc1nc(CN2CCC[C@H]([NH2+]CC3CCN(C(C)=O)CC3)C2)no1,3.812233153,0.4183,-3.393933153\nCCN(CC)C(=O)CSc1c(C)cnc2c1c(=O)n(C)c(=O)n2C,2.632741217,0.90112,-1.731621217\nCCOc1cc2c(cc1OCC)C[NH+](Cn1nc3n(c1=S)CCCCC3)CC2,4.018751641,2.53659,-1.482161641\nO=C(CNCC(F)(F)F)N(Cc1cccc(O)c1)C1CC1,2.389515685,2.0351,-0.354415685\nCc1cc(CN2CCn3c(nnc3-c3ccccc3)C2)ccc1-n1cncn1,2.418767322,2.85002,0.431252678\nC[C@@H]1C[C@@H](C)CN(S(=O)(=O)c2cccc(C(=O)Nc3nc(C4CC4)cs3)c2)C1,3.076699104,3.9394,0.862700896\nCOC(=O)c1c(NC(=O)CCC(=O)[O-])sc2c1CCCCCC2,2.562728956,1.6623,-0.900428956\nS=C(Nc1cccnc1)N(Cc1ccccc1)C[C@H]1CCCO1,2.556251215,3.4596,0.903348785\nC[NH+]1CC[C@@H](N2CC3(CCC2=O)CC[NH+](Cc2c[nH]cn2)CC3)C1,6.165717612,-1.5158,-7.681517612\nCC[C@@](C)([C@@H]([NH2+]C)C1C[C@@H](C)C[C@@H](C)C1)[NH+]1CCCCC1,5.7683801,1.858,-3.9103801\nCn1cc(NC(=O)N2CCN(c3ccc([N+](=O)[O-])cc3)CC2)c2ccccc2c1=O,2.24714082,2.8008,0.55365918\nCC(C)C[C@H](NC(=O)c1cc(-c2cccn2C)n[nH]1)c1ncnn1C,3.319192988,2.0609,-1.258292988\nCCNC(=O)CN(C1CCCCC1)S(=O)(=O)c1ccccc1,2.045861074,2.1461,0.100238926\nCCSc1cc(C(=O)NC[C@H](C)Nc2ccccc2)ccn1,2.637650604,3.424,0.786349396\nCc1noc(C(C)C)c1C(=O)N1CCN(C[C@H](C)O)CC1,2.859488392,1.24502,-1.614468392\nCSc1ccc(Cl)c(C(=O)Nc2cc(C(=O)N(C)C)ccc2Cl)c1,2.013567479,4.6694,2.655832521\nCC(=O)c1ccc(OC[C@@H](O)CN2CCn3c(nn(C)c3=O)C2)cc1,3.001332769,0.0399,-2.961432769\nCc1ccc([C@H]2C[NH+]=C(N)N2Cc2ccccc2)c(C)c1,3.297715153,1.25564,-2.042075153\nCc1nc(SCC(=O)c2ccc3[nH]c(=O)[nH]c3c2)nc(C)c1C,2.466697225,2.54646,0.079762775\nCn1cc(C(=O)N[C@H]2CCC[C@H]3OCC[C@H]23)c(-c2ccccc2)n1,3.352877627,2.7745,-0.578377627\nCC[NH2+]C[C@]1(c2ccc(F)cc2Cl)CCC[C@@H]1C,4.165842394,3.1202,-1.045642394\nCCN(CCNC(=O)N[C@@H](C)c1ncc(C)s1)c1cccc(C)c1,2.919644026,3.64664,0.726995974\nCOCC(=O)Nc1ccc(-c2noc([C@@H]3CCCN3Cc3[nH]cc[nH+]3)n2)cn1,3.85152994,1.1958,-2.65572994\nCc1cc(NC(=O)[C@H]2COc3ccccc3O2)c2ncccc2c1,2.627262029,3.32172,0.694457971\nCC[C@H]1CC(=O)N(C[C@@H]([NH3+])c2ccc(OC(C)C)cc2)C1,3.641910716,2.0153,-1.626610716\nCc1ccc(C(=O)N(C)CCC#N)cc1,1.82915826,1.9807,0.15154174\nCN(O)[C@H]1CC[C@H](c2ccc(C(C)(C)C)cc2)C1,3.088427273,3.9412,0.852772727\nCc1cc(C)n(-c2nc([O-])cc(C(C)C)n2)n1,3.071945177,1.47614,-1.595805177\nCO[C@@H](C)c1nc(C[NH+](C)Cc2cccc(C)c2C)cs1,4.259931405,2.68224,-1.577691405\nCCc1ncc(CN(C)C(=O)c2c[nH]nc2-c2ccc(OC)cc2)s1,2.753719275,3.3764,0.622680725\nCc1ccc2[nH]c3c(=O)n(CC(=O)N(C)c4ccc(C)c(C)c4)ncc3c2c1,2.543058432,3.46606,0.923001568\nCC[NH2+]C[C@H]1CCO[C@@H]1c1ccc(F)cc1F,4.211152742,1.6257,-2.585452742\nCn1nc(C(C)(C)C)cc1C(=O)Nc1ccc(C(N)=O)cc1,2.062408364,2.0688,0.006391636\nO=C(Nc1cccnc1)c1ccc(NC(=O)c2ncn(-c3ccccc3)n2)cc1,2.128494967,3.1669,1.038405033\nO=c1cc(C2CCC2)nc(-c2cccc(C[NH+]3CC(O)C3)c2)[nH]1,3.57334498,0.4638,-3.10954498\nC[C@@H](OC[C@H]1CCCO1)C(=O)Oc1cc(Cl)ccc1Cl,3.004023247,3.4829,0.478876753\nCOc1ccccc1C1(C(=O)Nc2ccc3c(c2)N(C(=O)C2CC2)CC3)CC1,2.34415068,3.6646,1.32044932\nO=C([O-])[C@@H]1Cc2c([nH]c3ccccc23)[C@H](C(=O)[O-])[NH2+]1,4.686340492,-2.8031,-7.489440492\nO=C1NC2(CCCC2)C(=O)N1Cc1ccc(Br)s1,2.905374606,2.8752,-0.030174606\nCC(C)N1CCO[C@H](c2nnn[n-]2)C1,4.545882887,-0.3895,-4.935382887\nCc1ccc2c(c1)C(=O)N([C@H](C)C(=O)NCCCC(C)C)C2=O,2.543137633,2.53192,-0.011217633\nCCN(CC)S(=O)(=O)N[C@H](c1ccc(F)cc1)c1nccn1C,2.880328558,1.8248,-1.055528558\nCCN(CC)C(=O)C1CCN(c2cccc(Cl)c2)CC1,1.925061949,3.4248,1.499738051\nO=C(NCC(=O)N1CCN(c2ccc(F)cc2)CC1)N[C@@H]1CCS(=O)(=O)C1,2.699990976,-0.0394,-2.739390976\nCCn1c(SCC(N)=O)nnc1[C@H](C)Oc1cccc(Cl)c1,2.695897046,2.6688,-0.027097046\nCOC(C)(C)C(=O)Cc1cccc2ccccc12,2.090800657,3.3764,1.285599343\nCC1CCN(C(=O)c2ccccc2NCC(=O)N[C@H](C)C(C)C)CC1,2.513852635,3.1313,0.617447365\nCOCC[C@@H](C)C(=O)Nc1ccc2c(C)n[nH]c2c1,2.813255353,2.48242,-0.330835353\nCC[C@H]([NH3+])[C@H](Sc1cc2ccccc2[nH]1)c1cccnc1,3.893539633,3.4168,-0.476739633\nN#C[C@@H](c1ccc(Cl)cc1)c1ccc([C@H](O)c2ccccc2)cc1,2.754553872,5.07718,2.322626128\nCCS(=O)(=O)c1ccccc1NCc1c[nH+]cn1C1CC1,3.319676997,2.0428,-1.276876997\nCc1ccc(OC[C@@H](O)C[NH+](C)Cc2cc(C)on2)c(C)c1,3.945702345,1.05446,-2.891242345\nCOc1cccc2c1OC[C@@H](NC(=O)Nc1cncc3ccccc13)C2,2.816092277,3.3686,0.552507723\nCc1cc([C@H]2CCCN2CCc2ccc([N+](=O)[O-])cc2)no1,2.770692362,3.27082,0.500127638\nCOc1ccccc1-n1nc(C(=O)NC2CCCCCC2)c2ccccc2c1=O,2.082487998,3.8469,1.764412002\nO=C(Cn1ccc(=O)[nH]c1=O)NC[C@H]1CCOC1,2.849533882,-1.3107,-4.160233882\nCC(=O)N1C(C)(C)C=C(CO)C1(C)C,3.451082344,1.3244,-2.126682344\nCc1cnc(CNCc2c[nH+]cn2C2CC2)cn1,3.652209805,1.02532,-2.626889805\nCc1cc(C(=O)N2CCc3cc(Cl)cc(Cl)c3C2)[nH]n1,2.669253158,3.22342,0.554166842\nCCOC(=O)[C@H](CC)N1CCC(C(=O)N2CCCCCC2)CC1,2.5693103,2.4427,-0.1266103\nCC(C)[C@H](C)NC(=O)C(=O)Nc1cc(Br)ccc1-n1cccn1,2.804510976,2.734,-0.070510976\nO=C(CSc1ccc(F)cc1)Nc1nnc(-c2ccccn2)o1,2.191540921,3.0015,0.809959079\nC/C=C/c1ccc(OCC(=O)Nc2ccc3[nH]c(=O)[nH]c3c2)c(OC)c1,2.253707868,2.9154,0.661692132\nCN(CC[NH+](C)C)C(=O)c1cc(F)c(Cl)cc1Cl,3.312684493,1.349,-1.963684493\nO=[N+]([O-])c1cnccc1NCc1cccc(F)c1,2.013819951,2.741,0.727180049\nCc1ccc(F)c(NC(=O)c2cccc(CS(C)(=O)=O)c2)c1,1.842203357,2.93102,1.088816643\nCc1nc(N2CCN(c3ccccc3O)CC2)nc2cc3c(cc12)CCC3,2.344492661,3.45912,1.114627339\nCC(C)c1ccsc1C(=O)N1CCN(c2ncccc2C#N)CC1,2.467553166,3.10058,0.633026834\nCc1ccc(S(=O)(=O)N2CCC[C@@H](c3nc(-c4ccccc4Br)no3)C2)cc1,2.679753116,4.37582,1.696066884\nCCc1ccc2nc(C)cc(C(=O)N[C@H]3CCC(=O)N(C)C3)c2c1,2.726969629,2.45622,-0.270749629\nCS(=O)(=O)[N-]c1ccc(C2=NN/C(=N\\c3ccccc3)SC2)cc1,3.686623439,3.3796,-0.307023439\nCOc1ccc(Br)cc1C[NH2+][C@H]1CCCC1(C)C,3.803916035,3.0998,-0.704116035\nO=C1CCC[C@@H]1CC[S@@](=O)c1cc(Cl)ccc1Cl,3.495879625,3.8603,0.364420375\nC[C@@H]([NH2+]Cc1ccn[nH]1)c1ccc(Cl)s1,4.249743882,1.9492,-2.300543882\nNC(=O)[C@@H]1CCCN1c1nc(-c2cccnc2)nc2scc(-c3ccccc3)c12,2.881834056,3.8744,0.992565944\nCOc1ccccc1CNC(=O)c1nn(C)c(=O)c2ccccc12,1.894141175,1.8721,-0.022041175\nCc1ccc(F)cc1S(=O)(=O)NCCCCn1ccccc1=O,2.142833523,2.05452,-0.088313523\nCc1cc(C)nc(N(C)CC(C)(C)O)n1,2.621362705,1.30054,-1.320822705\nCOC(=O)c1sc2ccc(NC(=O)NC(C)C)cc2c1OC,2.204937973,3.2264,1.021462027\nOc1c(F)ccc2cc[nH]c12,2.532800833,2.0126,-0.520200833\nCc1cccc(-c2nc(C[NH+]3CCC[C@@H](Cn4cncn4)C3)no2)c1,4.25447131,1.13162,-3.12285131\nO[C@H]1Cc2ccccc2[C@H]1[NH2+]Cc1nccn1Cc1ccccc1,3.674403729,1.6531,-2.021303729\nCc1cccc(S(=O)(=O)[C@H](C)C(=O)Nc2cc(F)ccc2F)c1,2.496263936,3.07412,0.577856064\nO=C1C([O-])=C(O)C(=O)c2ccccc21,2.704172084,0.1955,-2.508672084\nCC(C)(C)CC(=O)Nc1cc(F)c(F)cc1Cl,2.036293505,3.9929,1.956606495\nCOc1ccc(C(=O)NC(=S)Nc2ccc3oc(-c4ccc(O)c(Br)c4)nc3c2)cc1,2.285489018,5.0983,2.812810982\nCOC(=O)C1(NCc2cnc3ccccn23)CCCC1,2.69067309,1.9097,-0.78097309\nCOC(=O)c1cc(-c2oc3ccc(Br)cc3c(=O)c2C)cc([N+](=O)[O-])c1,2.387240107,4.22572,1.838479893\nFc1ccc(CNC2=NC=NC3=NC=N[C@H]32)cc1,3.602406741,1.1647,-2.437706741\nOC[C@@H](CCc1cccs1)c1cccc(F)c1,2.630519417,3.5959,0.965380583\nC[C@H]1CCc2c(C(=O)NCc3ccc(S(N)(=O)=O)cc3)csc2C1,2.683701432,2.4503,-0.233401432\nCOC(=O)NCc1ccc(NCc2ccc(Br)c(C)c2)cc1,1.910394325,4.22562,2.315225675\nCNS(=O)(=O)CC(=O)N[C@H](C)c1ccc(SC(C)C)cc1,2.831783505,1.9135,-0.918283505\nc1cc(N2CCCCCC2)ccc1C[NH2+]Cc1nnc2n1CCC2,3.039500112,1.8683,-1.171200112\nCC[C@@H](C)NC(=O)[C@@H](C)S(=O)(=O)Cc1ncc(Cl)n1C,3.496099959,1.2915,-2.204599959\nCC(=O)c1cccc(OC(=O)/C=C/c2cn(CCC#N)c3ccccc23)c1,2.377963524,4.37638,1.998416476\nCC(C)c1ccccc1NC(=O)C1CCN(S(C)(=O)=O)CC1,1.946704896,2.4201,0.473395104\nC[NH+](C)[C@H]1CC[C@@H](NC(=O)N2CCN(c3nnnn3-c3ccccc3)CC2)C1,3.830633555,-0.4405,-4.271133555\nCCn1cc(C(=O)C(=O)N(C)c2c(Cl)cccc2Cl)cn1,2.653209612,3.0555,0.402290388\nCC(C)CCN1N[C@H](C2=NC(=O)C3=C4CCCC[C@@H]4SC3=N2)c2ccccc2C1=O,4.330683963,4.0574,-0.273283963\nCc1ccnc2nc(C(=O)N(CCC[NH+](C)C)Cc3ccccc3)nn12,3.171964209,0.60972,-2.562244209\nCC[C@H](NC(=O)c1ccc([N+](=O)[O-])cc1F)c1ccc(OC)cc1,2.384004406,3.6236,1.239595594\nCCc1ccccc1NC(=O)N[C@H](C)C(=O)N1CCCC[C@@H]1C,2.748723049,3.16,0.411276951\nO=C(CS(=O)(=O)c1ccc2ccccc2n1)N1CCCC1,2.204420544,1.6309,-0.573520544\nCc1nc(-c2ccc(S(=O)(=O)N[C@H](C)C[NH+]3C[C@@H](C)C[C@@H](C)C3)cc2)co1,4.544819199,1.87762,-2.667199199\nCCOc1ccccc1NC(=O)C[C@@H]1CSc2nc(C(C)(C)C)cc(=O)n21,3.009533648,3.6151,0.605566352\nC[C@H]([NH2+]Cc1ccc(F)c(CO)c1)c1ccc(-c2cccs2)cc1,3.417228851,3.8711,0.453871149\nC[C@@H](Oc1ccccc1Cl)C(=O)Nc1cccc(I)c1,2.252775185,4.3506,2.097824815\nCOc1ccc(CCNC(=O)c2c[nH]nc2-c2ccccc2)cc1,2.153124233,3.0578,0.904675767\nC[NH+]1CCN(CC(=O)NC[C@@H]2COc3ccccc3O2)CC1,3.69680263,-1.2271,-4.92390263\nCOc1ccc2c(c1)N(C(=O)c1cccc([N+](=O)[O-])c1)CCO2,2.081577406,2.6426,0.561022594\nCCc1nc(C(=O)Nc2ccc(N(C)C(=O)OC)cc2)nn1-c1c(Cl)cccc1Cl,2.515200631,4.5914,2.076199369\nO=C(Nc1sc2c(c1C(=O)Nc1ccccc1)CCC2)c1ccccc1Cl,1.933911169,5.3948,3.460888831\nCc1ccc(Sc2ccc(=O)n(-c3ccc(C(=O)[O-])cc3)n2)cc1,2.544596831,2.05562,-0.488976831\nO=C1CCCN1CC#CC[NH+]1CCCC1,4.434869482,-0.7091,-5.143969482\nCOc1cc2c(cc1C[NH+]1C[C@@H]3CCC[C@H](C1)C3(O)c1ccccn1)CCC2,5.299426023,2.2815,-3.017926023\nCC(C)CN(C)CN1C(=O)C(=O)N(C23CC4CC(CC(C4)C2)C3)C1=O,4.036679938,2.2913,-1.745379938\nC[C@@H](O)c1ccc(N(C)[C@H](C)c2ccccc2F)cc1O,3.081308678,3.782,0.700691322\nCOc1ccc(NC(=O)N[C@@H]2CCO[C@@H]2C)c(Br)c1,3.049111174,2.7566,-0.292511174\nCn1c(C[NH+]2CCC(CS(N)(=O)=O)CC2)nnc1-c1ccccc1,3.609551356,-0.4345,-4.044051356\nCCc1cccc(NC(=O)Cc2nc(Cc3ccc(F)cc3)n[nH]2)c1,2.201228159,3.2782,1.076971841\nCc1cc(O)c(-c2cc(C(=O)Nc3cccnc3)n[nH]2)cc1C,2.368833974,3.04644,0.677606026\nCc1cnc(CN2CCO[C@H](CN(C)c3cccn[nH+]3)C2)o1,4.141147342,0.52932,-3.611827342\nCC[NH2+][C@@H](Cc1ccc(Br)s1)[C@@H]1CSCCO1,4.741884708,2.137,-2.604884708\nO=C(NC(=S)Nc1ccc(O)cc1)c1ccccc1[N+](=O)[O-],1.926072916,2.4272,0.501127084\nCC(=O)Nc1cccc(NC(=O)C(=O)N(C)CCc2cccs2)c1C,2.321622474,2.65452,0.332897526\nClc1cccc(CC[NH2+]Cc2cc[nH]c2)c1,3.452394629,1.9742,-1.478194629\nO=C(CSC(=S)N1CCCC1)N1CCc2ccc([N+](=O)[O-])cc21,2.513903323,2.5978,0.083896677\nCOc1ccc(N2C[C@@H](C(=O)NN3CCCCC3)CC2=O)c(OC)c1,2.730423213,1.5738,-1.156623213\nO=C(c1cccc(O)c1)N1CCN(C/C=C/c2ccccc2[N+](=O)[O-])CC1,2.224870018,2.7716,0.546729982\nCOc1cc(-c2cccc(CN3CCOCC3)c2)ncn1,2.083956027,1.9844,-0.099556027\nCc1csc(C(=O)[O-])c1NC(=O)c1cccc([N+](=O)[O-])c1,2.643339977,1.58052,-1.062819977\nCCC(CC)C(=O)Oc1ccc(S(=O)(=O)N(C)C)cc1,2.090592718,2.2785,0.187907282\nCN(C)S(=O)(=O)c1ccc(C(=O)N2CCC(Oc3nc4c(F)cccc4s3)CC2)cc1,2.426432086,3.3693,0.942867914\nCOC(=O)Cc1ccc(OCC(=O)N2CCCC2)cc1,1.744125512,1.4033,-0.340825512\nCC[C@H](Oc1ccc2c(c1)CCC2)C(=O)N1CCCCCC1,2.545510082,3.7353,1.189789918\nCO[C@H](C(=O)Nc1ccc(C(=O)NCC(C)C)cc1)c1ccccc1,2.252969911,3.3986,1.145630089\nO=C1C[C@@H](CNC(=O)Nc2ccc3c(c2)COC3)c2ccccc2N1,2.934345456,2.9643,0.029954544\nC=CCN(CC(=O)[O-])C(=O)CSc1cccs1,3.162627302,0.6047,-2.557927302\nCCOC(=O)C1CCN(CN2C(=O)NC(c3ccccc3)(c3ccccc3)C2=O)CC1,2.379611396,2.7146,0.334988604\nCC(C)([C@H](O)[C@H]1CCOC2(CCCC2)C1)[NH+]1CCCCC1,5.244483813,1.9341,-3.310383813\nNc1cc(-c2nc3cc(-n4cnnc4)ccc3o2)ccc1Cl,2.535858453,3.3111,0.775241547\nC[NH+](C)[C@@H]1CC[C@H](NC(=O)C(C)(C)CCCc2ccccc2)C1,3.853817039,2.2173,-1.636517039\nCCOc1ccc([C@H](C)NC(=O)c2ccc(F)cc2F)cc1,2.159335886,3.8545,1.695164114\nCOC(=O)C(C)(C)[C@@H]1CCC[NH+](Cc2nnc(C)n2C2CC2)C1,4.613226345,0.91552,-3.697706345\nCCc1cnc(CN(C)C(=O)c2ccc(SC(F)F)cc2)s1,2.537210369,4.2924,1.755189631\nC[NH2+]Cc1cc(=O)[nH]c(-c2ccc(F)cn2)n1,3.604829082,-0.3358,-3.940629082\nC[C@@H](NC(=O)COc1ccccc1F)c1ccccc1,2.002354089,3.0819,1.079545911\nCC[C@@H](C)Oc1cccc(C(=O)Nc2cccc(C(=O)N3CC[NH+](C)CC3)c2)c1,3.343060863,2.0867,-1.256360863\nCOC(=O)c1cccc(CNC(=O)Nc2cc(F)ccc2OC)c1,1.778129704,2.9426,1.164470296\nNC(=O)c1cncc(C#CC2(NC(=O)C3CCCC3)CCCCC2)c1,2.735607005,2.5413,-0.194307005\nO=C(CN1CC2(CCOCC2)Oc2ccccc2C1=O)NCCN1CCCC1=O,3.166655355,0.809,-2.357655355\nCC[C@H]1CCC[C@@H](C(=O)[C@@](C)(CC)N2CCOCC2)C1,3.690512322,3.2728,-0.417712322\nCOc1ccc(OC)c(-n2nnnc2S[C@@H](C)c2nc3ccccc3n2C(F)F)c1,3.036983258,4.2776,1.240616742\nCC(=O)NN1C(N)=C(C#N)[C@@H](c2ccc(Cl)cc2)C2=C1CC(C)(C)CC2=O,3.23336918,3.12738,-0.10598918\nCc1cc(-c2cncnc2C2CCN(C(=O)c3cn(C)nc3C)CC2)on1,2.788500188,2.50174,-0.286760188\nC[C@H](NC(=O)Cc1cc2ccccc2[nH]c1=O)c1ccc2ccccc2c1,2.464508914,4.1012,1.636691086\nCc1ccc(CC(=O)N2CCC[C@H](C)C2)cn1,2.432306191,2.19102,-0.241286191\nCCn1/c(=N/C(=O)Cn2cccn2)[nH]c2ccc(Br)cc21,3.05958439,2.0758,-0.98378439\nCc1cc(C(=O)N2CCN(Cc3cnc4ncccn34)CC2)c(C)o1,2.619295467,1.89714,-0.722155467\nCOc1ccccc1CNC[C@@H](c1ccc(C)o1)N1CCOCC1,2.661936754,2.75972,0.097783246\nC#CCN(Cc1ccccc1Cl)C1CCCC1,2.276202846,3.7178,1.441597154\nCCn1cc(Cn2cc(NC(=O)C3CCN(S(C)(=O)=O)CC3)cn2)cn1,2.468456254,0.7579,-1.710556254\nCC[C@@H](C)NC(=O)[C@@H](C)NCc1cccc(C(N)=O)c1,2.653152917,1.1783,-1.474852917\nNC(=O)C1CC[NH+](C/C=C/c2ccccc2)CC1,3.578814538,0.48,-3.098814538\nc1ccc(-c2cc(CSc3ncnc4sccc34)on2)cc1,2.382726154,4.6386,2.255873846\nC[C@H](NC(=O)N(C)C)c1cccc(C#CCCO)c1,2.827051719,1.7527,-1.074351719\nCC(=O)CCc1ccc(O[C@H](C)C(=O)N[C@H]2CCCC[C@@H]2C)cc1,2.92425479,3.6704,0.74614521\nO=Cc1c(C2CC2)oc2ccc(OCc3ccc(Cl)nc3)cc12,2.624109703,4.7501,2.125990297\nCN(Cc1cscn1)c1ccc(S(=O)(=O)C(C)(C)C)nc1,2.900280867,2.7467,-0.153580867\nCC(C)[NH+]1CCN(c2ccc(C(=O)N3C[C@H](C)OCC3(C)C)cc2)CC1,4.047651791,1.4394,-2.608251791\nO=C(NCCC[NH+]1CCCCC1)N1CCO[C@H](c2ccccc2)C1,3.814447251,1.2284,-2.586047251\nC[C@H](C(=O)N(C)CCC#N)[NH+](C)Cc1ccccc1,3.961452338,0.46188,-3.499572338\nO=C(/C=C/c1ccc(Cl)cc1)Nc1nnc([C@H]2CCCO2)s1,2.806479793,3.6949,0.888420207\nCC[C@@](C)([C@H](NC)c1cccc(OC)c1)[NH+]1CCCCC1,4.2931591,2.1932,-2.0999591\nCn1c(=O)c2[nH]c(NCCCCl)nc2n(C)c1=O,2.736699884,0.0011,-2.735599884\nCc1csc(C2(NC(=O)CCCc3nnc(-c4ccccc4)o3)CCCC2)n1,2.643283154,4.40992,1.766636846\nCCSc1ccc(Cl)cc1C(=O)Nc1cccnc1C,2.132314147,4.40772,2.275405853\nC[C@@H]1CCCN(C(=O)N[C@@H]2CCCCC[C@@H]2[NH3+])C1,3.76711659,1.3711,-2.39601659\nO=C(CN1CCCOC1=O)N1CCC[C@@H]([NH+]2CCCCCC2)C1,4.143119308,0.2786,-3.864519308\nCc1cc(SCCCSCC#N)nc2ccccc12,2.540944048,4.2822,1.741255952\nO=C(CCNc1ccccc1[N+](=O)[O-])OC1CCC1,2.06472993,2.4925,0.42777007\nCCCSC1=NC(=O)[C@H]2C(=N1)NC(=O)C[C@H]2c1cc(Br)ccc1OC,3.954159567,3.1153,-0.838859567\nC=CCN(Cc1cccc(C#N)c1)C[C@H](O)c1cccc(F)c1,2.763633613,3.41898,0.655346387\nO=C([O-])c1cccc2c1ccn2Cc1cscn1,2.841347083,1.5096,-1.331747083\nC[C@@H]1CCCCN1C(=O)c1ccc(NS(=O)(=O)c2ccc3c(c2)=[NH+]C(=O)[NH+]=3)cc1,3.780275122,-1.964,-5.744275122\nCC[C@@H](Cl)CCc1cc(C)cc(F)c1,2.71715977,4.08412,1.36696023\n[NH3+][C@H](Cc1cccc(F)c1F)c1cc(Cl)sc1Cl,3.588641574,3.8588,0.270158426\nCc1cnc(N(C)CC2CCCC2)c([N+](=O)[O-])c1,2.421682195,2.92462,0.502937805\nCc1cccc2c(CCC(=O)N3CCC[C@@H](n4cncn4)C3)c[nH]c12,2.996277595,2.86412,-0.132157595\nCc1noc(C)c1[C@@H]1CCCN1C(=O)CN1CCNC1=O,3.158593888,0.98014,-2.178453888\nCCCCCCSc1nc2ncccn2n1,2.464580554,2.7967,0.332119446\nCc1ccc(OC(=O)C2CCN(C(=O)c3ccc(F)cc3)CC2)c(C)c1,1.905636655,3.90034,1.994703345\nCC[C@]1(C2CC2)CC(=O)NC(=O)[C@H]1Cc1ccccc1,3.395847148,2.6982,-0.697647148\nO=C(CSc1ncc2ccccn12)N1CCc2sccc2C1,2.587505606,3.0728,0.485294394\nCC(=O)Nc1ccc(NC(=O)c2ccccc2C(=O)c2ccc(F)cc2)cc1Cl,1.8770974,4.9208,3.0437026\nCc1ccc(Cl)cc1NC(=O)N1CC[C@@H](N(C)C(=O)OC(C)(C)C)C1,2.673373595,4.12152,1.448146405\nNc1nc(CC(=O)N2CCC[C@@H](c3[nH]ncc3-c3ccccc3)C2)cc(=O)[nH]1,3.16264224,1.6909,-1.47174224\nC[C@H](NS(=O)(=O)c1ccccc1)C(=O)N1C[C@H](C)c2ccccc21,2.879521476,2.5037,-0.375821476\nCc1cccc2cc(C(=O)NC[C@H](C)C(=O)[O-])oc12,3.101639588,0.85702,-2.244619588\nCNC(=O)c1cccc(OC(=O)c2oc3ccc(OC)cc3c2C)c1,2.081208379,3.32862,1.247411621\nCC[C@@H](NC(=O)Nc1cc(COC)ncn1)c1ccc(C)c(F)c1,2.843322787,3.34332,0.499997213\nCCc1nn(C)cc1NC(=O)N1CCN(C(=O)[C@H]2CCCO2)CC1,3.02663316,0.8376,-2.18903316\nCC(C)Nc1nnc(SCc2cn3cc(Cl)ccc3n2)s1,2.651955928,3.9518,1.299844072\nCC(=O)Nc1cccc(N[C@H](C)C(=O)NC[C@H](C)c2ccccc2)c1C,2.750741611,3.67372,0.922978389\nCC(C)(CCC(=O)OC(C)(C)C)NC(=O)c1ccc(Br)o1,2.503298166,3.6724,1.169101834\nCCOC(=O)c1c(NC(=O)c2cc3cccc(OC)c3oc2=O)sc(C)c1C,2.294956795,3.90894,1.613983205\nCC1(C)C[C@H]([NH2+]C(C)(C)C(N)=O)C(C)(C)O1,4.587654623,0.1598,-4.427854623\nFc1cccc(C[NH2+][C@H]2CCC[C@@H](C(F)(F)F)C2)c1,4.065181131,3.0102,-1.054981131\nCOc1ccc(-c2nc(S(=O)(=O)c3ccccc3)c(NCc3ccc4c(c3)OCO4)o2)cc1,2.369919411,4.5238,2.153880589\nCCN(C(=O)c1sc(NC(=O)c2ccco2)cc1C)[C@@H](C)C(C)C,3.067582502,4.40842,1.340837498\nCC(=O)c1ccc(N2CCN(CC(=O)Nc3ccc(F)cc3)CC2)cc1,1.745600799,2.789,1.043399201\nCC(C)(C)c1ccccc1NC(=O)CNc1cc2c(cc1Cl)NC(=O)CO2,2.379469962,4.019,1.639530038\nCOc1ccc(NC(=S)NC(=O)c2ccccc2OC(C)C)cc1Cl,1.95587331,4.2626,2.30672669\nCOCCn1ccc2cc(NC(=O)c3cccs3)ccc21,2.079198728,3.6015,1.522301272\nCCOc1ccc(C(=S)NC[C@@H]2CCCO2)cc1,2.528851633,2.5294,0.000548367\nCCCc1c(C(=O)NC2CCN(C(=O)OCC)CC2)cnn1-c1ccccc1,2.229874275,3.1755,0.945625725\nC#Cc1cccc(NC(=O)NCCC[NH+]2CCCCCC2)c1,3.486046434,1.6384,-1.847646434\nCCc1ccc([C@@H](O)[C@@]2(C)CCCO2)cc1,3.291365597,2.8515,-0.439865597\nCOC(=O)[C@@]1(F)CCN(C(=O)Cc2cccc(OC)c2)C1,2.848120431,1.3513,-1.496820431\nCNC(=O)CC1CCN(C(=O)c2nc(-c3ccccc3)oc2C2CC2)CC1,2.470756132,3.2073,0.736543868\nO=C1N[C@H](CO)C(=O)N2CCN(Cc3cnn(-c4ccccc4Cl)c3)C[C@H]12,3.293818928,0.0292,-3.264618928\nCNc1nc(C)cc(-c2cccc3cnccc23)n1,2.234845867,3.04192,0.807074133\nCCn1cc(-c2nnc(SCc3ccccc3Cl)n2C)cn1,2.291699181,3.6442,1.352500819\nO=C(CSc1nc(-c2ccco2)nc2ccccc12)Nc1nccs1,2.346398845,4.0771,1.730701155\nO=C1CCc2cc(S(=O)(=O)Oc3cccc(C(F)(F)F)c3)ccc2N1,2.341990367,3.3578,1.015809633\nCC[C@@H](C)c1noc([C@H]2CCCN(C(=O)Cc3cccs3)C2)n1,3.228844727,3.5933,0.364455273\nCCCOc1ccc(Br)cc1C[NH+]1CCC(c2noc(C)n2)CC1,3.614952536,2.89182,-0.723132536\nCCNc1cc[nH+]cc1S(=O)(=O)[N-]c1ccc(C)[nH+]c1C,5.185505424,1.75754,-3.427965424\nCc1cc([C@H]([NH3+])C2CCCCCC2)sc1Br,3.776193085,4.07242,0.296226915\nCC1(C)CCC[C@H]1NS(=O)(=O)c1ccc(N)cc1Cl,3.001833462,2.7792,-0.222633462\nCc1ccc(Cl)c(-n2nnc(C(=O)Nc3ccccc3F)c2C)c1,2.064234641,3.92894,1.864705359\nCCOC(=O)[C@H]1CCCN(C(=S)Nc2ccccc2F)C1,2.457567813,2.7976,0.340032187\nC=CC[C@H](C)[C@@H](C)[NH2+][C@@H](C)CS(C)(=O)=O,4.867243714,0.5836,-4.283643714\nCOc1cc([C@@H]2CC(O)=Nc3c(C(=O)[O-])cn(-c4ccc(Cl)cc4)c32)cc(OC)c1O,3.620591927,3.3406,-0.279991927\nCCC([NH3+])(CC)C(=O)N[C@@H]1C[C@H]2C[C@@H]1[C@H]1CCC[C@@H]12,5.358506374,1.728,-3.630506374\nO=C(CNC(=O)N1CCC(OCc2ccccc2F)CC1)N1CCCCC1,2.212233062,2.5288,0.316566938\nCOC(=O)c1ccc(Oc2ncnc(Oc3cccc4cccnc34)c2[N+](=O)[O-])cc1,2.403730151,4.3042,1.900469849\nCC[NH2+]C[C@@H](O)CN1C(=O)CCCC1=O,4.011495477,-1.5303,-5.541795477\n[NH3+][C@H]1C=C[C@@H](C(=O)NC2(CC(=O)[O-])CCCCC2)C1,4.700839766,-0.8679,-5.568739766\nCOc1cccc([C@H]2C(C(=O)c3ccc(Cl)cc3)=C([O-])C(=O)N2c2cc(C)on2)c1,3.126908422,3.23022,0.103311578\nCC(C)CC[NH2+]Cc1cc(F)ccc1F,3.319744001,2.0743,-1.245444001\nCC(C)C[C@H](NC(N)=O)C(=O)NCCc1ccc(Cl)cc1Cl,2.501281448,2.7351,0.233818552\nCc1nc(C(=O)N(C)Cc2ccc(C(F)(F)F)cc2)nn1-c1ccc(F)cc1,2.28443192,4.00582,1.72138808\nCCCc1nnc(NC(=O)c2c(Cl)ccc(Cl)c2Cl)s1,2.26792167,4.7031,2.43517833\nO=C([O-])c1ccc[nH+]c1NC[C@H](c1ccccc1)N1CCOCC1,3.813179172,0.3496,-3.463579172\nCOCCOc1ccc(CNC(=O)N2CCC[C@H](C)CC2)cn1,2.587574312,2.4384,-0.149174312\nCc1nccc2c1=C[C@@H](C(=O)N[C@@H](C)c1ccccc1)C(=O)[NH+]=2,4.094326362,-1.09548,-5.189806362\nCOc1ccccc1NC(=O)c1ccc(NC(=O)Cn2nc(C(=O)[O-])c3ccccc3c2=O)cc1,2.34156721,1.6596,-0.68196721\nO=C(NC[C@@H]1COc2ccccc2C1)c1n[nH]c(=O)c2ccccc12,2.702355993,1.9042,-0.798155993\nCC1(C)CCC[C@@H]1[NH2+][C@@H]1CCCC1(C)C,4.747288252,2.7072,-2.040088252\nCC(C)n1nccc1C(=O)OC[C@H]1CN(Cc2ccccc2)CCO1,2.811890846,2.5218,-0.290090846\nCCc1ccc([C@@H](Br)c2ccc3c(c2)C[C@@H](C)O3)o1,3.294719399,4.6497,1.354980601\nCc1ccc(S(=O)(=O)N2CCCCC2)cc1C(=O)N(C)CC[NH+](C)C,2.941929517,0.38612,-2.555809517\nCC[C@@H]1CCc2c(sc(=O)n2CN2CC[NH+](C)C[C@@H]2c2ccccc2)C1,4.511671939,1.9538,-2.557871939\nC=CCC[C@H](O)c1ccc(F)cn1,3.059250513,2.2203,-0.838950513\nCC(C)(C)NC(=S)N/N=C/c1cccnc1,2.423634839,1.6781,-0.745534839\nCC1CCN(C(=O)c2ccc(CSc3nc4ccccc4[nH]3)cc2)CC1,2.05068455,4.7273,2.67661545\nCc1c(C)n(-c2ccc(Br)cn2)c2nc[n+](Cc3ccncc3)c(N)c12,2.971114521,3.11284,0.141725479\nCOc1ccc([C@H]2Nn3c(C)nnc3S[C@@H]2C(=O)N2CCCC2)cc1,3.304234512,1.97662,-1.327614512\nCc1ncc(CC(=O)N2CCc3c(c(C(=O)[O-])nn3CC3CC3)C2)c(=O)[nH]1,3.159885346,-0.82428,-3.984165346\nCOCc1nc2n(n1)CCC[C@@H]2NC(=O)Nc1cnccc1C,3.196257529,1.78452,-1.411737529\nCc1ccc(NC(=O)C[C@H](c2cccc(C)c2)n2cccc2)cc1,2.522819487,4.72314,2.200320513\nCC1CCN(C(=O)CN2CCN(c3cc(C#N)ccn3)CC2)CC1,2.242431466,1.33378,-0.908651466\nN#Cc1ccnc(N2CCC(CCC(=O)NCc3cccc(Cl)c3)CC2)c1,2.250365015,3.91968,1.669314985\nCOCC[C@@H](C)C(=O)Nc1cccc(Oc2ccncc2)c1,2.484806648,3.485,1.000193352\nO=C(Cn1nc(C(=O)[O-])c2ccccc2c1=O)Nc1ccc2c(c1)C(=O)c1ccccc1C2=O,2.536245035,1.1741,-1.362145035\nCC(C)(C)c1csc(CNC(=O)N[C@@H]2CCc3c(O)cccc32)n1,3.019480995,3.6329,0.613419005\nCCn1cncc1[C@@H]1OCC[C@H]1C[NH2+]C(C)(C)C,4.646004856,1.3425,-3.303504856\nO=C(CN1C(=O)N[C@]2(CCCc3sccc32)C1=O)N[C@H](c1ccccc1)c1cccs1,3.709928512,3.7988,0.088871488\nCc1ccc(C(=O)N[C@H](CC[NH+](C)C)c2ccc(Cl)cc2)s1,3.43628494,2.71562,-0.72066494\nC/C=C/C[S@@](=O)Cc1nc(-c2cccs2)oc1C,3.517488633,3.53632,0.018831367\nNc1nonc1-c1noc(COc2ccc(Br)cc2)n1,2.399600676,2.0433,-0.356300676\nCOC[C@H]1CCCN(C(=O)N[C@@H]2C[C@](C)(OC)C2(C)C)C1,3.874154216,2.258,-1.616154216\nC[NH+](C)[C@H]1CC[C@H](NC(=O)N(Cc2c(F)cccc2Cl)C2CC2)C1,3.994602094,2.2187,-1.775902094\nCC(=O)/N=C1\\S[C@@H]2CS(=O)(=O)C[C@@H]2N1c1ccc(Br)cc1Cl,3.526675094,2.7238,-0.802875094\nCc1c(Cl)cccc1S(=O)(=O)N1CCC(C(N)=O)CC1,2.029710667,1.53442,-0.495290667\nCN(Cc1ncnn1C)C(=O)[C@@H]1CC(=O)N(CC(F)(F)F)C1,3.223841813,0.1843,-3.039541813\nCC(=O)/C=C(/NNC(=O)c1cccc([N+](=O)[O-])c1)c1ccccc1,2.269320979,2.4593,0.189979021\nCC(C)c1cc(C(=O)NC[C@@H]2CCC[NH+](C(C)C)C2)n(C)n1,4.366273556,0.9766,-3.389673556\nCC[C@](C)([NH3+])CO[C@H]1CCOC2(CCCCC2)C1,4.663770551,2.2955,-2.368270551\nCc1ccc(-n2nc(C(=O)NC[C@H]3CCCO3)c(=O)n(Cc3cccc(F)c3)c2=O)cc1Cl,2.883985307,2.45222,-0.431765307\nC[C@H](Cc1cccs1)N(C)C(=O)NCC1CC1,2.835861306,2.7305,-0.105361306\nCOc1ccc(C(=O)N[C@@H](Cc2ccccc2)c2nc3ccccc3[nH]2)cc1[N+](=O)[O-],2.584541779,4.1935,1.608958221\nCCCn1c(N)c(Br)c(=O)n(CC(=O)NC)c1=O,2.537467799,-0.4893,-3.026767799\nCc1nc(/C=C/C(=O)Nc2nc(-c3ccccc3)ns2)cs1,2.462739944,3.62192,1.159180056\nCc1ccccc1OCCCC(=O)NCc1ccccc1,1.537882208,3.47042,1.932537792\nCC[C@H](C)OCc1ccc(C(=O)NN)cn1,2.8132977,1.0002,-1.8130977\nCCCc1noc(-c2cc(S(=O)(=O)NC)ccc2OC)n1,2.261944947,1.6058,-0.656144947\nCc1cc(C)n2cc(CN3C[C@@H](C)C[C@H]([NH3+])C3)nc2n1,4.026418302,0.79844,-3.227978302\nCc1noc(C)c1[C@@H](C)NC(=O)NCCCC[NH+]1C[C@@H](C)C[C@H](C)C1,4.640605204,1.99264,-2.647965204\nC[C@@H](NC(=O)C(C)(C)C)C(=O)Nc1ccc(F)c(C(=O)N(C)C)c1,2.557460441,2.0168,-0.540660441\nO=c1oc2cccnc2n1CC[NH+](Cc1ccccc1F)C1CCCCC1,3.845392421,2.5464,-1.298992421\nCOc1cccc(CCC(=O)N2CCC(OCc3ccccc3F)CC2)c1,2.033454364,3.9747,1.941245636\nCCn1c(N/N=C/c2ccc(N(C)C)cc2)nc2c1c(=O)[nH]c(=O)n2C,2.59704872,0.9552,-1.64184872\nCc1cc2n(n1)CCC(=O)N2[C@@H](C)C(=O)NCc1ccccn1,3.100623436,1.02812,-2.072503436\nCc1ccc(CC(=O)NC[C@@H]2COc3ccccc3O2)cc1,2.34855287,2.49372,0.14516713\nCC[C@H]1COCCN1Cc1c[nH]nc1-c1ccccc1F,3.202033955,2.8266,-0.375433955\nCC[C@H](C)N1C(=O)c2ccccc2N[C@@H]1c1ccc(Cl)c([N+](=O)[O-])c1,3.14156896,4.6132,1.47163104\nCS(=O)(=O)c1ccc(NC(=O)Cn2c(-c3ccccc3)cc3ccccc32)cc1,1.961669185,4.3505,2.388830815\nCS[C@@H]1CC[C@@H](NC(=O)/C=C/c2ccc(OCc3cccnc3)cc2)C1,3.120059729,4.0741,0.954040271\nCS[C@H](CO)[C@@H](C)NC(=O)NCc1cc(=O)[nH]c2ccccc12,3.361951895,1.4397,-1.922251895\nO=C(NCC1CC1)C1([NH2+]Cc2cnn(-c3ccccc3)c2)CCCC1,3.257340383,1.7747,-1.482640383\nCC(C)[C@@H]1CN(C(=O)NC[C@H]2CC[NH+](C3CC3)C2)CCS1,5.10628001,0.8366,-4.26968001\n[NH3+]C[C@H](CC(=O)NC1CC1)N1CCn2cnnc2C1,3.791785348,-1.6271,-5.418885348\nO=C(NN1Cc2ccccc2C1)c1cnn(-c2ccccc2)n1,2.463468958,1.9279,-0.535568958\nCOc1ccc(S(=O)(=O)[N-]c2ccc(Cl)c(Cl)c2)cc1[N+](=O)[O-],3.09463754,4.3043,1.20966246\nO=C(NCC[C@@H]1C[C@H]2CC[C@@H]1C2)C(=O)Nc1nc(-c2ccccc2)cs1,3.907753927,3.6911,-0.216653927\nCCCS(=O)(=O)c1ncc(Cl)c(C(=O)Nc2sc3c(c2C(=O)OC)CCC3)n1,2.655303848,2.9028,0.247496152\nCC(C)(C)c1csc(CNc2cc(F)cc(F)c2)n1,2.393327346,4.3309,1.937572654\nCCOc1ccc(OCC(=O)N2CCN(c3nccn3-c3cccc(Cl)c3)CC2)cc1,2.239043704,3.652,1.412956296\nCNC(=O)[C@@H]1C[C@H]([NH3+])CN1Cc1ccc(-n2cccn2)cc1C,3.745433108,0.11152,-3.633913108\nC[C@@H](Cc1nc2ccccc2s1)NCc1ncn(C)n1,3.139160511,2.1456,-0.993560511\nCCC(CC)n1ccc(Cn2nnc(N)c2C(C)(C)C)n1,3.056509207,2.7637,-0.292809207\nCCC[C@H]1CN(C(=O)[C@H]2C[C@@H]2c2ccccc2C)CCO1,3.194344378,3.12602,-0.068324378\nC[C@H]1CCCC[C@@H]1OCCCOc1ccccc1C[NH3+],3.348375729,2.7927,-0.555675729\nO=c1oc(-c2ccccc2)nn1CN1CCN(c2ccc(F)cn2)CC1,2.311079692,1.8171,-0.493979692\nO=C(CCc1ccco1)NCc1cnn(-c2ccccc2)c1,2.047571641,2.7143,0.666728359\nCOc1ccc2ccccc2c1/C=C1/N=C(c2cccc([N+](=O)[O-])c2)OC1=O,2.34548966,4.1011,1.75561034\nO=C([O-])[C@H]1CCO[C@H]1Cc1ccccc1,3.392413353,0.3841,-3.008313353\nO=C([O-])C1[NH+]=c2ccccc2=[NH+]1,5.228431607,-5.8234,-11.05183161\nC=CCN1C(=O)N=C(N)[C@@H]1c1ccc(OC(C)C)cc1,3.176426978,2.4937,-0.682726978\nCc1n[nH]c(SCC(=O)NC2(C)Cc3ccccc3C2)n1,2.992188236,1.87892,-1.113268236\nC[C@H](NC(=O)NC[C@@H]1C[NH+](C)CCN1C)c1ccc(C(C)(C)C)cc1,4.117589731,1.173,-2.944589731\nC=CCN1CCN(S(C)(=O)=O)[C@@H](C(=O)NCc2ccccc2)C1,2.679932748,0.4346,-2.245332748\nO=S(=O)(c1cccn2c(SCc3c(F)cccc3Cl)nnc12)N1CCCC1,2.468769119,3.5986,1.129830881\nCN[C@@H](c1c[nH+]ccc1N)[C@@H]1CN2CCC[C@@H]2CO1,4.894342078,0.2066,-4.687742078\nC[C@@H](Sc1nc2ccccc2n1C)C(=O)N(C)Cc1ccccc1,2.431248545,3.7125,1.281251455\nC[C@H]1CCCC[C@@H]1[NH+](C)Cc1cc(F)cc(C#CC[NH3+])c1,5.04524194,1.0125,-4.03274194\nO=C(Nc1ccc2c(c1)CCCN2S(=O)(=O)c1ccccc1)c1cc2ccccc2o1,2.17677755,4.8266,2.64982245\nC[C@@H]1CCCC[C@H]1[NH2+][C@@H]1CCN(C2CCCCC2)C1=O,4.263772335,2.0621,-2.201672335\nCCc1nn(C)c2c1nc(N)n2[C@H](C)c1ccccc1F,3.276272156,2.6628,-0.613472156\nCC(C)CNC(=O)CNC(=O)C(=O)Nc1ccc2ccn(C)c2c1,2.263583154,1.0052,-1.258383154\nCc1nc(-c2nnc(NC(=O)c3ccc(S(=O)(=O)N4CCC[C@H](C)C4)cc3)o2)cs1,2.887521357,3.17442,0.286898643\nO=C(NCCc1ccccc1)[C@H]1CCCN(C(=O)c2cccc(Cl)c2)C1,2.231072868,3.5511,1.320027132\nCC1(C)C[C@H]([NH2+]C[C@@H]2CCC[C@@H]2NS(C)(=O)=O)C(C)(C)O1,4.794950037,0.6138,-4.181150037\nCc1ccc2cccc(NC(=O)[C@H](C)SCc3nc4scc(-c5ccccc5)c4c(=O)[nH]3)c2n1,2.981012578,5.76862,2.787607422\nCN(C(=O)CS(=O)(=O)c1nnc(N2CCCC2)s1)C1CC1,2.737794284,0.5328,-2.204994284\nCC(C)c1ocnc1C[S@@](=O)[C@@H](C)c1ccc(F)c(F)c1,3.919654427,4.0861,0.166445573\nC/C(Cn1nnc2ccccc21)=N\\NC(=O)c1ccc(Br)cc1,2.193402011,2.9997,0.806297989\nCCOc1ccc(-n2ccnc(SCC(=O)Nc3ccccc3Cl)c2=O)cc1,2.118518314,4.0154,1.896881686\nCc1ccc(OC(F)F)c(CN2CCN(c3nc(C)ns3)CC2)c1,2.451202487,3.07854,0.627337513\nCc1cc2ncc([C@H](C)NC3CC[NH+](Cc4ccccn4)CC3)c(C)n2n1,4.106032583,1.63924,-2.466792583\nCC[NH2+][C@@]1(C(=O)[O-])CC[C@@H](n2cnc3ccccc32)C1,4.410980043,-0.1667,-4.577680043\nCCCN1C(=O)CCc2cc(NC(=O)N(CC)CC(C)(C)O)ccc21,2.532985399,3.0005,0.467514601\nCOC(C)(C)c1noc(-c2cc(Br)cn2C(C)C)n1,3.212215459,3.763,0.550784541\nCc1nc(NC[C@H](c2ccc(F)cc2)[NH+]2CCCC2)cc(-c2cc[nH]n2)n1,4.361277329,2.14612,-2.215157329\nC[C@@H](O)CCC(=O)Nc1cccc(NC(N)=O)c1,2.34605947,1.2767,-1.06935947\nCC(C)(C)c1csc([C@H]2CN(Cc3ccc(O)cc3O)CCO2)n1,3.143730594,3.4253,0.281569406\nCc1ccc(-c2c[n+](-c3cccc(C(F)(F)F)c3)c3n2CCCCC3)cc1,2.729851836,5.48542,2.755568164\nO=c1[nH]c(CCl)nc2ccc(O)cc12,2.419924373,1.3675,-1.052424373\nCOc1cc([C@H](C)N[C@@H]2CCCc3nc(C)sc32)ccc1F,3.302915814,4.32742,1.024504186\nCc1ccc([N-]S(=O)(=O)[C@@H]2CCC[NH2+]C2)c(C)[nH+]1,6.031753882,0.17834,-5.853413882\nCc1ccccc1-n1nc2c(c1NC(=O)c1cc([N+](=O)[O-])ccc1Cl)CS(=O)(=O)C2,2.635545395,3.42302,0.787474605\nCc1ccc(NC(=O)C[C@@H]2C(=O)Nc3nc(-c4ccccc4)nn32)cc1C,2.991716945,3.08394,0.092223055\nCC(C)[C@H]1CC[C@@H](C)C[C@H]1[NH2+]CC1(CS(C)(=O)=O)CC1,4.753834564,1.8354,-2.918434564\nO=C(NCCc1nnc(-c2ccccc2)o1)c1ccc2c(c1)C(=O)NC2=O,2.221256139,1.5927,-0.628556139\nCc1ncc(CC(=O)N2CCC(=O)N(CC3CC3)[C@@H](C(C)C)C2)c(=O)[nH]1,3.25717113,1.11632,-2.14085113\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@@H]1CCN1CCS(=O)(=O)CC1,4.746850504,-2.021,-6.767850504\nCc1ccccc1NN/C=C1/C(=O)NC(=O)c2ccccc21,2.204343137,2.22262,0.018276863\nCCN(CCC(=O)NC1CC1)C(=O)c1cnc(C)cn1,2.306046425,0.91582,-1.390226425\nCc1cccc([C@H]2CCC[NH2+]2)c1,4.094583094,1.39332,-2.701263094\nCS(=O)(=O)N1CCc2cc(C(=O)Nc3cc(Br)ccc3Br)ccc21,2.207749587,3.786,1.578250413\nCC(C)Cn1ncc(C(=O)N(C)[C@@H](C)Cc2ccsc2)c1C(F)F,3.336934928,4.2414,0.904465072\nCc1ccc(F)c(NC(=O)C(=O)NCCC[NH+]2CCCCCC2)c1,3.312323302,1.03782,-2.274503302\nCc1cc(NC(=O)[C@H](C)NS(=O)(=O)c2cccc(Cl)c2)ccc1N1CCCC1=O,2.630654339,3.08072,0.450065661\nO=C(Cn1nc(-c2ccccc2F)ccc1=O)NC[C@@H]1CCCO1,2.613433534,1.3446,-1.268833534\nCC(C)COC(=O)N1CCC([NH2+]CCc2nnc3ccccn23)CC1,3.358366621,1.0922,-2.266166621\nCC(=O)N1CC[C@@H]([NH2+][C@H](C)c2c[nH]c3ccc([N+](=O)[O-])cc23)C1,4.043087472,1.3213,-2.721787472\nCc1cnc(C(=O)N2CCC(c3[nH]ncc3-c3ccncc3)CC2)cn1,2.56977868,2.58992,0.02014132\nCCCNC(=O)CN1CCC([NH2+][C@H](C)c2ccc(F)cn2)CC1,3.633696098,0.8357,-2.797996098\nC1CCC([NH2+]CC[C@H]2C[NH2+]CCO2)CC1,5.438513113,-0.7652,-6.203713113\nCCOC(=O)C1(C)CCN(C(=O)CSc2ccccc2Cl)CC1,2.317976552,3.6239,1.305923448\nCCN(CC)C(=O)[C@@H]1CCC[NH+]1Cc1cnc(Cl)s1,4.76589579,1.2122,-3.55369579\nCCOc1ccccc1N/C(=C\\C(=O)OC)C(=O)NC,2.268032439,1.3001,-0.967932439\nCNC(=O)COc1ccccc1-c1noc(-c2ccc(S(=O)(=O)N(C)C)o2)n1,2.535585059,1.3717,-1.163885059\nCOC(=O)CSc1cc(Cl)ccc1Cl,1.950704472,3.2585,1.307795528\nCOC(=O)[C@@H]1C(=O)C2=C(C[C@@H]1C)Nc1ccccc1N[C@@H]2c1ccc(Cl)c(F)c1,3.528144568,4.71,1.181855432\nCOCC[C@@H](C)C(=O)Nc1ccccc1S(=O)(=O)C(F)F,2.730644768,2.294,-0.436644768\nCCN(c1ccc(C(=O)NCc2ccc([S@](C)=O)cc2)cc1)C(C)C,2.670970361,3.5887,0.917729639\nCC[C@@H]1CCCC[C@H]1[C@H]([NH3+])c1c(F)cccc1F,3.827628952,3.4642,-0.363428952\nCc1cccc(C(=O)NN2Cc3ccccc3C2)c1F,2.30064616,2.79472,0.49407384\nCCn1c(=O)n(CC(=O)Nc2cc(OC)ccc2OC)c(=O)c2ccccc21,2.001306361,1.839,-0.162306361\nCc1cc(=O)c(C(=O)N2CCC[C@H](Cn3cc(CC4CCCCC4)nn3)C2)c[nH]1,3.262503707,2.95002,-0.312483707\nCc1c(-c2ccnc(N3c4ccccc4C[C@H]3C)n2)nnc2nc(-c3ccco3)nn12,3.333401872,3.62742,0.294018128\nCc1ccc(NC(=O)c2cnn(-c3ccccc3)c2C(F)(F)F)cc1S(=O)(=O)N(C)C,2.297134473,3.70212,1.404985527\nCC[C@@H](CSC)N(C)C(=O)Nc1ccc(C(=O)N(C)C)c(C)c1,2.971901104,3.30212,0.330218896\nCOc1cc2c(cc1NC(=O)c1cc(OC)c(OC)c(OC)c1)CCCC2,2.044842359,3.8521,1.807257641\nCOc1cc(C)c(Br)cc1[C@@H](Cl)[C@H]1CCO[C@@H]1C,3.81993083,4.47102,0.65108917\nCc1cnc(SCC[NH+]2CCOCC2)nc1[C@@H]1CCCN(C(=O)/C=C/c2cccnc2)C1,4.141789854,1.60662,-2.535169854\nCc1ccc(NC(=O)CCn2nnc3cc(C(=O)NCc4cccc(Cl)c4)ccc32)cc1,2.111270605,4.35192,2.240649395\nCc1ccc(SCCC(=O)NC[C@@H]2CN3CCCC[C@@H]3CO2)cc1,3.075083992,2.84672,-0.228363992\nCn1cc(CNC(=O)c2cc(-c3ccccc3)nc3ccc(Br)cc23)cn1,2.113376849,4.3278,2.214423151\nCc1cccn2c(=O)c(C(=O)Nc3ccc([S@](C)=O)c(F)c3)cnc12,2.962959658,2.13172,-0.831239658\nCc1nonc1C(=O)NCc1csc(=O)[nH]1,3.075637706,0.05782,-3.017817706\nCOc1cc2[nH]c(C(=O)N3CCN(c4ccc(F)cc4)CC3)cc2c(OC)c1OC,2.277064802,3.2952,1.018135198\nO=C([O-])[C@H]1C[C@@H]2CCCC[C@@H]2N1C(=O)[C@@H]1CCS(=O)(=O)C1,4.106939223,-0.6693,-4.776239223\nCCn1c([C@@H](C)[NH2+]CC2([C@H](O)c3ccccc3)CC2)nc2ccccc21,3.791835426,3.1944,-0.597435426\nCCSC[C@H](C)N(C)C(=O)CCc1c[nH]c2ccccc12,2.930423354,3.7005,0.770076646\nCOc1ccc2c(C)c(CCC(=O)Nc3c(C(C)C)cccc3C(C)C)c(=O)oc2c1,2.335432391,5.92812,3.592687609\nO=C(NNC(=O)c1cccc(NC(=O)c2ccco2)c1)c1ccccc1,1.806007951,2.6067,0.800692049\nCc1ncc([C@H](C)NC[C@H]2CN3CCCC[C@@H]3CO2)c(C)n1,3.635850378,1.99734,-1.638510378\nCC[C@H]1CCCC[C@H]1n1cc(C(=O)[O-])c(=O)[nH]c1=O,3.611392278,0.0414,-3.569992278\nN#C[C@H](C(=O)c1cccc(N2CCCC2=O)c1)c1ccc(C(F)(F)F)cn1,2.994926879,3.71728,0.722353121\nCCOc1cccc([C@@H](C)[NH2+][C@H](C)Cn2cnc(C)c2C)c1,3.8512171,2.61174,-1.2394771\nO=C(OCc1nnsc1Cl)c1cccc(N2CCCS2(=O)=O)c1,2.690322495,2.0884,-0.601922495\nCc1cc(NC(=O)c2cc(Cl)ccc2Cl)n(C(C)C)n1,2.145072206,4.33152,2.186447794\nCc1ccc(OC[C@@H](O)C[NH2+]C[C@@]2(C)CCCO2)cc1,3.972963039,0.86722,-3.105743039\nCSC(C)(C)CNC(=O)c1ccc(S(=O)(=O)NC2CC2)cc1,2.376249042,1.9987,-0.377549042\nCc1ccccc1-n1nnnc1[C@@H](c1ccccc1)N(C)Cc1cccc([N+](=O)[O-])c1,2.833420949,4.10032,1.266899051\nCOc1cc(C)c(NC(=O)N2CC[C@@H](C)C[C@@H]2C)cc1OC,2.80715971,3.66452,0.85736029\nCCC[NH2+]CC/C=C(/C)c1ccc(C)c(C)c1,3.433729102,3.07024,-0.363489102\nCS(=O)(=O)c1nc(C(=O)Nc2ccc3c(c2)Cc2ccccc2-3)c2ccccn12,2.440662697,3.5613,1.120637303\nO=C(N[C@@H](c1ccc2c(c1)OCCO2)C1CC1)NC1(c2noc(C3CC3)n2)CCCC1,3.346508313,3.938,0.591491687\nCC(=O)Oc1ccccc1-c1nc2s/c(=C\\c3cc(C)c(OC(C)=O)c(C)c3)c(=O)n2n1,2.667587844,2.83314,0.165552156\nO=C(NCC(=O)N1CCN(C(=O)c2ccccc2)CC1)c1ccc(Br)o1,2.03066504,1.7565,-0.27416504\nO=C(NC1CCCC1)[C@@H]1CCCN1C(=O)Nc1cccc(Cl)c1,2.401089697,3.3951,0.994010303\nCc1c(C(=O)N2CCC3(CC2)C(=O)NCCC[NH+]3C)cnc2ccnn12,4.613407565,-0.95288,-5.566287565\nCC[C@@H](C)CN(C)c1ccccc1C[NH2+]C1CC1,3.802154805,2.3947,-1.407454805\nO=C(Nc1ccnn1Cc1cccc(Cl)c1Cl)c1cccc(S(=O)(=O)NC2CCCC2)c1,2.387381354,4.7114,2.324018646\nCOc1ccc(C(=O)Nc2ccccc2C(=O)Nc2ccccn2)cc1,1.688012874,3.5948,1.906787126\nCC(=O)Nc1cccc(-n2cc3c(c2-c2ccccc2Cl)c(=O)n(C)c(=O)n3C)c1,2.439185247,3.3067,0.867514753\nCc1ccc2nc(C)c(C(=O)NCC(C)(C)c3ccncc3)cc2c1,2.306965961,3.95424,1.647274039\nCSc1ccsc1C(=O)NC1(c2cccc(Cl)c2)CC1,2.584660576,4.5425,1.957839424\nO=C(OCc1nnc(-c2ccccc2Cl)o1)C1CCOCC1,2.256423598,2.8598,0.603376402\nCOc1ccc(-c2csc(NC(=O)CSc3nc4ccccc4[nH]3)n2)cc1OC,2.179890746,4.4344,2.254509254\nCCc1cc([C@@H]2CCC[NH2+]2)cc(CC)c1Cl,4.274119239,2.8631,-1.411019239\nCc1nc2sc3c(NC[C@H]4CCCO4)ncnc3c2c2c1COC(C)(C)C2,3.430342051,3.99012,0.559777949\nCc1cncc(NC(=O)NCCc2cccc(F)c2)c1,1.947457496,2.89332,0.945862504\nCN(C)c1ccc(NC(=O)[C@@H]2CCCN2C(=O)C2CC2)cn1,2.57703349,1.4871,-1.08993349\nO=C1O[C@H](CCN2CCOCC2)CN1Cc1ccccc1,2.614376716,1.7297,-0.884676716\nO=C(Cn1c(Cl)nc2ccccc21)c1ccc2c(c1)OCCO2,2.226699819,3.3438,1.117100181\nCC(=O)N[C@H](CC(=O)Nc1nc[nH]n1)c1ccc(C)cc1,3.10270471,1.31912,-1.78358471\nCCC[C@@]1(CC)NC(=O)NC1=O,3.343567709,0.7747,-2.568867709\nCC(C)[C@@H](O)c1ccn(CCC(C)(C)C)c1,3.326211986,3.6137,0.287488014\nO=C(CNS(=O)(=O)c1ccccc1Cl)NCc1ccco1,2.015111156,1.5277,-0.487411156\nCCSC1=C(C#N)[C@H](CC(C)C)C2=C(CCCC2=O)N1,3.509414391,3.74728,0.237865609\nC[C@H]1CC[C@H](NC(=O)NC[C@](C)(O)c2ccsc2)[C@H](C)C1,3.828507393,3.0795,-0.749007393\nO=C(Cc1ccc(-n2cccn2)cc1)Nc1cccc(CN2CCOC2=O)c1,2.246151822,3.0057,0.759548178\nCOCCSc1ccccc1C(=O)N1C[C@H](C)O[C@@H](C)C1,2.919174745,2.6745,-0.244674745\nCN[C@@H]1[C@@H](C[NH+](C)CCO)C(C)(C)OC1(C)C,5.466499996,-0.715,-6.181499996\nCC(C)COC1CCN(C(=O)/C=C/c2cccc3cccnc23)CC1,2.385662775,3.9116,1.525937225\nCc1cc(-n2c(C)cc(C(=O)CN3C(=O)NC(c4ccccc4)(c4ccccc4)C3=O)c2C)no1,2.659919657,4.06886,1.408940343\nCCC[NH2+][C@@H](CC[C@H]1CCCO1)[C@@H]1C[NH+]2CCN1CC2,6.424619424,-1.1297,-7.554319424\nCC[C@@H](C)NC(=O)[C@H](NC(=O)c1cccc(C)c1)C1CCN(C(=O)c2ccccc2)CC1,2.852569634,3.56052,0.707950366\nCc1ccc([C@H]2CSCCN2C(=O)N[C@H]2CC[C@H]([NH+](C)C)C2)cc1,4.245038493,1.86012,-2.384918493\nC[C@H](NC(=O)c1ccnn1C)C12CC3CC(CC(C3)C1)C2,4.194819005,2.7548,-1.440019005\nCn1ncnc1CCNC(=O)N1CCC[C@H]1Cc1ccccc1,2.744029526,1.7743,-0.969729526\nO=C([O-])COc1cc(-c2ccncc2)nc(N2CCN(c3ccccc3F)CC2)n1,2.663513821,1.133,-1.530513821\nN#C[C@H]1CNCCN1C(=O)c1ccsc1,3.291177871,0.68568,-2.605497871\nCC[NH+]1CC[C@H](N(C)CC(=O)OC(C)(C)C)[C@H](C)C1,4.803468008,0.5731,-4.230368008\nO=C(Cc1ccc2c(c1)OCCCO2)NCCNC(=O)[C@H]1C=c2ccccc2=[NH+]1,3.610596228,-1.8141,-5.424696228\nC[C@@H]1CCN(C(=O)NCCn2c(=O)[nH]c3ccccc32)C[C@@H]1O,3.115989084,0.7419,-2.374089084\nCCC1(CC)CC(=O)N(CCOc2ccccc2)C1=O,2.279792993,2.6307,0.350907007\nCSc1cccc(NC(=O)[C@@H]2CCCN2C(=O)c2cccs2)c1,2.519958928,3.7133,1.193341072\nCCOC(=O)c1ccc(N2C(=O)c3oc4ccc(F)cc4c(=O)c3[C@@H]2c2ccc(SC)cc2)cc1,2.869982169,5.5805,2.710517831\nCc1ccsc1CN[C@@H]1COC[C@@H]1O,3.438656169,0.90582,-2.532836169\nCO[C@H](C)C(=O)Nc1cccc(NC(=O)N(C)[C@@H](C)c2cccnc2)c1,2.920062765,3.2799,0.359837235\nCN(C(=O)c1ccc2c(c1)CCO2)C1CCC(=O)CC1,2.371567617,2.2052,-0.166367617\nCC(C)c1csc(/C(C#N)=C\\c2csc(-c3ccc(F)cc3)n2)n1,2.706482804,5.59328,2.886797196\nCOc1ccc(CC(=O)N2CCO[C@@H](COCc3cccc(C)c3)C2)cc1,2.523511281,2.99032,0.466808719\nO=C(COc1ccc(Cl)cc1)N[C@@H](c1ccccc1)c1cccs1,2.260327084,4.6861,2.425772916\nCCOC(=O)[C@H]1C(=O)C(=O)N(c2ccccc2)[C@@H]1c1ccc([N+](=O)[O-])cc1,2.954844007,2.4311,-0.523744007\nC[NH+](C)CCCOc1ccc(NC(=O)[C@H]2CCC(=O)O2)cc1,3.481344542,0.2441,-3.237244542\nCC(C)(C)OC(=O)N(CCNC(=O)N1CCC([NH+]2CCCC2)CC1)C1CC1,3.689186885,1.2386,-2.450586885\nCC1=C[C@@H](c2ccc(S(=O)(=O)Nc3ccc(Cl)cc3C)o2)N=N1,3.54172818,4.45292,0.91119182\nCc1nn(-c2ccccc2)c(COC(=O)C23C[C@@H]4C[C@@H](CC(O)(C4)C2)C3)c1C#N,4.885278242,3.4269,-1.458378242\nCC(C)C(=O)Nc1cccc(NC(=O)C(=O)NCC(C)(C)c2ccncc2)c1,2.297021131,2.7086,0.411578869\nCc1cc(C)n2nc(C)c(C(=O)O[C@@H](C)[C@@H]3CCCO3)c2n1,3.472079899,2.37886,-1.093219899\nCCCN[C@]1(C#N)CC[C@@H](Sc2nc(C)c(C)c(C)n2)C1,3.865569071,3.30844,-0.557129071\nCCOC(=O)C1(c2nc([C@H]3CC[C@H](C)C3)no2)CCCC1,3.496369973,3.3481,-0.148269973\nCCCNC(=O)c1ccc2c(c1)c(C)c(C)n2C,1.967924688,2.93494,0.967015312\nCCCOc1ccc(NCC(=O)Nc2ccc(F)cc2)cc1,1.632813467,3.6651,2.032286533\nS=c1n(CN2CCSCC2)nc(N2CCCCC2)n1-c1ccccc1,2.583555394,3.39989,0.816334606\nCc1cc(C(F)F)nn1CC(=O)N1CC(c2ccncc2)C1,2.61704593,2.15012,-0.46692593\nO=[N+]([O-])c1ccc(Cl)c(S(=O)(=O)/N=C([O-])/C=C/C2CC2)c1,3.166163675,1.6619,-1.504263675\nCC[C@@H](C)n1ncc(C(=O)N2CCO[C@H](C#N)C2)c1C1CC1,3.768596419,2.09608,-1.672516419\nO=C(NC(C1CC1)C1CC1)N1CC[C@@H](Oc2ccc(C(F)(F)F)cn2)C1,3.144221022,3.4517,0.307478978\nC[C@H]1CCCC[C@@H]1OCCN(C)Cc1nc(=O)c2ccccc2[nH]1,3.075784642,2.9502,-0.125584642\nCOc1ccc(Cl)cc1S(=O)(=O)N1CCN(C(=O)c2cc3ccccc3oc2=O)CC1,2.197795481,2.6017,0.403904519\nC[C@@H]1CCN(c2ncc([N+](=O)[O-])cc2Cl)[C@H](C)C1,3.208323663,3.268,0.059676337\nCc1nc(N2CCN(C(=O)c3cccc(F)c3)CC2)cc(-n2cnc(C)c2C)n1,2.473499443,2.68906,0.215560557\nCOC(=O)[C@H]1CCCCN1C[C@H](O)COc1ccc(Cl)cc1Cl,2.853293962,2.7606,-0.092693962\nCOC[C@](C)(CCO)[NH2+]Cc1ncccc1C,4.061519898,0.24092,-3.820599898\nO=C(NC1CCCCC1)C1CCN(C(=O)c2cc3cc(Cl)cnc3[nH]2)CC1,2.420386909,3.5174,1.097013091\nCC(C)(C)N(CCC(=O)[O-])C(=O)N1CC[NH+]2CCC[C@H]2C1,5.296421228,-1.2902,-6.586621228\nCOC(=O)c1cc(NC(=O)NCc2ncc[nH]2)ccc1C,2.150937232,1.82642,-0.324517232\nCN(Cc1ccc(C#N)cc1)C(=O)NCCN(C)c1ccccc1,2.099813573,2.83608,0.736266427\nCCCNC(=O)C1CCN(C(=O)c2cccc(NS(=O)(=O)c3ccc4c(c3)CCCC4)c2)CC1,2.239853257,3.7446,1.504746743\nO[C@H]1Cc2ccccc2[C@H]1[NH2+]Cc1cnc(-c2ccsc2)s1,4.127840834,2.5933,-1.534540834\nN#Cc1cc([N+](=O)[O-])ccc1Oc1ccc(/C=N\\NC(=O)c2ccccc2-n2cccc2)cc1,2.503753463,4.81338,2.309626537\nC[C@H]1C[C@H](C)CN(C(=O)[C@@H](C)S(=O)(=O)c2nc3ccccc3[nH]2)C1,3.481529855,2.2296,-1.251929855\nCCN(C(=O)CN1CC[NH+](CC2CC2)CC1)c1cccc2ccccc12,3.297787028,1.8032,-1.494587028\nCc1ccc(Nc2nc3c(c(=O)[nH]2)CCCC3)c(F)c1,2.46036033,2.83982,0.37945967\nCCC[C@H]1[C@H](C)CCCN1C(=O)NCCCCSC,3.291429317,3.7398,0.448370683\nCc1noc(C)c1CN1C(=O)N(c2cccc(Cl)c2)c2ncccc2S1(=O)=O,2.790302811,3.80244,1.012137189\nCCC[NH2+][C@@H](C)c1ccc(SCC(C)C)cc1,3.752956887,3.4691,-0.283856887\nC[C@@H](C(=O)NCCc1ccc(F)cc1)[NH+](C)Cc1ccc(Cl)nc1,3.667307699,1.6362,-2.031107699\nCC(C)(C)N1C[C@@H](C(=O)n2ccnc2)CC1=O,3.241710455,1.1703,-2.071410455\nCN(c1ccc(C(=O)Nc2ccccc2C(=O)N2CCCC2)cc1)S(C)(=O)=O,1.980031213,2.5707,0.590668787\nNc1ccc(=O)n(CCN2CC[NH+]3CCC[C@@H]3C2)c1,4.656713618,-1.2066,-5.863313618\nCc1cc(NC(=O)C(=O)N2CCN(c3ccc(Br)cn3)CC2)no1,2.379687103,1.42782,-0.951867103\nCc1ccc(C(=O)N2CCN(C(=O)CN3CCOCC3)CC2)s1,2.130681032,0.67312,-1.457561032\nCOc1ccc(C(=O)NCc2ccc(C[NH+]3CCC[C@H](C)C3)cc2)cc1[N+](=O)[O-],3.752516678,2.3482,-1.404316678\nC[NH+](Cc1ccc(OC(F)(F)F)cc1)C[C@H]1CCOC1,4.01571473,1.6364,-2.37931473\nCOC(=O)c1ccc(C(=O)N2CCC(Oc3nc4c(C)ccc(C)c4s3)CC2)cc1,2.358502992,4.38334,2.024837008\nCC[C@@H](C)[C@@H]([NH2+]C)C1([NH+](C)C)CCC(C)CC1,5.323609858,0.6877,-4.635909858\nCC[C@H](C)NC(=O)[C@H](C)NC(=O)Nc1ccc2c(c1)COC2,2.971019369,2.1415,-0.829519369\nCc1nn(CC(=O)Nc2ccc(N3CCOCC3)cc2)c(=O)c(C#N)c1C,2.242124545,1.20712,-1.035004545\nO=C([O-])CCCS(=O)(=O)Nc1ccc(Cl)cc1F,2.651310618,0.7509,-1.900410618\nCCOC(=O)c1ccccc1[N-]S(=O)(=O)c1cccc(F)c1,2.877115287,3.3965,0.519384713\nCC(C)(C)C(=O)Nc1ccc(NC(=O)c2ccccc2)nc1,1.82913395,3.3185,1.48936605\nCc1ccc(/C=C/C(=O)N(C)Cc2sccc2C)o1,2.502480003,3.62974,1.127259997\nCC(C)c1noc(CN(C)C(=O)c2csc3nc(-c4cccc(F)c4)cn23)n1,2.730284857,3.9805,1.250215143\nCCC[C@H](CCCc1c([O-])[nH+]c(N)[nH]c1=O)C(=O)[O-],4.655077806,-1.6663,-6.321377806\nCc1ccsc1-c1nc(C)c(C)nc1N,2.624802162,2.71256,0.087757838\nCn1nnc2cc(C(=O)NCC(C)(C)c3ccncc3)ccc21,2.42817859,2.0709,-0.35727859\nO=C(c1cccc(C2=NO[C@@H]([C@@H]3N=NC4=C3CCCC4)N2)c1)N1CCCC1,3.976111738,3.1926,-0.783511738\nCOC[C@H](O)CN(C)C(=O)c1cscc1C,3.166815768,1.13582,-2.030995768\nCOc1ccc2c(c1)C(N1CCN(C(=O)c3ccc(F)cc3)CC1)=Nc1ccccc1O2,2.297544373,4.4763,2.178755627\nNS(=O)(=O)N1CCC(C(=O)NCCNC(=O)c2ccco2)CC1,2.251835206,-0.9589,-3.210735206\nCCC(CC)C(C)(C)C[C@@H](C)[NH2+]C,4.576779185,2.4206,-2.156179185\nCCc1ccc2ncc(C#N)c(Nc3cc(OC)ccc3OC)c2c1,2.189201832,4.42968,2.240478168\nC[C@@H]1CCCC[C@H]1NC(=O)N(C)Cc1nccs1,3.124031281,2.8632,-0.260831281\nCCc1ccc(OCC(=O)N(C(C)C)C2CCOCC2)cc1,2.18664966,3.0438,0.85715034\nCOC1CCN(C(=O)c2ccc(N)c(C)c2)CC1,1.975924657,1.82822,-0.147704657\nCCOC(=O)c1cc(S(=O)(=O)N2CCN(c3ccc(C(C)=O)cc3)CC2)cn1C,2.234633575,1.9153,-0.319333575\nO=C(c1occc1Br)N1CCS[C@H](c2ccccc2)C1,3.097373773,3.9724,0.875026227\nClc1cc(N2CCC[C@H](OCc3cccnc3)C2)nc2[nH]ccc12,3.074239667,3.7969,0.722660333\nCC(C)[NH2+]Cc1ccc([N+](=O)[O-])c(OC/C(Cl)=C/Cl)c1,3.652626052,2.7644,-0.888226052\nCc1cc(C)nc(Nc2nc(CC(=O)Nc3ccc(-n4cnnn4)cc3)cs2)n1,2.491890746,2.45044,-0.041450746\nCc1c2c(cc3c1O/C(=C\\c1c(F)cccc1Cl)C3=O)CN(CCCN1CCOCC1)CO2,2.96219473,4.27792,1.31572527\nCSc1ncc(CNc2cc(-c3ccccc3)nc3ccnn23)n1C,2.639763986,3.4638,0.824036014\nCC[C@@H](C)c1ccc(S(=O)(=O)[N-]c2cc(F)ccc2F)cc1,3.448403351,4.8724,1.423996649\nCOCc1cccc(NC2=[NH+]CCCCC2)c1,3.269881966,1.298,-1.971881966\nC[C@@H](Oc1ccccc1F)C(=O)NNC(=O)Nc1ccccn1,2.508059175,1.8409,-0.667159175\nCc1ccc(-n2c(C)cc(/C=N\\NC(=O)CN3CCOCC3)c2C)cc1[N+](=O)[O-],2.45769911,2.09306,-0.36463911\nCCN1C(=O)Cc2cc(NC(=O)c3ccc(Cl)c([N+](=O)[O-])c3)ccc21,2.169784347,3.4095,1.239715653\nCc1cc(I)ccc1NC(=O)C[NH+](C)[C@@H]1CCc2ccccc21,3.83053583,2.74032,-1.09021583\nCCn1cc(NC(=O)[C@H]2CSCN2C(=O)CC(C)(C)C)ccc1=O,3.085154336,2.1444,-0.940754336\nN#Cc1c(Cl)nsc1N[C@@H]1CCc2cc(Cl)ccc21,3.258787912,4.42098,1.162192088\nCc1cc(C)c(NC(=O)CNC(=O)[C@@H](NC(=O)c2ccco2)C(C)C)c(C)c1,2.611430454,2.71416,0.102729546\nC[C@@H]1CCC[C@@H](N(C)C(=O)NCc2cccc(Cn3cccn3)c2)C1,3.054786435,3.6515,0.596713565\nC[C@H](NC(=O)CN(C)Cc1nc2ccccc2c(=O)[nH]1)c1ccccc1,2.556272893,2.2323,-0.323972893\nCc1ccc(Oc2ncnc(Nc3ncccc3C)c2[N+](=O)[O-])cc1C,2.404361839,4.24096,1.836598161\nO=C(CCn1cncn1)Nc1nc(-c2ccc3ccccc3c2)cs1,2.191495935,3.5836,1.392104065\nCc1cc(C)c(NC(=O)[C@H](C)N(C)OCc2ccccc2)c(Cl)c1,2.804898113,4.34744,1.542541887\nCc1ncc([C@@H](C)[NH2+][C@H](C)c2ccc3c(c2)CC(=O)N3)s1,4.219180742,2.33172,-1.887460742\nCCc1ccc(C(=O)OCC(=O)N2CCCC2)o1,2.044629001,1.6212,-0.423429001\nO=C(NCCCn1ccc2ccccc21)C1CCN(C(=O)/C=C/c2ccccc2)CC1,2.216937347,4.0996,1.882662653\nCC/[NH+]=C(/NCc1nnc2n1CCCCC2)N[C@@H]1C[C@@H]1C,4.195178435,-0.4514,-4.646578435\nCC[C@H]1CCCN1c1nc(C(F)(F)F)ccc1C(N)=S,3.041634251,3.1134,0.071765749\nCc1ccc(C)c(NC(=O)[C@@H](C)[S@@](=O)Cc2nccn2C(F)F)c1,3.483063654,3.17094,-0.312123654\nC[C@@H]1CCCC[NH+]1CCNC(=O)CCc1ccc2c(ccn2C)c1,4.085937819,1.6844,-2.401537819\nCc1ccc(OCCOc2ccccc2)c(C(=O)[O-])n1,2.306296362,1.21132,-1.094976362\nCc1csc([C@@H](C#N)C(=O)CSc2nnc(-c3ccccc3F)n2C2CCCCC2)n1,3.136642384,5.3229,2.186257616\nO=C(CCl)N1CCCC2(CCCCC2)C1,2.906406136,2.7981,-0.108306136\nN#CCN1CCN(CC(=O)NCCCC2CCCCC2)CC1,2.20926879,1.60428,-0.60498879\nCOc1ccc(F)cc1CSCc1nc(CC(C)C)no1,2.372806593,3.8492,1.476393407\n[NH3+][C@@H]1CCCC[C@@H]1/N=C/[C@@H]1C(=O)N=C(S)N(c2ccccc2)C1=O,4.498557498,1.0857,-3.412857498\nCNS(=O)(=O)c1cc(NC(=O)Cc2cccc(F)c2F)ccc1C,2.073399905,2.36252,0.289120095\nN#CCCN(CCC(F)(F)F)C(=O)c1ccc2c(c1)CCCC2,2.435420061,3.87368,1.438259939\nCCCC(=O)N1CCC(n2nc(C(=O)OCC)cc2C2CC2)CC1,2.485511105,2.9008,0.415288895\nCC(=O)Nc1c(N)cc2c3c(cccc13)C(=O)c1ccccc1-2,2.268598471,3.5918,1.323201529\nCCN(Cc1cccs1)CN1C(=O)N[C@](CC)(c2ccccc2)C1=O,2.892918554,3.3848,0.491881446\nC[C@@H](O)[C@H]1CCOC2(CCC2)C1,4.084328607,1.7165,-2.367828607\nO=C1CCCN1C[C@H](NC(=O)N1CCN(c2nccs2)CC1)c1ccccc1,2.838055644,2.3384,-0.499655644\nC=CCc1cc(N)ccc1O[C@H]1CCOC2(CCCC2)C1,3.696392259,3.8679,0.171507741\nCOC[C@@H]1CN(C(=O)C(=O)Nc2ccccc2-c2ccccc2)CCO1,2.611122099,2.1659,-0.445222099\nO=C(CCc1ccc(Cl)cc1)NCCc1nc2cc(F)ccc2s1,2.030294342,4.3803,2.350005658\nCc1cn2c(=O)c(C(=O)N3CCC[C@H](n4cccn4)C3)cnc2s1,3.246150591,1.73822,-1.507930591\nBrc1cc(C[NH+]2CCC[C@@H](c3cn[nH]c3)C2)cc2c1OCCCO2,4.62014089,2.296,-2.32414089\nCOc1ccccc1OCC(=O)NCC1(c2cccs2)CCCCC1,2.253408876,4.1538,1.900391124\nCC[C@@H](C)Oc1cccc(C(=O)Nc2cccc(-c3nn4cnnc4s3)c2)c1,2.865814889,4.2824,1.416585111\nCc1ccc(CN[C@@H](C(=O)NC2CC2)c2ccccc2)s1,2.532502845,3.16602,0.633517155\nC[C@@H](NC(=O)NNC(=O)[C@@H]1CCCO1)c1ccc(Cl)cc1,2.857988826,1.9104,-0.947588826\nCC[C@H](O)Cn1nc(Cn2cncn2)nc1Cc1cccc(Cl)c1,3.119028851,1.933,-1.186028851\nCC(C)[C@H](Sc1nnc(-c2ccco2)n1C)C(=O)Nc1ccc(Cl)cc1,2.778211591,4.4839,1.705688409\nCO[C@H](C)c1noc(CNc2ccc(Sc3ccccc3)cc2C)n1,2.950654307,4.84872,1.898065693\nC[C@@H](c1ccccc1)N1C[C@@H](C(=O)NCCCN2CC[NH+](C)CC2)CC1=O,3.848030117,-0.0673,-3.915330117\nCCCC[C@@H](C(=O)OC)[C@H](O)c1ccc(Br)cc1F,3.086125432,3.601,0.514874568\nO=C(NCC1(O)CCOCC1)c1ccc(CSC(F)F)o1,2.852911417,2.0067,-0.846211417\nC[C@](O)(CNC(=O)c1cccs1)CN1CCOCC1,2.825985348,0.5611,-2.264885348\nCCOc1ccccc1NC[C@@H]1COc2ccccc21,2.555787576,3.6734,1.117612424\nCCN1C=C(C[NH+]2CCC(C(=O)Nc3ccc([C@@H]4C=c5ccccc5=[NH+]4)cc3)CC2)[C@@H](C)N1,5.152966629,-0.7319,-5.884866629\nCC(C)(C)n1nccc1NC(=O)C(=O)N1CCC[C@@H](c2cn[nH]c2)C1,3.491526175,1.7059,-1.785626175\nCc1ccc(-n2ncc3c2ncn2nc(CCc4ccccc4)nc32)cc1Cl,2.444150134,4.21022,1.766069866\nCC(=O)N1CCN(Cc2cc(NC[C@H](O)c3ccsc3)cc[nH+]2)CC1,3.795660076,1.3718,-2.423860076\nCCC[C@H]1CN(C(=O)Nc2ccc(CC)c(F)c2)CCO1,2.654457773,3.4209,0.766442227\nCC[NH+]1CCN([C@H](C)CNC(=O)COc2ccccc2C#N)CC1,3.789513477,-0.33782,-4.127333477\nCCOc1ccc(N)c2ccccc12,1.62683674,2.8207,1.19386326\nCC[C@@](C)([C@@H](N)c1cccc(Cl)c1)[NH+](C)C,4.121757026,1.653,-2.468757026\nCC1CCC(NC(=O)C2=CC(=O)CS2)CC1,2.872345398,1.8811,-0.991245398\nCOc1cccc([C@@H](CO)NC(=O)NCc2scnc2C)c1,2.77534479,1.99292,-0.78242479\nC[C@@H](CCO)C[NH2+]Cc1ncc(-c2ccccc2)s1,3.718444882,1.892,-1.826444882\nC[C@@H](c1ccccc1)[C@@H]1[NH2+][C@@H](C(=O)[O-])CS1,4.769985386,-0.4551,-5.225085386\nO=C(NCCc1ccc(S(=O)(=O)NC(=O)c2ccc(Cl)cc2)cc1)c1ccc(Cl)cc1,1.894832086,4.0846,2.189767914\nCc1nc(-c2c[nH]c(C(=O)NC(C)(C)C)c2)cs1,2.51899596,2.97492,0.45592404\nCCSc1nnc(NC(=O)c2sc3cccc(F)c3c2C)s1,2.398658841,4.56462,2.165961159\nCn1c(=O)c(C2=NN[C@H](c3cccs3)C2)c(O)c2ccccc21,3.233543543,2.7443,-0.489243543\nCc1noc(C)c1CCC(=O)N1CCC2(CC1)Nc1ccccc1S(=O)(=O)N2,3.384083936,1.94674,-1.437343936\nCOCc1ccccc1NC(=O)NCc1ccc(C)cc1N(C)C,2.104700542,3.52912,1.424419458\nCc1cc2c3c([nH]c2cc1S(=O)(=O)N[C@H](C)Cc1ccco1)CC(C)(C)CC3=O,3.268707608,4.13392,0.865212392\nCCOc1ccc([C@H](C)NC(=O)C(=O)Nc2ccc(Cl)cc2C)cc1,2.322029223,3.86302,1.540990777\nCO[C@H](CNC(=O)C(=O)Nc1cccnc1-n1cccn1)c1ccccc1,2.837716015,1.7097,-1.128016015\nCCc1ccc([C@H]([NH3+])[C@@H](C)Oc2ccccc2C)cc1,3.183736434,3.30792,0.124183566\nCc1cccc([C@H]2CCCCC[NH+]2C2CCN(C(N)=O)CC2)c1,4.215146126,2.03812,-2.177026126\nCCC[C@@H](NC(=O)c1ccc2[nH]c3c(c2c1)CCCC3)C(=O)OC,2.682829088,3.1182,0.435370912\nC[C@H]1CN(C(C)(C)CNC(=O)[C@@H]2C[C@@H]2c2ccc(F)cc2)C[C@@H](C)O1,3.550507363,2.9332,-0.617307363\nCC(C)CN(C[C@H](O)c1ccc(F)cc1)C(=O)Nc1cccnc1Cl,2.717118187,4.0976,1.380481813\nCCCCOc1ccc(Cl)cc1,1.295802195,3.5189,2.223097805\nCS(=O)(=O)NC1CCC([NH2+][C@@H]2C[C@H]3C[C@@H]2[C@H]2CCC[C@@H]23)CC1,5.544107265,1.2349,-4.309207265\nCOC[C@@H](C)CNC(=O)[C@@H]1C=C[C@@H]([NH3+])C1,4.562448454,-0.4283,-4.990748454\nCc1ccc(C[C@@H](C)[NH2+][C@H](C)c2ccc(C)c(F)c2)s1,3.86904984,3.75964,-0.10940984\nCOCCN1C(=O)[C@@H]2[C@@H](Cc3ccc(O)c(O)c3)N[C@@]3(C(=O)Nc4ccc(F)cc43)[C@@H]2C1=O,4.296201915,0.8464,-3.449801915\nCc1ccc(C(=O)C2=C([O-])C(=O)N(CCCN3CCOCC3)[C@@H]2c2cccs2)cc1,3.18358086,2.15942,-1.02416086\nCc1cc([C@H](O)[C@@H]2CCO[C@]3(CCOC3)C2)ccc1F,4.177699502,2.75322,-1.424479502\nC[C@@H]1CCC[C@H](N(C)C(=O)c2cccc(NC(=O)C(C)(C)C)c2)C1,2.889836908,4.3219,1.432063092\nCOC(=O)[C@@H]1c2ccsc2CCN1S(=O)(=O)c1ccc(F)cc1,2.801380825,2.3483,-0.453080825\nCC[NH+](CC)C[C@@H](C)NC(=O)CC1C[C@@H]2CC[C@H](C1)[NH2+]2,6.514367429,-0.6897,-7.204067429\nCOC(=O)c1c(N)sc2c1CCS2,3.06293975,1.7651,-1.29783975\nCC[C@H](C(=O)N1CCCSC[C@H]1C[NH+](C)C)c1ccccc1,3.940231943,1.6588,-2.281431943\nCc1occc1C(=O)N/N=C/c1cc(Br)cc([N+](=O)[O-])c1[O-],3.024713357,2.09622,-0.928493357\nCCOc1ccccc1O[C@H]1CCC[C@H]1NC(=O)c1ccc(OC)nc1,2.886927227,3.2188,0.331872773\nCc1cc(C)c(NC(=O)NCc2ccc(N3CCOCC3)cc2)c(C)c1,1.995046745,3.77016,1.775113255\nCOc1ccc(C[C@@H](C#N)C(=O)[O-])cc1OC,2.962785706,0.13598,-2.826805706\nCC(C)NC(=O)Nc1nc2c(s1)CN(C(=O)c1ccc(C(F)(F)F)cc1)CC2,2.379930821,3.8903,1.510369179\nCc1cccnc1CSc1nc2cc(S(=O)(=O)N(C)C)ccc2o1,2.451381369,3.07382,0.622438631\nCc1ccccc1[C@@H](C)NC(=O)C(=O)Nc1ccc(N(C)C)c(C)c1,2.528571728,3.18534,0.656768272\nCC(C)(C)OC(=O)N1CC(=CC(=O)NCC#N)C1,2.850974182,0.80328,-2.047694182\nO=C1c2cccnc2C(=O)N1Cc1ccc(F)cc1S(=O)(=O)N1CCCCCC1,2.385687811,2.5816,0.195912189\nCCCN(CC(=O)NC)C(=O)[C@@H]1CC[C@H]2CCCC[C@@H]2[NH2+]1,4.358803608,0.2556,-4.103203608\nCS(=O)(=O)/N=c1/[nH]c(CC(=O)Nc2c(Cl)cccc2Cl)cs1,2.978326434,2.4245,-0.553826434\nC=CCn1c([S-])nc2cc(C(=O)N(C)CC(=O)Nc3ccccc3C(F)(F)F)ccc2c1=O,2.867110276,3.2178,0.350689724\nC[C@@H](c1ccc(S(C)(=O)=O)cc1)N(C)C(=O)c1cccc(Br)c1,2.415458229,3.6858,1.270341771\nCc1c(C(=O)NNC(=O)NC[C@H]2CCCO2)sc2ccc(F)cc12,2.847237911,2.47182,-0.375417911\nCCOc1ccc(O[C@@H](C)C(=O)NNC(=O)c2[nH]c3ccccc3c2Br)cc1,2.67368861,3.5576,0.88391139\nCc1ccc([C@@H](C)[NH2+][C@H](C)c2nc3c(s2)CCCC3)s1,4.200584091,3.77742,-0.423164091\nO=C(CN1C(=O)NC2(CCCCCC2)C1=O)NCc1cccs1,2.725502984,2.0091,-0.716402984\nCC[C@H](NC(=O)CCc1cc(F)ccc1F)C(=O)OC,2.471739379,1.9652,-0.506539379\nO=C(/C=C/c1c(F)cccc1F)N1CCN(c2ncccc2F)CC1,2.325210066,2.8609,0.535689934\nCC(C)C[C@H]1NC(=O)[C@H]2CN(C(=O)c3ccc4ccccc4c3)CCN2C1=O,3.012230743,2.0373,-0.974930743\nCCOc1cc(/C=C2/C(=N)N3N=CSC3=NC2=O)ccc1OC(=O)c1ccccc1Cl,2.837941116,4.20687,1.368928884\nCc1oc(-c2ccco2)nc1CC(=O)NC[C@H](C)Oc1ccccc1,2.739838922,3.36922,0.629381078\nCCC[C@H]1C[NH2+]CC[C@@H]1c1cnn(-c2ccccc2)c1,4.019041705,2.3393,-1.679741705\nO=C(c1c[nH]c(=O)cn1)N(CC1CC[NH+](C2CCCC2)CC1)C[C@H]1CCCO1,4.283016619,0.6286,-3.654416619\nC[C@@H]1C[C@@H]1c1ccc(/C=C/C(=O)N2CCC[C@H](O)C2)o1,3.44240408,2.3995,-1.04290408\nC[C@H](NC(=O)NC[C@@](C)(O)c1cccs1)c1cccc(C#N)c1,3.261078337,2.88768,-0.373398337\nO=C(Nc1ccnn1C1CC[NH+](Cc2ccccc2F)CC1)c1cccnc1,3.31736458,2.0895,-1.22786458\nCC(C)[C@@H](CNC(=O)NC[C@@]1(C)CCCO1)C(=O)[O-],3.979685613,-0.1232,-4.102885613\nCc1sc(NC(=O)C2CC2)c(C(=O)NCc2n[nH]c(=S)n2C(C)C)c1C,2.734164435,3.47843,0.744265565\nCc1nc(-n2cccc2)sc1C(=O)N1CCN(c2ccccc2Cl)CC1,2.289127951,3.85802,1.568892049\nCc1cc(NC(=O)N[C@H]2CCCN(c3ccccc3F)C2=O)no1,2.730594246,2.43922,-0.291374246\nCOc1ccc(Cl)c(NC(=O)[C@@H]2CC(O)=Nc3nncn32)c1,3.229683996,2.1116,-1.118083996\nO=C(Nc1ccc(Cl)cc1)C1CCN(c2ccc(-n3cccc3)cn2)CC1,2.15321418,4.3808,2.22758582\nCCc1c(N)cnn1[C@@]1(C)CCS(=O)(=O)C1,3.768605035,0.5614,-3.207205035\nCOc1ccc(OC)c(N2C(=O)/C(=C/c3ccc(O)cc3)SC2=S)c1,2.199361566,3.8152,1.615838434\nNC(=O)c1cccc(CNC(=O)COc2ccc(F)cc2Cl)c1,1.814883042,2.2732,0.458316958\nCOC[C@H](Nc1nccc(OC)n1)c1ccc(F)c(F)c1,2.847808751,2.563,-0.284808751\nCOC(=O)[C@@]1(NC(=O)CCCc2cc(Cl)sc2Cl)CCOC1,3.349771447,2.8259,-0.523871447\nC[C@H]1C[C@H](C)CN(S(=O)(=O)CCn2cnc3ccccc32)C1,3.092222018,2.344,-0.748222018\nCCCC[C@@H]1CCC[C@@H]1NC(=O)NC1CC[NH+](CC(=O)N(C)C)CC1,4.320234072,0.78,-3.540234072\nCc1cc(C)cc(N(C)CC(=O)N(C)[C@@H]2CCS(=O)(=O)C2)c1,2.950781578,1.38514,-1.565641578\nCc1ccc(/C=C/C(=O)Nc2sc(C)c(C)c2C#N)o1,2.5068509,3.78994,1.2830891\nCn1nnc2c(=O)n(CC(=O)NC3CC3)cnc21,2.465292041,-1.1964,-3.661692041\nCOc1cc([C@@H](C)N[C@H]2CC[C@H]([NH+](C)C)C2)ccc1OC(F)F,4.124893168,2.0128,-2.112093168\nCC(C)(C)OC(=O)NC1CCN(C(=O)C[NH3+])CC1,2.72243658,-0.256,-2.97843658\nCC[C@](C)([C@@H]([NH2+]C)c1cc(Cl)cnc1N)N1CCOCC1,4.283133262,1.0524,-3.230733262\nCc1cnc([C@@H](NC(=O)N(C)[C@H](c2ccccc2)C(C)C)C2CC2)s1,3.46224187,4.94132,1.47907813\nCOc1ccc([C@H]2CC(=O)C3=C(C)Nc4ccccc4N[C@H]3C2)c(OC)c1OC,3.44999679,4.3391,0.88910321\nCC(=O)N(C)C1CCC(C)(C)CC1,2.486653325,2.4335,-0.053153325\nCNC(=O)c1cccc(NC(=O)Cc2ccccc2Cl)c1,1.657934364,2.8808,1.222865636\nCOc1cc(Br)c(O[C@H]2CCCCO2)cc1C,2.706881191,3.67152,0.964638809\nO=C(C[C@@H]([NH2+]C[C@H]1CCCO1)C(=O)[O-])Nc1cccc(Cl)c1,4.003377104,-0.4705,-4.473877104\nCOc1ccc2cc(-c3csc(NC(=O)COc4ccccc4)n3)oc2c1,2.100462086,4.5824,2.481937914\nC[C@H](Sc1nnc(-c2ccccc2)n1C)C(=O)N1CCC[C@@H](C(N)=O)C1,2.909074093,1.6866,-1.222474093\nCCC[C@@H]1C[C@@H]1NC(=O)C(=O)Nc1cccc(-c2nnc(C)o2)c1,3.176625808,2.28832,-0.888305808\nCc1cc(C(=O)COC(=O)c2ccc(I)cc2)c(C)n1C[C@H]1CCCO1,2.718210326,3.92824,1.210029674\nCc1c(Cl)cccc1NC(=O)C(=O)NCc1ncn(C)n1,2.463583143,1.03182,-1.431763143\nCc1ccc([C@@]2(C(C)(C)C)CCC[NH2+]2)cc1,3.995007141,2.59362,-1.401387141\nCN(C)C(=O)c1ccc(NC[C@@H]2CC[C@@H](C(=O)[O-])O2)nn1,3.770737241,-1.1122,-4.882937241\nc1cncc([C@H](c2nc(-c3ccc4nc[nH]c4c3)no2)N2CCOCC2)c1,3.116341853,2.4295,-0.686841853\nC[C@@H]1[C@H](C(=O)[O-])CCN1C(=O)N1CCc2ccccc21,3.598947972,0.6294,-2.969547972\nCn1cc(CCCOC(=O)c2cc3ccccc3c(=O)[nH]2)cn1,2.31302229,2.0512,-0.26182229\nCCOc1ccc(NC(=O)Cn2cnc3sccc3c2=O)cc1,2.011111102,2.4954,0.484288898\nO=C(Nc1ccccc1O)C1CC[NH+](CC2CCCCC2)CC1,3.239610037,2.2059,-1.033710037\nCc1ccc(CCNC(=O)c2cc(NC(=O)C3CCCC3)n(C)n2)o1,2.383830001,2.42272,0.038889999\nO=C(Nc1cccnc1N1CCCC1)c1ccccc1,1.689281897,2.9341,1.244818103\nCOc1ccc(N2C(=O)NC(=O)/C(=C/Nc3ccc(F)cc3)C2=O)cc1,2.107915685,2.4131,0.305184315\nC[C@@H]([NH2+]Cc1ccc(Cl)c(Cl)c1)c1ccc2c(c1)NC(=O)CO2,3.579272193,3.1489,-0.430372193\nCc1cccc(C)c1NC(=O)CN(C)C(=O)c1cc(C)n(-c2ccccc2)c1C,2.10301896,4.42168,2.31866104\nCCc1ccsc1CNC(=O)Nc1ccc(NC(C)=O)c(OC)c1,2.285167592,3.5992,1.314032408\nCC/C(C)=N\\NC(=O)Cc1ccc(OC)cc1,1.94757722,2.1398,0.19222278\nCc1n[nH]c2cc(NC(=O)c3c(O)cc(F)cc3F)ccc12,2.437484013,3.10742,0.669935987\nCc1ccc(S(=O)(=O)NC[C@H](c2ccc(N(C)C)cc2)[NH+](C)C)c(C)c1,3.427929923,1.53354,-1.894389923\nCCC(O)(CC)C(=O)NN(C(=O)c1csc(C)c1)c1ccccc1,2.822409727,3.28562,0.463210273\nCC(C)CCNc1nc2c(c(=O)n(Cc3ccccc3Cl)c(=O)n2C)n1C,2.454505588,2.5934,0.138894412\nCOc1cc(C(=O)NC[C@H](C2CC2)[NH+](C)C)cc(OC)c1C,3.532251982,0.66512,-2.867131982\nCc1cccc2cc([C@H]3NC(=O)c4c(sc5c4CCCC5)N3)c(Cl)nc12,3.223424868,4.99102,1.767595132\nCc1[nH]cnc1C[NH+]1CCCCCCC1,4.107993259,1.06712,-3.040873259\nCCC1CCC(C[NH3+])([NH+](CC)[C@@H](C)CC)CC1,5.245877951,1.2706,-3.975277951\nN#Cc1ccc([C@@H]([NH3+])CO)cc1F,3.721661942,-0.02732,-3.748981942\nCC1(C)SC[C@H](C(=O)N2CCC(OCc3cccnc3)CC2)NC1=O,3.235181826,1.5994,-1.635781826\nO=C(C[NH+]1CCC[C@H]1c1nc2ccccc2s1)Nc1cc2[nH]c(=O)[nH]c2cc1Br,4.250457538,2.5869,-1.663557538\nCS(=O)(=O)NC[C@H]1CCCCN1C(=O)Nc1cc(C(F)(F)F)ccc1N1CCCC1,2.897130764,3.2412,0.344069236\nC[C@H](CC(=O)[C@@H](C#N)c1nc2cc(Cl)ccc2s1)NC(=O)C1CCCC1,3.325606232,4.21098,0.885373768\nC=CC[NH+](Cc1cc(F)ccc1OC)[C@H]1CCS(=O)(=O)C1,4.383766153,0.5923,-3.791466153\nCCc1cccc(OC2CN(C(=O)[C@H]3CCCCC[NH+]3C)C2)c1,3.985533936,1.2959,-2.689633936\nC[NH+]1CCN(C(=O)[C@@H](O)c2ccccc2)CC1,3.840745614,-0.9231,-4.763845614\nCOc1ccc(C(=O)N2CCC[C@H](c3nc(O)c4nnn(Cc5ccc(F)cc5)c4n3)C2)cc1OC,3.02645316,3.1513,0.12484684\nNC(=O)CCOc1ccccc1NC(=O)c1ccc(Cl)c(Cl)c1,1.802330638,3.4999,1.697569362\nC[C@@H](O)c1ccc(-c2ccccc2)cc1,1.840468032,3.4069,1.566431968\nC[NH2+]C1CCC(N(C)C(=O)c2sccc2Br)CC1,3.545652041,2.087,-1.458652041\nC[C@H]1CCC[NH+]1CC(=O)Nc1ccc(F)c(N2CCCS2(=O)=O)c1,4.340830087,0.3713,-3.969530087\nCC[C@@H](C)NC(=O)[C@@H](C)NC(=O)C#Cc1ccccc1,2.993557533,1.4575,-1.536057533\nc1ccc2sc(C3CC[NH+](C4CCSCC4)CC3)nc2c1,3.761832631,2.9542,-0.807632631\nC=C1C[C@@]23CC[C@@H]4[C@@](C)(CCC[C@@]4(C)C(=O)[O-])[C@H]2CC[C@]1(O)C3,5.990082746,2.8203,-3.169782746\nNc1c(NCCN2CCOCC2)ncnc1Nc1ccc(Br)cn1,2.469548408,1.704,-0.765548408\nCn1c(COC(=O)CCc2ncc(-c3ccccc3F)o2)cc(=O)n(C)c1=O,2.549319379,1.5541,-0.995219379\nCOc1ccc(Br)cc1/C=C/C(=O)NC[C@@H]1CCCO1,2.604302267,2.7661,0.161797733\nCCC[C@H](C)NC(=O)[C@@H](C)Nc1cccc(C)c1,2.577453333,3.10022,0.522766667\nC[C@H]1C[C@H](C)CN(C(=O)COC(=O)c2cncc(Br)c2)C1,2.968923183,2.5054,-0.463523183\nNc1cc(-c2cccc(Cl)c2)nn1-c1ccccc1F,2.017455876,3.914,1.896544124\nCOc1ccc(NC(=O)C(=O)NC2CC2)cc1C,1.795305536,1.22072,-0.574585536\nCC(C)N(Cc1cccnc1)C(=O)Nc1cnn(C[C@H]2CCCO2)c1,2.91680623,2.8996,-0.01720623\nC[C@@H](CNS(=O)(=O)c1ccc2c(c1)NC(=O)CO2)[NH+](C)C1CC1,4.101762609,-0.6386,-4.740362609\nCOc1ccc(NC(=O)N2CCC[C@@H](c3nnc(C(=O)Nc4cccc(F)c4)s3)C2)cc1,2.731919328,4.3496,1.617680672\nCOC1CC[NH+](C[C@@H]2CCC[C@H](C)C2)CC1,4.704832754,1.5064,-3.198432754\nCSc1ccc([C@H](Br)Cc2ccccc2)cc1,2.313993515,5.0872,2.773206485\nC[C@@]1(c2ccccc2F)NC(=O)N(CC(=O)NCC(F)(F)F)C1=O,2.754553183,1.2712,-1.483353183\nO=c1[nH]c(Cn2c(-c3ccccc3)noc2=O)nc2sc3c(c12)CCCC3,2.594235212,2.7284,0.134164788\nCc1cncc(C(=O)NCCC[NH+]2C[C@H](C)C[C@@H](C)C2)c1,4.271486349,1.07072,-3.200766349\nCCc1ccc(NC(=O)[C@@H](C)S(=O)(=O)Cc2nncn2C)cc1,2.848858229,1.3195,-1.529358229\nCOc1ccccc1-c1ccccc1C[C@H]1CN(C(=O)c2ccccc2)CCN(C)C1=O,2.69381381,4.1353,1.44148619\nCc1nsc(N)c1-c1nc2cnccc2n1C1CC1,2.945782277,2.78032,-0.165462277\nCc1ccsc1CN(CC[NH+](C)C)C(=O)c1ccc2c(c1)nnn2C,3.351786841,1.12512,-2.226666841\nCc1cccnc1CCNC(=O)c1ccc([C@H]2CCC[NH+]2C(C)C)s1,4.198988791,2.55222,-1.646768791\nCCc1ccsc1CNC(=O)C1(c2ccccc2F)CC1,2.52839512,3.7976,1.26920488\nCOc1cccc([C@@H](C)NC(=O)COC(=O)c2cncc(Br)c2)c1,2.462586621,2.8869,0.424313379\nCS(=O)(=O)N(CC(=O)N1CCOCC1)c1ccc(F)c(Cl)c1,2.227684896,1.1039,-1.123784896\nCOc1ccc(N2C(=O)CC[C@H](C(=O)Nc3ccc(-n4ccnc4)cc3)[C@H]2c2ccccc2OC)cc1,3.144845929,5.0125,1.867654071\nCC[C@@H]1C(=O)N2CCC[C@H]2C(=O)N1CC1CCC1,3.231498394,1.3983,-1.833198394\nCn1ccc(C(=O)N2C[C@@H]3CC[C@H](C2)[NH+](Cc2ccccc2)C3)n1,5.015766837,0.7396,-4.276166837\nNC1=NC(=O)N(Cc2ccccc2)[C@@H]1c1ccccc1Br,2.854435153,3.4832,0.628764847\nCC(C)[NH+]1CCCN(C(=O)c2cc(C3CC3)n(C(C)(C)C)n2)CC1,4.137344869,1.6547,-2.482644869\nCc1cc(CNC(=O)[C@@]2(C)CCC[NH2+]2)cc(C)c1F,3.955747577,1.17464,-2.781107577\nC[C@@H]1CC(=O)N(c2ccc(NCc3cccnc3)c([N+](=O)[O-])c2)C1=O,2.76947101,2.5013,-0.26817101\nCC1CCC2(CC1)NC(=O)N(NC(=O)CNC(=O)c1cccs1)C2=O,3.018403098,1.0098,-2.008603098\nCCOc1ccc(C(C)=O)cc1Cn1c(C(F)(F)F)nc2ccccc21,2.185433336,4.7047,2.519266664\nCc1ccccc1N1C(=O)c2ccc(C(=O)OCC(=O)NC[C@H]3CCCO3)cc2C1=O,2.664389642,2.24762,-0.416769642\nCOC1CCC(CC(=O)Nc2[nH+]cccc2[O-])CC1,3.890243815,1.1081,-2.782143815\nNc1ccc(F)cc1NC(=O)CN1CCc2ccccc21,1.963255159,2.4091,0.445844841\nCOc1c(Br)cc(Cl)cc1S(=O)(=O)[N-]c1cc(F)ccc1F,3.247974726,4.7834,1.535425274\nCCOc1ccc(NC(=O)N(C)Cc2ncnn2C)cc1C,2.274019952,2.18612,-0.087899952\nC[C@H]1OCC[C@@H]1C(=O)Nc1ccc(Cl)c(C(=O)[O-])c1,3.500226302,1.067,-2.433226302\nO=C(NCc1cccnc1)C(=O)N1CCCC1,1.957724353,0.3202,-1.637524353\nC#CCOc1ccc2ccccc2c1CN1CCN(CC#N)CC1,2.405771046,2.49298,0.087208954\nCCOc1ncccc1-c1ncnc2scc(C)c12,2.507808944,3.46042,0.952611056\nCC(=O)NC1(c2ccc(F)cc2)CCN(C(=O)C2CCCCC2)CC1,2.31470064,3.3598,1.04509936\nCc1ccc(NC(=O)C(=O)NC[C@@H](c2ccc3c(c2)CCN3C)[NH+]2CCCCC2)c([N+](=O)[O-])c1,3.902352433,1.76032,-2.142032433\nCCc1ccc(Oc2ccc3nnnn3n2)c(N)c1,2.589085867,1.4562,-1.132885867\nCc1cc(OCC(=O)OC(C)C)cc2c1C(=O)/C(=C/c1cc(Br)cc3c1OCOC3)O2,2.988519088,4.57062,1.582100912\nCOC(=O)[C@H]1[C@@H](NC(=S)NC[C@@H]2C=CN=N2)S[C@@H](C)[C@H]1C,5.238676386,1.6853,-3.553376386\nC[C@H]1C[C@@H](C)CN(C(=O)CSc2ccccc2S(C)(=O)=O)C1,3.009957402,2.6867,-0.323257402\nCC1(C)COc2cscc2OC1,3.146666448,2.5455,-0.601166448\nCc1cc(C)c(NC(=O)CNC(=O)CCc2nc3ccccc3c(=O)[nH]2)c(C)c1,2.263700819,2.53586,0.272159181\nCOC[C@@H](C)NC(=O)Nc1ccc(F)cc1F,2.338775673,2.1212,-0.217575673\nCCOc1ccc(N[C@H](C)c2ccc(F)cc2F)cc1CO,2.516868547,4.0289,1.512031453\nCCCn1nnnc1CN1C(=O)c2ccc([N+](=O)[O-])cc2C1=O,2.398476792,0.7875,-1.610976792\nCC(C)CC(C)(C)C(=O)Nc1ccc(F)c(C(=O)N(C)C)c1,2.287139552,3.5383,1.251160448\nCn1cc(/C=C/C(=O)c2cccc(F)c2)c(=O)n(C)c1=O,2.300985321,1.1192,-1.181785321\nO=C(NCc1ccnc(OC2CCC2)c1)c1[nH]ncc1Br,2.55409362,2.4285,-0.12559362\nCc1ccc([C@@H](CN2CCOCC2)NC(=O)COc2ccc(F)cc2)cc1,2.421749151,2.70262,0.280870849\n"
  },
  {
    "path": "graphium/data/multilevel_utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pandas as pd\nimport ast\nimport numpy as np\nfrom typing import List\nimport itertools\nimport math\n\n\ndef extract_labels(df: pd.DataFrame, task_level: str, label_cols: List[str]):\n    \"\"\"Extracts labels in label_cols from dataframe df for a given task_level.\n    Returns a list of numpy arrays converted to the correct shape. Multiple\n    targets are concatenated for each graph.\n    \"\"\"\n\n    def unpack(graph_data):\n        graph_data = pd.to_numeric(graph_data, errors=\"coerce\")\n        if isinstance(graph_data, str):\n            graph_data_list = ast.literal_eval(graph_data)\n            return np.array(graph_data_list)\n        elif isinstance(graph_data, (int, float)):\n            return np.array([graph_data])\n        elif isinstance(graph_data, list):\n            return np.array(graph_data)\n        elif isinstance(graph_data, np.ndarray):\n            if len(graph_data.shape) == 0:\n                graph_data = np.expand_dims(graph_data, 0)\n            if graph_data.shape[0] == 0:\n                graph_data = np.array([np.nan])\n                # TODO: Warning\n            return graph_data\n        else:\n            raise ValueError(\n                f\"Graph data should be one of str, float, int, list, np.ndarray, got {type(graph_data)}\"\n            )\n\n    def unpack_column(data: pd.Series):\n        return data.apply(unpack)\n\n    def merge_columns(data: pd.Series):\n        data = data.to_list()\n        data = [np.array([np.nan]) if not isinstance(d, np.ndarray) and math.isnan(d) else d for d in data]\n        padded_data = itertools.zip_longest(*data, fillvalue=np.nan)\n        data = np.stack(list(padded_data), 1).T\n        return data\n\n    unpacked_df: pd.DataFrame = df[label_cols].apply(unpack_column)\n    output = unpacked_df.apply(merge_columns, axis=\"columns\").to_list()\n\n    if task_level == \"graph\":\n        return np.concatenate(output)\n    return output\n"
  },
  {
    "path": "graphium/data/multitask/__init__.py",
    "content": ""
  },
  {
    "path": "graphium/data/multitask/tiny_ZINC_SA.csv",
    "content": "SMILES,SA\nCc1cccc(COC(=O)Nc2ccc(-c3cn[nH]c3)cc2OCCN2CCCC2)c1.O=C(O)C(F)(F)F.O=C(O)C(F)(F)F,2.896514950230738\nCCC(C)C1=C(C=CC=C1)OCCC[N]2C=CN=C2.O=C(O)C(=O)O,2.66931991795572\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,2.775189478\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.071497898\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,2.84000896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,3.605168457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.132664673\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,3.117639223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,2.744789746\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,2.547847184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.444743614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,3.546311784\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,2.342516783\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,2.685016252\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,2.04419285\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,2.369895336\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,4.165551538\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,3.989079408\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,3.287567863\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,3.152874099\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,2.173785623\nCc1ncc(-c2ccc(S(=O)(=O)NC3CCCCCC3)s2)o1,2.513723357\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,1.947448768\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,4.391338086\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,2.956072154\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,2.665131639\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,2.249569987\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,2.921725033\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,3.257260117\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,4.36293875\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,3.725368177\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,3.812797047\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,1.826753956\nCC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1,3.228025092\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,4.787067722\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,2.485526606\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,2.708478583\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,2.943459024\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,2.781430253\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,2.258312512\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,3.277555708\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,3.259412472\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,2.887282024\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,2.501067124\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,2.296421252\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,3.273511868\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,2.000346516\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,4.483674272\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,3.510193788\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,2.774610155\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,2.584851266\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,3.068452859\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,2.492863438\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,4.576896432\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,4.151966768\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,4.514122047\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,3.960984106\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,3.669896168\nC=CCSc1nnc(C2CCOCC2)n1N,2.825013358\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,2.674797385\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,2.133219774\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,3.418504414\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,3.363061129\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,1.931563902\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,2.82917551\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,2.376871447\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,3.548164288\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,3.226440403\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,3.128631552\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,2.869359841\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,2.427176885\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,2.493551313\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,3.779297449\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,5.127905513\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,3.24179101\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,2.114582277\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,2.852054716\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,3.940919906\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,2.400946341\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,2.601600041\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],3.576525387\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,2.298959953\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,3.812988405\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,2.732126257\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,3.955103091\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,3.628458628\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,2.795116489\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,3.002385916\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,3.789130146\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,2.245396536\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,2.86962049\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,3.445502237\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,2.872766629\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,1.983649537\nO=c1c2ccccc2sn1-c1ncc(Br)s1,2.853247472\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,2.60278761\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,2.870575026\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,2.965149266\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,1.961616674\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,2.529083407\n"
  },
  {
    "path": "graphium/data/multitask/tiny_ZINC_logp.csv",
    "content": "SMILES,logp\nCc1cccc(COC(=O)Nc2ccc(-c3cn[nH]c3)cc2OCCN2CCCC2)c1.O=C(O)C(F)(F)F.O=C(O)C(F)(F)F,5.87502\nCCC(C)C1=C(C=CC=C1)OCCC[N]2C=CN=C2.O=C(O)C(=O)O,3.0213\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,3.7897\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.161\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,1.5048\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,-1.1109\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.2426\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,0.98102\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,4.3197\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,0.97954\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.4136\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,0.8084\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,3.55016\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,2.63872\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,3.0455\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,4.13924\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,0.6424\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,0.7123\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,3.0474\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,1.2597\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,3.35398\nCc1ncc(-c2ccc(S(=O)(=O)NC3CCCCCC3)s2)o1,3.71262\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,4.97972\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,0.42742\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,2.7309\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,3.32222\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,1.76082\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,1.12714\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,4.3639\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,1.63596\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,1.9079\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,0.2247\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,2.73994\nCC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1,3.2416\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,1.3846\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,3.6869\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,4.09694\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,3.01252\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,2.0053\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,2.22984\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,1.84896\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,2.4361\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,3.0685\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,2.11938\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,3.7633\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,1.3679\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,4.1539\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,0.9483\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,0.3764\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,3.0369\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,3.792\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,2.49544\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,1.83022\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,3.54158\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,2.7888\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,1.3831\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,1.7691\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,0.8081\nC=CCSc1nnc(C2CCOCC2)n1N,1.164\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,2.9367\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,4.04592\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,1.4823\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,3.6889\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,4.0076\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,1.5648\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,1.5444\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,5.37382\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,3.7253\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,2.55582\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,0.1186\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,2.6842\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,2.76928\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,1.9323\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,0.2312\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,1.003\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,1.80932\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,1.8074\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,2.9729\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,5.3985\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,3.70032\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],0.94\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,3.204\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,0.77654\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,3.06272\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,1.47556\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,1.43692\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,0.8664\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,2.4608\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,1.5831\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,2.436\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,2.07512\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,3.7062\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,2.94166\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,3.6512\nO=c1c2ccccc2sn1-c1ncc(Br)s1,3.2712\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,2.92602\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,1.55642\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,4.1977\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,1.8963\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,1.1019\n"
  },
  {
    "path": "graphium/data/multitask/tiny_ZINC_score.csv",
    "content": "SMILES,score\nCc1cccc(COC(=O)Nc2ccc(-c3cn[nH]c3)cc2OCCN2CCCC2)c1.O=C(O)C(F)(F)F.O=C(O)C(F)(F)F,2.978505049769262\nCCC(C)C1=C(C=CC=C1)OCCC[N]2C=CN=C2.O=C(O)C(=O)O,0.35198008204428\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,1.014510522\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,0.089502102\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,-1.33520896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,-4.716068457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],0.109935327\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,-2.136619223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,1.574910254\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,-1.568307184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,-0.031143614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,-2.737911784\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,1.207643217\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,-0.046296252\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,1.00130715\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,1.769344664\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,-3.523151538\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,-3.276779408\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,-0.240167863\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,-1.893174099\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,1.180194377\nCc1ncc(-c2ccc(S(=O)(=O)NC3CCCCCC3)s2)o1,1.198896643\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,3.032271232\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,-3.963918086\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,-0.225172154\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,0.657088361\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,-0.488749987\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,-1.794585033\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,1.106639883\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,-2.72697875\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,-1.817468177\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,-3.588097047\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,0.913186044\nCC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1,0.013574908\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,-3.402467722\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,1.201373394\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,1.388461417\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,0.069060976\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,-0.776130253\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,-0.028472512\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,-1.428595708\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,-0.823312472\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,0.181217976\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,-0.381687124\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,1.466878748\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,-1.905611868\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,2.153553484\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,-3.535374272\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,-3.133793788\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,0.262289845\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,1.207148734\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,-0.573012859\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,-0.662643438\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,-1.035316432\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,-1.363166768\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,-3.131022047\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,-2.191884106\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,-2.861796168\nC=CCSc1nnc(C2CCOCC2)n1N,-1.661013358\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,0.261902615\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,1.912700226\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,-1.936204414\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,0.325838871\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,2.076036098\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,-1.26437551\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,-0.832471447\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,1.825655712\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,0.498859597\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,-0.572811552\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,-2.750759841\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,0.257023115\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,0.275728687\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,-1.846997449\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,-4.896705513\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,-2.23879101\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,-0.305262277\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,-1.044654716\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,-0.968019906\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,2.997553659\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,1.098719959\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],-2.636525387\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,0.905040047\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,-3.036448405\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,0.330593743\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,-2.479543091\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,-2.191538628\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,-1.928716489\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,-0.541585916\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,-2.206030146\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,0.190603464\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,-0.79450049\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,0.260697763\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,0.068893371\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,1.667550463\nO=c1c2ccccc2sn1-c1ncc(Br)s1,0.417952528\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,0.32323239\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,-1.314155026\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,1.232550734\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,-0.065316674\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,-1.427183407\n"
  },
  {
    "path": "graphium/data/normalization.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Optional\nfrom loguru import logger\nimport numpy as np\nimport torch\nfrom torch import Tensor\n\n\nclass LabelNormalization:\n    def __init__(\n        self,\n        method: Optional[str] = None,\n        min_clipping: Optional[int] = None,\n        max_clipping: Optional[int] = None,\n        normalize_val_test: Optional[bool] = False,\n        verbose: Optional[bool] = True,\n    ):\n        \"\"\"\n        Parameters:\n        method: str\n            Normalization method. Supports the following values:\n            - `None` (default): No normalization applied\n            - `normal`: Normalize to have 0-mean and 1-variance\n            - `unit`: Normalize to have all values in the range 0-1\n\n        min_clipping: int\n            Minimum value to clip to. If `None` (default), no clipping is applied.\n            For example, if `min_clipping` is -2, all values below -2 will be clipped to -2.\n            This is applied before the normalization.\n\n        max_clipping: int\n            Maximum value to clip to. If `None` (default), no clipping is applied.\n            For example, if `max_clipping` is 2, all values above 2 will be clipped to 2.\n            This is applied before the normalization.\n        \"\"\"\n\n        self.method = method\n        self.min_clipping = min_clipping\n        self.max_clipping = max_clipping\n        self.normalize_val_test = normalize_val_test\n        self.verbose = verbose\n        self.data_max = None\n        self.data_min = None\n        self.data_mean = None\n        self.data_std = None\n\n    def calculate_statistics(self, array):\n        \"\"\"\n        Saves the normalization parameters (e.g. mean and variance) to the object.\n        \"\"\"\n        # add axis = 0 to make sure that the statistics for multiple column labels is a vector or list instead of a scalar\n        self.data_max = np.nanmax(array, axis=0).tolist()\n        self.data_min = np.nanmin(array, axis=0).tolist()\n        self.data_mean = np.nanmean(array, axis=0).tolist()  # 5.380503871833475 for pcqm4mv2\n        self.data_std = np.nanstd(array, axis=0).tolist()  # 1.17850688410978995 for pcqm4mv2\n        if self.verbose:\n            logger.info(f\"Max value for normalization '{self.data_max}'\")\n            logger.info(f\"Min value for normalization '{self.data_min}'\")\n            logger.info(f\"Mean value for normalization '{self.data_mean}'\")\n            logger.info(f\"STD value for normalization '{self.data_std}'\")\n\n    def normalize(self, input):\n        \"\"\"\n        Apply the normalization method to the data.\n        Saves the normalization parameters (e.g. mean and variance) to the object.\n        \"\"\"\n        assert self.data_max is not None, \"calculate_statistic must be called before applying normalization\"\n        if self.min_clipping is not None:\n            self.data_min = max(self.min_clipping, self.data_min)\n        if self.max_clipping is not None:\n            self.data_max = min(self.max_clipping, self.data_max)\n        clipping = self.min_clipping is not None and self.max_clipping is not None\n        # Need to check since np.clip fails if both a_min and a_max are None\n        if clipping:\n            if isinstance(input, np.ndarray):\n                input = np.clip(input, a_min=self.data_min, a_max=self.data_max)\n            elif isinstance(input, Tensor):\n                input = torch.clip(input, min=self.data_min, max=self.data_max)\n        if self.method is None:\n            return input\n        elif self.method == \"normal\":\n            return (input - self.data_mean) / self.data_std\n        elif self.method == \"unit\":\n            return (input - self.data_min) / (self.data_max - self.data_min)\n        else:\n            raise ValueError(f\"normalization method {self.method} not recognised.\")\n\n    def denormalize(self, input):\n        \"\"\"\n        Apply the inverse of the normalization method to the data.\n        \"\"\"\n        if self.method is None:\n            return input\n        elif self.method == \"normal\":\n            mean, std = torch.tensor(self.data_mean), torch.tensor(self.data_std)\n            if input.device.type != \"ipu\":  # Cast to device if not on IPU\n                mean, std = mean.to(input.device), std.to(input.device)\n            return (input * std) + mean\n        elif self.method == \"unit\":\n            dmax, dmin = torch.tensor(self.data_max), torch.tensor(self.data_min)\n            if input.device.type != \"ipu\":  # Cast to device if not on IPU\n                dmax, dmin = dmax.to(input.device), dmin.to(input.device)\n            return input * (dmax - dmin) + dmin\n        else:\n            raise ValueError(f\"normalization method {self.method} not recognised.\")\n"
  },
  {
    "path": "graphium/data/sampler.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Dict, Optional\nfrom torch.utils.data.dataloader import Dataset\n\nimport torch.utils.data as data_utils\nimport numpy as np\nimport time\nimport os\nimport torch\n\n\nclass DatasetSubSampler(data_utils.Sampler):\n    def __init__(\n        self, dataset: Dataset, sampler_task_dict: Dict[str, Optional[float]], data_path: str, data_hash: str\n    ):\n        \"\"\"\n        Random sample a subset of the dataset for each task each epoch and combine them for training.\n\n        Parameters:\n            dataset: the whole training dataset\n            sampler_task_dict: a dict which indicates the sampled amount of data for each task.\n            data_path: path to save the data indices.\n            data_hash: hash folder name for the data indices.\n        \"\"\"\n        self.dataset = dataset\n        self.sampler_task_dict = sampler_task_dict\n        self.task_indices = {}\n        now = time.time()\n        path_with_hash = os.path.join(data_path, data_hash)\n        os.makedirs(path_with_hash, exist_ok=True)\n        filename = os.path.join(path_with_hash, \"dataset_indices.pkl\")\n        # if file dataset_indices.pkl exists, load indices from file.\n        if os.path.isfile(filename):\n            print(f\"Sampler--loading dataset indices from disk.\")\n            self.task_indices = torch.load(filename)\n        # if not, iterate through all data items in dataset and save indices to disk\n        else:\n            for i in range(len(dataset)):\n                # the dataset[i][\"labels\"].keys() are not deterministic, need to sort by key length\n                task_name = sorted(dataset[i][\"labels\"].keys(), key=len)[-1]\n                if task_name not in self.task_indices:\n                    self.task_indices[task_name] = []\n                self.task_indices[task_name].append(i)\n            elapsed = round(time.time() - now)\n            print(f\"Sampler-->time spent on getting indices: {elapsed}.\")\n            torch.save(self.task_indices, filename, pickle_protocol=4)\n\n    def __iter__(self):\n        indices = []\n        for task_name in self.task_indices.keys():\n            task_size = int(len(self.task_indices[task_name]) * self.sampler_task_dict[task_name])\n            indices += np.random.choice(self.task_indices[task_name], task_size, replace=False).tolist()\n            indices_set = set(indices)\n            self.total_size = len(indices_set)\n        return iter(indices_set)\n\n    def __len__(self):\n        return self.total_size\n\n    @classmethod\n    def check_sampling_required(cls, sampler_task_dict):\n        \"\"\"\n        Check if we need subsampling: if all items in the sampler_task_dict are 1.0,\n        skip subsampling.\n        \"\"\"\n        return not all(value == 1.0 for value in sampler_task_dict.values())\n"
  },
  {
    "path": "graphium/data/sdf2csv.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs and Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport datamol as dm\nimport zipfile\nimport pandas as pd\nfrom rdkit import Chem\nfrom ogb.lsc import PCQM4Mv2Dataset\nimport graphium\nfrom graphium.data.datamodule import BaseDataModule\nimport csv\n\n\ndef extract_zip(fname):\n    \"\"\"\n    #extract sdf from zip\n    \"\"\"\n    zf = zipfile.ZipFile(fname)\n    zf.extractall(\".\")\n\n\ndef extract_mols_from_sdf(fname):\n    \"\"\"\n    load sdf into mols\n    \"\"\"\n    mol_df = BaseDataModule._read_sdf(fname)\n    mols = mol_df[\"_rdkit_molecule_obj\"]\n    return mols\n\n\ndef mols2cxs(mols):\n    \"\"\"\n    convert into a smiles that contains the 3D structure\n    \"\"\"\n    cxs = []\n    for mol in mols:\n        cxs.append(dm.to_smiles(mol, cxsmiles=True))\n    return cxs\n\n\ndef write_csv(cxs: any, homos: any, fname: str):\n    \"\"\"\n    write cxsmiles and homo lomo to file\n    write the training molecules with cxsmiles first\n    \"\"\"\n    outname = fname + \".csv\"\n    fieldnames = [\"cxsmiles\", \"homo_lumo_gap\"]\n\n    with open(outname, \"w\") as outfile:\n        writer = csv.DictWriter(outfile, fieldnames=fieldnames)\n        writer.writeheader()\n        for i in range(len(cxs)):\n            writer.writerow({\"cxsmiles\": cxs[i], \"homo_lumo_gap\": homos[i]})\n\n\ndef sdf2csv(sdf_name: str = \"pcqm4m-v2-train\", outname: str = \"pcqm4m-v2-train\"):\n    \"\"\"\n    function converting sdf file molecules into csv format using CXSmiles and combine with mols without 3d positions from ogb\n    Parameters:\n        sdf_name: name to the extracted sdf file\n        outname: output name of the cxsmile file\n    \"\"\"\n    mols = extract_mols_from_sdf(sdf_name + \".sdf\")\n\n    # download ogb smiles\n    dataset = PCQM4Mv2Dataset(root=\".\", only_smiles=True)  # (smiles, homo_lomo)\n    homos = []\n    for i in range(len(mols)):\n        homo = dataset[i][1]\n        homos.append(homo)\n\n    # write the trainning set molecules first with cxsmiles\n    cxs = mols2cxs(mols)\n    for j in range(len(mols), len(dataset)):\n        cxs.append(dataset[j][0])\n        homos.append(dataset[j][1])\n\n    write_csv(cxs, homos, outname)\n\n\nif __name__ == \"__main__\":\n    \"\"\"\n    #* main function\n    #! this script need to be located at the specific data folder as it uses relative dependencies\n    for example   #* graphium/data/PCQM4Mv2\n\n    instruction on how to generate the csv file:\n    1. download the extract the sdf file from ogb: https://ogb.stanford.edu/docs/lsc/pcqm4mv2/\n    $ wget http://ogb-data.stanford.edu/data/lsc/pcqm4m-v2-train.sdf.tar.gz\n    $ md5sum pcqm4m-v2-train.sdf.tar.gz\n    $ tar -xf pcqm4m-v2-train.sdf.tar.gz\n    2. run this function (the smiles csv file will be directed downloaded in code)\n    \"\"\"\n    sdf_name = \"pcqm4m-v2-train\"\n    outname = \"pcqm4m-v2-train\"\n    sdf2csv(sdf_name=sdf_name, outname=outname)\n\n    #! check how many warning you get from loading cxsmiles\n    # path = \"pcqm4m-v2-train.csv\"\n    # df = pd.read_csv(path)\n    # smiles = df[\"cxsmiles\"]\n    # print (smiles[0])\n    # graphium.data.datamodule.smiles_to_unique_mol_ids(smiles)\n"
  },
  {
    "path": "graphium/data/single_atom_dataset/single_atom_dataset.csv",
    "content": "SMILES,score\n[CL-],17\n[Br-],35\nC,6\nO,8\nS,16\n[O-],8\nN,7"
  },
  {
    "path": "graphium/data/smiles_transform.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs and Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Type, List, Dict, Union, Any, Callable, Optional, Tuple, Iterable\n\nimport os\nimport datamol as dm\n\n\ndef smiles_to_unique_mol_id(smiles: str) -> Optional[str]:\n    \"\"\"\n    Convert a smiles to a unique MD5 Hash ID. Returns None if featurization fails.\n    Parameters:\n        smiles: A smiles string to be converted to a unique ID\n    Returns:\n        mol_id: a string unique ID\n    \"\"\"\n    try:\n        mol = dm.to_mol(\n            mol=smiles\n        )  # Doesn't need `ordered=True` because the unique_id doesn't depend on the atom order\n        mol_id = dm.unique_id(mol)\n    except:\n        mol_id = \"\"\n    if mol_id is None:\n        mol_id = \"\"\n    return mol_id\n\n\ndef did_featurization_fail(features: Any) -> bool:\n    \"\"\"\n    Check if a featurization failed.\n    \"\"\"\n    return (features is None) or isinstance(features, str)\n\n\nclass BatchingSmilesTransform:\n    \"\"\"\n    Class to transform a list of smiles using a transform function\n    \"\"\"\n\n    def __init__(self, transform: Callable):\n        \"\"\"\n        Parameters:\n            transform: Callable function to transform a single smiles\n        \"\"\"\n        self.transform = transform\n\n    def __call__(self, smiles_list: Iterable[str]) -> Any:\n        \"\"\"\n        Function to transform a list of smiles\n        \"\"\"\n        mol_id_list = []\n        for smiles in smiles_list:\n            mol_id_list.append(self.transform(smiles))\n        return mol_id_list\n\n    @staticmethod\n    def parse_batch_size(numel: int, desired_batch_size: int, n_jobs: int) -> int:\n        \"\"\"\n        Function to parse the batch size.\n        The batch size is limited by the number of elements divided by the number of jobs.\n        \"\"\"\n        assert ((n_jobs >= 0) or (n_jobs == -1)) and isinstance(\n            n_jobs, int\n        ), f\"n_jobs must be a positive integer or -1, got {n_jobs}\"\n        assert (\n            isinstance(desired_batch_size, int) and desired_batch_size >= 0\n        ), f\"desired_batch_size must be a positive integer, got {desired_batch_size}\"\n\n        if n_jobs == -1:\n            n_jobs = os.cpu_count()\n        if (n_jobs == 0) or (n_jobs == 1):\n            batch_size = 1\n        else:\n            batch_size = min(desired_batch_size, numel // n_jobs)\n        batch_size = max(1, batch_size)\n        return batch_size\n\n\ndef smiles_to_unique_mol_ids(\n    smiles: Iterable[str],\n    n_jobs=-1,\n    featurization_batch_size=1000,\n    backend=\"loky\",\n    progress=True,\n    progress_desc=\"mols to ids\",\n) -> List[Optional[str]]:\n    \"\"\"\n    This function takes a list of smiles and finds the corresponding datamol unique_id\n    in an element-wise fashion, returning the corresponding unique_ids.\n\n    The ID is an MD5 hash of the non-standard InChiKey provided\n    by `dm.to_inchikey_non_standard()`. It guarantees uniqueness for\n    different tautomeric forms of the same molecule.\n\n    Parameters:\n        smiles: a list of smiles to be converted to mol ids\n        n_jobs: number of jobs to run in parallel\n        backend: Parallelization backend\n        progress: Whether to display the progress bar\n\n    Returns:\n        ids: A list of MD5 hash ids\n    \"\"\"\n\n    batch_size = BatchingSmilesTransform.parse_batch_size(\n        numel=len(smiles), desired_batch_size=featurization_batch_size, n_jobs=n_jobs\n    )\n\n    unique_mol_ids = dm.parallelized_with_batches(\n        BatchingSmilesTransform(smiles_to_unique_mol_id),\n        smiles,\n        batch_size=batch_size,\n        progress=progress,\n        n_jobs=n_jobs,\n        backend=backend,\n        tqdm_kwargs={\"desc\": f\"{progress_desc}, batch={batch_size}\"},\n    )\n\n    return unique_mol_ids\n"
  },
  {
    "path": "graphium/data/utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, List, Callable, Dict, Tuple, Any, Optional\n\nimport importlib.resources\nimport zipfile\n\nfrom loguru import logger\n\nimport pandas as pd\nimport numpy as np\n\nimport graphium\n\nfrom torch_geometric.data import Data\nfrom graphium.features.featurizer import GraphDict\n\nGRAPHIUM_DATASETS_BASE_URL = \"https://storage.valencelabs.com/graphium/datasets\"\nGRAPHIUM_DATASETS = {\n    \"graphium-zinc-micro\": \"zinc-micro.zip\",\n    \"graphium-zinc-bench-gnn\": \"zinc-bench-gnn.zip\",\n}\n\n\ndef load_micro_zinc() -> pd.DataFrame:\n    \"\"\"\n    Return a dataframe of micro ZINC (1000 data points).\n    Returns:\n        pd.DataFrame: A dataframe of micro ZINC.\n    \"\"\"\n\n    with importlib.resources.open_text(\"graphium.data.micro_ZINC\", \"micro_ZINC.csv\") as f:\n        df = pd.read_csv(f)\n\n    return df  # type: ignore\n\n\ndef load_tiny_zinc() -> pd.DataFrame:\n    \"\"\"\n    Return a dataframe of tiny ZINC (100 data points).\n    Returns:\n        pd.DataFrame: A dataframe of tiny ZINC.\n    \"\"\"\n\n    with importlib.resources.open_text(\"graphium.data.micro_ZINC\", \"micro_ZINC.csv\") as f:\n        df = pd.read_csv(f, nrows=100)\n\n    return df  # type: ignore\n\n\ndef graphium_package_path(graphium_path: str) -> str:\n    \"\"\"Return the path of a graphium file in the package.\"\"\"\n\n    assert graphium_path.startswith(\n        \"graphium://\"\n    ), f\"Invalid graphium path, must start with 'graphium://': {graphium_path}\"\n\n    graphium_path = graphium_path.replace(\"graphium://\", \"\")\n    package, ressource = graphium_path.split(\"/\")\n    with importlib.resources.path(package, ressource) as data_path:\n        pass\n    return str(data_path)\n\n\ndef list_graphium_datasets() -> set:\n    \"\"\"\n    List Graphium datasets available to download.\n    Returns:\n        set: A set of Graphium dataset names.\n    \"\"\"\n    return set(GRAPHIUM_DATASETS.keys())\n\n\ndef download_graphium_dataset(\n    name: str, output_path: str, extract_zip: bool = True, progress: bool = False\n) -> str:\n    r\"\"\"Download a Graphium dataset to a specified location.\n\n    Args:\n        name: Name of the Graphium dataset from `graphium.data.utils.get_graphium_datasets()`.\n        output_path: Directory path where to download the dataset to.\n        extract_zip: Whether to extract the dataset if it's a zip file.\n        progress: Whether to show a progress bar during download.\n\n    Returns:\n        str: Path to the downloaded dataset.\n    \"\"\"\n\n    if name not in GRAPHIUM_DATASETS:\n        raise ValueError(f\"'{name}' is not a valid Graphium dataset name. Choose from {GRAPHIUM_DATASETS}\")\n\n    fname = GRAPHIUM_DATASETS[name]\n\n    dataset_path_source = graphium.utils.fs.join(GRAPHIUM_DATASETS_BASE_URL, fname)\n    dataset_path_destination = graphium.utils.fs.join(output_path, fname)\n\n    if not graphium.utils.fs.exists(dataset_path_destination):\n        graphium.utils.fs.copy(dataset_path_source, dataset_path_destination, progress=progress)\n\n        if extract_zip and str(dataset_path_destination).endswith(\".zip\"):\n            # Unzip the dataset\n            with zipfile.ZipFile(dataset_path_destination, \"r\") as zf:\n                zf.extractall(output_path)\n\n    if extract_zip:\n        # Set the destination path to the folder\n        # NOTE(hadim): this is a bit fragile.\n        dataset_path_destination = dataset_path_destination.split(\".\")[0]\n\n    return dataset_path_destination\n\n\ndef get_keys(pyg_data):\n    if isinstance(type(pyg_data).keys, property):\n        return pyg_data.keys\n    else:\n        return pyg_data.keys()\n\n\ndef found_size_mismatch(task: str, features: Union[Data, GraphDict], labels: np.ndarray, smiles: str) -> bool:\n    \"\"\"Check if a size mismatch exists between features and labels with respect to node/edge/nodepair.\n\n    Args:\n        task: The task name is needed to determine the task level (graph, node, edge or nodepair)\n        features: Features/information of molecule/graph (e.g., edge_index, feat, edge_feat, num_nodes, etc.)\n        labels: Target label of molecule for the task\n        smiles: Smiles string of molecule\n\n    Returns:\n        mismatch: Boolean variable indicating if a size mismatch was found between featurs and labels.\n    \"\"\"\n\n    mismatch = False\n\n    if np.isnan(labels).any():\n        pass\n\n    elif task.startswith(\"graph_\"):\n        pass\n\n    elif task.startswith(\"node_\"):\n        if labels.shape[0] != features.num_nodes:\n            mismatch = True\n            logger.warning(\n                (\n                    f\"Inconsistent number of nodes between labels and features in {task} task for {smiles}: {labels.shape[0]} vs {features.num_nodes}\"\n                )\n            )\n\n    elif task.startswith(\"edge_\"):\n        if labels.shape[0] != features.num_edges:\n            mismatch = True\n            logger.warning(\n                (\n                    f\"Inconsistent number of edges between labels and features in {task} task for {smiles}: {labels.shape[0]} vs {features.num_edges}\"\n                )\n            )\n\n    elif task.startswith(\"nodepair_\"):\n        if list(labels.shape[:2]) != 2 * [features.num_nodes]:\n            mismatch = True\n            logger.warning(\n                (\n                    f\"Inconsistent shape of nodepairs between labels and features in {task} task for {smiles}: {list(labels.shape[:2])} vs {2 * [features.num_nodes]}\"\n                )\n            )\n\n    else:\n        raise ValueError(\"Unkown task level\")\n\n    return mismatch\n"
  },
  {
    "path": "graphium/expts/pyg_batching_sparse.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"torch version:  1.13.0+cpu\\n\",\n      \"pyg version:  2.3.0.dev20230306\\n\",\n      \"batch.x =  tensor(indices=tensor([[0, 1, 1, 2, 3, 4],\\n\",\n      \"                       [1, 0, 1, 1, 0, 1]]),\\n\",\n      \"       values=tensor([1, 2, 3, 4, 5, 6]),\\n\",\n      \"       size=(5, 2), nnz=6, layout=torch.sparse_coo)\\n\",\n      \"Data(x=[2, 2])\\n\",\n      \"Data(x=[2, 2])\\n\",\n      \"[Data(x=[2, 2]), Data(x=[3, 2])]\\n\",\n      \"[tensor(indices=tensor([[0, 1, 1],\\n\",\n      \"                       [1, 0, 1]]),\\n\",\n      \"       values=tensor([1, 2, 3]),\\n\",\n      \"       size=(2, 2), nnz=3, layout=torch.sparse_coo), tensor(indices=tensor([[0, 1, 2],\\n\",\n      \"                       [1, 0, 1]]),\\n\",\n      \"       values=tensor([4, 5, 6]),\\n\",\n      \"       size=(3, 2), nnz=3, layout=torch.sparse_coo)]\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"import torch\\n\",\n    \"import torch_geometric\\n\",\n    \"from torch_geometric.data import Data, Batch\\n\",\n    \"\\n\",\n    \"data1 = Data(x=torch.sparse_coo_tensor(torch.tensor([[0, 1, 1], [1, 0, 1]]), torch.tensor([1, 2, 3]), (2, 2)))\\n\",\n    \"data2 = Data(x=torch.sparse_coo_tensor(torch.tensor([[0, 1, 2], [1, 0, 1]]), torch.tensor([4, 5, 6]), (3, 2)))\\n\",\n    \"batch = Batch.from_data_list([data1, data2])\\n\",\n    \"\\n\",\n    \"print(\\\"torch version: \\\", torch.__version__)\\n\",\n    \"print(\\\"pyg version: \\\", torch_geometric.__version__)\\n\",\n    \"print(\\\"batch.x = \\\", batch.x) # WORKS\\n\",\n    \"print(batch.get_example(0)) # FAILS\\n\",\n    \"print(batch[0]) # FAILS\\n\",\n    \"print(batch.to_data_list())\\n\",\n    \"print([b.x for b in batch.to_data_list()])\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"tensor(indices=tensor([[0, 1, 1, 2],\\n\",\n       \"                       [1, 0, 1, 0]]),\\n\",\n       \"       values=tensor([1, 2, 3, 5]),\\n\",\n       \"       size=(3, 2), nnz=4, layout=torch.sparse_coo)\"\n      ]\n     },\n     \"execution_count\": 10,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"batch.x.index_select(dim=0, index=torch.tensor([0, 1, 3]))\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"graphium_ipu\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.10\"\n  },\n  \"orig_nbformat\": 4\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "graphium/features/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\n- ✅ `featurizer.py`: featurization code for the molecules, adding node, edge and graph features to the mol object\n- `nmp.py`: check if a string can be converted to float, helper function for featurization\n- `positional_encoding.py`: code for computing all raw positional and structural encoding of the graph, see `graph_positional_encoder` function\n- `properties.py`: code for computing properties of the molecule\n- `rw.py`: code for computing random walk positional encoding\n- `spectral.py`: code for computing the spectral positional encoding such as the Laplacian eigenvalues and eigenvectors"
  },
  {
    "path": "graphium/features/__init__.py",
    "content": "from .featurizer import get_mol_atomic_features_onehot\nfrom .featurizer import get_mol_atomic_features_float\nfrom .featurizer import get_mol_edge_features\nfrom .featurizer import mol_to_adj_and_features\nfrom .featurizer import mol_to_graph_dict\nfrom .featurizer import mol_to_graph_signature\nfrom .featurizer import GraphDict\nfrom .featurizer import mol_to_pyggraph\nfrom .featurizer import to_dense_array\n"
  },
  {
    "path": "graphium/features/commute.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Dict, Any\n\nimport numpy as np\n\nfrom scipy.sparse import spmatrix, issparse\nfrom scipy.linalg import pinv\n\n\ndef compute_commute_distances(\n    adj: Union[np.ndarray, spmatrix], num_nodes: int, cache: Dict[str, Any]\n) -> Tuple[np.ndarray, str, Dict[str, Any]]:\n    \"\"\"\n    Compute avg. commute time/distance between nodepairs. This is the avg. number of steps a random walker, starting\n    at node i, will take before reaching a given node j for the first time, and then return to node i.\n\n    Reference: Saerens et al. \"The principal components analysis of a graph, and its relationships to spectral clustering.\" ECML. 2004.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n    Returns:\n        dist [num_nodes, num_nodes]: 2D array with avg. commute distances between nodepairs\n        base_level: Indicator of the output pos_level (node, edge, nodepair, graph) -> here nodepair\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    base_level = \"nodepair\"\n\n    if \"commute\" in cache:\n        dist = cache[\"commute\"]\n\n    else:\n        if issparse(adj):\n            adj = adj.toarray()\n\n        volG = adj.sum()\n\n        if \"pinvL\" in cache:\n            pinvL = cache[\"pinvL\"]\n\n        else:\n            L = np.diagflat(np.sum(adj, axis=1)) - adj\n            pinvL = pinv(L)\n            cache[\"pinvL\"] = pinvL\n\n        dist = volG * np.asarray(\n            [\n                [pinvL[i, i] + pinvL[j, j] - 2 * pinvL[i, j] for j in range(num_nodes)]\n                for i in range(num_nodes)\n            ]\n        )\n        cache[\"commute\"] = dist\n\n    return dist, base_level, cache\n"
  },
  {
    "path": "graphium/features/electrostatic.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Dict, Any\n\nimport numpy as np\n\nfrom scipy.linalg import pinv\nfrom scipy.sparse import spmatrix, issparse\n\n\ndef compute_electrostatic_interactions(\n    adj: Union[np.ndarray, spmatrix], cache: Dict[str, Any]\n) -> Tuple[np.ndarray, str, Dict[str, Any]]:\n    \"\"\"\n    Compute electrostatic interaction of nodepairs.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix\n        cache: Dictionary of cached objects\n    Returns:\n        electrostatic [num_nodes, num_nodes]: 2D array with electrostatic interactions of node nodepairs\n        base_level: Indicator of the output pos_level (node, edge, nodepair, graph) -> here nodepair\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    base_level = \"nodepair\"\n\n    if \"electrostatic\" in cache:\n        electrostatic = cache[\"electrostatic\"]\n\n    else:\n        if \"pinvL\" in cache:\n            pinvL = cache[\"pinvL\"]\n\n        else:\n            if issparse(adj):\n                adj = adj.toarray()\n\n            L = np.diagflat(np.sum(adj, axis=1)) - adj\n            pinvL = pinv(L)\n            cache[\"pinvL\"] = pinvL\n\n        electrostatic = pinvL - np.diag(pinvL)  # This means that the \"ground\" is set to any given atom\n        cache[\"electrostatic\"] = electrostatic\n\n    return electrostatic, base_level, cache\n"
  },
  {
    "path": "graphium/features/featurizer.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, List, Callable, Dict, Tuple, Any, Optional\n\nimport inspect\nfrom loguru import logger\nimport numpy as np\nfrom scipy.sparse import issparse, coo_matrix\nimport torch\nfrom torch import Tensor\n\nfrom torch_geometric.data import Data\n\nfrom rdkit import Chem\nimport datamol as dm\n\nfrom graphium.features import nmp\nfrom graphium.utils.tensor import one_of_k_encoding\nfrom graphium.features.positional_encoding import get_all_positional_encodings\n\n\ndef to_dense_array(array: np.ndarray, dtype: str = None) -> np.ndarray:\n    r\"\"\"\n    Assign the node data\n    Parameters:\n        array: The array to convert to dense\n        dtype: The dtype of the array\n    Returns:\n        The dense array\n    \"\"\"\n    if array is not None:\n        if issparse(array):\n            if array.dtype == np.float16:  # float16 doesn't support `todense`\n                array = array.astype(np.float32)\n            array = array.todense()\n\n        if dtype is not None:\n            array = array.astype(dtype)\n    return array\n\n\ndef to_dense_tensor(tensor: Tensor, dtype: str = None) -> Tensor:\n    r\"\"\"\n    Assign the node data\n    Parameters:\n        array: The array to convert to dense\n        dtype: The dtype of the array\n    Returns:\n        The dense array\n    \"\"\"\n    if tensor is not None:\n        if tensor.is_sparse:\n            tensor = tensor.todense()\n        if dtype is not None:\n            tensor = tensor.to(dtype)\n    return tensor\n\n\ndef _mask_nans_inf(mask_nan: Optional[str], array: np.ndarray, array_name: str) -> np.ndarray:\n    r\"\"\"\n    mask the NaNs in the array\n    Parameters:\n        mask_nan: How to mask the NaNs\n        array: The array to mask\n        array_name: The name of the array\n    Returns:\n        The masked array\n    \"\"\"\n    if (mask_nan is None) or (array is None):\n        return array\n\n    new_array = array\n    if issparse(new_array):\n        new_array = new_array.data\n    nans = ~np.isfinite(new_array)\n\n    # Mask the NaNs\n    if nans.any():\n        msg = f\"There are {np.sum(nans)} NaNs in `{array_name}`\"\n        if mask_nan == \"raise\":\n            raise ValueError(msg)\n        elif mask_nan == \"warn\":\n            logger.warning(msg)\n        else:\n            new_array[nans] = mask_nan\n            if issparse(array):\n                array.data = new_array\n                new_array = array\n    return new_array\n\n\ndef get_mol_atomic_features_onehot(mol: dm.Mol, property_list: List[str]) -> Dict[str, Tensor]:\n    r\"\"\"\n    Get the following set of features for any given atom\n\n    * One-hot representation of the atom\n    * One-hot representation of the atom degree\n    * One-hot representation of the atom implicit valence\n    * One-hot representation of the the atom hybridization\n    * Whether the atom is aromatic\n    * The atom's formal charge\n    * The atom's number of radical electrons\n\n    Additionally, the following features can be set, depending on the value of input Parameters\n\n    * One-hot representation of the number of hydrogen atom in the the current atom neighborhood if `explicit_H` is false\n    * One-hot encoding of the atom chirality, and whether such configuration is even possible\n\n    Parameters:\n\n        mol:\n            molecule from which to extract the properties\n\n        property_list:\n            A list of integer atomic properties to get from the molecule.\n            The integer values are converted to a one-hot vector.\n            Callables are not supported by this function.\n\n            Accepted properties are:\n\n            - \"atomic-number\"\n            - \"degree\"\n            - \"valence\", \"total-valence\"\n            - \"implicit-valence\"\n            - \"hybridization\"\n            - \"chirality\"\n            - \"phase\"\n            - \"type\"\n            - \"group\"\n            - \"period\"\n\n    Returns:\n        prop_dict:\n            A dictionnary where the element of ``property_list`` are the keys\n            and the values are np.ndarray of shape (N, OH). N is the number of atoms\n            in ``mol`` and OH the lenght of the one-hot encoding.\n\n    \"\"\"\n\n    prop_dict = {}\n\n    for prop in property_list:\n        prop = prop.lower()\n        prop_name = prop\n\n        property_array = []\n        for ii, atom in enumerate(mol.GetAtoms()):\n            if prop in [\"atomic-number\"]:\n                one_hot = one_of_k_encoding(atom.GetSymbol(), nmp.ATOM_LIST)\n            elif prop in [\"degree\"]:\n                one_hot = one_of_k_encoding(atom.GetDegree(), nmp.ATOM_DEGREE_LIST)\n            elif prop in [\"valence\", \"total-valence\"]:\n                prop_name = \"valence\"\n                one_hot = one_of_k_encoding(atom.GetTotalValence(), nmp.VALENCE)\n            elif prop in [\"implicit-valence\"]:\n                one_hot = one_of_k_encoding(atom.GetImplicitValence(), nmp.VALENCE)\n            elif prop in [\"hybridization\"]:\n                one_hot = one_of_k_encoding(atom.GetHybridization(), nmp.HYBRIDIZATION_LIST)\n            elif prop in [\"chirality\"]:\n                try:\n                    one_hot = one_of_k_encoding(atom.GetProp(\"_CIPCode\"), nmp.CHIRALITY_LIST)\n                    one_hot.append(int(atom.HasProp(\"_ChiralityPossible\")))\n                except:\n                    one_hot = [0, 0, int(atom.HasProp(\"_ChiralityPossible\"))]\n            elif prop in \"phase\":\n                one_hot = one_of_k_encoding(nmp.PHASE[atom.GetAtomicNum() - 1], nmp.PHASE_SET)\n            elif prop in \"type\":\n                one_hot = one_of_k_encoding(nmp.TYPE[atom.GetAtomicNum() - 1], nmp.TYPE_SET)\n            elif prop in \"group\":\n                one_hot = one_of_k_encoding(nmp.GROUP[atom.GetAtomicNum() - 1], nmp.GROUP_SET)\n            elif prop in \"period\":\n                one_hot = one_of_k_encoding(nmp.PERIOD[atom.GetAtomicNum() - 1], nmp.PERIOD_SET)\n            else:\n                raise ValueError(f\"Unsupported property `{prop}`\")\n\n            property_array.append(np.asarray(one_hot, dtype=np.float16))\n\n        prop_dict[prop_name] = np.stack(property_array, axis=0)\n\n    return prop_dict\n\n\ndef get_mol_conformer_features(\n    mol: dm.Mol,\n    property_list: Union[List[str], List[Callable]],\n    mask_nan: Optional[Union[float, str]] = None,\n) -> Dict[str, np.ndarray]:\n    r\"\"\"obtain the conformer features of a molecule\n    Parameters:\n\n        mol:\n            molecule from which to extract the properties\n\n        property_list:\n            A list of conformer property to get from the molecule\n            Accepted properties are:\n            - \"positions_3d\"\n\n    Returns:\n        prop_dict: a dictionary where the element of ``property_list`` are the keys\n    \"\"\"\n    prop_dict = {}\n    has_conf = True\n\n    try:\n        mol.GetConformer()\n    except:\n        has_conf = False\n    # * currently only accepts \"positions_3d\", raise errors otherwise\n    for prop in property_list:\n        if isinstance(prop, str):\n            if prop in [\"positions_3d\"]:  # locating 3d conformer coordinates\n                if not has_conf:\n                    positions = np.full((mol.GetNumAtoms(), 3), float(\"nan\"), dtype=np.float16)\n                else:\n                    positions = [[], [], []]\n                    for i in range(mol.GetNumAtoms()):\n                        pos = mol.GetConformer().GetAtomPosition(i)\n                        positions[0].append(pos.x)\n                        positions[1].append(pos.y)\n                        positions[2].append(pos.z)\n                    positions = np.asarray(positions, dtype=np.float16).T\n                prop_dict[prop] = positions\n            else:\n                raise ValueError(\n                    str(prop) + \" is not currently supported as a conformer property in `property_list`\"\n                )\n        else:\n            raise ValueError(f\"Elements in `property_list` must be str or callable, provided `{type(prop)}`\")\n\n        prop_dict[prop] = _mask_nans_inf(mask_nan, prop_dict[prop], prop)\n\n    return prop_dict\n\n\ndef get_mol_atomic_features_float(\n    mol: dm.Mol,\n    property_list: Union[List[str], List[Callable]],\n    offset_carbon: bool = True,\n    mask_nan: Union[str, float, type(None)] = \"raise\",\n) -> Dict[str, np.ndarray]:\n    r\"\"\"\n    Get a dictionary of floating-point arrays of atomic properties.\n    To ensure all properties are at a similar scale, some of the properties\n    are divided by a constant.\n\n    There is also the possibility of offseting by the carbon value using\n    the `offset_carbon` parameter.\n\n    Parameters:\n\n        mol:\n            molecule from which to extract the properties\n\n        property_list:\n            A list of atomic properties to get from the molecule, such as 'atomic-number',\n            'mass', 'valence', 'degree', 'electronegativity'.\n            Some elements are divided by a factor to avoid feature explosion.\n\n            Accepted properties are:\n\n            - \"atomic-number\"\n            - \"mass\", \"weight\"\n            - \"valence\", \"total-valence\"\n            - \"implicit-valence\"\n            - \"hybridization\"\n            - \"chirality\"\n            - \"hybridization\"\n            - \"aromatic\"\n            - \"ring\", \"in-ring\"\n            - \"min-ring\"\n            - \"max-ring\"\n            - \"num-ring\"\n            - \"degree\"\n            - \"radical-electron\"\n            - \"formal-charge\"\n            - \"vdw-radius\"\n            - \"covalent-radius\"\n            - \"electronegativity\"\n            - \"ionization\", \"first-ionization\"\n            - \"melting-point\"\n            - \"metal\"\n            - \"single-bond\"\n            - \"aromatic-bond\"\n            - \"double-bond\"\n            - \"triple-bond\"\n            - \"is-carbon\"\n            - \"group\"\n            - \"period\"\n\n        offset_carbon:\n            Whether to subract the Carbon property from the desired atomic property.\n            For example, if we want the mass of the Lithium (6.941), the mass of the\n            Carbon (12.0107) will be subracted, resulting in a value of -5.0697\n\n        mask_nan:\n            Deal with molecules that fail a part of the featurization.\n            NaNs can happen when taking the of a noble gas,\n            or other properties that are not measured for specific atoms.\n\n            - \"raise\": Raise an error when there is a nan or inf in the featurization\n            - \"warn\": Raise a warning when there is a nan or inf in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans or inf by the specified value\n\n    Returns:\n\n        prop_dict:\n            A dictionnary where the element of ``property_list`` are the keys\n            and the values are np.ndarray of shape (N,). N is the number of atoms\n            in ``mol``.\n\n    \"\"\"\n\n    periodic_table = Chem.GetPeriodicTable()\n    prop_dict = {}\n    C = Chem.Atom(\"C\")\n    C_num = C.GetAtomicNum()\n    offC = bool(offset_carbon)\n    atom_list = list(mol.GetAtoms())\n\n    for prop in property_list:\n        prop_name = None\n\n        property_array = np.zeros(mol.GetNumAtoms(), dtype=np.float16)\n        for ii, atom in enumerate(atom_list):\n            val = None\n            atomic_num = atom.GetAtomicNum()\n\n            if isinstance(prop, str):\n                prop = prop.lower()\n                prop_name = prop\n\n                if prop in [\"atomic-number\"]:\n                    val = (atomic_num - (offC * C_num)) / 5\n                elif prop in [\"mass\", \"weight\"]:\n                    prop_name = \"mass\"\n                    val = (atom.GetMass() - (offC * C.GetMass())) / 10\n                elif prop in [\"valence\", \"total-valence\"]:\n                    prop_name = \"valence\"\n                    val = atom.GetTotalValence() - (offC * 4)\n                elif prop in [\"implicit-valence\"]:\n                    val = atom.GetImplicitValence()\n                elif prop in [\"hybridization\"]:\n                    val = atom.GetHybridization()\n                elif prop in [\"chirality\"]:\n                    val = (atom.GetProp(\"_CIPCode\") == \"R\") if atom.HasProp(\"_CIPCode\") else 2\n                elif prop in [\"hybridization\"]:\n                    val = atom.GetHybridization()\n                elif prop in [\"aromatic\"]:\n                    val = atom.GetIsAromatic()\n                elif prop in [\"ring\", \"in-ring\"]:\n                    prop_name = \"in-ring\"\n                    val = atom.IsInRing()\n                elif prop in [\"min-ring\"]:\n                    ring_info = mol.GetRingInfo()\n                    val = ring_info.MinAtomRingSize(atom.GetIdx())\n                elif prop in [\"max-ring\"]:\n                    rings = mol.GetRingInfo().AtomRings()\n                    val = 0\n                    for ring in rings:\n                        if atom.GetIdx() in ring:\n                            if len(ring) > val:\n                                val = len(ring)\n                elif prop in [\"num-ring\"]:\n                    ring_info = mol.GetRingInfo()\n                    val = ring_info.NumAtomRings(atom.GetIdx())\n                elif prop in [\"degree\"]:\n                    val = atom.GetTotalDegree() - (offC * 2)\n                elif prop in [\"radical-electron\"]:\n                    val = atom.GetNumRadicalElectrons()\n                elif prop in [\"formal-charge\"]:\n                    val = atom.GetFormalCharge()\n                elif prop in [\"vdw-radius\"]:\n                    val = periodic_table.GetRvdw(atom.GetAtomicNum()) - offC * periodic_table.GetRvdw(C_num)\n                elif prop in [\"covalent-radius\"]:\n                    val = periodic_table.GetRcovalent(atomic_num) - offC * periodic_table.GetRcovalent(C_num)\n                elif prop in [\"electronegativity\"]:\n                    val = (\n                        nmp.ELECTRONEGATIVITY[atom.GetAtomicNum() - 1]\n                        - offC * nmp.ELECTRONEGATIVITY[C_num - 1]\n                    )\n                elif prop in [\"ionization\", \"first-ionization\"]:\n                    prop_name = \"ionization\"\n                    val = (nmp.FIRST_IONIZATION[atomic_num - 1] - offC * nmp.FIRST_IONIZATION[C_num - 1]) / 5\n                elif prop in [\"melting-point\"]:\n                    val = (nmp.MELTING_POINT[atomic_num - 1] - offC * nmp.MELTING_POINT[C_num - 1]) / 200\n                elif prop in [\"metal\"]:\n                    val = nmp.METAL[atomic_num - 1]\n                elif prop in \"group\":\n                    val = float(nmp.GROUP[atomic_num - 1]) - offC * float(nmp.GROUP[C_num - 1])\n                elif prop in \"period\":\n                    val = float(nmp.PERIOD[atomic_num - 1]) - offC * float(nmp.PERIOD[C_num - 1])\n                elif \"-bond\" in prop:\n                    bonds = [bond.GetBondTypeAsDouble() for bond in atom.GetBonds()]\n                    if prop in [\"single-bond\"]:\n                        val = len([bond == 1 for bond in bonds])\n                    elif prop in [\"aromatic-bond\"]:\n                        val = len([bond == 1.5 for bond in bonds])\n                    elif prop in [\"double-bond\"]:\n                        val = len([bond == 2 for bond in bonds])\n                    elif prop in [\"triple-bond\"]:\n                        val = len([bond == 3 for bond in bonds])\n                    else:\n                        raise ValueError(f\"{prop} is not a correct bond.\")\n                    val -= offC * 1\n                elif prop in [\"is-carbon\"]:\n                    val = atom.GetAtomicNum() == 6\n                    val -= offC * 1\n                else:\n                    raise ValueError(f\"Unsupported property `{prop}`\")\n\n            elif callable(prop):\n                prop_name = str(prop)\n                val = prop(atom)\n            else:\n                ValueError(f\"Elements in `property_list` must be str or callable, provided `{type(prop)}`\")\n\n            if val is None:\n                raise ValueError(\"val is undefined.\")\n\n            property_array[ii] = val\n\n        if prop_name is None:\n            raise ValueError(\"prop_name is undefined.\")\n\n        # Mask the NaNs\n        prop_dict[prop_name] = _mask_nans_inf(mask_nan, property_array, \"atom featurization\")\n\n    return prop_dict\n\n\ndef get_simple_mol_conformer(mol: dm.Mol) -> Union[Chem.rdchem.Conformer, None]:\n    r\"\"\"\n    If the molecule has a conformer, then it will return the conformer at idx `0`.\n    Otherwise, it generates a simple molecule conformer using `rdkit.Chem.rdDistGeom.EmbedMolecule`\n    and returns it. This is meant to be used in simple functions like `GetBondLength`,\n    not in functions requiring complex 3D structure.\n\n    Parameters:\n\n        mol: Rdkit Molecule\n\n    Returns:\n        conf: A conformer of the molecule, or `None` if it fails\n    \"\"\"\n\n    val = 0\n    if mol.GetNumConformers() == 0:\n        val = Chem.rdDistGeom.EmbedMolecule(mol)\n    if val == -1:\n        val = Chem.rdDistGeom.EmbedMolecule(\n            mol,\n            enforceChirality=False,\n            ignoreSmoothingFailures=True,\n            useBasicKnowledge=True,\n            useExpTorsionAnglePrefs=True,\n            forceTol=0.1,\n        )\n\n    if val == -1:\n        conf = None\n        logger.warn(\"Couldn't compute conformer for molecule `{}`\".format(Chem.MolToSmiles(mol)))\n    else:\n        conf = mol.GetConformer(0)\n\n    return conf\n\n\ndef get_estimated_bond_length(bond: Chem.rdchem.Bond, mol: dm.Mol) -> float:\n    r\"\"\"\n    Estimate the bond length between atoms by looking at the estimated atomic radius\n    that depends both on the atom type and the bond type. The resulting bond-length is\n    then the sum of the radius.\n\n    Keep in mind that this function only provides an estimate of the bond length and not\n    the true one based on a conformer. The vast majority od estimated bond lengths will\n    have an error below 5% while some bonds can have an error up to 20%. This function\n    is mostly useful when conformer generation fails for some molecules, or for\n    increased computation speed.\n\n    Parameters:\n        bond: The bond to measure its lenght\n        mol: The molecule containing the bond (used to get neighbouring atoms)\n\n    Returns:\n        bond_length: The bond length in Angstrom, typically a value around 1-2.\n\n    \"\"\"\n\n    # Get the atoms connected by the bond\n    idx1 = bond.GetBeginAtomIdx()\n    idx2 = bond.GetEndAtomIdx()\n    atom1 = mol.GetAtomWithIdx(idx1).GetAtomicNum()\n    atom2 = mol.GetAtomWithIdx(idx2).GetAtomicNum()\n    bond_type = bond.GetBondType()\n\n    # Get single bond atomic radius\n    if bond_type == Chem.rdchem.BondType.SINGLE:\n        rad1 = [nmp.BOND_RADIUS_SINGLE[atom1 - 1]]\n        rad2 = [nmp.BOND_RADIUS_SINGLE[atom2 - 1]]\n    # Get double bond atomic radius\n    elif bond_type == Chem.rdchem.BondType.DOUBLE:\n        rad1 = [nmp.BOND_RADIUS_DOUBLE[atom1 - 1]]\n        rad2 = [nmp.BOND_RADIUS_DOUBLE[atom2 - 1]]\n    # Get triple bond atomic radius\n    elif bond_type == Chem.rdchem.BondType.TRIPLE:\n        rad1 = [nmp.BOND_RADIUS_TRIPLE[atom1 - 1]]\n        rad2 = [nmp.BOND_RADIUS_TRIPLE[atom2 - 1]]\n    # Get average of single bond and double bond atomic radius\n    elif bond_type == Chem.rdchem.BondType.AROMATIC:\n        rad1 = [nmp.BOND_RADIUS_SINGLE[atom1 - 1], nmp.BOND_RADIUS_DOUBLE[atom1 - 1]]\n        rad2 = [nmp.BOND_RADIUS_SINGLE[atom2 - 1], nmp.BOND_RADIUS_DOUBLE[atom2 - 1]]\n\n    # Average the bond lengths, while ignoring nans in case some missing value\n    rad1_float = [elem for elem in rad1 if elem is not None]\n    rad2_float = [elem for elem in rad2 if elem is not None]\n\n    if len(rad1_float) > 0:\n        rad1_float = sum(rad1_float) / len(rad1_float)\n    else:\n        rad1_float = float(nmp.BOND_RADIUS_SINGLE[atom1 - 1])\n\n    if len(rad2_float) > 0:\n        rad2_float = sum(rad2_float) / len(rad2_float)\n    else:\n        rad2_float = float(nmp.BOND_RADIUS_SINGLE[atom2 - 1])\n\n    bond_length = rad1_float + rad2_float\n    return bond_length\n\n\ndef get_mol_edge_features(\n    mol: dm.Mol, property_list: List[str], mask_nan: Union[str, float, type(None)] = \"raise\"\n) -> Dict[str, np.ndarray]:\n    r\"\"\"\n    Get the following set of features for any given bond\n    See `graphium.features.nmp` for allowed values in one hot encoding\n\n    * One-hot representation of the bond type. Note that you should not kekulize your\n        molecules, if you expect this to take aromatic bond into account.\n    * Bond stereo type, following CIP classification\n    * Whether the bond is conjugated\n    * Whether the bond is in a ring\n\n    Parameters:\n        mol: rdkit.Chem.Molecule\n            the molecule of interest\n\n        property_list:\n            A list of edge properties to return for the given molecule.\n            Accepted properties are:\n\n            - \"bond-type-onehot\"\n            - \"bond-type-float\"\n            - \"stereo\"\n            - \"in-ring\"\n            - \"conjugated\"\n            - \"conformer-bond-length\" (might cause problems with complex molecules)\n            - \"estimated-bond-length\"\n\n    Returns:\n        prop_dict:\n            A dictionnary where the element of ``property_list`` are the keys\n            and the values are np.ndarray of shape (N,). N is the number of atoms\n            in ``mol``.\n\n    \"\"\"\n\n    prop_dict = {}\n\n    # Compute features for each bond\n    num_bonds = mol.GetNumBonds()\n    for prop in property_list:\n        property_array = []\n        for ii in range(num_bonds):\n            prop = prop.lower()\n            bond = mol.GetBondWithIdx(ii)\n\n            if prop in [\"bond-type-onehot\"]:\n                encoding = one_of_k_encoding(bond.GetBondType(), nmp.BOND_TYPES)\n            elif prop in [\"bond-type-float\"]:\n                encoding = [bond.GetBondTypeAsDouble()]\n            elif prop in [\"stereo\"]:\n                encoding = one_of_k_encoding(bond.GetStereo(), nmp.BOND_STEREO)\n            elif prop in [\"in-ring\"]:\n                encoding = [bond.IsInRing()]\n            elif prop in [\"conjugated\"]:\n                encoding = [bond.GetIsConjugated()]\n            elif prop in [\"conformer-bond-length\"]:\n                conf = get_simple_mol_conformer(mol)\n                if conf is not None:\n                    idx1 = bond.GetBeginAtomIdx()\n                    idx2 = bond.GetEndAtomIdx()\n                    encoding = [Chem.rdMolTransforms.GetBondLength(conf, idx1, idx2)]\n                else:\n                    encoding = [0]\n            elif prop in [\"estimated-bond-length\"]:\n                encoding = [get_estimated_bond_length(bond, mol)]\n\n            else:\n                raise ValueError(f\"Unsupported property `{prop}`\")\n\n            property_array.append(np.asarray(encoding, dtype=np.float16))\n\n        if num_bonds > 0:\n            property_array = np.stack(property_array, axis=0)\n            # Mask the NaNs\n            prop_dict[prop] = _mask_nans_inf(mask_nan, property_array, \"edge property\")\n        else:\n            # Add an empty vector with the right shape\n            arr_len = 1\n            if prop in [\"bond-type-onehot\"]:\n                arr_len = len(nmp.BOND_TYPES) + 1\n            elif prop in [\"stereo\"]:\n                arr_len = len(nmp.BOND_STEREO) + 1\n\n            prop_dict[prop] = np.zeros((0, arr_len))\n\n    return prop_dict\n\n\ndef mol_to_adj_and_features(\n    mol: Union[str, dm.Mol],\n    atom_property_list_onehot: List[str] = [],\n    atom_property_list_float: List[Union[str, Callable]] = [],\n    conformer_property_list: List[str] = [],\n    edge_property_list: List[str] = [],\n    add_self_loop: bool = False,\n    explicit_H: bool = False,\n    use_bonds_weights: bool = False,\n    pos_encoding_as_features: Dict[str, Any] = None,\n    dtype: np.dtype = np.float16,\n    mask_nan: Union[str, float, type(None)] = \"raise\",\n) -> Union[\n    coo_matrix,\n    Union[Tensor, None],\n    Union[Tensor, None],\n    Dict[str, Tensor],\n    Union[Tensor, None],\n    Dict[str, Tensor],\n]:\n    r\"\"\"\n    Transforms a molecule into an adjacency matrix representing the molecular graph\n    and a set of atom and bond features.\n\n    It also returns the positional encodings associated to the graph.\n\n    Parameters:\n\n        mol:\n            The molecule to be converted\n\n        atom_property_list_onehot:\n            List of the properties used to get one-hot encoding of the atom type,\n            such as the atom index represented as a one-hot vector.\n            See function `get_mol_atomic_features_onehot`\n\n        atom_property_list_float:\n            List of the properties used to get floating-point encoding of the atom type,\n            such as the atomic mass or electronegativity.\n            See function `get_mol_atomic_features_float`\n\n        conformer_property_list:\n            list of properties used to encode the conformer information, outside of atom properties, currently support \"positions_3d\"\n\n        edge_property_list:\n            List of the properties used to encode the edges, such as the edge type\n            and the stereo type.\n\n        add_self_loop:\n            Whether to add a value of `1` on the diagonal of the adjacency matrix.\n\n        explicit_H:\n            Whether to consider the Hydrogens explicitely. If `False`, the hydrogens\n            are implicit.\n\n        use_bonds_weights:\n            Whether to use the floating-point value of the bonds in the adjacency matrix,\n            such that single bonds are represented by 1, double bonds 2, triple 3, aromatic 1.5\n\n        pos_encoding_as_features: keyword arguments for function `graph_positional_encoder`\n            to generate positional encoding for node features.\n\n        dtype:\n            The torch data type used to build the graph\n\n        mask_nan:\n            Deal with molecules that fail a part of the featurization.\n            NaNs can happen when taking the of a noble gas,\n            or other properties that are not measured for specific atoms.\n\n            - \"raise\": Raise an error when there is a nan or inf in the featurization\n            - \"warn\": Raise a warning when there is a nan or inf in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans or inf by the specified value\n    Returns:\n\n        adj:\n            torch coo sparse adjacency matrix of the molecule\n\n        ndata:\n            Concatenated node data of the atoms, based on the properties from\n            `atom_property_list_onehot` and `atom_property_list_float`.\n            If no properties are given, it returns `None`\n\n        edata:\n            Concatenated node edge of the molecule, based on the properties from\n            `edge_property_list`.\n            If no properties are given, it returns `None`\n\n        pe_dict:\n            Dictionary of all positional encodings. Current supported keys:\n\n            - \"pos_enc_feats_sign_flip\":\n                Node positional encoding that requires augmentation via sign-flip.\n                For example, eigenvectors of the Laplacian are ambiguous to the\n                sign and are returned here.\n\n            - \"pos_enc_feats_no_flip\":\n                Node positional encoding that requires does not use sign-flip.\n                For example, distance from centroid are returned here.\n\n            - \"rwse\":\n                Node structural encoding corresponding to the diagonal of the random\n                walk matrix\n\n        conf_dict:\n            contains the 3d positions of a conformer of the molecule or 0s if none is found\n\n    \"\"\"\n\n    if isinstance(mol, str):\n        mol = dm.to_mol(mol, ordered=True)\n\n    # Add or remove explicit hydrogens\n    if explicit_H:\n        mol = Chem.AddHs(mol)\n    else:\n        mol = Chem.RemoveHs(mol)\n\n    num_nodes = mol.GetNumAtoms()\n\n    adj = mol_to_adjacency_matrix(\n        mol, use_bonds_weights=use_bonds_weights, add_self_loop=add_self_loop, dtype=dtype\n    )\n\n    # Get the node features\n    atom_features_onehot = get_mol_atomic_features_onehot(mol, atom_property_list_onehot)\n    atom_features_float = get_mol_atomic_features_float(mol, atom_property_list_float, mask_nan=mask_nan)\n    conf_dict = get_mol_conformer_features(mol, conformer_property_list, mask_nan=mask_nan)\n    ndata = list(atom_features_float.values()) + list(atom_features_onehot.values())\n    ndata = [d[:, np.newaxis] if d.ndim == 1 else d for d in ndata]\n\n    if len(ndata) > 0:\n        ndata = np.concatenate(ndata, axis=1).astype(dtype=dtype)\n    else:\n        ndata = None\n\n    # Get the edge features\n    edge_features = get_mol_edge_features(mol, edge_property_list, mask_nan=mask_nan)\n    edata = list(edge_features.values())\n    edata = [np.expand_dims(d, axis=1) if d.ndim == 1 else d for d in edata]\n    if len(edata) > 0:\n        edata = np.concatenate(edata, axis=1).astype(dtype=dtype)\n    else:\n        edata = None\n\n    # Get all positional encodings\n    pe_dict = get_all_positional_encodings(adj, num_nodes, pos_encoding_as_features)\n\n    # Mask the NaNs\n    for pe_key, pe_val in pe_dict.items():\n        pe_val = np.asarray(pe_val, dtype=dtype)\n        pe_dict[pe_key] = _mask_nans_inf(mask_nan, pe_val, pe_key)\n\n    return adj, ndata, edata, pe_dict, conf_dict\n\n\ndef mol_to_adjacency_matrix(\n    mol: dm.Mol,\n    use_bonds_weights: bool = False,\n    add_self_loop: bool = False,\n    dtype: np.dtype = np.float32,\n) -> coo_matrix:\n    r\"\"\"\n    Convert a molecule to a sparse adjacency matrix, as a torch Tensor.\n    Instead of using the Rdkit `GetAdjacencyMatrix()` method, this method\n    uses the bond ordering from the molecule object, which is the same as\n    the bond ordering in the bond features.\n\n    Warning:\n        Do not use `Tensor.coalesce()` on the returned adjacency matrix, as it\n        will change the ordering of the bonds.\n\n    Args:\n        mol: A molecule in the form of a SMILES string or an RDKit molecule object.\n\n        use_bonds_weights:\n            If `True`, the adjacency matrix will contain the bond type as the\n            value of the edge. If `False`, the adjacency matrix will contain\n            `1` as the value of the edge.\n\n        add_self_loop:\n            If `True`, the adjacency matrix will contain a self-loop for each\n            node.\n\n        dtype:\n            The data type used to build the graph\n\n    Returns:\n        adj:\n            coo sparse adjacency matrix of the molecule\n    \"\"\"\n\n    # Get the indices for the adjacency matrix, and the bond value\n    adj_idx, adj_val = [], []\n    for bond in mol.GetBonds():\n        adj_idx.append([bond.GetBeginAtomIdx(), bond.GetEndAtomIdx()])\n        adj_idx.append([bond.GetEndAtomIdx(), bond.GetBeginAtomIdx()])\n        if use_bonds_weights:\n            val = nmp.BOND_TYPES[bond.GetBondType()]\n        else:\n            val = 1\n        adj_val.extend([val, val])\n\n    # Convert to torch coo sparse tensor\n    if len(adj_val) > 0:  # ensure tensor is not empty\n        adj = coo_matrix(\n            (torch.as_tensor(adj_val), torch.as_tensor(adj_idx).T.reshape(2, -1)),\n            shape=(mol.GetNumAtoms(), mol.GetNumAtoms()),\n            dtype=dtype,\n        )\n    else:\n        # Special case for molecules with one atom\n        adj = coo_matrix(([], np.array([[], []])), shape=(mol.GetNumAtoms(), mol.GetNumAtoms()), dtype=dtype)\n\n    # Add self loops\n    if add_self_loop:\n        arange = np.arange(adj.shape[0], dtype=int)\n        adj[arange, arange] = 1\n    return adj\n\n\nclass GraphDict(dict):\n    def __init__(\n        self,\n        dic: Dict,\n    ):\n        r\"\"\"\n        Store the parameters required to initialize a `pyg.data.Data`, but\n        as a dictionary to reduce memory consumption.\n\n        Possible keys for the dictionary:\n\n        - adj: A sparse Tensor containing the adjacency matrix\n\n        - ndata: A dictionnary containing different keys and Tensors\n            associated to the node features.\n\n        - edata: A dictionnary containing different keys and Tensors\n            associated to the edge features.\n\n        - dtype: The dtype for the floating data.\n\n        - mask_nan:\n            Deal with molecules that fail a part of the featurization.\n            NaNs can happen when taking the of a noble gas,\n            or other properties that are not measured for specific atoms.\n\n            - \"raise\": Raise an error when there is a nan or inf in the featurization\n            - \"warn\": Raise a warning when there is a nan or inf in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans or inf by the specified value\n        \"\"\"\n        default_dic = {\n            \"dtype\": np.float16,\n            \"mask_nan\": \"raise\",\n        }\n        data = dic.pop(\"data\", {})\n        # ndata = dic.pop(\"ndata\", {})\n        # edata = dic.pop(\"edata\", {})\n        # for key in edata.keys():\n        #     assert key.startswith(\"edge_\"), f\"Edge features must start with 'edge_' but got {key}\"\n        default_dic.update(dic)\n        default_dic.update(data)\n        # default_dic.update(ndata)\n        # default_dic.update(edata)\n        super().__init__(default_dic)\n\n    @property\n    def keys(self):\n        return list(super().keys())\n\n    @property\n    def values(self):\n        return list(super().self.values())\n\n    def make_pyg_graph(self, **kwargs) -> Data:\n        \"\"\"\n        Convert the current dictionary of parameters, containing an adjacency matrix with node/edge data\n        into a `pyg.data.Data` of torch Tensors.\n\n        `**kwargs` can be used to overwrite any parameter from the current dictionary. See `GraphDict.__init__`\n        for a list of parameters\n        \"\"\"\n\n        num_nodes = self.adj.shape[0]\n        data_dict = {}\n\n        # Convert the numpy and numpy sparse data to torch\n        for key, val in self.items():\n            if key in [\"adj\", \"dtype\", \"mask_nan\"]:  # Skip the parameters\n                continue\n            elif isinstance(val, np.ndarray):\n                # Convert the data to the specified dtype in torch format\n                val = val.astype(self.dtype)\n                data_dict[key] = torch.as_tensor(val)\n            elif issparse(val):\n                data_dict[key] = torch.as_tensor(val.astype(np.float32).todense())\n                # `torch.sparse_coo_tensor` is too slow. Slows down the multiprocessing of features by >3x on 32 cores.\n                # indices = torch.from_numpy(np.vstack((val.row, val.col)).astype(np.int64))\n                # data_dict[key] = torch.sparse_coo_tensor(indices=indices, values=val.data, size=val.shape)\n            elif isinstance(val, torch.Tensor):\n                data_dict[key] = val\n            else:\n                pass  # Skip the other parameters\n\n        # Create the PyG graph object `Data`\n        edge_index = torch.as_tensor(np.vstack((self.adj.row, self.adj.col)))\n        edge_weight = torch.as_tensor(self.adj.data)\n        data = Data(edge_index=edge_index, edge_weight=edge_weight, num_nodes=num_nodes, **data_dict)\n        return data\n\n    @property\n    def adj(self):\n        return self[\"adj\"]\n\n    @property\n    def dtype(self):\n        return self[\"dtype\"]\n\n    @property\n    def mask_nan(self):\n        return self[\"mask_nan\"]\n\n    @property\n    def num_nodes(self) -> int:\n        return self.adj.shape[0]\n\n    @property\n    def num_edges(self) -> int:\n        if issparse(self.adj):\n            return self.adj.nnz\n        else:\n            return np.count_nonzero(self.adj)  # No division by 2 because edges are counted twice\n\n\ndef mol_to_graph_dict(\n    mol: dm.Mol,\n    atom_property_list_onehot: List[str] = [],\n    atom_property_list_float: List[Union[str, Callable]] = [],\n    conformer_property_list: List[str] = [],\n    edge_property_list: List[str] = [],\n    add_self_loop: bool = False,\n    explicit_H: bool = False,\n    use_bonds_weights: bool = False,\n    pos_encoding_as_features: Dict[str, Any] = None,\n    dtype: np.dtype = np.float16,\n    on_error: str = \"ignore\",\n    mask_nan: Union[str, float, type(None)] = \"raise\",\n    max_num_atoms: Optional[int] = None,\n) -> Union[GraphDict, str]:\n    r\"\"\"\n    Transforms a molecule into an adjacency matrix representing the molecular graph\n    and a set of atom and bond features, and re-organizes them into a dictionary\n    that allows to build a `pyg.data.Data` object.\n\n    Compared to `mol_to_pyggraph`, this function does not build the graph directly,\n    and is thus faster, less memory heavy, and compatible with other frameworks.\n\n    Parameters:\n\n        mol:\n            The molecule to be converted\n\n        atom_property_list_onehot:\n            List of the properties used to get one-hot encoding of the atom type,\n            such as the atom index represented as a one-hot vector.\n            See function `get_mol_atomic_features_onehot`\n\n        atom_property_list_float:\n            List of the properties used to get floating-point encoding of the atom type,\n            such as the atomic mass or electronegativity.\n            See function `get_mol_atomic_features_float`\n\n        conformer_property_list:\n            list of properties used to encode the conformer information, outside of atom properties, currently support \"positions_3d\"\n\n        edge_property_list:\n            List of the properties used to encode the edges, such as the edge type\n            and the stereo type.\n\n        add_self_loop:\n            Whether to add a value of `1` on the diagonal of the adjacency matrix.\n\n        explicit_H:\n            Whether to consider the Hydrogens explicitely. If `False`, the hydrogens\n            are implicit.\n\n        use_bonds_weights:\n            Whether to use the floating-point value of the bonds in the adjacency matrix,\n            such that single bonds are represented by 1, double bonds 2, triple 3, aromatic 1.5\n\n        pos_encoding_as_features: keyword arguments for function `graph_positional_encoder`\n            to generate positional encoding for node features.\n\n        dtype:\n            The numpy data type used to build the graph\n\n        on_error:\n            What to do when the featurization fails. This can change the\n            behavior of `mask_nan`.\n\n            - \"raise\": Raise an error\n            - \"warn\": Raise a warning and return a string of the error\n            - \"ignore\": Ignore the error and return a string of the error\n\n        mask_nan:\n            Deal with molecules that fail a part of the featurization.\n            NaNs can happen when taking the of a noble gas,\n            or other properties that are not measured for specific atoms.\n\n            - \"raise\": Raise an error when there is a nan or inf in the featurization\n            - \"warn\": Raise a warning when there is a nan or inf in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans or inf by the specified value\n\n        max_num_atoms:\n            Maximum number of atoms for a given molecule. If a molecule with more atoms\n            is give, an error is raised, but catpured according to the rules of\n            `on_error`.\n\n    Returns:\n\n        graph_dict:\n            A dictionary `GraphDict` containing the keys required to build a graph,\n            and which can be used to build a PyG graph. If it fails\n            to featurize the molecule, it returns a string with the error.\n\n            - \"adj\": A sparse int-array containing the adjacency matrix\n\n            - \"data\": A dictionnary containing different keys and numpy\n              arrays associated to the (node, edge & graph) features.\n\n            - \"dtype\": The numpy dtype for the floating data.\n    \"\"\"\n\n    input_mol = mol\n    try:\n        if isinstance(mol, str):\n            mol = dm.to_mol(mol, ordered=True)\n        if explicit_H:\n            mol = Chem.AddHs(mol)\n        else:\n            mol = Chem.RemoveHs(mol)\n        num_atoms = mol.GetNumAtoms()\n        if (max_num_atoms is not None) and (num_atoms > max_num_atoms):\n            raise ValueError(f\"Maximum number of atoms greater than permitted {num_atoms}>{max_num_atoms}\")\n        (\n            adj,\n            ndata,\n            edata,\n            pe_dict,\n            conf_dict,\n        ) = mol_to_adj_and_features(\n            mol=mol,\n            atom_property_list_onehot=atom_property_list_onehot,\n            atom_property_list_float=atom_property_list_float,\n            conformer_property_list=conformer_property_list,\n            edge_property_list=edge_property_list,\n            add_self_loop=add_self_loop,\n            explicit_H=explicit_H,\n            use_bonds_weights=use_bonds_weights,\n            pos_encoding_as_features=pos_encoding_as_features,\n            mask_nan=mask_nan,\n        )\n    except Exception as e:\n        if on_error.lower() == \"raise\":\n            raise e\n        elif on_error.lower() == \"warn\":\n            smiles = input_mol\n            if isinstance(smiles, dm.Mol):\n                smiles = Chem.MolToSmiles(input_mol)\n            msg = str(e) + \"\\nIgnoring following molecule:\" + smiles\n            logger.warning(msg)\n            return str(e)\n        elif on_error.lower() == \"ignore\":\n            return str(e)\n\n    graph_dict = {\"adj\": adj, \"data\": {}, \"dtype\": dtype}\n\n    # Assign the node data\n    if ndata is not None:\n        graph_dict[\"data\"][\"feat\"] = ndata\n\n    # Assign the edge data\n    if edata is not None:\n        if issparse(edata):\n            edata = to_dense_array(edata, dtype=dtype)\n        hetero_edata = edata.repeat(2, axis=0)\n        graph_dict[\"data\"][\"edge_feat\"] = hetero_edata\n\n    # Put the positional encodings as node features\n    # TODO: add support for PE on edges\n    for key, pe in pe_dict.items():\n        graph_dict[\"data\"][key] = pe\n\n    # put the conformer positions here\n    for key, val in conf_dict.items():\n        graph_dict[\"data\"][key] = val\n\n    graph_dict = GraphDict(graph_dict)\n    return graph_dict\n\n\ndef mol_to_pyggraph(\n    mol: dm.Mol,\n    atom_property_list_onehot: List[str] = [],\n    atom_property_list_float: List[Union[str, Callable]] = [],\n    conformer_property_list: List[str] = [],\n    edge_property_list: List[str] = [],\n    add_self_loop: bool = False,\n    explicit_H: bool = False,\n    use_bonds_weights: bool = False,\n    pos_encoding_as_features: Dict[str, Any] = None,\n    dtype: np.dtype = np.float16,\n    on_error: str = \"ignore\",\n    mask_nan: Union[str, float, type(None)] = \"raise\",\n    max_num_atoms: Optional[int] = None,\n) -> Union[Data, str]:\n    r\"\"\"\n    Transforms a molecule into an adjacency matrix representing the molecular graph\n    and a set of atom and bond features.\n\n    Then, the adjacency matrix and node/edge features are used to build a\n    `pyg.data.Data` with pytorch Tensors.\n\n    Parameters:\n\n        mol:\n            The molecule to be converted\n\n        atom_property_list_onehot:\n            List of the properties used to get one-hot encoding of the atom type,\n            such as the atom index represented as a one-hot vector.\n            See function `get_mol_atomic_features_onehot`\n\n        atom_property_list_float:\n            List of the properties used to get floating-point encoding of the atom type,\n            such as the atomic mass or electronegativity.\n            See function `get_mol_atomic_features_float`\n\n        conformer_property_list:\n            list of properties used to encode the conformer information, outside of atom properties, currently support \"positions_3d\"\n\n        edge_property_list:\n            List of the properties used to encode the edges, such as the edge type\n            and the stereo type.\n\n        add_self_loop:\n            Whether to add a value of `1` on the diagonal of the adjacency matrix.\n\n        explicit_H:\n            Whether to consider the Hydrogens explicitely. If `False`, the hydrogens\n            are implicit.\n\n        use_bonds_weights:\n            Whether to use the floating-point value of the bonds in the adjacency matrix,\n            such that single bonds are represented by 1, double bonds 2, triple 3, aromatic 1.5\n\n        pos_encoding_as_features: keyword arguments for function `graph_positional_encoder`\n            to generate positional encoding for node features.\n\n        dtype:\n            The numpy data type used to build the graph\n\n        on_error:\n            What to do when the featurization fails. This can change the\n            behavior of `mask_nan`.\n\n            - \"raise\": Raise an error\n            - \"warn\": Raise a warning and return a string of the error\n            - \"ignore\": Ignore the error and return a string of the error\n\n        mask_nan:\n            Deal with molecules that fail a part of the featurization.\n            NaNs can happen when taking the of a noble gas,\n            or other properties that are not measured for specific atoms.\n\n            - \"raise\": Raise an error when there is a nan in the featurization\n            - \"warn\": Raise a warning when there is a nan in the featurization\n            - \"None\": DEFAULT. Don't do anything\n            - \"Floating value\": Replace nans by the specified value\n\n        max_num_atoms:\n            Maximum number of atoms for a given molecule. If a molecule with more atoms\n            is give, an error is raised, but catpured according to the rules of\n            `on_error`.\n    Returns:\n\n        graph:\n            Pyg graph, with `graph['feat']` corresponding to the concatenated\n            node data from `atom_property_list_onehot` and `atom_property_list_float`,\n            `graph['edge_feat']` corresponding to the concatenated edge data from `edge_property_list`.\n            There are also additional entries for the positional encodings.\n\n    \"\"\"\n    graph_dict = mol_to_graph_dict(\n        mol=mol,\n        atom_property_list_onehot=atom_property_list_onehot,\n        atom_property_list_float=atom_property_list_float,\n        conformer_property_list=conformer_property_list,\n        edge_property_list=edge_property_list,\n        add_self_loop=add_self_loop,\n        explicit_H=explicit_H,\n        use_bonds_weights=use_bonds_weights,\n        pos_encoding_as_features=pos_encoding_as_features,\n        dtype=dtype,\n        on_error=on_error,\n        mask_nan=mask_nan,\n        max_num_atoms=max_num_atoms,\n    )\n\n    if (graph_dict is not None) and not isinstance(graph_dict, str):\n        return graph_dict.make_pyg_graph()\n    else:\n        return graph_dict\n\n\ndef mol_to_graph_signature(featurizer_args: Dict[str, Any] = None) -> Dict[str, Any]:\n    \"\"\"\n    Get the default arguments of `mol_to_graph_dict` and update it\n    with a provided dict of arguments in order to get a fulle signature\n    of the featurizer args actually used for the features computation.\n\n    Parameters:\n        featurizer_args: A dictionary of featurizer arguments to update\n    Returns:\n        A dictionary of featurizer arguments\n    \"\"\"\n\n    # Get the signature of `mol_to_graph_dict`\n    signature = inspect.signature(mol_to_graph_dict)\n\n    # Filter out empty arguments (without default value)\n    parameters = list(filter(lambda param: param.default is not param.empty, signature.parameters.values()))\n\n    # Convert to dict\n    parameters = {param.name: param.default for param in parameters}\n\n    # Update the parameters with the supplied ones\n    if featurizer_args is not None:\n        parameters.update(featurizer_args)\n\n    return parameters\n"
  },
  {
    "path": "graphium/features/graphormer.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Dict, Any\n\nimport numpy as np\nimport networkx as nx\n\nfrom scipy.sparse import spmatrix, issparse\n\n\ndef compute_graphormer_distances(\n    adj: Union[np.ndarray, spmatrix], num_nodes: int, cache: Dict[str, Any]\n) -> Tuple[np.ndarray, str, Dict[str, Any]]:\n    \"\"\"\n    Compute Graphormer distance between nodepairs.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n    Returns:\n        dist [num_nodes, num_nodes]: 2D array with Graphormer distances between nodepairs\n        base_level: Indicator of the output pos_level (node, edge, nodepair, graph) -> here nodepair\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    base_level = \"nodepair\"\n\n    if \"graphormer\" in cache:\n        dist = cache[\"graphormer\"]\n\n    else:\n        if issparse(adj):\n            adj = adj.toarray()\n\n        G = nx.from_numpy_array(adj)\n        paths = nx.all_pairs_shortest_path(G)\n\n        dist_dict = {i: {j: len(path) - 1 for j, path in paths_from_i.items()} for i, paths_from_i in paths}\n        dist = np.asarray([[dist_dict[i][j] for j in range(num_nodes)] for i in range(num_nodes)])\n        cache[\"graphormer\"] = dist\n\n    return dist, base_level, cache\n"
  },
  {
    "path": "graphium/features/nmp.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Optional, Dict, Union\nimport importlib.resources\nfrom copy import deepcopy\nimport pandas as pd\nimport numpy as np\nimport math\nfrom rdkit import Chem\n\n# NOTE(hadim): usually it's best to embed this in a function.\nwith importlib.resources.open_text(\"graphium.features\", \"periodic_table.csv\") as f:\n    PERIODIC_TABLE = pd.read_csv(f)\nPERIODIC_TABLE = PERIODIC_TABLE.set_index(\"AtomicNumber\")\n\n\n# Small function to convert strings to floats\ndef float_or_none(string: str) -> Union[float, None]:\n    \"\"\"\n    check if a string can be converted to float, return none if it can't\n    Parameters:\n        string: str\n    Returns:\n        val: float or None\n    \"\"\"\n    try:\n        val = float(string)\n    except:\n        val = None\n    return val\n\n\n# It's much faster to index from a list than a DataFrame, which can have big impact\n# when featurizing millions of molecules\nBOND_RADIUS_SINGLE = [float_or_none(elem) for elem in PERIODIC_TABLE[\"SingleBondRadius\"]]\nBOND_RADIUS_DOUBLE = [float_or_none(elem) for elem in PERIODIC_TABLE[\"DoubleBondRadius\"]]\nBOND_RADIUS_TRIPLE = [float_or_none(elem) for elem in PERIODIC_TABLE[\"TripleBondRadius\"]]\nELECTRONEGATIVITY = [float_or_none(elem) for elem in PERIODIC_TABLE[\"Electronegativity\"]]\nFIRST_IONIZATION = [float_or_none(elem) for elem in PERIODIC_TABLE[\"FirstIonization\"]]\nMELTING_POINT = [float_or_none(elem) for elem in PERIODIC_TABLE[\"MeltingPoint\"]]\nMETAL = (2 * (PERIODIC_TABLE[\"Metal\"] == \"yes\") + (PERIODIC_TABLE[\"Metalloid\"] == \"yes\")).tolist()\n\nPHASE = list(PERIODIC_TABLE[\"Phase\"].values)\nPHASE_SET = list(set(PHASE))\nTYPE = list(deepcopy(PERIODIC_TABLE[\"Type\"]))\nTYPE = [\"none\" if (isinstance(t, float) and math.isnan(t)) else t for t in TYPE]\nTYPE_SET = list(set(TYPE))\nGROUP = deepcopy(PERIODIC_TABLE[\"Group\"].values)\nGROUP[np.isnan(GROUP)] = 19\nGROUP_SET = list(set(GROUP))\nPERIOD = list(PERIODIC_TABLE[\"Period\"].values)\nPERIOD_SET = list(set(PERIOD))\n\nATOM_LIST = [\n    \"C\",\n    \"N\",\n    \"O\",\n    \"S\",\n    \"F\",\n    \"Si\",\n    \"P\",\n    \"Cl\",\n    \"Br\",\n    \"Mg\",\n    \"Na\",\n    \"Ca\",\n    \"Fe\",\n    \"As\",\n    \"Al\",\n    \"I\",\n    \"B\",\n    \"V\",\n    \"K\",\n    \"Tl\",\n    \"Yb\",\n    \"Sb\",\n    \"Sn\",\n    \"Ag\",\n    \"Pd\",\n    \"Co\",\n    \"Se\",\n    \"Ti\",\n    \"Zn\",\n    \"H\",\n    \"Li\",\n    \"Ge\",\n    \"Cu\",\n    \"Au\",\n    \"Ni\",\n    \"Cd\",\n    \"In\",\n    \"Mn\",\n    \"Zr\",\n    \"Cr\",\n    \"Pt\",\n    \"Hg\",\n    \"Pb\",\n]\n\nATOM_NUM_H = [0, 1, 2, 3, 4]\nVALENCE = [0, 1, 2, 3, 4, 5, 6]\nCHARGE_LIST = [-3, -2, -1, 0, 1, 2, 3]\nRADICAL_E_LIST = [0, 1, 2]\nATOM_DEGREE_LIST = [0, 1, 2, 3, 4]\n\nHYBRIDIZATION_LIST = [\n    Chem.rdchem.HybridizationType.names[k]\n    for k in sorted(Chem.rdchem.HybridizationType.names.keys(), reverse=True)\n    if k != \"OTHER\"\n]\n\n\nCHIRALITY_LIST = [\"R\"]  # alternative is just S\n\n\nBOND_TYPES = [\n    Chem.rdchem.BondType.SINGLE,\n    Chem.rdchem.BondType.DOUBLE,\n    Chem.rdchem.BondType.TRIPLE,\n    Chem.rdchem.BondType.AROMATIC,\n]\n\nBOND_STEREO = [\n    Chem.rdchem.BondStereo.STEREONONE,\n    Chem.rdchem.BondStereo.STEREOANY,\n    Chem.rdchem.BondStereo.STEREOZ,\n    Chem.rdchem.BondStereo.STEREOE,\n    Chem.rdchem.BondStereo.STEREOCIS,\n    Chem.rdchem.BondStereo.STEREOTRANS,\n]\n"
  },
  {
    "path": "graphium/features/periodic_table.csv",
    "content": "AtomicNumber,Element,Symbol,AtomicMass,NumberofNeutrons,NumberofProtons,NumberofElectrons,Period,Group,Phase,Radioactive,Natural,Metal,Nonmetal,Metalloid,Type,AtomicRadius,Electronegativity,FirstIonization,Density,MeltingPoint,BoilingPoint,NumberOfIsotopes,Discoverer,Year,SpecificHeat,NumberofShells,NumberofValence,SingleBondRadius,DoubleBondRadius,TripleBondRadius\n1,Hydrogen,H,1.007,0,1,1,1,1,gas,,yes,,yes,,Nonmetal,0.79,2.2,13.5984,8.99E-05,14.175,20.28,3,Cavendish,1766,14.304,1,1,0.32,-,-\n2,Helium,He,4.002,2,2,2,1,18,gas,,yes,,yes,,Noble Gas,0.49,,24.5874,1.79E-04,,4.22,5,Janssen,1868,5.193,1,,0.46,-,-\n3,Lithium,Li,6.941,4,3,3,2,1,solid,,yes,yes,,,Alkali Metal,2.1,0.98,5.3917,5.34E-01,453.85,1615,5,Arfvedson,1817,3.582,2,1,1.33,1.24,-\n4,Beryllium,Be,9.012,5,4,4,2,2,solid,,yes,yes,,,Alkaline Earth Metal,1.4,1.57,9.3227,1.85E+00,1560.15,2742,6,Vaulquelin,1798,1.825,2,2,1.02,0.9,0.85\n5,Boron,B,10.811,6,5,5,2,13,solid,,yes,,,yes,Metalloid,1.2,2.04,8.298,2.34E+00,2573.15,4200,6,Gay-Lussac,1808,1.026,2,3,0.85,0.78,0.73\n6,Carbon,C,12.011,6,6,6,2,14,solid,,yes,,yes,,Nonmetal,0.91,2.55,11.2603,2.27E+00,3948.15,4300,7,Prehistoric,,0.709,2,4,0.75,0.67,0.6\n7,Nitrogen,N,14.007,7,7,7,2,15,gas,,yes,,yes,,Nonmetal,0.75,3.04,14.5341,1.25E-03,63.29,77.36,8,Rutherford,1772,1.04,2,5,0.71,0.6,0.54\n8,Oxygen,O,15.999,8,8,8,2,16,gas,,yes,,yes,,Nonmetal,0.65,3.44,13.6181,1.43E-03,50.5,90.2,8,Priestley/Scheele,1774,0.918,2,6,0.63,0.57,0.53\n9,Fluorine,F,18.998,10,9,9,2,17,gas,,yes,,yes,,Halogen,0.57,3.98,17.4228,1.70E-03,53.63,85.03,6,Moissan,1886,0.824,2,7,0.64,0.59,0.53\n10,Neon,Ne,20.18,10,10,10,2,18,gas,,yes,,yes,,Noble Gas,0.51,,21.5645,9.00E-04,24.703,27.07,8,Ramsay and Travers,1898,1.03,2,8,0.67,0.96,-\n11,Sodium,Na,22.99,12,11,11,3,1,solid,,yes,yes,,,Alkali Metal,2.2,0.93,5.1391,9.71E-01,371.15,1156,7,Davy,1807,1.228,3,1,1.55,1.6,-\n12,Magnesium,Mg,24.305,12,12,12,3,2,solid,,yes,yes,,,Alkaline Earth Metal,1.7,1.31,7.6462,1.74E+00,923.15,1363,8,Black,1755,1.023,3,2,1.39,1.32,1.27\n13,Aluminum,Al,26.982,14,13,13,3,13,solid,,yes,yes,,,Metal,1.8,1.61,5.9858,2.70E+00,933.4,2792,8,Wshler,1827,0.897,3,3,1.26,1.13,1.11\n14,Silicon,Si,28.086,14,14,14,3,14,solid,,yes,,,yes,Metalloid,1.5,1.9,8.1517,2.33E+00,1683.15,3538,8,Berzelius,1824,0.705,3,4,1.16,1.07,1.02\n15,Phosphorus,P,30.974,16,15,15,3,15,solid,,yes,,yes,,Nonmetal,1.2,2.19,10.4867,1.82E+00,317.25,553,7,BranBrand,1669,0.769,3,5,1.11,1.02,0.94\n16,Sulfur,S,32.065,16,16,16,3,16,solid,,yes,,yes,,Nonmetal,1.1,2.58,10.36,2.07E+00,388.51,717.8,10,Prehistoric,,0.71,3,6,1.03,0.94,0.95\n17,Chlorine,Cl,35.453,18,17,17,3,17,gas,,yes,,yes,,Halogen,0.97,3.16,12.9676,3.21E-03,172.31,239.11,11,Scheele,1774,0.479,3,7,0.99,0.95,0.93\n18,Argon,Ar,39.948,22,18,18,3,18,gas,,yes,,yes,,Noble Gas,0.88,,15.7596,1.78E-03,83.96,87.3,8,Rayleigh and Ramsay,1894,0.52,3,8,0.96,1.07,0.96\n19,Potassium,K,39.098,20,19,19,4,1,solid,,yes,yes,,,Alkali Metal,2.8,0.82,4.3407,8.62E-01,336.5,1032,10,Davy,1807,0.757,4,1,1.96,1.93,-\n20,Calcium,Ca,40.078,20,20,20,4,2,solid,,yes,yes,,,Alkaline Earth Metal,2.2,1,6.1132,1.54E+00,1112.15,1757,14,Davy,1808,0.647,4,2,1.71,1.47,1.33\n21,Scandium,Sc,44.956,24,21,21,4,3,solid,,yes,yes,,,Transition Metal,2.1,1.36,6.5615,2.99E+00,1812.15,3109,15,Nilson,1878,0.568,4,,1.48,1.16,1.14\n22,Titanium,Ti,47.867,26,22,22,4,4,solid,,yes,yes,,,Transition Metal,2,1.54,6.8281,4.54E+00,1933.15,3560,9,Gregor,1791,0.523,4,,1.36,1.17,1.08\n23,Vanadium,V,50.942,28,23,23,4,5,solid,,yes,yes,,,Transition Metal,1.9,1.63,6.7462,6.11E+00,2175.15,3680,9,   del Rio,1801,0.489,4,,1.34,1.12,1.06\n24,Chromium,Cr,51.996,28,24,24,4,6,solid,,yes,yes,,,Transition Metal,1.9,1.66,6.7665,7.15E+00,2130.15,2944,9,Vauquelin,1797,0.449,4,,1.22,1.11,1.03\n25,Manganese,Mn,54.938,30,25,25,4,7,solid,,yes,yes,,,Transition Metal,1.8,1.55,7.434,7.44E+00,1519.15,2334,11,\"Gahn, Scheele\",1774,0.479,4,,1.19,1.05,1.03\n26,Iron,Fe,55.845,30,26,26,4,8,solid,,yes,yes,,,Transition Metal,1.7,1.83,7.9024,7.87E+00,1808.15,3134,10,Prehistoric,,0.449,4,,1.16,1.09,1.02\n27,Cobalt,Co,58.933,32,27,27,4,9,solid,,yes,yes,,,Transition Metal,1.7,1.88,7.881,8.86E+00,1768.15,3200,14,Brandt,1735,0.421,4,,1.11,1.03,0.96\n28,Nickel,Ni,58.693,31,28,28,4,10,solid,,yes,yes,,,Transition Metal,1.6,1.91,7.6398,8.91E+00,1726.15,3186,11,Cronstedt,1751,0.444,4,,1.1,1.01,1.01\n29,Copper,Cu,63.546,35,29,29,4,11,solid,,yes,yes,,,Transition Metal,1.6,1.9,7.7264,8.96E+00,1357.75,2835,11,Prehistoric,,0.385,4,,1.12,1.15,1.2\n30,Zinc,Zn,65.38,35,30,30,4,12,solid,,yes,yes,,,Transition Metal,1.5,1.65,9.3942,7.13E+00,692.88,1180,15,Prehistoric,,0.388,4,,1.18,1.2,-\n31,Gallium,Ga,69.723,39,31,31,4,13,solid,,yes,yes,,,Metal,1.8,1.81,5.9993,5.91E+00,302.91,2477,14,de Boisbaudran,1875,0.371,4,3,1.24,1.17,1.21\n32,Germanium,Ge,72.64,41,32,32,4,14,solid,,yes,,,yes,Metalloid,1.5,2.01,7.8994,5.32E+00,1211.45,3106,17,Winkler,1886,0.32,4,4,1.21,1.11,1.14\n33,Arsenic,As,74.922,42,33,33,4,15,solid,,yes,,,yes,Metalloid,1.3,2.18,9.7886,5.78E+00,1090.15,887,14,Albertus Magnus,1250,0.329,4,5,1.21,1.14,1.06\n34,Selenium,Se,78.96,45,34,34,4,16,solid,,yes,,yes,,Nonmetal,1.2,2.55,9.7524,4.81E+00,494.15,958,20,Berzelius,1817,0.321,4,6,1.16,1.07,1.07\n35,Bromine,Br,79.904,45,35,35,4,17,liq,,yes,,yes,,Halogen,1.1,2.96,11.8138,3.12E+00,266.05,332,19,Balard,1826,0.474,4,7,1.14,1.09,1.1\n36,Krypton,Kr,83.798,48,36,36,4,18,gas,,yes,,yes,,Noble Gas,1,,13.9996,3.73E-03,115.93,119.93,23,Ramsay and Travers,1898,0.248,4,8,1.17,1.21,1.08\n37,Rubidium,Rb,85.468,48,37,37,5,1,solid,,yes,yes,,,Alkali Metal,3,0.82,4.1771,1.53E+00,312.79,961,20,Bunsen and Kirchoff,1861,0.363,5,1,2.1,2.02,-\n38,Strontium,Sr,87.62,50,38,38,5,2,solid,,yes,yes,,,Alkaline Earth Metal,2.5,0.95,5.6949,2.64E+00,1042.15,1655,18,Davy,1808,0.301,5,2,1.85,1.57,1.39\n39,Yttrium,Y,88.906,50,39,39,5,3,solid,,yes,yes,,,Transition Metal,2.3,1.22,6.2173,4.47E+00,1799.15,3609,21,Gadolin,1794,0.298,5,,1.63,1.3,1.24\n40,Zirconium,Zr,91.224,51,40,40,5,4,solid,,yes,yes,,,Transition Metal,2.2,1.33,6.6339,6.51E+00,2125.15,4682,20,Klaproth,1789,0.278,5,,1.54,1.27,1.21\n41,Niobium,Nb,92.906,52,41,41,5,5,solid,,yes,yes,,,Transition Metal,2.1,1.6,6.7589,8.57E+00,2741.15,5017,24,Hatchett,1801,0.265,5,,1.47,1.25,1.16\n42,Molybdenum,Mo,95.96,54,42,42,5,6,solid,,yes,yes,,,Transition Metal,2,2.16,7.0924,1.02E+01,2890.15,4912,20,Scheele,1778,0.251,5,,1.38,1.21,1.13\n43,Technetium,Tc,98,55,43,43,5,7,artificial,yes,,yes,,,Transition Metal,2,1.9,7.28,1.15E+01,2473.15,5150,23,Perrier and Segr?,1937,,5,,1.28,1.2,1.1\n44,Ruthenium,Ru,101.07,57,44,44,5,8,solid,,yes,yes,,,Transition Metal,1.9,2.2,7.3605,1.24E+01,2523.15,4423,16,Klaus,1844,0.238,5,,1.25,1.14,1.03\n45,Rhodium,Rh,102.906,58,45,45,5,9,solid,,yes,yes,,,Transition Metal,1.8,2.28,7.4589,1.24E+01,2239.15,3968,20,Wollaston,1803,0.243,5,,1.25,1.1,1.06\n46,Palladium,Pd,106.42,60,46,46,5,10,solid,,yes,yes,,,Transition Metal,1.8,2.2,8.3369,1.20E+01,1825.15,3236,21,Wollaston,1803,0.244,5,,1.2,1.17,1.12\n47,Silver,Ag,107.868,61,47,47,5,11,solid,,yes,yes,,,Transition Metal,1.8,1.93,7.5762,1.05E+01,1234.15,2435,27,Prehistoric,,0.235,5,,1.28,1.39,1.37\n48,Cadmium,Cd,112.411,64,48,48,5,12,solid,,yes,yes,,,Transition Metal,1.7,1.69,8.9938,8.69E+00,594.33,1040,22,Stromeyer,1817,0.232,5,,1.36,1.44,-\n49,Indium,In,114.818,66,49,49,5,13,solid,,yes,yes,,,Metal,2,1.78,5.7864,7.31E+00,429.91,2345,34,Reich and Richter,1863,0.233,5,3,1.42,1.36,1.46\n50,Tin,Sn,118.71,69,50,50,5,14,solid,,yes,yes,,,Metal,1.7,1.96,7.3439,7.29E+00,505.21,2875,28,Prehistoric,,0.228,5,4,1.4,1.3,1.32\n51,Antimony,Sb,121.76,71,51,51,5,15,solid,,yes,,,yes,Metalloid,1.5,2.05,8.6084,6.69E+00,904.05,1860,29,Early historic times,,0.207,5,5,1.4,1.33,1.27\n52,Tellurium,Te,127.6,76,52,52,5,16,solid,,yes,,,yes,Metalloid,1.4,2.1,9.0096,6.23E+00,722.8,1261,29,von Reichenstein,1782,0.202,5,6,1.36,1.28,1.21\n53,Iodine,I,126.904,74,53,53,5,17,solid,,yes,,yes,,Halogen,1.3,2.66,10.4513,4.93E+00,386.65,457.4,24,Courtois,1811,0.214,5,7,1.33,1.29,1.25\n54,Xenon,Xe,131.293,77,54,54,5,18,gas,,yes,,yes,,Noble Gas,1.2,,12.1298,5.89E-03,161.45,165.03,31,Ramsay and Travers,1898,0.158,5,8,1.31,1.35,1.22\n55,Cesium,Cs,132.905,78,55,55,6,1,solid,,yes,yes,,,Alkali Metal,3.3,0.79,3.8939,1.87E+00,301.7,944,22,Bunsen and Kirchoff,1860,0.242,6,1,2.32,2.09,-\n56,Barium,Ba,137.327,81,56,56,6,2,solid,,yes,yes,,,Alkaline Earth Metal,2.8,0.89,5.2117,3.59E+00,1002.15,2170,25,Davy,1808,0.204,6,2,1.96,1.61,1.49\n57,Lanthanum,La,138.905,82,57,57,6,3,solid,,yes,yes,,,Lanthanide,2.7,1.1,5.5769,6.15E+00,1193.15,3737,19,Mosander,1839,0.195,6,,1.8,1.39,1.39\n58,Cerium,Ce,140.116,82,58,58,6,,solid,,yes,yes,,,Lanthanide,2.7,1.12,5.5387,6.77E+00,1071.15,3716,19,Berzelius,1803,0.192,6,,1.63,1.37,1.31\n59,Praseodymium,Pr,140.908,82,59,59,6,,solid,,yes,yes,,,Lanthanide,2.7,1.13,5.473,6.77E+00,1204.15,3793,15,von Welsbach,1885,0.193,6,,1.76,1.38,1.28\n60,Neodymium,Nd,144.242,84,60,60,6,,solid,,yes,yes,,,Lanthanide,2.6,1.14,5.525,7.01E+00,1289.15,3347,16,von Welsbach,1885,0.19,6,,1.74,1.37,-\n61,Promethium,Pm,145,84,61,61,6,,artificial,yes,,yes,,,Lanthanide,2.6,1.13,5.582,7.26E+00,1204.15,3273,14,Marinsky et al.,1945,,6,,1.73,1.35,-\n62,Samarium,Sm,150.36,88,62,62,6,,solid,,yes,yes,,,Lanthanide,2.6,1.17,5.6437,7.52E+00,1345.15,2067,17,Boisbaudran,1879,0.197,6,,1.72,1.34,-\n63,Europium,Eu,151.964,89,63,63,6,,solid,,yes,yes,,,Lanthanide,2.6,1.2,5.6704,5.24E+00,1095.15,1802,21,Demarcay,1901,0.182,6,,1.68,1.34,-\n64,Gadolinium,Gd,157.25,93,64,64,6,,solid,,yes,yes,,,Lanthanide,2.5,1.2,6.1501,7.90E+00,1585.15,3546,17,de Marignac,1880,0.236,6,,1.69,1.35,1.32\n65,Terbium,Tb,158.925,94,65,65,6,,solid,,yes,yes,,,Lanthanide,2.5,1.2,5.8638,8.23E+00,1630.15,3503,24,Mosander,1843,0.182,6,,1.68,1.35,-\n66,Dysprosium,Dy,162.5,97,66,66,6,,solid,,yes,yes,,,Lanthanide,2.5,1.22,5.9389,8.55E+00,1680.15,2840,21,de Boisbaudran,1886,0.17,6,,1.67,1.33,-\n67,Holmium,Ho,164.93,98,67,67,6,,solid,,yes,yes,,,Lanthanide,2.5,1.23,6.0215,8.80E+00,1743.15,2993,29,Delafontaine and Soret,1878,0.165,6,,1.66,1.33,-\n68,Erbium,Er,167.259,99,68,68,6,,solid,,yes,yes,,,Lanthanide,2.5,1.24,6.1077,9.07E+00,1795.15,3503,16,Mosander,1843,0.168,6,,1.65,1.33,-\n69,Thulium,Tm,168.934,100,69,69,6,,solid,,yes,yes,,,Lanthanide,2.4,1.25,6.1843,9.32E+00,1818.15,2223,18,Cleve,1879,0.16,6,,1.64,1.31,-\n70,Ytterbium,Yb,173.054,103,70,70,6,,solid,,yes,yes,,,Lanthanide,2.4,1.1,6.2542,6.97E+00,1097.15,1469,16,Marignac,1878,0.155,6,,1.7,1.29,-\n71,Lutetium,Lu,174.967,104,71,71,6,,solid,,yes,yes,,,Lanthanide,2.3,1.27,5.4259,9.84E+00,1936.15,3675,22,Urbain/ von Welsbach,1907,0.154,6,,1.62,1.31,1.31\n72,Hafnium,Hf,178.49,106,72,72,6,4,solid,,yes,yes,,,Transition Metal,2.2,1.3,6.8251,1.33E+01,2500.15,4876,17,Coster and von Hevesy,1923,0.144,6,,1.52,1.28,1.22\n73,Tantalum,Ta,180.948,108,73,73,6,5,solid,,yes,yes,,,Transition Metal,2.1,1.5,7.5496,1.67E+01,3269.15,5731,19,Ekeberg,1801,0.14,6,,1.46,1.26,1.19\n74,Wolfram,W,183.84,110,74,74,6,6,solid,,yes,yes,,,Transition Metal,2,2.36,7.864,1.93E+01,3680.15,5828,22,J. and F. d'Elhuyar,1783,0.132,6,,1.37,1.2,1.15\n75,Rhenium,Re,186.207,111,75,75,6,7,solid,,yes,yes,,,Transition Metal,2,1.9,7.8335,2.10E+01,3453.15,5869,21,\"Noddack, Berg, and Tacke\",1925,0.137,6,,1.31,1.19,1.1\n76,Osmium,Os,190.23,114,76,76,6,8,solid,,yes,yes,,,Transition Metal,1.9,2.2,8.4382,2.26E+01,3300.15,5285,19,Tennant,1803,0.13,6,,1.29,1.16,1.09\n77,Iridium,Ir,192.217,115,77,77,6,9,solid,,yes,yes,,,Transition Metal,1.9,2.2,8.967,2.26E+01,2716.15,4701,25,Tennant,1804,0.131,6,,1.22,1.15,1.07\n78,Platinum,Pt,195.084,117,78,78,6,10,solid,,yes,yes,,,Transition Metal,1.8,2.28,8.9587,2.15E+01,2045.15,4098,32,Ulloa/Wood,1735,0.133,6,,1.23,1.12,1.1\n79,Gold,Au,196.967,118,79,79,6,11,solid,,yes,yes,,,Transition Metal,1.8,2.54,9.2255,1.93E+01,1337.73,3129,21,Prehistoric,,0.129,6,,1.24,1.21,1.23\n80,Mercury,Hg,200.59,121,80,80,6,12,liq,,yes,yes,,,Transition Metal,1.8,2,10.4375,1.35E+01,234.43,630,26,Prehistoric,,0.14,6,,1.33,1.42,-\n81,Thallium,Tl,204.383,123,81,81,6,13,solid,,yes,yes,,,Metal,2.1,2.04,6.1082,1.19E+01,577.15,1746,28,Crookes,1861,0.129,6,3,1.44,1.42,1.5\n82,Lead,Pb,207.2,125,82,82,6,14,solid,,yes,yes,,,Metal,1.8,2.33,7.4167,1.13E+01,600.75,2022,29,Prehistoric,,0.129,6,4,1.44,1.35,1.37\n83,Bismuth,Bi,208.98,126,83,83,6,15,solid,,yes,yes,,,Metal,1.6,2.02,7.2856,9.81E+00,544.67,1837,19,Geoffroy the Younger,1753,0.122,6,5,1.51,1.41,1.35\n84,Polonium,Po,210,126,84,84,6,16,solid,yes,yes,,,yes,Metalloid,1.5,2,8.417,9.32E+00,527.15,1235,34,Curie,1898,,6,6,1.45,1.35,1.29\n85,Astatine,At,210,125,85,85,6,17,solid,yes,yes,,yes,,Noble Gas,1.4,2.2,9.3,7.00E+00,575.15,610,21,Corson et al.,1940,,6,7,1.47,1.38,1.38\n86,Radon,Rn,222,136,86,86,6,18,gas,yes,yes,yes,,,Alkali Metal,1.3,,10.7485,9.73E-03,202.15,211.3,20,Dorn,1900,0.094,6,8,1.42,1.45,1.33\n87,Francium,Fr,223,136,87,87,7,1,solid,yes,yes,yes,,,Alkaline Earth Metal,,0.7,4.0727,1.87E+00,300.15,950,21,Perey,1939,,7,1,2.23,2.18,-\n88,Radium,Ra,226,138,88,88,7,2,solid,yes,yes,yes,,,Actinide,,0.9,5.2784,5.50E+00,973.15,2010,15,Pierre and Marie Curie,1898,,7,2,2.01,1.73,1.59\n89,Actinium,Ac,227,138,89,89,7,3,solid,yes,yes,yes,,,Actinide,,1.1,5.17,1.01E+01,1323.15,3471,11,Debierne/Giesel,1899,0.12,7,,1.86,1.53,1.4\n90,Thorium,Th,232.038,142,90,90,7,,solid,yes,yes,yes,,,Actinide,,1.3,6.3067,1.17E+01,2028.15,5061,12,Berzelius,1828,0.113,7,,1.75,1.43,1.36\n91,Protactinium,Pa,231.036,140,91,91,7,,solid,yes,yes,yes,,,Actinide,,1.5,5.89,1.54E+01,1873.15,4300,14,Hahn and Meitner,1917,,7,,1.69,1.38,1.29\n92,Uranium,U,238.029,146,92,92,7,,solid,yes,yes,yes,,,Actinide,,1.38,6.1941,1.90E+01,1405.15,4404,15,Peligot,1841,0.116,7,,1.7,1.34,1.18\n93,Neptunium,Np,237,144,93,93,7,,artificial,yes,,yes,,,Actinide,,1.36,6.2657,2.05E+01,913.15,4273,153,McMillan and Abelson,1940,,7,,1.71,1.36,1.16\n94,Plutonium,Pu,244,150,94,94,7,,artificial,yes,,yes,,,Actinide,,1.28,6.0262,1.98E+01,913.15,3501,163,Seaborg et al.,1940,,7,,1.72,1.35,-\n95,Americium,Am,243,148,95,95,7,,artificial,yes,,yes,,,Actinide,,1.3,5.9738,1.37E+01,1267.15,2880,133,Seaborg et al.,1944,,7,,1.66,1.35,-\n96,Curium,Cm,247,151,96,96,7,,artificial,yes,,yes,,,Actinide,,1.3,5.9915,1.35E+01,1340.15,3383,133,Seaborg et al.,1944,,7,,1.66,1.36,-\n97,Berkelium,Bk,247,150,97,97,7,,artificial,yes,,yes,,,Actinide,,1.3,6.1979,1.48E+01,1259.15,983,83,Seaborg et al.,1949,,7,,1.68,1.39,-\n98,Californium,Cf,251,153,98,98,7,,artificial,yes,,yes,,,Actinide,,1.3,6.2817,1.51E+01,1925.15,1173,123,Seaborg et al.,1950,,7,,1.68,1.4,-\n99,Einsteinium,Es,252,153,99,99,7,,artificial,yes,,yes,,,Actinide,,1.3,6.42,1.35E+01,1133.15,,123,Ghiorso et al.,1952,,7,,1.65,1.4,-\n100,Fermium,Fm,257,157,100,100,7,,artificial,yes,,yes,,,Actinide,,1.3,6.5,,,,103,Ghiorso et al.,1953,,7,,1.67,-,-\n101,Mendelevium,Md,258,157,101,101,7,,artificial,yes,,yes,,,Actinide,,1.3,6.58,,,,33,Ghiorso et al.,1955,,7,,1.73,1.39,-\n102,Nobelium,No,259,157,102,102,7,,artificial,yes,,yes,,,Actinide,,1.3,6.65,,,,73,Ghiorso et al.,1958,,7,,1.76,-,-\n103,Lawrencium,Lr,262,159,103,103,7,,artificial,yes,,yes,,,Actinide,,,,,,,203,Ghiorso et al.,1961,,7,,1.61,1.41,-\n104,Rutherfordium,Rf,261,157,104,104,7,4,artificial,yes,,yes,,,Transactinide,,,,1.81E+01,,,,Ghiorso et al.,1969,,7,,1.57,1.4,1.31\n105,Dubnium,Db,262,157,105,105,7,5,artificial,yes,,yes,,,Transactinide,,,,3.90E+01,,,,Ghiorso et al.,1970,,7,,1.49,1.36,1.26\n106,Seaborgium,Sg,266,160,106,106,7,6,artificial,yes,,yes,,,Transactinide,,,,3.50E+01,,,,Ghiorso et al.,1974,,7,,1.43,1.28,1.21\n107,Bohrium,Bh,264,157,107,107,7,7,artificial,yes,,yes,,,Transactinide,,,,3.70E+01,,,,Armbruster and M?nzenberg,1981,,7,,1.41,1.28,1.19\n108,Hassium,Hs,267,159,108,108,7,8,artificial,yes,,yes,,,Transactinide,,,,4.10E+01,,,,Armbruster and M?nzenberg,1983,,7,,1.34,1.25,1.18\n109,Meitnerium,Mt,268,159,109,109,7,9,artificial,yes,,yes,,,Transactinide,,,,3.50E+01,,,,\"GSI, Darmstadt, West Germany\",1982,,7,,1.29,1.25,1.13\n110,Darmstadtium ,Ds ,271,161,110,110,7,10,artificial,yes,,yes,,,Transactinide,,,,,,,,,1994,,7,,1.28,1.16,1.12\n111,Roentgenium ,Rg ,272,161,111,111,7,11,artificial,yes,,yes,,,Transactinide,,,,,,,,,1994,,7,,1.21,1.16,1.18\n112,Copernicium ,Cn ,285,173,112,112,7,12,artificial,yes,,yes,,,Transactinide,,,,,,,,,1996,,7,,1.22,1.37,1.3\n113,Nihonium,Nh,284,171,113,113,7,13,artificial,yes,,yes,,,,,,,,,,,,2004,,7,3,1.36,-,-\n114,Flerovium,Fl,289,175,114,114,7,14,artificial,yes,,yes,,,Transactinide,,,,,,,,,1999,,7,4,1.43,-,-\n115,Moscovium,Mc,288,173,115,115,7,15,artificial,yes,,yes,,,,,,,,,,,,2010,,7,5,1.62,-,-\n116,Livermorium,Lv,292,176,116,116,7,16,artificial,yes,,yes,,,Transactinide,,,,,,,,,2000,,7,6,1.75,-,-\n117,Tennessine,Ts,295,178,117,117,7,17,artificial,yes,,,yes,,,,,,,,,,,2010,,7,7,1.65,-,-\n118,Oganesson,Og,294,176,118,118,7,18,artificial,yes,,,yes,,Noble Gas,,,,,,,,,2006,,7,8,1.57,-,-\n"
  },
  {
    "path": "graphium/features/positional_encoding.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Optional, Dict, Any, OrderedDict\nfrom copy import deepcopy\nimport numpy as np\nimport torch\nfrom scipy.sparse import spmatrix\nfrom collections import OrderedDict as OderedDictClass\n\nfrom graphium.features.spectral import compute_laplacian_pe\nfrom graphium.features.rw import compute_rwse\nfrom graphium.features.electrostatic import compute_electrostatic_interactions\nfrom graphium.features.commute import compute_commute_distances\nfrom graphium.features.graphormer import compute_graphormer_distances\nfrom graphium.features.transfer_pos_level import transfer_pos_level\n\n\ndef get_all_positional_encodings(\n    adj: Union[np.ndarray, spmatrix],\n    num_nodes: int,\n    pos_kwargs: Optional[Dict] = None,\n) -> Tuple[\"OrderedDict[str, np.ndarray]\"]:\n    r\"\"\"\n    Get features positional encoding.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        num_nodes: Number of nodes in the graph\n        pos_encoding_as_features: keyword arguments for function `graph_positional_encoder`\n            to generate positional encoding for node features.\n\n    Returns:\n        pe_dict: Dictionary of positional and structural encodings\n    \"\"\"\n\n    pos_kwargs = {} if pos_kwargs is None else pos_kwargs\n\n    pe_dict = OderedDictClass()\n\n    # Initialize cache\n    cache = {}\n\n    # Get the positional encoding for the features\n    if len(pos_kwargs) > 0:\n        for pos_name, this_pos_kwargs in pos_kwargs[\"pos_types\"].items():\n            this_pos_kwargs = deepcopy(this_pos_kwargs)\n            pos_type = this_pos_kwargs.pop(\"pos_type\", None)\n            pos_level = this_pos_kwargs.pop(\"pos_level\", None)\n            this_pe, cache = graph_positional_encoder(\n                deepcopy(adj),\n                num_nodes,\n                pos_type=pos_type,\n                pos_level=pos_level,\n                pos_kwargs=this_pos_kwargs,\n                cache=cache,\n            )\n            if pos_level == \"node\":\n                pe_dict.update({f\"{pos_type}\": this_pe})\n            else:\n                pe_dict.update({f\"{pos_level}_{pos_type}\": this_pe})\n\n    return pe_dict\n\n\ndef graph_positional_encoder(\n    adj: Union[np.ndarray, spmatrix],\n    num_nodes: int,\n    pos_type: Optional[str] = None,\n    pos_level: Optional[str] = None,\n    pos_kwargs: Optional[Dict[str, Any]] = None,\n    cache: Optional[Dict[str, Any]] = None,\n) -> Tuple[Dict[str, np.ndarray], Dict[str, Any]]:\n    r\"\"\"\n    Get a positional encoding that depends on the parameters.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        num_nodes: Number of nodes in the graph\n        pos_type: The type of positional encoding to use. If None, it must be provided by `pos_kwargs[\"pos_type\"]`. Supported types are:\n            - laplacian_eigvec              \\\n            - laplacian_eigval               \\  -> cache connected comps. & eigendecomp.\n            - rwse\n            - electrostatic                 \\\n            - commute                        \\  -> cache pinvL\n            - graphormer\n        pos_level: Positional level to output. If None, it must be provided by `pos_kwargs[\"pos_level\"]`.\n            - node\n            - edge\n            - nodepair\n            - graph\n        pos_kwargs: Extra keyword arguments for the positional encoding. Can include the keys pos_type and pos_level.\n        cache: Dictionary of cached objects\n\n    Returns:\n        pe: Positional or structural encoding\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    pos_kwargs = deepcopy(pos_kwargs)\n    if pos_kwargs is None:\n        pos_kwargs = {}\n    if cache is None:\n        cache = {}\n\n    # Get the positional type\n    pos_type2 = pos_kwargs.pop(\"pos_type\", None)\n    if pos_type is None:\n        pos_type = pos_type2\n    if pos_type2 is not None:\n        assert (\n            pos_type == pos_type2\n        ), f\"The positional type must be the same in `pos_type` and `pos_kwargs['pos_type']`. Provided: {pos_type} and {pos_type2}\"\n    assert pos_type is not None, \"Either `pos_type` or `pos_kwargs['pos_type']` must be provided.\"\n\n    # Get the positional level\n    pos_level2 = pos_kwargs.pop(\"pos_level\", None)\n    if pos_level is None:\n        pos_level = pos_level2\n    if pos_level2 is not None:\n        assert (\n            pos_level == pos_level2\n        ), f\"The positional level must be the same in `pos_level` and `pos_kwargs['pos_level']`. Provided: {pos_level} and {pos_level2}\"\n    assert pos_level is not None, \"Either `pos_level` or `pos_kwargs['pos_level']` must be provided.\"\n\n    # Convert to numpy array\n    if isinstance(adj, torch.sparse.Tensor):\n        adj = adj.to_dense().numpy()\n    elif isinstance(adj, torch.Tensor):\n        adj = adj.numpy()\n    adj = adj.astype(np.float64)\n\n    # Calculate positional encoding\n    if pos_type == \"laplacian_eigvec\":\n        _, pe, base_level, cache = compute_laplacian_pe(adj, cache=cache, **pos_kwargs)\n\n    elif pos_type == \"laplacian_eigval\":\n        pe, _, base_level, cache = compute_laplacian_pe(adj, cache=cache, **pos_kwargs)\n\n    elif pos_type == \"rw_return_probs\":\n        pe, base_level, cache = compute_rwse(\n            adj.astype(np.float32), num_nodes=num_nodes, cache=cache, pos_type=pos_type, **pos_kwargs\n        )\n\n    elif pos_type == \"rw_transition_probs\":\n        pe, base_level, cache = compute_rwse(\n            adj.astype(np.float32), num_nodes=num_nodes, cache=cache, pos_type=pos_type, **pos_kwargs\n        )\n\n    elif pos_type == \"electrostatic\":\n        pe, base_level, cache = compute_electrostatic_interactions(adj, cache, **pos_kwargs)\n\n    elif pos_type == \"commute\":\n        pe, base_level, cache = compute_commute_distances(adj, num_nodes, cache, **pos_kwargs)\n\n    elif pos_type == \"graphormer\":\n        pe, base_level, cache = compute_graphormer_distances(adj, num_nodes, cache, **pos_kwargs)\n\n    else:\n        raise ValueError(f\"Unknown `pos_type`: {pos_type}\")\n\n    # Convert to float32 and Convert between different pos levels\n    if isinstance(pe, (list, tuple)):\n        pe = [this_pe.astype(np.float32) for this_pe in pe]\n        pe = [transfer_pos_level(this_pe, base_level, pos_level, adj, num_nodes, cache) for this_pe in pe]\n    else:\n        pe = np.real(pe).astype(np.float32)\n        pe = transfer_pos_level(pe, base_level, pos_level, adj, num_nodes, cache)\n\n    return pe, cache\n"
  },
  {
    "path": "graphium/features/properties.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, List, Callable\n\nimport numpy as np\nimport datamol as dm\n\nfrom rdkit.Chem import rdMolDescriptors as rdMD\nfrom loguru import logger\n\n\ndef get_prop_or_none(\n    prop: Callable, n: int, *args: Union[dm.Mol, str], **kwargs: Union[dm.Mol, str]\n) -> Union[List[float], List[None]]:\n    r\"\"\"\n    return properties. If error, return list of `None` with lenght `n`.\n    Parameters:\n        prop: The property to compute.\n        n: The number of elements in the property.\n        *args: The arguments to pass to the property.\n        **kwargs: The keyword arguments to pass to the property.\n    Returns:\n        The property or a list of `None` with lenght `n`.\n    \"\"\"\n    logger.warning(\"get_prop_or_none is deprecated. Use `datamol.to_fp` instead.\")\n    try:\n        return prop(*args, **kwargs)\n    except RuntimeError:\n        return [None] * n\n\n\ndef get_props_from_mol(\n    mol: Union[dm.Mol, str],\n    properties: Union[List[str], str] = \"autocorr3d\",\n) -> np.ndarray:\n    r\"\"\"\n    Function to get a given set of desired properties from a molecule,\n    and output a property list.\n\n    Parameters:\n        mol: The molecule from which to compute the properties.\n        properties:\n            The list of properties to compute for each molecule. It can be the following:\n\n            - 'descriptors'\n            - 'autocorr3d'\n            - 'rdf'\n            - 'morse'\n            - 'whim'\n            - 'all'\n\n    Returns:\n        props: np.array(float)\n            The array of properties for the desired molecule\n        classes_start_idx: list(int)\n            The list of index specifying the start of each new class of\n            descriptor or property. For example, if props has 20 elements,\n            the first 5 are rotatable bonds, the next 8 are morse, and\n            the rest are whim, then ``classes_start_idx = [0, 5, 13]``.\n            This will mainly be useful to normalize the features of\n            each class.\n        classes_names: list(str)\n            The name of the classes associated to each starting index.\n            Will be usefull to understand what property is the network learning.\n\n    \"\"\"\n\n    logger.warning(\"get_props_from_mol is deprecated. Use `datamol.to_fp` instead.\")\n\n    if isinstance(mol, str):\n        mol = dm.to_mol(\n            mol\n        )  # Doesn't need `ordered=True` because the fingerprints don't depend on the atom order\n\n    if isinstance(properties, str):\n        properties = [properties]\n\n    properties = [p.lower() for p in properties]\n\n    # Initialize arrays\n    props = []  # Property vector for the features\n    classes_start_idx = []  # The starting index for each property class\n    classes_names = []\n\n    # Generate a 3D structure for the molecule\n    mol = dm.add_hs(mol)\n\n    if (\"autocorr3d\" in properties) or (\"all\" in properties):\n        # Some kind of 3D description of the molecule\n        classes_names.append(\"autocorr3d\")\n        classes_start_idx.append(len(props))\n        props.extend(get_prop_or_none(rdMD.CalcAUTOCORR3D, 80, mol))\n\n    if (\"rdf\" in properties) or (\"all\" in properties):\n        # The radial distribution function (better than the inertia)\n        # https://en.wikipedia.org/wiki/Radial_distribution_function\n        classes_names.append(\"rdf\")\n        classes_start_idx.append(len(props))\n        props.extend(get_prop_or_none(rdMD.CalcRDF, 210, mol))\n\n    if (\"morse\" in properties) or (\"all\" in properties):\n        # Molecule Representation of Structures based on Electron diffraction descriptors\n        classes_names.append(\"morse\")\n        classes_start_idx.append(len(props))\n        props.extend(get_prop_or_none(rdMD.CalcMORSE, 224, mol))\n\n    if (\"whim\" in properties) or (\"all\" in properties):\n        # WHIM descriptors are 3D structural descriptors obtained from the\n        # (x,y,z)‐atomic coordinates of a molecular conformation of a chemical,\n        # and are used successfully in QSAR modelling.\n        classes_names.append(\"whim\")\n        classes_start_idx.append(len(props))\n        props.extend(get_prop_or_none(rdMD.CalcWHIM, 114, mol))\n\n    return np.array(props), classes_start_idx, classes_names\n"
  },
  {
    "path": "graphium/features/rw.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Optional, List, Dict, Any, Iterable\n\nfrom scipy.sparse import issparse, spmatrix, coo_matrix\nimport numpy as np\nimport torch\n\nfrom torch_geometric.utils import to_dense_adj, from_scipy_sparse_matrix\nfrom torch_scatter import scatter_add\nfrom torch_geometric.utils.num_nodes import maybe_num_nodes\n\n\ndef compute_rwse(\n    adj: Union[np.ndarray, spmatrix],\n    ksteps: Union[int, List[int]],\n    num_nodes: int,\n    cache: Dict[str, Any],\n    pos_type: str = \"rw_return_probs\" or \"rw_transition_probs\",\n    space_dim: int = 0,\n) -> Tuple[np.ndarray, str, Dict[str, Any]]:\n    \"\"\"\n    Compute Random Walk Spectral Embedding (RWSE) for given list of K steps.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix\n        ksteps: List of numbers of steps for the random walks. If int, a list is generated from 1 to ksteps.\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n        pos_type: Desired output\n        space_dim: Estimated dimensionality of the space. Used to\n            correct the random-walk diagonal by a factor `k^(space_dim/2)`.\n            In euclidean space, this correction means that the height of\n            the gaussian distribution stays almost constant across the number of\n            steps, if `space_dim` is the dimension of the euclidean space.\n    Returns:\n        Two possible outputs:\n            rw_return_probs [num_nodes, len(ksteps)]: Random-Walk k-step landing probabilities\n            rw_transition_probs [num_nodes, num_nodes, len(ksteps)]:  Random-Walk k-step transition probabilities\n        base_level: Indicator of the output pos_level (node, edge, nodepair, graph) -> here either node or nodepair\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    base_level = \"node\" if pos_type == \"rw_return_probs\" else \"nodepair\"\n\n    # Manually handles edge case of 1 atom molecules here\n    if not isinstance(ksteps, Iterable):\n        ksteps = list(range(1, ksteps + 1))\n    if num_nodes == 1:\n        if pos_type == \"rw_return_probs\":\n            return np.ones((1, len(ksteps))), base_level, cache\n        else:\n            return np.ones((1, 1, len(ksteps))), base_level, cache\n\n    # Get the edge indices from the adjacency matrix\n    if not issparse(adj):\n        if \"coo_adj\" in cache:\n            adj = cache[\"coo_adj\"]\n        elif \"csr_adj\" in cache:\n            adj = cache[\"csr_adj\"]\n        else:\n            adj = coo_matrix(adj, dtype=np.float64)\n            cache[\"coo_adj\"] = adj\n\n    edge_index, edge_weight = from_scipy_sparse_matrix(adj)\n\n    # Compute the random-walk transition probabilities\n    if \"ksteps\" in cache:\n        cached_k = cache[\"ksteps\"]\n        missing_k = [k for k in ksteps if k not in cached_k]\n        if missing_k == []:\n            pass\n        elif min(missing_k) < min(cached_k):\n            Pk_dict = get_Pks(missing_k, edge_index=edge_index, edge_weight=edge_weight, num_nodes=num_nodes)\n            cache[\"ksteps\"] = sorted(missing_k + cache[\"ksteps\"])\n            for k in missing_k:\n                cache[\"Pk\"][k] = Pk_dict[k]\n        else:\n            start_k = min([max(cached_k), min(missing_k)])\n            start_Pk = cache[\"Pk\"][start_k]\n            Pk_dict = get_Pks(\n                missing_k,\n                edge_index=edge_index,\n                edge_weight=edge_weight,\n                num_nodes=num_nodes,\n                start_Pk=start_Pk,\n                start_k=start_k,\n            )\n            cache[\"ksteps\"] = sorted(cache[\"ksteps\"] + missing_k)\n            for k in missing_k:\n                cache[\"Pk\"][k] = Pk_dict[k]\n    else:\n        Pk_dict = get_Pks(ksteps, edge_index=edge_index, edge_weight=edge_weight, num_nodes=num_nodes)\n\n        cache[\"ksteps\"] = list(Pk_dict.keys())\n        cache[\"Pk\"] = Pk_dict\n\n    pe_list = []\n    if pos_type == \"rw_return_probs\":\n        for k in ksteps:\n            pe_list.append(torch.diagonal(cache[\"Pk\"][k], dim1=-2, dim2=-1) * (k ** (space_dim / 2)))\n    else:\n        for k in ksteps:\n            pe_list.append(cache[\"Pk\"][k])\n\n    pe = torch.stack(pe_list, dim=-1).numpy()\n\n    return pe, base_level, cache\n\n\ndef get_Pks(\n    ksteps: List[int],\n    edge_index: Tuple[torch.Tensor, torch.Tensor],\n    edge_weight: Optional[torch.Tensor] = None,\n    num_nodes: Optional[int] = None,\n    start_Pk: Optional[torch.Tensor] = None,\n    start_k: Optional[int] = None,\n) -> Dict[int, np.ndarray]:\n    \"\"\"\n    Compute Random Walk landing probabilities for given list of K steps.\n\n    Parameters:\n        ksteps: List of numbers of k-steps for which to compute the RW landings\n        edge_index: PyG sparse representation of the graph\n        edge_weight: Edge weights\n        num_nodes: Number of nodes in the graph\n\n    Returns:\n        2D Tensor with shape (num_nodes, len(ksteps)) with RW landing probs\n    \"\"\"\n    if edge_weight is None:\n        edge_weight = torch.ones(edge_index.size(1), device=edge_index.device)\n    num_nodes = maybe_num_nodes(edge_index, num_nodes)\n    src = edge_index[0]\n    deg = scatter_add(edge_weight, src, dim=0, dim_size=num_nodes)  # Out degrees.\n    deg_inv = deg.pow(-1.0)\n    deg_inv.masked_fill_(deg_inv == float(\"inf\"), 0)\n\n    if edge_index.numel() == 0:\n        P = edge_index.new_zeros((1, num_nodes, num_nodes))\n    else:\n        # P = D^-1 * A\n        P = torch.diag(deg_inv).float() @ to_dense_adj(\n            edge_index, max_num_nodes=num_nodes\n        )  # 1 x (Num nodes) x (Num nodes)\n\n    if start_Pk is not None:\n        Pk = start_Pk @ P.clone().detach().matrix_power(min(ksteps) - start_k)\n    else:\n        Pk = P.clone().detach().matrix_power(min(ksteps))\n\n    Pk_dict = {}\n    for k in range(min(ksteps), max(ksteps) + 1):\n        Pk_dict[k] = Pk.squeeze(0)\n        Pk = Pk @ P\n\n    return Pk_dict\n"
  },
  {
    "path": "graphium/features/spectral.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, Dict, Any\nfrom scipy.linalg import eig\nfrom scipy.sparse import csr_matrix, diags, issparse, spmatrix\nimport numpy as np\nimport torch\nimport networkx as nx\n\nfrom graphium.utils.tensor import is_dtype_torch_tensor, is_dtype_numpy_array\n\n\ndef compute_laplacian_pe(\n    adj: Union[np.ndarray, spmatrix],\n    num_pos: int,\n    cache: Dict[str, Any],\n    disconnected_comp: bool = True,\n    normalization: str = \"none\",\n) -> Tuple[np.ndarray, str, Dict[str, Any]]:\n    r\"\"\"\n    Compute the Laplacian eigenvalues and eigenvectors of the Laplacian of the graph.\n\n    Parameters:\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        num_pos: Number of Laplacian eigenvectors to compute\n        cache: Dictionary of cached objects\n        disconnected_comp: Whether to compute the eigenvectors for each connected component\n        normalization: Normalization to apply to the Laplacian\n\n    Returns:\n        Two possible outputs:\n            eigvals [num_nodes, num_pos]: Eigenvalues of the Laplacian repeated for each node.\n                This repetition is necessary in case of disconnected components, where\n                the eigenvalues of the Laplacian are not the same for each node.\n            eigvecs [num_nodes, num_pos]: Eigenvectors of the Laplacian\n        base_level: Indicator of the output pos_level (node, edge, nodepair, graph) -> here node\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    base_level = \"node\"\n\n    # Sparsify the adjacency patrix\n    if not issparse(adj):\n        if \"csr_adj\" not in cache:\n            adj = csr_matrix(adj, dtype=np.float64)\n            cache[\"csr_adj\"] = adj\n        else:\n            adj = cache[\"csr_adj\"]\n\n    # Compute the Laplacian, and normalize it\n    if f\"L_{normalization}_sp\" not in cache:\n        D = np.array(np.sum(adj, axis=1)).flatten()\n        D_mat = diags(D)\n        L = -adj + D_mat\n        L_norm = normalize_matrix(L, degree_vector=D, normalization=normalization)\n        cache[f\"L_{normalization}_sp\"] = L_norm\n    else:\n        L_norm = cache[f\"L_{normalization}_sp\"]\n\n    components = []\n\n    if disconnected_comp:\n        if \"components\" not in cache:\n            # Get the list of connected components\n            components = list(nx.connected_components(nx.from_scipy_sparse_array(adj)))\n            cache[\"components\"] = components\n\n        else:\n            components = cache[\"components\"]\n\n    # Compute the eigenvectors for each connected component, and stack them together\n    if len(components) > 1:\n        if \"lap_eig_comp\" not in cache:\n            eigvals = np.zeros((adj.shape[0], num_pos), dtype=np.complex64)\n            eigvecs = np.zeros((adj.shape[0], num_pos), dtype=np.complex64)\n            for component in components:\n                comp = list(component)\n                this_L = L_norm[comp][:, comp]\n                this_eigvals, this_eigvecs = _get_positional_eigvecs(this_L, num_pos=num_pos)\n\n                # Eigenvalues previously set to infinity are now set to 0\n                # Any NaN in the eigvals or eigvecs will be set to 0\n                this_eigvecs[~np.isfinite(this_eigvecs)] = 0.0\n                this_eigvals[~np.isfinite(this_eigvals)] = 0.0\n\n                eigvals[comp, :] = np.expand_dims(this_eigvals, axis=0)\n                eigvecs[comp, :] = this_eigvecs\n            cache[\"lap_eig_comp\"] = (eigvals, eigvecs)\n\n        else:\n            eigvals, eigvecs = cache[\"lap_eig_comp\"]\n\n    else:\n        if \"lap_eig\" not in cache:\n            eigvals, eigvecs = _get_positional_eigvecs(L, num_pos=num_pos)\n\n            # Eigenvalues previously set to infinity are now set to 0\n            # Any NaN in the eigvals or eigvecs will be set to 0\n            eigvecs[~np.isfinite(eigvecs)] = 0.0\n            eigvals[~np.isfinite(eigvals)] = 0.0\n            eigvals = np.repeat(np.expand_dims(eigvals, axis=0), adj.shape[0], axis=0)\n\n            cache[\"lap_eig\"] = (eigvals, eigvecs)\n\n        else:\n            eigvals, eigvecs = cache[\"lap_eig\"]\n\n    return eigvals, eigvecs, base_level, cache\n\n\ndef _get_positional_eigvecs(\n    matrix: Union[np.ndarray, spmatrix],\n    num_pos: int,\n) -> Tuple[np.ndarray, np.ndarray]:\n    r\"\"\"\n    compute the eigenvalues and eigenvectors of a matrix\n    Parameters:\n        matrix: Matrix to compute the eigenvalues and eigenvectors of\n        num_pos: Number of eigenvalues and eigenvectors to compute\n    Returns:\n        eigvals: Eigenvalues of the matrix\n        eigvecs: Eigenvectors of the matrix\n    \"\"\"\n    mat_len = matrix.shape[0]\n    eigvals, eigvecs = eig(matrix.todense())\n\n    # Pad with non-sense eigenvectors if required\n    if num_pos > mat_len:\n        temp_EigVal = np.ones(num_pos - mat_len, dtype=np.float64) + float(\"inf\")\n        temp_EigVec = np.zeros((mat_len, num_pos - mat_len), dtype=np.float64)\n        eigvals = np.concatenate([eigvals, temp_EigVal], axis=0)\n        eigvecs = np.concatenate([eigvecs, temp_EigVec], axis=1)\n\n    # Sort and keep only the first `num_pos` elements\n    sort_idx = eigvals.argsort()\n    eigvals = eigvals[sort_idx]\n    eigvals = eigvals[:num_pos]\n    eigvecs = eigvecs[:, sort_idx]\n    eigvecs = eigvecs[:, :num_pos]\n\n    # Normalize the eigvecs\n    eigvecs = eigvecs / np.maximum(np.sqrt(np.sum(eigvecs**2, axis=0, keepdims=True)), 1e-4)\n\n    return eigvals, eigvecs\n\n\ndef normalize_matrix(\n    matrix: Union[np.ndarray, spmatrix],\n    degree_vector=None,\n    normalization: str = None,\n) -> Union[np.ndarray, spmatrix]:\n    r\"\"\"\n    Normalize a given matrix using its degree vector\n\n    Parameters\n    ---------------\n\n        matrix: torch.tensor(N, N) or scipy.sparse.spmatrix(N, N)\n            A square matrix representing either an Adjacency matrix or a Laplacian.\n\n        degree_vector: torch.tensor(N) or np.ndarray(N) or None\n            A vector representing the degree of ``matrix``.\n            ``None`` is only accepted if ``normalization==None``\n\n        normalization: str or None, Default='none'\n            Normalization to use on the eig_matrix\n\n            - 'none' or ``None``: no normalization\n\n            - 'sym': Symmetric normalization ``D^-0.5 L D^-0.5``\n\n            - 'inv': Inverse normalization ``D^-1 L``\n\n    Returns\n    -----------\n        matrix: torch.tensor(N, N) or scipy.sparse.spmatrix(N, N)\n            The normalized matrix\n\n    \"\"\"\n\n    # Transform the degree vector into a matrix\n    if degree_vector is None:\n        if not ((normalization is None) or (normalization.lower() == \"none\")):\n            raise ValueError(\"`degree_vector` cannot be `None` if `normalization` is not `None`\")\n    else:\n        if is_dtype_numpy_array(matrix.dtype):\n            with np.errstate(divide=\"ignore\", invalid=\"ignore\"):\n                degree_inv = np.expand_dims(degree_vector**-0.5, axis=1)\n                degree_inv[np.isinf(degree_inv)] = 0\n        elif is_dtype_torch_tensor(matrix.dtype):\n            degree_inv = torch.unsqueeze(degree_vector**-0.5, dim=1)\n            degree_inv[torch.isinf(degree_inv)] = 0\n\n    # Compute the normalized matrix\n    if (normalization is None) or (normalization.lower() == \"none\"):\n        pass\n    elif normalization.lower() == \"sym\":\n        matrix = degree_inv * matrix * degree_inv.T\n    elif normalization.lower() == \"inv\":\n        matrix = (degree_inv**2) * matrix\n    else:\n        raise ValueError(\n            f'`normalization` should be `None`, `\"None\"`, `\"sym\"` or `\"inv\"`, but `{normalization}` was provided'\n        )\n\n    return matrix\n"
  },
  {
    "path": "graphium/features/transfer_pos_level.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Tuple, Union, List, Dict, Any, Optional\n\nimport numpy as np\n\nfrom scipy.sparse import spmatrix, issparse, coo_matrix\n\nfrom torch_geometric.utils import from_scipy_sparse_matrix\n\n\ndef transfer_pos_level(\n    pe: np.ndarray,\n    in_level: str,\n    out_level: str,\n    adj: Union[np.ndarray, spmatrix],\n    num_nodes: int,\n    cache: Optional[Dict[str, Any]] = None,\n) -> np.ndarray:\n    r\"\"\"\n    Transfer positional encoding between different positional levels (node, edge, nodepair, graph)\n\n    Parameters:\n        pe: Input pe with pos_level defined by in_level\n        in_level: pos_level of input pe\n        out_level: desired pos_level of output pe\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n\n    Returns:\n        pe: Output pe with pos_level defined by out_level\n    \"\"\"\n\n    if cache is None:\n        cache = {}\n\n    if in_level == \"node\":\n        if out_level == \"node\":\n            pass\n\n        elif out_level == \"edge\":\n            pe, cache = node_to_edge(pe, adj, cache)\n\n        elif out_level == \"nodepair\":\n            pe = node_to_nodepair(pe, num_nodes)\n\n        elif out_level == \"graph\":\n            raise NotImplementedError(\"Transfer function (node -> graph) not yet implemented.\")\n\n        else:\n            raise ValueError(f\"Unknown `pos_level`: {out_level}\")\n\n    elif in_level == \"edge\":\n        raise NotImplementedError(\"Transfer function (edge -> *) not yet implemented.\")\n\n    elif in_level == \"nodepair\":\n        if len(pe.shape) == 2:\n            pe = np.expand_dims(pe, -1)\n\n        if out_level == \"node\":\n            pe = nodepair_to_node(pe)\n\n        elif out_level == \"edge\":\n            pe, cache = nodepair_to_edge(pe, adj, cache)\n\n        elif out_level == \"nodepair\":\n            pass\n\n        elif out_level == \"graph\":\n            raise NotImplementedError(\"Transfer function (nodepair -> graph) not yet implemented.\")\n\n        else:\n            raise ValueError(f\"Unknown `pos_level`: {out_level}\")\n\n    elif in_level == \"graph\":\n        if out_level == \"node\":\n            pe = graph_to_node(pe, num_nodes, cache)\n\n        elif out_level in [\"edge\", \"nodepair\"]:\n            raise NotImplementedError(\"Transfer function (graph -> edge/nodepair) not yet implemented.\")\n\n        else:\n            raise ValueError(f\"Unknown `pos_level`: {out_level}\")\n\n    else:\n        raise ValueError(f\"Unknown `pos_level`: {in_level}\")\n\n    return pe\n\n\n# Transfer functions between different levels, i.e., node, edge, nodepair and graph level.\n\n# TODO:\n#   - Implement missing transfer functions below\n#   - Are transfer functions graph -> edge/nodepair and edge -> graph needed?\n\n\ndef node_to_edge(\n    pe: np.ndarray, adj: Union[np.ndarray, spmatrix], cache: Optional[Dict[str, Any]] = None\n) -> Tuple[np.ndarray, Dict[str, Any]]:\n    r\"\"\"\n    Get an edge-level positional encoding from a node-level positional encoding.\n     -> For each edge, concatenate the sum and absolute difference of pe of source and destination node.\n\n    Parameters:\n        pe [num_nodes, num_feat]: Node-level positional encoding\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        cache: Dictionary of cached objects\n\n    Returns:\n        edge_pe [2 * num_edges, 2 * num_feat]: Edge-level positional encoding\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    if cache is None:\n        cache = {}\n\n    if not issparse(adj):\n        if \"coo_adj\" in cache:\n            adj = cache[\"coo_adj\"]\n        elif \"csr_adj\" in cache:\n            adj = cache[\"csr_adj\"]\n        else:\n            adj = coo_matrix(adj, dtype=np.float64)\n            cache[\"coo_adj\"] = adj\n\n    edge_index, _ = from_scipy_sparse_matrix(adj)\n    src, dst = edge_index[0], edge_index[1]\n\n    pe_sum = pe[src] + pe[dst]\n    pe_abs_diff = np.abs(pe[src] - pe[dst])\n\n    edge_pe = np.concatenate((pe_sum, pe_abs_diff), axis=-1)\n\n    return edge_pe, cache\n\n\ndef node_to_nodepair(pe: np.ndarray, num_nodes: int) -> np.ndarray:\n    r\"\"\"\n    Get a nodepair-level positional encoding from a node-level positional encoding.\n     -> For each nodepair (i,j) concatenate the sum and absolute difference of pe at node i and j.\n\n    Parameters:\n        pe [num_nodes, num_feat]: Node-level positional encoding\n        num_nodes: Number of nodes in the graph\n\n    Returns:\n        nodepair_pe [num_nodes, num_nodes, 2 * num_feat]: Nodepair-level positional encoding\n    \"\"\"\n\n    expanded_pe = np.expand_dims(pe, axis=1)\n    expanded_pe = np.repeat(expanded_pe, repeats=num_nodes, axis=1)\n\n    pe_sum = expanded_pe + expanded_pe.transpose([1, 0, 2])\n    pe_abs_diff = np.abs(expanded_pe - expanded_pe.transpose([1, 0, 2]))\n\n    nodepair_pe = np.concatenate((pe_sum, pe_abs_diff), axis=-1)\n\n    return nodepair_pe\n\n\ndef node_to_graph(pe: np.ndarray, num_nodes: int) -> np.ndarray:\n    r\"\"\"\n    Get a graph-level positional encoding from a node-level positional encoding.\n     -> E.g., min/max/mean-pooling of node features.\n\n    Parameters:\n        pe [num_nodes, num_feat]: Node-level positional encoding\n        num_nodes: Number of nodes in the graph\n\n    Returns:\n        graph_pe [1, num_feat]: Graph-level positional encoding\n    \"\"\"\n\n    raise NotImplementedError(\"Transfer function (node -> graph) not yet implemented.\")\n\n\ndef edge_to_node(pe: np.ndarray, adj: Union[np.ndarray, spmatrix]) -> np.ndarray:\n    r\"\"\"\n    Get a node-level positional encoding from an edge-level positional encoding.\n     -> E.g., min/max/mean-pooling of information from edges (i,j) that contain node i\n\n    Parameters:\n        pe [num_edges, num_feat]: Edge-level positional encoding\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n\n    Returns:\n        node_pe [num_edges, num_feat]: Node-level positional encoding\n    \"\"\"\n\n    raise NotImplementedError(\"Transfer function (edge -> node) not yet implemented.\")\n\n\ndef edge_to_nodepair(\n    pe: np.ndarray, adj: Union[np.ndarray, spmatrix], num_nodes: int, cache: Optional[Dict[str, Any]] = None\n) -> np.ndarray:\n    r\"\"\"\n    Get a nodepair-level positional encoding from an edge-level positional encoding.\n     -> Zero-padding of non-existing edges.\n\n    Parameters:\n        pe [num_edges, num_feat]: Edge-level positional encoding\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n\n    Returns:\n        nodepair_pe [num_edges, num_edges, num_feat]: Nodepair-level positional encoding\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    if cache is None:\n        cache = {}\n\n    num_feat = pe.shape[-1]\n\n    if not isinstance(adj, coo_matrix):\n        if \"coo_adj\" in cache:\n            adj = cache[\"coo_adj\"]\n        else:\n            adj = coo_matrix(adj, dtype=np.float64)\n        cache[\"coo_adj\"] = adj\n\n    dst, src = adj.row, adj.col\n\n    nodepair_pe = np.zeros((num_nodes, num_nodes, num_feat))\n\n    for i in range(len(dst)):\n        nodepair_pe[dst[i], src[i], ...] = pe[i, ...]\n\n    return nodepair_pe, cache\n\n\ndef edge_to_graph(pe: np.ndarray) -> np.ndarray:\n    r\"\"\"\n    Get a graph-level positional encoding from an edge-level positional encoding.\n\n    Parameters:\n        pe [num_edges, num_feat]: Edge-level positional encoding\n\n    Returns:\n        graph_pe [1, num_feat]: Graph-level positional encoding\n    \"\"\"\n\n    raise NotImplementedError(\"Transfer function (edge -> graph) not yet implemented.\")\n\n\ndef nodepair_to_node(pe: np.ndarray, stats_list: List = [np.min, np.mean, np.std]) -> np.ndarray:\n    r\"\"\"\n    Get a node-level positional encoding from a graph-level positional encoding.\n     -> Calculate statistics over rows & cols of input positional encoding\n\n    Parameters:\n        pe [num_nodes, num_nodes, num_feat]: Nodepair-level positional encoding\n        stats_list: List of statistics to calculate per row/col of nodepair-level pe\n\n    Returns:\n        node_pe [num_nodes, 2 * len(stats_list) * num_feat]: Node-level positional encoding\n    \"\"\"\n\n    num_feat = pe.shape[-1]\n\n    node_pe_list = []\n\n    for stat in stats_list:\n        for i in range(num_feat):\n            node_pe_list.append(stat(pe[..., i], axis=0))\n            node_pe_list.append(stat(pe[..., i], axis=1))\n    node_pe = np.stack(node_pe_list, axis=-1)\n\n    return node_pe\n\n\ndef nodepair_to_edge(\n    pe: np.ndarray, adj: Union[np.ndarray, spmatrix], cache: Optional[Dict[str, Any]] = None\n) -> np.ndarray:\n    r\"\"\"\n    Get a edge-level positional encoding from a nodepair-level positional encoding.\n     -> Mask and sparsify nodepair-level positional encoding\n\n    Parameters:\n        pe [num_nodes, num_nodes, num_feat]: Nodepair-level positional encoding\n        adj [num_nodes, num_nodes]: Adjacency matrix of the graph\n        cache: Dictionary of cached objects\n\n    Returns:\n        edge_pe [num_edges, num_feat]: Edge-level positional encoding\n        cache: Updated dictionary of cached objects\n    \"\"\"\n\n    if cache is None:\n        cache = {}\n\n    num_feat = pe.shape[-1]\n\n    if not isinstance(adj, coo_matrix):\n        if \"coo_adj\" in cache:\n            adj = cache[\"coo_adj\"]\n        else:\n            adj = coo_matrix(adj, dtype=np.float64)\n        cache[\"coo_adj\"] = adj\n\n    dst, src = adj.row, adj.col\n\n    edge_pe = np.zeros((len(dst), num_feat))\n\n    for i in range(len(src)):\n        edge_pe[i, ...] = pe[dst[i], src[i]]\n\n    return edge_pe, cache\n\n\ndef nodepair_to_graph(pe: np.ndarray, num_nodes: int) -> np.ndarray:\n    r\"\"\"\n    Get a graph-level positional encoding from a nodepair-level positional encoding.\n     -> E.g., min/max/mean-pooling of entries of input pe\n\n    Parameters:\n        pe [num_nodes, num_nodes, num_feat]: Nodepair-level positional encoding\n        num_nodes: Number of nodes in the graph\n\n    Returns:\n        graph_pe [1, num_feat]: Graph-level positional encoding\n    \"\"\"\n\n    raise NotImplementedError(\"Transfer function (nodepair -> graph) not yet implemented.\")\n\n\ndef graph_to_node(\n    pe: Union[np.ndarray, List], num_nodes: int, cache: Optional[Dict[str, Any]] = None\n) -> np.ndarray:\n    r\"\"\"\n    Get a node-level positional encoding from a nodepair-level positional encoding.\n     -> E.g., expand dimension of graph-level pe\n\n    Parameters:\n        pe [num_feat]: Nodepair-level positional encoding (or list of them if graph disconnected)\n        num_nodes: Number of nodes in the graph\n        cache: Dictionary of cached objects\n\n    Returns:\n        node_pe [num_nodes, num_feat]: Node-level positional encoding\n    \"\"\"\n\n    if cache is None:\n        cache = {}\n\n    node_pe = None\n\n    # The key 'components' is only in cache if disconnected_comp == True when computing base pe\n    if \"components\" in cache:\n        if len(cache[\"components\"]) > 1:\n            node_pe = np.zeros((num_nodes, len(pe)))\n            components = cache[\"components\"]\n\n            for i, component in enumerate(components):\n                comp = list(component)\n                node_pe[comp, :] = np.real(pe[i])\n\n    if node_pe is None:\n        node_pe = np.tile(pe, (num_nodes, 1))\n\n    return node_pe\n"
  },
  {
    "path": "graphium/finetuning/__init__.py",
    "content": "from .utils import modify_cfg_for_finetuning\nfrom .finetuning import GraphFinetuning\nfrom .finetuning_architecture import FullGraphFinetuningNetwork\n\n\nFINETUNING_CONFIG_KEY = \"finetuning\"\n"
  },
  {
    "path": "graphium/finetuning/finetuning.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Iterable, List, Dict, Tuple, Union, Callable, Any, Optional, Type\n\nfrom collections import OrderedDict\n\nimport torch.nn as nn\nimport pytorch_lightning as pl\n\nfrom torch.optim.optimizer import Optimizer\nfrom pytorch_lightning.callbacks import BaseFinetuning\n\n\nclass GraphFinetuning(BaseFinetuning):\n    def __init__(\n        self,\n        finetuning_module: str,\n        added_depth: int = 0,\n        unfreeze_pretrained_depth: Optional[int] = None,\n        epoch_unfreeze_all: int = 0,\n        train_bn: bool = False,\n    ):\n        \"\"\"\n        Finetuning training callback that (un)freezes modules as specified in the configuration file.\n        By default, the modified layers of the fineuning module and the finetuning head are unfrozen.\n\n        Parameters:\n            finetuning_module: Module to finetune from\n            added_depth: Number of layers of finetuning module that have been modified rel. to pretrained model\n            unfreeze_pretrained_depth: Number of additional layers to unfreeze before layers modified rel. to pretrained model\n            epoch_unfreeze_all: Epoch to unfreeze entire model\n            train_bn: Boolean value indicating if batchnorm layers stay in training mode\n\n        \"\"\"\n        super().__init__()\n\n        self.finetuning_module = finetuning_module\n        self.training_depth = added_depth\n        if unfreeze_pretrained_depth is not None:\n            self.training_depth += unfreeze_pretrained_depth\n        self.epoch_unfreeze_all = epoch_unfreeze_all\n        self.train_bn = train_bn\n\n    def freeze_before_training(self, pl_module: pl.LightningModule):\n        \"\"\"\n        Freeze everything up to finetuning module (and parts of finetuning module)\n\n        Parameters:\n            pl_module: PredictorModule used for finetuning\n        \"\"\"\n\n        # Access module map of pretrained module\n        module_map = pl_module.model.pretrained_model.net._module_map\n\n        for module_name in module_map.keys():\n            self.freeze_module(pl_module, module_name, module_map)\n\n            if module_name.startswith(self.finetuning_module):\n                # Do not freeze modules after finetuning module\n                break\n\n    def freeze_module(self, pl_module, module_name: str, module_map: Dict[str, Union[nn.ModuleList, Any]]):\n        \"\"\"\n        Freeze specific modules\n\n        Parameters:\n            module_name: Name of module to (partally) freeze\n            module_map: Dictionary mapping from module_name to corresponding module(s)\n        \"\"\"\n        modules = module_map[module_name]\n\n        if module_name == \"pe_encoders\":\n            for param in pl_module.model.pretrained_model.net.encoder_manager.parameters():\n                param.requires_grad = False\n\n        # We only partially freeze the finetuning module\n        if module_name.startswith(self.finetuning_module):\n            if self.training_depth == 0:\n                pass\n            else:\n                modules = modules[: -self.training_depth]\n\n        self.freeze(modules=modules, train_bn=self.train_bn)\n\n    def finetune_function(self, pl_module: pl.LightningModule, epoch: int, optimizer: Optimizer):\n        \"\"\"\n        Function unfreezing entire model at specified epoch\n\n        Parameters:\n            pl_module: PredictorModule used for finetuning\n            epoch: Current training epoch\n            optimizer: Optimizer used for finetuning\n        \"\"\"\n        if epoch == self.epoch_unfreeze_all:\n            self.unfreeze_and_add_param_group(modules=pl_module, optimizer=optimizer, train_bn=self.train_bn)\n"
  },
  {
    "path": "graphium/finetuning/finetuning_architecture.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Any, Dict, Optional, Union\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor\nfrom torch_geometric.data import Batch\n\nfrom graphium.nn.utils import MupMixin\nfrom graphium.trainer.predictor import PredictorModule\nfrom graphium.utils.spaces import FINETUNING_HEADS_DICT\n\n\nclass FullGraphFinetuningNetwork(nn.Module, MupMixin):\n    def __init__(\n        self,\n        pretrained_model: Union[str, \"PretrainedModel\"],\n        pretrained_model_kwargs: Dict[str, Any] = {},\n        pretrained_overwriting_kwargs: Dict[str, Any] = {},\n        finetuning_head_kwargs: Optional[Dict[str, Any]] = None,\n        num_inference_to_average: int = 1,\n        last_layer_is_readout: bool = False,\n        name: str = \"FullFinetuningGNN\",\n    ):\n        r\"\"\"\n        Flexible class that allows to implement an end-to-end graph finetuning network architecture, supporting flexible pretrained models and finetuning heads.\n        The network decomposes into two parts of class PretrainedModel and FinetuningHead. The PretrainedModel class allows basic finetuning such as\n        finetuning from a specified module of the pretrained model and dropping/adding layers in this module. The (optional) FinetuningHead class allows more\n        flexible finetuning with a custom network applied after the pretrained model. If not specified, we fall back to basic finetuning integrated in PretrainedModel.\n\n        Parameters:\n\n            pretrained_model:\n                A PretrainedModel or an identifier of pretrained model within GRAPHIUM_PRETRAINED_MODELS_DICT or a valid .ckpt checkpoint path\n\n            pretrained_model_kwargs:\n                Key-word arguments to instantiate a model of the same class as the pretrained model (e.g., FullGraphMultitaskNetwork))\n\n            pretrained_overwriting_kwargs:\n                Key-word arguments indicating which parameters of loaded model are shared with the pretrained part of FullGraphFinetuningNetwork\n\n            finetuning_head_kwargs:\n                Key-word arguments to use for the finetuning head.\n                It must respect the following criteria:\n                - pretrained_model_kwargs[last_used_module][\"out_level\"] must be equal to finetuning_head_kwargs[\"in_level\"]\n                - pretrained_model_kwargs[last_used_module][\"out_dim\"] must be equal to finetuning_head_kwargs[\"in_dim\"]\n\n                Here, [last_used_module] represents the module that is finetuned from,\n                e.g., gnn, graph_output or (one of the) task_heads\n\n            num_inference_to_average:\n                Number of inferences to average at val/test time. This is used to avoid the noise introduced\n                by positional encodings with sign-flips. In case no such encoding is given,\n                this parameter is ignored.\n                NOTE: The inference time will be slowed-down proportionaly to this parameter.\n\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n        \"\"\"\n\n        super().__init__()\n\n        self.name = name\n        self.num_inference_to_average = num_inference_to_average\n        self.last_layer_is_readout = last_layer_is_readout\n        self._concat_last_layers = None\n        self.pretrained_model = pretrained_model\n        self.pretrained_overwriting_kwargs = pretrained_overwriting_kwargs\n        self.finetuning_head_kwargs = finetuning_head_kwargs\n        self.max_num_nodes_per_graph = None\n        self.max_num_edges_per_graph = None\n        self.finetuning_head = None\n\n        if not isinstance(self.pretrained_model, PretrainedModel):\n            self.pretrained_model = PretrainedModel(\n                self.pretrained_model, pretrained_model_kwargs, pretrained_overwriting_kwargs\n            )\n\n        if finetuning_head_kwargs is not None:\n            self.finetuning_head = FinetuningHead(finetuning_head_kwargs)\n\n    def forward(self, g: Batch) -> Tensor:\n        r\"\"\"\n        Apply the pre-processing neural network, the graph neural network,\n        and the post-processing neural network on the graph features.\n\n        Parameters:\n\n            g:\n                pyg Batch graph on which the convolution is done.\n                Must contain the following elements:\n\n                - Node key `\"feat\"`: `torch.Tensor[..., N, Din]`.\n                  Input node feature tensor, before the network.\n                  `N` is the number of nodes, `Din` is the input features dimension ``self.pre_nn.in_dim``\n\n                - Edge key `\"edge_feat\"`: `torch.Tensor[..., N, Ein]` **Optional**.\n                  The edge features to use. It will be ignored if the\n                  model doesn't supporte edge features or if\n                  `self.in_dim_edges==0`.\n\n                - Other keys related to positional encodings `\"pos_enc_feats_sign_flip\"`,\n                  `\"pos_enc_feats_no_flip\"`.\n\n        Returns:\n\n            `torch.Tensor[..., M, Dout]` or `torch.Tensor[..., N, Dout]`:\n                Node or graph feature tensor, after the network.\n                `N` is the number of nodes, `M` is the number of graphs,\n                `Dout` is the output dimension ``self.graph_output_nn.out_dim``\n                If the `self.gnn.pooling` is [`None`], then it returns node features and the output dimension is `N`,\n                otherwise it returns graph features and the output dimension is `M`\n\n        \"\"\"\n\n        g = self.pretrained_model.forward(g)\n\n        if self.finetuning_head is not None:\n            g = self.finetuning_head.forward(g)\n\n        return g\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n\n        Returns:\n            Dictionary with the kwargs to create the base model.\n        \"\"\"\n        kwargs = dict(\n            pretrained_model=self.pretrained_model,\n            pretrained_model_kwargs=None,\n            finetuning_head_kwargs=None,\n            num_inference_to_average=self.num_inference_to_average,\n            last_layer_is_readout=self.last_layer_is_readout,\n            name=self.name,\n        )\n\n        kwargs[\"pretrained_model_kwargs\"] = self.pretrained_model.make_mup_base_kwargs(\n            divide_factor=divide_factor\n        )\n\n        if self.finetuning_head is not None:\n            kwargs[\"finetuning_head_kwargs\"] = self.finetuning_head.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=True\n            )\n\n        kwargs[\"pretrained_overwriting_kwargs\"] = self.pretrained_overwriting_kwargs\n\n        return kwargs\n\n    def set_max_num_nodes_edges_per_graph(self, max_nodes: Optional[int], max_edges: Optional[int]) -> None:\n        r\"\"\"\n        Set the maximum number of nodes and edges for all gnn layers and encoder layers\n\n        Parameters:\n            max_nodes: Maximum number of nodes in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n\n            max_edges: Maximum number of edges in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n        \"\"\"\n\n        self.pretrained_model.net.set_max_num_nodes_edges_per_graph(max_nodes, max_edges)\n\n\nclass PretrainedModel(nn.Module, MupMixin):\n    def __init__(\n        self,\n        pretrained_model: str,\n        pretrained_model_kwargs: Dict[str, Any],\n        pretrained_overwriting_kwargs: Dict[str, Any],\n    ):\n        r\"\"\"\n        Flexible class allowing to finetune pretrained models from GRAPHIUM_PRETRAINED_MODELS_DICT or from a ckeckpoint path.\n        Can be any model that inherits from nn.Module, MupMixin and comes with a module map (e.g., FullGraphMultitaskNetwork)\n\n        Parameters:\n\n            pretrained_model:\n                Identifier of pretrained model within GRAPHIUM_PRETRAINED_MODELS_DICT or from a checkpoint path\n\n            pretrained_model_kwargs:\n                Key-word arguments to instantiate a model of the same class as the pretrained model (e.g., FullGraphMultitaskNetwork))\n\n            pretrained_overwriting_kwargs:\n                Key-word arguments indicating which parameters of loaded model are shared with the pretrained part of FullGraphFinetuningNetwork\n\n        \"\"\"\n\n        super().__init__()\n\n        # Load pretrained model\n        pretrained_model = PredictorModule.load_pretrained_model(pretrained_model, device=\"cpu\").model\n        pretrained_model.create_module_map()\n\n        # Initialize new model with architecture after desired modifications to architecture.\n        net = type(pretrained_model)\n        self.net = net(**pretrained_model_kwargs)\n        self.net.create_module_map()\n\n        # Overwrite parameters shared between loaded and modified pretrained model\n        self.overwrite_with_pretrained(pretrained_model, **pretrained_overwriting_kwargs)\n\n    def forward(self, g: Union[torch.Tensor, Batch]):\n        g = self.net.forward(g)\n\n        return g\n\n    def overwrite_with_pretrained(\n        self,\n        pretrained_model,\n        finetuning_module: str,\n        added_depth: int = 0,\n        sub_module_from_pretrained: str = None,\n    ):\n        \"\"\"\n        Overwrite parameters shared between loaded and modified pretrained model\n\n        Parameters:\n            pretrained_model: Model from GRAPHIUM_PRETRAINED_MODELS_DICT\n            finetuning_module: Module to finetune from\n            added_depth: Number of modified layers at the end of finetuning module\n            sub_module_from_pretrained: Optional submodule to finetune from\n        \"\"\"\n        module_map = self.net._module_map\n        module_map_from_pretrained = pretrained_model._module_map\n\n        module_names_from_pretrained = module_map_from_pretrained.keys()\n        super_module_names_from_pretrained = set(\n            [module_name.split(\"-\")[0] for module_name in module_names_from_pretrained]\n        )\n\n        for module_name in module_map.keys():\n            # Below exception handles some modules (e.g., pe_encoders in FullGraphMultitaskNetwork) that do not support len());\n            # They can always be replaced entirely\n            try:\n                shared_depth = len(module_map[module_name])\n            except:\n                module_map[module_name] = module_map_from_pretrained[module_name]\n                continue\n\n            if module_name.startswith(finetuning_module):\n                shared_depth -= added_depth\n\n            if module_name in module_map_from_pretrained.keys():\n                for idx in range(shared_depth):\n                    module_map[module_name][idx] = module_map_from_pretrained[module_name][idx]\n            elif module_name.split(\"-\")[0] in super_module_names_from_pretrained:\n                for idx in range(shared_depth):\n                    module_map[module_name][idx] = module_map_from_pretrained[\n                        \"\".join([module_name.split(\"-\")[0], \"-\", sub_module_from_pretrained])\n                    ][idx]\n            else:\n                raise RuntimeError(\"Mismatch between loaded pretrained model and model to be overwritten.\")\n\n            if module_name.startswith(finetuning_module):\n                break\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n\n        Returns:\n            Dictionary with the kwargs to create the base model.\n        \"\"\"\n        # For the post-nn network, all the dimension are divided\n\n        return self.net.make_mup_base_kwargs(divide_factor=divide_factor)\n\n\nclass FinetuningHead(nn.Module, MupMixin):\n    def __init__(self, finetuning_head_kwargs: Dict[str, Any]):\n        r\"\"\"\n        Flexible class allowing to use a custom finetuning head on top of the pretrained model.\n        Can be any model that inherits from nn.Module, MupMixin.\n\n        Parameters:\n\n            finetuning_head_kwargs: Key-word arguments needed to instantiate a custom (or existing) finetuning head from FINETUNING_HEADS_DICT\n\n        \"\"\"\n\n        super().__init__()\n        self.task = finetuning_head_kwargs.pop(\"task\", None)\n        self.previous_module = finetuning_head_kwargs.pop(\"previous_module\", \"task_heads\")\n        self.incoming_level = finetuning_head_kwargs.pop(\"incoming_level\", \"graph\")\n\n        model_type = finetuning_head_kwargs.pop(\"model_type\", \"mlp\")\n        net = FINETUNING_HEADS_DICT[model_type]\n        self.net = net(**finetuning_head_kwargs)\n\n    def forward(self, g: Union[Dict[str, Union[torch.Tensor, Batch]], torch.Tensor, Batch]):\n        if isinstance(g, (torch.Tensor, Batch)):\n            pass\n        elif isinstance(g, Dict) and len(g) == 1:\n            g = list(g.values())[0]\n        else:\n            raise TypeError(\"Output type from pretrained model not appropriate for finetuning head\")\n\n        g = self.net.forward(g)\n\n        return {self.task: g}\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n\n        Returns:\n            Dictionary with the kwargs to create the base model.\n        \"\"\"\n        # For the post-nn network, all the dimension are divided\n\n        return self.net.make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n"
  },
  {
    "path": "graphium/finetuning/fingerprinting.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport torch\n\nfrom collections import defaultdict\nfrom typing import Literal, Union, List\n\nimport tqdm\n\nfrom graphium.nn.architectures.global_architectures import FullGraphMultiTaskNetwork\nfrom graphium.trainer.predictor import PredictorModule\n\n\nclass Fingerprinter:\n    \"\"\"Extract fingerprints from a [`FullGraphMultiTaskNetwork`][graphium.nn.architectures.global_architectures.FullGraphMultiTaskNetwork]\n\n    Fingerprints are hidden representations of a pre-trained network.\n    They can be used as an additional representation for task-specific ML models\n    in downstream tasks.\n\n    Info: Connection to linear probing.\n        This two-stage process is similar in concept to linear-probing,\n        but pre-computing the fingerprints further decouples the two stages\n        and allows for more flexibility.\n\n    Note: CLI support\n        You can extract fingerprints easily with the CLI. For more info, see\n        ```sh\n        graphium finetune fingerprint --help\n        ```\n\n    To return intermediate representations, the network will be altered\n    to save the readouts of the specified layers. This class is designed to be used\n    as a context manager that automatically reverts the network to its original state.\n\n    Examples:\n\n        Basic usage:\n        ```python\n        # Return layer 1 of the gcn submodule.\n        # For a full list of options, see the `_module_map` attribute of the FullGraphMultiTaskNetwork.\n        fp_spec = \"gcn:1\"\n\n        # Create the object\n        fingerprinter = Fingerprinter(predictor, \"gcn:1\")\n\n        # Setup, run and teardown the fingerprinting process\n        fingerprinter.setup()\n        fp = fp.get_fingerprints_for_batch(batch)\n        fingerprinter.teardown()\n        ```\n\n        As context manager:\n        ```python\n        with Fingerprinter(predictor, \"gcn:1\") as fingerprinter:\n            fp = fp.get_fingerprints_for_batch(batch)\n        ```\n\n        Concatenating multiple hidden representations:\n        ```python\n        with Fingerprinter(predictor, [\"gcn:0\", \"gcn:1\"]) as fingerprinter:\n            fp = fp.get_fingerprints_for_batch(batch)\n        ```\n\n        For an entire dataset (expects a PyG dataloader):\n        ```python\n        with Fingerprinter(predictor, [\"gcn:0\", \"gcn:1\"]) as fingerprinter:\n            fps = fp.get_fingerprints_for_dataset(dataloader)\n        ```\n\n        Returning numpy arrays instead of torch tensors:\n        ```python\n        with Fingerprinter(predictor, [\"gcn:0\", \"gcn:1\"], out_type=\"numpy\") as fingerprinter:\n            fps = fp.get_fingerprints_for_dataset(dataloader)\n        ```\n    \"\"\"\n\n    def __init__(\n        self,\n        model: Union[FullGraphMultiTaskNetwork, PredictorModule],\n        fingerprint_spec: Union[str, List[str]],\n        out_type: Literal[\"torch\", \"numpy\"] = \"torch\",\n    ):\n        \"\"\"\n        Args:\n            model: Either a `FullGraphMultiTaskNetwork` or a `PredictorModule` that contains one. The benefit\n                of using a `PredictorModule` is that it will automatically convert the features to the correct dtype.\n            fingerprint_spec: The fingerprint specification. Of the format `module:layer`. If specified as a list,\n                the fingerprints from all the specified layers will be concatenated.\n            out_type: The output type of the fingerprints. Either `torch` or `numpy`.\n        \"\"\"\n\n        network = model\n        predictor = None\n\n        if isinstance(model, PredictorModule):\n            predictor = model\n            network = model.model\n\n        if not isinstance(network, FullGraphMultiTaskNetwork):\n            raise NotImplementedError(\n                f\"{self.__class__.__name__} only supports fingerprints for the FullGraphMultiTaskNetwork\"\n            )\n\n        if out_type not in [\"torch\", \"numpy\"]:\n            raise ValueError(f\"Invalid output type {out_type}, pick from 'torch' or 'numpy'\")\n\n        self.network = network\n        self.predictor = predictor\n        self._out_type = out_type\n\n        if isinstance(fingerprint_spec, str):\n            fingerprint_spec = [fingerprint_spec]\n\n        self._spec = defaultdict(list)\n        for spec_str in fingerprint_spec:\n            module_name, layer = spec_str.split(\":\")\n            self._spec[module_name].append(int(layer.strip()))\n\n        self._module_map_backup = None\n\n    def setup(self):\n        \"\"\"Prepare the network for fingerprinting\"\"\"\n        if hasattr(self.network, \"_module_map\"):\n            self._module_map_backup = self.network._module_map\n        self.network._enable_readout_cache(list(self._spec.keys()))\n        return self\n\n    def get_fingerprints_for_batch(self, batch):\n        \"\"\"Get the fingerprints for a single batch\"\"\"\n\n        if not self.network._cache_readouts:\n            raise RuntimeError(\n                f\"To use {self.__class__.__name__}, you must enable readout caching in the network. \"\n                f\"Alternatively, you can use the {self.__class__.__name__} as a context manager \"\n                \"to automatically setup the network for fingerprinting and revert the changes afterwards\"\n            )\n\n        # Run the batch through the model.\n        with torch.inference_mode():\n            if self.predictor is not None:\n                batch[\"features\"] = self.predictor._convert_features_dtype(batch[\"features\"])\n            self.network(batch[\"features\"])\n\n        readout_list = []\n        for module_name, layers in self._spec.items():\n            readout_list.extend(\n                [self.network._module_map[module_name]._readout_cache[layer].cpu() for layer in layers]\n            )\n\n        feats = torch.cat(readout_list, dim=-1)\n        return self._convert_output_type(feats)\n\n    def get_fingerprints_for_dataset(self, dataloader):\n        \"\"\"Return the fingerprints for an entire dataset\"\"\"\n\n        original_out_type = self._out_type\n        self._out_type = \"torch\"\n\n        fps = []\n        for batch in tqdm.tqdm(dataloader, desc=\"Fingerprinting batches\"):\n            feats = self.get_fingerprints_for_batch(batch)\n            fps.append(feats)\n\n        fps = torch.cat(fps, dim=0)\n\n        self._out_type = original_out_type\n        return self._convert_output_type(fps)\n\n    def teardown(self):\n        \"\"\"Restore the network to its original state\"\"\"\n        self.network._disable_readout_cache()\n        if self._module_map_backup is not None:\n            self.network._module_map = self._module_map_backup\n        else:\n            del self.network._module_map\n        return self\n\n    def __enter__(self):\n        \"\"\"Setup the network for fingerprinting\"\"\"\n        return self.setup()\n\n    def __exit__(self, exc_type, exc_val, exc_tb):\n        \"\"\"Revert the network to its original state\"\"\"\n        if exc_type is not None:\n            raise exc_type(exc_val)\n        return self.teardown()\n\n    def _convert_output_type(self, feats: torch.Tensor):\n        \"\"\"Small utility function to convert output types\"\"\"\n        if self._out_type == \"numpy\":\n            feats = feats.numpy()\n        return feats\n"
  },
  {
    "path": "graphium/finetuning/utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom copy import deepcopy\nfrom typing import Any, Dict, List, Union\n\nfrom loguru import logger\n\nfrom graphium.trainer import PredictorModule\n\nimport graphium\n\n\ndef filter_cfg_based_on_admet_benchmark_name(config: Dict[str, Any], names: Union[List[str], str]):\n    \"\"\"\n    Filter a base config for the full TDC ADMET benchmarking group to only\n    have settings related to a subset of the endpoints\n    \"\"\"\n\n    if config[\"datamodule\"][\"module_type\"] != \"ADMETBenchmarkDataModule\":\n        # NOTE (cwognum): For now, this implies we only support the ADMET benchmark from TDC.\n        #    It is easy to extend this in the future to support more datasets.\n        raise ValueError(\"You can only use this method for the `ADMETBenchmarkDataModule`\")\n\n    if isinstance(names, str):\n        names = [names]\n\n    def _filter(d):\n        return {k: v for k, v in d.items() if k in names}\n\n    cfg = deepcopy(config)\n\n    # Update the datamodule arguments\n    cfg[\"datamodule\"][\"args\"][\"tdc_benchmark_names\"] = names\n\n    # Filter the relevant config sections\n    if \"architecture\" in cfg and \"task_heads\" in cfg[\"architecture\"]:\n        cfg[\"architecture\"][\"task_heads\"] = _filter(cfg[\"architecture\"][\"task_heads\"])\n    if \"predictor\" in cfg and \"metrics_on_progress_bar\" in cfg[\"predictor\"]:\n        cfg[\"predictor\"][\"metrics_on_progress_bar\"] = _filter(cfg[\"predictor\"][\"metrics_on_progress_bar\"])\n    if \"predictor\" in cfg and \"loss_fun\" in cfg[\"predictor\"]:\n        cfg[\"predictor\"][\"loss_fun\"] = _filter(cfg[\"predictor\"][\"loss_fun\"])\n    if \"metrics\" in cfg:\n        cfg[\"metrics\"] = _filter(cfg[\"metrics\"])\n\n    return cfg\n\n\ndef modify_cfg_for_finetuning(cfg: Dict[str, Any]):\n    \"\"\"\n    Function combining information from configuration and pretrained model for finetuning.\n    \"\"\"\n    task = cfg[\"finetuning\"][\"task\"]\n\n    # Filter the config based on the task name\n    # NOTE (cwognum): This prevents the need for having many different files for each of the tasks\n    #    with lots and lots of config repetition.\n    cfg = filter_cfg_based_on_admet_benchmark_name(cfg, task)\n    cfg_finetune = cfg[\"finetuning\"]\n\n    # Load pretrained model\n    pretrained_model = cfg_finetune[\"pretrained_model\"]\n    pretrained_predictor = PredictorModule.load_pretrained_model(pretrained_model, device=\"cpu\")\n\n    # Inherit shared configuration from pretrained\n    # Architecture\n    pretrained_architecture = pretrained_predictor.model_kwargs\n    arch_keys = pretrained_architecture.keys()\n    arch_keys = [key.replace(\"_kwargs\", \"\") for key in arch_keys]\n    cfg_arch = {arch_keys[idx]: value for idx, value in enumerate(pretrained_architecture.values())}\n\n    cfg_arch_from_pretrained = deepcopy(cfg_arch)\n    # Featurization\n    cfg[\"datamodule\"][\"args\"][\"featurization\"] = pretrained_predictor.featurization\n\n    finetuning_module = cfg_finetune[\"finetuning_module\"]\n    sub_module_from_pretrained = cfg_finetune.get(\"sub_module_from_pretrained\", None)\n    new_sub_module = cfg_finetune.pop(\"new_sub_module\", None)\n    keep_modules_after_finetuning_module = cfg_finetune.pop(\"keep_modules_after_finetuning_module\", None)\n\n    # Find part of config of module to finetune from\n    pretrained_predictor.model.create_module_map()\n    module_map_from_pretrained = pretrained_predictor.model._module_map\n\n    if not any([module.startswith(finetuning_module) for module in module_map_from_pretrained.keys()]):\n        raise ValueError(\"Unkown module {finetuning_module}\")\n    elif sub_module_from_pretrained is None:\n        new_module_kwargs = deepcopy(cfg_arch[finetuning_module])\n    else:\n        new_module_kwargs = deepcopy(cfg_arch[finetuning_module][sub_module_from_pretrained])\n\n    # Modify config according to desired finetuning architecture\n    out_dim = (\n        cfg_arch[finetuning_module].get(\"out_dim\")\n        if sub_module_from_pretrained is None\n        else cfg_arch[finetuning_module][sub_module_from_pretrained].get(\"out_dim\")\n    )\n\n    if new_module_kwargs[\"depth\"] is None:\n        new_module_kwargs[\"depth\"] = len(new_module_kwargs[\"hidden_dims\"]) + 1\n\n    upd_kwargs = {\n        \"out_dim\": cfg_finetune.pop(\"new_out_dim\", out_dim),\n        \"depth\": new_module_kwargs[\"depth\"]\n        + cfg_finetune.get(\"added_depth\", 0)\n        - cfg_finetune.pop(\"drop_depth\", 0),\n    }\n\n    new_last_activation = cfg_finetune.pop(\"new_last_activation\", None)\n    if new_last_activation is not None:\n        upd_kwargs[\"last_activation\"] = new_last_activation\n\n    # Update config\n    new_module_kwargs.update(upd_kwargs)\n\n    if sub_module_from_pretrained is None:\n        cfg_arch[finetuning_module] = new_module_kwargs\n    else:\n        cfg_arch[finetuning_module] = {new_sub_module: new_module_kwargs}\n\n    # Remove modules of pretrained model after module to finetune from unless specified differently\n    module_list = list(module_map_from_pretrained.keys())\n    super_module_list = []\n    for module in module_list:\n        if module.split(\"-\")[0] not in super_module_list:  # Only add each supermodule once\n            super_module_list.append(module.split(\"-\")[0])\n\n    # Set configuration of modules after finetuning module to None\n    cutoff_idx = (\n        super_module_list.index(finetuning_module) + 1\n    )  # Index of module after module to finetune from\n    for module in super_module_list[cutoff_idx:]:\n        cfg_arch[module] = None\n\n    # If desired, we can keep specific modules after the finetuning module (specified in cfg/finetuning/keep_modules_after_finetuning_module)\n    if keep_modules_after_finetuning_module is not None:\n        for module_name, updates in keep_modules_after_finetuning_module.items():\n            cfg_arch = update_cfg_arch_for_module(cfg_arch, cfg_arch_from_pretrained, module_name, updates)\n\n    # Change architecture to FullGraphFinetuningNetwork\n    cfg_arch[\"model_type\"] = \"FullGraphFinetuningNetwork\"\n\n    cfg[\"architecture\"] = cfg_arch\n\n    pretrained_overwriting_kwargs = deepcopy(cfg[\"finetuning\"])\n    drop_keys = [\n        \"task\",\n        \"level\",\n        \"finetuning_head\",\n        \"unfreeze_pretrained_depth\",\n        \"epoch_unfreeze_all\",\n    ]\n\n    for key in drop_keys:\n        pretrained_overwriting_kwargs.pop(key, None)\n\n    finetuning_training_kwargs = deepcopy(cfg[\"finetuning\"])\n    drop_keys = [\"task\", \"level\", \"pretrained_model\", \"sub_module_from_pretrained\", \"finetuning_head\"]\n    for key in drop_keys:\n        finetuning_training_kwargs.pop(key, None)\n\n    cfg[\"finetuning\"].update(\n        {\"overwriting_kwargs\": pretrained_overwriting_kwargs, \"training_kwargs\": finetuning_training_kwargs}\n    )\n\n    return cfg\n\n\ndef update_cfg_arch_for_module(\n    cfg_arch: Dict[str, Any],\n    cfg_arch_from_pretrained: Dict[str, Any],\n    module_name: str,\n    updates: Dict[str, Any],\n):\n    \"\"\"\n    Function to modify the key-word arguments of modules after the finetuning module if they are kept.\n\n    Parameters:\n        cfg_arch: Configuration of the architecture of the model used for finetuning\n        cfg_arch_from_pretrained: Configuration of the architecture of the loaded pretrained model\n        module_name: Module of loaded pretrained model\n        updates: Changes to apply to key-work arguments of selected module\n    \"\"\"\n    # We need to distinguish between modules with & without submodules\n    if \"-\" not in module_name:\n        if cfg_arch[module_name] is None:\n            cfg_arch[module_name] = {}\n\n        cfg_arch_from_pretrained[module_name].update({key: value for key, value in updates.items()})\n\n        cfg_arch.update({module_name, cfg_arch_from_pretrained})\n\n    else:\n        module_name, sub_module = module_name.split(\"-\")\n        new_sub_module = updates.pop(\"new_sub_module\", sub_module)\n\n        if cfg_arch[module_name] is None:\n            cfg_arch[module_name] = {}\n\n        cfg_arch_from_pretrained[module_name][sub_module].update(\n            {key: value for key, value in updates.items()}\n        )\n        cfg_arch[module_name].update({new_sub_module: cfg_arch_from_pretrained[module_name][sub_module]})\n\n    return cfg_arch\n"
  },
  {
    "path": "graphium/hyper_param_search/__init__.py",
    "content": "from .results import extract_main_metric_for_hparam_search\n\nHYPER_PARAM_SEARCH_CONFIG_KEY = \"hyper_param_search\"\n"
  },
  {
    "path": "graphium/hyper_param_search/results.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n_OBJECTIVE_KEY = \"objective\"\n\n\ndef extract_main_metric_for_hparam_search(results: dict, cfg: dict):\n    \"\"\"Processes the results in the context of a hyper-parameter search.\"\"\"\n\n    # Extract the objectives\n    objectives = cfg[_OBJECTIVE_KEY]\n    if isinstance(objectives, str):\n        objectives = [objectives]\n\n    # Extract the objective values\n    objective_values = [results[k] for k in objectives]\n    if len(objective_values) == 1:\n        objective_values = objective_values[0]\n    return objective_values\n"
  },
  {
    "path": "graphium/ipu/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\ncode for IPU acceleration support\n\n- `ipu_dataloader.py`: code for handling dataloader on IPU\n- `ipu_losses.py`: code for computing losses on IPU\n- `ipu_simple_lightning.py`: code for pytorch lightning support on IPU\n- `ipu_utils.py`: utils functions for IPU\n- `ipu_wrapper.py`: wrapper code for IPU support "
  },
  {
    "path": "graphium/ipu/__init__.py",
    "content": ""
  },
  {
    "path": "graphium/ipu/ipu_dataloader.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Callable, Iterable, Optional, List, Tuple, Dict, Any, Union\nfrom copy import deepcopy\nfrom dataclasses import dataclass\nimport numpy as np\nfrom loguru import logger\nfrom torch import Tensor\n\nimport torch\nfrom torch_geometric.data import Data, Batch, Dataset\nfrom torch_geometric.transforms import BaseTransform\n\nfrom graphium.data.utils import get_keys\nfrom graphium.ipu.ipu_utils import import_poptorch\nfrom graphium.utils.packing import (\n    fast_packing,\n    hybrid_packing,\n    get_pack_sizes,\n    node_to_pack_indices_mask,\n    estimate_max_pack_node_size,\n)\n\n\n@dataclass\nclass IPUDataloaderOptions:\n    r\"\"\"\n    This data class stores the arguments necessary to instantiate a model for the Predictor.\n\n    Parameters:\n        model_class:\n            pytorch module used to create a model\n\n        model_kwargs:\n            Key-word arguments used to initialize the model from `model_class`.\n    \"\"\"\n\n    batch_size: int\n    max_num_nodes: Optional[int] = None\n    max_num_nodes_per_graph: Optional[int] = None\n    max_num_edges: Optional[int] = None\n    max_num_edges_per_graph: Optional[int] = None\n    mode: \"poptorch.DataLoaderMode\" = \"Sync\"\n\n    def set_kwargs(self):\n        # Get the maximum number of nodes\n        if self.max_num_nodes is not None:\n            assert (\n                self.max_num_nodes_per_graph is None\n            ), \"Cannot use `max_num_nodes` and `max_num_nodes_per_graph` simultaneously\"\n        elif self.max_num_nodes_per_graph is not None:\n            assert (\n                self.max_num_nodes is None\n            ), \"Cannot use `max_num_nodes` and `max_num_nodes_per_graph` simultaneously\"\n            self.max_num_nodes = self.max_num_nodes_per_graph * self.batch_size\n        else:\n            raise ValueError(\"Must provide either `max_num_nodes` or `max_num_nodes_per_graph`\")\n\n        # Get the maximum number of edges\n        if self.max_num_edges is not None:\n            assert (\n                self.max_num_edges_per_graph is None\n            ), \"Cannot use `max_num_edges` and `max_num_edges_per_graph` simultaneously\"\n        elif self.max_num_edges_per_graph is not None:\n            assert (\n                self.max_num_edges is None\n            ), \"Cannot use `max_num_edges` and `max_num_edges_per_graph` simultaneously\"\n            self.max_num_edges = self.max_num_edges_per_graph * self.batch_size\n        else:\n            raise ValueError(\"Must provide either `max_num_nodes` or `max_num_nodes_per_graph`\")\n\n        # poptorch mode\n        poptorch = import_poptorch()\n        if isinstance(self.mode, str):\n            if self.mode.lower() == \"sync\":\n                self.mode = poptorch.DataLoaderMode.Sync\n            elif self.mode.lower() == \"async\":\n                self.mode = poptorch.DataLoaderMode.Async\n            elif self.mode.lower() == \"asyncrebatched\":\n                self.mode = poptorch.DataLoaderMode.AsyncRebatched\n            else:\n                raise ValueError(f\"`{self.mode}` not a valid parameter.\")\n\n\nclass CombinedBatchingCollator:\n    \"\"\"\n    Collator object that manages the combined batch size defined as:\n\n        combined_batch_size = batch_size * device_iterations\n                             * replication_factor * gradient_accumulation\n\n    This is intended to be used in combination with the poptorch.DataLoader\n    \"\"\"\n\n    def __init__(\n        self,\n        batch_size: int,\n        max_num_nodes: int,\n        max_num_edges: int,\n        dataset_max_nodes_per_graph: int,\n        dataset_max_edges_per_graph: int,\n        collate_fn: Optional[Callable] = None,\n    ):\n        \"\"\"\n        Parameters:\n            batch_size: mini batch size used by the model\n            max_num_nodes: Maximum number of nodes in the batched padded graph\n            max_num_edges: Maximum number of edges in the batched padded graph\n            dataset_max_nodes_per_graph: Maximum number of nodes per graph in the full dataset\n            dataset_max_edges_per_graph: Maximum number of edges per graph in the full dataset\n            collate_fn: Function used to collate (or batch) the single data or graphs together\n        \"\"\"\n        super().__init__()\n        self.batch_size = batch_size\n        self.collate_fn = collate_fn\n        self.max_num_nodes = max_num_nodes\n        self.max_num_edges = max_num_edges\n        self.dataset_max_nodes_per_graph = dataset_max_nodes_per_graph\n        self.dataset_max_edges_per_graph = dataset_max_edges_per_graph\n\n    def __call__(\n        self, batch: List[Dict[str, Union[Data, Dict[str, Tensor]]]]\n    ) -> Dict[str, Union[Batch, Dict[str, Tensor], Any]]:\n        \"\"\"\n        Stack tensors, batch the pyg graphs, and pad each tensor to be same size.\n\n        Parameters:\n            batch: The batch of data, including pyg-graphs `Data` and labels `Dict[str, Tensor]` to be padded\n\n        Returns:\n            out_batch: A dictionary where the graphs are batched and the labels or other Tensors are stacked\n        \"\"\"\n\n        # Sort the batch such that large graphs are paired with small graphs\n        num_nodes = [b[\"features\"].num_nodes for b in batch]\n        packed_indices = hybrid_packing(num_nodes, batch_size=self.batch_size)\n        packs = [[batch[idx] for idx in pack] for pack in packed_indices]\n\n        # Loop all mini-batches within the global batch\n        all_batches = []\n        for pack in packs:\n            if self.collate_fn != None:\n                local_batch = self.collate_fn(pack)\n\n            transform = Pad(\n                max_num_nodes=self.max_num_nodes,\n                max_num_edges=self.max_num_edges,\n                dataset_max_nodes_per_graph=self.dataset_max_nodes_per_graph,\n                dataset_max_edges_per_graph=self.dataset_max_edges_per_graph,\n            )\n\n            local_batch[\"features\"] = transform(local_batch[\"features\"])\n            local_batch[\"labels\"] = transform(local_batch[\"labels\"])\n            all_batches.append(local_batch)\n\n        out_batch = {}\n\n        # Stack tensors in the first dimension to allow IPUs to differentiate between local and global graph\n        all_keys = get_keys(all_batches[0][\"labels\"])\n        out_batch[\"labels\"] = {\n            key: torch.stack([this_batch[\"labels\"][key] for this_batch in all_batches], 0) for key in all_keys\n        }\n        out_graphs = [this_batch[\"features\"] for this_batch in all_batches]\n        stacked_features = deepcopy(out_graphs[0])\n        for key, val in out_graphs[0].items():\n            if isinstance(val, torch.Tensor):\n                stacked_features[key] = torch.stack([this_graph[key] for this_graph in out_graphs], dim=0)\n\n        out_batch[\"features\"] = stacked_features\n        for key in all_batches[0].keys():\n            if key not in (\"features\", \"labels\"):\n                out_batch[key] = [this_batch[key] for this_batch in all_batches]\n\n        #\n        for data_key, data_val in out_batch.items():\n            if isinstance(data_val, Batch):\n                for sub_key, sub_val in data_val.items():\n                    if isinstance(sub_val, Tensor) and sub_val.dtype == torch.int64:\n                        out_batch[data_key][sub_key] = sub_val.to(torch.int32)\n\n        return out_batch\n\n\ndef create_ipu_dataloader(\n    dataset: Dataset,\n    ipu_dataloader_options: IPUDataloaderOptions,\n    ipu_options: Optional[\"poptorch.Options\"] = None,\n    batch_size: Optional[int] = 1,\n    collate_fn=None,\n    num_workers: Optional[int] = 0,\n    **kwargs,\n) -> \"poptorch.DataLoader\":\n    \"\"\"\n    Creates a poptorch.DataLoader for graph datasets\n    Applies the mini-batching method of concatenating multiple graphs into a\n    single graph with multiple disconnected subgraphs. See:\n    https://pytorch-geometric.readthedocs.io/en/2.0.2/notes/batching.html\n\n    Parameters:\n\n        dataset: The torch_geometric.data.Dataset instance from which to\n            load the graph examples for the IPU.\n        ipu_dataloader_options: The options to initialize the Dataloader for IPU\n        ipu_options: The poptorch.Options used by the\n            poptorch.DataLoader. Will use the default options if not provided.\n        batch_size: How many graph examples to load in each batch\n            (default: 1).\n        collate_fn: The function used to collate batches\n        **kwargs (optional): Additional arguments of :class:`poptorch.DataLoader`.\n\n    Returns:\n        The dataloader\n    \"\"\"\n    poptorch = import_poptorch()\n\n    if ipu_options is None:\n        # Create IPU default options\n        ipu_options = poptorch.Options()\n\n    # Define the collater function\n    collater = CombinedBatchingCollator(\n        batch_size,\n        collate_fn=collate_fn,\n        max_num_nodes=ipu_dataloader_options.max_num_nodes,\n        max_num_edges=ipu_dataloader_options.max_num_edges,\n        dataset_max_nodes_per_graph=dataset.max_num_nodes_per_graph,\n        dataset_max_edges_per_graph=dataset.max_num_edges_per_graph,\n    )\n\n    # Get the global batch size\n    num_nodes = np.asarray(dataset.num_nodes_list)\n    accum = ipu_options.Training.gradient_accumulation\n    repli = ipu_options._values[\"replication_factor\"]\n    device_iter = ipu_options._values[\"device_iterations\"]\n    combined_batch_size = batch_size * accum * repli * device_iter\n    num_batches = len(dataset) // combined_batch_size\n    num_workers = min(num_batches, num_workers)\n    buffer_size = num_batches // num_workers if num_workers > 0 else None\n    buffer_size = 3 if buffer_size is None else buffer_size\n    async_options = {\n        \"sharing_strategy\": poptorch.SharingStrategy.ForkServer,\n        \"early_preload\": True,\n        \"buffer_size\": buffer_size,\n        \"load_indefinitely\": True,\n        \"miss_sleep_time_in_ms\": 0,\n    }\n\n    # Estimate the packing size needed\n    max_pack_size, max_pack_size_per_graph = 0, 0\n    for _ in range(4):\n        this_max_pack_size, this_max_pack_size_per_graph = estimate_max_pack_node_size(\n            num_nodes=num_nodes,\n            batch_size=batch_size,\n            combined_batch_size=combined_batch_size,\n        )\n        max_pack_size = max(max_pack_size, this_max_pack_size)\n        max_pack_size_per_graph = max(max_pack_size_per_graph, this_max_pack_size_per_graph)\n\n    max_num_nodes = collater.max_num_nodes\n    # Log the estimated pack size, with warnings if too big or too small\n    logger.info(\n        f\"Estimating pack max_pack_size={max_pack_size} or max_pack_size_per_graph={max_pack_size_per_graph}\"\n    )\n    logger.info(f\"Provided `max_num_nodes={max_num_nodes}`\")\n    if max_pack_size > max_num_nodes - 10:\n        logger.warning(\n            f\"The value of `max_num_nodes={max_num_nodes}` seems to be insufficient compared to `max_pack_size={max_pack_size}` and will likely crash\"\n        )\n    elif max_pack_size < max_num_nodes - 20:\n        logger.warning(\n            f\"The value of `max_num_nodes={max_num_nodes}` seems to be large compared to `max_pack_size={max_pack_size}` and will likely waste memory\"\n        )\n\n    return poptorch.DataLoader(\n        options=deepcopy(ipu_options),\n        dataset=dataset,\n        batch_size=batch_size,\n        num_workers=num_workers,\n        collate_fn=collater,\n        async_options=async_options,\n        **kwargs,\n    )\n\n\nclass Pad(BaseTransform):\n    \"\"\"\n    Data transform that applies padding to enforce consistent tensor shapes.\n    \"\"\"\n\n    def __init__(\n        self,\n        max_num_nodes: int,\n        dataset_max_nodes_per_graph,\n        dataset_max_edges_per_graph,\n        max_num_edges: Optional[int] = None,\n        node_value: float = 0,\n        edge_value: float = 0,\n    ):\n        \"\"\"\n        Parameters:\n            max_num_nodes: The maximum number of nodes for the total padded graph\n            dataset_max_nodes_per_graph: the maximum number of nodes per graph in the dataset\n            dataset_max_edges_per_graph: the maximum number of edges per graph in the dataset\n            max_num_edges: The maximum number of edges for the total padded graph\n            node_value: Value to add to the node padding\n            edge_value: Value to add to the edge padding\n        \"\"\"\n        super().__init__()\n        self.max_num_nodes = max_num_nodes\n        self.dataset_max_nodes_per_graph = dataset_max_nodes_per_graph\n        self.dataset_max_edges_per_graph = dataset_max_edges_per_graph\n\n        if max_num_edges:\n            self.max_num_edges = max_num_edges\n        else:\n            # Assume fully connected graph\n            self.max_num_edges = max_num_nodes * (max_num_nodes - 1)\n\n        self.node_value = node_value\n        self.edge_value = edge_value\n\n    def validate(self, data):\n        \"\"\"\n        Validates that the input graph does not exceed the constraints that:\n\n          * the number of nodes must be <= max_num_nodes\n          * the number of edges must be <= max_num_edges\n\n        Returns:\n            Tuple containing the number nodes and the number of edges\n        \"\"\"\n        num_nodes = data.num_nodes\n        num_edges = data.num_edges\n\n        assert num_nodes <= self.max_num_nodes, (\n            f\"Too many nodes. Graph has {num_nodes} nodes \" f\"and max_num_nodes is {self.max_num_nodes}.\"\n        )\n\n        assert num_edges <= self.max_num_edges, (\n            f\"Too many edges. Graph has {num_edges} edges defined \"\n            f\"and max_num_edges is {self.max_num_edges}.\"\n        )\n\n        return num_nodes, num_edges\n\n    def __call__(self, batch: Batch) -> Batch:\n        return self._call(batch)\n\n    def forward(self, batch: Batch) -> Batch:\n        return self._call(batch)\n\n    def _call(self, batch: Batch) -> Batch:\n        \"\"\"\n        Pad the batch with a fake graphs that has the desired\n        number of nodes and edges.\n        \"\"\"\n        num_nodes, num_edges = self.validate(batch)\n        num_pad_nodes = self.max_num_nodes - num_nodes\n        num_pad_edges = self.max_num_edges - num_edges\n        # Create a copy to update with padded features\n        new_batch = deepcopy(batch)\n\n        real_graphs = new_batch.to_data_list()\n\n        for g in real_graphs:\n            g.graph_is_true = torch.tensor([1], dtype=bool)\n            g.node_is_true = torch.full([g.num_nodes], True, dtype=bool)\n            g.edge_is_true = torch.full([g.num_edges], True, dtype=bool)\n\n        # create fake graph with the needed # of nodes and edges\n        fake = Data()\n        fake.num_nodes = num_pad_nodes\n        fake.num_edges = num_pad_edges\n        fake.graph_is_true = torch.tensor([False], dtype=bool)\n        fake.node_is_true = torch.full([num_pad_nodes], False, dtype=bool)\n        fake.edge_is_true = torch.full([num_pad_edges], False, dtype=bool)\n\n        for key, value in real_graphs[0]:\n            if not torch.is_tensor(value):\n                continue\n\n            if key == \"graph_is_true\" or key == \"node_is_true\" or key == \"edge_is_true\":\n                continue\n\n            dim = real_graphs[0].__cat_dim__(key, value)\n            pad_shape = list(value.shape)\n\n            if batch.is_node_attr(key):\n                pad_shape[dim] = num_pad_nodes\n                pad_value = self.node_value\n            elif batch.is_edge_attr(key):\n                pad_shape[dim] = num_pad_edges\n                if key == \"edge_index\":\n                    # Padding edges are self-loops on the first padding node\n                    pad_value = 0\n                else:\n                    pad_value = self.edge_value\n            # identify graph attributes, pad nan label for the fake graph\n            elif key.startswith(\"graph_\"):\n                num_pad_graphs = 1  # we pad with one big fake graph\n                pad_shape[dim] = num_pad_graphs\n                pad_value = float(\"nan\")\n            else:\n                continue\n\n            pad_value = value.new_full(pad_shape, pad_value)\n            fake[key] = torch.cat([pad_value], dim=dim)\n        real_graphs.append(fake)\n        new_batch = Batch.from_data_list(real_graphs)\n\n        if \"num_nodes\" in new_batch:\n            new_batch.num_nodes = self.max_num_nodes\n\n        return new_batch\n\n    def __repr__(self) -> str:\n        s = f\"{self.__class__.__name__}(\"\n        s += f\"max_num_nodes={self.max_num_nodes}, \"\n        s += f\"max_num_edges={self.max_num_edges}, \"\n        s += f\"node_value={self.node_value}, \"\n        s += f\"edge_value={self.edge_value})\"\n        return s\n"
  },
  {
    "path": "graphium/ipu/ipu_losses.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport torch\nfrom torch import Tensor\nfrom torch.nn import BCELoss, BCEWithLogitsLoss, MSELoss, L1Loss\nfrom torch._C import _infer_size\nfrom loguru import logger\nfrom graphium.trainer.losses import HybridCELoss\n\n\nclass BCEWithLogitsLossIPU(BCEWithLogitsLoss):\n    \"\"\"\n    A modified version of the `torch.nn.BCEWithLogitsLoss` that can ignore NaNs\n    by giving them a weight of `0`. This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    def forward(self, input: Tensor, target: Tensor) -> Tensor:\n        prev_weight = None\n\n        target = target.clone().to(input.dtype)\n        weight = self.weight\n\n        # Get the original weight matrix. If None, set all weights = 1\n        if weight is not None:\n            prev_weight = self.weight.clone()\n            new_size = _infer_size(target.size(), weight.size())\n            weight = weight.expand(new_size).clone()\n        else:\n            weight = torch.ones(target.shape, dtype=input.dtype, device=input.device)\n\n        # Replace the nan-targets by 0 or 1. Take the value closest to the input.\n        # Give a weight of 0 where there are nan-targets\n        nan_targets = target.isnan()\n        nan_targets_0 = (input < 0.5) & nan_targets\n        nan_targets_1 = (input >= 0.5) & nan_targets\n        target[nan_targets_0] = 0.0\n        target[nan_targets_1] = 1.0\n        weight[nan_targets] = 0.0\n\n        # Compute the loss, and rescale by the number of nan elements\n        self.weight = weight\n        loss = super().forward(input, target)\n\n        num_real_targets = (~nan_targets).sum()\n        factor1 = torch.where(num_real_targets > 0, 1, 0)\n        factor2 = torch.where(num_real_targets > 0, 0, 1)\n        loss = factor1 * loss * nan_targets.numel() / (num_real_targets + factor2)\n\n        # Reset the self.weight to its original value\n        self.weight = prev_weight\n\n        return loss\n\n\nclass BCELossIPU(BCELoss):\n    \"\"\"\n    A modified version of the `torch.nn.BCELoss` that can ignore NaNs\n    by giving them a weight of `0`. This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    def forward(self, input: Tensor, target: Tensor) -> Tensor:\n        prev_weight = None\n\n        target = target.clone().to(input.dtype)\n        weight = self.weight\n\n        # Get the original weight matrix. If None, set all weights = 1\n        if weight is not None:\n            prev_weight = self.weight.clone()\n            new_size = _infer_size(target.size(), weight.size())\n            weight = weight.expand(new_size).clone()\n        else:\n            weight = torch.ones(target.shape, dtype=input.dtype, device=input.device)\n\n        # Replace the nan-targets by 0 or 1. Take the value closest to the input.\n        # Give a weight of 0 where there are nan-targets\n        nan_targets = target.isnan()\n        nan_targets_0 = (input < 0.5) & nan_targets\n        nan_targets_1 = (input >= 0.5) & nan_targets\n        target[nan_targets_0] = 0.0\n        target[nan_targets_1] = 1.0\n        weight[nan_targets] = 0.0\n\n        # Compute the loss, and rescale by the number of nan elements\n        self.weight = weight\n        loss = super().forward(input, target)\n\n        num_real_targets = (~nan_targets).sum()\n        factor1 = torch.where(num_real_targets > 0, 1, 0)\n        factor2 = torch.where(num_real_targets > 0, 0, 1)\n        loss = factor1 * loss * nan_targets.numel() / (num_real_targets + factor2)\n\n        # Reset the self.weight to its original value\n        self.weight = prev_weight\n\n        return loss\n\n\nclass MSELossIPU(MSELoss):\n    \"\"\"\n    A modified version of the `torch.nn.MSELoss` that can ignore NaNs\n    by giving them the same value for both `input` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    def forward(self, input: Tensor, target: Tensor) -> Tensor:\n        target = target.clone().to(input.dtype)\n        input = input.clone()\n\n        # Replace the nan-targets in the input/target tensors by 0\n        nan_targets = target.isnan()\n        input[nan_targets] = 0.0\n        target[nan_targets] = 0.0\n\n        # Compute the loss, and rescale by the number of nan elements\n        loss = super().forward(input, target)\n\n        num_real_targets = (~nan_targets).sum()\n        factor1 = torch.where(num_real_targets > 0, 1, 0)\n        factor2 = torch.where(num_real_targets > 0, 0, 1)\n        loss = factor1 * loss * nan_targets.numel() / (num_real_targets + factor2)\n\n        return loss\n\n\nclass L1LossIPU(L1Loss):\n    \"\"\"\n    A modified version of the `torch.nn.L1Loss` that can ignore NaNs\n    by giving them the same value for both `input` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    def forward(self, input: Tensor, target: Tensor) -> Tensor:\n        target = target.clone().to(input.dtype)\n        input = input.clone()\n\n        # Replace the nan-targets in the input/target tensors by 0\n        nan_targets = target.isnan()\n        input[nan_targets] = 0.0\n        target[nan_targets] = 0.0\n\n        # Compute the loss, and rescale by the number of nan elements\n        loss = super().forward(input, target)\n        num_real_targets = (~nan_targets).sum()\n        factor1 = torch.where(num_real_targets > 0, 1, 0)\n        factor2 = torch.where(num_real_targets > 0, 0, 1)\n        loss = factor1 * loss * nan_targets.numel() / (num_real_targets + factor2)\n\n        return loss\n\n\nclass HybridCELossIPU(HybridCELoss):\n    def __init__(\n        self,\n        n_brackets,\n        alpha: float = 0.5,\n    ) -> None:\n        \"\"\"\n        Parameters:\n            n_brackets: the number of brackets that will be used to group the regression targets.\n                Expected to have the same size as the number of classes in the transformed regression task.\n        \"\"\"\n        super().__init__(n_brackets=n_brackets, alpha=alpha)\n\n    def forward(self, input: Tensor, target: Tensor) -> Tensor:\n        \"\"\"\n        Parameters:\n            input: (batch_size x n_classes) tensor of logits predicted for each bracket.\n            target: (batch_size) or (batch_size, 1) tensor of target brackets in {0, 1, ..., self.n_brackets}.\n        \"\"\"\n\n        target = target.clone().to(input.dtype)\n        input = input.clone()\n\n        # Replace the nan-targets in the input/target tensors by 0\n        nan_targets = target.isnan()\n\n        # Compute the loss, and rescale by the number of nan elements\n        loss = super().forward(input, target, nan_targets)\n        return loss\n"
  },
  {
    "path": "graphium/ipu/ipu_metrics.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Optional, Tuple, Sequence, Literal\n\nimport torch\nfrom torch import BoolTensor, IntTensor, Tensor\nfrom torchmetrics.functional import auroc, average_precision, pearson_corrcoef, r2_score\nfrom torchmetrics.utilities.checks import _input_squeeze\nfrom torchmetrics.functional.classification.accuracy import (\n    _mode,\n    _check_subset_validity,\n    _accuracy_compute,\n    _accuracy_update,\n)\nfrom torchmetrics.functional.classification.precision_recall import _precision_compute, _recall_compute\nfrom torchmetrics.functional.classification.f_beta import _fbeta_compute\nfrom torchmetrics.functional import mean_squared_error, mean_absolute_error\nfrom torchmetrics.utilities.checks import _input_squeeze\nfrom torchmetrics.utilities.enums import AverageMethod\n\nfrom graphium.utils.tensor import nan_mean\nfrom graphium.ipu.ipu_utils import import_poptorch\n\n\ndef auroc_ipu(\n    preds: Tensor,\n    target: Tensor,\n    num_classes: Optional[int] = None,\n    task: Optional[Literal[\"binary\", \"multiclass\", \"multilabel\"]] = None,\n    pos_label: Optional[int] = None,\n    average: Optional[str] = \"macro\",\n    max_fpr: Optional[float] = None,\n    sample_weights: Optional[Sequence] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.auroc` that can ignore NaNs\n    by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    target = target.clone()\n    preds = preds.clone()\n\n    # Replace the nan-targets in the preds/target tensors by 0\n    nan_targets = target.isnan()\n    preds[nan_targets] = 0.0\n    target[nan_targets] = 0.0\n\n    # Get the original weight matrix. If None, set all weights = 1\n    if sample_weights is None:\n        sample_weights = torch.ones(target.shape[0], dtype=preds.dtype, device=preds.device)\n    sample_weights[nan_targets] = 0.0\n\n    # Compute the loss, and rescale by the number of nan elements\n    score = auroc(\n        preds=preds,\n        target=target.to(int),\n        num_classes=num_classes,\n        task=task,\n        pos_label=pos_label,\n        average=average,\n        max_fpr=max_fpr,\n        sample_weights=sample_weights,\n    )\n\n    return score\n\n\ndef average_precision_ipu(\n    preds: Tensor,\n    target: Tensor,\n    num_classes: Optional[int] = None,\n    task: Optional[Literal[\"binary\", \"multiclass\", \"multilabel\"]] = None,\n    ignore_index: Optional[int] = None,\n    pos_label: Optional[int] = None,\n    average: Optional[str] = \"macro\",\n    sample_weights: Optional[Sequence] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.average_precision` that can ignore NaNs\n    by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    target = target.clone()\n    preds = preds.clone()\n\n    # Replace the nan-targets in the preds/target tensors by 0\n    # Average precision is not sensitive to true negatives\n    nan_targets = target.isnan()\n    preds[nan_targets] = 0.0\n    target[nan_targets] = 0.0\n\n    # No need to use sample weights (which is no longer supported in torchmetrics >=0.10)\n    # # Get the original weight matrix. If None, set all weights = 1\n    # if sample_weights is None:\n    #     sample_weights = torch.ones(target.shape[0], dtype=preds.dtype, device=preds.device)\n    # sample_weights[nan_targets] = 0.0\n\n    # Compute the loss, and rescale by the number of nan elements\n    score = average_precision(\n        preds=preds,\n        target=target,\n        num_classes=num_classes,\n        task=task,\n        ignore_index=ignore_index,\n        pos_label=pos_label,\n        average=average,\n        # sample_weights=sample_weights,\n    )\n\n    return score\n\n\ndef precision_ipu(\n    preds: Tensor,\n    target: Tensor,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = None,\n    ignore_index: Optional[int] = None,\n    num_classes: Optional[int] = None,\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.precision` that can ignore NaNs\n    by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    (tp, fp, tn, fn), mode = get_confusion_matrix(\n        preds=preds,\n        target=target,\n        average=average,\n        mdmc_average=mdmc_average,\n        threshold=threshold,\n        top_k=top_k,\n        subset_accuracy=False,\n        num_classes=num_classes,\n        multiclass=multiclass,\n        ignore_index=ignore_index,\n    )\n\n    return _precision_compute(tp, fp, fn, average, mdmc_average)\n\n\ndef recall_ipu(\n    preds: Tensor,\n    target: Tensor,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = None,\n    ignore_index: Optional[int] = None,\n    num_classes: Optional[int] = None,\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.recall` that can ignore NaNs\n    by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n    \"\"\"\n\n    (tp, fp, tn, fn), mode = get_confusion_matrix(\n        preds=preds,\n        target=target,\n        average=average,\n        mdmc_average=mdmc_average,\n        threshold=threshold,\n        top_k=top_k,\n        num_classes=num_classes,\n        multiclass=multiclass,\n        ignore_index=ignore_index,\n    )\n\n    return _recall_compute(tp, fp, fn, average, mdmc_average)\n\n\ndef accuracy_ipu(\n    preds: Tensor,\n    target: Tensor,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = \"global\",\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    subset_accuracy: bool = False,\n    num_classes: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n    ignore_index: Optional[int] = None,\n) -> Tensor:\n    \"\"\"\n    A modified version of the `torchmetrics.functional.accuracy` that can ignore NaNs\n    by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n\n    Args:\n        preds: Predictions from model (probabilities, logits or labels)\n        target: Ground truth labels\n        average:\n            Defines the reduction that is applied. Should be one of the following:\n\n            - ``'micro'`` [default]: Calculate the metric globally, across all samples and classes.\n            - ``'macro'``: Calculate the metric for each class separately, and average the\n              metrics across classes (with equal weights for each class).\n            - ``'weighted'``: Calculate the metric for each class separately, and average the\n              metrics across classes, weighting each class by its support (``tp + fn``).\n            - ``'none'`` or ``None``: Calculate the metric for each class separately, and return\n              the metric for every class.\n            - ``'samples'``: Calculate the metric for each sample, and average the metrics\n              across samples (with equal weights for each sample).\n\n            .. note:: What is considered a sample in the multi-dimensional multi-class case\n                depends on the value of ``mdmc_average``.\n\n            .. note:: If ``'none'`` and a given class doesn't occur in the ``preds`` or ``target``,\n                the value for the class will be ``nan``.\n\n        mdmc_average:\n            Defines how averaging is done for multi-dimensional multi-class inputs (on top of the\n            ``average`` parameter). Should be one of the following:\n\n            - ``None`` [default]: Should be left unchanged if your data is not multi-dimensional multi-class.\n\n            - ``'samplewise'``: In this case, the statistics are computed separately for each\n              sample on the ``N`` axis, and then averaged over samples.\n              The computation for each sample is done by treating the flattened extra axes ``...``\n              (see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,\n              and computing the metric for the sample based on that.\n\n            - ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs\n              (see :ref:`pages/classification:input types`)\n              are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they\n              were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.\n\n        num_classes:\n            Number of classes. Necessary for ``'macro'``, ``'weighted'`` and ``None`` average methods.\n\n        threshold:\n            Threshold for transforming probability or logit predictions to binary (0,1) predictions, in the case\n            of binary or multi-label inputs. Default value of 0.5 corresponds to input being probabilities.\n        top_k:\n            Number of the highest probability or logit score predictions considered finding the correct label,\n            relevant only for (multi-dimensional) multi-class inputs. The\n            default value (``None``) will be interpreted as 1 for these inputs.\n\n            Should be left at default (``None``) for all other types of inputs.\n        multiclass:\n            Used only in certain special cases, where you want to treat inputs as a different type\n            than what they appear to be. See the parameter's\n            :ref:`documentation section <pages/classification:using the multiclass parameter>`\n            for a more detailed explanation and examples.\n        ignore_index:\n            Integer specifying a target class to ignore. If given, this class index does not contribute\n            to the returned score, regardless of reduction method. If an index is ignored, and ``average=None``\n            or ``'none'``, the score for the ignored class will be returned as ``nan``.\n        subset_accuracy:\n            Whether to compute subset accuracy for multi-label and multi-dimensional\n            multi-class inputs (has no effect for other input types).\n\n            - For multi-label inputs, if the parameter is set to ``True``, then all labels for\n              each sample must be correctly predicted for the sample to count as correct. If it\n              is set to ``False``, then all labels are counted separately - this is equivalent to\n              flattening inputs beforehand (i.e. ``preds = preds.flatten()`` and same for ``target``).\n\n            - For multi-dimensional multi-class inputs, if the parameter is set to ``True``, then all\n              sub-sample (on the extra axis) must be correct for the sample to be counted as correct.\n              If it is set to ``False``, then all sub-samples are counter separately - this is equivalent,\n              in the case of label predictions, to flattening the inputs beforehand (i.e.\n              ``preds = preds.flatten()`` and same for ``target``). Note that the ``top_k`` parameter\n              still applies in both cases, if set.\n\n    Raises:\n        ValueError:\n            If ``top_k`` parameter is set for ``multi-label`` inputs.\n        ValueError:\n            If ``average`` is none of ``\"micro\"``, ``\"macro\"``, ``\"weighted\"``, ``\"samples\"``, ``\"none\"``, ``None``.\n        ValueError:\n            If ``mdmc_average`` is not one of ``None``, ``\"samplewise\"``, ``\"global\"``.\n        ValueError:\n            If ``average`` is set but ``num_classes`` is not provided.\n        ValueError:\n            If ``num_classes`` is set\n            and ``ignore_index`` is not in the range ``[0, num_classes)``.\n        ValueError:\n            If ``top_k`` is not an ``integer`` larger than ``0``.\n    \"\"\"\n\n    (tp, fp, tn, fn), mode = get_confusion_matrix(\n        preds=preds,\n        target=target,\n        average=average,\n        mdmc_average=mdmc_average,\n        threshold=threshold,\n        top_k=top_k,\n        subset_accuracy=subset_accuracy,\n        num_classes=num_classes,\n        multiclass=multiclass,\n        ignore_index=ignore_index,\n    )\n\n    return _accuracy_compute(tp, fp, tn, fn, average, mdmc_average, mode)\n\n\ndef get_confusion_matrix(\n    preds: Tensor,\n    target: Tensor,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = \"global\",\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    subset_accuracy: bool = False,\n    num_classes: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n    ignore_index: Optional[int] = None,\n) -> Tuple[Tuple[Tensor], Tensor]:\n    \"\"\"\n    Calculates the confusion matrix according to the specified average method.\n\n    Args:\n        preds: Predictions from model (probabilities, logits or labels)\n        target: Ground truth labels\n        average:\n            Defines the reduction that is applied. Should be one of the following:\n\n            - ``'micro'`` [default]: Calculate the metric globally, across all samples and classes.\n            - ``'macro'``: Calculate the metric for each class separately, and average the\n              metrics across classes (with equal weights for each class).\n            - ``'weighted'``: Calculate the metric for each class separately, and average the\n              metrics across classes, weighting each class by its support (``tp + fn``).\n            - ``'none'`` or ``None``: Calculate the metric for each class separately, and return\n              the metric for every class.\n            - ``'samples'``: Calculate the metric for each sample, and average the metrics\n              across samples (with equal weights for each sample).\n\n            .. note:: What is considered a sample in the multi-dimensional multi-class case\n                depends on the value of ``mdmc_average``.\n\n            .. note:: If ``'none'`` and a given class doesn't occur in the ``preds`` or ``target``,\n                the value for the class will be ``nan``.\n\n        mdmc_average:\n            Defines how averaging is done for multi-dimensional multi-class inputs (on top of the\n            ``average`` parameter). Should be one of the following:\n\n            - ``None`` [default]: Should be left unchanged if your data is not multi-dimensional multi-class.\n\n            - ``'samplewise'``: In this case, the statistics are computed separately for each\n              sample on the ``N`` axis, and then averaged over samples.\n              The computation for each sample is done by treating the flattened extra axes ``...``\n              (see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,\n              and computing the metric for the sample based on that.\n\n            - ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs\n              (see :ref:`pages/classification:input types`)\n              are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they\n              were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.\n\n        num_classes:\n            Number of classes. Necessary for ``'macro'``, ``'weighted'`` and ``None`` average methods.\n\n        threshold:\n            Threshold for transforming probability or logit predictions to binary (0,1) predictions, in the case\n            of binary or multi-label inputs. Default value of 0.5 corresponds to input being probabilities.\n        top_k:\n            Number of the highest probability or logit score predictions considered finding the correct label,\n            relevant only for (multi-dimensional) multi-class inputs. The\n            default value (``None``) will be interpreted as 1 for these inputs.\n\n            Should be left at default (``None``) for all other types of inputs.\n        multiclass:\n            Used only in certain special cases, where you want to treat inputs as a different type\n            than what they appear to be. See the parameter's\n            :ref:`documentation section <pages/classification:using the multiclass parameter>`\n            for a more detailed explanation and examples.\n        ignore_index:\n            Integer specifying a target class to ignore. If given, this class index does not contribute\n            to the returned score, regardless of reduction method. If an index is ignored, and ``average=None``\n    \"\"\"\n    allowed_average = [\"micro\", \"macro\", \"weighted\", \"samples\", \"none\", None]\n    if average not in allowed_average:\n        raise ValueError(f\"The `average` has to be one of {allowed_average}, got {average}.\")\n\n    if average in [\"macro\", \"weighted\", \"none\", None] and (not num_classes or num_classes < 1):\n        raise ValueError(f\"When you set `average` as {average}, you have to provide the number of classes.\")\n\n    allowed_mdmc_average = [None, \"samplewise\", \"global\"]\n    if mdmc_average not in allowed_mdmc_average:\n        raise ValueError(f\"The `mdmc_average` has to be one of {allowed_mdmc_average}, got {mdmc_average}.\")\n\n    if num_classes and ignore_index is not None and (not ignore_index < num_classes or num_classes == 1):\n        raise ValueError(\n            f\"The `ignore_index` {ignore_index} is not valid for inputs with {num_classes} classes\"\n        )\n\n    if top_k is not None and (not isinstance(top_k, int) or top_k <= 0):\n        raise ValueError(f\"The `top_k` should be an integer larger than 0, got {top_k}\")\n\n    #### ADDED ####\n    # Put all the NaNs as the 0-class\n    nans = torch.isnan(target)\n    target[nans] = 0\n    preds[nans] = 0\n    if (preds.ndim > 1) and (preds.shape[1] > 1):\n        preds[nans, 0] = 1\n    target = target.to(int)\n    #### END ADDED ####\n\n    preds, target = _input_squeeze(preds, target)\n    mode = _mode(preds, target, threshold, top_k, num_classes, multiclass, ignore_index)\n    reduce = \"macro\" if average in [\"weighted\", \"none\", None] else average\n\n    if subset_accuracy and _check_subset_validity(mode):\n        # correct, total = _subset_accuracy_update(preds, target, threshold, top_k, ignore_index)\n        # return _subset_accuracy_compute(correct, total)\n        raise NotImplementedError(\"subset_accuracy not implemented\")\n    tp, fp, tn, fn = _accuracy_update(\n        preds, target, reduce, mdmc_average, threshold, num_classes, top_k, multiclass, ignore_index, mode\n    )\n\n    #### ADDED ####\n    num_nans = nans.sum(0)\n    if tp.numel() > 1:\n        tp[0] = tp[0] - num_nans\n        tn[1:] = tn[1:] - num_nans\n    else:\n        tn = tn - num_nans\n        if (preds.ndim > 1) and (preds.shape[1] > 1):\n            tp = tp - num_nans\n    #### END ADDED ####\n\n    return (tp, fp, tn, fn), mode\n\n\nclass NaNTensor(Tensor):\n    \"\"\"\n    Class to create and manage a NaN tensor along it's properties\n\n    The goal of the class is to override the regular tensor such that the basic\n    operations (sum, mean, max, etc) ignore the NaNs in the input.\n    It also supports NaNs in integer tensors (as the lowest integer possible).\n    \"\"\"\n\n    @property\n    def get_nans(self) -> BoolTensor:\n        \"\"\"\n        Gets the boolean Tensor containing the location of NaNs.\n        In the case of an integer tensor, this returns where the tensor is equal to its minimal value\n        In the case of a boolean tensor, this returns a Tensor filled with `False`\n        \"\"\"\n        if self.is_floating_point():\n            return self.isnan()\n        elif self.is_signed():\n            return self == torch.iinfo(self.dtype).min\n        else:\n            return torch.zeros(self.shape, device=self.device, dtype=bool)\n\n    def sum(self, *args, **kwargs) -> Tensor:\n        \"\"\"\n        Overloads the traditional sum to ignore the NaNs\n        \"\"\"\n        tensor = self.to(float)\n        tensor[self.get_nans] = float(\"nan\")\n        if self.is_floating_point():\n            dtype = self.dtype\n        else:\n            dtype = torch.int64\n        return tensor.nansum(*args, **kwargs).to(dtype)\n\n    def mean(self, *args, **kwargs) -> Tensor:\n        \"\"\"\n        Overloads the traditional mean to ignore the NaNs\n        \"\"\"\n        tensor = self.to(float)\n        tensor[self.get_nans] = float(\"nan\")\n        return nan_mean(tensor, *args, **kwargs).to(self.dtype)\n\n    def numel(self) -> int:\n        \"\"\"\n        Returns the number of non-NaN elements.\n        \"\"\"\n        return super(NaNTensor, ~self.get_nans).sum()\n\n    def min(self, *args, **kwargs) -> Tensor:\n        \"\"\"\n        Returns the min vale of a tensor whitout NaNs\n        \"\"\"\n        tensor = self\n        tensor = tensor[~self.get_nans]\n        return super(NaNTensor, tensor).min(*args, **kwargs)\n\n    def max(self, *args, **kwargs) -> Tensor:\n        \"\"\"\n        Returns the max vale of a tensor whitout NaNs\n        \"\"\"\n        tensor = self\n        tensor = tensor[~self.get_nans]\n        return super(NaNTensor, tensor).max(*args, **kwargs)\n\n    def argsort(self, dim=-1, descending=False) -> IntTensor:\n        \"\"\"\n        Return the indices that sort the tensor, while putting all the NaNs to the end of the sorting.\n        \"\"\"\n        tensor = self\n        if descending:\n            tensor[tensor.get_nans] = float(\"-inf\")\n        else:\n            tensor[tensor.get_nans] = float(\"inf\")\n        return super(NaNTensor, tensor).argsort(dim=dim, descending=descending)\n\n    def size(self, dim) -> Tensor:\n        \"\"\"\n        Instead of returning the size, return the number of non-NaN elements in\n        a specific dimension. Useful for the `r2_score` metric.\n        \"\"\"\n        return (~self.get_nans).sum(dim=dim)\n\n    def __lt__(self, other) -> Tensor:\n        \"\"\"\n        Stupid fix that allows the code to work with `r2_score`,\n        since it requires the size to be > 2. But since `self.size` now returns\n        a Tensor instead of a value, we check that all elements are > 2.\n        \"\"\"\n        if (not isinstance(other, Tensor)) and (other == 2):\n            return super().__lt__(other).all()\n        else:\n            return super().__lt__(other)\n\n    @classmethod\n    def __torch_function__(cls, func, types, args=(), kwargs=None):\n        \"\"\"\n        This __torch_function__ implementation wraps subclasses such that\n        methods called on subclasses return a subclass instance instead of\n        a ``torch.Tensor`` instance.\n\n        One corollary to this is that you need coverage for torch.Tensor\n        methods if implementing __torch_function__ for subclasses.\n\n        Affects the call torch.sum() as to behave the same way as NaNTensor.sum()\n\n        We recommend always calling ``super().__torch_function__`` as the base\n        case when doing the above.\n\n        While not mandatory, we recommend making `__torch_function__` a classmethod.\n        \"\"\"\n        if func.__name__ == \"sum\":\n            kwargs = {} if kwargs is None else kwargs\n            return args[0].sum(*args[1:], **kwargs)\n        else:\n            return super().__torch_function__(func, types, args=args, kwargs=kwargs)\n\n\ndef pearson_ipu(preds, target):\n    \"\"\"Computes pearson correlation coefficient.\n\n    Handles NaNs in the target without reshaping tensors in order to work on IPU.\n\n    Args:\n        preds: estimated scores\n        target: ground truth scores\n    \"\"\"\n    preds = NaNTensor(preds)\n    target = NaNTensor(target)\n    preds[target.get_nans] = float(\"nan\")\n    pearson = pearson_corrcoef(preds, target.to(preds.dtype))\n    return Tensor(pearson)\n\n\ndef spearman_ipu(preds, target):\n    \"\"\"Computes spearman rank correlation coefficient.\n\n    Handles NaNs in the target without reshaping tensors in order to work on IPU.\n\n    Args:\n        preds: estimated scores\n        target: ground truth scores\n    \"\"\"\n    nans = target.isnan()\n    dtype = preds.dtype\n    preds[nans] = float(\"inf\")\n    target[nans] = float(\"inf\")\n    preds_sort = _rank_data(preds).to(dtype=dtype)\n    target_sort = _rank_data(target).to(dtype=dtype)\n    target_sort[nans] = float(\"nan\")\n    spearman = pearson_ipu(preds_sort, target_sort)\n    return Tensor(spearman)\n\n\ndef _rank_data(data: Tensor) -> Tensor:\n    \"\"\"Calculate the rank for each element of a tensor.\n\n    The rank refers to the indices of an element in the corresponding sorted tensor (starting from 1).\n    Duplicates of the same value will be assigned the mean of their rank.\n\n    Adopted from `Rank of element tensor`_\n    \"\"\"\n    n = data.numel()\n    rank = torch.empty_like(data)\n    idx = data.argsort()\n    rank[idx] = torch.arange(1, n + 1, dtype=data.dtype, device=data.device)\n\n    # TODO: Repeats not yet supported\n    # repeats = _find_repeats(data)\n    # for r in repeats:\n    #     condition = data == r\n    #     rank[condition] = rank[condition].mean()\n    return rank\n\n\ndef r2_score_ipu(preds, target, *args, **kwargs) -> Tensor:\n    \"\"\"\n    Computes r2 score also known as `R2 Score_Coefficient Determination`_:\n\n    .. math:: R^2 = 1 - \\frac{SS_{res}}{SS_{tot}}\n\n    where :math:`SS_{res}=\\sum_i (y_i - f(x_i))^2` is the sum of residual squares, and\n    :math:`SS_{tot}=\\sum_i (y_i - \\bar{y})^2` is total sum of squares. Can also calculate\n    adjusted r2 score given by\n\n    .. math:: R^2_{adj} = 1 - \\frac{(1-R^2)(n-1)}{n-k-1}\n\n    where the parameter :math:`k` (the number of independent regressors) should\n    be provided as the ``adjusted`` argument.\n    Handles NaNs without reshaping tensors in order to work on IPU.\n\n    Args:\n        preds: estimated labels\n        target: ground truth labels\n        adjusted: number of independent regressors for calculating adjusted r2 score.\n        multioutput: Defines aggregation in the case of multiple output scores. Can be one of the following strings:\n\n            * ``'raw_values'`` returns full set of scores\n            * ``'uniform_average'`` scores are uniformly averaged\n            * ``'variance_weighted'`` scores are weighted by their individual variances\n    \"\"\"\n    preds = NaNTensor(preds)\n    target = NaNTensor(target)\n    preds[target.get_nans] = float(\"nan\")\n    score = r2_score(preds, target, *args, **kwargs)\n    return Tensor(score)\n\n\ndef fbeta_score_ipu(\n    preds: Tensor,\n    target: Tensor,\n    beta: float = 1.0,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = None,\n    ignore_index: Optional[int] = None,\n    num_classes: Optional[int] = None,\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.classification.f_beta._fbeta_compute`\n    that can ignore NaNs by giving them the same value for both `preds` and `target`.\n    This allows it to work with compilation\n    and IPUs since it doesn't modify the tensor's shape.\n\n    Args:\n        preds: Predictions from model (probabilities, logits or labels)\n        target: Ground truth labels\n        average:\n            Defines the reduction that is applied. Should be one of the following:\n\n            - ``'micro'`` [default]: Calculate the metric globally, across all samples and classes.\n            - ``'macro'``: Calculate the metric for each class separately, and average the\n              metrics across classes (with equal weights for each class).\n            - ``'weighted'``: Calculate the metric for each class separately, and average the\n              metrics across classes, weighting each class by its support (``tp + fn``).\n            - ``'none'`` or ``None``: Calculate the metric for each class separately, and return\n              the metric for every class.\n            - ``'samples'``: Calculate the metric for each sample, and average the metrics\n              across samples (with equal weights for each sample).\n\n            .. note:: What is considered a sample in the multi-dimensional multi-class case\n                depends on the value of ``mdmc_average``.\n\n            .. note:: If ``'none'`` and a given class doesn't occur in the ``preds`` or ``target``,\n                the value for the class will be ``nan``.\n\n        mdmc_average:\n            Defines how averaging is done for multi-dimensional multi-class inputs (on top of the\n            ``average`` parameter). Should be one of the following:\n\n            - ``None`` [default]: Should be left unchanged if your data is not multi-dimensional multi-class.\n\n            - ``'samplewise'``: In this case, the statistics are computed separately for each\n              sample on the ``N`` axis, and then averaged over samples.\n              The computation for each sample is done by treating the flattened extra axes ``...``\n              (see :ref:`pages/classification:input types`) as the ``N`` dimension within the sample,\n              and computing the metric for the sample based on that.\n\n            - ``'global'``: In this case the ``N`` and ``...`` dimensions of the inputs\n              (see :ref:`pages/classification:input types`)\n              are flattened into a new ``N_X`` sample axis, i.e. the inputs are treated as if they\n              were ``(N_X, C)``. From here on the ``average`` parameter applies as usual.\n\n        num_classes:\n            Number of classes. Necessary for ``'macro'``, ``'weighted'`` and ``None`` average methods.\n\n        threshold:\n            Threshold for transforming probability or logit predictions to binary (0,1) predictions, in the case\n            of binary or multi-label inputs. Default value of 0.5 corresponds to input being probabilities.\n        top_k:\n            Number of the highest probability or logit score predictions considered finding the correct label,\n            relevant only for (multi-dimensional) multi-class inputs. The\n            default value (``None``) will be interpreted as 1 for these inputs.\n\n            Should be left at default (``None``) for all other types of inputs.\n        multiclass:\n            Used only in certain special cases, where you want to treat inputs as a different type\n            than what they appear to be. See the parameter's\n            :ref:`documentation section <pages/classification:using the multiclass parameter>`\n            for a more detailed explanation and examples.\n        ignore_index:\n            Integer specifying a target class to ignore. If given, this class index does not contribute\n            to the returned score, regardless of reduction method. If an index is ignored, and ``average=None``\n            or ``'none'``, the score for the ignored class will be returned as ``nan``.\n        subset_accuracy:\n            Whether to compute subset accuracy for multi-label and multi-dimensional\n            multi-class inputs (has no effect for other input types).\n\n            - For multi-label inputs, if the parameter is set to ``True``, then all labels for\n              each sample must be correctly predicted for the sample to count as correct. If it\n              is set to ``False``, then all labels are counted separately - this is equivalent to\n              flattening inputs beforehand (i.e. ``preds = preds.flatten()`` and same for ``target``).\n\n            - For multi-dimensional multi-class inputs, if the parameter is set to ``True``, then all\n              sub-sample (on the extra axis) must be correct for the sample to be counted as correct.\n              If it is set to ``False``, then all sub-samples are counter separately - this is equivalent,\n              in the case of label predictions, to flattening the inputs beforehand (i.e.\n              ``preds = preds.flatten()`` and same for ``target``). Note that the ``top_k`` parameter\n              still applies in both cases, if set.\n\n    Raises:\n        ValueError:\n            If ``top_k`` parameter is set for ``multi-label`` inputs.\n        ValueError:\n            If ``average`` is none of ``\"micro\"``, ``\"macro\"``, ``\"weighted\"``, ``\"samples\"``, ``\"none\"``, ``None``.\n        ValueError:\n            If ``mdmc_average`` is not one of ``None``, ``\"samplewise\"``, ``\"global\"``.\n        ValueError:\n            If ``average`` is set but ``num_classes`` is not provided.\n        ValueError:\n            If ``num_classes`` is set\n            and ``ignore_index`` is not in the range ``[0, num_classes)``.\n        ValueError:\n            If ``top_k`` is not an ``integer`` larger than ``0``.\n    \"\"\"\n\n    (tp, fp, tn, fn), mode = get_confusion_matrix(\n        preds=preds,\n        target=target,\n        average=average,\n        mdmc_average=mdmc_average,\n        ignore_index=ignore_index,\n        num_classes=num_classes,\n        threshold=threshold,\n        top_k=top_k,\n        multiclass=multiclass,\n    )\n\n    b2 = beta**2\n    fbeta = ((1 + b2) * tp) / ((1 + b2) * tp + b2 * fn + fp)\n\n    if average in (None, \"none\", AverageMethod.NONE):\n        pass\n    elif average == AverageMethod.MICRO:\n        pass\n    elif average == AverageMethod.MACRO:\n        fbeta = fbeta.mean()\n    elif average == AverageMethod.WEIGHTED:\n        weights = tp + fn\n        fbeta = (weights * fbeta).sum() / weights.sum()\n    else:\n        raise ValueError(\n            f\"`average={average}` not yet supported. Chose between None, Micro, Macro, or Weighted\"\n        )\n\n    return fbeta\n\n\ndef f1_score_ipu(\n    preds: Tensor,\n    target: Tensor,\n    beta: float = 1.0,\n    average: Optional[str] = \"micro\",\n    mdmc_average: Optional[str] = None,\n    ignore_index: Optional[int] = None,\n    num_classes: Optional[int] = None,\n    threshold: float = 0.5,\n    top_k: Optional[int] = None,\n    multiclass: Optional[bool] = None,\n):\n    \"\"\"\n    A modified version of the `torchmetrics.functional.classification.f_beta._fbeta_compute`\n    that can ignore NaNs by giving them the same value for both `preds` and `target`.\n    Used to calculate the f1_score on IPU with beta parameter equal to 1.0\n    This allows it to work with compilation and IPUs since it doesn't modify the tensor's shape.\n\n    Computes f_beta metric from stat scores: true positives, false positives, true negatives, false negatives.\n\n    Args:\n        tp: True positives\n        fp: False positives\n        tn: True negatives\n        fn: False negatives\n        beta: The parameter `beta` (which determines the weight of recall in the combined score)\n        ignore_index: Integer specifying a target class to ignore. If given, this class index does not contribute\n            to the returned score, regardless of reduction method\n        average: Defines the reduction that is applied\n        mdmc_average: Defines how averaging is done for multi-dimensional multi-class inputs (on top of the\n            ``average`` parameter)\n    \"\"\"\n\n    return fbeta_score_ipu(\n        preds,\n        target,\n        beta=beta,\n        average=average,\n        mdmc_average=mdmc_average,\n        ignore_index=ignore_index,\n        num_classes=num_classes,\n        threshold=threshold,\n        top_k=top_k,\n        multiclass=multiclass,\n    )\n\n\ndef mean_squared_error_ipu(preds: Tensor, target: Tensor, squared: bool) -> Tensor:\n    \"\"\"Computes mean squared error.\n\n    Handles NaNs without reshaping tensors in order to work on IPU.\n\n    Args:\n        preds: estimated labels\n        target: ground truth labels\n        squared: returns RMSE value if set to False\n\n    Return:\n        Tensor with MSE\n    \"\"\"\n    target = target.clone()\n    preds = preds.clone()\n\n    # Replace the nan-targets in the preds/target tensors by 0\n    nan_targets = target.isnan()\n    preds[nan_targets] = 0.0\n    target[nan_targets] = 0.0\n\n    # Compute the loss, and rescale by the number of nan elements\n    loss = mean_squared_error(preds, target, squared)\n\n    if squared:\n        factor = nan_targets.numel() / ((~nan_targets).sum())\n    else:\n        factor = (nan_targets.numel() / ((~nan_targets).sum())).sqrt()\n\n    loss = loss * factor\n\n    return loss\n\n\ndef mean_absolute_error_ipu(preds: Tensor, target: Tensor) -> Tensor:\n    \"\"\"Computes mean absolute error.\n\n    Handles NaNs without reshaping tensors in order to work on IPU.\n\n    Args:\n        preds: estimated labels\n        target: ground truth labels\n\n    Return:\n        Tensor with MAE\n    \"\"\"\n    target = target.clone()\n    preds = preds.clone()\n\n    # Replace the nan-targets in the preds/target tensors by 0\n    nan_targets = target.isnan()\n    preds[nan_targets] = 0.0\n    target[nan_targets] = 0.0\n\n    # Compute the loss, and rescale by the number of nan elements\n    loss = mean_absolute_error(preds, target)\n    loss = loss * nan_targets.numel() / ((~nan_targets).sum())\n\n    return loss\n"
  },
  {
    "path": "graphium/ipu/ipu_simple_lightning.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport lightning\nfrom lightning_graphcore import IPUStrategy\nfrom lightning.pytorch.loggers import WandbLogger\n\nimport torch\nfrom torch import nn\n\nimport torchvision\nimport torchvision.transforms as transforms\n\nimport mup\n\nfrom graphium.nn.base_layers import FCLayer\nfrom graphium.utils.mup import set_base_shapes\n\n\nON_IPU = True  # Change this line to run on CPU\nSEED = 42\n\n\n# The simple PyTorch model used in each of these examples\nclass SimpleTorchModel(torch.nn.Module):\n    def __init__(self, in_dim, hidden_dim, kernel_size, num_classes):\n        super().__init__()\n        self.in_dim = in_dim\n        self.hidden_dim = hidden_dim\n        self.kernel_size = kernel_size\n        self.num_classes = num_classes\n\n        conv_block = nn.Sequential(\n            nn.Conv2d(in_channels=in_dim, out_channels=hidden_dim, kernel_size=kernel_size),\n            nn.BatchNorm2d(hidden_dim),\n            nn.ReLU(),\n            nn.MaxPool2d(kernel_size),\n            nn.MaxPool2d(kernel_size),\n        )\n\n        self.the_network = nn.Sequential(\n            conv_block,\n            torch.nn.Flatten(),\n            FCLayer(4 * hidden_dim, hidden_dim),\n            FCLayer(hidden_dim, hidden_dim),\n            FCLayer(hidden_dim, num_classes, activation=None, is_readout_layer=True),\n            nn.LogSoftmax(1),\n        )\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0):\n        return dict(\n            in_dim=self.in_dim,\n            hidden_dim=round(self.hidden_dim / divide_factor),\n            kernel_size=self.kernel_size,\n            num_classes=self.num_classes,\n        )\n\n    def forward(self, x):\n        return self.the_network(x)\n\n\n# This class shows a minimal lightning example. This example uses our own\n# SimpleTorchModel which is a basic 2 conv, 2 FC torch network. It can be\n# found in simple_torch_model.py.\nclass SimpleLightning(lightning.LightningModule):\n    def __init__(self, in_dim, hidden_dim, kernel_size, num_classes, on_ipu):\n        super().__init__()\n        self.model = SimpleTorchModel(\n            in_dim=in_dim, hidden_dim=hidden_dim, kernel_size=kernel_size, num_classes=num_classes\n        )\n        self.on_ipu = on_ipu\n\n    def training_step(self, batch, _):\n        x, label = batch\n        prediction = self.model(x)\n        loss = torch.nn.functional.nll_loss(prediction, label)\n        return loss\n\n    def validation_step(self, batch, _):\n        x, label = batch\n        prediction = self.model(x)\n        preds = torch.argmax(prediction, dim=1)\n        acc = torch.sum(preds == label).float() / len(label)\n        loss = torch.nn.functional.nll_loss(prediction, label)\n        return loss, acc\n\n    # PopTorch doesn't currently support logging within steps. Use the Lightning\n    # callback hooks instead.\n    def on_train_batch_end(self, outputs, batch, batch_idx):\n        self.log(\"StepLoss\", outputs[\"loss\"])\n\n    def validation_epoch_end(self, outputs):\n        loss = [out[0] for out in outputs]\n        self.log(\"val_loss\", torch.stack(loss).mean(), prog_bar=True)\n\n        acc = [out[1] for out in outputs]\n        self.log(\"val_acc\", torch.stack(acc).mean(), prog_bar=True)\n\n    def configure_optimizers(self):\n        adam = torch.optim.Adam\n\n        if self.on_ipu:\n            import poptorch\n\n            adam = poptorch.optim.Adam\n\n        optimizer = mup.MuAdam(self.parameters(), lr=0.01, impl=adam)\n        return optimizer\n\n\nif __name__ == \"__main__\":\n    torch.manual_seed(SEED)\n\n    # Create the model as usual.\n    predictor = SimpleLightning(in_dim=1, hidden_dim=32, kernel_size=3, num_classes=10, on_ipu=ON_IPU)\n    model = predictor.model\n    base = model.__class__(**model.make_mup_base_kwargs(divide_factor=2))\n    predictor.model = set_base_shapes(model, base, rescale_params=False)\n\n    torch.manual_seed(SEED)\n    # Normal PyTorch dataset.\n    train_set = torchvision.datasets.FashionMNIST(\n        \"out/FashionMNIST\", train=True, download=True, transform=transforms.Compose([transforms.ToTensor()])\n    )\n    val_set = torchvision.datasets.FashionMNIST(\n        \"out/FashionMNIST\", train=False, download=True, transform=transforms.Compose([transforms.ToTensor()])\n    )\n\n    # Normal PyTorch dataloader.\n    train_loader = torch.utils.data.DataLoader(train_set, batch_size=16, shuffle=True)\n    val_loader = torch.utils.data.DataLoader(val_set, batch_size=16, shuffle=False)\n\n    torch.manual_seed(SEED)\n\n    ipus = None\n    plugins = None\n    if ON_IPU:\n        import poptorch\n\n        training_opts = poptorch.Options()\n        inference_opts = poptorch.Options()\n\n        # Set the seeds\n        training_opts.randomSeed(SEED)\n        inference_opts.randomSeed(SEED)\n        ipus = 1\n        strategy = IPUStrategy(training_opts=training_opts, inference_opts=inference_opts)\n\n    trainer = lightning.Trainer(\n        logger=WandbLogger(),\n        ipus=ipus,\n        max_epochs=3,\n        log_every_n_steps=1,\n        plugins=plugins,\n    )\n\n    # When fit is called the model will be compiled for IPU and will run on the available IPU devices.\n    trainer.fit(predictor, train_dataloaders=train_loader, val_dataloaders=val_loader)\n"
  },
  {
    "path": "graphium/ipu/ipu_utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport os\nimport tempfile\nfrom datetime import datetime\nfrom copy import deepcopy\nfrom types import ModuleType\nfrom typing import Optional, Tuple, List\nimport torch\n\n\ndef import_poptorch(raise_error=True) -> Optional[ModuleType]:\n    \"\"\"\n    Import poptorch and returns it.\n    It is wrapped in a function to avoid breaking the code\n    for non-IPU devices which did not install poptorch.\n\n    Parameters:\n        raise_error: Whether to raise an error if poptorch is unavailable.\n            If `False`, return `None`\n\n    Returns:\n        The poptorch module\n\n    \"\"\"\n    try:\n        import poptorch\n\n        return poptorch\n    except ImportError as e:\n        if raise_error:\n            raise e\n        return\n\n\ndef is_running_on_ipu() -> bool:\n    \"\"\"\n    Returns whether the current module is running on ipu.\n    Needs to be used in the `forward` or `backward` pass.\n    \"\"\"\n    poptorch = import_poptorch(raise_error=False)\n    on_ipu = (poptorch is not None) and (poptorch.isRunningOnIpu())\n    return on_ipu\n\n\ndef load_ipu_options(\n    ipu_opts: List[str],\n    seed: Optional[int] = None,\n    model_name: Optional[str] = None,\n    gradient_accumulation: Optional[int] = None,\n    precision: Optional[int] = None,\n    ipu_inference_opts: Optional[List[str]] = None,\n) -> Tuple[\"poptorch.Options\", \"poptorch.Options\"]:\n    \"\"\"\n    Load the IPU options from the config file.\n\n    Parameters:\n        ipu_cfg: The list  configurations for the IPU, written as a list of strings to make use of `poptorch.Options.loadFromFile`\n\n            write a temporary config gile, and read it. See `Options.loadFromFile`\n            #? see the tutorial for IPU options here\n            # https://github.com/graphcore/tutorials/tree/sdk-release-2.6/tutorials/pytorch/efficient_data_loading\n            #? see the full documentation for ipu options here\n            # https://docs.graphcore.ai/projects/poptorch-user-guide/en/latest/reference.html?highlight=options#poptorch.Options\n\n            ***minibatch size***: The number of samples processed by one simple fwd/bwd pass.\n            = # of samples in a minibatch\n\n            ***device iterations***: A device iteration corresponds to one iteration of the training loop executed on the IPU, starting with data-loading and ending with a weight update.\n            In this simple case, when we set n deviceIterations, the host will prepare n mini-batches in an infeed queue so the IPU can perform efficiently n iterations.\n            = # of minibatches to be processed at a time\n            = # of training / backward pass in this call\n\n            ***gradient accumulation factor***: After each backward pass the gradients are accumulated together for K mini-batches. set K in the argument\n            = # of minibatches to accumulate gradients from\n\n            ***replication factor***: Replication describes the process of running multiple instances of the same model simultaneously on different IPUs to achieve data parallelism.\n            If the model requires N IPUs and the replication factor is M, N x M IPUs will be necessary.\n            = # of times the model is copied to speed up computation, each replica of the model is sent a different subset of the dataset\n\n            ***global batch size***: In a single device iteration, many mini-batches may be processed and the resulting gradients accumulated.\n            We call this total number of samples processed for one optimiser step the global batch size.\n            = total number of samples processed for *one optimiser step*\n            = (minibatch size x Gradient accumulation factor) x Number of replicas\n\n        seed: random seed for the IPU\n        model_name: Name of the model, to be used for ipu profiling\n        ipu_inference_opts: optional IPU configuration overrides for inference.\n            If this is provided, options in this file override those in `ipu_file` for inference.\n\n    Returns:\n\n        training_opts: IPU options for the training set.\n\n        inference_opts: IPU options for inference.\n            It differs from the `training_opts` by enforcing `gradientAccumulation` to 1\n\n    \"\"\"\n\n    poptorch = import_poptorch()\n    ipu_options = poptorch.Options()\n    ipu_opts_file = ipu_options_list_to_file(ipu_opts)\n    ipu_options.loadFromFile(ipu_opts_file.name)\n    ipu_opts_file.close()\n\n    ipu_options.outputMode(poptorch.OutputMode.All)\n    if seed is not None:\n        ipu_options.randomSeed(seed)\n    if model_name is not None:\n        ipu_options.modelName(f\"{model_name}_train\")\n    if gradient_accumulation is not None:\n        current = ipu_options.Training.gradient_accumulation\n        assert (current == 1) or (\n            current == gradient_accumulation\n        ), f\"Received inconsistent gradient accumulation `{current}` and `{gradient_accumulation}\"\n        ipu_options.Training.gradientAccumulation(gradient_accumulation)\n\n    if precision == \"16-true\":\n        # IPUOptions.loadFromFile currently doesn't support setting half partials, doing it here\n        ipu_options.Precision.setPartialsType(torch.half)\n    training_opts = ipu_options\n\n    # Change the inference options to remove gradient accumulation\n    inference_opts = deepcopy(ipu_options)\n    inference_opts.Training.gradientAccumulation(1)\n    if ipu_inference_opts is not None:\n        ipu_inference_opts_file = ipu_options_list_to_file(ipu_inference_opts)\n        inference_opts.loadFromFile(ipu_inference_opts_file.name)\n        ipu_inference_opts_file.close()\n\n    return training_opts, inference_opts\n\n\ndef ipu_options_list_to_file(ipu_opts: Optional[List[str]]) -> tempfile._TemporaryFileWrapper:\n    \"\"\"\n    Create a temporary file from a list of ipu configs, such that it can be read by `poptorch.Options.loadFromFile`\n\n    Parameters:\n        ipu_opts: The list  configurations for the IPU, written as a list of strings to make use of `poptorch.Options.loadFromFile`\n    Returns:\n        tmp_file: The temporary file of ipu configs\n    \"\"\"\n    if ipu_opts is None:\n        return\n\n    tmp_file = tempfile.NamedTemporaryFile(\"w\", delete=True)\n    for s in ipu_opts:\n        tmp_file.write(s + \"\\n\")\n    tmp_file.flush()\n    return tmp_file\n"
  },
  {
    "path": "graphium/ipu/ipu_wrapper.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Dict, Any, Optional, Callable, Union, Type, Tuple, Iterable\n\nfrom torch_geometric.data import Batch\nfrom torch import Tensor\nfrom lightning_graphcore import IPUStrategy\nfrom lightning.pytorch.utilities.types import STEP_OUTPUT\nfrom lightning.pytorch.trainer.states import RunningStage\n\nfrom graphium.trainer.predictor import PredictorModule\nfrom graphium.ipu.ipu_utils import import_poptorch\n\nimport torch\nfrom torch_geometric.data import Data, Batch\nfrom torch_geometric.data.data import BaseData\nfrom loguru import logger\nimport functools\nimport collections\nfrom graphium.data.utils import get_keys\n\npoptorch = import_poptorch()\n\n\nclass PyGArgsParser(poptorch.ICustomArgParser):\n    \"\"\"\n    This class is responsible for converting a PyG Batch from and to\n    a tensor of tuples. This allows PyG Batch to be used as inputs to\n    IPU programs. Copied from poppyg repo, in the future import from\n    the repo directly.\n    \"\"\"\n\n    @staticmethod\n    def sortedTensorKeys(struct: BaseData) -> Iterable[str]:\n        \"\"\"\n        Find all the keys that map to a tensor value in struct. The keys\n        are returned in sorted order.\n        \"\"\"\n        all_keys = sorted(get_keys(struct))\n\n        def isTensor(k: str) -> bool:\n            return isinstance(struct[k], torch.Tensor)\n\n        return filter(isTensor, all_keys)\n\n    def yieldTensors(self, struct: BaseData):\n        \"\"\"\n        yield every torch.Tensor in struct in sorted order\n        \"\"\"\n        for k in self.sortedTensorKeys(struct):\n            yield struct[k]\n\n    def reconstruct(self, original_structure: BaseData, tensor_iterator: Iterable[Tensor]):\n        \"\"\"\n        Create a new instance with the same class type as the\n        original_structure. This new instance will be initialized with tensors\n        from the provided iterator and uses the same sorted keys from the\n        yieldTensors() implementation.\n        \"\"\"\n        tensor_keys = self.sortedTensorKeys(original_structure)\n        kwargs = {k: next(tensor_iterator) for k in tensor_keys}\n\n        for k in get_keys(original_structure):\n            if k not in kwargs:\n                # copy non-tensor properties to the new instance\n                kwargs[k] = original_structure[k]\n\n        cls = original_structure.__class__\n\n        if issubclass(cls, Batch):\n            kwargs[\"_base_cls\"] = Data\n            return Batch(**kwargs)\n\n        return cls(**kwargs)\n\n\n# PyG uses the BaseData object as the root for data and batch objects\npoptorch.registerCustomArgParser(BaseData, PyGArgsParser())\n\n\nclass PredictorModuleIPU(PredictorModule):\n    \"\"\"\n    This class wraps around the `PredictorModule` to make it work with IPU and the `IPUPluginGraphium`.\n    \"\"\"\n\n    def __init__(self, *args, **kwargs):\n        # Import poptorch in a safe way that will work when working with cpu/gpu\n        self.poptorch = import_poptorch()\n        super().__init__(*args, **kwargs)\n\n    @staticmethod\n    def compute_loss(\n        preds: Dict[str, Tensor],\n        targets: Dict[str, Tensor],\n        weights: Optional[Tensor],\n        loss_fun: Dict[str, Callable],\n        target_nan_mask: Union[Type, str] = \"ignore\",\n        multitask_handling: Optional[str] = None,\n    ) -> Tuple[Tensor, Dict[str, Tensor]]:\n        return PredictorModule.compute_loss(\n            preds, targets, weights, loss_fun, target_nan_mask, multitask_handling\n        )\n\n    def on_train_batch_end(self, outputs, batch, batch_idx):\n        outputs = self.convert_from_fp16(outputs)\n        outputs[\"loss\"] = outputs[\"loss\"][outputs[\"loss\"] != 0].mean()\n        super().on_train_batch_end(outputs, batch, batch_idx)\n\n    def training_step(self, batch, batch_idx) -> Dict[str, Any]:\n        features, labels = batch[\"features\"], batch[\"labels\"]\n        features, labels = self.squeeze_input_dims(features, labels)\n        dict_input = {\"features\": features, \"labels\": labels}\n        step_dict = super().training_step(dict_input, to_cpu=False)\n\n        loss = step_dict.pop(\"loss\")\n        step_dict[\"loss\"] = self.poptorch.identity_loss(loss, reduction=\"mean\")\n        return step_dict\n\n    def validation_step(self, batch, batch_idx) -> Dict[str, Any]:\n        features, labels = batch[\"features\"], batch[\"labels\"]\n        features, labels = self.squeeze_input_dims(features, labels)\n        dict_input = {\"features\": features, \"labels\": labels}\n        step_dict = super().validation_step(dict_input, to_cpu=False)\n\n        return step_dict\n\n    def test_step(self, batch, batch_idx) -> Dict[str, Any]:\n        # Build a dictionary from the tuples\n        features, labels = batch[\"features\"], batch[\"labels\"]\n        features, labels = self.squeeze_input_dims(features, labels)\n        dict_input = {\"features\": features, \"labels\": labels}\n        step_dict = super().test_step(dict_input, to_cpu=False)\n\n        return step_dict\n\n    def predict_step(self, **inputs) -> Dict[str, Any]:\n        # Build a dictionary from the tuples\n        dict_input = inputs\n        step_dict = super().predict_step(dict_input, to_cpu=False)\n\n        return step_dict\n\n    def on_validation_batch_end(\n        self, outputs: Any, batch: Any, batch_idx: int, dataloader_idx: int = 0\n    ) -> None:\n        # convert data that will be tracked\n        outputs = self.convert_from_fp16(outputs)\n        super().on_validation_batch_end(outputs, batch, batch_idx, dataloader_idx)\n\n    def evaluation_epoch_end(self, outputs: Any):\n        outputs = self.convert_from_fp16(outputs)\n        super().evaluation_epoch_end(outputs)\n\n    def on_test_batch_end(self, outputs: Any, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> None:\n        outputs = self.convert_from_fp16(outputs)\n        super().on_test_batch_end(outputs, batch, batch_idx, dataloader_idx)\n\n    def configure_optimizers(self, impl=None):\n        if impl is None:\n            dtype = self.precision_to_dtype(self.trainer.precision)\n            impl = functools.partial(\n                self.poptorch.optim.Adam,\n                accum_type=dtype,\n                first_order_momentum_accum_type=dtype,\n                second_order_momentum_accum_type=torch.float,\n            )\n        return super().configure_optimizers(impl=impl)\n\n    def squeeze_input_dims(self, features, labels):\n        for key, tensor in features:\n            if isinstance(tensor, torch.Tensor):\n                features[key] = features[key].squeeze(0)\n\n        for key in labels:\n            labels[key] = labels[key].squeeze(0)\n\n        return features, labels\n\n    def convert_from_fp16(self, data: Any) -> Any:\n        \"\"\"\n        Converts tensors from FP16 to FP32. Useful to convert the IPU program output data\n        \"\"\"\n        if isinstance(data, collections.Sequence):\n            for idx in range(len(data)):\n                data[idx] = self.convert_from_fp16(data[idx])\n        elif isinstance(data, collections.Mapping):\n            for key in data:\n                data[key] = self.convert_from_fp16(data[key])\n        elif isinstance(data, torch.Tensor) and data.dtype == torch.float16:\n            data = data.float()\n        return data\n\n    def _convert_features_dtype(self, feats):\n        \"\"\"\n        Converts features to trainer precision rather than model precision.\n        Necessary to run IPU on FP16.\n        \"\"\"\n        dtype = self.precision_to_dtype(self.trainer.precision)\n\n        # Convert features to dtype\n        if isinstance(feats, torch.Tensor):\n            feats = feats.to(dtype)\n        elif isinstance(feats, (Data, Batch, dict)):\n            for key, val in feats.items():\n                if isinstance(val, torch.Tensor) and (val.is_floating_point()):\n                    feats[key] = val.to(dtype=dtype)\n        else:\n            raise ValueError(f\"Unsupported feats type `{type(feats)}` : {feats}\")\n        return feats\n\n    def precision_to_dtype(self, precision):\n        return torch.half if precision == \"16-true\" else torch.float\n\n    def get_num_graphs(self, data: Batch):\n        \"\"\"\n        IPU specific method to compute the number of graphs in a Batch,\n        that considers gradient accumulation, multiple IPUs and multiple\n        device iterations. Essential to estimate throughput in graphs/s.\n        \"\"\"\n        num_graphs = torch.max(data.batch, dim=-1).values\n        num_graphs = torch.sum(num_graphs)\n\n        return num_graphs\n"
  },
  {
    "path": "graphium/ipu/to_dense_batch.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Optional, Tuple\n\nimport torch\nfrom torch import Tensor\nfrom torch_scatter import scatter_add\n\n\ndef to_sparse_batch(x: Tensor, mask_idx: Tensor):\n    \"\"\"\n    Reverse function of `to_dense_batch`\n    \"\"\"\n    return torch.index_select(x.reshape(-1, x.shape[-1]), 0, mask_idx)\n\n\ndef to_sparse_batch_from_packed(x: Tensor, pack_from_node_idx: Tensor):\n    \"\"\"\n    Reverse function of `to_packed_dense_batch`\n    \"\"\"\n    return x[pack_from_node_idx[:, 0], pack_from_node_idx[:, 1]]\n\n\ndef to_dense_batch(\n    x: Tensor,\n    batch: Optional[Tensor] = None,\n    fill_value: float = 0.0,\n    max_num_nodes_per_graph: Optional[int] = None,\n    batch_size: Optional[int] = None,\n    drop_nodes_last_graph=False,\n) -> Tuple[Tensor, Tensor]:\n    r\"\"\"Given a sparse batch of node features\n    :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}` (with\n    :math:`N_i` indicating the number of nodes in graph :math:`i`), creates a\n    dense node feature tensor\n    :math:`\\mathbf{X} \\in \\mathbb{R}^{B \\times N_{\\max} \\times F}` (with\n    :math:`N_{\\max} = \\max_i^B N_i`).\n    In addition, a mask of shape :math:`\\mathbf{M} \\in \\{ 0, 1 \\}^{B \\times\n    N_{\\max}}` is returned, holding information about the existence of\n    fake-nodes in the dense representation.\n\n    Parameters:\n        x: Node feature matrix\n            :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}`.\n        batch: Batch vector\n            :math:`\\mathbf{b} \\in {\\{ 0, \\ldots, B-1\\}}^N`, which assigns each\n            node to a specific example. Must be ordered. (default: :obj:`None`)\n        fill_value: The value for invalid entries in the\n            resulting dense output tensor. (default: :obj:`0`)\n        max_num_nodes_per_graph: The size of the output node dimension.\n            (default: :obj:`None`)\n        batch_size: The batch size. (default: :obj:`None`)\n        drop_nodes_last_graph: Whether to drop the nodes of the last graphs that exceed\n            the `max_num_nodes_per_graph`. Useful when the last graph is a padding.\n\n    :rtype: (:class:`Tensor`, :class:`BoolTensor`)\n    \"\"\"\n    if batch is None and max_num_nodes_per_graph is None:\n        mask = torch.ones(1, x.size(0), dtype=torch.bool, device=x.device)\n        return x.unsqueeze(0), mask\n\n    if batch is None:\n        batch = x.new_zeros(x.size(0), dtype=torch.long)\n\n    if batch_size is None:\n        assert x.device.type != \"ipu\", (\n            \"When using the IPU the batch size must be \"\n            \"provided during compilation instead of determined at runtime\"\n        )\n        batch_size = int(batch.max()) + 1\n    if x.device not in [\"ipu\", \"xla\"]:\n        num_nodes = scatter_add(batch.new_ones(x.size(0)), batch, dim=0, dim_size=batch_size)\n    else:\n        # Can't use scatter_add here due to PopTorch bug, will be fixed in SDK 3.3\n        arange = torch.arange(batch_size).unsqueeze(-1)\n        num_nodes = batch.eq(arange).sum(dim=-1)\n    cum_nodes = torch.cat([batch.new_zeros(1), num_nodes.cumsum(dim=0)])\n\n    if max_num_nodes_per_graph is None:  # Must be provided on IPU\n        max_num_nodes_per_graph = int(num_nodes.max())\n\n    idx = torch.arange(batch.size(0), dtype=torch.long, device=x.device)\n    idx = (idx - cum_nodes[batch]) + (batch * max_num_nodes_per_graph)\n\n    size = [batch_size * max_num_nodes_per_graph] + list(x.size())[1:]\n\n    out = x.new_full(size, fill_value)\n\n    ##### CHANGES FROM PYG #####\n\n    # In case the last graph represents padding. Drop the overflowing nodes.\n    if drop_nodes_last_graph:\n        num_nodes = num_nodes[:-1]\n        idx[idx >= size[0]] = size[0] - 1\n\n    # Raise error if num_nodes > max_num_nodes\n    if x.device.type != \"ipu\":\n        assert (\n            num_nodes <= max_num_nodes_per_graph\n        ).all(), f\"Encountered graphs with {num_nodes.max()} nodes, greater than `max_num_nodes = {max_num_nodes_per_graph}`\"\n\n    out[idx] = x\n    out = out.view([batch_size, max_num_nodes_per_graph] + list(x.size())[1:])\n\n    # Create a zero-mask on the right device\n    mask_sz = batch_size * max_num_nodes_per_graph\n    if x.device.type in (\"ipu\", \"xla\"):\n        mask = torch.zeros(mask_sz, dtype=torch.int32, device=\"cpu\")\n        mask = mask.to(x.device)\n        # Can't use mask[idx] here due to PopTorch bug, will be fixed in SDK 3.3\n        # mask[idx] = 1\n        # mask = mask.bool()\n        if drop_nodes_last_graph:\n            num_nodes_with_padding = torch.cat((num_nodes, torch.tensor([0], dtype=torch.int32)), dim=0)\n        else:\n            num_nodes_with_padding = num_nodes\n\n        arange = torch.arange(max_num_nodes_per_graph)\n        mask = num_nodes_with_padding.unsqueeze(-1).gt(arange).flatten()\n\n    else:\n        mask = torch.zeros(mask_sz, dtype=torch.bool, device=x.device)\n        mask[idx] = 1\n\n    ##### END CHANGES FROM PYG #####\n\n    mask = mask.view(batch_size, max_num_nodes_per_graph)\n\n    return out, mask, idx  # Added `idx` as a return\n\n\ndef to_packed_dense_batch(\n    x: Tensor,\n    pack_from_node_idx: Tensor,\n    pack_attn_mask: Tensor,\n    fill_value: float = 0.0,\n    max_num_nodes_per_pack: Optional[int] = None,\n) -> Tuple[Tensor, Tensor]:\n    r\"\"\"Given a sparse batch of node features\n    :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}` (with\n    :math:`N_i` indicating the number of nodes in graph :math:`i`), creates a\n    dense node feature tensor\n    :math:`\\mathbf{X} \\in \\mathbb{R}^{B \\times N_{\\max} \\times F}` (with\n    :math:`N_{\\max} = \\max_i^B N_i`).\n    In addition, a mask of shape :math:`\\mathbf{M} \\in \\{ 0, 1 \\}^{B \\times\n    N_{\\max}}` is returned, holding information about the existence of\n    fake-nodes in the dense representation.\n\n    Parameters: # TODO: Update docstring\n        x: Node feature matrix\n            :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}`.\n        batch: Batch vector\n            :math:`\\mathbf{b} \\in {\\{ 0, \\ldots, B-1\\}}^N`, which assigns each\n            node to a specific example. Must be ordered. (default: :obj:`None`)\n        fill_value: The value for invalid entries in the\n            resulting dense output tensor. (default: :obj:`0`)\n        max_num_nodes_per_graph: The size of the output node dimension.\n            (default: :obj:`None`)\n        batch_size: The batch size. (default: :obj:`None`)\n        drop_nodes_last_graph: Whether to drop the nodes of the last graphs that exceed\n            the `max_num_nodes_per_graph`. Useful when the last graph is a padding.\n\n    :rtype: (:class:`Tensor`, :class:`BoolTensor`)\n    \"\"\"\n\n    if max_num_nodes_per_pack is None:  # Must be provided on IPU\n        max_num_nodes_per_pack = pack_attn_mask.shape[-1]\n\n    size = [pack_attn_mask[0], max_num_nodes_per_pack] + list(x.size())[1:]\n\n    out = x.new_full(size, fill_value)\n    out[pack_from_node_idx[:, 0], pack_from_node_idx[:, 1]] = x\n\n    return out\n"
  },
  {
    "path": "graphium/nn/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\ncode for base graph layer classes, \nin subfolders there are different GNN layers, positional encoding layers and the graph architecture. \n\n- ✅ `base_graph_layer.py`: contains `BaseGraphStructure` and `BaseGraphModule` classes, should be inherented by all gnn layers\n- ✅ `base_layer.py`: contains base class for different layer, notably `FCLayer` for feedforward layers\n- `residual_connections.py`: code for residual connections\n- `utils.py`: util for mixed precision\n\n\n"
  },
  {
    "path": "graphium/nn/__init__.py",
    "content": ""
  },
  {
    "path": "graphium/nn/architectures/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\n- ✅ `encoder_manager.py`: encoder manager to manage the positional encoders and pool them at the correct level as input to the gnns\n- ✅ `global_architectures.py`: `FullGraphNetwork`, architecture to run all the gnn layers\n- `pyg_architectures.py`: `FeedForwardPyg`, base class for different pyg layers"
  },
  {
    "path": "graphium/nn/architectures/__init__.py",
    "content": "from .global_architectures import FeedForwardNN\nfrom .global_architectures import FullGraphMultiTaskNetwork\nfrom .global_architectures import TaskHeads\nfrom .global_architectures import GraphOutputNN\nfrom .pyg_architectures import FeedForwardPyg\nfrom .global_architectures import EnsembleFeedForwardNN\n"
  },
  {
    "path": "graphium/nn/architectures/encoder_manager.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Iterable, Dict, Any, Optional\nfrom torch_geometric.data import Batch\n\n# Misc imports\nimport inspect\nfrom copy import deepcopy\n\n# Torch imports\nfrom torch import Tensor, nn\nimport torch\n\nfrom graphium.data.utils import get_keys\nfrom graphium.nn.encoders import (\n    laplace_pos_encoder,\n    mlp_encoder,\n    signnet_pos_encoder,\n    gaussian_kernel_pos_encoder,\n)\n\nPE_ENCODERS_DICT = {\n    \"laplacian_pe\": laplace_pos_encoder.LapPENodeEncoder,\n    \"mlp\": mlp_encoder.MLPEncoder,\n    \"cat_mlp\": mlp_encoder.CatMLPEncoder,\n    \"signnet\": signnet_pos_encoder.SignNetNodeEncoder,\n    \"gaussian_kernel\": gaussian_kernel_pos_encoder.GaussianKernelPosEncoder,\n}\n\n\nclass EncoderManager(nn.Module):\n    def __init__(\n        self,\n        pe_encoders_kwargs: Optional[Dict[str, Any]] = None,\n        max_num_nodes_per_graph: Optional[int] = None,\n        name: str = \"encoder_manager\",\n    ):\n        r\"\"\"\n        Class that allows to runs multiple encoders in parallel and concatenate / pool their outputs.\n        Parameters:\n\n            pe_encoders_kwargs:\n                key-word arguments to use for the initialization of all positional encoding encoders\n                can use the class PE_ENCODERS_DICT: \"la_encoder\"(tested) , \"mlp_encoder\" (not tested), \"signnet_encoder\" (not tested)\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n        \"\"\"\n\n        super().__init__()\n        self.name = name\n        self.max_num_nodes_per_graph = max_num_nodes_per_graph\n        if pe_encoders_kwargs is not None:\n            max_nodes = pe_encoders_kwargs.pop(\"max_num_nodes_per_graph\", None)\n            if max_nodes is not None:\n                if self.max_num_nodes_per_graph is not None:\n                    assert (\n                        self.max_num_nodes_per_graph == max_nodes\n                    ), f\"max_num_nodes_per_graph mismatch {self.max_num_nodes_per_graph}!={max_nodes}\"\n                self.max_num_nodes_per_graph = max_nodes\n\n        self.pe_encoders_kwargs = deepcopy(pe_encoders_kwargs)\n        self.pe_encoders = self._initialize_positional_encoders(pe_encoders_kwargs)\n\n    def _initialize_positional_encoders(self, pe_encoders_kwargs: Dict[str, Any]) -> Optional[nn.ModuleDict]:\n        r\"\"\"Initialize the positional encoders for each positional/structural encodings.\n        Parameters:\n\n            pe_encoders_kwargs: key-word arguments to use for the initialization of all positional encoding encoders\n\n        Returns:\n            pe_encoders: a nn.ModuleDict containing all positional encoders specified by encoder_name in pe_encoders_kwargs[\"encoders\"]\n        \"\"\"\n\n        if (pe_encoders_kwargs is None) or (len(pe_encoders_kwargs) == 0):\n            return\n\n        pe_encoders = nn.ModuleDict()\n\n        # Pooling options here for pe encoders\n        self.pe_pool = pe_encoders_kwargs[\"pool\"]\n        pe_out_dim = pe_encoders_kwargs.get(\"out_dim\", None)\n        edge_pe_out_dim = pe_encoders_kwargs.get(\"edge_out_dim\", None)\n        in_dim_dict = pe_encoders_kwargs[\"in_dims\"]\n\n        # Loop every positional encoding to assign it\n        for encoder_name, encoder_kwargs in pe_encoders_kwargs[\"encoders\"].items():\n            encoder_kwargs = deepcopy(encoder_kwargs)\n            encoder_type = encoder_kwargs.pop(\"encoder_type\")\n            output_keys = encoder_kwargs[\"output_keys\"]\n            encoder = PE_ENCODERS_DICT[encoder_type]\n\n            # Get the keys associated to in_dim. First check if there's a key that starts with `encoder_name`\n            # Then check for the exact key\n\n            this_in_dims = {}\n\n            for key in encoder_kwargs.get(\"input_keys\", []):\n                if key in in_dim_dict:\n                    this_in_dims[key] = in_dim_dict[key]\n                else:\n                    raise ValueError(\n                        f\"Key '{key}' not found in `in_dim_dict`. Encoder '{encoder_name}' is also not found.\\n Available keys: {in_dim_dict.keys()}\"\n                    )\n\n            # Parse the in_dims based on Encoder's signature\n            accepted_keys = inspect.signature(encoder).parameters.keys()\n            if all([key in accepted_keys for key in this_in_dims.keys()]):\n                pass\n            elif \"in_dim\" in accepted_keys:\n                if len(this_in_dims) == 1 or encoder_type == \"laplacian_pe\":\n                    this_in_dims = {\"in_dim\": list(this_in_dims.values())[0]}\n                elif len(this_in_dims) > 1 and encoder_type == \"cat_mlp\":\n                    this_in_dims = {\"in_dim\": list(this_in_dims.values())}\n                else:\n                    raise ValueError(\n                        f\"All `in_dims` must be equal for encoder {encoder_name} unless edge_mlp is used. Provided: {this_in_dims}, {encoder_type}\"\n                    )\n            else:\n                raise ValueError(\n                    f\"`in_dim` not understood for encoder {encoder_name}. Provided: {this_in_dims}. Accepted keys are: {accepted_keys}\"\n                )\n\n            # Add the max_num_nodes_per_graph if it's in the accepted input keys\n            if \"max_num_nodes_per_graph\" in accepted_keys:\n                encoder_kwargs[\"max_num_nodes_per_graph\"] = self.max_num_nodes_per_graph\n\n            # Initialize the pe_encoder layer\n            if output_keys[0] == \"feat\":\n                pe_out_dim2 = encoder_kwargs.pop(\"out_dim\", None)\n                if pe_out_dim2 is not None:\n                    assert pe_out_dim == pe_out_dim2, f\"values mismatch {pe_out_dim}!={pe_out_dim2}\"\n                pe_encoders[encoder_name] = encoder(out_dim=pe_out_dim, **this_in_dims, **encoder_kwargs)\n            elif output_keys[0] == \"edge_feat\":\n                pe_out_dim2 = encoder_kwargs.pop(\"out_dim\", None)\n                if pe_out_dim2 is not None:\n                    assert edge_pe_out_dim == pe_out_dim2, f\"values mismatch {pe_out_dim}!={pe_out_dim2}\"\n                pe_encoders[encoder_name] = encoder(out_dim=edge_pe_out_dim, **this_in_dims, **encoder_kwargs)\n            else:\n                pe_encoders[encoder_name] = encoder(**this_in_dims, **encoder_kwargs)\n\n        return pe_encoders\n\n    def forward(self, g: Batch) -> Batch:\n        r\"\"\"\n        forward pass of the pe encoders and pooling\n\n        Parameters:\n            g:\n                ptg Batch on which the convolution is done.\n                Must contain the following elements:\n\n                - Node key `\"feat\"`: `torch.Tensor[..., N, Din]`.\n                  Input node feature tensor, before the network.\n                  `N` is the number of nodes, `Din` is the input features dimension ``self.pre_nn.in_dim``\n\n                - Edge key `\"edge_feat\"`: `torch.Tensor[..., N, Ein]` **Optional**.\n                  The edge features to use. It will be ignored if the\n                  model doesn't supporte edge features or if\n                  `self.in_dim_edges==0`.\n\n                - Other keys related to positional encodings `\"pos_enc_feats_sign_flip\"`,\n                  `\"pos_enc_feats_no_flip\"`.\n\n        Returns:\n            g:\n                pyg Batch with the positional encodings added to the graph\n        \"\"\"\n        # Apply the positional encoders\n        pe_pooled = self.forward_positional_encoding(g)\n\n        # Add the processed positional encodings to the graphs.\n        # If the key is already present, concatenate the pe_pooled to the pre-existing feature.\n        for pe_key, this_pe in pe_pooled.items():\n            feat = this_pe\n            if pe_key in get_keys(g):\n                feat = torch.cat((feat, g[pe_key]), dim=-1)\n            g[pe_key] = feat\n        return g\n\n    def forward_positional_encoding(self, g: Batch) -> Dict[str, Tensor]:\n        \"\"\"\n        Forward pass for the positional encodings (PE),\n        with each PE having it's own encoder defined in `self.pe_encoders`.\n        All the positional encodings with the same keys are pooled together\n        using `self.pe_pooling`.\n\n        Parameters:\n            g: pyg Batch containing the node positional encodings\n\n        Returns:\n            pe_node_pooled: The positional / structural encodings go through\n            encoders, then are pooled together according to their keys.\n\n        \"\"\"\n\n        # Return None if no positional encoders\n        if (self.pe_encoders is None) or len(self.pe_encoders) == 0:\n            return {}\n\n        encoder_outs = []\n        # Run every node and edge positional-encoder\n        for encoder_name, encoder in self.pe_encoders.items():\n            encoder_outs.append(encoder(g, key_prefix=encoder_name))\n\n        # list of dict to dict of list, with concatenation of the tensors\n        pe_cats = {\n            key: torch.stack([d[key] for d in encoder_outs if key in d], dim=-1)\n            for key in set().union(*encoder_outs)\n        }\n\n        # Pool the node and edge positional encodings\n        pe_pooled = {}\n        for key, pe_cat in pe_cats.items():\n            pe_pooled[key] = self.forward_simple_pooling(pe_cat, pooling=self.pe_pool, dim=-1)\n\n        return pe_pooled\n\n    def forward_simple_pooling(self, h: Tensor, pooling: str, dim: int) -> Tensor:\n        \"\"\"\n        Apply sum, mean, or max pooling on a Tensor.\n        Parameters:\n            h: the Tensor to pool\n            pooling: string specifiying the pooling method\n            dim: the dimension to pool over\n\n        Returns:\n            pooled: the pooled Tensor\n        \"\"\"\n\n        if pooling == \"sum\":\n            pooled = torch.sum(h, dim=dim)\n        elif pooling == \"mean\":\n            pooled = torch.mean(h, dim=dim)\n        elif pooling == \"max\":\n            pooled = torch.max(h, dim=dim).values\n        else:\n            raise Exception(f\"Pooling method `{self.pe_pool}` is not defined\")\n        return pooled\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n\n        Returns:\n            pe_kw: the model kwargs where the dimensions are divided by the factor\n        \"\"\"\n        # For the pe-encoders, don't factor the in_dim and in_dim_edges\n        if self.pe_encoders is not None:\n            pe_kw = deepcopy(self.pe_encoders_kwargs)\n            new_pe_kw = {\n                key: encoder.make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=False)\n                for key, encoder in self.pe_encoders.items()\n            }\n            pe_kw[\"out_dim\"] = round(pe_kw.get(\"out_dim\", 0) / divide_factor)\n            pe_kw[\"edge_out_dim\"] = round(pe_kw.get(\"edge_out_dim\", 0) / divide_factor)\n            for key, enc in pe_kw[\"encoders\"].items():\n                new_pe_kw[key].pop(\"in_dim\", None)\n                new_pe_kw[key].pop(\"in_dim_edges\", None)\n                enc.update(new_pe_kw[key])\n        return pe_kw\n\n    @property\n    def input_keys(self) -> Iterable[str]:\n        r\"\"\"\n        Returns the input keys for all pe-encoders\n\n        Returns:\n            input_keys: the input keys for all pe-encoders\n        \"\"\"\n        if self.pe_encoders is not None:\n            return self.pe_encoders_kwargs[\"input_keys\"]\n        else:\n            raise ValueError(\"pe_encoders is not initialized, so there are no input keys.\")\n\n    @property\n    def in_dims(self) -> Iterable[int]:\n        r\"\"\"\n        Returns the input dimensions for all pe-encoders\n\n        Returns:\n            in_dims: the input dimensions for all pe-encoders\n        \"\"\"\n        if self.pe_encoders is not None:\n            return self.pe_encoders_kwargs[\"in_dims\"]\n        else:\n            raise ValueError(\"pe_encoders is not initialized, so there are no input dimensions.\")\n\n    @property\n    def out_dim(self) -> int:\n        r\"\"\"\n        Returns the output dimension of the pooled embedding from all the pe encoders\n\n        Returns:\n            out_dim: the output dimension of the pooled embedding from all the pe encoders\n        \"\"\"\n        if self.pe_encoders is not None:\n            return self.pe_encoders_kwargs[\"out_dim\"]\n        else:\n            raise ValueError(\"pe_encoders is not initialized, so there is no output dimension.\")\n"
  },
  {
    "path": "graphium/nn/architectures/global_architectures.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Iterable, List, Dict, Literal, Tuple, Union, Callable, Any, Optional, Type\nfrom torch_geometric.data import Batch\nfrom graphium.ipu.to_dense_batch import to_dense_batch\nfrom loguru import logger\n\n# Misc imports\nimport inspect\nfrom copy import deepcopy\nfrom collections import OrderedDict\n\n# Torch imports\nfrom torch import Tensor, nn\nimport torch\nfrom torch_geometric.data import Data\nfrom omegaconf import DictConfig, OmegaConf\n\n# graphium imports\nfrom graphium.data.utils import get_keys\nfrom graphium.nn.base_layers import FCLayer, get_activation, get_norm\nfrom graphium.nn.architectures.encoder_manager import EncoderManager\nfrom graphium.nn.pyg_layers import VirtualNodePyg, parse_pooling_layer_pyg\nfrom graphium.nn.base_graph_layer import BaseGraphModule, BaseGraphStructure\nfrom graphium.nn.residual_connections import (\n    ResidualConnectionBase,\n    ResidualConnectionWeighted,\n    ResidualConnectionRandom,\n)\nfrom graphium.nn.utils import MupMixin\nfrom graphium.ipu.ipu_utils import import_poptorch, is_running_on_ipu\n\npoptorch = import_poptorch(raise_error=False)\n\nimport collections\n\n\nclass FeedForwardNN(nn.Module, MupMixin):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        hidden_dims: Union[List[int], int],\n        depth: Optional[int] = None,\n        activation: Union[str, Callable] = \"relu\",\n        last_activation: Union[str, Callable] = \"none\",\n        dropout: float = 0.0,\n        last_dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        first_normalization: Union[str, Callable] = \"none\",\n        last_normalization: Union[str, Callable] = \"none\",\n        residual_type: str = \"none\",\n        residual_skip_steps: int = 1,\n        name: str = \"LNN\",\n        layer_type: Union[str, nn.Module] = \"fc\",\n        layer_kwargs: Optional[Dict] = None,\n        last_layer_is_readout: bool = False,\n    ):\n        r\"\"\"\n        A flexible neural network architecture, with variable hidden dimensions,\n        support for multiple layer types, and support for different residual\n        connections.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            hidden_dims:\n                Either an integer specifying all the hidden dimensions,\n                or a list of dimensions in the hidden layers.\n                Be careful, the \"simple\" residual type only supports\n                hidden dimensions of the same value.\n\n            depth:\n                If `hidden_dims` is an integer, `depth` is 1 + the number of\n                hidden layers to use.\n                If `hidden_dims` is a list, then\n                `depth` must be `None` or equal to `len(hidden_dims) + 1`\n\n            activation:\n                activation function to use in the hidden layers.\n\n            last_activation:\n                activation function to use in the last layer.\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            last_dropout:\n                The ratio of units to dropout for the last_layer. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            first_normalization:\n                Whether to use batch normalization **before** the first layer\n\n            last_normalization:\n                Whether to use batch normalization in the last layer\n\n            residual_type:\n                - \"none\": No residual connection\n                - \"simple\": Residual connection similar to the ResNet architecture.\n                  See class `ResidualConnectionSimple`\n                - \"weighted\": Residual connection similar to the Resnet architecture,\n                  but with weights applied before the summation. See class `ResidualConnectionWeighted`\n                - \"concat\": Residual connection where the residual is concatenated instead\n                  of being added.\n                - \"densenet\": Residual connection where the residual of all previous layers\n                  are concatenated. This leads to a strong increase in the number of parameters\n                  if there are multiple hidden layers.\n\n            residual_skip_steps:\n                The number of steps to skip between each residual connection.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n\n            layer_type:\n                The type of layers to use in the network.\n                Either \"fc\" as the `FCLayer`, or a class representing the `nn.Module`\n                to use.\n\n            layer_kwargs:\n                The arguments to be used in the initialization of the layer provided by `layer_type`\n\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n        \"\"\"\n\n        super().__init__()\n\n        # Set the class attributes\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        if isinstance(hidden_dims, int):\n            self.hidden_dims = [hidden_dims] * (depth - 1)\n        else:\n            self.hidden_dims = list(hidden_dims)\n            assert (depth is None) or (\n                depth == len(self.hidden_dims) + 1\n            ), \"Mismatch between the provided network depth from `hidden_dims` and `depth`\"\n        self.depth = len(self.hidden_dims) + 1\n        self.activation = get_activation(activation)\n        self.last_activation = get_activation(last_activation)\n        self.dropout = dropout\n        self.last_dropout = last_dropout\n        self.normalization = normalization\n        self.first_normalization = get_norm(first_normalization, dim=in_dim)\n        self.last_normalization = last_normalization\n        self.residual_type = None if residual_type is None else residual_type.lower()\n        self.residual_skip_steps = residual_skip_steps\n        self.layer_kwargs = layer_kwargs if layer_kwargs is not None else {}\n        self.name = name\n        self.last_layer_is_readout = last_layer_is_readout\n        self._readout_cache = None\n\n        self.full_dims = [self.in_dim] + self.hidden_dims + [self.out_dim]\n        self._parse_layers(layer_type=layer_type, residual_type=residual_type)\n        self._create_layers()\n        self._check_bad_arguments()\n\n    def _parse_layers(self, layer_type, residual_type):\n        # Parse the layer and residuals\n        from graphium.utils.spaces import LAYERS_DICT, RESIDUALS_DICT\n\n        self.layer_class, self.layer_name = self._parse_class_from_dict(layer_type, LAYERS_DICT)\n        self.residual_class, self.residual_name = self._parse_class_from_dict(residual_type, RESIDUALS_DICT)\n\n    def _check_bad_arguments(self):\n        r\"\"\"\n        Raise comprehensive errors if the arguments seem wrong\n        \"\"\"\n        if (self.residual_type == \"simple\") and not (self.hidden_dims[:-1] == self.hidden_dims[1:]):\n            raise ValueError(\n                f\"When using the residual_type={self.residual_type}\"\n                + f\", all elements in the hidden_dims must be equal. Provided:{self.hidden_dims}\"\n            )\n\n    def _parse_class_from_dict(\n        self, name_or_class: Union[type, str], class_dict: Dict[str, type]\n    ) -> Tuple[type, str]:\n        r\"\"\"\n        Register the hyperparameters for tracking by Pytorch-lightning\n        \"\"\"\n        if isinstance(name_or_class, str):\n            obj_name = name_or_class.lower()\n            obj_class = class_dict[obj_name]\n        elif callable(name_or_class):\n            obj_name = str(name_or_class)\n            obj_class = name_or_class\n        else:\n            raise TypeError(f\"`name_or_class` must be str or callable, provided: {type(name_or_class)}\")\n\n        return obj_class, obj_name\n\n    def _create_residual_connection(self, out_dims: List[int]) -> Tuple[ResidualConnectionBase, List[int]]:\n        r\"\"\"\n        Create the residual connection classes.\n        The out_dims is only used if the residual classes requires weights\n        \"\"\"\n        if self.residual_class == ResidualConnectionWeighted:\n            # if self.residual_class.has_weights:\n            residual_layer = self.residual_class(\n                skip_steps=self.residual_skip_steps,\n                out_dims=out_dims,\n                dropout=self.dropout,\n                activation=self.activation,\n                normalization=self.normalization,\n                bias=False,\n            )\n        elif self.residual_class == ResidualConnectionRandom:\n            residual_layer = self.residual_class(\n                out_dims=out_dims,\n                skip_steps=self.residual_skip_steps,\n            )\n        else:\n            residual_layer = self.residual_class(skip_steps=self.residual_skip_steps)\n\n        residual_out_dims = residual_layer.get_true_out_dims(self.full_dims[1:])\n\n        return residual_layer, residual_out_dims\n\n    def _create_layers(self):\n        r\"\"\"\n        Create all the necessary layers for the network.\n        It's a bit complicated to explain what's going on in this function,\n        but it must manage the varying features sizes caused by:\n\n        - The presence of different types of residual connections\n        \"\"\"\n\n        self.residual_layer, residual_out_dims = self._create_residual_connection(out_dims=self.full_dims[1:])\n\n        # Create a ModuleList of the GNN layers\n        self.layers = nn.ModuleList()\n        this_in_dim = self.full_dims[0]\n        this_activation = self.activation\n        this_norm = self.normalization\n        this_dropout = self.dropout\n\n        for ii in range(self.depth):\n            this_out_dim = self.full_dims[ii + 1]\n            other_kwargs = {}\n            sig = inspect.signature(self.layer_class)\n            key_args = [p.name for p in sig.parameters.values()]\n            if ii == self.depth - 1:\n                this_activation = self.last_activation\n                this_norm = self.last_normalization\n                this_dropout = self.last_dropout\n                if self.last_layer_is_readout and (\"is_readout_layer\" in key_args):\n                    other_kwargs[\"is_readout_layer\"] = self.last_layer_is_readout\n\n            # Create the layer\n            self.layers.append(\n                self.layer_class(\n                    in_dim=this_in_dim,\n                    out_dim=this_out_dim,\n                    activation=this_activation,\n                    dropout=this_dropout,\n                    normalization=this_norm,\n                    **self.layer_kwargs,\n                    **other_kwargs,\n                )\n            )\n\n            if ii < len(residual_out_dims):\n                this_in_dim = residual_out_dims[ii]\n\n    @property\n    def cache_readouts(self) -> bool:\n        \"\"\"Whether the readout cache is enabled\"\"\"\n        return isinstance(self._readout_cache, dict)\n\n    def _enable_readout_cache(self):\n        \"\"\"\n        Enable the readout cache.\n        Due to the usage of a dict, it only saves readouts for a single batch at a time\n        \"\"\"\n        if not self.cache_readouts:\n            self._readout_cache = {}\n\n    def _disable_readout_cache(self):\n        \"\"\"Disable the readout cache\"\"\"\n        self._readout_cache = None\n\n    def drop_layers(self, depth: int) -> None:\n        r\"\"\"\n        Remove the last layers of the model part.\n        \"\"\"\n\n        assert depth >= 0\n        assert depth <= len(self.layers)\n\n        if depth > 0:\n            self.layers = self.layers[:-depth]\n\n    def add_layers(self, layers: int) -> None:\n        r\"\"\"\n        Add layers to the end of the model.\n        \"\"\"\n        assert isinstance(layers, nn.ModuleList)\n        assert len(layers) > 0\n        if len(self.layers) > 0:\n            assert layers[0].in_dim == self.layers[-1].out_dim\n\n        self.layers.extend(layers)\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Apply the neural network on the input features.\n\n        Parameters:\n\n            h: `torch.Tensor[..., Din]`:\n                Input feature tensor, before the network.\n                `Din` is the number of input features\n\n        Returns:\n\n            `torch.Tensor[..., Dout]`:\n                Output feature tensor, after the network.\n                `Dout` is the number of output features\n\n        \"\"\"\n        feat_prev = None\n\n        # Apply a normalization before the first layer\n        if self.first_normalization is not None:\n            h = self.first_normalization(h)\n\n        # Apply all neural network layers\n        for ii, layer in enumerate(self.layers):\n            h = layer.forward(h)\n            if ii < len(self.layers) - 1:\n                h, feat_prev = self.residual_layer.forward(h, feat_prev, step_idx=ii)\n\n            if self.cache_readouts:\n                self._readout_cache[ii] = h\n\n        return h\n\n    def get_init_kwargs(self) -> Dict[str, Any]:\n        \"\"\"\n        Get a dictionary that can be used to instanciate a new object with identical parameters.\n        \"\"\"\n        return deepcopy(\n            dict(\n                in_dim=self.in_dim,\n                out_dim=self.out_dim,\n                hidden_dims=self.hidden_dims,\n                depth=None,\n                activation=self.activation,\n                last_activation=self.last_activation,\n                dropout=self.dropout,\n                last_dropout=self.last_dropout,\n                normalization=self.normalization,\n                first_normalization=self.first_normalization,\n                last_normalization=self.last_normalization,\n                residual_type=self.residual_type,\n                residual_skip_steps=self.residual_skip_steps,\n                name=self.name,\n                layer_type=self.layer_class,\n                layer_kwargs=self.layer_kwargs,\n                last_layer_is_readout=self.last_layer_is_readout,\n            )\n        )\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        \"\"\"\n        kwargs = self.get_init_kwargs()\n        kwargs[\"hidden_dims\"] = [round(dim / divide_factor) for dim in kwargs[\"hidden_dims\"]]\n        if factor_in_dim:\n            kwargs[\"in_dim\"] = round(kwargs[\"in_dim\"] / divide_factor)\n        if not self.last_layer_is_readout:\n            kwargs[\"out_dim\"] = round(kwargs[\"out_dim\"] / divide_factor)\n        return kwargs\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        class_str = f\"{self.name}(depth={self.depth}, {self.residual_layer})\\n    \"\n        layer_str = f\"[{self.layer_class.__name__}[{' -> '.join(map(str, self.full_dims))}]\"\n\n        return class_str + layer_str\n\n\nclass EnsembleFeedForwardNN(FeedForwardNN):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        hidden_dims: Union[List[int], int],\n        num_ensemble: int,\n        reduction: Union[str, Callable],\n        subset_in_dim: Union[float, int] = 1.0,\n        depth: Optional[int] = None,\n        activation: Union[str, Callable] = \"relu\",\n        last_activation: Union[str, Callable] = \"none\",\n        dropout: float = 0.0,\n        last_dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        first_normalization: Union[str, Callable] = \"none\",\n        last_normalization: Union[str, Callable] = \"none\",\n        residual_type: str = \"none\",\n        residual_skip_steps: int = 1,\n        name: str = \"LNN\",\n        layer_type: Union[str, nn.Module] = \"ens-fc\",\n        layer_kwargs: Optional[Dict] = None,\n        last_layer_is_readout: bool = False,\n    ):\n        r\"\"\"\n        An ensemble of flexible neural network architecture, with variable hidden dimensions,\n        support for multiple layer types, and support for different residual\n        connections.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            hidden_dims:\n                Either an integer specifying all the hidden dimensions,\n                or a list of dimensions in the hidden layers.\n                Be careful, the \"simple\" residual type only supports\n                hidden dimensions of the same value.\n\n            num_ensemble:\n                Number of MLPs that run in parallel.\n\n            reduction:\n                Reduction to use at the end of the MLP. Choices:\n\n                - \"none\" or `None`: No reduction\n                - \"mean\": Mean reduction\n                - \"sum\": Sum reduction\n                - \"max\": Max reduction\n                - \"min\": Min reduction\n                - \"median\": Median reduction\n                - `Callable`: Any callable function. Must take `dim` as a keyword argument.\n\n            subset_in_dim:\n                If float, ratio of the subset of the ensemble to use. Must be between 0 and 1.\n                If int, number of elements to subset from in_dim.\n                If `None`, the subset_in_dim is set to `1.0`.\n                A different subset is used for each ensemble.\n                Only valid if the input shape is `[B, Din]`.\n\n            depth:\n                If `hidden_dims` is an integer, `depth` is 1 + the number of\n                hidden layers to use.\n                If `hidden_dims` is a list, then\n                `depth` must be `None` or equal to `len(hidden_dims) + 1`\n\n            activation:\n                activation function to use in the hidden layers.\n\n            last_activation:\n                activation function to use in the last layer.\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            last_dropout:\n                The ratio of units to dropout for the last_layer. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            first_normalization:\n                Whether to use batch normalization **before** the first layer\n\n            last_normalization:\n                Whether to use batch normalization in the last layer\n\n            residual_type:\n                - \"none\": No residual connection\n                - \"simple\": Residual connection similar to the ResNet architecture.\n                  See class `ResidualConnectionSimple`\n                - \"weighted\": Residual connection similar to the Resnet architecture,\n                  but with weights applied before the summation. See class `ResidualConnectionWeighted`\n                - \"concat\": Residual connection where the residual is concatenated instead\n                  of being added.\n                - \"densenet\": Residual connection where the residual of all previous layers\n                  are concatenated. This leads to a strong increase in the number of parameters\n                  if there are multiple hidden layers.\n\n            residual_skip_steps:\n                The number of steps to skip between each residual connection.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n\n            layer_type:\n                The type of layers to use in the network.\n                Either \"ens-fc\" as the `EnsembleFCLayer`, or a class representing the `nn.Module`\n                to use.\n\n            layer_kwargs:\n                The arguments to be used in the initialization of the layer provided by `layer_type`\n\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n        \"\"\"\n\n        # Parse the ensemble arguments\n        if layer_kwargs is None:\n            layer_kwargs = {}\n        layer_kwargs[\"num_ensemble\"] = self._parse_num_ensemble(num_ensemble, layer_kwargs)\n\n        # Parse the sample input dimension\n        self.subset_in_dim, self.subset_idx = self._parse_subset_in_dim(in_dim, subset_in_dim, num_ensemble)\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            depth=depth,\n            activation=activation,\n            last_activation=last_activation,\n            dropout=dropout,\n            last_dropout=last_dropout,\n            normalization=normalization,\n            first_normalization=first_normalization,\n            last_normalization=last_normalization,\n            residual_type=residual_type,\n            residual_skip_steps=residual_skip_steps,\n            name=name,\n            layer_type=layer_type,\n            layer_kwargs=layer_kwargs,\n            last_layer_is_readout=last_layer_is_readout,\n        )\n\n        # Parse the reduction\n        self.reduction = reduction\n        self.reduction_fn = self._parse_reduction(reduction)\n\n    def _create_layers(self):\n        self.full_dims[0] = self.subset_in_dim\n        super()._create_layers()\n\n    def _parse_num_ensemble(self, num_ensemble: int, layer_kwargs) -> int:\n        r\"\"\"\n        Parse the num_ensemble argument.\n        \"\"\"\n        num_ensemble_out = num_ensemble\n\n        # Get the num_ensemble from the layer_kwargs if it exists\n        num_ensemble_2 = None\n        if layer_kwargs is None:\n            layer_kwargs = {}\n        else:\n            num_ensemble_2 = layer_kwargs.get(\"num_ensemble\", None)\n\n        if num_ensemble is None:\n            num_ensemble_out = num_ensemble_2\n\n        # Check that the num_ensemble is consistent\n        if num_ensemble_2 is not None:\n            assert (\n                num_ensemble_2 == num_ensemble\n            ), f\"num_ensemble={num_ensemble} != num_ensemble_2={num_ensemble_2}\"\n\n        # Check that `num_ensemble_out` is not None\n        assert (\n            num_ensemble_out is not None\n        ), f\"num_ensemble={num_ensemble} and num_ensemble_2={num_ensemble_2}\"\n\n        return num_ensemble_out\n\n    def _parse_reduction(self, reduction: Optional[Union[str, Callable]]) -> Optional[Callable]:\n        r\"\"\"\n        Parse the reduction argument.\n        \"\"\"\n\n        if isinstance(reduction, str):\n            reduction = reduction.lower()\n        if reduction is None or reduction == \"none\":\n            return None\n        elif reduction == \"mean\":\n            return torch.mean\n        elif reduction == \"sum\":\n            return torch.sum\n        elif reduction == \"max\":\n\n            def max_vals(x, dim):\n                return torch.max(x, dim=dim).values\n\n            return max_vals\n        elif reduction == \"min\":\n\n            def min_vals(x, dim):\n                return torch.min(x, dim=dim).values\n\n            return min_vals\n        elif reduction == \"median\":\n\n            def median_vals(x, dim):\n                return torch.median(x, dim=dim).values\n\n            return median_vals\n        elif callable(reduction):\n            return reduction\n        else:\n            raise ValueError(f\"Unknown reduction {reduction}\")\n\n    def _parse_subset_in_dim(\n        self, in_dim: int, subset_in_dim: Union[float, int], num_ensemble: int\n    ) -> Tuple[float, int]:\n        r\"\"\"\n        Parse the subset_in_dim argument and the subset_in_dim.\n\n        The subset_in_dim is the ratio of the hidden features to use by each MLP of the ensemble.\n        The subset_in_dim is the number of input features to use by each MLP of the ensemble.\n\n        Parameters:\n\n            in_dim: The number of input features, before subsampling\n\n            subset_in_dim:\n                Ratio of the subset of features to use by each MLP of the ensemble.\n                Must be between 0 and 1. A different subset is used for each ensemble.\n                Only valid if the input shape is `[B, Din]`.\n\n                If None, the subset_in_dim is set to 1.0.\n\n            num_ensemble:\n                Number of MLPs that run in parallel.\n\n        Returns:\n\n                subset_in_dim: The ratio of the subset of features to use by each MLP of the ensemble.\n                subset_idx: The indices of the features to use by each MLP of the ensemble.\n        \"\"\"\n\n        # Parse the subset_in_dim, make sure value is between 0 and 1\n        subset_idx = None\n        if subset_in_dim is None:\n            return 1.0, None\n        if isinstance(subset_in_dim, int):\n            assert (\n                subset_in_dim > 0 and subset_in_dim <= in_dim\n            ), f\"subset_in_dim={subset_in_dim}, in_dim={in_dim}\"\n        elif isinstance(subset_in_dim, float):\n            assert subset_in_dim > 0.0 and subset_in_dim <= 1.0, f\"subset_in_dim={subset_in_dim}\"\n\n            # Convert to integer value\n            subset_in_dim = int(in_dim * subset_in_dim)\n            if subset_in_dim == 0:\n                subset_in_dim = 1\n\n        # Create the subset_idx, which is a list of indices to use for each ensemble\n        if subset_in_dim != in_dim:\n            subset_idx = torch.stack([torch.randperm(in_dim)[:subset_in_dim] for _ in range(num_ensemble)])\n\n        return subset_in_dim, subset_idx\n\n    def _parse_layers(self, layer_type, residual_type):\n        # Parse the layer and residuals\n        from graphium.utils.spaces import ENSEMBLE_LAYERS_DICT, RESIDUALS_DICT\n\n        self.layer_class, self.layer_name = self._parse_class_from_dict(layer_type, ENSEMBLE_LAYERS_DICT)\n        self.residual_class, self.residual_name = self._parse_class_from_dict(residual_type, RESIDUALS_DICT)\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Subset the hidden dimension for each MLP,\n        forward the ensemble MLP on the input features,\n        then reduce the output if specified.\n\n        Parameters:\n\n            h: `torch.Tensor[B, Din]` or `torch.Tensor[..., 1, B, Din]` or `torch.Tensor[..., L, B, Din]`:\n\n                Input feature tensor, before the MLP.\n                `Din` is the number of input features, `B` is the batch size, and `L` is the number of ensembles.\n\n        Returns:\n\n            `torch.Tensor[..., L, B, Dout]` or `torch.Tensor[..., B, Dout]`:\n\n                Output feature tensor, after the MLP.\n                `Dout` is the number of output features, `B` is the batch size, and `L` is the number of ensembles.\n                `L` is removed if a reduction is specified.\n        \"\"\"\n        # Subset the input features for each MLP in the ensemble\n        if self.subset_idx is not None:\n            if len(h.shape) != 2:\n                assert (\n                    h.shape[-3] == 1\n                ), f\"Expected shape to be [B, Din] or [..., 1, B, Din] when using `subset_in_dim`, got {h.shape}.\"\n            h = h[..., self.subset_idx].transpose(-2, -3)\n\n        # Run the standard forward pass\n        h = super().forward(h)\n\n        # Reduce the output if specified\n        if self.reduction_fn is not None:\n            h = self.reduction_fn(h, dim=-3)\n\n        return h\n\n    def get_init_kwargs(self) -> Dict[str, Any]:\n        \"\"\"\n        Get a dictionary that can be used to instanciate a new object with identical parameters.\n        \"\"\"\n        kw = super().get_init_kwargs()\n        kw[\"num_ensemble\"] = self.num_ensemble\n        kw[\"reduction\"] = self.reduction\n        return kw\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        class_str = f\"{self.name}(depth={self.depth}, {self.residual_layer})\\n , num_ensemble={self.num_ensemble}, reduction={self.reduction}\\n    \"\n        layer_str = f\"[{self.layer_class.__name__}[{' -> '.join(map(str, self.full_dims))}]\"\n\n        return class_str + layer_str\n\n\nclass FeedForwardGraph(FeedForwardNN):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        hidden_dims: Union[List[int], int],\n        layer_type: Union[str, nn.Module],\n        depth: Optional[int] = None,\n        activation: Union[str, Callable] = \"relu\",\n        last_activation: Union[str, Callable] = \"none\",\n        dropout: float = 0.0,\n        last_dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        first_normalization: Union[str, Callable] = \"none\",\n        last_normalization: Union[str, Callable] = \"none\",\n        residual_type: str = \"none\",\n        residual_skip_steps: int = 1,\n        in_dim_edges: int = 0,\n        hidden_dims_edges: List[int] = [],\n        out_dim_edges: Optional[int] = None,\n        name: str = \"GNN\",\n        layer_kwargs: Optional[Dict] = None,\n        virtual_node: str = \"none\",\n        use_virtual_edges: bool = False,\n        last_layer_is_readout: bool = False,\n    ):\n        r\"\"\"\n        A flexible neural network architecture, with variable hidden dimensions,\n        support for multiple layer types, and support for different residual\n        connections.\n\n        This class is meant to work with different graph neural networks\n        layers. Any layer must inherit from `graphium.nn.base_graph_layer.BaseGraphStructure`\n        or `graphium.nn.base_graph_layer.BaseGraphLayer`.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            hidden_dims:\n                List of dimensions in the hidden layers.\n                Be careful, the \"simple\" residual type only supports\n                hidden dimensions of the same value.\n\n            layer_type:\n                Type of layer to use. Can be a string or nn.Module.\n\n            depth:\n                If `hidden_dims` is an integer, `depth` is 1 + the number of\n                hidden layers to use. If `hidden_dims` is a `list`, `depth` must\n                be `None`.\n\n            activation:\n                activation function to use in the hidden layers.\n\n            last_activation:\n                activation function to use in the last layer.\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            last_dropout:\n                The ratio of units to dropout for the last layer. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            first_normalization:\n                Whether to use batch normalization **before** the first layer\n\n            last_normalization:\n                Whether to use batch normalization in the last layer\n\n            residual_type:\n                - \"none\": No residual connection\n                - \"simple\": Residual connection similar to the ResNet architecture.\n                  See class `ResidualConnectionSimple`\n                - \"weighted\": Residual connection similar to the Resnet architecture,\n                  but with weights applied before the summation. See class `ResidualConnectionWeighted`\n                - \"concat\": Residual connection where the residual is concatenated instead\n                  of being added.\n                - \"densenet\": Residual connection where the residual of all previous layers\n                  are concatenated. This leads to a strong increase in the number of parameters\n                  if there are multiple hidden layers.\n\n            residual_skip_steps:\n                The number of steps to skip between each residual connection.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n\n            in_dim_edges:\n                Input edge-feature dimensions of the network. Keep at 0 if not using\n                edge features, or if the layer doesn't support edges.\n\n            hidden_dims_edges:\n                Hidden dimensions for the edges. Most models don't support it, so it\n                should only be used for those that do, i.e. `GatedGCNLayer`\n\n            out_dim_edges:\n                Output edge-feature dimensions of the network. Keep at 0 if not using\n                edge features, or if the layer doesn't support edges. Defaults to the\n                last value of hidden_dims_edges.\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n\n            layer_type:\n                The type of layers to use in the network.\n                A class that inherits from `graphium.nn.base_graph_layer.BaseGraphStructure`,\n                or one of the following strings\n\n                - \"pyg:gin\": GINConvPyg\n                - \"pyg:gine\": GINEConvPyg\n                - \"pyg:gated-gcn\": GatedGCNPyg\n                - \"pyg:pna-msgpass\": PNAMessagePassingPyg\n\n            layer_kwargs:\n                The arguments to be used in the initialization of the layer provided by `layer_type`\n\n            virtual_node:\n                A string associated to the type of virtual node to use,\n                either `None`, \"none\", \"mean\", \"sum\", \"max\", \"logsum\".\n                See `graphium.nn.pooling_pyg.VirtualNode`.\n\n                The virtual node will not use any residual connection if `residual_type`\n                is \"none\". Otherwise, it will use a simple ResNet like residual\n                connection.\n\n            use_virtual_edges:\n                A bool flag used to select if the virtual node should use the edges or not\n\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n        \"\"\"\n\n        # Initialize the additional attributes\n        self.in_dim_edges = in_dim_edges\n        if isinstance(hidden_dims_edges, int):\n            self.hidden_dims_edges = [hidden_dims_edges] * (depth - 1)\n        elif len(hidden_dims_edges) == 0:\n            self.hidden_dims_edges = []\n        else:\n            self.hidden_dims_edges = list(hidden_dims_edges)\n            assert depth is None\n        self.out_dim_edges = (\n            out_dim_edges\n            if out_dim_edges is not None\n            else self.hidden_dims_edges[-1] if self.hidden_dims_edges else 0\n        )\n        self.full_dims_edges = None\n        if len(self.hidden_dims_edges) or self.out_dim_edges > 0:\n            assert self.out_dim_edges > 0, self.out_dim_edges\n            self.full_dims_edges = [self.in_dim_edges] + self.hidden_dims_edges + [self.out_dim_edges]\n\n        self.virtual_node = virtual_node.lower() if virtual_node is not None else \"none\"\n\n        self.use_virtual_edges = use_virtual_edges\n        self.virtual_node_class = self._parse_virtual_node_class()\n\n        # Initialize the parent `FeedForwardNN`\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            depth=depth,\n            activation=activation,\n            last_activation=last_activation,\n            normalization=normalization,\n            first_normalization=first_normalization,\n            last_normalization=last_normalization,\n            residual_type=residual_type,\n            residual_skip_steps=residual_skip_steps,\n            name=name,\n            layer_type=layer_type,\n            dropout=dropout,\n            last_dropout=last_dropout,\n            layer_kwargs=layer_kwargs,\n            last_layer_is_readout=last_layer_is_readout,\n        )\n\n        self.first_normalization_edges = get_norm(first_normalization, dim=in_dim_edges)\n\n    def _check_bad_arguments(self):\n        r\"\"\"\n        Raise comprehensive errors if the arguments seem wrong\n        \"\"\"\n        super()._check_bad_arguments()\n        if (\n            (self.in_dim_edges > 0) or (self.full_dims_edges is not None)\n        ) and not self.layer_class.layer_supports_edges:\n            raise ValueError(f\"Cannot use edge features with class `{self.layer_class}`\")\n\n    def get_nested_key(self, d, target_key):\n        \"\"\"\n        Get the value associated with a key in a nested dictionary.\n\n        Parameters:\n        - d: The dictionary to search in\n        - target_key: The key to search for\n\n        Returns:\n        - The value associated with the key if found, None otherwise\n        \"\"\"\n        if target_key in d:\n            return d[target_key]\n        for key, value in d.items():\n            if isinstance(value, (dict, DictConfig)):\n                nested_result = self.get_nested_key(value, target_key)\n                if nested_result is not None:\n                    return nested_result\n        return None\n\n    def _create_layers(self):\n        r\"\"\"\n        Create all the necessary layers for the network.\n        It's a bit complicated to explain what's going on in this function,\n        but it must manage the varying features sizes caused by:\n\n        - The presence of different types of residual connections\n        - The presence or absence of edges\n        - The output dimensions varying for different networks i.e. `GatLayer` outputs different feature sizes according to the number of heads\n        - The presence or absence of virtual nodes\n        - The different possible pooling, and the concatenation of multiple pooling together.\n        \"\"\"\n\n        residual_layer_temp, residual_out_dims = self._create_residual_connection(out_dims=self.full_dims[1:])\n\n        # Create a ModuleList of the GNN layers\n        self.layers = nn.ModuleList()\n        self.virtual_node_layers = nn.ModuleList()\n        this_in_dim = self.full_dims[0]\n        this_activation = self.activation\n        this_norm = self.normalization\n        this_dropout = self.dropout\n\n        # Find the appropriate edge dimensions, depending if edges are used,\n        # And if the residual is required for the edges\n        this_in_dim_edges, this_out_dim_edges = None, None\n        if self.full_dims_edges is not None:\n            this_in_dim_edges, this_out_dim_edges = self.full_dims_edges[0:2]\n            residual_out_dims_edges = residual_layer_temp.get_true_out_dims(self.full_dims_edges[1:])\n        elif self.in_dim_edges > 0:\n            this_in_dim_edges = self.in_dim_edges\n        layer_out_dims_edges = []\n\n        # Create all the layers in a loop\n        for ii in range(self.depth):\n            this_out_dim = self.full_dims[ii + 1]\n\n            # Find the edge key-word arguments depending on the layer type and residual connection\n            this_edge_kwargs = {}\n            if self.layer_class.layer_supports_edges and self.in_dim_edges > 0:\n                this_edge_kwargs[\"in_dim_edges\"] = this_in_dim_edges\n                if \"out_dim_edges\" in inspect.signature(self.layer_class.__init__).parameters.keys():\n                    if self.full_dims_edges is not None:\n                        this_out_dim_edges = self.full_dims_edges[ii + 1]\n                        this_edge_kwargs[\"out_dim_edges\"] = this_out_dim_edges\n                    else:\n                        this_out_dim_edges = self.get_nested_key(self.layer_kwargs, \"out_dim_edges\")\n                        this_edge_kwargs[\"out_dim_edges\"] = this_out_dim_edges\n                    layer_out_dims_edges.append(this_out_dim_edges)\n\n            # Create the GNN layer\n            self.layers.append(\n                self.layer_class(\n                    in_dim=this_in_dim,\n                    out_dim=this_out_dim,\n                    activation=this_activation,\n                    dropout=this_dropout,\n                    normalization=this_norm,\n                    layer_idx=ii,\n                    layer_depth=self.depth,\n                    **self.layer_kwargs,\n                    **this_edge_kwargs,\n                )\n            )\n\n            # Create the Virtual Node layer, except at the last layer\n            if ii < len(residual_out_dims):\n                self.virtual_node_layers.append(\n                    self.virtual_node_class(\n                        in_dim=this_out_dim * self.layers[-1].out_dim_factor,\n                        out_dim=this_out_dim * self.layers[-1].out_dim_factor,\n                        in_dim_edges=this_out_dim_edges,\n                        out_dim_edges=this_out_dim_edges,\n                        activation=this_activation,\n                        dropout=this_dropout,\n                        normalization=this_norm,\n                        bias=True,\n                        vn_type=self.virtual_node,\n                        residual=self.residual_type is not None,\n                        use_edges=self.use_virtual_edges,\n                    )\n                )\n\n            # Get the true input dimension of the next layer,\n            # by factoring both the residual connection and GNN layer type\n            if ii < len(residual_out_dims):\n                this_in_dim = residual_out_dims[ii] * self.layers[ii - 1].out_dim_factor\n                if self.full_dims_edges is not None:\n                    this_in_dim_edges = residual_out_dims_edges[ii] * self.layers[ii - 1].out_dim_factor\n\n        layer_out_dims = [layer.out_dim_factor * layer.out_dim for layer in self.layers]\n\n        # Initialize residual and pooling layers\n        self.residual_layer, _ = self._create_residual_connection(out_dims=layer_out_dims)\n        if len(layer_out_dims_edges) > 0:\n            self.residual_edges_layer, _ = self._create_residual_connection(out_dims=layer_out_dims_edges)\n        else:\n            self.residual_edges_layer = None\n\n    def _graph_layer_forward(\n        self,\n        layer: BaseGraphModule,\n        g: Batch,\n        feat: Tensor,\n        edge_feat: Optional[Tensor],\n        feat_prev: Optional[Tensor],\n        edge_feat_prev: Optional[Tensor],\n        step_idx: int,\n    ) -> Tuple[Tensor, Optional[Tensor], Optional[Tensor], Optional[Tensor]]:\n        r\"\"\"\n        A flexible neural network architecture, with variable hidden dimensions,\n        support for multiple layer types, and support for different residual\n        connections.\n\n        This class is meant to work with different PyG-based graph neural networks\n        layers. Any layer must inherit from `graphium.nn.base_graph_layer.BaseGraphStructure`\n        or `graphium.nn.base_graph_layer.BaseGraphLayer`.\n\n        Apply the *i-th* PyG graph layer, where *i* is the index given by `step_idx`.\n        The layer is applied differently depending if there are edge features or not.\n\n        Then, the residual is also applied on both the features and the edges (if applicable)\n\n        Parameters:\n\n            layer:\n                The PyG layer used for the convolution\n\n            g:\n                graph on which the convolution is done\n\n            feat (torch.Tensor[..., N, Din]):\n                Node feature tensor, before convolution.\n                `N` is the number of nodes, `Din` is the input features\n\n            edge_feat (torch.Tensor[..., N, Ein]):\n                Edge feature tensor, before convolution.\n                `N` is the number of nodes, `Ein` is the input edge features\n\n            feat_prev:\n                Node feature of the previous residual connection, or `None`\n\n            edge_feat_prev:\n                Edge feature of the previous residual connection, or `None`\n\n            step_idx:\n                The current step idx in the forward loop\n\n        Returns:\n\n            feat (torch.Tensor[..., N, Dout]):\n                Node feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Dout` is the output features of the layer and residual\n\n            edge_feat:\n                Edge feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Ein` is the input edge features\n\n            feat_prev:\n                Node feature tensor to be used at the next residual connection, or `None`\n\n            edge_feat_prev:\n                Edge feature tensor to be used at the next residual connection, or `None`\n\n        \"\"\"\n\n        # Set node / edge features into the graph\n        g[\"feat\"] = feat\n        g[\"edge_feat\"] = edge_feat\n\n        # Apply the GNN layer\n        g = layer(g)\n\n        # Get the node / edge features from the graph\n        feat = g[\"feat\"]\n        edge_feat = g[\"edge_feat\"]\n\n        # Apply the residual layers on the features and edges (if applicable)\n        if step_idx < len(self.layers) - 1:\n            feat, feat_prev = self.residual_layer.forward(feat, feat_prev, step_idx=step_idx)\n            if (self.residual_edges_layer is not None) and (layer.layer_outputs_edges):\n                edge_feat, edge_feat_prev = self.residual_edges_layer.forward(\n                    edge_feat, edge_feat_prev, step_idx=step_idx\n                )\n\n        return feat, edge_feat, feat_prev, edge_feat_prev\n\n    def _parse_virtual_node_class(self) -> type:\n        return VirtualNodePyg\n\n    def _virtual_node_forward(\n        self,\n        g: Data,\n        feat: torch.Tensor,\n        vn_feat: torch.Tensor,\n        step_idx: int,\n        edge_feat: torch.Tensor,\n    ) -> Tuple[torch.Tensor, torch.Tensor]:\n        r\"\"\"\n        Apply the *i-th* virtual node layer, where *i* is the index given by `step_idx`.\n\n        Parameters:\n\n            g:\n                graph on which the convolution is done\n\n            feat (torch.Tensor[..., N, Din]):\n                Node feature tensor, before convolution.\n                `N` is the number of nodes, `Din` is the input features\n\n            vn_feat (torch.Tensor[..., M, Din]):\n                Graph feature of the previous virtual node, or `None`\n                `M` is the number of graphs, `Din` is the input features\n                It is added to the result after the MLP, as a residual connection\n\n            step_idx:\n                The current step idx in the forward loop\n\n            edge_feat: torch.Tensor\n\n        Returns:\n\n            `feat = torch.Tensor[..., N, Dout]`:\n                Node feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Dout` is the output features of the layer and residual\n\n            `vn_feat = torch.Tensor[..., M, Dout]`:\n                Graph feature tensor to be used at the next virtual node, or `None`\n                `M` is the number of graphs, `Dout` is the output features\n\n        \"\"\"\n        if step_idx < len(self.virtual_node_layers):\n            feat, vn_feat, edge_feat = self.virtual_node_layers[step_idx].forward(\n                g=g, feat=feat, vn_feat=vn_feat, edge_feat=edge_feat\n            )\n\n        return feat, vn_feat, edge_feat\n\n    def forward(self, g: Batch) -> torch.Tensor:\n        r\"\"\"\n        Apply the full graph neural network on the input graph and node features.\n\n        Parameters:\n\n            g:\n                pyg Batch graph on which the convolution is done with the keys:\n\n                - `\"feat\"`: torch.Tensor[..., N, Din]\n                  Node feature tensor, before convolution.\n                  `N` is the number of nodes, `Din` is the input features\n\n                - `\"edge_feat\"` (torch.Tensor[..., N, Ein]):\n                  Edge feature tensor, before convolution.\n                  `N` is the number of nodes, `Ein` is the input edge features\n\n\n        Returns:\n\n            `torch.Tensor[..., M, Dout]` or `torch.Tensor[..., N, Dout]`:\n                Node or graph feature tensor, after the network.\n                `N` is the number of nodes, `M` is the number of graphs,\n                `Dout` is the output dimension ``self.out_dim``\n                If the `self.pooling` is [`None`], then it returns node features and the output dimension is `N`,\n                otherwise it returns graph features and the output dimension is `M`\n\n        \"\"\"\n\n        # Initialize values of the residuals and virtual node\n        feat_prev = None\n        edge_feat_prev = None\n        vn_feat = 0.0\n        feat = g[\"feat\"]\n        edge_feat = g[\"edge_feat\"]\n        # Apply the normalization before the first network layers\n        if self.first_normalization is not None:\n            feat = self.first_normalization(feat)\n        if (self.first_normalization_edges is not None) and (self.in_dim_edges > 0):\n            edge_feat = self.first_normalization_edges(edge_feat)\n\n        # Apply the forward loop of the layers, residuals and virtual nodes\n        for ii, layer in enumerate(self.layers):\n            feat, edge_feat, feat_prev, edge_feat_prev = self._graph_layer_forward(\n                layer=layer,\n                g=g,\n                feat=feat,\n                edge_feat=edge_feat,\n                feat_prev=feat_prev,\n                edge_feat_prev=edge_feat_prev,\n                step_idx=ii,\n            )\n            feat, vn_feat, edge_feat = self._virtual_node_forward(\n                g=g, feat=feat, edge_feat=edge_feat, vn_feat=vn_feat, step_idx=ii\n            )\n\n            if self.cache_readouts:\n                self._readout_cache[ii] = feat\n\n        g[\"feat\"], g[\"edge_feat\"] = feat, edge_feat\n        return g\n\n    def get_init_kwargs(self) -> Dict[str, Any]:\n        \"\"\"\n        Get a dictionary that can be used to instanciate a new object with identical parameters.\n        \"\"\"\n        kwargs = super().get_init_kwargs()\n        new_kwargs = dict(\n            in_dim_edges=self.in_dim_edges,\n            hidden_dims_edges=self.hidden_dims_edges,\n            out_dim_edges=self.out_dim_edges,\n            virtual_node=self.virtual_node,\n            use_virtual_edges=self.use_virtual_edges,\n        )\n        kwargs.update(new_kwargs)\n        return deepcopy(kwargs)\n\n    def make_mup_base_kwargs(\n        self,\n        divide_factor: float = 2.0,\n        factor_in_dim: bool = False,\n    ) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension for the nodes\n\n        Returns:\n            kwargs: Dictionary of parameters to be used to instanciate the base model divided by the factor\n        \"\"\"\n        kwargs = self.get_init_kwargs()\n        kwargs[\"hidden_dims\"] = [round(dim / divide_factor) for dim in kwargs[\"hidden_dims\"]]\n        kwargs[\"hidden_dims_edges\"] = [round(dim / divide_factor) for dim in kwargs[\"hidden_dims_edges\"]]\n        if factor_in_dim:\n            kwargs[\"in_dim\"] = round(kwargs[\"in_dim\"] / divide_factor)\n            kwargs[\"in_dim_edges\"] = round(kwargs[\"in_dim_edges\"] / divide_factor)\n        if not self.last_layer_is_readout:\n            kwargs[\"out_dim\"] = round(kwargs[\"out_dim\"] / divide_factor)\n            kwargs[\"out_dim_edges\"] = round(kwargs[\"out_dim_edges\"] / divide_factor)\n\n        def _recursive_divide_dim(x: collections.abc.Mapping):\n            for k, v in x.items():\n                if isinstance(v, collections.abc.Mapping):\n                    _recursive_divide_dim(v)\n                elif k in [\"in_dim\", \"out_dim\", \"in_dim_edges\", \"out_dim_edges\"]:\n                    x[k] = round(v / divide_factor)\n                elif k in [\"embed_dim\"]:\n                    x[k] = round(v / divide_factor)\n\n        _recursive_divide_dim(kwargs[\"layer_kwargs\"])\n\n        return kwargs\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        class_str = f\"{self.name}(depth={self.depth}, {self.residual_layer})\\n    \"\n        layer_str = f\"{self.layer_class.__name__}[{' -> '.join(map(str, self.full_dims))}]\\n    \"\n\n        return class_str + layer_str\n\n\nclass FullGraphMultiTaskNetwork(nn.Module, MupMixin):\n    def __init__(\n        self,\n        gnn_kwargs: Dict[str, Any],\n        pre_nn_kwargs: Optional[Dict[str, Any]] = None,\n        pre_nn_edges_kwargs: Optional[Dict[str, Any]] = None,\n        pe_encoders_kwargs: Optional[Dict[str, Any]] = None,\n        task_heads_kwargs: Optional[Dict[str, Any]] = None,\n        graph_output_nn_kwargs: Optional[Dict[str, Any]] = None,\n        accelerator_kwargs: Optional[Dict[str, Any]] = None,\n        num_inference_to_average: int = 1,\n        last_layer_is_readout: bool = False,\n        name: str = \"FullGNN\",\n    ):\n        r\"\"\"\n        Class that allows to implement a full graph neural network architecture,\n        including the pre-processing MLP and the post processing MLP.\n\n        Parameters:\n\n            gnn_kwargs:\n                key-word arguments to use for the initialization of the pre-processing\n                GNN network using the class `FeedForwardGraph`.\n                It must respect the following criteria:\n\n                - gnn_kwargs[\"in_dim\"] must be equal to pre_nn_kwargs[\"out_dim\"]\n                - gnn_kwargs[\"out_dim\"] must be equal to graph_output_nn_kwargs[\"in_dim\"]\n\n            pe_encoders_kwargs:\n                key-word arguments to use for the initialization of all positional encoding encoders.\n                See the class `EncoderManager` for more details.\n\n            pre_nn_kwargs:\n                key-word arguments to use for the initialization of the pre-processing\n                MLP network of the node features before the GNN, using the class `FeedForwardNN`.\n                If `None`, there won't be a pre-processing MLP.\n\n            pre_nn_edges_kwargs:\n                key-word arguments to use for the initialization of the pre-processing\n                MLP network of the edge features before the GNN, using the class `FeedForwardNN`.\n                If `None`, there won't be a pre-processing MLP.\n\n            task_heads_kwargs:\n                This argument is a list of dictionaries containing the arguments for task heads. Each argument is used to\n                initialize a task-specific MLP.\n\n            graph_output_nn_kwargs:\n                This argument is a list of dictionaries corresponding to the arguments for a FeedForwardNN.\n                Each dict of arguments is used to initialize a shared MLP.\n\n            accelerator_kwargs:\n                key-word arguments specific to the accelerator being used,\n                e.g. pipeline split points\n\n            num_inference_to_average:\n                Number of inferences to average at val/test time. This is used to avoid the noise introduced\n                by positional encodings with sign-flips. In case no such encoding is given,\n                this parameter is ignored.\n                NOTE: The inference time will be slowed-down proportionaly to this parameter.\n\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n            name:\n                Name attributed to the current network, for display and printing\n                purposes.\n        \"\"\"\n\n        super().__init__()\n        self.name = name\n        self.num_inference_to_average = num_inference_to_average\n        self.last_layer_is_readout = last_layer_is_readout\n        self._concat_last_layers = None\n        self.pre_nn, self.pre_nn_edges, self.task_heads = None, None, None\n        self.pe_encoders_kwargs = deepcopy(pe_encoders_kwargs)\n        self.graph_output_nn_kwargs = graph_output_nn_kwargs\n        self.encoder_manager = EncoderManager(pe_encoders_kwargs)\n        self.max_num_nodes_per_graph = None\n        self.max_num_edges_per_graph = None\n        self._cache_readouts = False\n\n        # Initialize the pre-processing neural net for nodes (applied directly on node features)\n        if pre_nn_kwargs is not None:\n            name = pre_nn_kwargs.pop(\"name\", \"pre-NN\")\n            self.pre_nn = FeedForwardNN(**pre_nn_kwargs, name=name)\n            next_in_dim = self.pre_nn.out_dim\n            gnn_kwargs.setdefault(\"in_dim\", next_in_dim)\n            assert (\n                next_in_dim == gnn_kwargs[\"in_dim\"]\n            ), f\"Inconsistent dimensions between pre-NN output ({next_in_dim}) and GNN input ({gnn_kwargs['in_dim']})\"\n\n        # Initialize the pre-processing neural net for edges (applied directly on edge features)\n        if pre_nn_edges_kwargs is not None:\n            name = pre_nn_edges_kwargs.pop(\"name\", \"pre-NN-edges\")\n            self.pre_nn_edges = FeedForwardNN(**pre_nn_edges_kwargs, name=name)\n            next_in_dim = self.pre_nn_edges.out_dim\n            gnn_kwargs.setdefault(\"in_dim_edges\", next_in_dim)\n            assert (\n                next_in_dim == gnn_kwargs[\"in_dim_edges\"]\n            ), f\"Inconsistent dimensions between pre-NN-edges output ({next_in_dim}) and GNN input ({gnn_kwargs['in_dim_edges']})\"\n\n        # Initialize the graph neural net (applied after the pre_nn)\n        name = gnn_kwargs.pop(\"name\", \"GNN\")\n        gnn_class = self._parse_feed_forward_gnn(gnn_kwargs)\n        gnn_kwargs.setdefault(\n            \"last_layer_is_readout\", self.last_layer_is_readout and (task_heads_kwargs is None)\n        )\n        self.gnn = gnn_class(**gnn_kwargs, name=name)\n        next_in_dim = self.gnn.out_dim\n\n        if task_heads_kwargs is not None:\n            self.task_heads = TaskHeads(\n                in_dim=self.out_dim,\n                in_dim_edges=self.out_dim_edges,\n                task_heads_kwargs=task_heads_kwargs,\n                graph_output_nn_kwargs=graph_output_nn_kwargs,\n            )\n            self._task_heads_kwargs = task_heads_kwargs\n\n        if accelerator_kwargs is not None:\n            accelerator = accelerator_kwargs[\"_accelerator\"]\n            if accelerator == \"ipu\":\n                self._apply_ipu_options(accelerator_kwargs)\n\n        self._check_bad_arguments()\n\n    @staticmethod\n    def _parse_feed_forward_gnn(\n        gnn_kwargs: Dict[str, Any],\n    ):\n        \"\"\"\n        Parse the key-word arguments to determine which `FeedForward` class to use.\n        Parameters:\n            gnn_kwargs: key-word arguments to use for the initialization of the GNN network.\n\n        Returns:\n            `FeedForwardPyg` class\n        \"\"\"\n\n        return FeedForwardGraph\n\n    def _check_bad_arguments(self):\n        r\"\"\"\n        Raise comprehensive errors if the arguments seem wrong\n        \"\"\"\n        if self.pre_nn is not None:\n            if self.pre_nn.out_dim != self.gnn.in_dim:\n                raise ValueError(\n                    f\"`self.pre_nn.out_dim` must be equal to `self.gnn.in_dim`.\"\n                    + 'Provided\" {self.pre_nn.out_dim} and {self.gnn.in_dim}'\n                )\n\n        if self.task_heads is not None:\n            edge_level_tasks = [\n                task_name\n                for task_name, head_kwargs in self._task_heads_kwargs.items()\n                if head_kwargs[\"task_level\"] == \"edge\"\n            ]\n            if len(edge_level_tasks) > 0 and (\n                not self.gnn.layer_class.layer_supports_edges or self.out_dim_edges < 1\n            ):\n                raise ValueError(\n                    f\"Task heads have edge level tasks {', '.join(edge_level_tasks)}, but edge level tasks cannot be used with layer class `{self.gnn.layer_class}`\"\n                )\n            graph_level_tasks = [\n                task_name\n                for task_name, head_kwargs in self._task_heads_kwargs.items()\n                if head_kwargs[\"task_level\"] == \"graph\"\n            ]\n            if len(graph_level_tasks) > 0 and self.graph_output_nn_kwargs[\"graph\"][\"pooling\"] == [\"none\"]:\n                raise ValueError(\n                    f\"Task heads have graph level tasks {', '.join(graph_level_tasks)}, but pooling is none.\"\n                )\n\n    def _apply_ipu_options(self, ipu_kwargs):\n        gnn_layers_per_ipu = ipu_kwargs.get(\"gnn_layers_per_ipu\")\n        self._apply_ipu_pipeline_split(gnn_layers_per_ipu)\n\n    def _apply_ipu_pipeline_split(self, gnn_layers_per_ipu):\n        r\"\"\"\n        Apply pipeline split from accelerator options if applicable\n        \"\"\"\n\n        if gnn_layers_per_ipu is None:\n            return\n\n        if not isinstance(gnn_layers_per_ipu, collections.abc.Sequence):\n            raise ValueError(\"gnn_layers_per_ipu must be a Sequence (e.g. a list)\")\n\n        valid_ipu_pipeline_lengths = [1, 2, 4, 8, 16]\n        pipeline_length = len(gnn_layers_per_ipu)\n\n        if pipeline_length not in valid_ipu_pipeline_lengths:\n            raise ValueError(\n                f\"Length of gnn_layers_per_ipu must be one of {valid_ipu_pipeline_lengths}, \"\n                f\"got {gnn_layers_per_ipu} of length {pipeline_length} instead\"\n            )\n\n        model_depth = len(self.gnn.layers)\n\n        if sum(gnn_layers_per_ipu) != model_depth:\n            raise ValueError(\n                f\"The values in gnn_layers_per_ipu must add up to the depth of the model, \"\n                f\"got {gnn_layers_per_ipu} with total {sum(gnn_layers_per_ipu)} vs model depth \"\n                f\"of {model_depth}\"\n            )\n\n        begin_block_layer_indices = [sum(gnn_layers_per_ipu[:i]) for i in range(1, pipeline_length)]\n\n        for begin_block_layer_index, ipu_id in zip(begin_block_layer_indices, range(1, pipeline_length)):\n            self.gnn.layers[begin_block_layer_index] = poptorch.BeginBlock(\n                self.gnn.layers[begin_block_layer_index], ipu_id=ipu_id\n            )\n\n    def _enable_readout_cache(self, module_filter: Optional[Union[str, List[str]]]):\n        \"\"\"\n        Enable a single-batch readout cache for (a subset of) the modules.\n        This is used to extract hidden representations for fingerprinting.\n        \"\"\"\n\n        self.create_module_map(level=\"module\")\n\n        for k in module_filter:\n            if k not in self._module_map:\n                raise ValueError(f\"Module {k} not found in network, choose from {self._module_map.keys()}\")\n\n        if module_filter is None:\n            module_filter = list(self._module_map.keys())\n        if isinstance(module_filter, str):\n            module_filter = [module_filter]\n\n        for module_name, module in self._module_map.items():\n            if module_name in module_filter:\n                if not isinstance(module, FeedForwardNN):\n                    raise RuntimeError(\n                        f\"Readout cache can only be enabled for FeedForwardNN subclasses, not {type(module)}\"\n                    )\n                module._enable_readout_cache()\n\n        self._cache_readouts = True\n\n    def _disable_readout_cache(self):\n        \"\"\"Disable the readout cache\"\"\"\n        self.create_module_map(level=\"module\")\n        for _, module in self._module_map.items():\n            if isinstance(module, FeedForwardNN):\n                module._disable_readout_cache()\n        self._cache_readouts = False\n\n    def create_module_map(self, level: Union[Literal[\"layers\"], Literal[\"module\"]] = \"layers\"):\n        \"\"\"\n        Function to create mapping between each (sub)module name and corresponding nn.ModuleList() (if possible);\n        Used for finetuning when (partially) loading or freezing specific modules of the pretrained model\n\n        Args:\n            level: Whether to map to the module object or the layers of the module object\n        \"\"\"\n        self._module_map = OrderedDict()\n\n        if self.encoder_manager is not None:\n            self._module_map.update(\n                {\"pe_encoders\": self.encoder_manager}\n            )  # could be extended to submodules, e.g. pe_encoders/la_pos/linear_in/..., etc.; not necessary for current finetuning\n\n        if self.pre_nn is not None:\n            self._module_map.update({\"pre_nn\": self.pre_nn})\n\n        if self.pre_nn_edges is not None:\n            self._module_map.update({\"pre_nn_edges\": self.pre_nn_edges})\n\n        # No need to check for NoneType as GNN module is not optional in FullGraphMultitaskNetwork\n        self._module_map.update({\"gnn\": self.gnn})\n\n        if self.task_heads is not None:\n            self._module_map.update(\n                {\n                    \"graph_output_nn-\"\n                    + output_level: self.task_heads.graph_output_nn[output_level].graph_output_nn\n                    for output_level in self.task_heads.graph_output_nn.keys()\n                }\n            )\n\n            self._module_map.update(\n                {\n                    \"task_heads-\" + task_head_name: self.task_heads.task_heads[task_head_name]\n                    for task_head_name in self.task_heads.task_heads.keys()\n                }\n            )\n\n            if level == \"layers\":\n                for module_name, module in self._module_map.items():\n                    if module_name != \"pe_encoders\":\n                        self._module_map[module_name] = module.layers\n        return self._module_map\n\n    def forward(self, g: Batch) -> Tensor:\n        r\"\"\"\n        Apply the pre-processing neural network, the graph neural network,\n        and the post-processing neural network on the graph features.\n\n        Parameters:\n\n            g:\n                pyg Batch graph on which the convolution is done.\n                Must contain the following elements:\n\n                - Node key `\"feat\"`: `torch.Tensor[..., N, Din]`.\n                  Input node feature tensor, before the network.\n                  `N` is the number of nodes, `Din` is the input features dimension ``self.pre_nn.in_dim``\n\n                - Edge key `\"edge_feat\"`: `torch.Tensor[..., N, Ein]` **Optional**.\n                  The edge features to use. It will be ignored if the\n                  model doesn't supporte edge features or if\n                  `self.in_dim_edges==0`.\n\n                - Other keys related to positional encodings `\"pos_enc_feats_sign_flip\"`,\n                  `\"pos_enc_feats_no_flip\"`.\n\n        Returns:\n\n            `torch.Tensor[..., M, Dout]` or `torch.Tensor[..., N, Dout]`:\n                Node or graph feature tensor, after the network.\n                `N` is the number of nodes, `M` is the number of graphs,\n                `Dout` is the output dimension ``self.graph_output_nn.out_dim``\n                If the `self.gnn.pooling` is [`None`], then it returns node features and the output dimension is `N`,\n                otherwise it returns graph features and the output dimension is `M`\n\n        \"\"\"\n\n        # Apply the positional encoders\n        g = self.encoder_manager(g)\n\n        e = None\n\n        # Run the pre-processing network on node features\n        if self.pre_nn is not None:\n            g[\"feat\"] = self.pre_nn.forward(g[\"feat\"])\n\n        # Run the pre-processing network on edge features\n        # If there are no edges, skip the forward and change the dimension of e\n        if self.pre_nn_edges is not None:\n            e = g[\"edge_feat\"]\n            if torch.prod(torch.as_tensor(e.shape[:-1])) == 0:\n                e = torch.zeros(\n                    list(e.shape[:-1]) + [self.pre_nn_edges.out_dim], device=e.device, dtype=e.dtype\n                )\n            else:\n                e = self.pre_nn_edges.forward(e)\n            g[\"edge_feat\"] = e\n\n        # Run the graph neural network\n        g = self.gnn.forward(g)\n\n        if self.task_heads is not None:\n            return self.task_heads.forward(g)\n\n        return g\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n\n        Returns:\n            Dictionary with the kwargs to create the base model.\n        \"\"\"\n        kwargs = dict(\n            gnn_kwargs=None,\n            pre_nn_kwargs=None,\n            pre_nn_edges_kwargs=None,\n            pe_encoders_kwargs=None,\n            num_inference_to_average=self.num_inference_to_average,\n            last_layer_is_readout=self.last_layer_is_readout,\n            name=self.name,\n        )\n\n        # For the pre-nn network, get the smaller dimensions.\n        # For the input dim, only divide the features coming from the pe-encoders\n        if self.pre_nn is not None:\n            kwargs[\"pre_nn_kwargs\"] = self.pre_nn.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=False\n            )\n            pe_enc_outdim = 0 if self.encoder_manager is None else self.pe_encoders_kwargs.get(\"out_dim\", 0)\n            pre_nn_indim = kwargs[\"pre_nn_kwargs\"][\"in_dim\"] - pe_enc_outdim\n            kwargs[\"pre_nn_kwargs\"][\"in_dim\"] = round(pre_nn_indim + (pe_enc_outdim / divide_factor))\n\n        # For the pre-nn on the edges, factor all dimensions, except the in_dim\n        if self.pre_nn_edges is not None:\n            kwargs[\"pre_nn_edges_kwargs\"] = self.pre_nn_edges.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=False\n            )\n            pe_enc_edge_outdim = (\n                0 if self.encoder_manager is None else self.pe_encoders_kwargs.get(\"edge_out_dim\", 0)\n            )\n            pre_nn_edge_indim = kwargs[\"pre_nn_edges_kwargs\"][\"in_dim\"] - pe_enc_edge_outdim\n            kwargs[\"pre_nn_edges_kwargs\"][\"in_dim\"] = round(\n                pre_nn_edge_indim + (pe_enc_edge_outdim / divide_factor)\n            )\n\n        # For the pe-encoders, don't factor the in_dim and in_dim_edges\n        if self.encoder_manager is not None:\n            kwargs[\"pe_encoders_kwargs\"] = self.encoder_manager.make_mup_base_kwargs(\n                divide_factor=divide_factor\n            )\n\n        if self.task_heads is not None:\n            task_heads_kwargs = self.task_heads.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=True\n            )\n            kwargs[\"task_heads_kwargs\"] = task_heads_kwargs[\"task_heads_kwargs\"]\n            kwargs[\"graph_output_nn_kwargs\"] = task_heads_kwargs[\"graph_output_nn_kwargs\"]\n\n        # For the gnn network, all the dimension are divided, except the input dims if pre-nn are missing\n        if self.gnn is not None:\n            factor_in_dim = self.pre_nn is not None\n            kwargs[\"gnn_kwargs\"] = self.gnn.make_mup_base_kwargs(\n                divide_factor=divide_factor,\n                factor_in_dim=factor_in_dim,\n            )\n\n        return kwargs\n\n    def set_max_num_nodes_edges_per_graph(self, max_nodes: Optional[int], max_edges: Optional[int]) -> None:\n        \"\"\"\n        Set the maximum number of nodes and edges for all gnn layers and encoder layers\n\n        Parameters:\n            max_nodes: Maximum number of nodes in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n\n            max_edges: Maximum number of edges in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n        \"\"\"\n        self.max_num_nodes_per_graph = max_nodes\n        self.max_num_edges_per_graph = max_edges\n        if (self.encoder_manager is not None) and (self.encoder_manager.pe_encoders is not None):\n            for encoder in self.encoder_manager.pe_encoders.values():\n                encoder.max_num_nodes_per_graph = max_nodes\n                encoder.max_num_edges_per_graph = max_edges\n        if self.gnn is not None:\n            for layer in self.gnn.layers:\n                if isinstance(layer, BaseGraphStructure):\n                    layer.max_num_nodes_per_graph = max_nodes\n                    layer.max_num_edges_per_graph = max_edges\n\n        self.task_heads.set_max_num_nodes_edges_per_graph(max_nodes, max_edges)\n\n    def __repr__(self) -> str:\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        pre_nn_str, pre_nn_edges_str = \"\", \"\"\n        if self.pre_nn is not None:\n            pre_nn_str = self.pre_nn.__repr__() + \"\\n\\n\"\n        if self.pre_nn_edges is not None:\n            pre_nn_edges_str = self.pre_nn_edges.__repr__() + \"\\n\\n\"\n        gnn_str = self.gnn.__repr__() + \"\\n\\n\"\n        if self.task_heads is not None:\n            task_str = self.task_heads.__repr__()\n            task_str = \"    Task heads:\\n    \" + \"    \".join(task_str.splitlines(True))\n\n        child_str = \"    \" + pre_nn_str + pre_nn_edges_str + gnn_str + task_str\n        child_str = \"    \".join(child_str.splitlines(True))\n\n        full_str = self.name + \"\\n\" + \"-\" * (len(self.name) + 2) + \"\\n\" + child_str\n\n        return full_str\n\n    @property\n    def in_dim(self) -> int:\n        r\"\"\"\n        Returns the input dimension of the network\n        \"\"\"\n        if self.pre_nn is not None:\n            return self.pre_nn.in_dim\n        else:\n            return self.gnn.in_dim\n\n    @property\n    def out_dim(self) -> int:\n        r\"\"\"\n        Returns the output dimension of the network\n        \"\"\"\n        return self.gnn.out_dim\n\n    @property\n    def out_dim_edges(self) -> int:\n        r\"\"\"\n        Returns the output dimension of the edges\n        of the network.\n        \"\"\"\n        if self.gnn.full_dims_edges is not None:\n            return self.gnn.full_dims_edges[-1]\n        return self.gnn.in_dim_edges\n\n    @property\n    def in_dim_edges(self) -> int:\n        r\"\"\"\n        Returns the input edge dimension of the network\n        \"\"\"\n        return self.gnn.in_dim_edges\n\n\nclass GraphOutputNN(nn.Module, MupMixin):\n    def __init__(\n        self,\n        in_dim: int,\n        in_dim_edges: int,\n        task_level: str,\n        graph_output_nn_kwargs: Dict[str, Any],\n    ):\n        r\"\"\"\n        Parameters:\n            in_dim:\n                Input feature dimensions of the layer\n\n            in_dim_edges:\n                Input edge feature dimensions of the layer\n            task_level:\n                graph/node/edge/nodepair depending on wether it is graph/node/edge/nodepair level task\n            graph_output_nn_kwargs:\n                key-word arguments to use for the initialization of the post-processing\n                MLP network after the GNN, using the class `FeedForwardNN`.\n        \"\"\"\n        super().__init__()\n        self.task_level = task_level\n        self._concat_last_layers = None\n        self.in_dim = in_dim\n        self.in_dim_edges = in_dim_edges\n        self.graph_output_nn_kwargs = graph_output_nn_kwargs\n        self.max_num_nodes_per_graph = None\n        self.max_num_edges_per_graph = None\n        self.map_task_level = {\n            \"node\": \"feat\",\n            \"nodepair\": \"nodepair_feat\",\n            \"graph\": \"graph_feat\",\n            \"edge\": \"edge_feat\",\n        }\n\n        if self.task_level == \"nodepair\":\n            level_in_dim = 2 * self.in_dim\n        elif self.task_level == \"edge\":\n            level_in_dim = self.in_dim_edges\n        elif self.task_level == \"graph\":\n            self.global_pool_layer, self.out_pool_dim = self._parse_pooling_layer(\n                self.in_dim, graph_output_nn_kwargs[self.task_level][\"pooling\"]\n            )\n            level_in_dim = self.out_pool_dim\n        else:\n            level_in_dim = self.in_dim\n\n        if level_in_dim == 0:\n            raise ValueError(f\"Task head has an input dimension of 0.\")\n        # Initialize the post-processing neural net (applied after the gnn)\n        name = graph_output_nn_kwargs[self.task_level].pop(\"name\", \"post-NN\")\n        filtered_graph_output_nn_kwargs = {\n            k: v for k, v in graph_output_nn_kwargs[self.task_level].items() if k not in [\"pooling\", \"in_dim\"]\n        }\n        self.graph_output_nn = FeedForwardNN(\n            in_dim=level_in_dim, name=name, **filtered_graph_output_nn_kwargs\n        )\n\n    def forward(self, g: Batch):\n        \"\"\"\n        Parameters:\n            g: pyg Batch graph\n        Returns:\n            h: Output features after applying graph_output_nn\n        \"\"\"\n        # Check if at least one nodepair task is present\n        if self.task_level == \"nodepair\":\n            g[\"nodepair_feat\"] = self.compute_nodepairs(\n                node_feats=g[\"feat\"],\n                batch=g.batch,\n                max_num_nodes=self.max_num_nodes_per_graph,\n                drop_nodes_last_graph=is_running_on_ipu(),\n            )\n        # Check if at least one graph-level task is present\n        if self.task_level == \"graph\":\n            # pool features if the level is graph\n            g[\"graph_feat\"] = self._pool_layer_forward(g, g[\"feat\"])\n\n        h = g[self.map_task_level[self.task_level]]\n        # Run the output network\n        if self.concat_last_layers is None:\n            h = self.graph_output_nn.forward(h)\n        else:\n            # Concatenate the output of the last layers according to `self._concat_last_layers``.\n            # Useful for generating fingerprints\n            h = [h]\n            for ii in range(len(self.graph_output_nn.layers)):\n                h.insert(0, self.graph_output_nn.layers[ii].forward(h[0]))  # Append in reverse\n        return h\n\n    def _parse_pooling_layer(\n        self, in_dim: int, pooling: Union[str, List[str]], **kwargs\n    ) -> Tuple[nn.Module, int]:\n        r\"\"\"\n        Return the pooling layer\n        **This function is virtual, so it needs to be implemented by the child class.**\n\n        Parameters:\n\n            in_dim:\n                The dimension at the input layer of the pooling\n\n            pooling:\n                The list of pooling layers to use. The accepted strings are:\n\n                - \"sum\": `SumPooling`\n                - \"mean\": `MeanPooling`\n                - \"max\": `MaxPooling`\n                - \"min\": `MinPooling`\n                - \"std\": `StdPooling`\n                - \"s2s\": `Set2Set`\n                - \"dir{int}\": `DirPooling`\n\n            kwargs:\n                Kew-word arguments for the pooling layer initialization\n\n        Return:\n            pool_layer: Pooling layer module\n            out_pool_dim: Output dimension of the pooling layer\n\n        \"\"\"\n        return parse_pooling_layer_pyg(in_dim, pooling, **kwargs)\n\n    def _pool_layer_forward(\n        self,\n        g: Batch,\n        feat: torch.Tensor,\n    ):\n        r\"\"\"\n        Apply the graph pooling layer, followed by the linear output layer.\n\n        Parameters:\n\n            g: pyg Batch graph on which the convolution is done\n\n            feat (torch.Tensor[..., N, Din]):\n                Node feature tensor, before convolution.\n                `N` is the number of nodes, `Din` is the output size of the last Graph layer\n\n        Returns:\n            torch.Tensor[..., M, Din] or torch.Tensor[..., N, Din]:\n                Node feature tensor, after convolution.\n                `N` is the number of nodes, `M` is the number of graphs, `Dout` is the output dimension ``self.out_dim``\n                If the pooling is `None`, the dimension is `N`, otherwise it is `M`\n\n        \"\"\"\n\n        if len(self.global_pool_layer) > 0:\n            pooled_feat = self.global_pool_layer(g, feat)\n        else:\n            pooled_feat = feat\n\n        return pooled_feat\n\n    def compute_nodepairs(\n        self,\n        node_feats: torch.Tensor,\n        batch: torch.Tensor,\n        max_num_nodes: int = None,\n        fill_value: float = float(\"nan\"),\n        batch_size: int = None,\n        drop_nodes_last_graph: bool = False,\n    ) -> torch.Tensor:\n        r\"\"\"\n        Vectorized implementation of nodepair-level task:\n        Parameters:\n            node_feats: Node features\n            batch: Batch vector\n            max_num_nodes: The maximum number of nodes per graph\n            fill_value: The value for invalid entries in the\n                resulting dense output tensor. (default: :obj:`NaN`)\n            batch_size: The batch size. (default: :obj:`None`)\n            drop_nodes_last_graph: Whether to drop the nodes of the last graphs that exceed\n                the `max_num_nodes_per_graph`. Useful when the last graph is a padding.\n        Returns:\n            result: concatenated node features of shape B * max_num_nodes * 2*h,\n            where B is number of graphs, max_num_nodes is the chosen maximum number nodes, and h is the feature dim\n        \"\"\"\n        dense_feat, mask, _ = to_dense_batch(\n            node_feats,\n            batch=batch,\n            fill_value=fill_value,\n            batch_size=batch_size,\n            max_num_nodes_per_graph=max_num_nodes,\n            drop_nodes_last_graph=drop_nodes_last_graph,\n        )\n        n = dense_feat.size(1)\n        h_X = dense_feat[:, :, None].repeat(1, 1, n, 1)\n        h_Y = dense_feat[:, None, :, :].repeat(1, n, 1, 1)\n\n        nodepair_h = torch.cat((h_X + h_Y, torch.abs(h_X - h_Y)), dim=-1)\n        upper_tri_mask = torch.triu(torch.ones(n, n), diagonal=1).bool()\n        # Mask nodepair_h using upper_tri_mask\n        batched_result = nodepair_h[:, upper_tri_mask, :]\n        return batched_result\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n\n        Returns:\n            Dictionary with the kwargs to create the base model.\n        \"\"\"\n        # For the post-nn network, all the dimension are divided\n        graph_output_nn_kwargs = self.graph_output_nn.make_mup_base_kwargs(\n            divide_factor=divide_factor, factor_in_dim=factor_in_dim\n        )\n        kwargs = {\n            \"pooling\": self.graph_output_nn_kwargs[self.task_level][\"pooling\"],\n            **graph_output_nn_kwargs,\n        }\n        return kwargs\n\n    def drop_graph_output_nn_layers(self, num_layers_to_drop: int) -> None:\n        r\"\"\"\n        Remove the last layers of the model. Useful for Transfer Learning.\n        Parameters:\n            num_layers_to_drop: The number of layers to drop from the `self.graph_output_nn` network.\n        \"\"\"\n\n        assert num_layers_to_drop >= 0\n        assert num_layers_to_drop <= len(self.graph_output_nn.layers)\n\n        if num_layers_to_drop > 0:\n            self.graph_output_nn.layers = self.graph_output_nn.layers[:-num_layers_to_drop]\n\n    def extend_graph_output_nn_layers(self, layers: nn.ModuleList):\n        r\"\"\"\n        Add layers at the end of the model. Useful for Transfer Learning.\n        Parameters:\n            layers: A ModuleList of all the layers to extend\n        \"\"\"\n\n        assert isinstance(layers, nn.ModuleList)\n        if len(self.graph_output_nn.layers) > 0:\n            assert layers[0].in_dim == self.graph_output_nn.layers.out_dim[-1]\n\n        self.graph_output_nn.extend(layers)\n\n    def set_max_num_nodes_edges_per_graph(self, max_nodes: Optional[int], max_edges: Optional[int]) -> None:\n        \"\"\"\n        Set the maximum number of nodes and edges for all gnn layers and encoder layers\n\n        Parameters:\n            max_nodes: Maximum number of nodes in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n\n            max_edges: Maximum number of edges in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n        \"\"\"\n        self.max_num_nodes_per_graph = max_nodes\n        self.max_num_edges_per_graph = max_edges\n\n    @property\n    def concat_last_layers(self) -> Optional[Iterable[int]]:\n        \"\"\"\n        Property to control the output of the `self.forward`.\n        If set to a list of integer, the `forward` function will\n        concatenate the output of different layers.\n\n        If set to `None`, the output of the last layer is returned.\n\n        NOTE: The indexes are inverted. 0 is the last layer, 1 is the second last, etc.\n        \"\"\"\n        return self._concat_last_layers\n\n    @concat_last_layers.setter\n    def concat_last_layers(self, value: Union[Type[None], int, Iterable[int]]) -> None:\n        \"\"\"\n        Set the property to control the output of the `self.forward`.\n        If set to a list of integer, the `forward` function will\n        concatenate the output of different layers.\n        If a single integer is provided, it will output that specific layer.\n\n        If set to `None`, the output of the last layer is returned.\n\n        NOTE: The indexes are inverted. 0 is the last layer, 1 is the second last, etc.\n\n        Parameters:\n            value: Output layers to concatenate, in reverse order (`0` is the last layer)\n        \"\"\"\n        if (value is not None) and not isinstance(value, Iterable):\n            value = [value]\n        self._concat_last_layers = value\n\n    @property\n    def out_dim(self) -> int:\n        r\"\"\"\n        Returns the output dimension of the network\n        \"\"\"\n        return self.graph_output_nn.out_dim\n\n\nclass TaskHeads(nn.Module, MupMixin):\n    def __init__(\n        self,\n        in_dim: int,\n        in_dim_edges: int,\n        task_heads_kwargs: Dict[str, Any],\n        graph_output_nn_kwargs: Dict[str, Any],\n        last_layer_is_readout: bool = True,\n    ):\n        r\"\"\"\n        Class that groups all multi-task output heads together to provide the task-specific outputs.\n        Parameters:\n            in_dim:\n                Input feature dimensions of the layer\n\n            in_dim_edges:\n                Input edge feature dimensions of the layer\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method\n            task_heads_kwargs:\n                This argument is a list of dictionaries corresponding to the arguments for a FeedForwardNN.\n                Each dict of arguments is used to initialize a task-specific MLP.\n            graph_output_nn_kwargs:\n                key-word arguments to use for the initialization of the post-processing\n                MLP network after the GNN, using the class `FeedForwardNN`.\n        \"\"\"\n        super().__init__()\n        self.last_layer_is_readout = last_layer_is_readout\n        self.task_heads_kwargs = deepcopy(task_heads_kwargs)\n        self.graph_output_nn_kwargs = deepcopy(graph_output_nn_kwargs)\n        self.task_levels = {head_kwargs[\"task_level\"] for _, head_kwargs in self.task_heads_kwargs.items()}\n        self.in_dim = in_dim\n        self.in_dim_edges = in_dim_edges\n        self.task_heads = nn.ModuleDict()\n        self.graph_output_nn = nn.ModuleDict()\n        self._check_bad_arguments()\n\n        for task_name, head_kwargs in self.task_heads_kwargs.items():\n            task_level = self.task_heads_kwargs[task_name].get(\"task_level\")\n            self.graph_output_nn[task_level] = GraphOutputNN(\n                in_dim=self.in_dim,\n                in_dim_edges=self.in_dim_edges,\n                task_level=task_level,\n                graph_output_nn_kwargs=self.graph_output_nn_kwargs,\n            )\n            head_kwargs.setdefault(\"name\", f\"NN-{task_name}\")\n            head_kwargs.setdefault(\"last_layer_is_readout\", last_layer_is_readout)\n            # Create a new dictionary without the task_level key-value pair,\n            # and pass it while initializing the FeedForwardNN instance for tasks\n            filtered_kwargs = {k: v for k, v in head_kwargs.items() if k != \"task_level\"}\n            filtered_kwargs[\"in_dim\"] = self.graph_output_nn_kwargs[task_level][\"out_dim\"]\n            self.task_heads[task_name] = FeedForwardNN(**filtered_kwargs)\n\n    def forward(self, g: Batch) -> Dict[str, torch.Tensor]:\n        r\"\"\"\n        forward function of the task head\n        Parameters:\n            g: pyg Batch graph\n        Returns:\n            task_head_outputs: Return a dictionary: Dict[task_name, Tensor]\n        \"\"\"\n        features = {task_level: self.graph_output_nn[task_level](g) for task_level in self.task_levels}\n\n        task_head_outputs = {}\n        for task_name, head in self.task_heads.items():\n            task_level = self.task_heads_kwargs[task_name].get(\n                \"task_level\", None\n            )  # Get task_level without modifying head_kwargs\n            task_head_outputs[task_name] = head.forward(features[task_level])\n\n        return task_head_outputs\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n\n        Returns:\n            kwargs: Dictionary of arguments to be used to initialize the base model\n        \"\"\"\n        graph_output_nn_kwargs = {}\n        for task_level, graph_output_nn in self.graph_output_nn.items():\n            graph_output_nn: GraphOutputNN\n            graph_output_nn_kwargs[task_level] = graph_output_nn.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=factor_in_dim\n            )\n\n        task_heads_kwargs = {}\n        for task_name, task_nn in self.task_heads.items():\n            task_nn: FeedForwardNN\n            task_heads_kwargs[task_name] = task_nn.make_mup_base_kwargs(\n                divide_factor=divide_factor, factor_in_dim=factor_in_dim\n            )\n            task_heads_kwargs[task_name][\"task_level\"] = self.task_heads_kwargs[task_name][\"task_level\"]\n        kwargs = dict(\n            in_dim=self.in_dim,\n            last_layer_is_readout=self.last_layer_is_readout,\n            task_heads_kwargs=task_heads_kwargs,\n            graph_output_nn_kwargs=graph_output_nn_kwargs,\n        )\n        return kwargs\n\n    def set_max_num_nodes_edges_per_graph(self, max_nodes: Optional[int], max_edges: Optional[int]) -> None:\n        \"\"\"\n        Set the maximum number of nodes and edges for all gnn layers and encoder layers\n\n        Parameters:\n            max_nodes: Maximum number of nodes in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n\n            max_edges: Maximum number of edges in the dataset.\n                This will be useful for certain architecture, but ignored by others.\n        \"\"\"\n        for graph_output_nn in self.graph_output_nn.values():\n            graph_output_nn: GraphOutputNN\n            graph_output_nn.set_max_num_nodes_edges_per_graph(max_nodes, max_edges)\n\n        for task_head in self.task_heads.values():\n            task_head: FeedForwardNN\n            for layer in task_head.layers:\n                if isinstance(layer, BaseGraphStructure):\n                    layer.max_num_nodes_per_graph = max_nodes\n                    layer.max_num_edges_per_graph = max_edges\n\n    @property\n    def out_dim(self) -> Dict[str, int]:\n        r\"\"\"\n        Returns the output dimension of each task head\n        \"\"\"\n        return {task_name: head.out_dim for task_name, head in self.task_heads.items()}\n\n    def __repr__(self):\n        r\"\"\"\n        Returns a string representation of the task heads\n        \"\"\"\n        task_repr = []\n        for head, net in self.task_heads.items():\n            task_repr.append(head + \": \" + net.__repr__())\n        return \"\\n\".join(task_repr)\n\n    def _check_bad_arguments(self):\n        r\"\"\"\n        Raise comprehensive errors if the arguments seem wrong\n        \"\"\"\n        for task_name, head_kwargs in self.task_heads_kwargs.items():\n            task_level = self.task_heads_kwargs[task_name].get(\"task_level\", None)\n            if task_level is None:\n                raise ValueError(\"task_level must be specified for each task head.\")\n            if task_level not in [\"node\", \"edge\", \"graph\", \"nodepair\"]:\n                raise ValueError(f\"task_level {task_level} is not supported.\")\n"
  },
  {
    "path": "graphium/nn/architectures/pyg_architectures.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom torch import Tensor\nfrom torch.nn import Module\nfrom typing import Tuple, Union, List, Optional\n\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.nn.base_graph_layer import BaseGraphModule\nfrom graphium.nn.pyg_layers import VirtualNodePyg, parse_pooling_layer_pyg\nfrom graphium.nn.architectures.global_architectures import FeedForwardGraph\n\n\nclass FeedForwardPyg(FeedForwardGraph):\n    def _graph_layer_forward(\n        self,\n        layer: BaseGraphModule,\n        g: Batch,\n        feat: Tensor,\n        edge_feat: Optional[Tensor],\n        feat_prev: Optional[Tensor],\n        edge_feat_prev: Optional[Tensor],\n        step_idx: int,\n    ) -> Tuple[Tensor, Optional[Tensor], Optional[Tensor], Optional[Tensor]]:\n        r\"\"\"\n        A flexible neural network architecture, with variable hidden dimensions,\n        support for multiple layer types, and support for different residual\n        connections.\n\n        This class is meant to work with different PyG-based graph neural networks\n        layers. Any layer must inherit from `graphium.nn.base_graph_layer.BaseGraphStructure`\n        or `graphium.nn.base_graph_layer.BaseGraphLayer`.\n\n        Apply the *i-th* PyG graph layer, where *i* is the index given by `step_idx`.\n        The layer is applied differently depending if there are edge features or not.\n\n        Then, the residual is also applied on both the features and the edges (if applicable)\n\n        Parameters:\n\n            layer:\n                The PyG layer used for the convolution\n\n            g:\n                graph on which the convolution is done\n\n            feat (torch.Tensor[..., N, Din]):\n                Node feature tensor, before convolution.\n                `N` is the number of nodes, `Din` is the input features\n\n            edge_feat (torch.Tensor[..., N, Ein]):\n                Edge feature tensor, before convolution.\n                `N` is the number of nodes, `Ein` is the input edge features\n\n            feat_prev:\n                Node feature of the previous residual connection, or `None`\n\n            edge_feat_prev:\n                Edge feature of the previous residual connection, or `None`\n\n            step_idx:\n                The current step idx in the forward loop\n\n        Returns:\n\n            feat (torch.Tensor[..., N, Dout]):\n                Node feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Dout` is the output features of the layer and residual\n\n            edge_feat (torch.Tensor[..., N, Eout]):\n                Edge feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Ein` is the input edge features\n\n            feat_prev:\n                Node feature tensor to be used at the next residual connection, or `None`\n\n            edge_feat_prev:\n                Edge feature tensor to be used at the next residual connection, or `None`\n\n        \"\"\"\n\n        # Set node / edge features into the graph\n        g[\"feat\"] = feat\n        g[\"edge_feat\"] = edge_feat\n\n        # Apply the GNN layer\n        g = layer(g)\n\n        # Get the node / edge features from the graph\n        feat = g[\"feat\"]\n        edge_feat = g[\"edge_feat\"]\n\n        # Apply the residual layers on the features and edges (if applicable)\n        if step_idx < len(self.layers) - 1:\n            feat, feat_prev = self.residual_layer.forward(feat, feat_prev, step_idx=step_idx)\n            if (self.residual_edges_layer is not None) and (layer.layer_outputs_edges):\n                edge_feat, edge_feat_prev = self.residual_edges_layer.forward(\n                    edge_feat, edge_feat_prev, step_idx=step_idx\n                )\n\n        return feat, edge_feat, feat_prev, edge_feat_prev\n\n    def _parse_virtual_node_class(self) -> type:\n        return VirtualNodePyg\n\n    def _parse_pooling_layer(\n        self, in_dim: int, pooling: Union[str, List[str]], **kwargs\n    ) -> Tuple[Module, int]:\n        return parse_pooling_layer_pyg(in_dim, pooling, **kwargs)\n"
  },
  {
    "path": "graphium/nn/base_graph_layer.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport abc\nfrom typing import Union, Callable, List, Optional, Mapping\nfrom copy import deepcopy\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor, IntTensor\nfrom torch_sparse import SparseTensor\n\nfrom graphium.nn.base_layers import get_activation, DropPath\nfrom graphium.utils.decorators import classproperty\nimport torch_geometric as pyg\n\n\nclass BaseGraphStructure:\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        layer_idx: Optional[int] = None,\n        layer_depth: Optional[int] = None,\n        droppath_rate: float = 0.0,\n    ):\n        r\"\"\"\n        Abstract class used to standardize the implementation of Pyg layers\n        in the current library. It will allow a network to seemlesly swap between\n        different GNN layers by better understanding the expected inputs\n        and outputs.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            layer_idx:\n                The index of the current layer\n\n            layer_depth:\n                The total depth (number of layers) associated to this specific layer\n\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1, see https://arxiv.org/abs/1603.09382\n        \"\"\"\n\n        super().__init__()\n\n        # Basic attributes\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.normalization = normalization\n        self.dropout = dropout\n        self.activation = activation\n        self.layer_idx = layer_idx\n        self.layer_depth = layer_depth\n        self.droppath_rate = droppath_rate\n        self._max_num_nodes_per_graph = None\n        self._max_num_edges_per_graph = None\n\n    def _initialize_activation_dropout_norm(self):\n        if not isinstance(self, nn.Module):\n            raise TypeError(\n                \"This function requires the current object to be an `nn.Module`. Use multi-inheritance or the class `BaseGraphModule` instead\"\n            )\n\n        # Build the layers\n        self.activation_layer = get_activation(self.activation)\n        self.dropout_layer = self._parse_dropout(self.dropout)\n\n        self.norm_layer = self._parse_norm(self.normalization)\n        self.droppath_layer = self._parse_droppath(self.droppath_rate)\n\n    def _parse_dropout(self, dropout):\n        if callable(dropout):\n            return deepcopy(dropout)\n        elif dropout > 0:\n            return nn.Dropout(p=dropout)\n        return\n\n    def _parse_droppath(self, droppath_rate):\n        if droppath_rate == 0:\n            return\n\n        droppath_rate = DropPath.get_stochastic_drop_rate(droppath_rate, self.layer_idx, self.layer_depth)\n        return DropPath(drop_rate=droppath_rate)\n\n    def _parse_norm(self, normalization, dim=None):\n        if dim is None:\n            dim = self.out_dim * self.out_dim_factor\n        if normalization is None or normalization == \"none\":\n            parsed_norm = None\n        elif callable(normalization):\n            parsed_norm = deepcopy(normalization)\n        elif normalization == \"batch_norm\":\n            parsed_norm = nn.BatchNorm1d(dim)\n        elif normalization == \"layer_norm\":\n            parsed_norm = nn.LayerNorm(dim)\n        else:\n            raise ValueError(\n                f\"Undefined normalization `{normalization}`, must be `None`, `Callable`, 'batch_norm', 'layer_norm', 'none'\"\n            )\n        return parsed_norm\n\n    def apply_norm_activation_dropout(\n        self,\n        feat: Tensor,\n        normalization: bool = True,\n        activation: bool = True,\n        dropout: bool = True,\n        droppath: bool = True,\n        batch_idx: Optional[IntTensor] = None,\n        batch_size: Optional[int] = None,\n    ):\n        r\"\"\"\n        Apply the different normalization and the dropout to the\n        output layer.\n\n        Parameters:\n\n            feat:\n                Feature tensor, to be normalized\n\n            batch_idx\n\n            normalization:\n                Whether to apply the normalization\n\n            activation:\n                Whether to apply the activation layer\n\n            dropout:\n                Whether to apply the dropout layer\n\n            droppath:\n                Whether to apply the DropPath layer\n\n        Returns:\n\n            feat:\n                Normalized and dropped-out features\n\n        \"\"\"\n\n        if normalization and (self.norm_layer is not None):\n            feat = self.norm_layer(feat)\n\n        if activation and (self.activation_layer is not None):\n            feat = self.activation_layer(feat)\n\n        if dropout and (self.dropout_layer is not None):\n            feat = self.dropout_layer(feat)\n\n        if droppath and (self.droppath_layer is not None):\n            feat = self.droppath_layer(feat, batch_idx=batch_idx, batch_size=batch_size)\n\n        return feat\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        supports output edges edges or not.\n\n        Returns:\n\n            bool:\n                Whether the layer supports the use of edges\n        \"\"\"\n        ...\n\n    @property\n    @abc.abstractmethod\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_input_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Whether the layer uses input edges in the forward pass\n        \"\"\"\n        ...\n\n    @property\n    @abc.abstractmethod\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_output_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Whether the layer outputs edges in the forward pass\n        \"\"\"\n        ...\n\n    @property\n    @abc.abstractmethod\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Abstract method.\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                The factor that multiplies the output dimensions\n        \"\"\"\n        ...\n\n    @property\n    def max_num_nodes_per_graph(self) -> Optional[int]:\n        \"\"\"\n        Get the maximum number of nodes per graph. Useful for reshaping a compiled model (IPU)\n        \"\"\"\n        return self._max_num_nodes_per_graph\n\n    @max_num_nodes_per_graph.setter\n    def max_num_nodes_per_graph(self, value: Optional[int]):\n        \"\"\"\n        Set the maximum number of nodes per graph. Useful for reshaping a compiled model (IPU)\n        \"\"\"\n        if value is not None:\n            assert isinstance(value, int) and (\n                value > 0\n            ), f\"Value should be a positive integer, provided f{value} of type {type(value)}\"\n        self._max_num_nodes_per_graph = value\n\n    @property\n    def max_num_edges_per_graph(self) -> Optional[int]:\n        \"\"\"\n        Get the maximum number of nodes per graph. Useful for reshaping a compiled model (IPU)\n        \"\"\"\n        return self._max_num_edges_per_graph\n\n    @max_num_edges_per_graph.setter\n    def max_num_edges_per_graph(self, value: Optional[int]):\n        \"\"\"\n        Set the maximum number of nodes per graph. Useful for reshaping a compiled model (IPU)\n        \"\"\"\n        if value is not None:\n            assert isinstance(value, int) and (\n                value > 0\n            ), f\"Value should be a positive integer, provided f{value} of type {type(value)}\"\n        self._max_num_edges_per_graph = value\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        f = self.out_dim_factor\n        out_dim_f_print = \"\" if f == 1 else f\" * {f}\"\n        return f\"{self.__class__.__name__}({self.in_dim} -> {self.out_dim}{out_dim_f_print}, activation={self.activation})\"\n\n\nclass BaseGraphModule(BaseGraphStructure, nn.Module):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        layer_idx: Optional[int] = None,\n        layer_depth: Optional[int] = None,\n        droppath_rate: float = 0.0,\n    ):\n        r\"\"\"\n        Abstract class used to standardize the implementation of Pyg layers\n        in the current library. It will allow a network to seemlesly swap between\n        different GNN layers by better understanding the expected inputs\n        and outputs.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            layer_idx:\n                The index of the current layer\n\n            layer_depth:\n                The total depth (number of layers) associated to this specific layer\n\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1, see https://arxiv.org/abs/1603.09382\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n            layer_idx=layer_idx,\n            layer_depth=layer_depth,\n            droppath_rate=droppath_rate,\n        )\n\n        self._initialize_activation_dropout_norm()\n\n\ndef check_intpus_allow_int(obj, edge_index, size):\n    \"\"\"\n    Overwrite the __check_input__ to allow for int32 and int16\n    TODO: Remove when PyG and pytorch supports int32.\n    \"\"\"\n    the_size: List[Optional[int]] = [None, None]\n\n    if isinstance(edge_index, Tensor):\n        # These 3 lines are different. They check for more int types and avoid overflow\n        assert edge_index.dtype in (torch.long, torch.int64, torch.int32, torch.int16)\n        # assert edge_index.min() >= 0\n        # assert edge_index.max() < torch.iinfo(edge_index.dtype).max\n\n        assert edge_index.dim() == 2\n        assert edge_index.size(0) == 2\n        if size is not None:\n            the_size[0] = size[0]\n            the_size[1] = size[1]\n        return the_size\n\n    elif isinstance(edge_index, SparseTensor):\n        if obj.flow == \"target_to_source\":\n            raise ValueError(\n                (\n                    'Flow direction \"target_to_source\" is invalid for '\n                    \"message propagation via `torch_sparse.SparseTensor`. If \"\n                    \"you really want to make use of a reverse message \"\n                    \"passing flow, pass in the transposed sparse tensor to \"\n                    \"the message passing module, e.g., `adj_t.t()`.\"\n                )\n            )\n        the_size[0] = edge_index.sparse_size(1)\n        the_size[1] = edge_index.sparse_size(0)\n        return the_size\n\n    raise ValueError(\n        (\n            \"`MessagePassing.propagate` only supports `torch.LongTensor` of \"\n            \"shape `[2, num_messages]` or `torch_sparse.SparseTensor` for \"\n            \"argument `edge_index`.\"\n        )\n    )\n"
  },
  {
    "path": "graphium/nn/base_layers.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, Callable, Optional, Type, Tuple, Iterable\nfrom copy import deepcopy\nfrom loguru import logger\n\nimport inspect\n\nimport torch\nimport torch.nn as nn\nimport torch.nn.functional as F\nfrom torch import Tensor, IntTensor\nimport mup.init as mupi\nfrom mup import set_base_shapes, MuReadout\nfrom torch.nn.functional import linear\n\nfrom graphium.ipu.ipu_utils import is_running_on_ipu\n\nSUPPORTED_ACTIVATION_MAP = {\n    \"ReLU\",\n    \"Sigmoid\",\n    \"Tanh\",\n    \"ELU\",\n    \"SELU\",\n    \"GLU\",\n    \"GELU\",\n    \"LeakyReLU\",\n    \"Softplus\",\n    \"None\",\n}\n\n\ndef get_activation(activation: Union[type(None), str, Callable]) -> Optional[Callable]:\n    r\"\"\"\n    returns the activation function represented by the input string\n\n    Parameters:\n        activation: Callable, `None`, or string with value:\n            \"none\", \"ReLU\", \"Sigmoid\", \"Tanh\", \"ELU\", \"SELU\", \"GLU\", \"GELU\", \"LeakyReLU\", \"Softplus\"\n\n    Returns:\n        Callable or None: The activation function\n    \"\"\"\n    if (activation is not None) and callable(activation):\n        # activation is already a function\n        return activation\n\n    if (activation is None) or (activation.lower() == \"none\"):\n        return None\n\n    # search in SUPPORTED_ACTIVATION_MAP a torch.nn.modules.activation\n    activation = [x for x in SUPPORTED_ACTIVATION_MAP if activation.lower() == x.lower()]\n    assert len(activation) == 1 and isinstance(\n        activation[0], str\n    ), f\"Unhandled activation function {activation} of type {type(activation)}\"\n    activation = activation[0]\n\n    return vars(torch.nn.modules.activation)[activation]()\n\n\ndef get_activation_str(activation: Union[type(None), str, Callable]) -> str:\n    r\"\"\"\n    returns the string related to the activation function\n\n    Parameters:\n        activation: Callable, `None`, or string with value:\n            \"none\", \"ReLU\", \"Sigmoid\", \"Tanh\", \"ELU\", \"SELU\", \"GLU\", \"LeakyReLU\", \"Softplus\"\n\n    Returns:\n        The name of the activation function\n    \"\"\"\n\n    if isinstance(activation, str):\n        return activation\n\n    if activation is None:\n        return \"None\"\n\n    if isinstance(activation, Callable):\n        return activation.__class__._get_name(activation)\n    else:\n        raise ValueError(f\"Unhandled activation function {activation} of type {type(activation)}\")\n\n\ndef get_norm(normalization: Union[Type[None], str, Callable], dim: Optional[int] = None):\n    r\"\"\"\n    returns the normalization function represented by the input string\n\n    Parameters:\n        normalization: Callable, `None`, or string with value:\n            \"none\", \"batch_norm\", \"layer_norm\"\n        dim: Dimension where to apply the norm. Mandatory for 'batch_norm' and 'layer_norm'\n\n    Returns:\n        Callable or None: The normalization function\n    \"\"\"\n    parsed_norm = None\n    if (normalization is None) or (normalization in [\"none\", \"NoneType\"]):\n        pass\n    elif callable(normalization):\n        parsed_norm = normalization\n    elif normalization in [\"batch_norm\", \"BatchNorm1d\"]:\n        parsed_norm = nn.BatchNorm1d(dim)\n    elif normalization in [\"layer_norm\", \"LayerNorm\"]:\n        parsed_norm = nn.LayerNorm(dim)\n    else:\n        raise ValueError(\n            f\"Undefined normalization `{normalization}`, must be `None`, `Callable`, 'batch_norm', 'layer_norm', 'none'\"\n        )\n    return deepcopy(parsed_norm)\n\n\nclass MultiheadAttentionMup(nn.MultiheadAttention):\n    \"\"\"\n    Modifying the MultiheadAttention to work with the muTransfer paradigm.\n    The layers are initialized using the mup package.\n    The `_scaled_dot_product_attention` normalizes the attention matrix with `1/d` instead of `1/sqrt(d)`\n    The biased self-attention option is added to have 3D attention bias.\n    \"\"\"\n\n    def __init__(self, biased_attention, **kwargs):\n        super().__init__(**kwargs)\n        self.biased_attention = biased_attention\n\n    def _reset_parameters(self):\n        set_base_shapes(self, None, rescale_params=False)  # Set the shapes of the tensors, useful for mup\n        if self._qkv_same_embed_dim:\n            mupi.xavier_uniform_(self.in_proj_weight)\n        else:\n            mupi.xavier_uniform_(self.q_proj_weight)\n            mupi.xavier_uniform_(self.k_proj_weight)\n            mupi.xavier_uniform_(self.v_proj_weight)\n\n        if self.in_proj_bias is not None:\n            nn.init.constant_(self.in_proj_bias, 0.0)\n            nn.init.constant_(self.out_proj.bias, 0.0)\n        if self.bias_k is not None:\n            mupi.xavier_normal_(self.bias_k)\n        if self.bias_v is not None:\n            mupi.xavier_normal_(self.bias_v)\n\n    def forward(\n        self,\n        query: Tensor,\n        key: Tensor,\n        value: Tensor,\n        key_padding_mask: Optional[Tensor] = None,\n        attn_bias: Optional[Tensor] = None,\n        precision: Optional[str] = \"32\",\n        *args,\n        **kwargs,\n    ) -> Tuple[Tensor, Optional[Tensor]]:\n        # attn_bias [batch, num_heads, nodes, nodes]\n        if not self.biased_attention or attn_bias is None:\n            attn_bias = 0.0\n        # assuming source and target have the same sequence length (homogeneous graph attention)\n        batch, nodes, hidden = query.size()\n        assert (\n            hidden == self.embed_dim\n        ), f\"query hidden dimension {hidden} != embed_dim {self.embed_dim} in class\"\n        head_dim = self.embed_dim // self.num_heads\n        assert head_dim * self.num_heads == self.embed_dim, \"embed_dim must be divisible by num_heads\"\n        scaling_factor = 1 / head_dim  # use head_dim instead of (head_dim**0.5) for mup\n        b_q, b_k, b_v = self.in_proj_bias.chunk(3)\n        q_proj_weight, k_proj_weight, v_proj_weight = self.in_proj_weight.chunk(3)\n        # [batch, num_heads, nodes, head_size]\n        q = linear(query, q_proj_weight, b_q).view(batch, nodes, self.num_heads, -1).transpose(1, 2)\n        # [batch, num_heads, nodes, head_size]\n        k = linear(key, k_proj_weight, b_k).view(batch, nodes, self.num_heads, -1).transpose(1, 2)\n        # [batch, num_heads, nodes, head_size]\n        v = linear(value, v_proj_weight, b_v).view(batch, nodes, self.num_heads, -1).transpose(1, 2)\n        q = q * scaling_factor\n        # [batch, num_heads, nodes, nodes]\n        attn_weights = q @ k.transpose(-1, -2)\n        # [batch, num_heads, nodes, nodes]\n        attn_weights += attn_bias\n        key_padding_mask_value = float(\"-inf\") if precision == \"32\" else -10000\n        # key_padding_mask: [batch, 1, 1, nodes]\n        if key_padding_mask is not None:\n            masked_attn_weights = attn_weights.masked_fill(\n                key_padding_mask.unsqueeze(1).unsqueeze(2),\n                key_padding_mask_value,\n            )\n        else:\n            masked_attn_weights = attn_weights\n        masked_attn_weights = F.softmax(masked_attn_weights, dim=-1)\n        attn_probs = F.dropout(masked_attn_weights, p=self.dropout, training=self.training)\n        # [batch, num_heads, nodes, nodes] * [batch, num_heads, nodes, head_size] -> [batch, num_heads, nodes, head_size]\n        attn = attn_probs @ v\n        # [batch, nodes, embd_dim]\n        attn = attn.transpose(1, 2).contiguous().view(batch, nodes, self.embed_dim)\n        # [batch, nodes, embd_dim]\n        out = (self.out_proj(attn), None)\n        return out\n\n\nclass TransformerEncoderLayerMup(nn.TransformerEncoderLayer):\n    r\"\"\"\n    Modified version of ``torch.nn.TransformerEncoderLayer`` that uses :math:`1/n`-scaled attention\n    for compatibility with muP (as opposed to the original :math:`1/\\sqrt{n}` scaling factor)\n    Arguments are the same as ``torch.nn.TransformerEncoderLayer``.\n    \"\"\"\n\n    def __init__(self, biased_attention, *args, **kwargs) -> None:\n        super(TransformerEncoderLayerMup, self).__init__(*args, **kwargs)\n\n        # Extract arguments passed to __init__ as a dictionary\n        signature = inspect.signature(nn.TransformerEncoderLayer.__init__)\n\n        # `self` needs to passed, which makes things tricky, but using this object seems fine for now\n        bound_signature = signature.bind(self, *args, **kwargs)\n        bound_signature.apply_defaults()\n\n        mha_names = [\"embed_dim\", \"num_heads\", \"dropout\", \"batch_first\", \"device\", \"dtype\"]\n        transformer_names = [\"d_model\", \"nhead\", \"dropout\", \"batch_first\", \"device\", \"dtype\"]\n\n        # Override self attention to use muP\n        self.self_attn = MultiheadAttentionMup(\n            biased_attention,\n            **{\n                mha_name: bound_signature.arguments[transformer_name]\n                for mha_name, transformer_name in zip(mha_names, transformer_names)\n            },\n        )\n\n\nclass MuReadoutGraphium(MuReadout):\n    \"\"\"\n    PopTorch-compatible replacement for `mup.MuReadout`\n\n    Not quite a drop-in replacement for `mup.MuReadout` - you need to specify\n    `base_width`.\n\n    Set `base_width` to width of base model passed to `mup.set_base_shapes`\n    to get same results on IPU and CPU. Should still \"work\" with any other\n    value, but won't give the same results as CPU\n    \"\"\"\n\n    def __init__(self, in_features, *args, **kwargs):\n        super().__init__(in_features, *args, **kwargs)\n        self._base_width = in_features\n\n    @property\n    def absolute_width(self):\n        return float(self.in_features)\n\n    @property\n    def base_width(self):\n        return self._base_width\n\n    @base_width.setter\n    def base_width(self, val):\n        if val is None:\n            return\n        assert isinstance(\n            val, (int, torch.int, torch.long)\n        ), f\"`base_width` must be None, int or long, provided {val} of type {type(val)}\"\n        self._base_width = val\n\n    def width_mult(self):\n        return self.absolute_width / self.base_width\n\n\nclass FCLayer(nn.Module):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        bias: bool = True,\n        init_fn: Optional[Callable] = None,\n        is_readout_layer: bool = False,\n        droppath_rate: float = 0.0,\n    ):\n        r\"\"\"\n        A simple fully connected and customizable layer. This layer is centered around a `torch.nn.Linear` module.\n        The order in which transformations are applied is:\n\n        - Dense Layer\n        - Activation\n        - Dropout (if applicable)\n        - Batch Normalization (if applicable)\n\n        Parameters:\n            in_dim:\n                Input dimension of the layer (the `torch.nn.Linear`)\n            out_dim:\n                Output dimension of the layer.\n            dropout:\n                The ratio of units to dropout. No dropout by default.\n            activation:\n                Activation function to use.\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n            bias:\n                Whether to enable bias in for the linear layer.\n            init_fn:\n                Initialization function to use for the weight of the layer. Default is\n                $$\\mathcal{U}(-\\sqrt{k}, \\sqrt{k})$$ with $$k=\\frac{1}{ \\text{in_dim}}$$\n            is_readout_layer: Whether the layer should be treated as a readout layer by replacing of `torch.nn.Linear`\n                by `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1, see https://arxiv.org/abs/1603.09382\n        Attributes:\n            dropout (int):\n                The ratio of units to dropout.\n            normalization (None or Callable):\n                Normalization layer\n            linear (`torch.nn.Linear`):\n                The linear layer\n            activation (`torch.nn.Module`):\n                The activation layer\n            init_fn (Callable):\n                Initialization function used for the weight of the layer\n            in_dim (int):\n                Input dimension of the linear layer\n            out_dim (int):\n                Output dimension of the linear layer\n        \"\"\"\n\n        super().__init__()\n\n        self.__params = locals()\n        del self.__params[\"__class__\"]\n        del self.__params[\"self\"]\n\n        # Basic parameters\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.bias = bias\n        self.dropout = None\n        self.normalization = get_norm(normalization, dim=out_dim)\n\n        # Dropout and activation\n        if dropout:\n            self.dropout = nn.Dropout(p=dropout)\n        self.activation = get_activation(activation)\n\n        self.drop_path = None\n        if droppath_rate > 0:\n            self.drop_path = DropPath(droppath_rate)\n\n        # Linear layer, or MuReadout layer\n        if not is_readout_layer:\n            self.linear = nn.Linear(in_dim, out_dim, bias=bias)\n        else:\n            self.linear = MuReadoutGraphium(in_dim, out_dim, bias=bias)\n\n            # Warn user in case of weird parameters\n            if self.normalization is not None:\n                logger.warning(\n                    f\"Normalization is not `None` for the readout layer. Provided {self.normalization}\"\n                )\n            if (self.dropout is not None) and (self.dropout.p > 0):\n                logger.warning(f\"Dropout is not `None` or `0` for the readout layer. Provided {self.dropout}\")\n\n        # Define the initialization function based on `muTransfer`, and reset the parameters\n        self.init_fn = init_fn if init_fn is not None else mupi.xavier_uniform_\n        self.reset_parameters()\n\n    def reset_parameters(self, init_fn=None):\n        \"\"\"\n        Reset the parameters of the linear layer using the `init_fn`.\n        \"\"\"\n        set_base_shapes(self, None, rescale_params=False)  # Set the shapes of the tensors, useful for mup\n        init_fn = init_fn or self.init_fn\n        if init_fn is not None:\n            init_fn(self.linear.weight)\n        if self.bias:\n            self.linear.bias.data.zero_()\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Apply the FC layer on the input features.\n\n        Parameters:\n\n            h: `torch.Tensor[..., Din]`:\n                Input feature tensor, before the FC.\n                `Din` is the number of input features\n\n        Returns:\n\n            `torch.Tensor[..., Dout]`:\n                Output feature tensor, after the FC.\n                `Dout` is the number of output features\n\n        \"\"\"\n\n        if torch.prod(torch.as_tensor(h.shape[:-1])) == 0:\n            h = torch.zeros(\n                list(h.shape[:-1]) + [self.linear.out_features],\n                device=h.device,\n                dtype=h.dtype,\n            )\n            return h\n\n        h = self.linear(h)\n\n        if self.normalization is not None:\n            if h.shape[1] != self.out_dim:\n                h = self.normalization(h.transpose(1, 2)).transpose(1, 2)\n            else:\n                h = self.normalization(h)\n\n        if self.dropout is not None:\n            h = self.dropout(h)\n        if self.activation is not None:\n            h = self.activation(h)\n        if self.drop_path is not None:\n            h = self.drop_path(h)\n\n        return h\n\n    @property\n    def in_channels(self) -> int:\n        r\"\"\"\n        Get the input channel size. For compatibility with PyG.\n        \"\"\"\n        return self.in_dim\n\n    @property\n    def out_channels(self) -> int:\n        r\"\"\"\n        Get the output channel size. For compatibility with PyG.\n        \"\"\"\n        return self.out_dim\n\n    def __repr__(self):\n        return f\"{self.__class__.__name__}({self.in_dim} -> {self.out_dim}, activation={self.activation})\"\n\n\nclass MLP(nn.Module):\n    def __init__(\n        self,\n        in_dim: int,\n        hidden_dims: Union[Iterable[int], int],\n        out_dim: int,\n        depth: Optional[int] = None,\n        activation: Union[str, Callable] = \"relu\",\n        last_activation: Union[str, Callable] = \"none\",\n        dropout: float = 0.0,\n        last_dropout: float = 0.0,\n        normalization: Union[Type[None], str, Callable] = \"none\",\n        last_normalization: Union[Type[None], str, Callable] = \"none\",\n        first_normalization: Union[Type[None], str, Callable] = \"none\",\n        last_layer_is_readout: bool = False,\n        droppath_rate: float = 0.0,\n        constant_droppath_rate: bool = True,\n        fc_layer: FCLayer = FCLayer,\n        fc_layer_kwargs: Optional[dict] = None,\n    ):\n        r\"\"\"\n        Simple multi-layer perceptron, built of a series of FCLayers\n\n        Parameters:\n            in_dim:\n                Input dimension of the MLP\n            hidden_dims:\n                Either an integer specifying all the hidden dimensions,\n                or a list of dimensions in the hidden layers.\n            out_dim:\n                Output dimension of the MLP.\n            depth:\n                If `hidden_dims` is an integer, `depth` is 1 + the number of\n                hidden layers to use.\n                If `hidden_dims` is a list, then\n                `depth` must be `None` or equal to `len(hidden_dims) + 1`\n            activation:\n                Activation function to use in all the layers except the last.\n                if `layers==1`, this parameter is ignored\n            last_activation:\n                Activation function to use in the last layer.\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization in the hidden layers.\n                - `Callable`: Any callable function\n\n                if `layers==1`, this parameter is ignored\n            last_normalization:\n                Norrmalization to use **after the last layer**. Same options as `normalization`.\n            first_normalization:\n                Norrmalization to use in **before the first layer**. Same options as `normalization`.\n            last_dropout:\n                The ratio of units to dropout at the last layer.\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1.\n                See https://arxiv.org/abs/1603.09382\n            constant_droppath_rate:\n                If `True`, drop rates will remain constant accross layers.\n                Otherwise, drop rates will vary stochastically.\n                See `DropPath.get_stochastic_drop_rate`\n            fc_layer:\n                The fully connected layer to use. Must inherit from `FCLayer`.\n            fc_layer_kwargs:\n                Keyword arguments to pass to the fully connected layer.\n        \"\"\"\n\n        super().__init__()\n\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.fc_layer_kwargs = deepcopy(fc_layer_kwargs) or {}\n\n        # Parse the hidden dimensions and depth\n        if isinstance(hidden_dims, int):\n            self.hidden_dims = [hidden_dims] * (depth - 1)\n        else:\n            self.hidden_dims = list(hidden_dims)\n            assert (depth is None) or (\n                depth == len(self.hidden_dims) + 1\n            ), \"Mismatch between the provided network depth from `hidden_dims` and `depth`\"\n        self.depth = len(self.hidden_dims) + 1\n\n        # Parse the normalization\n        self.first_normalization = get_norm(first_normalization, dim=in_dim)\n\n        all_dims = [in_dim] + self.hidden_dims + [out_dim]\n        fully_connected = []\n        if self.depth == 0:\n            self.fully_connected = None\n            return\n        else:\n            for ii in range(self.depth):\n                if ii < (self.depth - 1):\n                    # Define the parameters for all intermediate layers\n                    this_activation = activation\n                    this_normalization = normalization\n                    this_dropout = dropout\n                    is_readout_layer = False\n                else:\n                    # Define the parameters for the last layer\n                    this_activation = last_activation\n                    this_normalization = last_normalization\n                    this_dropout = last_dropout\n                    is_readout_layer = last_layer_is_readout\n\n                if constant_droppath_rate:\n                    this_drop_rate = droppath_rate\n                else:\n                    this_drop_rate = DropPath.get_stochastic_drop_rate(droppath_rate, ii, self.depth)\n\n                # Add a fully-connected layer\n                fully_connected.append(\n                    fc_layer(\n                        all_dims[ii],\n                        all_dims[ii + 1],\n                        activation=this_activation,\n                        normalization=this_normalization,\n                        dropout=this_dropout,\n                        is_readout_layer=is_readout_layer,\n                        droppath_rate=this_drop_rate,\n                        **self.fc_layer_kwargs,\n                    )\n                )\n\n        self.fully_connected = nn.Sequential(*fully_connected)\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Apply the MLP on the input features.\n\n        Parameters:\n\n            h: `torch.Tensor[..., Din]`:\n                Input feature tensor, before the MLP.\n                `Din` is the number of input features\n\n        Returns:\n\n            `torch.Tensor[..., Dout]`:\n                Output feature tensor, after the MLP.\n                `Dout` is the number of output features\n\n        \"\"\"\n        if self.first_normalization is not None:\n            h = self.first_normalization(h)\n        if self.fully_connected is not None:\n            h = self.fully_connected(h)\n        return h\n\n    @property\n    def in_features(self):\n        return self.in_dim\n\n    def __getitem__(self, idx: int) -> nn.Module:\n        return self.fully_connected[idx]\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        return self.__class__.__name__ + \" (\" + str(self.in_dim) + \" -> \" + str(self.out_dim) + \")\"\n\n\nclass GRU(nn.Module):\n    def __init__(self, in_dim: int, hidden_dim: int):\n        r\"\"\"\n        Wrapper class for the GRU used by the GNN framework, nn.GRU is used for the Gated Recurrent Unit itself\n\n        Parameters:\n            in_dim:\n                Input dimension of the GRU layer\n            hidden_dim:\n                Hidden dimension of the GRU layer.\n        \"\"\"\n\n        super().__init__()\n        self.in_dim = in_dim\n        self.hidden_dim = hidden_dim\n        self.gru = nn.GRU(in_dim=in_dim, hidden_dim=hidden_dim)\n\n    def forward(self, x, y):\n        r\"\"\"\n        Parameters:\n            x:  `torch.Tensor[B, N, Din]`\n                where Din <= in_dim (difference is padded)\n            y:  `torch.Tensor[B, N, Dh]`\n                where Dh <= hidden_dim (difference is padded)\n\n        Returns:\n            torch.Tensor: `torch.Tensor[B, N, Dh]`\n\n        \"\"\"\n        assert x.shape[-1] <= self.in_dim and y.shape[-1] <= self.hidden_dim\n\n        (B, N, _) = x.shape\n        x = x.reshape(1, B * N, -1).contiguous()\n        y = y.reshape(1, B * N, -1).contiguous()\n\n        # padding if necessary\n        if x.shape[-1] < self.in_dim:\n            x = F.pad(input=x, pad=[0, self.in_dim - x.shape[-1]], mode=\"constant\", value=0)\n        if y.shape[-1] < self.hidden_dim:\n            y = F.pad(\n                input=y,\n                pad=[0, self.hidden_dim - y.shape[-1]],\n                mode=\"constant\",\n                value=0,\n            )\n\n        x = self.gru(x, y)[1]\n        x = x.reshape(B, N, -1)\n        return x\n\n\nclass DropPath(nn.Module):\n    def __init__(self, drop_rate: float):\n        r\"\"\"\n        DropPath class for stochastic depth\n        Deep Networks with Stochastic Depth\n        Gao Huang, Yu Sun, Zhuang Liu, Daniel Sedra and Kilian Weinberger\n        https://arxiv.org/abs/1603.09382\n\n        Parameters:\n            drop_rate:\n                Drop out probability\n        \"\"\"\n\n        super().__init__()\n        self.drop_rate = drop_rate\n\n    @staticmethod\n    def get_stochastic_drop_rate(\n        drop_rate: float, layer_idx: Optional[int] = None, layer_depth: Optional[int] = None\n    ):\n        \"\"\"\n        Get the stochastic drop rate from the nominal drop rate, the layer index, and the layer depth.\n\n        `return drop_rate * (layer_idx / (layer_depth - 1))`\n\n        Parameters:\n            drop_rate:\n                Drop out nominal probability\n\n            layer_idx:\n                The index of the current layer\n\n            layer_depth:\n                The total depth (number of layers) associated to this specific layer\n\n        \"\"\"\n        if drop_rate == 0:\n            return 0\n        else:\n            assert (layer_idx is not None) and (\n                layer_depth is not None\n            ), f\"layer_idx={layer_idx} and layer_depth={layer_depth} should be integers when `droppath_rate>0`\"\n            return drop_rate * (layer_idx / (layer_depth - 1))\n\n    def forward(\n        self,\n        input: Tensor,\n        batch_idx: IntTensor,\n        batch_size: Optional[int] = None,\n    ) -> Tensor:\n        r\"\"\"\n        Parameters:\n            input:  `torch.Tensor[total_num_nodes, hidden]`\n            batch: batch attribute of the batch object, batch.batch\n            batch_size: The batch size. Must be provided when working on IPU\n\n        Returns:\n            torch.Tensor: `torch.Tensor[total_num_nodes, hidde]`\n\n        \"\"\"\n        on_ipu = is_running_on_ipu()\n\n        if self.drop_rate > 0:\n            keep_prob = 1 - self.drop_rate\n\n            # Parse the batch size\n            if batch_size is None:\n                if on_ipu:\n                    raise ValueError(\n                        \"When using the IPU the batch size must be \"\n                        \"provided during compilation instead of determined at runtime\"\n                    )\n                else:\n                    batch_size = int(batch_idx.max()) + 1\n\n            # mask shape: [num_graphs, 1]\n            mask = input.new_empty(batch_size, 1).bernoulli_(keep_prob)\n            # if on_ipu, the last graph is a padded fake graph\n            if on_ipu:\n                mask[-1] = 0\n            # using gather to extend mask to [total_num_nodes, 1]\n            node_mask = mask[batch_idx]\n            if keep_prob == 0:\n                # avoid dividing by 0\n                input_scaled = input\n            else:\n                input_scaled = input / keep_prob\n            out = input_scaled * node_mask\n        else:\n            out = input\n        return out\n"
  },
  {
    "path": "graphium/nn/encoders/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\ncode for positional encoder networks for positional encodings and structural encodings\n\n- ✅ `base_encoder.py`: `BaseEncoder` class should be inherented by all positional encoder networks\n- ✅ `mlp_encoder.py`: `MLPEncoder` as a simple MLP encoder for any positional encoding\n- `gaussian_kernal_pos_encoder.py`: `GaussianKernalPosEncoder` class for gaussian kernal positional encoder\n- `laplace_pos_encoder.py`: `LapPENodeEncoder` class for laplacian eigenvector and eigenvalue encoding\n"
  },
  {
    "path": "graphium/nn/encoders/__init__.py",
    "content": "from .base_encoder import BaseEncoder\nfrom .laplace_pos_encoder import LapPENodeEncoder\nfrom .mlp_encoder import MLPEncoder, CatMLPEncoder\nfrom .signnet_pos_encoder import SignNetNodeEncoder\nfrom .gaussian_kernel_pos_encoder import GaussianKernelPosEncoder\nfrom .bessel_pos_encoder import BesselSphericalPosEncoder\n"
  },
  {
    "path": "graphium/nn/encoders/base_encoder.py",
    "content": "from typing import List, Dict, Any, Union, Callable\nimport abc\nimport torch\nfrom torch_geometric.data import Batch\n\n\nfrom graphium.nn.base_layers import get_norm\nfrom graphium.nn.utils import MupMixin\n\n\nclass BaseEncoder(torch.nn.Module, MupMixin):\n    def __init__(\n        self,\n        input_keys: List[str],\n        output_keys: List[str],\n        in_dim: int,\n        out_dim: int,\n        num_layers: int,\n        activation: Union[str, Callable] = \"relu\",\n        first_normalization=None,\n        use_input_keys_prefix: bool = True,  # TODO: might be redundant along with parse_input_keys_with_prefix function\n    ):\n        r\"\"\"\n        Base class for all positional and structural encoders.\n        Initialize the encoder with the following arguments:\n        Parameters:\n            input_keys: The keys from the graph to use as input\n            output_keys: The keys to return as output encodings\n            in_dim: The input dimension for the encoder\n            out_dim: The output dimension of the encodings\n            num_layers: The number of layers of the encoder\n            activation: The activation function to use\n            first_normalization: The normalization to use before the first layer\n            use_input_keys_prefix: Whether to use the `key_prefix` argument in the `forward` method.\n            This is useful when the encodings are categorized by the function `get_all_positional_encoding`\n        \"\"\"\n        super().__init__()\n\n        if type(in_dim) is list:\n            in_dim = sum(in_dim)\n\n        self.input_keys = self.parse_input_keys(input_keys)\n        self.output_keys = self.parse_output_keys(output_keys)\n        self.in_dim = in_dim\n        self.out_dim = out_dim\n        self.num_layers = num_layers\n        self.activation = activation\n        self.use_input_keys_prefix = use_input_keys_prefix\n        self.first_normalization = get_norm(first_normalization, dim=in_dim)\n\n    # TODO: the function below seems redundant; could be removed/replaced moving forward\n    def parse_input_keys_with_prefix(self, key_prefix):\n        \"\"\"\n        Parse the `input_keys` argument, given a certain prefix.\n        If the prefix is `None`, it is ignored\n        \"\"\"\n        ### TODO: redundant\n        input_keys = self.input_keys\n        if (key_prefix is not None) and (self.use_input_keys_prefix):\n            input_keys = [f\"{k}\" for k in input_keys]\n            # input_keys = [f\"{key_prefix}/{k}\" for k in input_keys]\n        ###\n        return input_keys\n\n    @abc.abstractmethod\n    def forward(self, graph: Batch, key_prefix=None) -> Dict[str, torch.Tensor]:\n        r\"\"\"\n        Forward pass of the encoder on a graph.\n        This is a method to be implemented by the child class.\n        Parameters:\n            graph: The input pyg Batch\n        \"\"\"\n        raise ValueError(\"This method must be implemented by the child class\")\n\n    @abc.abstractmethod\n    def parse_input_keys(self, input_keys: List[str]) -> List[str]:\n        r\"\"\"\n        Parse the `input_keys` argument. This is a method to be implemented by the child class.\n        Parameters:\n            input_keys: The input keys to parse\n        \"\"\"\n        raise ValueError(\"This method must be implemented by the child class\")\n\n    @abc.abstractmethod\n    def parse_output_keys(self, output_keys: List[str]) -> List[str]:\n        \"\"\"\n        Parse the `output_keys` argument.  This is a method to be implemented by the child class.\n        Parameters:\n            output_keys: The output keys to parse\n        \"\"\"\n        raise ValueError(\"This method must be implemented by the child class\")\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameters:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            A dictionary with the base model arguments\n        \"\"\"\n\n        base_kwargs = {\n            \"input_keys\": self.input_keys,\n            \"output_keys\": self.output_keys,\n            \"in_dim\": round(self.in_dim / divide_factor) if factor_in_dim else self.in_dim,\n            \"out_dim\": round(self.out_dim / divide_factor),\n            \"num_layers\": self.num_layers,\n            \"activation\": self.activation,\n            \"first_normalization\": type(self.first_normalization).__name__,\n            \"use_input_keys_prefix\": self.use_input_keys_prefix,\n        }\n\n        return base_kwargs\n"
  },
  {
    "path": "graphium/nn/encoders/bessel_pos_encoder.py",
    "content": "import torch\nimport torch.nn as nn\nfrom torch import Tensor\n\nfrom typing import Union, Callable, List, Dict, Any, Optional, Tuple\nfrom torch_geometric.data import Batch\nfrom torch_geometric.nn.models.dimenet import BesselBasisLayer, SphericalBasisLayer\nfrom torch_geometric.nn import radius_graph\n\nfrom graphium.nn.encoders.base_encoder import BaseEncoder\nfrom graphium.nn.pyg_layers.utils import triplets\nfrom graphium.nn.pyg_layers.dimenet_pyg import OutputBlock\nfrom graphium.nn.base_layers import FCLayer\n\n\nclass BesselSphericalPosEncoder(BaseEncoder):\n    def __init__(\n        self,\n        input_keys: List[str],  # The keys from the pyg graph\n        output_keys: List[str],  # The keys to return\n        in_dim: int,\n        out_dim: int,\n        num_layers: int,\n        out_dim_edges: int,\n        num_output_layers: int,\n        num_spherical: int,\n        num_radial: int,\n        max_num_neighbors: int = 32,\n        cutoff: float = 5.0,\n        envelope_exponent: int = 5,\n        activation: Union[str, Callable] = \"gelu\",\n        first_normalization=\"none\",\n        use_input_keys_prefix: bool = True,\n    ):\n        r\"\"\"\n        Configurable DimeNet's embedding encoder from the\n        `\"Directional Message Passing for Molecular Graphs\" <https://arxiv.org/abs/2003.03123> paper.\n\n        [!] code uses the pytorch-geometric implementation of BesselBasisLayer & SphericalBasisLayer\n\n        Parameters:\n            input_keys: The keys from the graph to use as input\n            output_keys: The keys to return as output encodings\n                (**should at least contain: `edge_rbf`, `triplet_sbf`, `radius_edge_index`; Optional: `node_*`, `edge_*`)\n            in_dim: The input dimension for the encoder (**not used)\n            out_dim: The output dimension of the node encodings\n            num_layers: The number of layers of the encoder (**not used)\n            out_dim_edges: The output dimension of the edge encodings\n            num_output_layers: The number of layers of the OutBlock\n            num_spherical (int): Number of spherical harmonics.\n            num_radial (int): Number of radial basis functions.\n            max_num_neighbors (int): The maximum number of neighbors to consider in radius graph. (default: 32)\n            cutoff (float): Cutoff distance for interatomic interactions. (default: 5.0)\n            envelope_exponent (int): Shape of the smooth cutoff. (default: 5)\n            activation: The activation function to use\n            first_normalization: The normalization to use before the first layer\n            use_input_keys_prefix: Whether to use the `key_prefix` argument in the `forward` method.\n\n        \"\"\"\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            first_normalization=first_normalization,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n        self.num_radial = num_radial\n        self.cutoff = cutoff\n        self.envelope_exponent = envelope_exponent\n        self.num_spherical = num_spherical\n        self.max_num_neighbors = max_num_neighbors\n\n        # static Bessel embeddings\n        self.rbf = BesselBasisLayer(num_radial, cutoff, envelope_exponent)  # for edges\n        self.sbf = SphericalBasisLayer(num_spherical, num_radial, cutoff, envelope_exponent)  # for triplets\n\n        # edge embedding (num_radial -> out_dim)\n        # ***simplified version*** by removing atom type input\n        self.rbf_proj = FCLayer(num_radial, out_dim_edges, activation=activation)\n        # node embedding from edge embedding (out_dim_edges -> out_dim)\n        self.output_block_0 = OutputBlock(\n            num_radial, out_dim_edges, out_dim, num_output_layers, activation\n        )  # 1 outblock right after embedding in original dimenet\n\n    def forward(self, batch: Batch, key_prefix: Optional[str] = None) -> Dict[str, Any]:\n        r\"\"\"\n        forward function of the DimeNetEncoder class\n        Parameters:\n            batch: The batch of pyg graphs\n            key_prefix: The prefix to use for the input keys\n        Returns:\n            A dictionary of the outpaut encodings with keys specified by `output_keys`\n        \"\"\"\n        ### Get the input keys ###\n        positions_3d_key = self.parse_input_keys_with_prefix(key_prefix)[0]\n        # be in shape [num_nodes, 3]\n        pos = batch[positions_3d_key]\n        # Create radius graph in encoder (not use chemical topology of molecules)\n        radius_edge_index = radius_graph(\n            pos, r=self.cutoff, batch=batch.batch, max_num_neighbors=self.max_num_neighbors\n        )\n\n        # Process edges and triplets.\n        i, j, idx_i, idx_j, idx_k, idx_kj, idx_ji = triplets(radius_edge_index, num_nodes=pos.size(0))\n\n        # Calculate distances.\n        dist = (pos[i] - pos[j]).pow(2).sum(dim=-1).sqrt()\n        # Calculate angles.\n        pos_i = pos[idx_i]\n        pos_ji, pos_ki = pos[idx_j] - pos_i, pos[idx_k] - pos_i\n        a = (pos_ji * pos_ki).sum(dim=-1)\n        b = torch.cross(pos_ji, pos_ki).norm(dim=-1)\n        angle = torch.atan2(b, a)\n\n        # [num_edges, num_radial]\n        rbf = self.rbf(dist)\n        # [num_triplets, num_spherical * num_radial]\n        sbf = self.sbf(dist, angle, idx_kj)\n        # initial 3D edge embedding [num_edges, out_dim] (may merge with other edge encoder's output)\n        edge_feature_3d = self.rbf_proj(rbf)\n        # initial 3D node embedding [num_nodes, out_dim] (align with original DimeNet implementation)\n        P = self.output_block_0(edge_feature_3d, rbf, i, num_nodes=pos.size(0))\n\n        # Crash if the key starts with 'graph_'\n        # Return `rbf` and `sbf` for necessary message passing in DimeNet\n        # Return `radius_edge_index` for necessary message passing in DimeNet\n        # Return `P` as node embedding (if the key starts with 'node_')\n        # Return `edge_feature_3d` otherwise\n        output = {}\n        for key in self.output_keys:\n            if isinstance(key, str) and key.startswith(\"graph_\"):\n                raise ValueError(\"Graph encodings are not supported for this encoder\")\n            elif key.startswith(\"node_\"):\n                output[key] = P\n            elif key == \"edge_rbf\":\n                output[key] = rbf\n            elif key == \"triplet_sbf\":\n                output[key] = sbf\n            elif key == \"radius_edge_index\":\n                output[key] = radius_edge_index\n            else:\n                output[key] = edge_feature_3d\n        return output\n\n    def make_mup_base_kwargs(\n        self,\n        divide_factor: float = 2.0,\n        factor_in_dim: bool = False,\n    ) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            A dictionary of the base model kwargs\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n        base_kwargs.update(\n            dict(\n                num_spherical=self.num_spherical,\n                num_radial=round(self.num_radial / divide_factor),\n                cutoff=self.cutoff,\n                envelope_exponent=self.envelope_exponent,\n            )\n        )\n        return base_kwargs\n\n    # these two below aligned with `gaussian_kernel_pos_encoder``\n    def parse_input_keys(\n        self,\n        input_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `input_keys`.\n        Parameters:\n            input_keys: The input keys to parse\n        Returns:\n            The parsed input keys\n        \"\"\"\n        if len(input_keys) != 1:\n            raise ValueError(f\"`{self.__class__}` only supports one key\")\n        for key in input_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Input keys must be node features, not edge features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Input keys must be node features, not graph features, for encoder {self.__class__}\"\n        return input_keys\n\n    def parse_output_keys(\n        self,\n        output_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `output_keys`.\n        Parameters:\n            output_keys: The output keys to parse\n        Returns:\n            The parsed output keys\n        \"\"\"\n        # all of three are required to do DimeNet-style message passing\n        assert \"edge_rbf\" in output_keys, \"Edge radial basis feature should present for this encoder\"\n        assert (\n            \"triplet_sbf\" in output_keys\n        ), \"Triplet(angle) spherical radial basis feature should present for this encoder\"\n        assert (\n            \"radius_edge_index\" in output_keys\n        ), \"Radius edge index (graph) should be built in forward of this encoder\"\n\n        for key in output_keys:\n            assert not key.startswith(\"graph_\"), \"Graph encodings are not supported for this encoder\"\n        return output_keys\n"
  },
  {
    "path": "graphium/nn/encoders/gaussian_kernel_pos_encoder.py",
    "content": "from typing import Union, Callable, List, Dict, Any, Optional\nfrom torch_geometric.data import Batch\n\nfrom graphium.nn.pyg_layers.utils import PreprocessPositions\nfrom graphium.ipu.ipu_utils import is_running_on_ipu\nfrom graphium.nn.encoders.base_encoder import BaseEncoder\n\n\nclass GaussianKernelPosEncoder(BaseEncoder):\n    def __init__(\n        self,\n        input_keys: List[str],  # The keys from the pyg graph\n        output_keys: List[str],  # The keys to return\n        in_dim: int,\n        out_dim: int,\n        embed_dim: int,\n        num_layers: int,\n        max_num_nodes_per_graph: Optional[int] = None,\n        activation: Union[str, Callable] = \"gelu\",\n        first_normalization=\"none\",\n        use_input_keys_prefix: bool = True,\n        num_heads: int = 1,\n    ):\n        r\"\"\"\n        Configurable gaussian kernel-based Positional Encoding node and edge encoder.\n        Useful for encoding 3D conformation positions.\n\n        Parameters:\n            input_keys: The keys from the pyg graph to use as input\n            output_keys: The keys to return corresponding to the output encodings\n            in_dim: The input dimension for the encoder\n            out_dim: The output dimension of the encodings\n            embed_dim: The dimension of the embedding\n            num_layers: The number of layers of the encoder\n            max_num_nodes_per_graph: The maximum number of nodes per graph\n            activation: The activation function to use\n            first_normalization: The normalization to use before the first layer\n            use_input_keys_prefix: Whether to use the `key_prefix` argument in the `forward` method.\n            num_heads: The number of heads to use for the multi-head attention\n        \"\"\"\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            first_normalization=first_normalization,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n\n        self.embed_dim = embed_dim\n        self.num_heads = num_heads\n        self.max_num_nodes_per_graph = max_num_nodes_per_graph\n\n        # parameters for preprocessing 3d positions\n        self.preprocess_3d_positions = PreprocessPositions(\n            num_heads=self.num_heads,\n            embed_dim=self.embed_dim,\n            num_kernel=self.out_dim,\n            num_layers=self.num_layers,\n            activation=self.activation,\n            first_normalization=self.first_normalization,\n        )\n\n    def parse_input_keys(\n        self,\n        input_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `input_keys`.\n        Parameters:\n            input_keys: The input keys to parse\n        Returns:\n            The parsed input keys\n        \"\"\"\n\n        if len(input_keys) != 1:\n            raise ValueError(f\"`{self.__class__}` only supports one key\")\n        for key in input_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Input keys must be node features, not edge features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"nodepair_\"\n            ), f\"Input keys must be node features, not nodepair features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Input keys must be node features, not graph features, for encoder {self.__class__}\"\n        return input_keys\n\n    def parse_output_keys(\n        self,\n        output_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `output_keys`.\n        Parameters:\n            output_keys: The output keys to parse\n        Returns:\n            The parsed output keys\n        \"\"\"\n        for key in output_keys:\n            assert not key.startswith(\"edge_\"), \"Edge encodings are not supported for this encoder\"\n            assert not key.startswith(\"graph_\"), \"Graph encodings are not supported for this encoder\"\n        return output_keys\n\n    def forward(self, batch: Batch, key_prefix: Optional[str] = None) -> Dict[str, Any]:\n        r\"\"\"\n        forward function of the GaussianKernelPosEncoder class\n        Parameters:\n            batch: The batch of pyg graphs\n            key_prefix: The prefix to use for the input keys\n        Returns:\n            A dictionary of the output encodings with keys specified by `output_keys`\n        \"\"\"\n        input_keys = self.parse_input_keys_with_prefix(key_prefix)\n\n        on_ipu = is_running_on_ipu()\n        max_num_nodes_per_graph = None\n        if on_ipu:\n            max_num_nodes_per_graph = self.max_num_nodes_per_graph\n\n        attn_bias_3d, node_feature_3d = self.preprocess_3d_positions(\n            batch, max_num_nodes_per_graph, on_ipu, positions_3d_key=input_keys[0]\n        )\n\n        # Return `attn_bias_3d` if the key starts with 'nodepair_'\n        # Crash if the key starts with 'edge_' or 'graph_'\n        # Return `node_feature_3d` otherwise\n        output = {}\n        for key in self.output_keys:\n            if isinstance(key, str) and key.startswith(\"nodepair_\"):\n                output[key] = attn_bias_3d\n            elif isinstance(key, str) and key.startswith(\"edge_\"):\n                raise ValueError(\"Edge encodings are not supported for this encoder\")\n            else:\n                output[key] = node_feature_3d\n        return output\n\n    def make_mup_base_kwargs(\n        self,\n        divide_factor: float = 2.0,\n        factor_in_dim: bool = False,\n    ) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            A dictionary of the base model kwargs\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n        base_kwargs.update(\n            dict(\n                num_heads=self.num_heads,\n                embed_dim=round(self.embed_dim / divide_factor),\n                max_num_nodes_per_graph=self.max_num_nodes_per_graph,\n            )\n        )\n        return base_kwargs\n"
  },
  {
    "path": "graphium/nn/encoders/laplace_pos_encoder.py",
    "content": "from typing import List, Dict, Any, Optional, Union, Callable\nimport torch\nimport torch.nn as nn\nfrom torch_geometric.data import Batch\n\nfrom graphium.nn.base_layers import MLP, get_norm, FCLayer, TransformerEncoderLayerMup\nfrom graphium.nn.encoders.base_encoder import BaseEncoder\n\n\nclass LapPENodeEncoder(BaseEncoder):\n    def __init__(\n        self,\n        input_keys: List[str],\n        output_keys: List[str],\n        in_dim: int,  # Size of Laplace PE embedding. Only used by the MLP model\n        hidden_dim: int,\n        out_dim: int,\n        num_layers: int,\n        activation: Optional[Union[str, Callable]] = \"relu\",\n        model_type: str = \"DeepSet\",  # 'Transformer' or 'DeepSet' or 'MLP'\n        num_layers_post=1,  # Num. layers to apply after pooling\n        dropout=0.0,\n        first_normalization=None,\n        use_input_keys_prefix: bool = True,\n        **model_kwargs,\n    ):\n        r\"\"\"\n        Laplace Positional Embedding node encoder.\n        LapPE of size dim_pe will get appended to each node feature vector.\n\n        Parameters:\n            input_keys: List of input keys to use from the data object.\n            output_keys: List of output keys to add to the data object.\n            in_dim : Size of Laplace PE embedding. Only used by the MLP model\n            hidden_dim: Size of hidden layer\n            out_dim: Size of final node embedding\n            num_layers: Number of layers in the MLP\n            activation: Activation function to use.\n            model_type: 'Transformer' or 'DeepSet' or 'MLP'\n            num_layers_post: Number of layers to apply after pooling\n            dropout: Dropout rate\n            first_normalization: Normalization to apply to the first layer.\n        \"\"\"\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            first_normalization=first_normalization,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n\n        # Parse the `input_keys`.\n        self.hidden_dim = hidden_dim\n        self.model_type = model_type\n        if num_layers_post == 0:\n            assert hidden_dim == out_dim, \"Hidden dim must be equal to out dim if num_layers_post == 0\"\n        self.num_layers_post = num_layers_post\n        self.dropout = dropout\n        self.model_kwargs = model_kwargs\n\n        if out_dim - in_dim < 1:\n            raise ValueError(f\"LapPE size {in_dim} is too large for \" f\"desired embedding size of {out_dim}.\")\n\n        # Initial projection of eigenvalue and the node's eigenvector value\n        self.linear_in = FCLayer(2, hidden_dim, activation=\"none\")\n\n        if self.model_type == \"Transformer\":\n            # Transformer model for LapPE\n            model_kwargs.setdefault(\"nhead\", 1)\n            encoder_layer = TransformerEncoderLayerMup(\n                None,\n                d_model=hidden_dim,\n                batch_first=True,\n                dropout=dropout,\n                activation=self.activation,\n                **model_kwargs,\n            )\n            self.pe_encoder = nn.TransformerEncoder(encoder_layer, num_layers=num_layers)\n        elif self.model_type == \"DeepSet\":\n            # DeepSet model for LapPE (this will be followed by a sum pooling)\n            self.pe_encoder = MLP(\n                in_dim=hidden_dim,\n                hidden_dims=hidden_dim,\n                out_dim=hidden_dim,\n                depth=num_layers,\n                dropout=dropout,\n                **model_kwargs,\n            )\n        elif self.model_type == \"MLP\":\n            # MLP that will mix all eigenvalues and eigenvectors\n            self.pe_encoder = MLP(\n                in_dim=self.in_dim * hidden_dim,\n                hidden_dims=hidden_dim,\n                out_dim=hidden_dim,\n                depth=num_layers_post,\n                dropout=dropout,\n                activation=activation,\n                last_activation=\"none\",\n                **model_kwargs,\n            )\n        else:\n            raise ValueError(f\"Unexpected PE model {self.model_type}\")\n\n        self.post_mlp = None\n        if num_layers_post > 0:\n            # MLP to apply post pooling\n            self.post_mlp = MLP(\n                in_dim=hidden_dim,\n                hidden_dims=hidden_dim,\n                out_dim=out_dim,\n                depth=num_layers_post,\n                dropout=dropout,\n                activation=activation,\n                last_activation=\"none\",\n            )\n\n    def parse_input_keys(\n        self,\n        input_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the input keys and make sure they are supported for this encoder\n        Parameters:\n            input_keys: List of input keys to use from the data object.\n        Returns:\n            List of parsed input keys\n        \"\"\"\n        if len(input_keys) != 2:\n            raise ValueError(f\"`{self.__class__}` only supports 2 keys\")\n        for key in input_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Input keys must be node features, not edge features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"nodepair_\"\n            ), f\"Input keys must be node features, not graph features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Input keys must be node features, not graph features, for encoder {self.__class__}\"\n        return input_keys\n\n    def parse_output_keys(\n        self,\n        output_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        parse the output keys\n        Parameters:\n            output_keys: List of output keys to add to the data object.\n        Returns:\n            List of parsed output keys\n        \"\"\"\n        for key in output_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Edge encodings are not supported for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"nodepair_\"\n            ), f\"Edge encodings are not supported for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Graph encodings are not supported for encoder {self.__class__}\"\n        return output_keys\n\n    def forward(\n        self,\n        batch: Batch,\n        key_prefix: Optional[str] = None,\n    ) -> Dict[str, torch.Tensor]:\n        r\"\"\"\n        Forward pass of the encoder.\n        Parameters:\n            batch: pyg Batches of graphs\n            key_prefix: Prefix to use for the input and output keys.\n        Returns:\n            output dictionary with keys as specified in `output_keys` and their output embeddings.\n        \"\"\"\n        # input_keys = self.parse_input_keys_with_prefix(key_prefix)\n        eigvals, eigvecs = batch[self.input_keys[0]], batch[self.input_keys[1]]\n\n        # Random flipping to the Laplacian encoder\n        if self.training:\n            sign_flip = torch.rand(eigvecs.size(1), device=eigvecs.device, dtype=eigvecs.dtype)\n            sign_flip[sign_flip >= 0.5] = 1.0\n            sign_flip[sign_flip < 0.5] = -1.0\n            eigvecs = eigvecs * sign_flip.unsqueeze(0)\n\n        pos_enc = torch.cat(\n            (eigvecs.unsqueeze(2), eigvals.unsqueeze(2)), dim=2\n        )  # (Num nodes) x (Num Eigenvectors) x 2\n        empty_mask = torch.isnan(pos_enc)  # (Num nodes) x (Num Eigenvectors) x 2\n\n        pos_enc[empty_mask] = 0  # (Num nodes) x (Num Eigenvectors) x 2\n        if self.first_normalization:\n            pos_enc = self.first_normalization(pos_enc)\n        pos_enc = self.linear_in(pos_enc)  # (Num nodes) x (Num Eigenvectors) x hidden_dim\n\n        # PE encoder: a Transformer or DeepSet model\n        if self.model_type == \"Transformer\":\n            pos_enc = self.pe_encoder(src=pos_enc, src_key_padding_mask=empty_mask[:, :, 0])\n            # (Num nodes) x (Num Eigenvectors) x hidden_dim\n        elif self.model_type == \"DeepSet\":\n            pos_enc = self.pe_encoder(pos_enc)\n            # (Num nodes) x (Num Eigenvectors) x hidden_dim\n        elif self.model_type == \"MLP\":\n            pos_enc = torch.flatten(pos_enc, start_dim=-2, end_dim=-1)\n            pos_enc = self.pe_encoder(pos_enc)\n            # (Num nodes) x hidden_dim\n        else:\n            raise ValueError(f\"Unexpected PE model {self.model_type}\")\n\n        if self.model_type in [\"Transformer\", \"DeepSet\"]:\n            # Mask out padded nodes\n            pos_enc[empty_mask[..., 0]] = 0  # (Num nodes) x (Num Eigenvectors) x hidden_dim\n            pos_enc = torch.sum(pos_enc, 1, keepdim=False)  # (Num nodes) x hidden_dim\n\n        # MLP post pooling\n        if self.post_mlp is not None:\n            pos_enc = self.post_mlp(pos_enc)  # (Num nodes) x out_dim\n\n        output = {key: pos_enc for key in self.output_keys}\n\n        return output\n\n    def make_mup_base_kwargs(\n        self,\n        divide_factor: float = 2.0,\n        factor_in_dim: bool = False,\n    ) -> Dict[str, Any]:\n        r\"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            Dictionary of kwargs to be used to create the base model.\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor, factor_in_dim)\n        base_kwargs.update(\n            dict(\n                hidden_dim=round(self.hidden_dim / divide_factor),\n                model_type=self.model_type,\n                num_layers_post=self.num_layers_post,\n                dropout=self.dropout,\n                **self.model_kwargs,\n            )\n        )\n        return base_kwargs\n"
  },
  {
    "path": "graphium/nn/encoders/mlp_encoder.py",
    "content": "import torch\nimport torch.nn as nn\nfrom typing import Union, Callable, List, Dict, Any, Optional\nfrom torch_geometric.data import Batch\n\nfrom graphium.nn.base_layers import MLP, get_norm\nfrom graphium.nn.encoders.base_encoder import BaseEncoder\n\n\nclass MLPEncoder(BaseEncoder):\n    def __init__(\n        self,\n        input_keys: List[str],\n        output_keys: str,\n        in_dim: int,\n        hidden_dim: int,\n        out_dim: int,\n        num_layers: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout=0.0,\n        normalization=\"none\",\n        first_normalization=\"none\",\n        use_input_keys_prefix: bool = True,\n    ):\n        r\"\"\"\n        Configurable kernel-based Positional Encoding node/edge-level encoder.\n\n        Parameters:\n            input_keys: List of input keys to use from pyg batch graph\n            output_keys: List of output keys to add to the pyg batch graph\n            in_dim : input dimension of the mlp encoder\n            hidden_dim : hidden dimension of the mlp encoder\n            out_dim : output dimension of the mlp encoder\n            num_layers : number of layers of the mlp encoder\n            activation : activation function to use\n            dropout : dropout to use\n            normalization : normalization to use\n            first_normalization : normalization to use before the first layer\n            use_input_keys_prefix: Whether to use the `key_prefix` argument\n        \"\"\"\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            first_normalization=first_normalization,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n\n        # Check the output_keys\n        self.hidden_dim = hidden_dim\n        self.dropout = dropout\n        self.normalization = normalization\n\n        # Initialize the MLP\n        self.pe_encoder = MLP(\n            in_dim=in_dim,\n            hidden_dims=hidden_dim,\n            out_dim=out_dim,\n            depth=num_layers,\n            dropout=dropout,\n            activation=activation,\n            last_activation=activation,\n            first_normalization=self.first_normalization,\n            normalization=normalization,\n            last_normalization=normalization,\n            last_dropout=dropout,\n        )\n\n    def parse_input_keys(\n        self,\n        input_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `input_keys`.\n        Parameters:\n            input_keys: List of input keys to use from pyg batch graph\n        Returns:\n            parsed input_keys\n        \"\"\"\n        return input_keys\n\n    def parse_output_keys(\n        self,\n        output_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `output_keys`.\n        Parameters:\n            output_keys: List of output keys to add to the pyg batch graph\n        Returns:\n            parsed output_keys\n        \"\"\"\n        assert len(output_keys) == len(\n            self.input_keys\n        ), f\"The number of input keys {len(self.input_keys)} and output keys {len(output_keys)} must be the same for the class {self.__class__.__name__}\"\n        for in_key, out_key in zip(self.input_keys, output_keys):\n            if in_key.startswith(\"edge_\") or out_key.startswith(\"edge_\"):\n                assert out_key.startswith(\"edge_\") and in_key.startswith(\n                    \"edge_\"\n                ), f\"The output key {out_key} and input key {in_key} must match the 'edge_' prefix for the class {self.__class__.__name__}\"\n            if in_key.startswith(\"nodepair_\") or out_key.startswith(\"nodepair_\"):\n                assert out_key.startswith(\"nodepair_\") and in_key.startswith(\n                    \"nodepair_\"\n                ), f\"The output key {out_key} and input key {in_key} must match the 'nodepair_' prefix for the class {self.__class__.__name__}\"\n            if in_key.startswith(\"graph_\") or out_key.startswith(\"graph_\"):\n                assert out_key.startswith(\"graph_\") and in_key.startswith(\n                    \"graph_\"\n                ), f\"The output key {out_key} and input key {in_key} must match the 'graph_' prefix for the class {self.__class__.__name__}\"\n        return output_keys\n\n    def forward(\n        self,\n        batch: Batch,\n        key_prefix: Optional[str] = None,\n    ) -> Dict[str, torch.Tensor]:\n        r\"\"\"\n        forward function of the mlp encoder\n        Parameters:\n            batch: pyg batch graph\n            key_prefix: Prefix to use for the input keys\n        Returns:\n            output: Dictionary of output embeddings with keys specified by input_keys\n        \"\"\"\n        # TODO: maybe we should also use the output key here? @Dom\n        # input_keys = self.parse_input_keys_with_prefix(key_prefix)\n\n        # TODO: maybe it makes sense to combine MLPEncoder and CatMLPEncoder into one class with\n        #   CatMLPEncoder being executed when the list input_keys contains more than one element.\n        #   Currently, the input_keys list can only contain one element in MLPEncoder.\n\n        # Run the MLP for each input key\n        output = {}\n        for input_key, output_key in zip(self.input_keys, self.output_keys):\n            output[output_key] = self.pe_encoder(batch[input_key])  # (Num nodes/edges) x dim_pe\n\n        return output\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            base_kwargs: Dictionary of kwargs to use for the base model\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n        base_kwargs.update(\n            dict(\n                hidden_dim=round(self.hidden_dim / divide_factor),\n                dropout=self.dropout,\n                normalization=self.normalization,\n            )\n        )\n        return base_kwargs\n\n\nclass CatMLPEncoder(BaseEncoder):\n    def __init__(\n        self,\n        input_keys: List[str],\n        output_keys: str,\n        in_dim: Union[int, List[int]],\n        hidden_dim: int,\n        out_dim: int,\n        num_layers: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout=0.0,\n        normalization=\"none\",\n        first_normalization=\"none\",\n        use_input_keys_prefix: bool = True,\n    ):\n        r\"\"\"\n        Configurable kernel-based Positional Encoding node/edge-level encoder.\n        Concatenates the list of input (node or edge) features in the feature dimension\n\n        Parameters:\n            input_keys: List of input keys; inputs are concatenated in feat dimension and passed through mlp\n            output_keys: List of output keys to add to the pyg batch graph\n            in_dim : input dimension of the mlp encoder; sum of input dimensions of inputs\n            hidden_dim : hidden dimension of the mlp encoder\n            out_dim : output dimension of the mlp encoder\n            num_layers : number of layers of the mlp encoder\n            activation : activation function to use\n            dropout : dropout to use\n            normalization : normalization to use\n            first_normalization : normalization to use before the first layer\n            use_input_keys_prefix: Whether to use the `key_prefix` argument\n        \"\"\"\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            first_normalization=first_normalization,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n\n        if type(in_dim) is list:\n            in_dim = sum(in_dim)\n\n        # Check the output_keys\n        self.hidden_dim = hidden_dim\n        self.dropout = dropout\n        self.normalization = normalization\n\n        # Initialize the MLP\n        self.pe_encoder = MLP(\n            in_dim=in_dim,\n            hidden_dims=hidden_dim,\n            out_dim=out_dim,\n            depth=num_layers,\n            dropout=dropout,\n            activation=activation,\n            last_activation=activation,\n            first_normalization=self.first_normalization,\n            normalization=normalization,\n            last_normalization=normalization,\n            last_dropout=dropout,\n        )\n\n    def parse_input_keys(\n        self,\n        input_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `input_keys`.\n        Parameters:\n            input_keys: List of input keys to use from pyg batch graph\n        Returns:\n            parsed input_keys\n        \"\"\"\n        return input_keys\n\n    def parse_output_keys(\n        self,\n        output_keys: List[str],\n    ) -> List[str]:\n        r\"\"\"\n        Parse the `output_keys`.\n        Parameters:\n            output_keys: List of output keys to add to the pyg batch graph\n        Returns:\n            parsed output_keys\n        \"\"\"\n\n        return output_keys\n\n    def forward(\n        self,\n        batch: Batch,\n        key_prefix: Optional[str] = None,\n    ) -> Dict[str, torch.Tensor]:\n        r\"\"\"\n        forward function of the mlp encoder\n        Parameters:\n            batch: pyg batch graph\n            key_prefix: Prefix to use for the input keys\n        Returns:\n            output: Dictionary of output embeddings with keys specified by input_keys\n        \"\"\"\n        # TODO: maybe we should also use the output key here? @Dom\n        # input_keys = self.parse_input_keys_with_prefix(key_prefix)\n\n        # Concatenate selected pes\n        input = torch.cat(\n            [batch[input_key] for input_key in self.input_keys], dim=-1\n        )  # [num_nodes/num_edges, sum(in_dims)]\n\n        output = {}\n        for output_key in self.output_keys:\n            output[output_key] = self.pe_encoder(input)\n\n        return output\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        Returns:\n            base_kwargs: Dictionary of kwargs to use for the base model\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n        base_kwargs.update(\n            dict(\n                hidden_dim=round(self.hidden_dim / divide_factor),\n                dropout=self.dropout,\n                normalization=self.normalization,\n            )\n        )\n        return base_kwargs\n"
  },
  {
    "path": "graphium/nn/encoders/signnet_pos_encoder.py",
    "content": "\"\"\"\nSignNet https://arxiv.org/abs/2202.13013\nbased on https://github.com/cptq/SignNet-BasisNet\n\"\"\"\n\nfrom typing import Dict, Any, Optional, List\n\nimport torch\nimport torch.nn as nn\nfrom torch_geometric.nn import GINConv\nfrom torch_scatter import scatter\nfrom torch_geometric.data import Batch\n\nfrom graphium.nn.base_layers import MLP\nfrom graphium.nn.encoders.base_encoder import BaseEncoder\n\n\nclass SimpleGIN(nn.Module):\n    def __init__(\n        self, in_dim, hidden_dim, out_dim, num_layers, normalization=\"none\", dropout=0.5, activation=\"relu\"\n    ):\n        r\"\"\"\n        not supported yet\n        \"\"\"\n        # TODO: Not sure this works? Needs updating.\n        super().__init__()\n        self.layers = nn.ModuleList()\n        self.normalization = normalization\n        # input layer\n        update_net = MLP(\n            in_dim,\n            hidden_dim,\n            hidden_dim,\n            2,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n            last_normalization=normalization,\n            last_dropout=dropout,\n        )\n        self.layers.append(GINConv(update_net))\n        # hidden layers\n        for i in range(num_layers - 2):\n            update_net = MLP(\n                hidden_dim,\n                hidden_dim,\n                hidden_dim,\n                2,\n                normalization=normalization,\n                dropout=dropout,\n                activation=activation,\n                last_normalization=normalization,\n                last_dropout=dropout,\n            )\n            self.layers.append(GINConv(update_net))\n        # output layer\n        update_net = MLP(\n            hidden_dim,\n            hidden_dim,\n            out_dim,\n            2,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n            last_normalization=normalization,\n            last_dropout=dropout,\n        )\n        self.layers.append(GINConv(update_net))\n        self.dropout = nn.Dropout(p=dropout)\n\n    def forward(self, x, edge_index):\n        for ii, layer in enumerate(self.layers):\n            x = layer(x, edge_index)\n        return x\n\n\nclass GINDeepSigns(nn.Module):\n    \"\"\"Sign invariant neural network with MLP aggregation.\n    f(v1, ..., vk) = rho(enc(v1) + enc(-v1), ..., enc(vk) + enc(-vk))\n    \"\"\"\n\n    def __init__(\n        self,\n        in_channels,\n        hidden_channels,\n        out_channels,\n        num_layers,\n        k,\n        dim_pe,\n        rho_num_layers,\n        normalization=\"none\",\n        dropout=0.5,\n        activation=\"relu\",\n    ):\n        # TODO: Not sure this works? Needs updating.\n        super().__init__()\n        self.enc = SimpleGIN(\n            in_channels,\n            hidden_channels,\n            out_channels,\n            num_layers,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n        )\n        rho_dim = out_channels * k\n        self.rho = MLP(\n            rho_dim,\n            hidden_channels,\n            dim_pe,\n            rho_num_layers,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n        )\n\n    def forward(self, x, edge_index, batch_index):\n        N = x.shape[0]  # Total number of nodes in the batch.\n        x = x.transpose(0, 1)  # N x K x In -> K x N x In\n        x = self.enc(x, edge_index) + self.enc(-x, edge_index)\n        x = x.transpose(0, 1).reshape(N, -1)  # K x N x Out -> N x (K * Out)\n        x = self.rho(x)  # N x dim_pe (Note: in the original codebase dim_pe is always K)\n        return x\n\n\nclass MaskedGINDeepSigns(nn.Module):\n    \"\"\"Sign invariant neural network with sum pooling and DeepSet.\n    f(v1, ..., vk) = rho(enc(v1) + enc(-v1), ..., enc(vk) + enc(-vk))\n    \"\"\"\n\n    def __init__(\n        self,\n        in_channels,\n        hidden_channels,\n        out_channels,\n        num_layers,\n        dim_pe,\n        rho_num_layers,\n        normalization=\"none\",\n        dropout=0.5,\n        activation=\"relu\",\n    ):\n        # TODO: Not sure this works? Needs updating.\n        super().__init__()\n        self.enc = SimpleGIN(\n            in_channels,\n            hidden_channels,\n            out_channels,\n            num_layers,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n        )\n        self.rho = MLP(\n            out_channels,\n            hidden_channels,\n            dim_pe,\n            rho_num_layers,\n            normalization=normalization,\n            dropout=dropout,\n            activation=activation,\n        )\n\n    def batched_n_nodes(self, batch_index):\n        batch_size = batch_index.max().item() + 1\n        one = batch_index.new_ones(batch_index.size(0))\n        n_nodes = scatter(\n            one, batch_index, dim=0, dim_size=batch_size, reduce=\"add\"\n        )  # Number of nodes in each graph.\n        n_nodes = n_nodes.unsqueeze(1)\n        return torch.cat([size * n_nodes.new_ones(size) for size in n_nodes])\n\n    def forward(self, x, edge_index, batch_index):\n        N = x.shape[0]  # Total number of nodes in the batch.\n        K = x.shape[1]  # Max. number of eigen vectors / frequencies.\n        x = x.transpose(0, 1)  # N x K x In -> K x N x In\n        x = self.enc(x, edge_index) + self.enc(-x, edge_index)  # K x N x Out\n        x = x.transpose(0, 1)  # K x N x Out -> N x K x Out\n\n        batched_num_nodes = self.batched_n_nodes(batch_index)\n        mask = torch.cat([torch.arange(K).unsqueeze(0) for _ in range(N)])\n        mask = (mask.to(batch_index.device) < batched_num_nodes.unsqueeze(1)).bool()\n        x[~mask] = 0\n        x = x.sum(dim=1)  # (sum over K) -> N x Out\n        x = self.rho(x)  # N x Out -> N x dim_pe (Note: in the original codebase dim_pe is always K)\n        return x\n\n\nclass SignNetNodeEncoder(BaseEncoder):\n    r\"\"\"SignNet Positional Embedding node encoder.\n    https://arxiv.org/abs/2202.13013\n    https://github.com/cptq/SignNet-BasisNet\n\n    Uses precomputated Laplacian eigen-decomposition, but instead\n    of eigen-vector sign flipping + DeepSet/Transformer, computes the PE as:\n    SignNetPE(v_1, ... , v_k) = \\rho ( [\\phi(v_i) + \\rhi(-v_i)]^k_i=1 )\n    where \\phi is GIN network applied to k first non-trivial eigenvectors, and\n    \\rho is an MLP if k is a constant, but if all eigenvectors are used then\n    \\rho is DeepSet with sum-pooling.\n\n    SignNetPE of size dim_pe will get appended to each node feature vector.\n\n    Args:\n        dim_emb: Size of final node embedding\n    \"\"\"\n\n    def __init__(\n        self,\n        input_keys: Dict,\n        output_keys: List[str],\n        in_dim,  # Size of PE embedding\n        hidden_dim,\n        out_dim,\n        num_layers,  # Num. layers in \\phi GNN part\n        model_type,  # 'MLP' or 'DeepSet'\n        max_freqs,  # Num. eigenvectors (frequencies)\n        use_input_keys_prefix: bool = True,\n        num_layers_post=0,  # Num. layers in \\rho MLP/DeepSet\n        activation=\"relu\",\n        dropout=0.0,\n        normalization=\"none\",\n    ):\n        # TODO: Not sure this works? Needs updating.\n        super().__init__(\n            input_keys=input_keys,\n            output_keys=output_keys,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_layers=num_layers,\n            activation=activation,\n            use_input_keys_prefix=use_input_keys_prefix,\n        )\n\n        # Parse the `input_keys`.\n        self.hidden_dim = hidden_dim\n        self.model_type = model_type\n        self.max_freqs = max_freqs\n        self.num_layers_post = num_layers_post\n        self.dropout = dropout\n        self.normalization = normalization\n\n        if model_type not in [\"MLP\", \"DeepSet\"]:\n            raise ValueError(f\"Unexpected SignNet model {model_type}\")\n        self.model_type = model_type\n\n        if num_layers_post < 1:\n            raise ValueError(f\"Num layers in rho model has to be positive.\")\n\n        if out_dim - in_dim < 1:\n            raise ValueError(\n                f\"SignNet PE size {in_dim} is too large for \" f\"desired embedding size of {out_dim}.\"\n            )\n\n        # Sign invariant neural network.\n        if self.model_type == \"MLP\":\n            self.sign_inv_net = GINDeepSigns(\n                in_channels=1,\n                hidden_channels=hidden_dim,\n                out_channels=hidden_dim,\n                num_layers=num_layers,\n                k=max_freqs,\n                dim_pe=in_dim,\n                rho_num_layers=num_layers_post,\n                normalization=normalization,\n                dropout=dropout,\n                activation=activation,\n            )\n        elif self.model_type == \"DeepSet\":\n            self.sign_inv_net = MaskedGINDeepSigns(\n                in_channels=1,\n                hidden_channels=hidden_dim,\n                out_channels=out_dim,\n                num_layers=num_layers,\n                dim_pe=in_dim,\n                rho_num_layers=num_layers_post,\n                normalization=normalization,\n                dropout=dropout,\n                activation=activation,\n            )\n        else:\n            raise ValueError(f\"Unexpected model {self.model_type}\")\n\n    def parse_input_keys(self, input_keys):\n        if len(input_keys) != 1:\n            raise ValueError(f\"`{self.__class__}` only supports 1 key\")\n        for key in input_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Input keys must be node features, not edge features, for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Input keys must be node features, not graph features, for encoder {self.__class__}\"\n        return input_keys\n\n    def parse_output_keys(self, output_keys):\n        for key in output_keys:\n            assert not key.startswith(\n                \"edge_\"\n            ), f\"Edge encodings are not supported for encoder {self.__class__}\"\n            assert not key.startswith(\n                \"graph_\"\n            ), f\"Graph encodings are not supported for encoder {self.__class__}\"\n        return output_keys\n\n    def forward(self, batch: Batch, key_prefix: Optional[str] = None) -> Dict[str, torch.Tensor]:\n        input_keys = self.parse_input_keys_with_prefix(key_prefix)\n        eigvecs, edge_index, batch_index = batch[input_keys[0]], batch[\"edge_index\"], batch[\"batch_index\"]\n        pos_enc = eigvecs.unsqueeze(-1)  # (Num nodes) x (Num Eigenvectors) x 1\n\n        empty_mask = torch.isnan(pos_enc)\n        pos_enc[empty_mask] = 0  # (Num nodes) x (Num Eigenvectors) x 1\n\n        # SignNet\n        pos_enc = self.sign_inv_net(pos_enc, edge_index, batch_index)  # (Num nodes) x (pos_enc_dim)\n\n        output = {key: pos_enc for key in self.output_keys}\n\n        return output\n\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: bool = False) -> Dict[str, Any]:\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        # TODO: Update this. It is broken\n\n        Parameters:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension\n        \"\"\"\n        base_kwargs = super().make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=factor_in_dim)\n        base_kwargs.update(\n            dict(\n                hidden_dim=round(self.hidden_dim / divide_factor),\n                model_type=self.model_type,\n                max_freqs=self.max_freqs,\n                num_layers_post=self.num_layers_post,\n                dropout=self.dropout,\n                normalization=type(self.normalization).__name__,\n            )\n        )\n        return base_kwargs\n"
  },
  {
    "path": "graphium/nn/ensemble_layers.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, Callable, Optional, Type, Tuple, Iterable\nfrom copy import deepcopy\nfrom loguru import logger\n\n\nimport torch\nimport torch.nn as nn\nimport mup.init as mupi\nfrom mup import set_base_shapes\n\nfrom graphium.nn.base_layers import FCLayer, MLP\n\n\nclass EnsembleLinear(nn.Module):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        bias: bool = True,\n        init_fn: Optional[Callable] = None,\n    ):\n        r\"\"\"\n        Multiple linear layers that are applied in parallel with batched matrix multiplication with `torch.matmul`.\n\n        Parameters:\n            in_dim:\n                Input dimension of the linear layers\n            out_dim:\n                Output dimension of the linear layers.\n            num_ensemble:\n                Number of linear layers in the ensemble.\n\n\n        \"\"\"\n        super(EnsembleLinear, self).__init__()\n\n        # Initialize weight and bias as learnable parameters\n        self.weight = nn.Parameter(torch.Tensor(num_ensemble, out_dim, in_dim))\n        if bias:\n            self.bias = nn.Parameter(torch.Tensor(num_ensemble, 1, out_dim))\n        else:\n            self.register_parameter(\"bias\", None)\n\n        # Initialize parameters\n        self.init_fn = init_fn if init_fn is not None else mupi.xavier_uniform_\n        self.reset_parameters()\n\n    def reset_parameters(self):\n        \"\"\"\n        Reset the parameters of the linear layer using the `init_fn`.\n        \"\"\"\n        set_base_shapes(self, None, rescale_params=False)  # Set the shapes of the tensors, useful for mup\n        # Initialize weight using the provided initialization function\n        self.init_fn(self.weight)\n\n        # Initialize bias if present\n        if self.bias is not None:\n            nn.init.zeros_(self.bias)\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Apply the batched linear transformation on the input features.\n\n        Parameters:\n                h: `torch.Tensor[B, Din]` or `torch.Tensor[..., 1, B, Din]` or `torch.Tensor[..., L, B, Din]`:\n                    Input feature tensor, before the batched linear transformation.\n                    `Din` is the number of input features, `B` is the batch size, and `L` is the number of linear layers.\n\n        Returns:\n                `torch.Tensor[..., L, B, Dout]`:\n                    Output feature tensor, after the batched linear transformation.\n                    `Dout` is the number of output features, , `B` is the batch size, and `L` is the number of linear layers.\n        \"\"\"\n\n        # Perform the linear transformation using torch.matmul\n        h = torch.matmul(self.weight, h.transpose(-1, -2)).transpose(-1, -2)\n\n        # Add bias if present\n        if self.bias is not None:\n            h += self.bias\n\n        return h\n\n\nclass EnsembleFCLayer(FCLayer):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        activation: Union[str, Callable] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        bias: bool = True,\n        init_fn: Optional[Callable] = None,\n        is_readout_layer: bool = False,\n        droppath_rate: float = 0.0,\n    ):\n        r\"\"\"\n        Multiple fully connected layers running in parallel.\n        This layer is centered around a `torch.nn.Linear` module.\n        The order in which transformations are applied is:\n\n        - Dense Layer\n        - Activation\n        - Dropout (if applicable)\n        - Batch Normalization (if applicable)\n\n        Parameters:\n            in_dim:\n                Input dimension of the layer (the `torch.nn.Linear`)\n            out_dim:\n                Output dimension of the layer.\n            num_ensemble:\n                Number of linear layers in the ensemble.\n            dropout:\n                The ratio of units to dropout. No dropout by default.\n            activation:\n                Activation function to use.\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n            bias:\n                Whether to enable bias in for the linear layer.\n            init_fn:\n                Initialization function to use for the weight of the layer. Default is\n                $$\\mathcal{U}(-\\sqrt{k}, \\sqrt{k})$$ with $$k=\\frac{1}{ \\text{in_dim}}$$\n            is_readout_layer: Whether the layer should be treated as a readout layer by replacing of `torch.nn.Linear`\n                by `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1, see https://arxiv.org/abs/1603.09382\n        Attributes:\n            dropout (int):\n                The ratio of units to dropout.\n            normalization (None or Callable):\n                Normalization layer\n            linear (`torch.nn.Linear`):\n                The linear layer\n            activation (`torch.nn.Module`):\n                The activation layer\n            init_fn (Callable):\n                Initialization function used for the weight of the layer\n            in_dim (int):\n                Input dimension of the linear layer\n            out_dim (int):\n                Output dimension of the linear layer\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            bias=bias,\n            init_fn=init_fn,\n            is_readout_layer=is_readout_layer,\n            droppath_rate=droppath_rate,\n        )\n\n        # Linear layer, or MuReadout layer\n        if not is_readout_layer:\n            self.linear = EnsembleLinear(\n                in_dim, out_dim, num_ensemble=num_ensemble, bias=bias, init_fn=init_fn\n            )\n        else:\n            self.linear = EnsembleMuReadoutGraphium(in_dim, out_dim, num_ensemble=num_ensemble, bias=bias)\n\n        self.reset_parameters()\n\n    def reset_parameters(self, init_fn=None):\n        \"\"\"\n        Reset the parameters of the linear layer using the `init_fn`.\n        \"\"\"\n        set_base_shapes(self, None, rescale_params=False)  # Set the shapes of the tensors, useful for mup\n        self.linear.reset_parameters()\n\n    def __repr__(self):\n        rep = super().__repr__()\n        rep = rep[:-1] + f\", num_ensemble={self.linear.weight.shape[0]})\"\n        return rep\n\n\nclass EnsembleMuReadoutGraphium(EnsembleLinear):\n    \"\"\"\n    This layer implements an ensemble version of μP with a 1/width multiplier and a\n    constant variance initialization for both weights and biases.\n    \"\"\"\n\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        bias: bool = True,\n        init_fn: Optional[Callable] = None,\n        readout_zero_init=False,\n        output_mult=1.0,\n    ):\n        self.in_dim = in_dim\n        self.output_mult = output_mult\n        self.readout_zero_init = readout_zero_init\n        self._base_width = in_dim\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            num_ensemble=num_ensemble,\n            bias=bias,\n            init_fn=init_fn,\n        )\n\n    def reset_parameters(self) -> None:\n        if self.readout_zero_init:\n            self.weight.data[:] = 0\n            if self.bias is not None:\n                self.bias.data[:] = 0\n        else:\n            super().reset_parameters()\n\n    def width_mult(self):\n        assert hasattr(self.weight, \"infshape\"), (\n            \"Please call set_base_shapes(...). If using torch.nn.DataParallel, \"\n            \"switch to distributed training with \"\n            \"torch.nn.parallel.DistributedDataParallel instead\"\n        )\n        return self.weight.infshape.width_mult()\n\n    def _rescale_parameters(self):\n        \"\"\"Rescale parameters to convert SP initialization to μP initialization.\n\n        Warning: This method is NOT idempotent and should be called only once\n        unless you know what you are doing.\n        \"\"\"\n        if hasattr(self, \"_has_rescaled_params\") and self._has_rescaled_params:\n            raise RuntimeError(\n                \"`_rescale_parameters` has been called once before already. \"\n                \"Unless you know what you are doing, usually you should not be calling `_rescale_parameters` more than once.\\n\"\n                \"If you called `set_base_shapes` on a model loaded from a checkpoint, \"\n                \"or just want to re-set the base shapes of an existing model, \"\n                \"make sure to set the flag `rescale_params=False`.\\n\"\n                \"To bypass this error and *still rescale parameters*, set `self._has_rescaled_params=False` before this call.\"\n            )\n        if self.bias is not None:\n            self.bias.data *= self.width_mult() ** 0.5\n        self.weight.data *= self.width_mult() ** 0.5\n        self._has_rescaled_params = True\n\n    def forward(self, x):\n        return super().forward(self.output_mult * x / self.width_mult())\n\n    @property\n    def absolute_width(self):\n        return float(self.in_dim)\n\n    @property\n    def base_width(self):\n        return self._base_width\n\n    @base_width.setter\n    def base_width(self, val):\n        if val is None:\n            return\n        assert isinstance(\n            val, (int, torch.int, torch.long)\n        ), f\"`base_width` must be None, int or long, provided {val} of type {type(val)}\"\n        self._base_width = val\n\n    def width_mult(self):\n        return self.absolute_width / self.base_width\n\n\nclass EnsembleMLP(MLP):\n    def __init__(\n        self,\n        in_dim: int,\n        hidden_dims: Union[Iterable[int], int],\n        out_dim: int,\n        num_ensemble: int,\n        depth: Optional[int] = None,\n        reduction: Optional[Union[str, Callable]] = \"none\",\n        activation: Union[str, Callable] = \"relu\",\n        last_activation: Union[str, Callable] = \"none\",\n        dropout: float = 0.0,\n        last_dropout: float = 0.0,\n        normalization: Union[Type[None], str, Callable] = \"none\",\n        last_normalization: Union[Type[None], str, Callable] = \"none\",\n        first_normalization: Union[Type[None], str, Callable] = \"none\",\n        last_layer_is_readout: bool = False,\n        droppath_rate: float = 0.0,\n        constant_droppath_rate: bool = True,\n    ):\n        r\"\"\"\n        Simple multi-layer perceptron, built of a series of FCLayers\n\n        Parameters:\n            in_dim:\n                Input dimension of the MLP\n            hidden_dims:\n                Either an integer specifying all the hidden dimensions,\n                or a list of dimensions in the hidden layers.\n            out_dim:\n                Output dimension of the MLP.\n            num_ensemble:\n                Number of MLPs that run in parallel.\n            depth:\n                If `hidden_dims` is an integer, `depth` is 1 + the number of\n                hidden layers to use.\n                If `hidden_dims` is a list, then\n                `depth` must be `None` or equal to `len(hidden_dims) + 1`\n            reduction:\n                Reduction to use at the end of the MLP. Choices:\n\n                - \"none\" or `None`: No reduction\n                - \"mean\": Mean reduction\n                - \"sum\": Sum reduction\n                - \"max\": Max reduction\n                - \"min\": Min reduction\n                - \"median\": Median reduction\n                - `Callable`: Any callable function. Must take `dim` as a keyword argument.\n           activation:\n                Activation function to use in all the layers except the last.\n                if `layers==1`, this parameter is ignored\n            last_activation:\n                Activation function to use in the last layer.\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization in the hidden layers.\n                - `Callable`: Any callable function\n\n                if `layers==1`, this parameter is ignored\n            last_normalization:\n                Norrmalization to use **after the last layer**. Same options as `normalization`.\n            first_normalization:\n                Norrmalization to use in **before the first layer**. Same options as `normalization`.\n            last_dropout:\n                The ratio of units to dropout at the last layer.\n            last_layer_is_readout: Whether the last layer should be treated as a readout layer.\n                Allows to use the `mup.MuReadout` from the muTransfer method https://github.com/microsoft/mup\n            droppath_rate:\n                stochastic depth drop rate, between 0 and 1.\n                See https://arxiv.org/abs/1603.09382\n            constant_droppath_rate:\n                If `True`, drop rates will remain constant accross layers.\n                Otherwise, drop rates will vary stochastically.\n                See `DropPath.get_stochastic_drop_rate`\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            hidden_dims=hidden_dims,\n            out_dim=out_dim,\n            depth=depth,\n            activation=activation,\n            last_activation=last_activation,\n            dropout=dropout,\n            last_dropout=last_dropout,\n            normalization=normalization,\n            last_normalization=last_normalization,\n            first_normalization=first_normalization,\n            last_layer_is_readout=last_layer_is_readout,\n            droppath_rate=droppath_rate,\n            constant_droppath_rate=constant_droppath_rate,\n            fc_layer=EnsembleFCLayer,\n            fc_layer_kwargs={\"num_ensemble\": num_ensemble},\n        )\n\n        self.reduction_fn = self._parse_reduction(reduction)\n\n    def _parse_reduction(self, reduction: Optional[Union[str, Callable]]) -> Optional[Callable]:\n        r\"\"\"\n        Parse the reduction argument.\n        \"\"\"\n\n        if isinstance(reduction, str):\n            reduction = reduction.lower()\n        if reduction is None or reduction == \"none\":\n            return None\n        elif reduction == \"mean\":\n            return torch.mean\n        elif reduction == \"sum\":\n            return torch.sum\n        elif reduction == \"max\":\n\n            def max_vals(x, dim):\n                return torch.max(x, dim=dim).values\n\n            return max_vals\n        elif reduction == \"min\":\n\n            def min_vals(x, dim):\n                return torch.min(x, dim=dim).values\n\n            return min_vals\n        elif reduction == \"median\":\n\n            def median_vals(x, dim):\n                return torch.median(x, dim=dim).values\n\n            return median_vals\n        elif callable(reduction):\n            return reduction\n        else:\n            raise ValueError(f\"Unknown reduction {reduction}\")\n\n    def forward(self, h: torch.Tensor) -> torch.Tensor:\n        r\"\"\"\n        Apply the ensemble MLP on the input features, then reduce the output if specified.\n\n        Parameters:\n\n            h: `torch.Tensor[B, Din]` or `torch.Tensor[..., 1, B, Din]` or `torch.Tensor[..., L, B, Din]`:\n\n                Input feature tensor, before the MLP.\n                `Din` is the number of input features, `B` is the batch size, and `L` is the number of ensembles.\n\n        Returns:\n\n            `torch.Tensor[..., L, B, Dout]` or `torch.Tensor[..., B, Dout]`:\n\n                Output feature tensor, after the MLP.\n                `Dout` is the number of output features, `B` is the batch size, and `L` is the number of ensembles.\n                `L` is removed if a reduction is specified.\n        \"\"\"\n        h = super().forward(h)\n        if self.reduction_fn is not None:\n            h = self.reduction_fn(h, dim=-3)\n        return h\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        rep = super().__repr__()\n        rep = rep[:-1] + f\", num_ensemble={self.layers[0].linear.weight.shape[0]})\"\n"
  },
  {
    "path": "graphium/nn/pyg_layers/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\ncode for different PyG GNN layers and graph transformer layers for this library\n\n- ✅ `pna_pyg.py`: `PNAMessagePassingPyg` class implements the PNA layer, good example script for new GNN layers\n- ✅ `gps_pyg.py`: `GPSLayerPyg` class is the implementation of the graphGPS layer in PyG, a graph transformer layer\n- `gated_gcn_pyg.py`: `GatedGCNPyg` class for the gated GCN layer implementation. \n- `gin_pyg.py`: GIN and GINE layer implementation in the `GINConvPyg` and `GINEConvPyg` class respectively\n- `mpnn_pyg.py`: `MPNNPlusPyg` class implements the MPNN layer\n- `pooling_pyg.py`: pooling layers in pyg\n"
  },
  {
    "path": "graphium/nn/pyg_layers/__init__.py",
    "content": "from .gcn_pyg import GCNConvPyg\nfrom .gated_gcn_pyg import GatedGCNPyg\nfrom .gin_pyg import GINConvPyg\nfrom .gin_pyg import GINEConvPyg\nfrom .pna_pyg import PNAMessagePassingPyg\nfrom .mpnn_pyg import MPNNPlusPyg\nfrom .gps_pyg import GPSLayerPyg\nfrom .pooling_pyg import scatter_logsum_pool\nfrom .pooling_pyg import scatter_std_pool\nfrom .pooling_pyg import parse_pooling_layer_pyg\nfrom .pooling_pyg import VirtualNodePyg\nfrom .dimenet_pyg import DimeNetPyg\n"
  },
  {
    "path": "graphium/nn/pyg_layers/dimenet_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Callable, Union, Optional, Tuple\nfrom functools import partial\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor\n\nfrom torch_geometric.data import Data, Batch\nfrom torch_geometric.utils import scatter\n\nfrom graphium.nn.base_graph_layer import BaseGraphModule\nfrom graphium.utils.decorators import classproperty\nfrom graphium.nn.pyg_layers.utils import triplets\nfrom graphium.nn.base_layers import MLP, FCLayer\n\n\nclass ResidualLayer(torch.nn.Module):\n    r\"\"\"Modified from\n    https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/models/dimenet.html\n    (align style with our codebase)\n    \"\"\"\n\n    def __init__(self, hidden_channels: int, activation: Union[Callable, str]):\n        super().__init__()\n        self.lin1 = FCLayer(hidden_channels, hidden_channels, activation=activation)\n        self.lin2 = FCLayer(hidden_channels, hidden_channels, activation=activation)\n\n    def forward(self, x: Tensor) -> Tensor:\n        return x + self.lin2(self.lin1(x))\n\n\nclass OutputBlock(torch.nn.Module):\n    r\"\"\"Modified from\n    https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/models/dimenet.html\n    (align style)\n    \"\"\"\n\n    def __init__(\n        self,\n        num_radial: int,\n        hidden_channels: int,\n        out_channels: int,\n        num_layers: int,\n        activation: Union[Callable, str],\n    ):\n        super().__init__()\n\n        self.lin_rbf = FCLayer(num_radial, hidden_channels, bias=False, activation=None)\n        self.lins = nn.ModuleList()\n        for _ in range(num_layers):\n            self.lins.append(FCLayer(hidden_channels, hidden_channels, activation=activation))\n        self.lin = FCLayer(hidden_channels, out_channels, bias=False, activation=None)\n\n    def forward(self, x: Tensor, rbf: Tensor, i: Tensor, num_nodes: Optional[int] = None) -> Tensor:\n        x = self.lin_rbf(rbf) * x\n        x = scatter(x, i, dim=0, dim_size=num_nodes, reduce=\"sum\")\n        for lin in self.lins:\n            x = lin(x)\n        return self.lin(x)\n\n\nclass InteractionBlock(nn.Module):\n    r\"\"\"Modified from\n    https://pytorch-geometric.readthedocs.io/en/latest/_modules/torch_geometric/nn/models/dimenet.html\n    (align style; add output linear layer to allow change of dimension)\n    \"\"\"\n\n    def __init__(\n        self,\n        hidden_dim: int,\n        output_dim: int,\n        num_bilinear: int,\n        num_spherical: int,\n        num_radial: int,\n        num_before_skip: int,\n        num_after_skip: int,\n        activation: Union[Callable, str],\n    ):\n        super().__init__()\n\n        self.lin_rbf = FCLayer(num_radial, hidden_dim, activation=None, bias=False)\n        self.lin_sbf = FCLayer(num_spherical * num_radial, num_bilinear, activation=None, bias=False)\n\n        # Dense transformations of input messages.\n        self.lin_kj = FCLayer(hidden_dim, hidden_dim, activation=activation)\n        self.lin_ji = FCLayer(hidden_dim, hidden_dim, activation=activation)\n\n        self.W = nn.Parameter(torch.Tensor(hidden_dim, num_bilinear, hidden_dim))\n        self.W.data.normal_(mean=0, std=2 / self.W.size(0))\n\n        self.layers_before_skip = nn.ModuleList(\n            [ResidualLayer(hidden_dim, activation) for _ in range(num_before_skip)]\n        )\n        self.lin = FCLayer(hidden_dim, hidden_dim, activation=activation)\n        self.layers_after_skip = nn.ModuleList(\n            [ResidualLayer(hidden_dim, activation) for _ in range(num_after_skip)]\n        )\n        self.lin_out = FCLayer(hidden_dim, output_dim, activation=activation)\n\n    def forward(self, x: Tensor, rbf: Tensor, sbf: Tensor, idx_kj: Tensor, idx_ji: Tensor) -> Tensor:\n        \"\"\"\n        Parameters:\n            x: edge features after encodings [num_edges, hidden_dim]\n            rbf: bessel rbf of edges [num_edges, num_radial]\n            sbf: spherical bessel rbf of triplets [num_triplet, num_spherical * num_radial]\n            idx_kj: indices in edge of triplets [num_triplet] (value range from 0 to num_edges)\n            idx_ji: indices in edge of triplets [num_triplet] (value range from 0 to num_edges)\n        \"\"\"\n        rbf = self.lin_rbf(rbf)  # [num_edges, hidden_dim]\n        sbf = self.lin_sbf(sbf)  # [num_triplet, hidden_dim]\n\n        x_ji = self.lin_ji(x)\n        x_kj = self.lin_kj(x)\n        x_kj = x_kj * rbf\n\n        x_kj = torch.einsum(\"wj,wl,ijl->wi\", sbf, x_kj[idx_kj], self.W)\n        x_kj = scatter(x_kj, idx_ji, dim=0, dim_size=x.size(0), reduce=\"sum\")\n\n        h = x_ji + x_kj\n        for layer in self.layers_before_skip:\n            h = layer(h)\n        h = self.lin(h) + x\n        for layer in self.layers_after_skip:\n            h = layer(h)\n        return self.lin_out(h)  # [num_edges, output_dim]\n\n\nclass DimeNetPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        in_dim_edges: int,\n        out_dim_edges: int,\n        num_bilinear: int,\n        num_spherical: int,\n        num_radial: int,\n        num_before_skip: int = 1,\n        num_after_skip: int = 2,\n        num_output_layers: int = 3,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        **kwargs,\n    ):\n        r\"\"\"\n        `\"Directional Message Passing for Molecular Graphs\" <https://arxiv.org/abs/2003.03123> paper.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            in_dim_edges:\n                Input feature dimensions of the edges\n\n            out_dim_edges:\n                Output feature dimensions of the edges\n\n            num_bilinear:\n                The dimension of bilinear layer in the interaction block\n\n            num_spherical:\n                The number of spherical harmonics\n\n            num_radial:\n                The number of radial basis functions\n\n            num_before_skip:\n                The number of residual layers before skip connection (default: 1)\n\n            num_after_skip:\n                The number of residual layers after skip connection (default: 2)\n\n            num_output_layers:\n                The number of output layers for a single output block (default: 3)\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            init_eps :\n                Initial :math:`\\epsilon` value, default: ``0``.\n\n            learn_eps :\n                If True, :math:`\\epsilon` will be a learnable parameter.\n\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        # transform old node feature\n        self.node_model = MLP(\n            in_dim=in_dim,\n            hidden_dims=in_dim,\n            out_dim=out_dim,\n            depth=2,\n            activation=activation,\n            normalization=self.normalization,\n        )\n\n        # update edge feature\n        self.interaction_block = InteractionBlock(\n            in_dim_edges,\n            out_dim_edges,\n            num_bilinear,\n            num_spherical,\n            num_radial,\n            num_before_skip,\n            num_after_skip,\n            activation,\n        )\n        # updated edge feature -> new node feature\n        self.output_block = OutputBlock(\n            num_radial=num_radial,\n            hidden_channels=out_dim_edges,\n            out_channels=out_dim,\n            num_layers=num_output_layers,\n            activation=activation,\n        )\n\n    def forward(self, batch: Union[Data, Batch]) -> Union[Data, Batch]:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs to pass through the layer\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        assert (\n            \"radius_edge_index\" in batch\n        ), \"radius_edge_index not in batch, make sure to use 3D encoder firstly\"\n        # (j, i) = edge_index\n        i, j, idx_i, idx_j, idx_k, idx_kj, idx_ji = triplets(\n            batch.radius_edge_index, num_nodes=batch.feat.size(0)\n        )\n        x, P = batch.edge_feat, batch.feat\n        rbf, sbf = batch.edge_rbf, batch.triplet_sbf\n\n        # apply MLP to node embeddings\n        P = self.node_model(P)  # [num_nodes, out_dim]\n\n        # rbf and sbf should be computed during pos encoder\n        x = self.interaction_block(x, rbf, sbf, idx_kj, idx_ji)\n        P = P + self.output_block(x, rbf, i, num_nodes=batch.feat.size(0))  # [num_nodes, out_dim]\n\n        batch.edge_feat = x  # updated edge features\n        batch.feat = P  # updated node features\n\n        return batch\n\n    ############################################################################################################\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/gated_gcn_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, Callable\nfrom functools import partial\n\nimport torch\nimport torch.nn as nn\n\nfrom torch_geometric.nn.conv import MessagePassing\nfrom torch_scatter import scatter\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.nn.base_graph_layer import BaseGraphStructure, check_intpus_allow_int\nfrom graphium.nn.base_layers import FCLayer\nfrom graphium.utils.decorators import classproperty\n\n\nclass GatedGCNPyg(MessagePassing, BaseGraphStructure):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        in_dim_edges: int,\n        out_dim_edges: int = None,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        eps: float = 1e-5,\n        **kwargs,\n    ):\n        r\"\"\"\n        ResGatedGCN: Residual Gated Graph ConvNets\n        An Experimental Study of Neural Networks for Variable Graphs (Xavier Bresson and Thomas Laurent, ICLR 2018)\n        https://arxiv.org/pdf/1711.07553v2.pdf\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer, and for the edges\n\n            in_dim_edges:\n                Input edge-feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            eps:\n                Epsilon value for the normalization `sum(gate_weights * messages) / (sum(gate_weights) + eps)`,\n                where `gate_weights` are the weights of the gates and follow a sigmoid function.\n\n        \"\"\"\n        MessagePassing.__init__(self, aggr=\"add\", flow=\"source_to_target\", node_dim=-2)\n        BaseGraphStructure.__init__(\n            self,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        self._initialize_activation_dropout_norm()\n\n        # Allow int32 in the edge_index\n        self.__check_input__ = partial(check_intpus_allow_int, self)\n        if out_dim_edges is None:\n            out_dim_edges = in_dim_edges\n\n        # Initialize the layers for the gating\n        self.A = FCLayer(in_dim, out_dim, activation=None, bias=True)\n        self.B = FCLayer(in_dim, out_dim, activation=None, bias=True)\n        self.C = FCLayer(in_dim_edges, out_dim, activation=None, bias=True)\n        self.D = FCLayer(in_dim, out_dim, activation=None, bias=True)\n        self.E = FCLayer(in_dim, out_dim, activation=None, bias=True)\n\n        self.edge_out = FCLayer(\n            in_dim=out_dim, out_dim=out_dim_edges, activation=None, dropout=dropout, bias=True\n        )\n        self.eps = eps\n\n    def forward(\n        self,\n        batch: Union[Data, Batch],\n    ) -> Union[Data, Batch]:\n        r\"\"\"\n        Forward pass the Gated GCN layer\n        extract the following from the batch:\n        x, node features with dim [n_nodes, in_dim]\n        e, edge features with dim [n_edges, in_dim]\n        edge_index with dim [2, n_edges]\n\n        Parameters:\n            batch: pyg Batch graph to pass through the layer\n        Returns:\n            batch: pyg Batch graph\n        \"\"\"\n\n        x, e, edge_index = batch.feat, batch.edge_feat, batch.edge_index\n\n        # Apply the linear layers\n        Ax = self.A(x)\n        Bx = self.B(x)\n        Ce = self.C(e)\n        Dx = self.D(x)\n        Ex = self.E(x)\n\n        # Propagate, and apply norm, activation, dropout\n        x, e = self.propagate(edge_index, Bx=Bx, Dx=Dx, Ex=Ex, Ce=Ce, e=e, Ax=Ax)\n        x = self.apply_norm_activation_dropout(x, batch_idx=batch.batch)\n        e = self.edge_out(e)\n\n        # Output\n        batch.feat = x\n        batch.edge_feat = e\n\n        return batch\n\n    def message(self, Dx_i: torch.Tensor, Ex_j: torch.Tensor, Ce: torch.Tensor) -> torch.Tensor:\n        \"\"\"\n        message function\n        Parameters:\n            Dx_i: tensor with dimension [n_edges, out_dim]\n            Ex_j: tensor with dimension [n_edges, out_dim]\n            Ce: tensor with dimension [n_edges, out_dim]\n        Returns:\n            sigma_ij: tensor with dimension [n_edges, out_dim]\n        \"\"\"\n        e_ij = Dx_i + Ex_j + Ce\n        sigma_ij = torch.sigmoid(e_ij)\n\n        self.e = e_ij\n        return sigma_ij\n\n    def aggregate(\n        self,\n        sigma_ij: torch.Tensor,\n        index: torch.Tensor,\n        Bx_j: torch.Tensor,\n        Bx: torch.Tensor,\n    ) -> torch.Tensor:\n        r\"\"\"\n        aggregation function of the layer\n        Parameters:\n            sigma_ij: the output from message() function with dim [n_edges, out_dim]\n            index: dim [n_edges]\n            Bx_j: dim [n_edges, out_dim]\n            Bx: dim [n_nodes, out_dim]\n        Returns:\n            out: dim [n_nodes, out_dim]\n        \"\"\"\n        dim_size = Bx.shape[0]\n\n        # Sum the messages, weighted by the gates. Sum the gates.\n        numerator_eta_xj = scatter(sigma_ij * Bx_j, index, 0, None, dim_size, reduce=\"sum\")\n        denominator_eta_xj = scatter(sigma_ij, index, 0, None, dim_size, reduce=\"sum\")\n\n        # Cast to float32 if needed\n        dtype = denominator_eta_xj.dtype\n        if dtype == torch.float16:\n            numerator_eta_xj = numerator_eta_xj.to(dtype=torch.float32)\n            denominator_eta_xj = denominator_eta_xj.to(dtype=torch.float32)\n\n        # Normalize the messages by the sum of the gates\n        out = numerator_eta_xj / (denominator_eta_xj + self.eps)\n\n        # Cast back to float16 if needed\n        if dtype == torch.float16:\n            out = out.to(dtype=dtype)\n        return out\n\n    def update(self, aggr_out: torch.Tensor, Ax: torch.Tensor):\n        r\"\"\"\n        update function of the layer\n        Parameters:\n            aggr_out: the output from aggregate() function with dim [n_nodes, out_dim]\n            Ax: tensor with dim [n_nodes, out_dim]\n        Returns:\n            x: dim [n_nodes, out_dim]\n            e_out: dim [n_edges, out_dim_edges]\n        \"\"\"\n        x = Ax + aggr_out\n        e_out = self.e\n        del self.e\n        return x, e_out\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/gcn_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Callable, Union\nfrom functools import partial\n\nimport torch_geometric.nn as pyg_nn\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.nn.base_graph_layer import BaseGraphModule, check_intpus_allow_int\nfrom graphium.utils.decorators import classproperty\n\n\nclass GCNConvPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        **kwargs,\n    ):\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        self.model = pyg_nn.GCNConv(\n            in_channels=self.in_dim, out_channels=out_dim, add_self_loops=False, normalize=False\n        )\n        self.model.__check_input__ = partial(check_intpus_allow_int, self)\n\n    def forward(\n        self,\n        batch: Union[Data, Batch],\n    ) -> Union[Data, Batch]:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs to pass through the layer\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        batch.feat = self.model(batch.feat, batch.edge_index)\n        batch.feat = self.apply_norm_activation_dropout(batch.feat, batch_idx=batch.batch)\n\n        return batch\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/gin_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Callable, Union, Optional\nfrom functools import partial\n\nimport torch_geometric.nn as pyg_nn\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.nn.base_graph_layer import BaseGraphModule, check_intpus_allow_int\nfrom graphium.nn.base_layers import MLP\nfrom graphium.utils.decorators import classproperty\n\n\nclass GINConvPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        **kwargs,\n    ):\n        r\"\"\"\n        GIN: Graph Isomorphism Networks\n        HOW POWERFUL ARE GRAPH NEURAL NETWORKS? (Keyulu Xu, Weihua Hu, Jure Leskovec and Stefanie Jegelka, ICLR 2019)\n        https://arxiv.org/pdf/1810.00826.pdf\n\n        [!] code uses the pytorch-geometric implementation of GINConv\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            init_eps :\n                Initial :math:`\\epsilon` value, default: ``0``.\n\n            learn_eps :\n                If True, :math:`\\epsilon` will be a learnable parameter.\n\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        gin_nn = MLP(\n            in_dim=self.in_dim,\n            hidden_dims=self.in_dim,\n            out_dim=self.out_dim,\n            depth=2,\n            activation=self.activation_layer,\n            last_activation=\"none\",\n            normalization=self.normalization,\n            last_normalization=\"none\",\n        )\n\n        self.model = pyg_nn.GINConv(gin_nn)\n        self.model.__check_input__ = partial(check_intpus_allow_int, self)\n\n    def forward(\n        self,\n        batch: Union[Data, Batch],\n    ) -> Union[Data, Batch]:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs to pass through the layer\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        batch.feat = self.model(batch.feat, batch.edge_index)\n        batch.feat = self.apply_norm_activation_dropout(batch.feat, batch_idx=batch.batch)\n\n        return batch\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n\n\nclass GINEConvPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        in_dim_edges: Optional[int] = None,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        **kwargs,\n    ):\n        r\"\"\"\n        GINE: Graph Isomorphism Networks with Edges\n        Strategies for Pre-training Graph Neural Networks\n        Weihua Hu, Bowen Liu, Joseph Gomes, Marinka Zitnik, Percy Liang, Vijay Pande, Jure Leskovec\n        https://arxiv.org/abs/1905.12265\n\n        [!] code uses the pytorch-geometric implementation of GINEConv\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            init_eps :\n                Initial :math:`\\epsilon` value, default: ``0``.\n\n            learn_eps :\n                If True, :math:`\\epsilon` will be a learnable parameter.\n\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n        )\n\n        gin_nn = MLP(\n            in_dim=self.in_dim,\n            hidden_dims=self.in_dim,\n            out_dim=self.out_dim,\n            depth=2,\n            activation=self.activation_layer,\n            last_activation=\"none\",\n            normalization=self.normalization,\n            last_normalization=\"none\",\n        )\n\n        self.model = pyg_nn.GINEConv(gin_nn, edge_dim=in_dim_edges)  # , node_dim=-1)\n        self.model.__check_input__ = partial(check_intpus_allow_int, self)\n\n    def forward(\n        self,\n        batch: Union[Data, Batch],\n    ) -> Union[Data, Batch]:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs to pass through the layer\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        batch.feat = self.model(batch.feat, batch.edge_index, batch.edge_feat)\n        batch.feat = self.apply_norm_activation_dropout(batch.feat, batch_idx=batch.batch)\n\n        return batch\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/gps_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport torch\nfrom copy import deepcopy\nfrom typing import Callable, Union, Optional, Dict, Any\nfrom torch.nn import Module\nfrom torch import Tensor\nfrom torch_geometric.data import Batch\nfrom graphium.nn.base_graph_layer import BaseGraphModule\nfrom graphium.nn.base_layers import FCLayer, MultiheadAttentionMup, MLP\nfrom graphium.nn.pyg_layers import (\n    GatedGCNPyg,\n    GINConvPyg,\n    GINEConvPyg,\n    PNAMessagePassingPyg,\n    MPNNPlusPyg,\n)\nfrom graphium.data.utils import get_keys\nfrom graphium.utils.decorators import classproperty\nfrom graphium.ipu.to_dense_batch import (\n    to_dense_batch,\n    to_sparse_batch,\n    to_packed_dense_batch,\n    to_sparse_batch_from_packed,\n)\nfrom graphium.ipu.ipu_utils import is_running_on_ipu\n\nPYG_LAYERS_DICT = {\n    \"pyg:gin\": GINConvPyg,\n    \"pyg:gine\": GINEConvPyg,\n    \"pyg:gated-gcn\": GatedGCNPyg,\n    \"pyg:pna-msgpass\": PNAMessagePassingPyg,\n    \"pyg:mpnnplus\": MPNNPlusPyg,\n}\n\nATTENTION_LAYERS_DICT = {\n    \"full-attention\": MultiheadAttentionMup,\n    \"none\": None,\n}\n\n\nclass GPSLayerPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        in_dim_edges: Optional[int] = None,\n        out_dim_edges: Optional[int] = None,\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        node_residual: Optional[bool] = True,\n        normalization: Union[str, Callable] = \"none\",\n        mpnn_type: str = \"pyg:gine\",\n        mpnn_kwargs: Optional[dict] = None,\n        attn_type: str = \"full-attention\",\n        precision: str = \"32\",\n        biased_attention_key: Optional[str] = None,\n        attn_kwargs=None,\n        force_consistent_in_dim: bool = True,\n        droppath_rate_attn: float = 0.0,\n        droppath_rate_ffn: float = 0.0,\n        hidden_dim_scaling: float = 4.0,\n        output_scale: float = 1.0,\n        **kwargs,\n    ):\n        r\"\"\"\n        GPS layer implementation in pyg\n        adapated from https://github.com/rampasek/GraphGPS/blob/main/graphgps/layer/gps_layer.py\n        GPS: Recipe for a General, Powerful, Scalable Graph Transformer\n        Ladislav Rampášek, Mikhail Galkin, Vijay Prakash Dwivedi, Anh Tuan Luu, Guy Wolf, Dominique Beaini\n        https://arxiv.org/abs/2205.12454\n\n        GPS++: An Optimised Hybrid MPNN/Transformer for Molecular Property Prediction\n        Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders, Hatem Helal, Deniz Beker, Ladislav Rampášek, Dominique Beaini\n        https://arxiv.org/abs/2212.02229\n\n        Parameters:\n\n            in_dim:\n                Input node feature dimensions of the layer\n\n            out_dim:\n                Output node feature dimensions of the layer\n\n            in_dim_edges:\n                input edge-feature dimensions of the layer\n\n            out_dim_edges:\n                output edge-feature dimensions of the layer\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            node_residual:\n                If node residual is used after on the gnn layer output\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            mpnn_type:\n                type of mpnn used, choose from \"pyg:gin\", \"pyg:gine\", \"pyg:gated-gcn\", \"pyg:pna-msgpass\" and \"pyg:mpnnplus\"\n\n            mpnn_kwargs:\n                kwargs for mpnn layer\n\n            attn_type:\n                type of attention used, choose from \"full-attention\" and \"none\"\n\n            attn_kwargs:\n                kwargs for attention layer\n\n            force_consistent_in_dim:\n                whether to force the `embed_dim` to be the same as the `in_dim` for the attention and mpnn.\n                The argument is only valid if `attn_type` is not None. If `embed_dim` is not provided,\n                it will be set to `in_dim` by default, so this parameter won't have an effect.\n\n            droppath_rate_attn:\n                stochastic depth drop rate for attention layer https://arxiv.org/abs/1603.09382\n\n            droppath_rate_ffn:\n                stochastic depth drop rate for ffn layer https://arxiv.org/abs/1603.09382\n\n            mpnn_type:\n                Type of MPNN layer to use. Choices specified in PYG_LAYERS_DICT\n\n            mpnn_kwargs:\n                Keyword arguments to pass to the MPNN layer\n\n            attn_type:\n                Type of attention layer to use. Choices specified in ATTENTION_LAYERS_DICT\n\n            biased_attention_key:\n                indicates if biased attention is used by specifying a key corresponding to the pyg attribute in the batch (processed by the gaussian kernel encoder)\n                default: None means biased attention is not used\n\n            attn_kwargs:\n                Keyword arguments to pass to the attention layer\n\n            output_scale:\n                Float value that will be used to scale the activations, helps reduce growth of activations\n\n                as the model gets deeper. Default value of 1.0 leaves the layer unchanged.\n\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            droppath_rate=droppath_rate_attn,\n            **kwargs,\n        )\n        # Set the other attributes\n        self.in_dim_edges = in_dim_edges\n        self.out_dim_edges = out_dim_edges\n\n        # Dropout layers\n        self.dropout_local = self.dropout_layer\n        self.dropout_attn = self._parse_dropout(dropout=self.dropout)\n\n        # DropPath layers\n        self.droppath_ffn = self._parse_droppath(droppath_rate_ffn)\n\n        # Residual connections\n        self.node_residual = node_residual\n\n        self.precision = precision\n\n        # MLP applied at the end of the GPS layer\n        self.mlp = MLP(\n            in_dim=in_dim,\n            hidden_dims=int(hidden_dim_scaling * in_dim),\n            out_dim=in_dim,\n            depth=2,\n            activation=activation,\n            dropout=self.dropout,\n            last_dropout=self.dropout,\n        )\n        self.f_out = FCLayer(in_dim, out_dim, normalization=normalization)\n\n        # Normalization layers\n        self.norm_layer_local = self._parse_norm(normalization=self.normalization, dim=in_dim)\n        self.norm_layer_attn = self._parse_norm(normalization=self.normalization, dim=in_dim)\n        self.norm_layer_ff = self._parse_norm(self.normalization)\n\n        self.biased_attention_key = biased_attention_key\n        # Initialize the MPNN and Attention layers\n        self.mpnn = self._parse_mpnn_layer(mpnn_type, mpnn_kwargs)\n        self.attn_layer = self._parse_attn_layer(\n            attn_type, self.biased_attention_key, attn_kwargs, force_consistent_in_dim=force_consistent_in_dim\n        )\n\n        self.output_scale = output_scale\n        self.use_edges = True if self.in_dim_edges is not None else False\n\n    def residual_add(self, feature: Tensor, input_feature: Tensor) -> Tensor:\n        r\"\"\"\n        Residual additition layer. Allows information to propagate through the model\n        by skipping the computational layers.\n        Parameters:\n            feature: The feature (typically nodes or edges) after message passing\n            input_feature: The same feature from before message passing\n        Returns:\n            The addition of the two tensors.\n        \"\"\"\n        feature += input_feature\n        return feature\n\n    def scale_activations(self, feature: Tensor, scale_factor: Tensor) -> Tensor:\n        \"\"\"Scale Activations by a constant factor to stop growth of activation scale\n        and reduce numerical stability issues at low precision\n\n        Args:\n            feature (Tensor): The feature to scale\n            scale_factor (float): The floating point scale factor\n\n        Returns:\n            Tensor: The scaled features\n        \"\"\"\n        scale_factor = torch.tensor(scale_factor).to(feature.device)\n        feature = feature / scale_factor.to(dtype=feature.dtype)\n        return feature\n\n    def forward(self, batch: Batch) -> Batch:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs to pass through the layer\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        # pe, feat, edge_index, edge_feat = batch.pos_enc_feats_sign_flip, batch.feat, batch.edge_index, batch.edge_feat\n        feat = batch.feat\n\n        feat_in = feat  # for first residual connection\n\n        # Local MPNN with edge attributes.\n        batch_out = batch.clone()\n        if self.mpnn is not None:\n            batch_out = self.mpnn(batch_out)\n        h_local = batch_out.feat\n        if self.dropout_local is not None:\n            h_local = self.dropout_local(h_local)\n        # Apply the residual connection for the node features and scale the activations by some value to help reduce activation growth\n        if self.node_residual:\n            if self.layer_depth < 1:\n                h_local = self.residual_add(h_local, feat_in)\n                h_local = self.scale_activations(h_local, self.output_scale)\n            else:\n                h_local = self.scale_activations(h_local, self.output_scale)\n                h_local = self.residual_add(h_local, feat_in)\n\n        if self.norm_layer_local is not None:\n            h_local = self.norm_layer_local(h_local)\n\n        # Multi-head attention.\n        if self.attn_layer is not None:\n            h_attn = self._self_attention_block(feat, feat_in, batch)\n            # Combine local and global outputs.\n            feat = h_local + h_attn\n        else:\n            feat = h_local\n\n        # MLP block, with skip connection\n        feat_mlp = self.mlp(feat)\n        # Add the droppath to the output of the MLP\n        batch_size = None if feat.device.type != \"ipu\" else batch.graph_is_true.shape[0]\n        if self.droppath_ffn is not None:\n            feat_mlp = self.droppath_ffn(feat_mlp, batch.batch, batch_size)\n        feat = feat + feat_mlp\n\n        feat = self.f_out(feat)\n\n        batch_out.feat = feat\n\n        return batch_out\n\n    def _parse_mpnn_layer(self, mpnn_type, mpnn_kwargs: Dict[str, Any]) -> Optional[Module]:\n        \"\"\"Parse the MPNN layer.\"\"\"\n\n        if mpnn_type is None or mpnn_type == \"none\":\n            return\n\n        mpnn_kwargs = deepcopy(mpnn_kwargs)\n        if mpnn_kwargs is None:\n            mpnn_kwargs = {}\n\n        # Set the default values\n        mpnn_kwargs = deepcopy(mpnn_kwargs)\n        mpnn_kwargs.setdefault(\"in_dim\", self.in_dim)\n        mpnn_kwargs.setdefault(\"out_dim\", self.in_dim)\n        mpnn_kwargs.setdefault(\"in_dim_edges\", self.in_dim_edges)\n        mpnn_kwargs.setdefault(\"out_dim_edges\", self.out_dim_edges)\n        # TODO: The rest of default values\n        self.mpnn_kwargs = mpnn_kwargs\n\n        # Initialize the MPNN layer\n        mpnn_class = PYG_LAYERS_DICT[mpnn_type]\n        mpnn_layer = mpnn_class(**mpnn_kwargs, layer_depth=self.layer_depth, layer_idx=self.layer_idx)\n\n        return mpnn_layer\n\n    def _parse_attn_layer(\n        self,\n        attn_type,\n        biased_attention_key: str,\n        attn_kwargs: Dict[str, Any],\n        force_consistent_in_dim: bool = True,\n    ) -> Optional[Module]:\n        \"\"\"\n        parse the input attention layer and check if it is valid\n        Parameters:\n            attn_type: type of the attention layer\n            biased_attention_key: key for the attenion bias\n            attn_kwargs: kwargs for the attention layer\n            force_consistent_in_dim: whether to force the `embed_dim` to be the same as the `in_dim`\n\n        Returns:\n            attn_layer: the attention layer\n        \"\"\"\n\n        # Set the default values for the Attention layer\n        if attn_kwargs is None:\n            attn_kwargs = {}\n        attn_kwargs.setdefault(\"num_heads\", 1)\n        attn_kwargs.setdefault(\"dropout\", self.dropout)\n        attn_kwargs.setdefault(\"batch_first\", True)\n        self.attn_kwargs = attn_kwargs\n\n        # Force the `embed_dim` to be the same as the `in_dim`\n        attn_kwargs.setdefault(\"embed_dim\", self.in_dim)\n        if force_consistent_in_dim:\n            embed_dim = attn_kwargs[\"embed_dim\"]\n            assert embed_dim == self.in_dim, f\"embed_dim={embed_dim} must be equal to in_dim={self.in_dim}\"\n\n        # Initialize the Attention layer\n        attn_layer, attn_class = None, None\n        if attn_type is not None:\n            attn_class = ATTENTION_LAYERS_DICT[attn_type]\n        if attn_class is not None:\n            attn_layer = attn_class(biased_attention_key, **attn_kwargs)\n        return attn_layer\n\n    def _use_packing(self, batch: Batch) -> bool:\n        \"\"\"\n        Check if we should use packing for the batch of graphs.\n        \"\"\"\n        batch_keys = get_keys(batch)\n        return \"pack_from_node_idx\" in batch_keys and \"pack_attn_mask\" in batch_keys\n\n    def _to_dense_batch(\n        self,\n        h: Tensor,\n        batch: Batch,\n        batch_size: Optional[int] = None,\n        max_num_nodes_per_graph: Optional[int] = None,\n        on_ipu: bool = False,\n    ) -> Tensor:\n        \"\"\"\n        Convert the batch of graphs to a dense batch.\n        \"\"\"\n\n        if self._use_packing(batch):\n            attn_mask = batch.pack_attn_mask\n            key_padding_mask = None\n            idx = batch.pack_from_node_idx\n            h_dense = to_packed_dense_batch(\n                h,\n                pack_from_node_idx=idx,\n                pack_attn_mask=attn_mask,\n                max_num_nodes_per_pack=100,  # TODO: This should be a parameter\n            )\n        else:\n            attn_mask = None\n            h_dense, key_padding_mask, idx = to_dense_batch(\n                h,\n                batch=batch.batch,  # The batch index as a vector that indicates for nodes of which graph it belongs to\n                batch_size=batch_size,\n                max_num_nodes_per_graph=max_num_nodes_per_graph,\n                drop_nodes_last_graph=on_ipu,\n            )\n            key_padding_mask = ~key_padding_mask\n        return h_dense, attn_mask, key_padding_mask, idx\n\n    def _to_sparse_batch(self, batch: Batch, h_dense: Tensor, idx: Tensor) -> Tensor:\n        \"\"\"\n        Convert the dense batch back to a sparse batch.\n        \"\"\"\n        if self._use_packing(batch):\n            h = to_sparse_batch_from_packed(\n                h_dense,\n                pack_from_node_idx=idx,\n            )\n        else:\n            h = to_sparse_batch(\n                h_dense,\n                mask_idx=idx,\n            )\n        return h\n\n    def _self_attention_block(self, feat: Tensor, feat_in: Tensor, batch: Batch) -> Tensor:\n        \"\"\"\n        Applying the multi-head self-attention to the batch of graphs.\n        First the batch is converted from [num_nodes, hidden_dim] to [num_graphs, max_num_nodes, hidden_dim]\n        Then the self-attention is applied on each graph\n        Then the batch is converted again to [num_nodes, hidden_dim]\n        \"\"\"\n\n        # Multi-head attention.\n        on_ipu = is_running_on_ipu()\n        max_num_nodes_per_graph = None\n        if on_ipu:\n            max_num_nodes_per_graph = self.max_num_nodes_per_graph\n\n        # Convert the tensor to a dense batch, then back to a sparse batch\n        batch_size = None if feat.device.type != \"ipu\" else batch.graph_is_true.shape[0]\n\n        # h[num_nodes, hidden_dim] -> h_dense[num_graphs, max_num_nodes, hidden_dim]\n        feat_dense, attn_mask, key_padding_mask, idx = self._to_dense_batch(\n            feat,\n            batch=batch,  # The batch index as a vector that indicates for nodes of which graph it belongs to\n            batch_size=batch_size,\n            max_num_nodes_per_graph=max_num_nodes_per_graph,\n            on_ipu=on_ipu,\n        )\n\n        attn_bias = None\n        if self.biased_attention_key is not None and self.biased_attention_key != \"none\":\n            attn_bias = batch[self.biased_attention_key]\n\n        # h_dense[num_graphs, max_num_nodes, hidden_dim] -> feat_attn[num_graphs, max_num_nodes, hidden_dim]\n        feat_attn = self._sa_block(\n            feat_dense, attn_bias=attn_bias, attn_mask=attn_mask, key_padding_mask=key_padding_mask\n        )\n\n        # feat_attn[num_graphs, max_num_nodes, hidden_dim] -> feat_attn[num_nodes, hidden_dim]\n        feat_attn = self._to_sparse_batch(batch, feat_attn, idx)\n\n        # Dropout, residual, norm\n        if self.dropout_attn is not None:\n            feat_attn = self.dropout_attn(feat_attn)\n        feat_attn = feat_in + feat_attn\n        if self.norm_layer_attn is not None:\n            feat_attn = self.norm_layer_attn(feat_attn)\n        if self.droppath_layer is not None:\n            self.droppath_layer(feat_attn, batch.batch, batch_size=batch_size)\n\n        # Combine local and global outputs.\n        return feat + feat_attn\n\n    def _sa_block(\n        self, x: torch.Tensor, attn_bias: torch.Tensor, attn_mask=None, key_padding_mask=None\n    ) -> torch.Tensor:\n        \"\"\"\n        Self-attention block.\n        Parameters:\n            x: input tensor\n            attn_bias: attention bias tensor\n            attn_mask: None\n            key_padding_mask: None\n        Returns:\n            x: output tensor\n        \"\"\"\n        x = self.attn_layer(\n            x,\n            x,\n            x,\n            attn_bias=attn_bias,\n            precision=self.precision,\n            attn_mask=attn_mask,\n            key_padding_mask=key_padding_mask,\n            need_weights=False,\n        )[0]\n        return x\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        if self.mpnn is None:\n            return False\n        return self.mpnn.layer_outputs_edges\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/mpnn_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Callable, Optional, Union, Tuple, List\n\nimport torch\nfrom torch import Tensor, IntTensor, LongTensor\nfrom graphium.nn.base_graph_layer import BaseGraphModule\nfrom graphium.nn.base_layers import MLP\nfrom graphium.utils.decorators import classproperty\nfrom torch_geometric.nn.aggr import MultiAggregation, Aggregation\nfrom torch_geometric.data import Batch\n\n\nclass MPNNPlusPyg(BaseGraphModule):\n    def __init__(\n        self,\n        in_dim: int = 64,\n        out_dim: int = 64,\n        activation: Union[str, Callable] = \"gelu\",\n        dropout: float = 0.3,\n        normalization: Union[str, Callable] = \"layer_norm\",\n        gather_from: str = \"both\",\n        scatter_to: str = \"both\",\n        node_combine_method: str = \"concat\",\n        num_node_mlp: int = 2,\n        mlp_expansion_ratio: int = 4,\n        use_edges: bool = True,\n        in_dim_edges: Optional[int] = 32,\n        out_dim_edges: Optional[int] = 32,\n        aggregation_method: Optional[List[Union[str, Aggregation]]] = [\"sum\"],\n        num_edge_mlp: Optional[int] = 2,\n        use_globals: bool = True,\n        edge_dropout_rate: Optional[float] = 0.0035,\n        **kwargs,\n    ):\n        r\"\"\"\n            MPNNPlusPyg: InteractionNetwork layer witg edges and global feature, GPS++ type of GNN layer\n            GPS++: An Optimised Hybrid MPNN/Transformer for Molecular Property Prediction\n            Dominic Masters, Josef Dean, Kerstin Klaser, Zhiyi Li, Sam Maddrell-Mander, Adam Sanders,\n            Hatem Helal, Deniz Beker, Ladislav Rampášek, Dominique Beaini\n            https://arxiv.org/abs/2212.02229\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the nodes\n\n            out_dim:\n                Output feature dimensions of the nodes\n\n            activation:\n                Activation function to use in the edge and node model\n\n            dropout:\n                The ratio of units to dropout at the end within apply_norm_activation_dropout.\n\n            normalization:\n                Normalization to use with the edge, node models and at the end\n                within apply_norm_activation_dropout. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            gather_from:\n                The method to gather features from. Could choose from:\n                \"senders\", \"receivers\" and \"both\".\n\n            scatter_to:\n                The method to scatter features to. Could choose from:\n                \"senders\", \"receivers\" and \"both\".\n\n            node_combine_method:\n                The method to combine the node features, Could choose from:\n                \"sum\" and \"concat\".\n\n            aggregation_method:\n                Methods for aggregating (scatter) messages built from node and edge features.\n                Provide a list of `Aggregation` or strings.\n                supported strings are:\n\n                - \"sum\" / \"add\" (Default)\n                - \"mean\"\n                - \"max\"\n                - \"min\"\n                - \"softmax\"\n                - \"median\"\n                - \"std\"\n                - \"var\"\n\n            num_node_mlp:\n                Number of mlp layer used for node model\n\n            mlp_expansion_ratio:\n                Expansion ratio for node and edge mlp\n\n            use_edges:\n                If edge features are used\n\n            in_dim_edges:\n                Input feature dimensions of the edges\n\n            out_dim_edges:\n                Output feature dimensions of the edges\n\n            num_edge_mlp:\n                Number of mlp layer used for edge model\n\n            edge_dropout_rate:\n                dropout rate for the edges\n        \"\"\"\n\n        super().__init__(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        self.gather_from = gather_from\n        self.scatter_to = scatter_to\n        self.node_combine_method = node_combine_method\n        self.num_node_mlp = num_node_mlp\n        self.mlp_expansion_ratio = mlp_expansion_ratio\n\n        self.use_edges = use_edges\n        self.in_dim_edges = in_dim_edges\n        self.out_dim_edges = out_dim_edges\n        self.num_edge_mlp = num_edge_mlp\n        self.edge_dropout_rate = edge_dropout_rate\n\n        self.aggregator = MultiAggregation(list(aggregation_method))\n        n_agg = len(aggregation_method)\n\n        # node_model:\n        edge_dim = self.out_dim_edges if use_edges else self.in_dim_edges\n        if self.node_combine_method == \"concat\":\n            node_model_in_dim = (1 + 2 * n_agg) * self.in_dim + 2 * n_agg * edge_dim\n        elif self.node_combine_method == \"sum\":\n            node_model_in_dim = (1 + n_agg) * self.in_dim + n_agg * edge_dim\n        else:\n            raise ValueError(f\"node_combine_method {self.node_combine_method} not recognised.\")\n        node_model_hidden_dim = self.mlp_expansion_ratio * self.in_dim\n        self.node_model = MLP(\n            in_dim=node_model_in_dim,\n            hidden_dims=node_model_hidden_dim,\n            out_dim=self.out_dim,\n            depth=self.num_node_mlp,\n            activation=self.activation_layer,\n            last_dropout=self.dropout,\n            normalization=self.normalization,\n        )\n\n        # edge_model:\n        if self.node_combine_method == \"concat\":\n            edge_model_in_dim = 2 * self.in_dim + self.in_dim_edges\n        elif self.node_combine_method == \"sum\":\n            edge_model_in_dim = self.in_dim + self.in_dim_edges\n        else:\n            raise ValueError(f\"node_combine_method {self.node_combine_method} not recognised.\")\n        edge_model_hidden_dim = self.mlp_expansion_ratio * self.in_dim_edges\n        self.edge_model = MLP(\n            in_dim=edge_model_in_dim,\n            hidden_dims=edge_model_hidden_dim,\n            out_dim=self.out_dim_edges,\n            depth=self.num_edge_mlp,\n            activation=self.activation_layer,\n            last_dropout=self.edge_dropout_rate,\n            normalization=self.normalization,\n        )\n\n        self.use_globals = use_globals\n\n    def gather_features(\n        self,\n        input_features: Tensor,\n        senders: Union[IntTensor, LongTensor],\n        receivers: Union[IntTensor, LongTensor],\n    ) -> Tuple[Tensor, Tensor, Tensor]:\n        r\"\"\"\n        Function to gather node features based on the senders and receivers of the edge indices.\n\n        Parameters:\n\n            input_features:\n                Node features of the batch\n\n            senders:\n                Senders of the edge_index of the batch\n\n            receivers:\n                Receivers of the edge_index of the batch\n\n        Output:\n            Gathered node features (sender and receiver) summed up or concatenated\n            Gathered sender features\n            Gathered receiver features\n        \"\"\"\n\n        out = []\n\n        receiver_features = input_features[receivers]\n        sender_features = input_features[senders]\n\n        if self.gather_from == \"receivers\":\n            out.append(receiver_features)\n\n        if self.gather_from == \"senders\":\n            out.append(sender_features)\n\n        if self.gather_from == \"both\":\n            if self.node_combine_method == \"sum\":\n                out.append(receiver_features + sender_features)\n            elif self.node_combine_method == \"concat\":\n                out.append(torch.cat([receiver_features, sender_features], dim=-1))\n            else:\n                raise ValueError(f\"node_combine_method {self.node_combine_method} not recognised.\")\n\n        return out, sender_features, receiver_features\n\n    def aggregate_features(\n        self,\n        input_features: Tensor,\n        senders: Union[IntTensor, LongTensor],\n        receivers: Union[IntTensor, LongTensor],\n        sender_features: Tensor,\n        receiver_features: Tensor,\n        size: int,\n    ) -> Tensor:\n        r\"\"\"\n        Function to aggregate (scatter) messages built from node and edge features.\n\n        Parameters:\n\n            input_features:\n                Edge features of the batch\n\n            senders:\n                Senders of the edge_index of the batch\n\n            receivers:\n                Receivers of the edge_index of the batch\n\n            sender_features:\n                Senders features gathered from the gather_features function\n\n            receiver_features:\n                Receiver features gathered from the gather_features function\n\n            size:\n                size of the aggregation, equals to the total number of nodes\n\n        Returns:\n            Tensor:\n                Aggregated node features\n\n        \"\"\"\n\n        out = []\n        aggregated_features = []\n\n        if self.scatter_to in [\"receivers\", \"both\"]:\n            # using direct_neighbour_aggregation to generate the message\n            message = torch.cat([input_features, sender_features], dim=-1)\n            # sum method is used with aggregators\n            aggregated_features.append(self.aggregator(message, receivers, dim_size=size))\n\n        if self.scatter_to in [\"senders\", \"both\"]:\n            # using direct_neighbour_aggregation to generate the message\n            message = torch.cat([input_features, receiver_features], dim=-1)\n            # sum method is used with aggregators\n            aggregated_features.append(self.aggregator(message, senders, dim_size=size))\n\n        if self.node_combine_method == \"sum\" and self.scatter_to == \"both\":\n            out.append(aggregated_features[0] + aggregated_features[1])\n        elif self.scatter_to == \"both\":\n            out.append(torch.cat([aggregated_features[0], aggregated_features[1]], dim=-1))\n        else:\n            out.extend(aggregated_features)\n\n        return out\n\n    def forward(self, batch: Batch) -> Batch:\n        r\"\"\"\n        Forward function of the MPNN Plus layer\n        Parameters:\n            batch:\n                pyg Batch graph to pass through the layer\n        Returns:\n            batch:\n                pyg Batch graph with updated node and edge features\n        \"\"\"\n        senders = batch.edge_index[0]\n        receivers = batch.edge_index[1]\n        # ---------------EDGE step---------------\n        edge_model_input, sender_nodes, receiver_nodes = self.gather_features(batch.feat, senders, receivers)\n\n        if self.use_edges:\n            edge_model_input.append(batch.edge_feat)\n            edge_model_input = torch.cat([edge_model_input[0], edge_model_input[1]], dim=-1)\n            # edge dropout included in the edge_model\n            batch.edge_feat = self.edge_model(edge_model_input)\n        else:\n            batch.edge_feat = edge_model_input\n\n        # ---------------NODE step---------------\n        # message + aggregate\n        node_count_per_pack = batch.feat.shape[-2]\n        node_model_input = self.aggregate_features(\n            batch.edge_feat, senders, receivers, sender_nodes, receiver_nodes, node_count_per_pack\n        )\n        node_model_input.append(batch.feat)\n        batch.feat = torch.cat([node_model_input[0], node_model_input[1]], dim=-1)\n        batch.feat = self.node_model(batch.feat)\n\n        # ---------------Apply norm activation and dropout---------------\n        # use dropout value of the layer (default 0.3)\n        batch.feat = self.apply_norm_activation_dropout(\n            batch.feat,\n            normalization=False,\n            activation=False,\n            batch_idx=batch.batch,\n            batch_size=batch.num_graphs,\n        )\n\n        return batch\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            supports_edges: bool\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n"
  },
  {
    "path": "graphium/nn/pyg_layers/pna_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Dict, List, Optional, Union, Callable\nfrom functools import partial\n\nimport torch\nfrom torch import Tensor\nfrom torch_scatter import scatter\n\nfrom torch_geometric.nn.conv import MessagePassing\nfrom torch_geometric.typing import OptTensor\nfrom torch_geometric.utils import degree\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.utils.decorators import classproperty\nfrom graphium.nn.base_layers import MLP, FCLayer, get_activation\nfrom graphium.nn.base_graph_layer import BaseGraphStructure, check_intpus_allow_int\n\n\nclass PNAMessagePassingPyg(MessagePassing, BaseGraphStructure):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        aggregators: List[str],\n        scalers: List[str],\n        activation: Union[Callable, str] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        avg_d: Dict[str, float] = {\"log\": 1.0, \"lin\": 1.0},\n        last_activation: Union[Callable, str] = \"none\",\n        posttrans_layers: int = 1,\n        pretrans_layers: int = 1,\n        in_dim_edges: int = 0,\n        **kwargs,\n    ):\n        r\"\"\"\n        Implementation of the message passing architecture of the PNA message passing layer,\n        previously known as `PNALayerComplex`. This layer applies an MLP as\n        pretransformation to the concatenation of $[h_u, h_v, e_{uv}]$ to generate\n        the messages, with $h_u$ the node feature, $h_v$ the neighbour node features,\n        and $e_{uv}$ the edge feature between the nodes $u$ and $v$.\n\n        After the pre-transformation, it aggregates the messages\n        multiple aggregators and scalers,\n        concatenates their results, then applies an MLP on the concatenated\n        features.\n\n        PNA: Principal Neighbourhood Aggregation\n        Gabriele Corso, Luca Cavalleri, Dominique Beaini, Pietro Lio, Petar Velickovic\n        https://arxiv.org/abs/2004.05718\n\n        [!] code adapted from pytorch-geometric implementation of PNAConv\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the layer\n\n            out_dim:\n                Output feature dimensions of the layer\n\n            aggregators:\n                Set of aggregation function identifiers,\n                e.g. \"mean\", \"max\", \"min\", \"std\", \"sum\", \"var\", \"moment3\".\n                The results from all aggregators will be concatenated.\n\n            scalers:\n                Set of scaling functions identifiers\n                e.g. \"identidy\", \"amplification\", \"attenuation\"\n                The results from all scalers will be concatenated\n\n            activation:\n                activation function to use in the layer\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            avg_d:\n                Average degree of nodes in the training set, used by scalers to normalize\n\n            last_activation:\n                activation function to use in the last layer of the internal MLP\n\n            posttrans_layers:\n                number of layers in the MLP transformation after the aggregation\n\n            pretrans_layers:\n                number of layers in the transformation before the aggregation\n\n            in_dim_edges:\n                size of the edge features. If 0, edges are ignored\n\n        \"\"\"\n\n        MessagePassing.__init__(self, node_dim=0)\n        BaseGraphStructure.__init__(\n            self,\n            in_dim=in_dim,\n            out_dim=out_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            **kwargs,\n        )\n\n        # Allow int32 as edge index\n        self.__check_input__ = partial(check_intpus_allow_int, self)\n\n        self.aggregators = aggregators\n        self.scalers = scalers\n\n        # Edge dimensions\n        self.in_dim_edges = in_dim_edges\n        self.edge_encoder = None\n        if self.in_dim_edges > 0:\n            self.edge_encoder = FCLayer(self.in_dim_edges, self.in_dim_edges, activation=None)\n\n        # Initializing basic attributes\n        self.avg_d = avg_d\n        self.last_activation = get_activation(last_activation)\n\n        # MLP used on each pair of nodes with their edge MLP(h_u, h_v, e_uv)\n        self.pretrans = MLP(\n            in_dim=2 * in_dim + in_dim_edges,\n            hidden_dims=in_dim,\n            out_dim=in_dim,\n            depth=pretrans_layers,\n            activation=self.activation,\n            last_activation=self.last_activation,\n            dropout=dropout,\n            normalization=normalization,\n            last_normalization=normalization,\n        )\n\n        # MLP used on the aggregated messages of the neighbours\n        self.posttrans = MLP(\n            in_dim=(len(aggregators) * len(scalers)) * self.in_dim,\n            hidden_dims=self.out_dim,\n            out_dim=self.out_dim,\n            depth=posttrans_layers,\n            activation=self.activation,\n            last_activation=self.last_activation,\n            dropout=dropout,\n            normalization=normalization,\n            last_normalization=normalization,\n        )\n\n    def forward(self, batch: Union[Data, Batch]) -> Union[Data, Batch]:\n        r\"\"\"\n        forward function of the layer\n        Parameters:\n            batch: pyg Batch graphs\n        Returns:\n            batch: pyg Batch graphs\n        \"\"\"\n        feat, edge_index, edge_feat = batch.feat, batch.edge_index, batch.edge_feat\n\n        out = self.propagate(edge_index, x=feat, edge_feat=edge_feat, size=None)\n        out = self.posttrans(out)  # No more towers and concat with x\n        batch.feat = out\n        return batch\n\n    def message(self, x_i: Tensor, x_j: Tensor, edge_feat: OptTensor) -> Tensor:\n        r\"\"\"\n        message function\n\n        Parameters:\n            x_i: node features\n            x_j: neighbour node features\n            edge_feat: edge features\n        Returns:\n            feat: the message\n        \"\"\"\n        feat: Tensor = x_i  # Dummy.\n        if (edge_feat is not None) and (self.edge_encoder is not None):\n            edge_feat = self.edge_encoder(edge_feat)\n            feat = torch.cat([x_i, x_j, edge_feat], dim=-1)\n        else:\n            feat = torch.cat([x_i, x_j], dim=-1)\n\n        return self.pretrans(feat)  # No more towers\n\n    def aggregate(\n        self,\n        inputs: Tensor,\n        index: Tensor,\n        edge_index: Tensor,\n        dim_size: Optional[int] = None,\n    ) -> Tensor:\n        r\"\"\"\n        aggregate function\n\n        Parameters:\n            inputs: input features\n            index: index of the nodes\n            edge_index: edge index\n            dim_size: dimension size\n        Returns:\n            out: aggregated features\n        \"\"\"\n        outs = []\n\n        for aggregator in self.aggregators:\n            if aggregator == \"sum\":\n                out = scatter(inputs, index, 0, None, dim_size, reduce=\"sum\")\n            elif aggregator == \"mean\":\n                out = scatter(inputs, index, 0, None, dim_size, reduce=\"mean\")\n            elif aggregator == \"min\":\n                out = scatter(inputs, index, 0, None, dim_size, reduce=\"min\")\n            elif aggregator == \"max\":\n                out = scatter(inputs, index, 0, None, dim_size, reduce=\"max\")\n            elif aggregator in [\"var\", \"std\"]:\n                mean = scatter(inputs, index, 0, None, dim_size, reduce=\"mean\")\n                mean_squares = scatter(inputs * inputs, index, 0, None, dim_size, reduce=\"mean\")\n                out = mean_squares - mean * mean\n                if aggregator == \"std\":\n                    out = torch.sqrt(torch.relu(out) + 1e-5)\n            else:\n                raise ValueError(f'Unknown aggregator \"{aggregator}\".')\n            outs.append(out)\n        out = torch.cat(outs, dim=-1)\n\n        deg = degree(index, dim_size, dtype=inputs.dtype)\n        deg = deg.clamp_(1).view(-1, 1)\n\n        outs = []\n        for scaler in self.scalers:\n            if scaler == \"identity\":\n                out_scaler = out\n            elif scaler == \"amplification\":\n                out_scaler = out * (torch.log(deg + 1) / self.avg_d[\"log\"])\n            elif scaler == \"attenuation\":\n                out_scaler = out * (self.avg_d[\"log\"] / torch.log(deg + 1))\n            elif scaler == \"linear\":\n                out_scaler = out * (deg / self.avg_d[\"lin\"])\n            elif scaler == \"inverse_linear\":\n                out_scaler = out * (self.avg_d[\"lin\"] / deg)\n            else:\n                raise ValueError(f'Unknown scaler \"{scaler}\".')\n            outs.append(out_scaler)\n        return torch.cat(outs, dim=-1)\n\n    @property\n    def layer_outputs_edges(self) -> bool:\n        r\"\"\"\n        Abstract method. Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Always ``False`` for the current class\n        \"\"\"\n        return False\n\n    @property\n    def out_dim_factor(self) -> int:\n        r\"\"\"\n        Get the factor by which the output dimension is multiplied for\n        the next layer.\n\n        For standard layers, this will return ``1``.\n\n        But for others, such as ``GatLayer``, the output is the concatenation\n        of the outputs from each head, so the out_dim gets multiplied by\n        the number of heads, and this function should return the number\n        of heads.\n\n        Returns:\n\n            int:\n                Always ``1`` for the current class\n        \"\"\"\n        return 1\n\n    @property\n    def layer_inputs_edges(self) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type\n        uses edges as input or not.\n        It is different from ``layer_supports_edges`` since a layer that\n        supports edges can decide to not use them.\n\n        Returns:\n\n            bool:\n                Returns ``self.edge_features``\n        \"\"\"\n        return self.edge_features\n\n    @classproperty\n    def layer_supports_edges(cls) -> bool:\n        r\"\"\"\n        Return a boolean specifying if the layer type supports edges or not.\n\n        Returns:\n\n            bool:\n                Always ``True`` for the current class\n        \"\"\"\n        return True\n"
  },
  {
    "path": "graphium/nn/pyg_layers/pooling_pyg.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport torch\nimport torch.nn as nn\nfrom torch import Tensor, LongTensor\nfrom typing import List, Union, Callable, Tuple, Optional, Dict\nfrom copy import deepcopy\n\nfrom torch_scatter import scatter\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.nn.base_layers import MLP, FCLayer\nfrom graphium.utils.tensor import ModuleListConcat, ModuleWrap\nfrom graphium.nn.base_layers import MuReadoutGraphium\n\nEPS = 1e-6\n\n\ndef scatter_logsum_pool(x: Tensor, batch: LongTensor, dim: int = 0, dim_size: Optional[int] = None) -> Tensor:\n    r\"\"\"\n    Apply pooling over the nodes in the graph using a mean aggregation,\n    but scaled by the log of the number of nodes. This gives the same\n    expressive power as the sum, but helps deal with graphs that are\n    significantly larger than others by using a logarithmic scale.\n\n    $$r^{(i)} = \\frac{\\log N_i}{N_i}\\sum_{k=1}^{N_i} x^{(i)}_k$$\n\n    Parameters:\n        x (Tensor): Node feature matrix\n            :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}`.\n        batch (LongTensor): Batch vector :math:`\\mathbf{b} \\in {\\{ 0, \\ldots,\n            B-1\\}}^N`, which assigns each node to a specific example.\n        size (int, optional): Batch-size :math:`B`.\n            Automatically calculated if not given. (default: :obj:`None`)\n    Returns:\n        the pooled features tensor\n    \"\"\"\n    dim_size = int(batch.max().detach() + 1) if dim_size is None else dim_size\n    mean_pool = scatter(x, batch, dim=dim, dim_size=dim_size, reduce=\"mean\")\n    num_nodes = scatter(\n        torch.ones(x.shape[:-1], dtype=x.dtype, device=x.device),\n        batch,\n        dim=dim,\n        dim_size=dim_size,\n        reduce=\"sum\",\n    )\n    lognum = torch.log(num_nodes)\n    return mean_pool * lognum.unsqueeze(-1)\n\n\ndef scatter_std_pool(x: Tensor, batch: LongTensor, dim: int = 0, dim_size: Optional[int] = None):\n    r\"\"\"Returns batch-wise graph-level-outputs by taking the channel-wise\n    minimum across the node dimension, so that for a single graph\n    :math:`\\mathcal{G}_i` its output is computed by\n\n    .. math::\n        \\mathbf{r}_i = \\mathrm{max}_{n=1}^{N_i} \\, \\mathbf{x}_n\n\n    Parameters:\n        x (Tensor): Node feature matrix\n            :math:`\\mathbf{X} \\in \\mathbb{R}^{(N_1 + \\ldots + N_B) \\times F}`.\n        batch (LongTensor): Batch vector :math:`\\mathbf{b} \\in {\\{ 0, \\ldots,\n            B-1\\}}^N`, which assigns each node to a specific example.\n        size (int, optional): Batch-size :math:`B`.\n            Automatically calculated if not given. (default: :obj:`None`)\n\n    Returns:\n        the pooled features tensor\n    \"\"\"\n    dim_size = int(batch.max().detach() + 1) if dim_size is None else dim_size\n    mean = scatter(x, batch, dim=dim, out=None, dim_size=dim_size, reduce=\"mean\")\n    mean_squares = scatter(x * x, batch, dim=dim, out=None, dim_size=dim_size, reduce=\"mean\")\n    out = mean_squares - mean * mean\n    return torch.sqrt(torch.relu(out) + 1e-5)\n\n\nclass PoolingWrapperPyg(ModuleWrap):\n    def __init__(self, func, feat_type, *args, **kwargs) -> None:\n        super().__init__(func, *args, **kwargs)\n        self.feat_type = feat_type\n\n    def forward(self, g: Batch, feature: Tensor, *args, **kwargs):\n        \"\"\"\n        forward function\n        Parameters:\n            g: the pyg batch graph\n            feature: the node features\n        Returns:\n            the pooled features\n        \"\"\"\n        dim_size = g.num_graphs\n        if self.feat_type == \"node\":\n            index = g.batch\n        elif self.feat_type == \"edge\":\n            index = g.batch[g.edge_index][0]\n        else:\n            index = g.batch\n        return self.func(feature, index, dim_size=dim_size, *args, **kwargs, **self.kwargs)\n\n\ndef parse_pooling_layer_pyg(in_dim: int, pooling: Union[str, List[str]], feat_type: str = \"node\", **kwargs):\n    r\"\"\"\n    Select the pooling layers from a list of strings, and put them\n    in a Module that concatenates their outputs.\n\n    Parameters:\n\n        in_dim:\n            The dimension at the input layer of the pooling\n\n        pooling:\n            The list of pooling layers to use. The accepted strings are:\n\n            - \"none\": No pooling\n            - \"sum\": Sum all the nodes for each graph\n            - \"mean\": Mean all the nodes for each graph\n            - \"logsum\": Mean all the nodes then multiply by log(num_nodes) for each graph\n            - \"max\": Max all the nodes for each graph\n            - \"min\": Min all the nodes for each graph\n            - \"std\": Standard deviation of all the nodes for each graph\n\n    \"\"\"\n\n    # Create the pooling layer\n    pool_layer = ModuleListConcat()\n    out_pool_dim = 0\n    if isinstance(pooling, str):\n        pooling = [pooling]\n    assert feat_type in [\"node\", \"edge\", \"global\"]\n    for this_pool in pooling:\n        this_pool = None if this_pool is None else this_pool.lower()\n        out_pool_dim += in_dim\n        if this_pool == \"sum\":\n            pool_layer.append(PoolingWrapperPyg(scatter, dim=0, reduce=\"add\", feat_type=feat_type, **kwargs))\n        elif this_pool == \"mean\":\n            pool_layer.append(PoolingWrapperPyg(scatter, dim=0, reduce=\"mean\", feat_type=feat_type, **kwargs))\n        elif this_pool == \"logsum\":\n            pool_layer.append(PoolingWrapperPyg(scatter_logsum_pool, dim=0, feat_type=feat_type, **kwargs))\n        elif this_pool == \"max\":\n            pool_layer.append(PoolingWrapperPyg(scatter, dim=0, reduce=\"max\", feat_type=feat_type, **kwargs))\n        elif this_pool == \"min\":\n            pool_layer.append(PoolingWrapperPyg(scatter, dim=0, reduce=\"min\", feat_type=feat_type, **kwargs))\n        elif this_pool == \"std\":\n            pool_layer.append(PoolingWrapperPyg(scatter_std_pool, dim=0, feat_type=feat_type, **kwargs))\n        elif (this_pool == \"none\") or (this_pool is None):\n            pass\n        else:\n            raise NotImplementedError(f\"Undefined pooling `{this_pool}`\")\n\n    return pool_layer, out_pool_dim\n\n\nclass VirtualNodePyg(nn.Module):\n    def __init__(\n        self,\n        in_dim: int,\n        out_dim: int,\n        in_dim_edges: Optional[int],\n        out_dim_edges: Optional[int],\n        vn_type: Union[type(None), str] = \"sum\",\n        activation: Union[str, Callable] = \"relu\",\n        dropout: float = 0.0,\n        normalization: Union[str, Callable] = \"none\",\n        bias: bool = True,\n        residual: bool = True,\n        use_edges: bool = False,\n        **kwargs,\n    ):\n        r\"\"\"\n        The VirtualNode is a layer that pool the features of the graph,\n        applies a neural network layer on the pooled features,\n        then add the result back to the node features of every node.\n\n        Parameters:\n\n            in_dim:\n                Input feature dimensions of the virtual node layer.\n\n            out_dim:\n                Output feature dimensions of the virtual node layer.\n\n            in_dim_edges:\n                Input feature dimensions of the virtual node layer for the edges.\n\n            out_dim_edges:\n                Output feature dimensions of the virtual node layer for the edges.\n\n            vn_type:\n                The type of the virtual node. Choices are:\n\n                - \"none\": No pooling\n                - \"sum\": Sum all the nodes for each graph\n                - \"mean\": Mean all the nodes for each graph\n                - \"logsum\": Mean all the nodes then multiply by log(num_nodes) for each graph\n                - \"max\": Max all the nodes for each graph\n                - \"min\": Min all the nodes for each graph\n                - \"std\": Standard deviation of all the nodes for each graph\n\n            activation:\n                activation function to use in the neural network layer.\n\n            dropout:\n                The ratio of units to dropout. Must be between 0 and 1\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization\n                - `Callable`: Any callable function\n\n            bias:\n                Whether to add a bias to the neural network\n\n            residual:\n                Whether all virtual nodes should be connected together\n                via a residual connection\n\n            use_edges:\n                Boolean flag to select if edges are used in the global node\n                aggregation and update of features\n\n        \"\"\"\n        super().__init__()\n        if (vn_type is None) or (vn_type.lower() == \"none\"):\n            self.vn_type = None\n            self.fc_layer = None\n            self.residual = None\n            return\n        # Layer sizes stored here\n        self.in_dim_edges = in_dim_edges\n        self.out_dim_edges = out_dim_edges\n        self.in_dim_nodes = in_dim\n        self.out_dim_nodes = out_dim\n\n        # Use edge features in pooling\n        self.use_edges = use_edges\n        has_edges = (\n            (in_dim_edges is not None)\n            and (out_dim_edges is not None)\n            and (in_dim_edges > 0)\n            and (out_dim_edges > 0)\n        )\n        if self.use_edges and not has_edges:\n            raise ValueError(\"The edge features are not defined but `use_edges` is True\")\n        self.vn_type = vn_type.lower()\n        self.layer, out_pool_dim = parse_pooling_layer_pyg(\n            in_dim=self.in_dim_nodes, pooling=self.vn_type, feat_type=\"node\"\n        )\n        self.residual = residual\n\n        if self.use_edges:\n            self.edge_layer, out_edge_pool_dim = parse_pooling_layer_pyg(\n                in_dim=self.in_dim_edges, pooling=self.vn_type, feat_type=\"edge\"\n            )\n\n            out_pool_dim = out_pool_dim + out_edge_pool_dim\n\n        self.fc_layer = FCLayer(\n            in_dim=out_pool_dim,\n            out_dim=out_pool_dim,\n            activation=activation,\n            dropout=dropout,\n            normalization=normalization,\n            bias=bias,\n        )\n\n        # Projection layers from the pooling layer to node and edge feature sizes\n        self.node_projection = MuReadoutGraphium(out_pool_dim, self.out_dim_nodes)\n        self.edge_projection = None\n        if self.use_edges:\n            self.edge_projection = MuReadoutGraphium(out_pool_dim, self.out_dim_edges)\n\n    def forward(\n        self, g: Union[Data, Batch], feat: Tensor, vn_feat: LongTensor, edge_feat: Tensor\n    ) -> Tuple[Tensor, Tensor, Tensor]:\n        r\"\"\"\n        Apply the virtual node layer.\n\n        Parameters:\n\n            g:\n                PyG Graphs or Batched graphs.\n\n            feat (torch.Tensor[..., N, Din]):\n                Node feature tensor, before convolution.\n                `N` is the number of nodes, `Din` is the input features\n\n            vn_feat (torch.Tensor[..., M, Din]):\n                Graph feature of the previous virtual node, or `None`\n                `M` is the number of graphs, `Din` is the input features.\n                It is added to the result after the MLP, as a residual connection\n\n            edge_feat (torch.Tensor[..., E, Din]):\n                Edge feature tensor, before convolution.\n                `E` is the number of edges, `Din` is the input features\n\n        Returns:\n\n            `feat = torch.Tensor[..., N, Dout]`:\n                Node feature tensor, after convolution and residual.\n                `N` is the number of nodes, `Dout` is the output features of the layer and residual\n\n            `vn_feat = torch.Tensor[..., M, Dout]`:\n                Graph feature tensor to be used at the next virtual node, or `None`\n                `M` is the number of graphs, `Dout` is the output features\n\n            `edge_feat = torch.Tensor[..., N, Dout]`:\n                Edge feature tensor, after convolution and residual - if edges are used, otherwise returned unchanged.\n                `N` is the number of edges, `Dout` is the output features of the layer and residual\n\n        \"\"\"\n\n        # Pool the features\n        if self.vn_type is None:\n            return feat, vn_feat, edge_feat\n        pool = self.layer(g, feat)\n        if self.use_edges:\n            edge_pool = self.edge_layer(g, edge_feat)\n            pool = torch.cat((pool, edge_pool), -1)\n\n        vn_h_temp = self.fc_layer.forward(vn_feat + pool)\n\n        if self.residual:\n            vn_feat = vn_feat + vn_h_temp\n        else:\n            vn_feat = vn_h_temp\n\n        # Add the virtual node projections to the node features\n        feat = feat + self.node_projection(vn_feat[g.batch])\n\n        if self.use_edges:\n            # Add the virtual node projections to the edge features\n            edge_feat = edge_feat + self.edge_projection(vn_feat[g.batch[g.edge_index][0]])\n        return feat, vn_feat, edge_feat\n"
  },
  {
    "path": "graphium/nn/pyg_layers/utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport math\nimport torch\nimport torch.nn as nn\nfrom torch_geometric.data import Batch\nfrom typing import Tuple\nfrom torch import Tensor\n\nfrom torch_geometric.typing import SparseTensor\n\nfrom graphium.nn.base_layers import MLP, get_norm\nfrom graphium.ipu.to_dense_batch import to_dense_batch, to_sparse_batch\n\n\nclass PreprocessPositions(nn.Module):\n    \"\"\"\n    Compute 3D attention bias and 3D node features according to the 3D position information.\n    \"\"\"\n\n    def __init__(\n        self,\n        num_heads,\n        embed_dim,\n        num_kernel,\n        in_dim=3,\n        num_layers=2,\n        activation=\"gelu\",\n        first_normalization=\"none\",\n    ):\n        r\"\"\"\n        Parameters:\n            num_heads:\n                Number of attention heads used in self-attention.\n            embed_dim:\n                Hidden dimension of node features.\n            num_kernel:\n                Number of gaussian kernels.\n            num_layers: The number of layers in the MLP.\n            activation: The activation function used in the MLP.\n            first_normalization: The normalization function used before the gaussian kernel.\n\n        \"\"\"\n        super().__init__()\n        self.num_heads = num_heads\n        self.num_kernel = num_kernel\n        self.embed_dim = embed_dim\n        self.first_normalization = get_norm(first_normalization, dim=in_dim)\n\n        self.gaussian = GaussianLayer(self.num_kernel, in_dim=in_dim)\n        self.gaussian_proj = MLP(\n            in_dim=self.num_kernel,\n            hidden_dims=self.num_kernel,\n            out_dim=self.num_heads,\n            depth=num_layers,\n            activation=activation,\n            last_layer_is_readout=True,  # Since the output is not proportional to the hidden dim, but to the number of heads\n        )\n\n        # make sure the 3D node feature has the same dimension as the embedding size\n        # so that it can be added to the original node features\n        self.node_proj = nn.Linear(self.num_kernel, self.embed_dim)\n\n    def forward(\n        self, batch: Batch, max_num_nodes_per_graph: int, on_ipu: bool, positions_3d_key: str\n    ) -> Tuple[Tensor, Tensor]:\n        r\"\"\"\n        Inputs:\n            batch:\n                Batch object.\n            max_num_nodes_per_graph:\n                Maximum number of nodes per graph.\n            on_ipu:\n                If model rus on IPU.\n            positions_3d_key:\n                The key of the pyg graph object that contains the 3D positions.\n\n        \"\"\"\n\n        pos = batch[positions_3d_key]\n        if self.first_normalization is not None:\n            pos = self.first_normalization(pos)\n        batch_size = None if pos.device.type != \"ipu\" else batch.graph_is_true.shape[0]\n        # batch_size = None if batch.feat.device.type != \"ipu\" else batch.graph_is_true.shape[0] #[Andy] batch.feat is only available after passing through layers, not a good attribute to check\n        # pos: [batch, nodes, 3]\n        # padding_mask: [batch, nodes]\n        # idx: [totoal_nodes]\n        pos, mask, idx = to_dense_batch(\n            pos,\n            batch=batch.batch,\n            batch_size=batch_size,\n            max_num_nodes_per_graph=max_num_nodes_per_graph,\n            drop_nodes_last_graph=on_ipu,\n        )\n        # check nan with the pos from to_dense_batch,\n        # and generate mask. 1 for nan, 0 for other values.\n        # pos consists of real nodes and padding nodes\n        # for real nodes, if 3d position does not exit, it is nans. For padding nodes, 3d positions will be 0\n        # if the first node of a molecule has 3d position as nan, the whole molecule will be masked out.\n        # [batch]\n        nan_mask = torch.isnan(pos)[:, 0, 0]\n        # apply nan_mask on pos so that it does not give nan gradient\n        # when applying gaussian kernels\n        pos.masked_fill_(nan_mask.unsqueeze(1).unsqueeze(2), 0.0)\n        # we need the opposite of mask output\n        padding_mask = ~mask\n        # [batch, nodes]\n        batch, n_node, _ = pos.shape\n        # [batch, nodes, nodes, 3]\n        delta_pos = pos.unsqueeze(1) - pos.unsqueeze(2)\n        # [batch, nodes, nodes]\n        distance = delta_pos.norm(dim=-1).view(-1, n_node, n_node)\n        # [batch, nodes, nodes, num_kernel]\n        distance_feature = self.gaussian(distance)\n        # [batch, nodes, nodes, num_heads]\n        attn_bias = self.gaussian_proj(distance_feature)\n        # [batch, num_heads, nodes, nodes]\n        attn_bias = attn_bias.permute(0, 3, 1, 2).contiguous()\n        # apply padding_mask on attn_bias\n        # unsqueezed mask size: [batch, 1, 1, nodes] apply on tensor [batch, num_heads, nodes, nodes]\n        attn_bias.masked_fill_(\n            padding_mask.unsqueeze(1).unsqueeze(2),\n            float(\"-1000\"),\n        )\n        # apply nan_mask on attn_bias\n        # unsqueezed mask size: [batch, 1, 1, 1] apply on tensor [batch, num_heads, nodes, nodes]\n        attn_bias.masked_fill_(\n            nan_mask.unsqueeze(-1).unsqueeze(-1).unsqueeze(-1),\n            0.0,\n        )\n        # apply padding_mask on distance_feature\n        # unsqueezed mask size: [batch, 1, nodes, 1] apply on tensor [batch, nodes, nodes, num_kernel]\n        distance_feature.masked_fill_(padding_mask.unsqueeze(1).unsqueeze(-1).to(torch.bool), 0.0)\n        # [batch, nodes, num_kernel]\n        distance_feature_sum = distance_feature.sum(dim=-2)\n        # Output of GaussianLayer is FP32, cast to dtype of self.node_proj here\n        distance_feature_sum = distance_feature_sum.to(self.node_proj.weight.dtype)\n        # [batch, nodes, embed_dim]\n        node_feature = self.node_proj(distance_feature_sum)\n        # apply nan_mask on node_feature\n        # unsqueezed mask size: [batch, 1, 1] apply on tensor [batch, nodes, embed_dim]\n        node_feature.masked_fill_(nan_mask.unsqueeze(1).unsqueeze(2).to(torch.bool), 0.0)\n        # [total_nodes, embed_dim]\n        node_feature = to_sparse_batch(node_feature, idx)\n\n        return attn_bias, node_feature\n\n\nclass GaussianLayer(nn.Module):\n    def __init__(self, num_kernels=128, in_dim=3):\n        r\"\"\"\n            Gaussian kernel function that applied on the all-to-all 3D distances.\n        Parameters:\n            num_kernels:\n                Number of gaussian kernel used.\n        \"\"\"\n        super().__init__()\n        self.num_kernels = num_kernels\n        self.means = nn.Embedding(1, num_kernels)\n        self.stds = nn.Embedding(1, num_kernels)\n        nn.init.uniform_(self.means.weight, 0, in_dim)\n        nn.init.uniform_(self.stds.weight, 0, in_dim)\n\n    def forward(self, input: Tensor) -> Tensor:\n        # [batch, nodes, nodes, 1]\n        input = input.unsqueeze(-1)\n        # [batch, nodes, nodes, num_kernels]\n        expanded_input = input.expand(-1, -1, -1, self.num_kernels)\n        # [num_kernels]\n        mean = self.means.weight.float().view(-1)\n        # [num_kernels]\n        std = self.stds.weight.float().view(-1).abs() + 0.01  # epsilon is 0.01 that matches gps++ value\n        pre_exp_factor = (2 * math.pi) ** 0.5\n        # [batch, nodes, nodes, num_kernels]\n        tensor_with_kernel = torch.exp(-0.5 * (((expanded_input - mean) / std) ** 2)) / (pre_exp_factor * std)\n        return tensor_with_kernel\n\n\ndef triplets(\n    edge_index: Tensor,\n    num_nodes: int,\n) -> Tuple[Tensor, Tensor, Tensor, Tensor, Tensor, Tensor, Tensor]:\n    r\"\"\"Generates triplets from the given edge indices.\n        A triplet is defined as a path of length two,\n        such that if node A is connected to node B,\n        and node B is connected to node C, then there is a triplet (A, B, C).\n\n    Parameters:\n        edge_index (LongTensor): The edge indices.\n        num_nodes (int): The number of nodes.\n\n    Returns:\n        col: The sink node indices of edges from the edge indices.\n        row: The source node indices of edges from the edge indices.\n        idx_i: The sink node indices of the triplets.\n        idx_j: The middle node indices of the triplets.\n        idx_k: The source node indices of the triplets.\n        idx_kj: The indices of edges those from the source node to the middle node.\n        idx_ji: The indices of edges those from the middle node to the sink node.\n    \"\"\"\n    row, col = edge_index  # j->i\n\n    value = torch.arange(row.size(0), device=row.device)\n    adj_t = SparseTensor(row=col, col=row, value=value, sparse_sizes=(num_nodes, num_nodes))\n    adj_t_row = adj_t[row]\n    num_triplets = adj_t_row.set_value(None).sum(dim=1).to(torch.long)\n\n    # Node indices (k->j->i) for triplets.\n    idx_i = col.repeat_interleave(num_triplets)\n    idx_j = row.repeat_interleave(num_triplets)\n    idx_k = adj_t_row.storage.col()\n\n    # Remove self-loop triplets d->b->d\n    mask = idx_i != idx_k  # Remove i == k triplets.\n    idx_i, idx_j, idx_k = idx_i[mask], idx_j[mask], idx_k[mask]\n\n    # Edge indices (k->j, j->i) for triplets.\n    idx_kj = adj_t_row.storage.value()[mask]\n    idx_ji = adj_t_row.storage.row()[mask]\n    return col, row, idx_i, idx_j, idx_k, idx_kj, idx_ji\n"
  },
  {
    "path": "graphium/nn/residual_connections.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nDifferent types of residual connections, including None, Simple (ResNet-like),\nConcat and DenseNet\n\"\"\"\n\nimport abc\nimport torch\nimport torch.nn as nn\nfrom typing import List, Union, Callable\n\nfrom graphium.nn.base_layers import FCLayer\nfrom graphium.utils.decorators import classproperty\n\n\nclass ResidualConnectionBase(nn.Module):\n    def __init__(self, skip_steps: int = 1):\n        r\"\"\"\n        Abstract class for the residual connections. Using this class,\n        we implement different types of residual connections, such as\n        the ResNet, weighted-ResNet, skip-concat and DensNet.\n\n        The following methods must be implemented in a children class\n\n        - ``h_dim_increase_type()``\n        - ``has_weights()``\n\n        Parameters:\n\n            skip_steps: int\n                The number of steps to skip between the residual connections.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n        \"\"\"\n\n        super().__init__()\n        self.skip_steps = skip_steps\n\n    def _bool_apply_skip_step(self, step_idx: int):\n        r\"\"\"\n        Whether to apply the skip connection, depending on the\n        ``step_idx`` and ``self.skip_steps``.\n\n        Parameters:\n\n            step_idx: int\n                The current layer step index.\n\n        \"\"\"\n        return (self.skip_steps != 0) and ((step_idx % self.skip_steps) == 0)\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        return f\"{self.__class__.__name__}(skip_steps={self.skip_steps})\"\n\n    @classproperty\n    @abc.abstractmethod\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        How does the dimension of the output features increases after each layer?\n\n        Returns:\n\n            h_dim_increase_type: None or str\n                - ``None``: The dimension of the output features do not change at each layer.\n                E.g. ResNet.\n\n                - \"previous\": The dimension of the output features is the concatenation of\n                the previous layer with the new layer.\n\n                - \"cumulative\": The dimension of the output features is the concatenation\n                of all previous layers.\n\n        \"\"\"\n        ...\n\n    def get_true_out_dims(self, out_dims: List) -> List:\n        r\"\"\"\n        find the true output dimensions\n        Parameters:\n            out_dims: List\n        Returns:\n            true_out_dims: List\n        \"\"\"\n        true_out_dims = [out_dims[0]]\n        out_dims_at_skip = [out_dims[0]]\n        for ii in range(1, len(out_dims) - 1):\n            # For the `None` type, don't change the output dims\n            if self.h_dim_increase_type is None:\n                true_out_dims.append(out_dims[ii])\n\n            # For the \"previous\" type, add the previous layers when the skip connection applies\n            elif self.h_dim_increase_type == \"previous\":\n                if self._bool_apply_skip_step(step_idx=ii):\n                    true_out_dims.append(out_dims[ii] + out_dims_at_skip[-1])\n                    out_dims_at_skip.append(out_dims[ii])\n                else:\n                    true_out_dims.append(out_dims[ii])\n\n            # For the \"cumulative\" type, add all previous layers when the skip connection applies\n            elif self.h_dim_increase_type == \"cumulative\":\n                if self._bool_apply_skip_step(step_idx=ii):\n                    true_out_dims.append(out_dims[ii] + out_dims_at_skip[-1])\n                    out_dims_at_skip.append(true_out_dims[ii])\n                else:\n                    true_out_dims.append(out_dims[ii])\n            else:\n                raise ValueError(f\"undefined value: {self.h_dim_increase_type}\")\n\n        return true_out_dims\n\n    @classproperty\n    @abc.abstractmethod\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            has_weights: bool\n                Whether the residual connection uses weights\n\n        \"\"\"\n        ...\n\n\nclass ResidualConnectionNone(ResidualConnectionBase):\n    r\"\"\"\n    No residual connection.\n    This class is only used for simpler code compatibility\n    \"\"\"\n\n    def __init__(self, skip_steps: int = 1):\n        super().__init__(skip_steps=skip_steps)\n\n    def __repr__(self):\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        return f\"{self.__class__.__name__}\"\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n\n            None:\n                The dimension of the output features do not change at each layer.\n        \"\"\"\n\n        return None\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            False\n                The current class does not use weights\n\n        \"\"\"\n        return False\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Ignore the skip connection.\n\n        Returns:\n\n            h: torch.Tensor(..., m)\n                Return same as input.\n\n            h_prev: torch.Tensor(..., m)\n                Return same as input.\n\n        \"\"\"\n        return h, h_prev\n\n\nclass ResidualConnectionSimple(ResidualConnectionBase):\n    def __init__(self, skip_steps: int = 1):\n        r\"\"\"\n        Class for the simple residual connections proposed by ResNet,\n        where the current layer output is summed to a\n        previous layer output.\n\n        Parameters:\n\n            skip_steps: int\n                The number of steps to skip between the residual connections.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n        \"\"\"\n        super().__init__(skip_steps=skip_steps)\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n\n            None:\n                The dimension of the output features do not change at each layer.\n        \"\"\"\n\n        return None\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            False\n                The current class does not use weights\n\n        \"\"\"\n        return False\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Add ``h`` with the previous layers with skip connection ``h_prev``,\n        similar to ResNet.\n\n        Parameters:\n\n            h: torch.Tensor(..., m)\n                The current layer features\n\n            h_prev: torch.Tensor(..., m), None\n                The features from the previous layer with a skip connection.\n                At ``step_idx==0``, ``h_prev`` can be set to ``None``.\n\n            step_idx: int\n                Current layer index or step index in the forward loop of the architecture.\n\n        Returns:\n\n            h: torch.Tensor(..., m)\n                Either return ``h`` unchanged, or the sum with\n                on ``h_prev``, depending on the ``step_idx`` and ``self.skip_steps``.\n\n            h_prev: torch.Tensor(..., m)\n                Either return ``h_prev`` unchanged, or the same value as ``h``,\n                depending on the ``step_idx`` and ``self.skip_steps``.\n\n        \"\"\"\n        if self._bool_apply_skip_step(step_idx):\n            if step_idx > 0:\n                h = h + h_prev\n            h_prev = h\n\n        return h, h_prev\n\n\nclass ResidualConnectionWeighted(ResidualConnectionBase):\n    def __init__(\n        self,\n        out_dims,\n        skip_steps: int = 1,\n        dropout=0.0,\n        activation: Union[str, Callable] = \"none\",\n        normalization=\"none\",\n        bias=False,\n    ):\n        r\"\"\"\n        Class for the simple residual connections proposed by ResNet,\n        with an added layer in the residual connection itself.\n        The layer output is summed to a a non-linear transformation\n        of a previous layer output.\n\n        Parameters:\n\n            skip_steps: int\n                The number of steps to skip between the residual connections.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n\n            out_dims: list(int)\n                list of all output dimensions for the network\n                that will use this residual connection.\n                E.g. ``out_dims = [4, 8, 8, 8, 2]``.\n\n            dropout: float\n                value between 0 and 1.0 representing the percentage of dropout\n                to use in the weights\n\n            activation: str, Callable\n                The activation function to use after the skip weights\n\n            normalization:\n                Normalization to use. Choices:\n\n                - \"none\" or `None`: No normalization\n                - \"batch_norm\": Batch normalization\n                - \"layer_norm\": Layer normalization in the hidden layers.\n                - `Callable`: Any callable function\n\n            bias: bool\n                Whether to apply add a bias after the weights\n\n        \"\"\"\n\n        super().__init__(skip_steps=skip_steps)\n\n        self.residual_list = nn.ModuleList()\n        self.skip_count = 0\n        self.out_dims = out_dims\n\n        for ii in range(0, len(self.out_dims) - 1, self.skip_steps):\n            this_dim = self.out_dims[ii]\n            self.residual_list.append(\n                FCLayer(\n                    this_dim,\n                    this_dim,\n                    activation=activation,\n                    dropout=dropout,\n                    normalization=normalization,\n                    bias=False,\n                )\n            )\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n\n            None:\n                The dimension of the output features do not change at each layer.\n        \"\"\"\n        return None\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            True\n                The current class uses weights\n\n        \"\"\"\n        return True\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Add ``h`` with the previous layers with skip connection ``h_prev``, after\n        a feed-forward layer.\n\n        Parameters:\n\n            h: torch.Tensor(..., m)\n                The current layer features\n\n            h_prev: torch.Tensor(..., m), None\n                The features from the previous layer with a skip connection.\n                At ``step_idx==0``, ``h_prev`` can be set to ``None``.\n\n            step_idx: int\n                Current layer index or step index in the forward loop of the architecture.\n\n        Returns:\n\n            h: torch.Tensor(..., m)\n                Either return ``h`` unchanged, or the sum with the output of a NN layer\n                on ``h_prev``, depending on the ``step_idx`` and ``self.skip_steps``.\n\n            h_prev: torch.Tensor(..., m)\n                Either return ``h_prev`` unchanged, or the same value as ``h``,\n                depending on the ``step_idx`` and ``self.skip_steps``.\n\n        \"\"\"\n\n        if self._bool_apply_skip_step(step_idx):\n            if step_idx > 0:\n                h = h + self.residual_list[self.skip_count].forward(h_prev)\n                self.skip_count += 1\n            h_prev = h\n\n        return h, h_prev\n\n    def _bool_apply_skip_step(self, step_idx: int):\n        return super()._bool_apply_skip_step(step_idx) and self.skip_count < len(self.residual_list)\n\n\nclass ResidualConnectionConcat(ResidualConnectionBase):\n    def __init__(self, skip_steps: int = 1):\n        r\"\"\"\n        Class for the simple residual connections proposed but where\n        the skip connection features are concatenated to the current\n        layer features.\n\n        Parameters:\n\n            skip_steps: int\n                The number of steps to skip between the residual connections.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n        \"\"\"\n\n        super().__init__(skip_steps=skip_steps)\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n\n            \"previous\":\n                The dimension of the output layer is the concatenation with the previous layer.\n        \"\"\"\n\n        return \"previous\"\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            False\n                The current class does not use weights\n\n        \"\"\"\n        return False\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Concatenate ``h`` with the previous layers with skip connection ``h_prev``.\n\n        Parameters:\n\n            h: torch.Tensor(..., m)\n                The current layer features\n\n            h_prev: torch.Tensor(..., n), None\n                The features from the previous layer with a skip connection.\n                Usually, we have ``n`` equal to ``m``.\n                At ``step_idx==0``, ``h_prev`` can be set to ``None``.\n\n            step_idx: int\n                Current layer index or step index in the forward loop of the architecture.\n\n        Returns:\n\n            h: torch.Tensor(..., m) or torch.Tensor(..., m + n)\n                Either return ``h`` unchanged, or the concatenation\n                with ``h_prev``, depending on the ``step_idx`` and ``self.skip_steps``.\n\n            h_prev: torch.Tensor(..., m) or torch.Tensor(..., m + n)\n                Either return ``h_prev`` unchanged, or the same value as ``h``,\n                depending on the ``step_idx`` and ``self.skip_steps``.\n\n        \"\"\"\n\n        if self._bool_apply_skip_step(step_idx):\n            h_in = h\n            if step_idx > 0:\n                h = torch.cat([h, h_prev], dim=-1)\n            h_prev = h_in\n\n        return h, h_prev\n\n\nclass ResidualConnectionDenseNet(ResidualConnectionBase):\n    def __init__(self, skip_steps: int = 1):\n        r\"\"\"\n        Class for the residual connections proposed by DenseNet, where\n        all previous skip connection features are concatenated to the current\n        layer features.\n\n        Parameters:\n\n            skip_steps: int\n                The number of steps to skip between the residual connections.\n                If `1`, all the layers are connected. If `2`, half of the\n                layers are connected.\n        \"\"\"\n\n        super().__init__(skip_steps=skip_steps)\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n\n            \"cumulative\":\n                The dimension of the output layer is the concatenation of all the previous layer.\n        \"\"\"\n\n        return \"cumulative\"\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n\n            False\n                The current class does not use weights\n\n        \"\"\"\n        return False\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Concatenate ``h`` with all the previous layers with skip connection ``h_prev``.\n\n        Parameters:\n\n            h: torch.Tensor(..., m)\n                The current layer features\n\n            h_prev: torch.Tensor(..., n), None\n                The features from the previous layers.\n                n = ((step_idx // self.skip_steps) + 1) * m\n\n                At ``step_idx==0``, ``h_prev`` can be set to ``None``.\n\n            step_idx: int\n                Current layer index or step index in the forward loop of the architecture.\n\n        Returns:\n\n            h: torch.Tensor(..., m) or torch.Tensor(..., m + n)\n                Either return ``h`` unchanged, or the concatenation\n                with ``h_prev``, depending on the ``step_idx`` and ``self.skip_steps``.\n\n            h_prev: torch.Tensor(..., m) or torch.Tensor(..., m + n)\n                Either return ``h_prev`` unchanged, or the same value as ``h``,\n                depending on the ``step_idx`` and ``self.skip_steps``.\n\n        \"\"\"\n\n        if self._bool_apply_skip_step(step_idx):\n            if step_idx > 0:\n                h = torch.cat([h, h_prev], dim=-1)\n            h_prev = h\n\n        return h, h_prev\n\n\nclass ResidualConnectionRandom(ResidualConnectionBase):\n    def __init__(self, skip_steps=1, out_dims: List[int] = None, num_layers: int = None):\n        r\"\"\"\n        Class for the random residual connection, where each layer is connected\n        to each following layer with a random weight between 0 and 1.\n        Parameters:\n            skip_steps:\n                Parameter only there for compatibility with other classes of the same parent.\n            out_dims:\n                The list of output dimensions. Only required to get the number\n                of layers. Must be provided if `num_layers` is None.\n            num_layers:\n                The number of layers. Must be provided if `out_dims` is None.\n        \"\"\"\n        if skip_steps != 1:\n            raise ValueError(\"Only `skip_step=1` is implemented\")\n        super().__init__(skip_steps=skip_steps)\n\n        if out_dims is not None:\n            if num_layers is not None:\n                assert num_layers == len(out_dims)\n            num_layers = len(out_dims)\n        if num_layers is None:\n            raise ValueError(\"Either `out_dims` or `num_layers` must be provided\")\n        self.num_layers = num_layers\n\n        self.random_dict_weights = {}\n        for ii in range(1, self.num_layers):\n            random_weights = torch.rand(ii)\n            self.random_dict_weights[ii] = random_weights\n\n    @classproperty\n    def h_dim_increase_type(cls):\n        r\"\"\"\n        Returns:\n            None:\n                The dimension of the output features do not change at each layer.\n        \"\"\"\n\n        return None\n\n    @classproperty\n    def has_weights(cls):\n        r\"\"\"\n        Returns:\n            False\n                The current class does not use weights\n        \"\"\"\n        return False\n\n    def forward(self, h: torch.Tensor, h_prev: torch.Tensor, step_idx: int):\n        r\"\"\"\n        Add ``h`` with the previous layers with skip connection ``h_prev``,\n        similar to ResNet.\n        Parameters:\n            h: torch.Tensor(..., m)\n                The current layer features\n            h_prev: torch.Tensor(..., m), None\n                The features from the previous layer with a skip connection.\n                At ``step_idx==0``, ``h_prev`` can be set to ``None``.\n            step_idx: int\n                Current layer index or step index in the forward loop of the architecture.\n        Returns:\n            h: torch.Tensor(..., m)\n                Either return ``h`` unchanged, or the sum with\n                on ``h_prev``, depending on the ``step_idx`` and ``self.skip_steps``.\n            h_prev: torch.Tensor(..., m)\n                Either return ``h_prev`` unchanged, or the same value as ``h``,\n                depending on the ``step_idx`` and ``self.skip_steps``.\n        \"\"\"\n\n        if self._bool_apply_skip_step(step_idx):\n            for i in range(0, step_idx):\n                h += (\n                    self.random_dict_weights[step_idx][i].to(dtype=h_prev[i].dtype, device=h_prev[i].device)\n                    * h_prev[i]\n                )\n            if h_prev is None:\n                h_prev = [h]\n            else:\n                h_prev.append(h)\n\n        return h, h_prev\n"
  },
  {
    "path": "graphium/nn/utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport abc\nimport inspect\nfrom numbers import Real\nfrom typing import Optional\n\n\nclass MupMixin(abc.ABC):\n    @abc.abstractmethod\n    def make_mup_base_kwargs(self, divide_factor: float = 2.0, factor_in_dim: Optional[bool] = None):\n        \"\"\"\n        Create a 'base' model to be used by the `mup` or `muTransfer` scaling of the model.\n        The base model is usually identical to the regular model, but with the\n        layers width divided by a given factor (2 by default)\n\n        This is done using the `scale_kwargs()` method with `scale_factor = 1 / divide_factor`.\n\n        Parameter:\n            divide_factor: Factor by which to divide the width.\n            factor_in_dim: Whether to factor the input dimension for the nodes. If None, the default for scale_kwargs is used\n        Returns:\n            kwargs: Dictionary of parameters to be used to instanciate the base model divided by the factor\n        \"\"\"\n        ...\n\n    def scale_kwargs(self, scale_factor: Real, scale_in_dim: bool = False):\n        \"\"\"\n        Create a \"scaled\" version of the module where the hidden dims are scaled as in muTransfer.\n\n        This can be used with `scale_factor` < 1 to create a \"base\" model to extract shape\n        information as in the `mup` package or with `scale_factor` > 1 to create a scaled model\n        to which optimal hyperparameters can be \"muTransferred\" from the original model\n\n        Parameters:\n            scale_factor: Factor by which to scale the width.\n            scale_in_dim: Whether to factor the input dimension for the nodes\n\n        Returns:\n            kwargs: Dictionary of parameters to be used to instantiate the scaled model\n        \"\"\"\n\n        divide_factor = 1 / scale_factor\n\n        if not scale_in_dim:\n            return self.make_mup_base_kwargs(divide_factor=divide_factor)\n\n        # If scale_in_dim passed, need to check it can be forwarded\n        try:\n            return self.make_mup_base_kwargs(divide_factor=divide_factor, factor_in_dim=scale_in_dim)\n        except TypeError as e:\n            raise RuntimeError(\n                \"This error may have been caused by passing scale_in_dim to scale_kwargs \"\n                \"for a class that does not support passing factor_in_dim to make_mup_base_kwargs, \"\n                \"which cannot be done\"\n            ) from e\n"
  },
  {
    "path": "graphium/trainer/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\ncode for the metric, training and trainer\n\n- ✅ `predictor.py`: the `PredictorModule` class is the main class for the trainer \n- `metrics.py`: metrics for the task heads \n- `predictor_options.py`: options for the predictor, intervals, tracking min/max etc. \n- `predictor_summaries.py`: summaries for the task to track loss and metric\n"
  },
  {
    "path": "graphium/trainer/__init__.py",
    "content": "from . import predictor\nfrom . import metrics\n\nfrom .predictor import PredictorModule\n"
  },
  {
    "path": "graphium/trainer/losses.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Optional\n\nimport torch\nfrom torch import Tensor\nfrom torch.nn import functional as F\nfrom torch.nn.modules.loss import _WeightedLoss\n\n\nclass HybridCELoss(_WeightedLoss):\n    def __init__(\n        self,\n        n_brackets,\n        regression_loss: str = \"mse\",\n        alpha: float = 0.5,\n        weight: Optional[Tensor] = None,\n        reduction: str = \"mean\",\n    ) -> None:\n        \"\"\"\n        A hybrid between the regression loss (either MAE or MSE) and the cross entropy loss. Intended\n        to be used with noisy regression datasets, for which the targets are assigned to binary brackets,\n        and the task is transformed into a multi-class classification.\n\n        Note that it assumes that the brackets are consecutive integers starting at 0 up to n_brackets,\n        which has an impact on the scale of the regression loss component.\n\n        Parameters:\n            n_brackets: the number of brackets that will be used to group the regression targets.\n                Expected to have the same size as the number of classes in the transformed regression task.\n            regression_loss: type of regression loss, either 'mse' or 'mae'.\n            alpha: weight assigned to the CE loss component. Must be a value in [0, 1] range.\n            weight: a manual rescaling weight given to each class in the CE loss component.\n                If given, has to be a Tensor of the same size as the number of classes.\n            reduction: specifies the reduction to apply to the output: 'none' | 'mean' | 'sum'.\n                'none': no reduction will be applied, 'mean': the sum of the output will be divided\n                by the number of elements in the output, 'sum': the output will be summed.\n        \"\"\"\n        super().__init__(weight=weight, reduction=reduction)\n\n        if regression_loss == \"mae\":\n            self.regression_loss = F.l1_loss\n        elif regression_loss == \"mse\":\n            self.regression_loss = F.mse_loss\n        else:\n            raise ValueError(\n                f\"Expected regression_loss to be in {{'mae', 'mse'}}, received {regression_loss}.\"\n            )\n\n        if alpha < 0 or alpha > 1:\n            raise ValueError(f\"Expected alpha to be in the [0, 1] range, received {alpha}.\")\n\n        self.brackets = Tensor(range(n_brackets))\n        self.alpha = alpha\n        self.softmax = torch.nn.Softmax(dim=1)\n\n    def forward(self, input: Tensor, target: Tensor, nan_targets: Tensor = None) -> Tensor:\n        \"\"\"\n        Parameters:\n            input: (batch_size x n_classes) tensor of logits predicted for each bracket.\n            target: (batch_size) or (batch_size, 1) tensor of target brackets in {0, 1, ..., self.n_brackets}.\n        \"\"\"\n\n        target = target.flatten()\n\n        # set input and target with nans to 0s for regression loss\n        if nan_targets is not None:\n            input = torch.masked_fill(input, nan_targets.unsqueeze(1), 0)\n            target = torch.masked_fill(target, nan_targets, 0)\n        # regression loss needs normalized logits to probability as input to do inner product with self.brackets\n        # we apply softmax on the raw logits first\n        softmax_input = self.softmax(input)\n        # the softmax of a tensor of 0s would not be 0s anymore, so need to apply nan_targets here again to filter out\n        if nan_targets is not None:\n            softmax_input = torch.masked_fill(softmax_input, nan_targets.unsqueeze(1), 0.0)\n        # [batch_size, n_classes] * [n_classes] ([0, 1, 2...n_brakets-1]) -> [batch_size]\n        regression_input = torch.inner(softmax_input.to(self.brackets.dtype), self.brackets.to(input.device))\n        regression_loss = self.regression_loss(regression_input, target.float(), reduction=self.reduction)\n        # regression_loss needs some scaling by total_targets/num_real_targets\n        if nan_targets is not None:\n            num_real_targets = (~nan_targets).sum()\n            factor1 = torch.where(num_real_targets > 0, 1, 0)\n            factor2 = torch.where(num_real_targets > 0, 0, 1)\n            regression_loss = factor1 * regression_loss * nan_targets.numel() / (num_real_targets + factor2)\n\n            # set input and target with nans to -1000s for ce loss\n            input = torch.masked_fill(input, nan_targets.unsqueeze(1), -1000)\n            target = torch.masked_fill(target, nan_targets, -1000)\n        # cross_entropy loss needs raw logits as input\n        # ce_loss does not need scaling as it already ignores -1000 masked nan values\n        ce_loss = F.cross_entropy(\n            input, target.long(), weight=self.weight, ignore_index=-1000, reduction=self.reduction\n        )\n\n        return self.alpha * ce_loss + (1 - self.alpha) * regression_loss\n"
  },
  {
    "path": "graphium/trainer/metrics.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union, Callable, Optional, Dict, Any\n\nimport sys\n\nimport torch\nfrom torch import Tensor\nimport operator as op\n\nfrom torchmetrics.utilities.distributed import reduce\nimport torchmetrics.functional.regression.mae\n\nfrom graphium.utils.tensor import nan_mean\n\n# NOTE(hadim): the below is a fix to be able to import previously saved Graphium model that are incompatible\n# with the current version of torchmetrics.\n# In the future, we should NOT save any torchmetrics objects during serialization.\n# See https://github.com/valence-discovery/graphium/issues/106\nsys.modules[\"torchmetrics.functional.regression.mean_absolute_error\"] = torchmetrics.functional.regression.mae\n\nEPS = 1e-5\n\n\nclass Thresholder:\n    def __init__(\n        self,\n        threshold: float,\n        operator: Union[str, Callable] = \"greater\",\n        th_on_preds: bool = True,\n        th_on_target: bool = False,\n    ):\n        # Basic params\n        self.threshold = threshold\n        self.th_on_target = th_on_target\n        self.th_on_preds = th_on_preds\n        self.operator, self.op_str = self._get_operator(operator)\n\n    def compute(self, preds: Tensor, target: Tensor):\n        # Apply the threshold on the predictions\n        if self.th_on_preds:\n            preds = self.operator(preds, self.threshold)\n\n        # Apply the threshold on the targets\n        if self.th_on_target:\n            target = self.operator(target, self.threshold)\n\n        return preds, target\n\n    def __call__(self, preds: Tensor, target: Tensor):\n        return self.compute(preds, target)\n\n    def __repr__(self):\n        r\"\"\"\n        Control how the class is printed\n        \"\"\"\n\n        return f\"{self.op_str}{self.threshold}\"\n\n    @staticmethod\n    def _get_operator(operator):\n        \"\"\"Operator can either be a string, or a callable\"\"\"\n        if isinstance(operator, str):\n            op_name = operator.lower()\n            if op_name in [\"greater\", \"gt\"]:\n                op_str = \">\"\n                operator = op.gt\n            elif op_name in [\"lower\", \"lt\"]:\n                op_str = \"<\"\n                operator = op.lt\n            else:\n                raise ValueError(f\"operator `{op_name}` not supported\")\n        elif callable(operator):\n            op_str = operator.__name__\n            if op_str == \"lt\":\n                op_str = \"<\"\n            elif op_str == \"gt\":\n                op_str = \">\"\n        elif operator is None:\n            pass\n        else:\n            raise TypeError(f\"operator must be either `str` or `callable`, provided: `{type(operator)}`\")\n\n        return operator, op_str\n\n    def __getstate__(self):\n        \"\"\"Serialize the class for pickling.\"\"\"\n        state = {}\n        state[\"threshold\"] = self.threshold\n        state[\"th_on_target\"] = self.th_on_target\n        state[\"th_on_preds\"] = self.th_on_preds\n\n        # Set the operator state.\n        # If it's a callable, it's up to the user to ensure it unserializes well\n        operator = self.operator\n        if operator == op.lt:\n            operator = \"lower\"\n        elif operator == op.gt:\n            operator = \"greater\"\n        state[\"operator\"] = operator\n\n        return state\n\n    def __setstate__(self, state: dict):\n        state[\"operator\"], state[\"op_str\"] = self._get_operator(state[\"operator\"])\n        self.__dict__.update(state)\n        self.__init__(**state)\n\n    def __eq__(self, obj) -> bool:\n        is_eq = [\n            self.threshold == obj.threshold,\n            self.th_on_target == obj.th_on_target,\n            self.th_on_preds == obj.th_on_preds,\n            self.operator == obj.operator,\n            self.op_str == obj.op_str,\n        ]\n        return all(is_eq)\n\n\nclass MetricWrapper:\n    r\"\"\"\n    Allows to initialize a metric from a name or Callable, and initialize the\n    `Thresholder` in case the metric requires a threshold.\n    \"\"\"\n\n    def __init__(\n        self,\n        metric: Union[str, Callable],\n        threshold_kwargs: Optional[Dict[str, Any]] = None,\n        target_nan_mask: Optional[Union[str, int]] = None,\n        multitask_handling: Optional[str] = None,\n        squeeze_targets: bool = False,\n        target_to_int: bool = False,\n        **kwargs,\n    ):\n        r\"\"\"\n        Parameters\n            metric:\n                The metric to use. See `METRICS_DICT`\n\n            threshold_kwargs:\n                If `None`, no threshold is applied.\n                Otherwise, we use the class `Thresholder` is initialized with the\n                provided argument, and called before the `compute`\n\n            target_nan_mask:\n\n                - None: Do not change behaviour if there are NaNs\n\n                - int, float: Value used to replace NaNs. For example, if `target_nan_mask==0`, then\n                  all NaNs will be replaced by zeros\n\n                - 'ignore': The NaN values will be removed from the tensor before computing the metrics.\n                  Must be coupled with the `multitask_handling='flatten'` or `multitask_handling='mean-per-label'`.\n\n            multitask_handling:\n                - None: Do not process the tensor before passing it to the metric.\n                  Cannot use the option `multitask_handling=None` when `target_nan_mask=ignore`.\n                  Use either 'flatten' or 'mean-per-label'.\n\n                - 'flatten': Flatten the tensor to produce the equivalent of a single task\n\n                - 'mean-per-label': Loop all the labels columns, process them as a single task,\n                    and average the results over each task\n                  *This option might slow down the computation if there are too many labels*\n\n            squeeze_targets:\n                If true, targets will be squeezed prior to computing the metric.\n                Required in classifigression task.\n\n            target_to_int:\n                If true, targets will be converted to integers prior to computing the metric.\n\n            kwargs:\n                Other arguments to call with the metric\n        \"\"\"\n\n        self.metric, self.metric_name = self._get_metric(metric)\n        self.thresholder = None\n        if threshold_kwargs is not None:\n            self.thresholder = Thresholder(**threshold_kwargs)\n\n        self.target_nan_mask = self._parse_target_nan_mask(target_nan_mask)\n        self.multitask_handling = self._parse_multitask_handling(multitask_handling, self.target_nan_mask)\n        self.squeeze_targets = squeeze_targets\n        self.target_to_int = target_to_int\n        self.kwargs = kwargs\n\n    @staticmethod\n    def _parse_target_nan_mask(target_nan_mask):\n        \"\"\"\n        Parse the `target_nan_mask` parameter\n        \"\"\"\n\n        if (target_nan_mask is None) or isinstance(target_nan_mask, (int, float)):\n            # None, int, float, are accepted\n            pass\n        elif isinstance(target_nan_mask, Tensor) and (target_nan_mask.numel() == 1):\n            # Only single element tensors are accepted\n            target_nan_mask = target_nan_mask.flatten()[0]\n        elif isinstance(target_nan_mask, str):\n            # Only a few str options are accepted\n            target_nan_mask = target_nan_mask.lower()\n            accepted_str = [\"ignore\", \"none\"]\n            assert (\n                target_nan_mask in accepted_str\n            ), f\"Provided {target_nan_mask} not in accepted_str={accepted_str}\"\n\n            if target_nan_mask == \"none\":\n                target_nan_mask = None\n        else:\n            raise ValueError(f\"Unrecognized option `target_nan_mask={target_nan_mask}`\")\n        return target_nan_mask\n\n    @staticmethod\n    def _parse_multitask_handling(multitask_handling, target_nan_mask):\n        \"\"\"\n        Parse the `multitask_handling` parameter\n        \"\"\"\n\n        if multitask_handling is None:\n            # None is accepted\n            pass\n        elif isinstance(multitask_handling, str):\n            # Only a few str options are accepted\n            multitask_handling = multitask_handling.lower()\n            accepted_str = [\"flatten\", \"mean-per-label\", \"none\"]\n            assert (\n                multitask_handling in accepted_str\n            ), f\"Provided {multitask_handling} not in accepted_str={accepted_str}\"\n\n            if multitask_handling == \"none\":\n                multitask_handling = None\n        else:\n            raise ValueError(f\"Unrecognized option `multitask_handling={multitask_handling}`\")\n\n        if (target_nan_mask == \"ignore\") and (multitask_handling is None):\n            raise ValueError(\n                \"Cannot use the option `multitask_handling=None` when `target_nan_mask=ignore`. Use either 'flatten' or 'mean-per-label'\"\n            )\n\n        return multitask_handling\n\n    @staticmethod\n    def _get_metric(metric):\n        from graphium.utils.spaces import METRICS_DICT\n\n        if isinstance(metric, str):\n            metric_name = metric\n            metric = METRICS_DICT[metric]\n        else:\n            metric_name = None\n            metric = metric\n        return metric, metric_name\n\n    def compute(self, preds: Tensor, target: Tensor) -> Tensor:\n        r\"\"\"\n        Compute the metric, apply the thresholder if provided, and manage the NaNs\n        \"\"\"\n        if preds.ndim == 1:\n            preds = preds.unsqueeze(-1)\n\n        if target.ndim == 1:\n            target = target.unsqueeze(-1)\n\n        # Threshold the prediction\n        if self.thresholder is not None:\n            preds, target = self.thresholder(preds, target)\n\n        target_nans = torch.isnan(target)\n\n        # for the classifigression task, cast predictions from\n        # (batch_size, n_targets * n_brackets) to (batch_size, n_targets, n_brackets)\n        # TODO: make this more flexible to the target shape in the future\n        if preds.shape[1] != target.shape[1]:\n            preds = preds.view(target.shape[0], target.shape[1], -1)\n            classifigression = True\n        else:\n            classifigression = False\n\n        if self.multitask_handling is None:\n            # In case of no multi-task handling, apply the nan filtering, then compute the metrics\n            assert (\n                self.target_nan_mask != \"ignore\"\n            ), f\"Cannot use the option `multitask_handling=None` when `target_nan_mask=ignore`. Use either 'flatten' or 'mean-per-label'\"\n            preds, target = self._filter_nans(preds, target)\n            if self.squeeze_targets:\n                target = target.squeeze()\n            if self.target_to_int:\n                target = target.to(int)\n            metric_val = self.metric(preds, target, **self.kwargs)\n        elif self.multitask_handling == \"flatten\":\n            # Flatten the tensors, apply the nan filtering, then compute the metrics\n            if classifigression:\n                preds = preds.view(-1, preds.shape[-1])\n                target = target.flatten()\n            else:\n                preds, target = preds.flatten(), target.flatten()\n            preds, target = self._filter_nans(preds, target)\n            if self.squeeze_targets:\n                target = target.squeeze()\n            if self.target_to_int:\n                target = target.to(int)\n            metric_val = self.metric(preds, target, **self.kwargs)\n        elif self.multitask_handling == \"mean-per-label\":\n            # Loop the columns (last dim) of the tensors, apply the nan filtering, compute the metrics per column, then average the metrics\n            target_list = [target[..., ii][~target_nans[..., ii]] for ii in range(target.shape[-1])]\n            # TODO: make this more flexible to the target shape in the future\n            if classifigression:\n                preds_list = [preds[..., i, :][~target_nans[..., i]] for i in range(preds.shape[1])]\n            else:\n                preds_list = [preds[..., ii][~target_nans[..., ii]] for ii in range(preds.shape[-1])]\n            metric_val = []\n            for ii in range(len(target_list)):\n                try:\n                    this_preds, this_target = self._filter_nans(preds_list[ii], target_list[ii])\n                    if self.squeeze_targets:\n                        this_target = this_target.squeeze()\n                    if self.target_to_int:\n                        this_target = this_target.to(int)\n                    metric_val.append(self.metric(this_preds, this_target, **self.kwargs))\n                except:\n                    pass\n            # Average the metric\n            metric_val = nan_mean(torch.stack(metric_val))\n        else:\n            # Wrong option\n            raise ValueError(f\"Invalid option `self.multitask_handling={self.multitask_handling}`\")\n\n        return metric_val\n\n    def _filter_nans(self, preds: Tensor, target: Tensor):\n        \"\"\"Handle the NaNs according to the chosen options\"\"\"\n        target_nans = torch.isnan(target)\n\n        if self.target_nan_mask is None:\n            pass\n        elif isinstance(self.target_nan_mask, (int, float)):\n            target = target.clone()\n            target[torch.isnan(target)] = self.target_nan_mask\n        elif self.target_nan_mask == \"ignore\":\n            target = target[~target_nans]\n            preds = preds[~target_nans]\n        else:\n            raise ValueError(f\"Invalid option `{self.target_nan_mask}`\")\n        return preds, target\n\n    def __call__(self, preds: Tensor, target: Tensor) -> Tensor:\n        r\"\"\"\n        Compute the metric with the method `self.compute`\n        \"\"\"\n        return self.compute(preds, target)\n\n    def __repr__(self):\n        r\"\"\"\n        Control how the class is printed\n        \"\"\"\n        full_str = f\"{self.metric.__name__}\"\n        if self.thresholder is not None:\n            full_str += f\"({self.thresholder})\"\n\n        return full_str\n\n    def __eq__(self, obj) -> bool:\n        is_eq = [\n            self.metric == obj.metric,\n            self.metric_name == obj.metric_name,\n            self.thresholder == obj.thresholder,\n            self.target_nan_mask == obj.target_nan_mask,\n            self.multitask_handling == obj.multitask_handling,\n            self.squeeze_targets == obj.squeeze_targets,\n            self.target_to_int == obj.target_to_int,\n            self.kwargs == obj.kwargs,\n        ]\n        return all(is_eq)\n\n    def __getstate__(self):\n        \"\"\"Serialize the class for pickling.\"\"\"\n        state = {}\n        if self.metric_name is None:\n            state[\"metric\"] = self.metric\n        else:\n            state[\"metric\"] = self.metric_name\n        state[\"target_nan_mask\"] = self.target_nan_mask\n        state[\"multitask_handling\"] = self.multitask_handling\n        state[\"squeeze_targets\"] = self.squeeze_targets\n        state[\"target_to_int\"] = self.target_to_int\n        state[\"kwargs\"] = self.kwargs\n        state[\"threshold_kwargs\"] = None\n        if self.thresholder is not None:\n            state[\"threshold_kwargs\"] = self.thresholder.__getstate__()\n        return state\n\n    def __setstate__(self, state: dict):\n        \"\"\"Reload the class from pickling.\"\"\"\n        state[\"metric\"], state[\"metric_name\"] = self._get_metric(state[\"metric\"])\n        thresholder = state.pop(\"threshold_kwargs\", None)\n        if thresholder is not None:\n            thresholder = Thresholder(**thresholder)\n        state[\"thresholder\"] = thresholder\n\n        self.__dict__.update(state)\n"
  },
  {
    "path": "graphium/trainer/predictor.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport time\nfrom copy import deepcopy\nfrom typing import Any, Callable, Dict, List, Optional, Tuple, Type, Union\n\nimport lightning\nimport numpy as np\nimport torch\nfrom loguru import logger\nfrom mup.optim import MuAdam\nfrom torch import Tensor, nn\nfrom torch_geometric.data import Batch, Data\n\nfrom graphium.config.config_convert import recursive_config_reformating\nfrom graphium.data.datamodule import BaseDataModule\nfrom graphium.trainer.metrics import MetricWrapper\nfrom graphium.trainer.predictor_options import (\n    EvalOptions,\n    FlagOptions,\n    ModelOptions,\n    OptimOptions,\n)\nfrom graphium.trainer.predictor_summaries import TaskSummaries\nfrom graphium.utils import fs\nfrom graphium.utils.moving_average_tracker import MovingAverageTracker\nfrom graphium.utils.spaces import GRAPHIUM_PRETRAINED_MODELS_DICT\nfrom graphium.utils.tensor import dict_tensor_fp16_to_fp32\n\n\nclass PredictorModule(lightning.LightningModule):\n    def __init__(\n        self,\n        model_class: Type[nn.Module],\n        model_kwargs: Dict[str, Any],\n        loss_fun: Dict[str, Union[str, Callable]],\n        task_levels: Dict[str, str],\n        random_seed: int = 42,\n        featurization: Dict[str, str] = None,\n        optim_kwargs: Optional[Dict[str, Any]] = None,\n        torch_scheduler_kwargs: Optional[Dict[str, Any]] = None,\n        scheduler_kwargs: Optional[Dict[str, Any]] = None,\n        target_nan_mask: Optional[Union[str, int]] = None,\n        multitask_handling: Optional[str] = None,\n        metrics: Dict[str, Callable] = None,\n        metrics_on_progress_bar: Dict[str, List[str]] = [],\n        metrics_on_training_set: Optional[Dict[str, List[str]]] = None,\n        flag_kwargs: Dict[str, Any] = None,\n        task_norms: Optional[Dict[Callable, Any]] = None,\n        metrics_every_n_train_steps: Optional[int] = None,\n        replicas: int = 1,\n        gradient_acc: int = 1,\n        global_bs: Optional[int] = 1,\n    ):\n        \"\"\"\n        The Lightning module responsible for handling the predictions, losses, metrics, optimization, etc.\n        It works in a multi-task setting, with different losses and metrics per class\n\n        Parameters:\n            model_class: The torch Module containing the main forward function\n            model_kwargs: The arguments to initialize the model from `model_class`\n            loss_fun: A `dict[str, fun]`, where `str` is the task name and `fun` the loss function\n            random_seed: The seed for random initialization\n            optim_kwargs: The optimization arguments. See class `OptimOptions`\n            torch_scheduler_kwargs: The torch scheduler arguments. See class `OptimOptions`\n            scheduler_kwargs: The lightning scheduler arguments. See class `OptimOptions`\n            target_nan_mask: How to handle the NaNs. See `MetricsWrapper` for options\n            metrics: A `dict[str, fun]`, where `str` is the task name and `fun` the metric function\n            metrics_on_progress_bar: A `dict[str, list[str2]`, where `str` is the task name and `str2` the metrics to include on the progress bar\n            metrics_on_training_set: A `dict[str, list[str2]`, where `str` is the task name and `str2` the metrics to include on the training set\n            flag_kwargs: Arguments related to using the FLAG adversarial augmentation\n            task_norms: the normalization for each task\n            metrics_every_n_train_steps: Compute and log metrics every n training steps.\n                Set to `None` to never log the training metrics and statistics (the default). Set to `1` to log at every step.\n        \"\"\"\n        self.save_hyperparameters()\n\n        self.random_seed = random_seed\n        torch.random.manual_seed(self.random_seed)\n        np.random.seed(self.random_seed)\n\n        self.target_nan_mask = target_nan_mask\n        self.multitask_handling = multitask_handling\n        self.task_levels = task_levels\n        self.featurization = featurization\n        self.task_norms = task_norms\n\n        super().__init__()\n\n        # Setting the model options\n        self.model_kwargs = model_kwargs\n        self._model_options = ModelOptions(model_class=model_class, model_kwargs=model_kwargs)\n        # Setting the optimizer options\n        self.optim_options = OptimOptions(\n            optim_kwargs=optim_kwargs,\n            torch_scheduler_kwargs=torch_scheduler_kwargs,\n            scheduler_kwargs=scheduler_kwargs,\n        )\n        # Setting the evaluation options\n        eval_options = {}\n        for task in loss_fun:\n            eval_options[task] = EvalOptions(\n                loss_fun=loss_fun[task],\n                metrics=metrics[task],\n                metrics_on_progress_bar=metrics_on_progress_bar[task],\n                metrics_on_training_set=(\n                    metrics_on_training_set[task] if metrics_on_training_set is not None else None\n                ),\n            )\n            eval_options[task].check_metrics_validity()\n\n        self._eval_options_dict: Dict[str, EvalOptions] = eval_options\n        self._eval_options_dict = {\n            self._get_task_key(task_level=task_levels[key], task=key): value\n            for key, value in self._eval_options_dict.items()\n        }\n        # Setting the flag options\n        self._flag_options = FlagOptions(flag_kwargs=flag_kwargs)\n\n        self.model = self._model_options.model_class(**self._model_options.model_kwargs)\n\n        loss_fun = {\n            self._get_task_key(task_level=task_levels[key], task=key): value\n            for key, value in loss_fun.items()\n        }\n        self.tasks = list(loss_fun.keys())\n\n        # Task-specific evalutation attributes\n        self.loss_fun = {}\n        self.metrics = {}\n        self.metrics_on_progress_bar = {}\n        self.metrics_on_training_set = {}\n        for task in self.tasks:\n            self.loss_fun[task] = EvalOptions.parse_loss_fun(loss_fun[task])\n            self.metrics[task] = (\n                self._eval_options_dict[task].metrics\n                if self._eval_options_dict[task].metrics is not None\n                else {}\n            )\n            self.metrics_on_progress_bar[task] = self._eval_options_dict[task].metrics_on_progress_bar\n            self.metrics_on_training_set[task] = (\n                list(self.metrics[task].keys())\n                if self._eval_options_dict[task].metrics_on_training_set is None\n                else self._eval_options_dict[task].metrics_on_training_set\n            )\n        self.n_params = sum(p.numel() for p in self.parameters() if p.requires_grad)\n\n        # Set the parameters and default values for the FLAG adversarial augmentation, and check values\n        self._flag_options.set_kwargs()\n        self.flag_kwargs = self._flag_options.flag_kwargs\n\n        # Set the parameters for optimizer options\n        self.optim_options.set_kwargs()\n\n        # Initialize the epoch summary\n        monitor = self.optim_options.scheduler_kwargs[\"monitor\"].split(\"/\")[0]\n        mode = self.optim_options.scheduler_kwargs[\"mode\"]\n\n        self.task_epoch_summary = TaskSummaries(\n            task_loss_fun=self.loss_fun,\n            task_metrics=self.metrics,\n            task_metrics_on_training_set=self.metrics_on_training_set,\n            task_metrics_on_progress_bar=self.metrics_on_progress_bar,\n            monitor=monitor,\n            mode=mode,\n        )\n\n        # This helps avoid a bug when saving hparams to yaml with different dict or str formats\n        self._set_hparams(recursive_config_reformating(self.hparams))\n\n        # throughput estimation\n        self.mean_val_time_tracker = MovingAverageTracker()\n        self.mean_val_tput_tracker = MovingAverageTracker()\n        self.validation_step_outputs = []\n        self.test_step_outputs = []\n        self.epoch_start_time = None\n\n        # Decide whether to log every step or once at the end\n        # of the epoch.\n        self.metrics_every_n_train_steps = metrics_every_n_train_steps\n        # Wether save preds and targets for each training step.\n\n        self.samples_seen = 0\n        self.global_bs = global_bs\n\n    def forward(\n        self, inputs: Dict\n    ) -> Dict[str, Union[Tensor, Dict[str, Tensor], Dict[str, Dict[str, Tensor]]]]:\n        r\"\"\"\n        Returns the result of `self.model.forward(*inputs)` on the inputs.\n        If the output of `out = self.model.forward` is a dictionary with a `\"preds\"` key,\n        it is returned directly. Otherwise, a new dictionary is created and\n        returns `{\"preds\": out}`.\n\n        Returns:\n            A dict with a key `\"preds\"` representing the prediction of the network.\n\n        \"\"\"\n        # Convert to the right dtype and run the model\n        feats = self._convert_features_dtype(inputs[\"features\"])\n        # *check for nan in model output\n        out = self.model.forward(feats)\n        if isinstance(out, dict) and (\"preds\" in out.keys()):\n            out_dict = out\n        else:\n            out_dict = {\"preds\": out}\n\n        return out_dict\n\n    def _convert_features_dtype(self, feats):\n        # Convert features to dtype\n        if isinstance(feats, torch.Tensor):\n            feats = feats.to(self.dtype)\n        elif isinstance(feats, (Data, Batch, dict)):\n            for key, val in feats.items():\n                if isinstance(val, torch.Tensor) and (val.is_floating_point()):\n                    feats[key] = val.to(dtype=self.dtype)\n        return feats\n\n    def _get_task_key(self, task_level: str, task: str):\n        task_prefix = f\"{task_level}_\"\n        if not task.startswith(task_prefix):\n            task = task_prefix + task\n        return task\n\n    def configure_optimizers(self, impl=None):\n        if impl is None:\n            impl = torch.optim.Adam\n\n        # Define the optimizer and schedulers\n        optimiser = MuAdam(self.parameters(), **self.optim_options.optim_kwargs, impl=impl)\n        self.optim_options.torch_scheduler_kwargs.pop(\"module_type\")\n        torch_scheduler = self.optim_options.scheduler_class(\n            optimizer=optimiser, **self.optim_options.torch_scheduler_kwargs\n        )\n        scheduler = {\n            \"scheduler\": torch_scheduler,\n            **self.optim_options.scheduler_kwargs,\n        }\n        return [optimiser], [scheduler]\n\n    @staticmethod\n    def compute_loss(\n        preds: Dict[str, Tensor],\n        targets: Dict[str, Tensor],\n        weights: Optional[Tensor],\n        loss_fun: Dict[str, Callable],\n        target_nan_mask: Optional[Union[str, int]] = None,\n        multitask_handling: Optional[str] = None,\n    ) -> Tuple[Tensor, Dict[str, Tensor]]:\n        r\"\"\"\n        Compute the loss using the specified loss function, and dealing with\n        the nans in the `targets`.\n\n        Parameters:\n            preds:\n                Predicted values\n\n            targets:\n                Target values\n\n            weights:\n                No longer supported, will raise an error.\n\n            target_nan_mask:\n\n                - None: Do not change behaviour if there are NaNs\n\n                - int, float: Value used to replace NaNs. For example, if `target_nan_mask==0`, then\n                  all NaNs will be replaced by zeros\n\n                - 'ignore': The NaN values will be removed from the tensor before computing the metrics.\n                  Must be coupled with the `multitask_handling='flatten'` or `multitask_handling='mean-per-label'`.\n\n            multitask_handling:\n                - None: Do not process the tensor before passing it to the metric.\n                  Cannot use the option `multitask_handling=None` when `target_nan_mask=ignore`.\n                  Use either 'flatten' or 'mean-per-label'.\n\n                - 'flatten': Flatten the tensor to produce the equivalent of a single task\n\n                - 'mean-per-label': Loop all the labels columns, process them as a single task,\n                    and average the results over each task\n                  *This option might slowdown the computation if there are too many labels*\n\n            loss_fun:\n                Loss function to use\n\n        Returns:\n            Tensor:\n                weighted_loss: Resulting weighted loss\n                all_task_losses: Loss per task\n        \"\"\"\n\n        wrapped_loss_fun_dict = {\n            task: MetricWrapper(\n                metric=loss,\n                threshold_kwargs=None,\n                target_nan_mask=target_nan_mask,\n                multitask_handling=multitask_handling,\n            )\n            for task, loss in loss_fun.items()\n        }\n\n        if weights is not None:\n            raise NotImplementedError(\"Weights are no longer supported in the loss\")\n        all_task_losses = {\n            task: wrapped(preds=preds[task], target=targets[task])\n            for task, wrapped in wrapped_loss_fun_dict.items()\n        }\n        total_loss = torch.sum(torch.stack(list(all_task_losses.values())), dim=0)\n        num_tasks = len(all_task_losses.keys())\n        weighted_loss = total_loss / num_tasks\n        return weighted_loss, all_task_losses\n\n    def _general_step(self, batch: Dict[str, Tensor], step_name: str, to_cpu: bool) -> Dict[str, Any]:\n        r\"\"\"Common code for training_step, validation_step and testing_step\"\"\"\n        preds = self.forward(batch)  # The dictionary of predictions\n\n        # * check for nan in model output\n        targets_dict = batch.get(\"labels\")\n\n        # Different type of preds can be return by the forward\n        if isinstance(preds, dict) and (\"preds\" in preds.keys()):\n            preds = preds[\"preds\"]\n        elif isinstance(preds, Tensor):\n            preds = {k: preds[ii] for ii, k in enumerate(targets_dict.keys())}\n\n        preds = {\n            self._get_task_key(task_level=self.task_levels[key], task=key): value\n            for key, value in preds.items()\n        }\n        # preds = {k: preds[ii] for ii, k in enumerate(targets_dict.keys())}\n        for task, pred in preds.items():\n            task_specific_norm = self.task_norms[task] if self.task_norms is not None else None\n            if hasattr(task_specific_norm, \"normalize_val_test\"):\n                normalize_val_test = task_specific_norm.normalize_val_test\n            else:\n                normalize_val_test = False\n            if step_name != \"train\" and not normalize_val_test:\n                # apply denormalization for val and test predictions for correct loss and metrics evaluation\n                # if normalize_val_test is not true, only train loss will stay as the normalized version\n                # if normalize_val_test is true, no denormalization is applied, all losses and metrics are normalized version\n                preds[task] = task_specific_norm.denormalize(pred)\n            targets_dict[task] = targets_dict[task].to(dtype=pred.dtype)\n        weights = batch.get(\"weights\", None)\n\n        loss, task_losses = self.compute_loss(\n            preds=preds,\n            targets=targets_dict,\n            weights=weights,\n            loss_fun=self.loss_fun,\n            target_nan_mask=self.target_nan_mask,\n            multitask_handling=self.multitask_handling,\n        )\n\n        device = \"cpu\" if to_cpu else None\n        for task in preds:\n            task_specific_norm = self.task_norms[task] if self.task_norms is not None else None\n            if hasattr(task_specific_norm, \"normalize_val_test\"):\n                normalize_val_test = task_specific_norm.normalize_val_test\n            else:\n                normalize_val_test = False\n            if step_name == \"train\" and not normalize_val_test:\n                # apply denormalization for targets and predictions for the evaluation of training metrics (excluding loss)\n                # if normalize_val_test is not true, train loss will stay as the normalized version\n                # if normalize_val_test is true, no denormalization is applied, all losses and metrics are normalized version\n                preds[task] = task_specific_norm.denormalize(preds[task])\n                targets_dict[task] = task_specific_norm.denormalize(targets_dict[task])\n            preds[task] = preds[task].detach().to(device=device)\n            targets_dict[task] = targets_dict[task].detach().to(device=device)\n        if weights is not None:\n            weights = weights.detach().to(device=device)\n\n        step_dict = {\"preds\": preds, \"targets\": targets_dict, \"weights\": weights}\n        # step_dict[f\"{self.loss_fun._get_name()}/{step_name}\"] = loss.detach().cpu()            original\n\n        # step_dict[f\"weighted_loss/{step_name}\"] = loss.detach().cpu()\n        # step_dict[f\"loss/{step_name}\"] = loss.detach().cpu()\n        for task in self.tasks:\n            step_dict[\n                self.task_epoch_summary.metric_log_name(task, self.loss_fun[task]._get_name(), step_name)\n            ] = loss.detach()\n\n        step_dict[\"loss\"] = loss\n        # print(\"loss \", self.global_step, self.current_epoch, loss)\n        step_dict[\"task_losses\"] = task_losses\n        step_dict[\"gradient_norm\"] = self.get_gradient_norm()\n        return step_dict\n\n    def flag_step(self, batch: Dict[str, Tensor], step_name: str, to_cpu: bool) -> Dict[str, Any]:\n        r\"\"\"\n        Perform adversarial data agumentation during one training step using FLAG.\n        Paper: https://arxiv.org/abs/2010.09891\n        Github: https://github.com/devnkong/FLAG\n        \"\"\"\n\n        alpha, n_steps = self.flag_kwargs[\"alpha\"], self.flag_kwargs[\"n_steps\"]\n\n        X = self._convert_features_dtype(batch[\"features\"])\n        X_shape = X[\"feat\"].shape\n\n        pert = torch.FloatTensor(X_shape).uniform_(-alpha, alpha).to(device=X[\"feat\"].device)\n        pert.requires_grad = True\n\n        # Perturb the features\n        pert_batch = deepcopy(batch)\n        features = pert_batch[\"features\"]\n        features[\"feat\"] = features[\"feat\"] + pert\n\n        preds = self.forward(pert_batch)[\"preds\"]\n        targets = batch.pop(\"labels\")\n        for key in targets.keys():\n            targets[key] = targets[key].to(dtype=preds[key].dtype)\n        weights = batch.pop(\"weights\", None)\n        loss, task_losses = self.compute_loss(\n            preds=preds,\n            targets=targets,\n            weights=weights,\n            target_nan_mask=self.target_nan_mask,\n            multitask_handling=self.multitask_handling,\n            loss_fun=self.loss_fun,\n        )\n\n        loss = loss / n_steps\n\n        # Iteratively augment data by applying perturbations\n        # Accumulate the gradients to be applied to the weights of the network later on\n        for _ in range(n_steps - 1):\n            loss.backward()\n            pert_data = pert.detach() + alpha * torch.sign(pert.grad.detach())\n            pert.data = pert_data.data\n            pert.grad[:] = 0\n            features[\"feat\"] = features[\"feat\"] + pert\n            pert_batch[\"features\"] = features\n            preds = self.forward(pert_batch)[\"preds\"]\n            loss, _ = self.compute_loss(\n                preds=preds,\n                targets=targets,\n                weights=weights,\n                target_nan_mask=self.target_nan_mask,\n                multitask_handling=self.multitask_handling,\n                loss_fun=self.loss_fun,\n            )\n            loss = loss / n_steps\n\n        device = \"cpu\" if to_cpu else None\n        for key in preds.keys():\n            preds[key] = preds[key].detach().to(device=device)\n            targets[key] = targets[key].detach().to(device=device)\n        if weights is not None:\n            weights = weights.detach().to(device=device)\n\n        step_dict = {\"preds\": preds, \"targets\": targets, \"weights\": weights}\n        step_dict[f\"loss/{step_name}\"] = loss.detach().cpu()\n        step_dict[\"loss\"] = loss\n        step_dict[\"task_losses\"] = task_losses\n        return step_dict\n\n    def on_train_batch_start(self, batch: Any, batch_idx: int) -> Optional[int]:\n        self.train_batch_start_time = time.time()\n        self.skip_log_train_metrics = (self.metrics_every_n_train_steps is None) or (\n            (batch_idx % self.metrics_every_n_train_steps) != 0\n        )\n        return super().on_train_batch_start(batch, batch_idx)\n\n    def on_train_batch_end(self, outputs, batch: Any, batch_idx: int) -> None:\n        train_batch_time = time.time() - self.train_batch_start_time  # To be used for throughput calculation\n\n        # Get the metrics that are logged at every step (loss, grad_norm, batch_time, batch_tput)\n        concatenated_metrics_logs = {}\n        concatenated_metrics_logs[\"train/loss\"] = outputs[\"loss\"]\n        concatenated_metrics_logs[\"epoch_count\"] = self.current_epoch\n        # Incriment by the batch size\n        self.samples_seen += self.global_bs\n        concatenated_metrics_logs[\"samples_seen\"] = self.samples_seen\n\n        # report the training loss for each individual tasks\n        for task in self.tasks:\n            concatenated_metrics_logs[f\"train/loss/{task}\"] = outputs[\"task_losses\"][task]\n\n        # get the mean loss value for individual tasks as they are a tensor of size --> gradient accumulation * replication * device_iter\n        # filter zeros out for the individual losses\n        for key in concatenated_metrics_logs:\n            if isinstance(concatenated_metrics_logs[key], torch.Tensor):\n                if concatenated_metrics_logs[key].numel() > 1:\n                    concatenated_metrics_logs[key] = concatenated_metrics_logs[key][\n                        concatenated_metrics_logs[key] != 0\n                    ].mean()\n\n        # If logging is skipped for this step, then log the important metrics anyway and return\n        if self.skip_log_train_metrics:\n            if self.logger is not None:\n                self.logger.log_metrics(\n                    concatenated_metrics_logs, step=self.global_step\n                )  # This is a pytorch lightning function call\n            return\n\n        ### The code below is not executed if the logging is skipped for this step ###\n\n        # Get the throughput of the batch\n        num_graphs = self.get_num_graphs(batch[\"features\"])\n        tput = num_graphs / train_batch_time\n        concatenated_metrics_logs[\"train/batch_time\"] = train_batch_time\n        concatenated_metrics_logs[\"train/batch_tput\"] = tput\n\n        # Compute all the metrics for the training set\n        self.task_epoch_summary.update_predictor_state(\n            step_name=\"train\",\n            targets=outputs[\"targets\"],\n            preds=outputs[\"preds\"],\n            loss=outputs[\"loss\"],  # This is the weighted loss for now, but change to task-specific loss\n            task_losses=outputs[\"task_losses\"],\n            n_epochs=self.current_epoch,\n        )\n        metrics_logs = self.task_epoch_summary.get_metrics_logs()  # Dict[task, metric_logs]\n        metrics_logs[\"_global\"][\"grad_norm\"] = self.get_gradient_norm()\n        concatenated_metrics_logs.update(metrics_logs)\n\n        # Log the metrics\n        if self.logger is not None:\n            self.logger.log_metrics(\n                concatenated_metrics_logs, step=self.global_step\n            )  # This is a pytorch lightning function call\n\n    def training_step(self, batch: Dict[str, Tensor], to_cpu: bool = True) -> Dict[str, Any]:\n        step_dict = None\n\n        # Train using FLAG\n        if self.flag_kwargs[\"n_steps\"] > 0:\n            step_dict = self.flag_step(batch=batch, step_name=\"train\", to_cpu=to_cpu)\n        # Train normally, without using FLAG\n        elif self.flag_kwargs[\"n_steps\"] == 0:\n            # step_dict = self._general_step(batch=batch, step_name=\"train\", to_cpu=True)\n            step_dict = self._general_step(batch=batch, step_name=\"train\", to_cpu=to_cpu)\n\n        # Remove the preds and targets if no logging is required\n        if self.skip_log_train_metrics:\n            step_dict.pop(\"preds\")\n            step_dict.pop(\"targets\")\n        return step_dict  # Returning the metrics_logs with the loss\n\n    def get_gradient_norm(self):\n        # compute the norm\n        total_norm = torch.tensor(0.0)\n        for p in self.parameters():\n            if p.grad is not None:\n                param_norm = p.grad.detach().data.norm(2)\n                total_norm += param_norm.detach().cpu() ** 2\n        total_norm = total_norm**0.5\n        return total_norm\n\n    def validation_step(self, batch: Dict[str, Tensor], to_cpu: bool = True) -> Dict[str, Any]:\n        return self._general_step(batch=batch, step_name=\"val\", to_cpu=to_cpu)\n\n    def test_step(self, batch: Dict[str, Tensor], to_cpu: bool = True) -> Dict[str, Any]:\n        return self._general_step(batch=batch, step_name=\"test\", to_cpu=to_cpu)\n\n    def _general_epoch_end(self, outputs: Dict[str, Any], step_name: str, device: str) -> None:\n        r\"\"\"Common code for training_epoch_end, validation_epoch_end and testing_epoch_end\"\"\"\n        # Transform the list of dict of dict, into a dict of list of dict\n        preds = {}\n        targets = {}\n        for task in self.tasks:\n            preds[task] = torch.cat([out[\"preds\"][task].to(device) for out in outputs], dim=0)\n            targets[task] = torch.cat([out[\"targets\"][task].to(device) for out in outputs], dim=0)\n        if (\"weights\" in outputs[0].keys()) and (outputs[0][\"weights\"] is not None):\n            weights = torch.cat([out[\"weights\"].to(device) for out in outputs], dim=0)\n        else:\n            weights = None\n\n        # NOTE: Computing the loss over the entire split may cause\n        # overflow issues when using fp16\n        loss, task_losses = self.compute_loss(\n            preds=dict_tensor_fp16_to_fp32(preds),\n            targets=dict_tensor_fp16_to_fp32(targets),\n            weights=weights,\n            target_nan_mask=self.target_nan_mask,\n            multitask_handling=self.multitask_handling,\n            loss_fun=self.loss_fun,\n        )\n\n        self.task_epoch_summary.update_predictor_state(\n            step_name=step_name,\n            preds=preds,\n            targets=targets,\n            loss=loss,\n            task_losses=task_losses,\n            n_epochs=self.current_epoch,\n        )\n        metrics_logs = self.task_epoch_summary.get_metrics_logs()\n        self.task_epoch_summary.set_results(task_metrics=metrics_logs)\n\n        return metrics_logs  # Consider returning concatenated dict for logging\n\n    def on_train_epoch_start(self) -> None:\n        self.epoch_start_time = time.time()\n\n    def on_train_epoch_end(self) -> None:\n        if self.epoch_start_time is None:\n            logger.warning(\"epoch timer not initialized\")\n        else:\n            epoch_time = time.time() - self.epoch_start_time\n            self.epoch_start_time = None\n            self.log(\"epoch_time\", torch.tensor(epoch_time), sync_dist=True)\n\n    def on_validation_epoch_start(self) -> None:\n        self.mean_val_time_tracker.reset()\n        self.mean_val_tput_tracker.reset()\n        return super().on_validation_epoch_start()\n\n    def on_validation_batch_start(self, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> None:\n        self.validation_batch_start_time = time.time()\n        return super().on_validation_batch_start(batch, batch_idx, dataloader_idx)\n\n    def on_validation_batch_end(\n        self, outputs: Any, batch: Any, batch_idx: int, dataloader_idx: int = 0\n    ) -> None:\n        val_batch_time = time.time() - self.validation_batch_start_time\n        self.validation_step_outputs.append(outputs)\n        self.mean_val_time_tracker.update(val_batch_time)\n        num_graphs = self.get_num_graphs(batch[\"features\"])\n        self.mean_val_tput_tracker.update(num_graphs / val_batch_time)\n        return super().on_validation_batch_end(outputs, batch, batch_idx, dataloader_idx)\n\n    def on_validation_epoch_end(self) -> None:\n        metrics_logs = self._general_epoch_end(\n            outputs=self.validation_step_outputs, step_name=\"val\", device=\"cpu\"\n        )\n        self.validation_step_outputs.clear()\n        concatenated_metrics_logs = self.task_epoch_summary.concatenate_metrics_logs(metrics_logs)\n        concatenated_metrics_logs[\"val/mean_time\"] = torch.tensor(self.mean_val_time_tracker.mean_value)\n        concatenated_metrics_logs[\"val/mean_tput\"] = self.mean_val_tput_tracker.mean_value\n        self.log_dict(concatenated_metrics_logs, sync_dist=True)\n\n        # Save yaml file with the per-task metrics summaries\n        full_dict = {}\n        full_dict.update(self.task_epoch_summary.get_dict_summary())\n\n    def on_test_batch_end(self, outputs: Any, batch: Any, batch_idx: int, dataloader_idx: int = 0) -> None:\n        self.test_step_outputs.append(outputs)\n\n    def on_test_epoch_end(self) -> None:\n        metrics_logs = self._general_epoch_end(outputs=self.test_step_outputs, step_name=\"test\", device=\"cpu\")\n        self.test_step_outputs.clear()\n        concatenated_metrics_logs = self.task_epoch_summary.concatenate_metrics_logs(metrics_logs)\n\n        self.log_dict(concatenated_metrics_logs, sync_dist=True)\n\n        # Save yaml file with the per-task metrics summaries\n        full_dict = {}\n        full_dict.update(self.task_epoch_summary.get_dict_summary())\n\n    def on_train_start(self):\n        hparams_log = deepcopy(self.hparams)\n        hparams_log[\"n_params\"] = self.n_params\n        if self.logger is not None:\n            self.logger.log_hyperparams(hparams_log)\n\n    def get_progress_bar_dict(self) -> Dict[str, float]:\n        prog_dict = {}\n        prog_dict[\"loss\"] = self.task_epoch_summary.weighted_loss.detach().cpu()\n        results_on_progress_bar = self.task_epoch_summary.get_results_on_progress_bar(\"val\")\n        for task in self.tasks:\n            prog_dict[self.task_epoch_summary.metric_log_name(task, \"loss\", \"val\")] = (\n                self.task_epoch_summary.task_summaries[task].summaries[\"val\"].loss\n            )\n            prog_dict.update(results_on_progress_bar)\n        return prog_dict\n\n    def __repr__(self) -> str:\n        r\"\"\"\n        Controls how the class is printed\n        \"\"\"\n        model_str = self.model.__repr__()\n        summary_str = self.summarize().__repr__()\n\n        return model_str + \"\\n\\n\" + summary_str\n\n    @staticmethod\n    def list_pretrained_models():\n        \"\"\"List available pretrained models.\"\"\"\n        return GRAPHIUM_PRETRAINED_MODELS_DICT\n\n    @staticmethod\n    def load_pretrained_model(name_or_path: str, device: str = None):\n        \"\"\"Load a pretrained model from its name.\n\n        Args:\n            name: Name of the model to load or a valid checkpoint path. List available\n                from `graphium.trainer.PredictorModule.list_pretrained_models()`.\n        \"\"\"\n\n        name = GRAPHIUM_PRETRAINED_MODELS_DICT.get(name_or_path)\n\n        if name is not None:\n            return PredictorModule.load_from_checkpoint(\n                GRAPHIUM_PRETRAINED_MODELS_DICT[name_or_path], map_location=device\n            )\n\n        if name is None and not (fs.exists(name_or_path) and fs.get_extension(name_or_path) == \"ckpt\"):\n            raise ValueError(\n                f\"The model '{name_or_path}' is not available. Choose from {set(GRAPHIUM_PRETRAINED_MODELS_DICT.keys())} \"\n                \"or pass a valid checkpoint (.ckpt) path.\"\n            )\n\n        return PredictorModule.load_from_checkpoint(name_or_path, map_location=device)\n\n    def set_max_nodes_edges_per_graph(self, datamodule: BaseDataModule, stages: Optional[List[str]] = None):\n        datamodule.setup()\n\n        max_nodes = datamodule.get_max_num_nodes_datamodule(stages)\n        max_edges = datamodule.get_max_num_edges_datamodule(stages)\n\n        self.model.set_max_num_nodes_edges_per_graph(max_nodes, max_edges)\n\n    def get_num_graphs(self, data: Batch):\n        \"\"\"\n        Method to compute number of graphs in a Batch.\n        Essential to estimate throughput in graphs/s.\n        \"\"\"\n        return torch.max(data.batch) + 1\n"
  },
  {
    "path": "graphium/trainer/predictor_options.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nr\"\"\"Data classes to group together related arguments for the creation of a Predictor Module.\"\"\"\n\n\n\"\"\"\nReplace the usage of **kwargs by adding checks to make sure that everything is type-safe.\n    Stricter typing is important because:\n        - It makes finding bugs easier since incorrect types will cause an obvious error.\n        - Static analysis becomes easier, and the IDE can give hints about errors in the code before runtime.\nAdd the post-init function to do the checks immediately.\n\"\"\"\n\nfrom copy import deepcopy\nfrom dataclasses import dataclass, field\nfrom typing import Any, Callable, Dict, List, Optional, Type, Union\nfrom inspect import signature, isclass\n\nfrom torch import nn\n\nfrom graphium.utils.spaces import LOSS_DICT\nfrom graphium.utils.spaces import SCHEDULER_DICT\n\n\n@dataclass\nclass ModelOptions:\n    r\"\"\"\n    This data class stores the arguments necessary to instantiate a model for the Predictor.\n\n    Parameters:\n        model_class:\n            pytorch module used to create a model\n\n        model_kwargs:\n            Key-word arguments used to initialize the model from `model_class`.\n    \"\"\"\n\n    model_class: Type[nn.Module]\n    model_kwargs: Dict[str, Any]\n\n\n@dataclass\nclass OptimOptions:\n    r\"\"\"\n    This data class stores the arguments necessary to configure the optimizer for the Predictor.\n\n    Parameters:\n        optim_kwargs:\n            Dictionnary used to initialize the optimizer, with possible keys below.\n\n            - lr `float`: Learning rate (Default=`1e-3`)\n            - weight_decay `float`: Weight decay used to regularize the optimizer (Default=`0.`)\n\n        torch_scheduler_kwargs:\n            Dictionnary for the scheduling of learning rate, with possible keys below.\n\n            - type `str`: Type of the learning rate to use from pytorch. Examples are\n                `'ReduceLROnPlateau'` (default), `'CosineAnnealingWarmRestarts'`, `'StepLR'`, etc.\n            - **kwargs: Any other argument for the learning rate scheduler\n\n        scheduler_kwargs:\n            Dictionnary for the scheduling of the learning rate modification used by pytorch-lightning\n\n            - monitor `str`: metric to track (Default=`\"loss/val\"`)\n            - interval `str`: Whether to look at iterations or epochs (Default=`\"epoch\"`)\n            - strict `bool`: if set to True will enforce that value specified in monitor is available\n                while trying to call scheduler.step(), and stop training if not found. If False will\n                only give a warning and continue training (without calling the scheduler). (Default=`True`)\n            - frequency `int`: **TODO: NOT REALLY SURE HOW IT WORKS!** (Default=`1`)\n\n        scheduler_class: The class to use for the scheduler, or the str representing the scheduler.\n\n    \"\"\"\n\n    optim_kwargs: Optional[Dict[str, Any]] = None\n    torch_scheduler_kwargs: Optional[Dict[str, Any]] = None\n    scheduler_kwargs: Optional[Dict[str, Any]] = None\n    scheduler_class: Optional[Union[str, Type]] = None\n\n    # Instead of passing a dictionary to be processed by the predictor,\n    # this class will process the dictionary in advance and return the optimizer\n    def set_kwargs(self):\n        torch_scheduler_kwargs = deepcopy(self.torch_scheduler_kwargs)\n        # Set the parameters and default value for the optimizer, and check values\n        if self.optim_kwargs is None:\n            self.optim_kwargs = {}\n        self.optim_kwargs.setdefault(\"lr\", 1e-3)\n        self.optim_kwargs.setdefault(\"weight_decay\", 0.0)\n        assert self.optim_kwargs[\"lr\"] > 0\n        assert self.optim_kwargs[\"weight_decay\"] >= 0\n\n        # Set the lightning scheduler\n        if self.scheduler_kwargs is None:\n            self.scheduler_kwargs = {}\n        self.scheduler_kwargs.setdefault(\"interval\", \"epoch\")\n        self.scheduler_kwargs.setdefault(\"monitor\", \"loss/val\")\n        self.scheduler_kwargs.setdefault(\"mode\", \"min\")\n        self.scheduler_kwargs.setdefault(\"frequency\", 1)\n        self.scheduler_kwargs.setdefault(\"strict\", True)\n\n        # Set the pytorch scheduler arguments\n        if torch_scheduler_kwargs is None:\n            torch_scheduler_kwargs = {}\n        torch_scheduler_kwargs.setdefault(\"module_type\", \"ReduceLROnPlateau\")\n\n        # Get the class for the scheduler\n        scheduler_class = torch_scheduler_kwargs.pop(\"module_type\")\n        if self.scheduler_class is None:\n            if isinstance(scheduler_class, str):\n                self.scheduler_class = SCHEDULER_DICT[scheduler_class]\n            elif isclass(scheduler_class):\n                self.scheduler_class = scheduler_class\n            else:\n                raise TypeError(\"`scheduler_class` should be a str or a class\")\n\n        # Add the `monitor` and `mode` variables\n        sig = signature(self.scheduler_class.__init__)\n        key_args = [p.name for p in sig.parameters.values()]\n        if \"monitor\" in key_args:\n            torch_scheduler_kwargs.setdefault(\"monitor\", self.scheduler_kwargs[\"monitor\"])\n        if \"mode\" in key_args:\n            torch_scheduler_kwargs.setdefault(\"mode\", self.scheduler_kwargs[\"mode\"])\n\n\n@dataclass\nclass EvalOptions:\n    r\"\"\"\n    This data class stores the arguments necessary to instantiate a model for the Predictor.\n\n    Parameters:\n        loss_fun:\n            Loss function used during training.\n            Acceptable strings are graphium.utils.spaces.LOSS_DICT.keys().\n            If a dict, must contain a 'name' key with one of the acceptable loss function strings\n            as a value. The rest of the dict will be used as the arguments passed to the loss object.\n            Otherwise, a callable object must be provided, with a method `loss_fun._get_name()`.\n\n        metrics:\n            A dictionnary of metrics to compute on the prediction, other than the loss function.\n            These metrics will be logged into WandB or other.\n\n        metrics_on_progress_bar:\n            The metrics names from `metrics` to display also on the progress bar of the training\n\n        metrics_on_training_set:\n            The metrics names from `metrics` to be computed on the training set for each iteration.\n            If `None`, all the metrics are computed. Using less metrics can significantly improve\n            performance, depending on the number of readouts.\n    \"\"\"\n\n    loss_fun: Union[str, Dict, Callable]\n    metrics: Dict[str, Callable] = None\n    metrics_on_progress_bar: List[str] = field(default_factory=List[str])\n    metrics_on_training_set: Optional[List[str]] = None\n\n    def check_metrics_validity(self):\n        \"\"\"\n        Check that the metrics for the progress_par and training_set are valid\n        \"\"\"\n        if self.metrics_on_progress_bar is not None:\n            selected = set(self.metrics_on_progress_bar)\n            assert selected.issubset(\n                set(self.metrics.keys())\n            ), f\"Metrics {selected - set(self.metrics.keys())} not in `metrics` with choices {set(self.metrics.keys())}\"\n\n        if self.metrics_on_training_set is not None:\n            selected = set(self.metrics_on_training_set)\n            assert selected.issubset(\n                set(self.metrics.keys())\n            ), f\"Metrics {selected - set(self.metrics.keys())} not in `metrics` with choices {set(self.metrics.keys())}\"\n\n    # Parse before or after?\n    @staticmethod\n    def parse_loss_fun(loss_fun: Union[str, Dict, Callable]) -> Callable:\n        r\"\"\"\n        Parse the loss function from a string or a dict\n\n        Parameters:\n            loss_fun:\n                A callable corresponding to the loss function, a string specifying the loss\n                function from `LOSS_DICT`, or a dict containing a key 'name' specifying the\n                loss function, and the rest of the dict used as the arguments for the loss.\n                Accepted strings are: graphium.utils.spaces.LOSS_DICT.keys().\n\n        Returns:\n            Callable:\n                Function or callable to compute the loss, takes `preds` and `targets` as inputs.\n        \"\"\"\n\n        if isinstance(loss_fun, str):\n            if loss_fun not in LOSS_DICT.keys():\n                raise ValueError(\n                    f\"`loss_fun` expected to be one of the strings in {LOSS_DICT.keys()}. \"\n                    f\"Provided: {loss_fun}.\"\n                )\n            loss_fun = LOSS_DICT[loss_fun]()\n        elif isinstance(loss_fun, dict):\n            if loss_fun.get(\"name\") is None:\n                raise ValueError(f\"`loss_fun` expected to have a key 'name'.\")\n            if loss_fun[\"name\"] not in LOSS_DICT.keys():\n                raise ValueError(\n                    f\"`loss_fun['name']` expected to be one of the strings in {LOSS_DICT.keys()}. \"\n                    f\"Provided: {loss_fun}.\"\n                )\n            loss_fun = deepcopy(loss_fun)\n            loss_name = loss_fun.pop(\"name\")\n            loss_fun = LOSS_DICT[loss_name](**loss_fun)\n        elif not callable(loss_fun):\n            raise ValueError(f\"`loss_fun` must be `str`, `dict` or `callable`. Provided: {type(loss_fun)}\")\n\n        return loss_fun\n\n\n@dataclass\nclass FlagOptions:\n    r\"\"\"\n    This data class stores the arguments necessary to instantiate a model for the Predictor.\n\n    Parameters:\n        flag_kwargs:\n            Keyword arguments used for FLAG, and adversarial data augmentation for graph networks.\n            See: https://arxiv.org/abs/2010.09891\n\n            - n_steps: An integer that specifies the number of ascent steps when running FLAG during training.\n                Default value of 0 trains GNNs without FLAG, and any value greater than 0 will use FLAG with that\n                many iterations.\n\n            - alpha: A float that specifies the ascent step size when running FLAG. Default=0.01\n    \"\"\"\n\n    flag_kwargs: Dict[str, Any] = None\n\n    # Set the parameters and default values for the FLAG adversarial augmentation, and check values\n    def set_kwargs(self):\n        if self.flag_kwargs is None:\n            self.flag_kwargs = {}\n        self.flag_kwargs.setdefault(\"alpha\", 0.01)\n        self.flag_kwargs.setdefault(\"n_steps\", 0)\n        assert isinstance(self.flag_kwargs[\"n_steps\"], int) and (self.flag_kwargs[\"n_steps\"] >= 0)\n        assert self.flag_kwargs[\"alpha\"] >= 0\n"
  },
  {
    "path": "graphium/trainer/predictor_summaries.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nr\"\"\"Classes to store information about resulting evaluation metrics when using a Predictor Module.\"\"\"\n\nfrom typing import Any, Callable, Dict, List, Optional, Union\nfrom loguru import logger\n\nimport numpy as np\nimport torch\nfrom torch import Tensor\n\nfrom graphium.utils.tensor import nan_mean, nan_std, nan_median, tensor_fp16_to_fp32\n\n\nclass SummaryInterface(object):\n    r\"\"\"\n    An interface to define the functions implemented by summary classes that implement SummaryInterface.\n    \"\"\"\n\n    def set_results(self, **kwargs):\n        raise NotImplementedError()\n\n    def get_dict_summary(self):\n        raise NotImplementedError()\n\n    def update_predictor_state(self, **kwargs):\n        raise NotImplementedError()\n\n    def get_metrics_logs(self, **kwargs):\n        raise NotImplementedError()\n\n\nclass Summary(SummaryInterface):\n    # TODO (Gabriela): Default argument cannot be []\n    def __init__(\n        self,\n        loss_fun: Union[str, Callable],\n        metrics: Dict[str, Callable],\n        metrics_on_training_set: List[str] = [],\n        metrics_on_progress_bar: List[str] = [],\n        monitor: str = \"loss\",\n        mode: str = \"min\",\n        task_name: Optional[str] = None,\n    ):\n        r\"\"\"\n        A container to be used by the Predictor Module that stores the results for the given metrics on the predictions and targets provided.\n        Parameters:\n            loss_fun:\n            Loss function used during training. Acceptable strings are 'mse', 'bce', 'mae', 'cosine'.\n            Otherwise, a callable object must be provided, with a method `loss_fun._get_name()`.\n\n            metrics:\n            A dictionnary of metrics to compute on the prediction, other than the loss function.\n            These metrics will be logged into WandB or similar.\n\n            metrics_on_training_set:\n            The metrics names from `metrics` to be computed on the training set for each iteration.\n            If `None`, all the metrics are computed. Using less metrics can significantly improve\n            performance, depending on the number of readouts.\n\n            metrics_on_progress_bar:\n            The metrics names from `metrics` to display also on the progress bar of the training\n\n            monitor:\n            `str` metric to track (Default=`\"loss/val\"`)\n\n            task_name:\n            name of the task (Default=`None`)\n\n        \"\"\"\n        self.loss_fun = loss_fun\n        self.metrics = metrics\n        self.metrics_on_training_set = metrics_on_training_set\n        self.metrics_on_progress_bar = metrics_on_progress_bar\n        self.monitor = monitor\n        self.mode = mode\n\n        self.summaries = {}\n        self.best_summaries = {}\n\n        # Current predictor state\n        # self.predictor_outputs = None\n        self.step_name: str = None\n        self.targets: Tensor = None\n        self.preds: Tensor = None\n        self.loss = None  # What type?\n        self.n_epochs: int = None\n\n        self.task_name = task_name\n        self.logged_metrics_exceptions = []  # Track which metric exceptions have been logged\n\n    def update_predictor_state(\n        self, step_name: str, targets: Tensor, preds: Tensor, loss: Tensor, n_epochs: int\n    ):\n        r\"\"\"\n        update the state of the predictor\n        Parameters:\n            step_name: which stage you are in, e.g. \"train\"\n            targets: the targets tensor\n            predictions: the predictions tensor\n            loss: the loss tensor\n            n_epochs: the number of epochs\n        \"\"\"\n        self.step_name = step_name\n        self.targets = targets\n        self.preds = preds\n        self.loss = loss\n        self.n_epochs = n_epochs\n\n    def set_results(\n        self,\n        metrics: Dict[str, Tensor],\n    ):\n        r\"\"\"\n        set the reults from the metrics\n        [!] This function requires that self.update_predictor_state() be called before it.\n        Parameters:\n            metrics: a dictionary of metrics\n        \"\"\"\n\n        # Include the task_name in the loss for logging, and similarly for other metrics\n        metrics[self.metric_log_name(self.task_name, \"loss\", self.step_name)] = self.loss\n        self.summaries[self.step_name] = Summary.Results(\n            targets=self.targets,\n            preds=self.preds,\n            loss=self.loss,\n            metrics=metrics,  # Should include task name from get_metrics_logs()\n            monitored_metric=f\"{self.monitor}/{self.step_name}\",  # Include task name?\n            n_epochs=self.n_epochs,\n        )\n        if self.is_best_epoch(self.step_name, self.loss, metrics):\n            self.best_summaries[self.step_name] = self.summaries[self.step_name]\n\n    def is_best_epoch(self, step_name: str, loss: Tensor, metrics: Dict[str, Tensor]) -> bool:\n        r\"\"\"\n        check if the current epoch is the best epoch based on self.mode criteria\n        Parameters:\n            step_name: which stage you are in, e.g. \"train\"\n            loss: the loss tensor\n            metrics: a dictionary of metrics\n        \"\"\"\n\n        # TODO (Gabriela): Check for bugs related to monitor_name\n        if not (step_name in self.best_summaries.keys()):\n            return True\n\n        # Include the task_name in the loss for logging, and similarly for other metrics\n        metrics[self.metric_log_name(self.task_name, \"loss\", self.step_name)] = loss\n        monitor_name = f\"{self.monitor}/{step_name}\"  # Include task_name?\n        if (\n            not monitor_name in self.best_summaries.keys()\n        ):  # Feels like there's a bug here. What is this trying to do???\n            return True\n\n        if self.mode == \"max\":\n            return metrics[monitor_name] > self.best_summaries[step_name].monitored\n        elif self.mode == \"min\":\n            return metrics[monitor_name] < self.best_summaries[step_name].monitored\n        else:\n            ValueError(f\"Mode must be 'min' or 'max', provided `{self.mode}`\")\n\n    def get_results(\n        self,\n        step_name: str,\n    ):\n        r\"\"\"\n        retrieve the results for a given step\n        Parameters:\n            step_name: which stage you are in, e.g. \"train\"\n        Returns:\n            the results for the given step\n        \"\"\"\n        return self.summaries[step_name]\n\n    def get_best_results(\n        self,\n        step_name: str,\n    ):\n        r\"\"\"\n        retrieve the best results for a given step\n        Parameters:\n            step_name: which stage you are in, e.g. \"train\"\n        Returns:\n            the best results for the given step\n        \"\"\"\n        return self.best_summaries[step_name]\n\n    def get_results_on_progress_bar(\n        self,\n        step_name: str,\n    ) -> Dict[str, Tensor]:\n        r\"\"\"\n        retrieve the results to be displayed on the progress bar for a given step\n        Parameters:\n            step_name: which stage you are in, e.g. \"train\"\n        Returns:\n            the results to be displayed on the progress bar for the given step\n        \"\"\"\n        results = self.summaries[step_name]\n        results_prog = {\n            # f\"{kk}/{step_name}\": results.metrics[f\"{kk}/{step_name}\"] for kk in self.metrics_on_progress_bar\n            self.metric_log_name(self.task_name, kk, step_name): results.metrics[\n                self.metric_log_name(self.task_name, kk, step_name)\n            ]\n            for kk in self.metrics_on_progress_bar\n        }\n        return results_prog\n\n    def get_dict_summary(self) -> Dict[str, Any]:\n        r\"\"\"\n        retrieve the full summary in a dictionary\n        Returns:\n            the full summary in a dictionary\n        \"\"\"\n        full_dict = {}\n        # Get metric summaries\n        full_dict[\"metric_summaries\"] = {}\n        for key, val in self.summaries.items():\n            full_dict[\"metric_summaries\"][key] = {k: v for k, v in val.metrics.items()}\n            full_dict[\"metric_summaries\"][key][\"n_epochs\"] = val.n_epochs\n\n        # Get metric summaries at best epoch\n        full_dict[\"best_epoch_metric_summaries\"] = {}\n        for key, val in self.best_summaries.items():\n            full_dict[\"best_epoch_metric_summaries\"][key] = val.metrics\n            full_dict[\"best_epoch_metric_summaries\"][key][\"n_epochs\"] = val.n_epochs\n\n        return full_dict\n\n    def get_metrics_logs(self) -> Dict[str, Any]:\n        r\"\"\"\n        Get the data about metrics to log.\n        Note: This function requires that self.update_predictor_state() be called before it.\n        Returns:\n            A dictionary of metrics to log.\n        \"\"\"\n\n        targets = tensor_fp16_to_fp32(self.targets)\n        preds = tensor_fp16_to_fp32(self.preds)\n\n        targets = targets.to(dtype=preds.dtype, device=preds.device)\n\n        # Compute the metrics always used in regression tasks\n        metric_logs = {}\n        metric_logs[self.metric_log_name(self.task_name, \"mean_pred\", self.step_name)] = nan_mean(preds)\n        metric_logs[self.metric_log_name(self.task_name, \"std_pred\", self.step_name)] = nan_std(preds)\n        metric_logs[self.metric_log_name(self.task_name, \"median_pred\", self.step_name)] = nan_median(preds)\n        metric_logs[self.metric_log_name(self.task_name, \"mean_target\", self.step_name)] = nan_mean(targets)\n        metric_logs[self.metric_log_name(self.task_name, \"std_target\", self.step_name)] = nan_std(targets)\n        metric_logs[self.metric_log_name(self.task_name, \"median_target\", self.step_name)] = nan_median(\n            targets\n        )\n\n        # Specify which metrics to use\n        metrics_to_use = self.metrics\n        if self.step_name == \"train\":\n            metrics_to_use = {\n                key: metric for key, metric in metrics_to_use.items() if key in self.metrics_on_training_set\n            }\n        # Compute the additional metrics\n        for key, metric in metrics_to_use.items():\n            metric_name = self.metric_log_name(\n                self.task_name, key, self.step_name\n            )  # f\"{key}/{self.step_name}\"\n            try:\n                metric_logs[metric_name] = metric(preds, targets)\n            except Exception as e:\n                metric_logs[metric_name] = torch.as_tensor(float(\"nan\"))\n                # Warn only if it's the first warning for that metric\n                if metric_name not in self.logged_metrics_exceptions:\n                    self.logged_metrics_exceptions.append(metric_name)\n                    logger.warning(f\"Error for metric {metric_name}. NaN is returned. Exception: {e}\")\n\n        # Convert all metrics to CPU, except for the loss\n        # metric_logs[f\"{self.loss_fun._get_name()}/{self.step_name}\"] = self.loss.detach().cpu()\n        metric_logs[self.metric_log_name(self.task_name, self.loss_fun._get_name(), self.step_name)] = (\n            self.loss.detach().cpu()\n        )\n        # print(\"Metrics logs keys: \", metric_logs.keys())\n        metric_logs = {key: metric.detach().cpu() for key, metric in metric_logs.items()}\n\n        return metric_logs\n\n    def metric_log_name(self, task_name, metric_name, step_name):\n        if task_name is None:\n            return f\"{metric_name}/{step_name}\"\n        else:\n            return f\"{task_name}/{metric_name}/{step_name}\"\n\n    class Results:\n        def __init__(\n            self,\n            targets: Tensor = None,\n            preds: Tensor = None,\n            loss: float = None,  # Is this supposed to be a Tensor or float?\n            metrics: dict = None,\n            monitored_metric: str = None,\n            n_epochs: int = None,\n        ):\n            r\"\"\"\n            This inner class is used as a container for storing the results of the summary.\n            Parameters:\n                targets: the targets\n                preds: the prediction tensor\n                loss: the loss, float or tensor\n                metrics: the metrics\n                monitored_metric: the monitored metric\n                n_epochs: the number of epochs\n            \"\"\"\n            self.targets = targets.detach().cpu()\n            self.preds = preds.detach().cpu()\n            self.loss = loss.item() if isinstance(loss, Tensor) else loss\n            self.monitored_metric = monitored_metric\n            if monitored_metric in metrics.keys():\n                self.monitored = metrics[monitored_metric].detach().cpu()\n            self.metrics = {\n                key: value.tolist() if isinstance(value, (Tensor, np.ndarray)) else value\n                for key, value in metrics.items()\n            }\n            self.n_epochs = n_epochs\n\n\nclass TaskSummaries(SummaryInterface):\n    def __init__(\n        self,\n        task_loss_fun: Callable,\n        task_metrics: Dict[str, Callable],\n        task_metrics_on_training_set: List[str],\n        task_metrics_on_progress_bar: List[str],\n        monitor: str = \"loss\",\n        mode: str = \"min\",\n    ):\n        r\"\"\"\n        class to store the summaries of the tasks\n        Parameters:\n            task_loss_fun: the loss function for each task\n            task_metrics: the metrics for each task\n            task_metrics_on_training_set: the metrics to use on the training set\n            task_metrics_on_progress_bar: the metrics to use on the progress bar\n            monitor: the metric to monitor\n            mode: the mode of the metric to monitor\n        \"\"\"\n        self.task_loss_fun = task_loss_fun\n        self.task_metrics = task_metrics\n        self.task_metrics_on_progress_bar = task_metrics_on_progress_bar\n        self.task_metrics_on_training_set = task_metrics_on_training_set\n        self.monitor = monitor\n        self.mode = mode\n\n        self.task_summaries: Dict[str, Summary] = {}\n        self.task_best_summaries: Dict[str, Summary] = {}\n        self.tasks = list(task_loss_fun.keys())\n\n        for task in self.tasks:\n            self.task_summaries[task] = Summary(\n                self.task_loss_fun[task],\n                self.task_metrics[task],\n                self.task_metrics_on_training_set[task],\n                self.task_metrics_on_progress_bar[task],\n                self.monitor,\n                self.mode,\n                task_name=task,\n            )\n\n        # Current predictor state\n        self.weighted_loss = None\n        self.step_name = None\n\n    def update_predictor_state(\n        self,\n        step_name: str,\n        targets: Dict[str, Tensor],\n        preds: Dict[str, Tensor],\n        loss: Tensor,\n        task_losses: Dict[str, Tensor],\n        n_epochs: int,\n    ):\n        r\"\"\"\n        update the state for all predictors\n        Parameters:\n            step_name: the name of the step\n            targets: the target tensors\n            preds: the prediction tensors\n            loss: the loss tensor\n            task_losses: the task losses\n            n_epochs: the number of epochs\n        \"\"\"\n        self.weighted_loss = loss\n        self.step_name = step_name\n        for task in self.tasks:\n            self.task_summaries[task].update_predictor_state(\n                step_name,\n                targets[task],\n                preds[task].detach(),\n                task_losses[task].detach(),\n                n_epochs,\n            )\n\n    def set_results(self, task_metrics: Dict[str, Dict[str, Tensor]]):\n        \"\"\"\n        set the results for all tasks\n        Parameters:\n            task_metrics: the metrics for each task\n        \"\"\"\n        for task in self.tasks:\n            self.task_summaries[task].set_results(task_metrics[task])\n            step_name = self.task_summaries[task].step_name\n            loss = self.task_summaries[task].loss\n            if self.task_summaries[task].is_best_epoch(step_name, loss, task_metrics[task]):\n                self.task_summaries[task].best_summaries[step_name] = self.task_summaries[task].summaries[\n                    step_name\n                ]\n\n    def get_results(\n        self,\n        step_name: str,\n    ) -> Dict[str, Dict[str, Any]]:\n        \"\"\"\n        retrieve the results\n        Parameters:\n            step_name: the name of the step, i.e. \"train\"\n        Returns:\n            the results\n        \"\"\"\n        results = {}\n        for task in self.tasks:\n            results[task] = self.task_summaries[task].get_results(step_name)\n        return results\n\n    def get_best_results(\n        self,\n        step_name: str,\n    ) -> Dict[str, Dict[str, Any]]:\n        \"\"\"\n        retrieve the best results\n        Parameters:\n            step_name: the name of the step, i.e. \"train\"\n        Returns:\n            the best results\n        \"\"\"\n        results = {}\n        for task in self.tasks:\n            results[task] = self.task_summaries[task].get_best_results(step_name)\n        return results\n\n    def get_results_on_progress_bar(\n        self,\n        step_name: str,\n    ) -> Dict[str, Dict[str, Any]]:\n        r\"\"\"\n        return all results from all tasks for the progress bar\n        Combine the dictionaries. Instead of having keys as task names, we merge all the task-specific dictionaries.\n        Parameters:\n            step_name: the name of the step\n        Returns:\n            the results for the progress bar\n        \"\"\"\n        task_results_prog = {}\n        for task in self.tasks:\n            # task_results_prog[task] = self.task_summaries[task].get_results_on_progress_bar(step_name)\n            task_results_prog.update(self.task_summaries[task].get_results_on_progress_bar(step_name))\n        return task_results_prog\n\n    def get_dict_summary(\n        self,\n    ) -> Dict[str, Dict[str, Any]]:\n        r\"\"\"\n        get task summaries in a dictionary\n        Returns:\n            the task summaries\n        \"\"\"\n        task_full_dict = {}\n        for task in self.tasks:\n            task_full_dict[task] = self.task_summaries[task].get_dict_summary()\n        return task_full_dict\n\n    def get_metrics_logs(\n        self,\n    ) -> Dict[str, Dict[str, Tensor]]:\n        r\"\"\"\n        get the logs for the metrics\n        Returns:\n            the task logs for the metrics\n        \"\"\"\n        task_metrics_logs = {}\n        for task in self.tasks:\n            task_metrics_logs[task] = self.task_summaries[task].get_metrics_logs()\n            # average metrics\n            for key in task_metrics_logs[task]:\n                if isinstance(task_metrics_logs[task][key], torch.Tensor):\n                    if task_metrics_logs[task][key].numel() > 1:\n                        task_metrics_logs[task][key] = task_metrics_logs[task][key][\n                            task_metrics_logs[task][key] != 0\n                        ].mean()\n\n        # Include global (weighted loss)\n        task_metrics_logs[\"_global\"] = {}\n        task_metrics_logs[\"_global\"][f\"loss/{self.step_name}\"] = self.weighted_loss.detach().cpu()\n        return task_metrics_logs\n\n    # TODO (Gabriela): This works to fix the logging on TB, but make it more efficient\n    def concatenate_metrics_logs(\n        self,\n        metrics_logs: Dict[str, Dict[str, Tensor]],\n    ) -> Dict[str, Tensor]:\n        r\"\"\"\n        concatenate the metrics logs\n        Parameters:\n            metrics_logs: the metrics logs\n        Returns:\n            the concatenated metrics logs\n        \"\"\"\n        concatenated_metrics_logs = {}\n        for task in list(self.tasks) + [\"_global\"]:\n            concatenated_metrics_logs.update(metrics_logs[task])\n        concatenated_metrics_logs[f\"loss/{self.step_name}\"] = self.weighted_loss.detach().cpu()\n        return concatenated_metrics_logs\n\n    def metric_log_name(\n        self,\n        task_name: str,\n        metric_name: str,\n        step_name: str,\n    ) -> str:\n        r\"\"\"\n        print the metric name, task name and step name\n        Returns:\n            the metric name, task name and step name\n        \"\"\"\n        if task_name is None:\n            return f\"{metric_name}/{step_name}\"\n        else:\n            return f\"{task_name}/{metric_name}/{step_name}\"\n"
  },
  {
    "path": "graphium/utils/README.md",
    "content": "<div align=\"center\">\n    <img src=\"../../docs/images/logo-title.png\" height=\"80px\">\n    <h3>The Graph Of LIfe Library.</h3>\n</div>\n\n\n## What is in this folder? \n\nfolder for utils and helper functions\n\n- ✅ `spaces.py`: the file to track the space of all names and dictionaries\n- `mup.py`: helper function for mup operations\n"
  },
  {
    "path": "graphium/utils/__init__.py",
    "content": "from . import fs\nfrom . import tensor\n"
  },
  {
    "path": "graphium/utils/arg_checker.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\" Argument checker module \"\"\"\nimport collections\nimport numpy as np\n\n# Global variable of accepted string types\nKNOWN_TYPES = {\n    \"none\": None,\n    \"str\": str,\n    \"list\": list,\n    \"tuple\": tuple,\n    \"dict\": dict,\n    \"int\": int,\n    \"float\": float,\n    \"complex\": complex,\n    \"bool\": bool,\n    \"callable\": callable,\n}\n\n\ndef _parse_type(type_to_validate, accepted_types):\n    # Check if the provided type is accepted\n    if (type_to_validate is not None) and (not isinstance(type_to_validate, accepted_types)):\n        raise TypeError(\n            \"type_to_validate should be None, type or str. {} provided\".format(type(type_to_validate))\n        )\n    if isinstance(type_to_validate, str):\n        type_to_validate = type_to_validate.lower()\n        if type_to_validate in KNOWN_TYPES.keys():\n            type_to_validate = KNOWN_TYPES[type_to_validate]\n        else:\n            raise TypeError(\n                \"type_to_validate is not a known type. Known types are :\"\n                \" \\n{}\\n Provided : \\n{}\".format(KNOWN_TYPES.keys(), type_to_validate)\n            )\n    return type_to_validate\n\n\ndef _enforce_iter_type(arg, enforce_type):\n    # Cast the arg to be either a list or a tuple\n    if enforce_type is not None:\n        if (enforce_type == list) and (not isinstance(arg, list)):\n            arg = list(arg)\n        elif (enforce_type == tuple) and (not isinstance(arg, tuple)):\n            arg = tuple(arg)\n        elif enforce_type not in (list, tuple):\n            raise TypeError('enforce_type should be None, \"list\" or \"tuple\", but is {}'.format(enforce_type))\n    return arg\n\n\ndef check_arg_iterator(arg, enforce_type=None, enforce_subtype=None, cast_subtype: bool = True):\n    r\"\"\"\n    Verify if the type is an iterator. If it is `None`, convert to an empty list/tuple. If it is\n    not a list/tuple/str, try to convert to an iterator. If it is a str or cannot be converted to\n    an iterator, then put the `arg` inside an iterator.\n    Possibly enforce the iterator type to `list` or `tuple`, if `enfoce_type` is not None.\n    Possibly enforce the subtype to any given type if `enforce_subtype` is not None,\n    and decide whether to cast the subtype or to throw an error.\n\n    Parameters:\n        arg (any type):\n            The input to verify/convert to an iterator (list or tuple). If None, an empty iterator\n            is returned.\n        enforce_type (str or type):\n            The type to enforce the iterator. The valid choices are :\n            `None`, `list`, `tuple`, `'none'`, `'list'`, `'tuple'`.\n            If `None`, then the iterator type is not enforced.\n\n        enforce_subtype (type, np.dtype or str representing basic type):\n            Verify if all the elements inside the iterator are the desired type.\n            If `None`, then the sub-type is not enforced.\n            Accepted strings are ['none', 'str', 'list', 'tuple', 'dict', 'int',\n            'float', 'complex', 'bool', 'callable']\n\n        cast_subtype:\n            If True, then the type specified by `enforce_subtype` is used to cast the\n            elements inside the iterator. If False, then an error is thrown if the\n            types do not match.\n\n    Returns:\n        output (iterator):\n            An iterator based on the input of the desired type (list or tuple) and\n            the desired subtypes.\n\n    \"\"\"\n\n    # If not list or tuple, put into a list\n    if arg is None:\n        arg = []\n    elif isinstance(arg, str):\n        arg = [arg]\n    elif isinstance(arg, tuple):\n        if enforce_type is None:\n            enforce_type = tuple\n        arg = list(arg)\n    elif not isinstance(arg, (tuple, list)):\n        try:\n            arg = list(arg)\n        except Exception:\n            arg = [arg]\n\n    output = arg\n\n    # Make sure that enforce_type and enforce_subtype are a good inputs\n    enforce_type = _parse_type(enforce_type, (type, str))\n    enforce_subtype = _parse_type(enforce_subtype, (type, str, np.dtype))\n\n    # Cast all the subtypes of the list/tuple into the desired subtype\n    if enforce_subtype is not None:\n        if enforce_type is None:\n            arg2 = output\n        elif not isinstance(output, enforce_type):\n            arg2 = list(output)\n        else:\n            arg2 = output\n        try:\n            for idx, a in enumerate(output):\n                if not isinstance(a, enforce_subtype):\n                    if cast_subtype:\n                        arg2[idx] = enforce_subtype(a)\n                    else:\n                        raise TypeError(\n                            \"iter subtype is {}, desired subtype is {}, \"\n                            \"but cast_subtype is set to False\".format(type(arg2[idx]), enforce_subtype)\n                        )\n        except Exception as e:\n            raise TypeError(\n                \"iterator subtype is {} and cannot be casted to {}\\n{}\".format(type(a), enforce_subtype, e)\n            )\n\n        output = _enforce_iter_type(arg2, enforce_type)\n\n    output = _enforce_iter_type(output, enforce_type)\n\n    return output\n\n\ndef check_list1_in_list2(list1, list2, throw_error=True):\n    r\"\"\"\n    Verify if the list1 (iterator) is included in list2 (iterator). If not, raise an error.\n\n    Parameters:\n        list1, list2: list, tuple or object\n            A list or tuple containing the elements to verify the inclusion.\n            If an object is provided other than a list or tuple,\n            then it is considered as a list of a single element.\n        throw_error: bool\n            Whether to throw an error if list1 is not in list2\n\n    Returns:\n        list1_in_list2: bool\n            A boolean representing the inclusion of list1 in list2. It is returned if\n            throw_error is set to false\n\n\n    \"\"\"\n\n    list1 = check_arg_iterator(list1)\n    list2 = check_arg_iterator(list2)\n\n    # If all elements of list1 are not in list2, throw an error\n    list1_in_list2 = all(elem in list2 for elem in list1)\n    if not list1_in_list2 and throw_error:\n        raise ValueError(\n            (\"Elements in list1 should be contained in list2.\" + \"\\n\\nlist1 = {} \\n\\n list2 = {}\").format(\n                list1, list2\n            )\n        )\n\n    return list1_in_list2\n\n\ndef check_columns_choice(dataframe, columns_choice, extra_accepted_cols=None, enforce_type=\"list\"):\n    r\"\"\"\n    Verify if the choice of column `columns_choice` is inside the dataframe or\n    the extra_accepted_cols. Otherwise, errors are thrown by the sub-functions.\n\n    Parameters:\n        dataframe: (pd.DataFrame)\n            The dataframe on which to verify if the column choice is valid.\n            columns_choice: str, iterator(str)\n            The columns chosen from the dataframe\n        extra_accepted_cols: str, iterator(str)\n            A list\n\n        enforce_type: str or type\n            The type to enforce the iterator. The valid choices are :\n            `None`, `list`, `tuple`, `'none'`, `'list'`, `'tuple'`.\n            If `None`, then the iterator type is not enforced.\n\n\n    Returns:\n        output: iterator\n            A str iterator based on the input of the desired type (list or tuple)\n\n    \"\"\"\n    extra_accepted_cols = [] if extra_accepted_cols is None else extra_accepted_cols\n    valid_columns = list(dataframe.columns)\n    kwargs_iterator = {\n        \"enforce_type\": enforce_type,\n        \"enforce_subtype\": None,\n        \"cast_subtype\": False,\n    }\n    columns_choice = check_arg_iterator(columns_choice, **kwargs_iterator)\n    extra_accepted_cols = check_arg_iterator(extra_accepted_cols, **kwargs_iterator)\n    valid_columns = check_arg_iterator(valid_columns, **kwargs_iterator)\n    valid_columns_full = valid_columns + extra_accepted_cols\n    check_list1_in_list2(columns_choice, valid_columns_full)\n\n    return columns_choice\n"
  },
  {
    "path": "graphium/utils/command_line_utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport re\nfrom collections import defaultdict\nfrom typing import List, Dict\n\n\ndef get_anchors_and_aliases(filepath):\n    \"\"\"Utility function to extract anchors and aliases from YAML config file\n\n    In a YAML file we can specify an anchor as  `some_name: &anchor_name value`\n    This is then picked up with alises in the rest of the config as\n    `other_name: *anchor_name`\n    Using this format in the YAML file all anchors and aliases will be extracted\n\n    Args:\n        filepath (str): path to the config file\n\n    Returns:\n        anchors (Dict): A dictionary containing the YAML paths of the anchors\n                        as keys, and a list of YAML paths to alises.\n    \"\"\"\n    anchors = defaultdict(list)\n    current_level = {}\n    anchor_to_path = {}\n\n    with open(filepath, \"r\") as file:\n        for line in file:\n            indent = len(line) - len(line.lstrip(\" \"))\n            key_match = re.search(r\"(\\w+):\", line)\n            anchor_match = re.search(r\"&(\\w+)\", line)\n            alias_match = re.search(r\"\\*(\\w+)\", line)\n            if key_match:\n                key = key_match.group(1)\n                # Compute the full path of the current key.\n                full_path = \".\".join(\n                    [current_level[i] for i in sorted(current_level.keys()) if i < indent] + [key]\n                )\n                current_level[indent] = key\n                # Remove any keys that are indented more than the current line.\n                keys_to_remove = [i for i in current_level if i > indent]\n                for i in keys_to_remove:\n                    del current_level[i]\n            else:\n                full_path = \".\".join([current_level[i] for i in sorted(current_level.keys())])\n            if anchor_match:\n                anchor = anchor_match.group(1)\n                anchor_to_path[anchor] = full_path\n            if alias_match:\n                alias = alias_match.group(1)\n                if alias in anchor_to_path:\n                    anchors[anchor_to_path[alias]].append(full_path)\n    return anchors\n\n\ndef update_config(cfg: Dict, unknown: List, anchors: List):\n    \"\"\"\n    Update the configuration dictionary with command line arguments.\n    \"\"\"\n    for arg in unknown:\n        if arg.startswith(\"--\"):\n            key, value = arg[2:].split(\"=\")\n            if key in anchors.keys():\n                all_refs = anchors[key]\n            else:\n                all_refs = []\n            all_refs.append(key)\n            for key in all_refs:\n                keys = key.split(\".\")\n                temp_cfg = cfg\n                for k in keys[:-1]:\n                    temp_cfg = temp_cfg[k]\n                temp_cfg[keys[-1]] = type(temp_cfg[keys[-1]])(value) if keys[-1] in temp_cfg else value\n\n    return cfg\n"
  },
  {
    "path": "graphium/utils/custom_lr.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport warnings\nfrom torch.optim.lr_scheduler import _LRScheduler\n\n\nclass WarmUpLinearLR(_LRScheduler):\n    \"\"\"Custom linear learning rate with warmup.\n\n    Args:\n        optimizer (Optimizer): Wrapped optimizer.\n        max_num_epochs (int): Maximum number of epochs.\n        warmup_epochs (int): The number of epochs for learning rate warmup. Default: 0.\n        min_lr (float): the minimum learning rate value. Default: 0.\n        last_epoch (int): The index of last epoch. Default: -1.\n        verbose (bool): If ``True``, prints a message to stdout for\n            each update. Default: ``False``.\n    \"\"\"\n\n    def __init__(self, optimizer, max_num_epochs, warmup_epochs=0, min_lr=0, last_epoch=-1, verbose=False):\n        self.max_num_epochs = max_num_epochs\n        self.warmup_epochs = warmup_epochs\n        self.min_lr = min_lr\n        self.last_epoch = last_epoch\n        super(WarmUpLinearLR, self).__init__(optimizer, last_epoch, verbose)\n\n    def get_lr(self):\n        if not self._get_lr_called_within_step:\n            warnings.warn(\n                \"To get the last learning rate computed by the scheduler, \" \"please use `get_last_lr()`.\",\n                UserWarning,\n            )\n        if self.warmup_epochs > 0 and (self.last_epoch + 1) < self.warmup_epochs:\n            return [(self.last_epoch + 1) * base_lr / self.warmup_epochs for base_lr in self.base_lrs]\n        else:\n            # check epoch_diff in case there is a division by zero error\n            epoch_diff = self.max_num_epochs - self.warmup_epochs\n            if epoch_diff <= 0:\n                factor = 0\n            else:\n                factor = ((self.last_epoch + 1) - self.warmup_epochs) / epoch_diff\n            return [factor * self.min_lr + (1 - factor) * base_lr for base_lr in self.base_lrs]\n\n    def _get_closed_form_lr(self):\n        lr_list = []\n        for base_lr in self.base_lrs:\n            if self.warmup_epochs > 0 and (self.last_epoch + 1) < self.warmup_epochs:\n                lr = (self.last_epoch + 1) * base_lr / self.warmup_epochs\n            else:\n                factor = ((self.last_epoch + 1) - self.warmup_epochs) / (\n                    self.max_num_epochs - self.warmup_epochs\n                )\n                lr = factor * self.min_lr + (1 - factor) * base_lr\n            lr_list.append(lr)\n        return lr_list\n"
  },
  {
    "path": "graphium/utils/decorators.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\nclass classproperty(property):\n    r\"\"\"\n    Decorator used to declare a class property, defined for the class\n    without needing to instanciate an object.\n\n    !!! Example\n\n        ``` python linenums=\"1\"\n            @classproperty\n            def my_class_property(cls):\n                return 5\n        ```\n    \"\"\"\n\n    def __get__(self, cls, owner):\n        return classmethod(self.fget).__get__(None, owner)()\n"
  },
  {
    "path": "graphium/utils/fs.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Union\nfrom typing import Optional\n\nimport os\nimport io\nimport platformdirs\nimport pathlib\n\nfrom tqdm.auto import tqdm\nimport fsspec\n\n\ndef get_cache_dir(suffix: str = None, create: bool = True) -> pathlib.Path:\n    \"\"\"Get a local cache directory. You can append a suffix folder to it and optionnaly create\n    the folder if it doesn't exist.\n    \"\"\"\n\n    cache_dir = pathlib.Path(platformdirs.user_cache_dir(appname=\"graphium\"))\n\n    if suffix is not None:\n        cache_dir /= suffix\n\n    if create:\n        cache_dir.mkdir(exist_ok=True, parents=True)\n\n    return cache_dir\n\n\ndef get_mapper(path: Union[str, os.PathLike]):\n    \"\"\"Get the fsspec mapper.\n    Args:\n        path: a path supported by `fsspec` such as local, s3, gcs, etc.\n    \"\"\"\n    return fsspec.get_mapper(str(path))\n\n\ndef get_basename(path: Union[str, os.PathLike]):\n    \"\"\"Get the basename of a file or a folder.\n    Args:\n        path: a path supported by `fsspec` such as local, s3, gcs, etc.\n    \"\"\"\n    path = str(path)\n    mapper = get_mapper(path)\n    clean_path = path.rstrip(mapper.fs.sep)\n    return str(clean_path).split(mapper.fs.sep)[-1]\n\n\ndef get_extension(path: Union[str, os.PathLike]):\n    \"\"\"Get the extension of a file.\n    Args:\n        path: a path supported by `fsspec` such as local, s3, gcs, etc.\n    \"\"\"\n    basename = get_basename(path)\n    return basename.split(\".\")[-1]\n\n\ndef exists(path: Union[str, os.PathLike, fsspec.core.OpenFile, io.IOBase]):\n    \"\"\"Check whether a file exists.\n    Args:\n        path: a path supported by `fsspec` such as local, s3, gcs, etc.\n    \"\"\"\n\n    if isinstance(path, fsspec.core.OpenFile):\n        return path.fs.exists(path.path)\n\n    elif isinstance(path, (str, pathlib.Path)):\n        mapper = get_mapper(str(path))\n        return mapper.fs.exists(path)\n\n    else:\n        # NOTE(hadim): file-like objects always exist right?\n        return True\n\n\ndef exists_and_not_empty(path: Union[str, os.PathLike]):\n    \"\"\"Check whether a directory exists and is not empty.\"\"\"\n\n    if not exists(path):\n        return False\n\n    fs = get_mapper(path).fs\n\n    return len(fs.ls(path)) > 0\n\n\ndef mkdir(path: Union[str, os.PathLike], exist_ok: bool = True):\n    \"\"\"Create directory including potential parents.\"\"\"\n    fs = get_mapper(path).fs\n    fs.mkdirs(path, exist_ok=exist_ok)\n\n\ndef rm(path: Union[str, os.PathLike], recursive=False, maxdepth=None):\n    \"\"\"Delete a file or a directory with all nested files.\"\"\"\n    fs = get_mapper(path).fs\n    fs.rm(path, recursive=recursive, maxdepth=maxdepth)\n\n\ndef join(*paths):\n    \"\"\"Join paths together. The first element determine the\n    filesystem to use (and so the separator.\n    Args:\n        paths: a list of paths supported by `fsspec` such as local, s3, gcs, etc.\n    \"\"\"\n    paths = [str(path) for path in paths]\n    source_path = paths[0]\n    fs = get_mapper(source_path).fs\n    full_path = fs.sep.join(paths)\n    return full_path\n\n\ndef get_size(file: Union[str, os.PathLike, io.IOBase, fsspec.core.OpenFile]) -> Optional[int]:\n    \"\"\"Get the size of a file given its path. Return None if the\n    size can't be retrieved.\n    \"\"\"\n\n    if isinstance(file, io.IOBase) and hasattr(file, \"name\"):\n        fs_local = fsspec.filesystem(\"file\")\n        file_size = fs_local.size(getattr(file, \"name\"))\n\n    elif isinstance(file, (str, pathlib.Path)):\n        fs = get_mapper(str(file)).fs\n        file_size = fs.size(str(file))\n\n    elif isinstance(file, fsspec.core.OpenFile):\n        file_size = file.fs.size(file.path)\n\n    else:\n        file_size = None\n\n    return file_size\n\n\ndef copy(\n    source: Union[str, os.PathLike, io.IOBase, fsspec.core.OpenFile],\n    destination: Union[str, os.PathLike, io.IOBase, fsspec.core.OpenFile],\n    chunk_size: int = None,\n    force: bool = False,\n    progress: bool = False,\n    leave_progress: bool = True,\n):\n    \"\"\"Copy one file to another location across different filesystem (local, S3, GCS, etc).\n\n    Args:\n        source: path or file-like object to copy from.\n        destination: path or file-like object to copy to.\n        chunk_size: the chunk size to use. If progress is enabled the chunk\n            size is `None`, it is set to 2048.\n        force: whether to overwrite the destination file it it exists.\n        progress: whether to display a progress bar.\n        leave_progress: whether to hide the progress bar once the copy is done.\n    \"\"\"\n\n    if progress and chunk_size is None:\n        chunk_size = 2048\n\n    if isinstance(source, (str, os.PathLike)):\n        source_file = fsspec.open(str(source), \"rb\")\n    else:\n        source_file = source\n\n    if isinstance(destination, (str, os.PathLike)):\n        # adapt the file mode of the destination depending on the source file.\n        destination_mode = \"wb\"\n        if hasattr(source_file, \"mode\"):\n            destination_mode = \"wb\" if \"b\" in getattr(source_file, \"mode\") else \"w\"\n        elif isinstance(source_file, io.BytesIO):\n            destination_mode = \"wb\"\n        elif isinstance(source_file, io.StringIO):\n            destination_mode = \"w\"\n\n        destination_file = fsspec.open(str(destination), destination_mode)\n    else:\n        destination_file = destination\n\n    if not exists(source_file):\n        raise ValueError(f\"The file being copied does not exist: {source}\")\n\n    if not force and exists(destination_file):\n        raise ValueError(f\"The destination file to copy already exists: {destination}\")\n\n    with source_file as source_stream:\n        with destination_file as destination_stream:\n            if chunk_size is None:\n                # copy without chunks\n                destination_stream.write(source_stream.read())\n\n            else:\n                # copy with chunks\n\n                # determine the size of the source file\n                source_size = None\n                if progress:\n                    source_size = get_size(source)\n\n                # init progress bar\n                pbar = tqdm(\n                    total=source_size,\n                    leave=leave_progress,\n                    disable=not progress,\n                    unit=\"B\",\n                    unit_divisor=1024,\n                    unit_scale=True,\n                )\n\n                # start the loop\n                while True:\n                    data = source_stream.read(chunk_size)\n                    if not data:\n                        break\n                    destination_stream.write(data)\n                    pbar.update(chunk_size)\n\n                pbar.close()\n"
  },
  {
    "path": "graphium/utils/hashing.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import Any\nimport hashlib\nimport yaml\n\n\ndef get_md5_hash(object: Any) -> str:\n    \"\"\"\n    MD5 hash of any object.\n    The object is converted to a YAML string before being hashed.\n    This allows for nested dictionaries/lists and for hashing of classes and their attributes.\n    \"\"\"\n    dhash = hashlib.md5()\n    encoded = yaml.dump(object, sort_keys=True).encode()\n    dhash.update(encoded)\n    return dhash.hexdigest()\n"
  },
  {
    "path": "graphium/utils/moving_average_tracker.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom dataclasses import dataclass\n\n\n@dataclass\nclass MovingAverageTracker:\n    num_samples: int = 0\n    mean_value: float = 0.0\n\n    def update(self, value: float):\n        self.mean_value = self.mean_value * (self.num_samples / (self.num_samples + 1)) + value / (\n            self.num_samples + 1\n        )\n        self.num_samples += 1\n\n    def reset(self):\n        # no need to update mean_value, it will be reset when update is called\n        self.num_samples = 0\n"
  },
  {
    "path": "graphium/utils/mup.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n##### Code adapted from the `mup` package from Microsoft https://github.com/microsoft/mup\n\nfrom torch.nn import Linear\nfrom torch.nn.modules.conv import _ConvNd\nfrom mup import get_shapes, assert_hidden_size_inf, MuReadout, rescale_linear_bias, save_base_shapes\nfrom mup.shape import _zip_infshape_dict, _extract_shapes\n\nfrom graphium.nn.base_layers import MuReadoutGraphium\n\n\ndef apply_infshapes(model, infshapes):\n    \"\"\"\n    Modified from the regular `mup.apply_infshapes` by explicitly adding `base_dim` to the `MuReadoutGraphium`.\n    This allows the code to work on IPUs.\n    \"\"\"\n    for name, p in model.named_parameters():\n        p.infshape = infshapes[name]\n    for _, module in model.named_modules():\n        if isinstance(module, MuReadoutGraphium):\n            module.base_width = module.weight.infshape[-1].base_dim\n\n\ndef set_base_shapes(model, base, rescale_params=True, delta=None, savefile=None, do_assert=True):\n    \"\"\"Sets the `p.infshape` attribute for each parameter `p` of `model`.\n\n    Code taken from the `mup` package from Microsoft https://github.com/microsoft/mup.\n    No change except in the `apply_inf_shapes`, using the one from Graphium instead of `mup`\n\n    Inputs:\n        model: nn.Module instance\n        base: The base model.\n            Can be nn.Module, a dict of shapes, a str, or None.\n            If None, then defaults to `model`\n            If str, then treated as filename for yaml encoding of a dict of base shapes.\n        rescale_params:\n            assuming the model is initialized using the default pytorch init (or\n            He initialization etc that scale the same way with fanin): If True\n            (default), rescales parameters to have the correct (μP) variances.\n        do_assert:\n    Output:\n        same object as `model`, after setting the `infshape` attribute of each parameter.\n    \"\"\"\n    if base is None:\n        base = model\n    base_shapes = _extract_shapes(base)\n    if delta is not None:\n        delta_shapes = _extract_shapes(delta)\n        base_shapes = _zip_infshape_dict(base_shapes, delta_shapes)\n    shapes = get_shapes(model)\n    infshapes = _zip_infshape_dict(base_shapes, shapes)\n    if savefile is not None:\n        save_base_shapes(infshapes, savefile)\n    apply_infshapes(model, infshapes)\n    if do_assert:\n        assert_hidden_size_inf(model)\n    if rescale_params:\n        for name, module in model.named_modules():\n            if isinstance(module, MuReadout):\n                module._rescale_parameters()\n            elif isinstance(module, (Linear, _ConvNd)):\n                rescale_linear_bias(module)\n    return model\n"
  },
  {
    "path": "graphium/utils/packing.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom typing import List, Tuple, Iterable, Optional\nimport numpy as np\nimport torch\n\n\nclass MolPack:\n    \"\"\"\n    Class that keeps track of the number of atoms and indices that are added\n    to each pack. Useful when doing packing, or other forms of smart batching.\n    A pack is a batch, but with optimized memory consumption.\n    \"\"\"\n\n    def __init__(self):\n        self.num_nodes = 0\n        self.num_graphs = 0\n        self.average_atom = 0\n        self.indices = []\n\n    def add_mol(self, num_nodes: int, idx: int) -> \"MolPack\":\n        \"\"\"\n        Add a molecule and it's index to the batch\n\n        Parameters:\n            num_nodes: Number of atoms of the new molecule\n\n            idx: Index associated to the molecule\n        \"\"\"\n        self.num_nodes += num_nodes\n        self.num_graphs += 1\n        self.average_atom = self.num_nodes / self.num_graphs\n        self.indices.append(idx)\n        return self\n\n    def expected_atoms(self, remaining_mean_num_nodes: float, batch_size: int) -> float:\n        \"\"\"\n        Given a desired batch size, and given the remaining mean number of\n        atoms, find the expected number of atoms of the current batch when it is full\n\n        Parameters:\n            remaining_mean_num_nodes: Average number of atoms per molecule\n                left to be sampled and distributed across tasks.\n\n            batch_size: Desired batch size\n\n        Returns:\n            expected_atoms: The expected number of atoms in this batch if we\n                sample randomly the remaining molecules.\n        \"\"\"\n        return self.num_nodes + ((batch_size - self.num_graphs) * remaining_mean_num_nodes)\n\n    def __repr__(self) -> str:\n        \"\"\"\n        Print the main attributes of the current class\n        \"\"\"\n        return f\"{self.__class__.__name__}(m: {self.num_graphs},\\ta: {self.num_nodes},\\tav: {self.average_atom:.1f})\"\n\n\ndef smart_packing(num_nodes: List[int], batch_size: int) -> List[List[int]]:\n    \"\"\"\n    Simple and fast algorithm for packing graphs such that each batch has roughly the\n    same number of atoms.\n    Has for-loop scalability issues `O(num_graphs * ipu_batch_size)` = `O(num_graphs^2 / batch_size)`\n\n    Parameters:\n        num_nodes: List of the number of atoms per molecule for the entire global batch.\n            Must be of length `batch_size * ipu_batch_size`.\n\n        batch_size: The batch size per iteration, considering a single device and single\n            forward pass.\n            The global batch size is `batch_size * device_iterations * replication_factor * gradient_accumulation`\n\n    Returns:\n        packed_indices: A list of packs, each containing a list of indices, such that\n            if we collect `num_nodes` from the indices, then each pack has roughly the\n            same total number of atoms.\n    \"\"\"\n\n    # Sort the list\n    num_nodes = np.asarray(num_nodes)\n    argsort_num_nodes = np.argsort(num_nodes)\n    sorted_num_nodes = num_nodes[argsort_num_nodes]\n    ipu_batch_size = int(len(num_nodes) / batch_size)\n    sorted_num_nodes, initial_num_nodes = (\n        sorted_num_nodes[:-ipu_batch_size],\n        sorted_num_nodes[-ipu_batch_size:],\n    )\n    reverse_cumsum = np.sum(sorted_num_nodes) - np.cumsum(sorted_num_nodes) + sorted_num_nodes[-1]\n\n    # Start with the largest element in separate packs\n    mol_batches = [\n        MolPack().add_mol(initial_num_nodes[-ii - 1], argsort_num_nodes[-ii - 1])\n        for ii in range(ipu_batch_size)\n    ]\n\n    # Loop from smallest to largest molecule, and add each molecule to the pack with smallest expected sum\n    for ii, num_atom in enumerate(sorted_num_nodes):\n        remaining_mean = reverse_cumsum[ii] / (len(sorted_num_nodes) - ii)\n        max_expected, idx_max_expected = 0, 0\n        for jj, m in enumerate(mol_batches):\n            if m.num_graphs >= batch_size:\n                continue\n            expected = m.num_nodes + (\n                (batch_size - m.num_graphs) * remaining_mean\n            )  # Faster than calling m.expected_atoms\n            if expected > max_expected:\n                max_expected = expected\n                idx_max_expected = jj\n        mol_batches[idx_max_expected].add_mol(num_atom, argsort_num_nodes[ii])\n\n    packed_indices = [batch.indices for batch in mol_batches]\n\n    return packed_indices\n\n\ndef fast_packing(num_nodes: List[int], batch_size: int) -> List[List[int]]:\n    \"\"\"\n    Super fast algorithm for packing graphs such that each batch has roughly the\n    same number of atoms. Not as good as `smart_packing` but\n    faster and more scalable for-loop complexity of `O(batch_size)`.\n\n    Parameters:\n        num_nodes: List of the number of atoms per molecule for the entire global batch.\n            Must be of length `batch_size * ipu_batch_size`.\n\n        batch_size: The batch size per iteration, considering a single device and single\n            forward pass.\n            The global batch size is `batch_size * device_iterations * replication_factor * gradient_accumulation`\n\n    Returns:\n        packed_indices: A list of packs, each containing a list of indices, such that\n            if we collect `num_nodes` from the indices, then each pack has roughly the\n            same total number of atoms.\n    \"\"\"\n    num_nodes = np.asarray(num_nodes)\n    argsort_num_nodes = np.argsort(num_nodes)\n    ipu_batch_size = int(len(num_nodes) / batch_size)\n\n    packed_indices = np.stack(\n        [\n            np.random.permutation(argsort_num_nodes[ii * ipu_batch_size : (ii + 1) * ipu_batch_size])\n            for ii in range(batch_size)\n        ],\n        axis=0,\n    ).T.tolist()\n    return packed_indices\n\n\ndef hybrid_packing(num_nodes: List[int], batch_size: int) -> List[List[int]]:\n    \"\"\"\n    Uses a combination of the `smart_packing` `O(n^2)` on the most important data points,\n    and the `fast_packing` `O(n)` on the average-sized data points.\n\n    Depending on the expected complexity\n\n    Parameters:\n        num_nodes: List of the number of atoms per molecule for the entire global batch.\n            Must be of length `batch_size * ipu_batch_size`.\n\n        batch_size: The batch size per iteration, considering a single device and single\n            forward pass.\n            The global batch size is `batch_size * device_iterations * replication_factor * gradient_accumulation`\n\n    Returns:\n        packed_indices: A list of packs, each containing a list of indices, such that\n            if we collect `num_nodes` from the indices, then each pack has roughly the\n            same total number of atoms.\n    \"\"\"\n\n    # Determine the parameters based on the complexity of the smart-packing.\n    # The bigger the complexity, the more the `fast_packing` algorithm becomes\n    # statistically powerful, and the more speed benefits it provides.\n    smart_packing_complexity = len(num_nodes) ** 2 / batch_size\n    if smart_packing_complexity < 1e4:\n        return smart_packing(num_nodes=num_nodes, batch_size=batch_size)\n    elif smart_packing_complexity < 1e5:\n        big, small = 3, 6\n    else:\n        return fast_packing(num_nodes=num_nodes, batch_size=batch_size)\n\n    # Small datasets benefit from smart-packing, without compute burden\n    ipu_batch_size = int(len(num_nodes) / batch_size)\n    if len(num_nodes) < (big + small) * ipu_batch_size:\n        return smart_packing(num_nodes=num_nodes, batch_size=batch_size)\n\n    # Sort the list\n    num_nodes = np.asarray(num_nodes)\n    argsort_num_nodes = np.argsort(num_nodes)\n\n    # Smallest and biggest graphs are often outliers and will benefit from the `smart_packing`\n    biggest_graphs = argsort_num_nodes[-big * ipu_batch_size :]\n    smallest_graphs = argsort_num_nodes[: small * ipu_batch_size]\n    big_n_small_graphs = np.concatenate([biggest_graphs, smallest_graphs])\n    big_n_small_packs = smart_packing(num_nodes[big_n_small_graphs], batch_size=big + small)\n    big_n_small_indices = [big_n_small_graphs[pack] for pack in big_n_small_packs]\n    big_n_small_nodes = [num_nodes[pack] for pack in big_n_small_indices]\n\n    # Medium graphs will be packed faster\n    medium_graphs = argsort_num_nodes[small * ipu_batch_size : -big * ipu_batch_size]\n    medium_packs = fast_packing(num_nodes[medium_graphs], batch_size=batch_size - big - small)\n    medium_indices = [medium_graphs[pack] for pack in medium_packs]\n    medium_nodes = [num_nodes[pack] for pack in medium_indices]\n\n    # Pack the big/small with the medium in a smart way\n    big_n_small_sort = np.argsort(np.sum(np.stack(big_n_small_nodes, axis=1), axis=0))\n    medium_sort = np.argsort(np.sum(np.stack(medium_nodes, axis=1), axis=0))\n    packed_indices = [\n        np.concatenate([medium_indices[medium_sort[ii]], big_n_small_indices[big_n_small_sort[-ii]]])\n        for ii in range(len(medium_indices))\n    ]\n\n    return packed_indices\n\n\ndef get_pack_sizes(packed_indices, num_nodes):\n    \"\"\"\n    Get the number of atoms of each pack\n    \"\"\"\n    pack_sums = []\n    for pack in packed_indices:\n        pack_sum = 0\n        for idx in pack:\n            pack_sum += num_nodes[idx]\n        pack_sums.append(pack_sum)\n    return pack_sums\n\n\ndef estimate_max_pack_node_size(num_nodes: Iterable[int], batch_size: int, combined_batch_size: int):\n    \"\"\"\n    Estimate the value of `max_num_nodes`, which represents the maximum number of nodes\n    needed in a batch to fit the data.\n\n    Parameters:\n        num_nodes: Number of nodes for all the graphs in the dataset\n        batch_size: The regular batch size per IPU\n        combined_batch_size: batch_size * device_iterations\n                             * replication_factor * gradient_accumulation\n\n    \"\"\"\n\n    # Estimate the packing size needed\n    rand_indices = np.arange(len(num_nodes))\n    np.random.shuffle(rand_indices)\n    max_pack_size = 0\n    for ii in range(0, len(num_nodes), combined_batch_size):\n        this_indices = rand_indices[ii : ii + combined_batch_size]\n        choice = num_nodes[this_indices]\n        if len(choice) == combined_batch_size:\n            packed_indices = hybrid_packing(choice, batch_size)\n            max_pack_size = max(max_pack_size, max(get_pack_sizes(packed_indices, num_nodes[this_indices])))\n    max_pack_size_per_graph = max_pack_size / batch_size\n\n    return max_pack_size, max_pack_size_per_graph\n\n\ndef node_to_pack_indices_mask(\n    packed_indices: Iterable[Iterable[int]], all_num_nodes: Iterable[int], max_pack_size: Optional[int] = None\n) -> Tuple[torch.Tensor, torch.Tensor]:\n    \"\"\"\n    Given a list of packed indices, and the number of nodes in each graph,\n    return a tensor of shape (sum(all_num_nodes), 2) where the first column\n    is the pack index, and the second column is the node index within the pack.\n\n    Can be used to generate a dense packing of the nodes as follows:\n    ```\n    # node_features: A tensor of shape (num_nodes, num_node_features)\n    # num_packs: The number of packs desired\n    # max_nodes_per_pack: The maximum number of nodes per pack\n    # dense_pack: A tensor of shape (num_packs, max_nodes_per_pack, num_node_features)\n\n    dense_pack = torch.zeros([num_packs, max_nodes_per_pack, num_node_features])\n    dense_pack[pack_from_node_idx[:, 0], pack_from_node_idx[:, 1]] = node_features\n    ```\n\n    This is useful when using a Transformer, to avoid wasteful padding when the\n    the longest sequence is much longer than the average sequence length.\n\n    Parameters:\n        packed_indices: A list of lists of graph indices, where each sub-list\n                        represents a pack of graphs\n        all_num_nodes: The number of nodes in each graph\n        max_pack_size: The maximum number of nodes per pack. If None, will be\n                          infered from the provided packs.\n                          Useful to determine the shape of the `pack_attn_mask`.\n\n    Returns:\n        pack_from_node_idx: A tensor of shape (num_nodes, 2) where the first column\n                        is the pack index, and the second column is the node index within the pack.\n\n        pack_attn_mask: A tensor of shape (num_packs, max_pack_size, max_pack_size),\n                            that represents the attention masking for each pack,\n                            such that the graphs in the pack are masked out from each other.\n    \"\"\"\n\n    all_num_nodes = torch.as_tensor(all_num_nodes, dtype=torch.long)\n    cumsum_num_nodes = torch.cumsum(all_num_nodes, dim=0)\n    if max_pack_size is None:\n        pack_sizes = get_pack_sizes(packed_indices, all_num_nodes)\n        max_pack_size = max(pack_sizes)\n\n    # Get the node indices associated to the packs, with 0 padding\n    pack_from_node_idx = torch.zeros(sum(all_num_nodes), 2, dtype=torch.long)\n    pack_attn_mask = []  # masks for the attention\n    for ii, pack in enumerate(packed_indices):\n        jj = 0  # Counter for the number of nodes in the pack\n        this_pack_attn_mask = torch.ones((max_pack_size, max_pack_size), dtype=torch.bool)\n        for graph_idx in pack:\n            num_nodes = all_num_nodes[graph_idx]\n            node_idx = torch.arange(cumsum_num_nodes[graph_idx] - num_nodes, cumsum_num_nodes[graph_idx])\n            this_pack_attn_mask[jj : jj + num_nodes, jj : jj + num_nodes] = False\n            pack_from_node_idx[node_idx, 0] = ii\n            pack_from_node_idx[node_idx, 1] = jj + torch.arange(num_nodes)\n            jj += num_nodes\n        pack_attn_mask.append(this_pack_attn_mask)\n    pack_attn_mask = torch.stack(pack_attn_mask, dim=0)\n\n    return pack_from_node_idx, pack_attn_mask\n"
  },
  {
    "path": "graphium/utils/safe_run.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom loguru import logger\nimport traceback as tb\n\n\nclass SafeRun:\n    def __init__(self, name: str, raise_error: bool = True, verbose: int = 2) -> None:\n        \"\"\"\n        Run some code with error handling and some printing, using the with statment.\n\n        Example:\n            In the example below, the `2+None`, an error will be caught and printed.\n            ```\n            with SafeRun(name=\"Addition that fails\", raise_error=False):\n                2 + None\n            ```\n\n        Parameters:\n            name: Name of the code, used for printing\n            raise_error: Whether to raise an error, or to catch it and print instead\n            verbose: The level of verbosity\n                0: Do not print anything\n                1: Print only the traced stack when an error is caught and `raise_error` is False\n                2: Print headers and footers at the start and exit of the with statement.\n        \"\"\"\n        self.name = name\n        self.raise_error = raise_error\n        self.verbose = verbose\n\n    def __enter__(self):\n        \"\"\"\n        Print that the with-statement started, if `self.verbose >= 2`\n        \"\"\"\n        if self.verbose >= 2:\n            logger.info(f\"\\n------------ {self.name} STARTED ------------\")\n\n    def __exit__(self, type, value, traceback):\n        \"\"\"\n        Handle the error. Raise it if `self.raise_error==True`, otherwise ignore it\n        and print it if `self.verbose >= 1`. Also print that the with-statement is\n        completed if `self.verbose >= 2`.\n        \"\"\"\n        if traceback is not None:\n            if self.raise_error:\n                if self.verbose >= 1:\n                    logger.error(f\"------------ {self.name} ERROR: ------------\")\n                return False\n            else:\n                if self.verbose >= 1:\n                    logger.error(f\"------------ {self.name} ERROR: ------------\\nERROR skipped. Traceback:\\n\")\n                    logger.trace(print(\"\".join(tb.format_exception(None, value, traceback))))\n                return True\n        else:\n            if self.verbose >= 2:\n                logger.info(\"\\n------------ {self.name} COMPLETED ------------\\n\\n\")\n            return True\n"
  },
  {
    "path": "graphium/utils/spaces.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom copy import deepcopy\nimport torch\nimport torch.optim.lr_scheduler as sc\nimport torchmetrics.functional as TorchMetrics\n\nimport graphium.nn.base_layers as BaseLayers\nimport graphium.nn.ensemble_layers as EnsembleLayers\nimport graphium.nn.architectures as Architectures\nimport graphium.utils.custom_lr as CustomLR\nimport graphium.data.datamodule as Datamodules\nimport graphium.ipu.ipu_losses as IPULosses\nimport graphium.ipu.ipu_metrics as Metrics\nimport graphium.nn.pyg_layers as PygLayers\nimport graphium.nn.residual_connections as Residuals\nimport graphium.nn.encoders as Encoders\nimport graphium.trainer.losses as Losses\n\nPE_ENCODERS_DICT = {\n    \"laplacian_pe\": Encoders.laplace_pos_encoder.LapPENodeEncoder,\n    \"mlp\": Encoders.mlp_encoder.MLPEncoder,\n    \"signnet\": Encoders.signnet_pos_encoder.SignNetNodeEncoder,\n    \"gaussian_kernel\": Encoders.gaussian_kernel_pos_encoder.GaussianKernelPosEncoder,\n    \"bessel_kernel\": Encoders.bessel_pos_encoder.BesselSphericalPosEncoder,\n}\n\n\nFC_LAYERS_DICT = {\n    \"fc\": BaseLayers.FCLayer,\n}\n\nENSEMBLE_FC_LAYERS_DICT = {\n    \"ens-fc\": EnsembleLayers.EnsembleFCLayer,\n}\n\nPYG_LAYERS_DICT = {\n    \"pyg:gcn\": PygLayers.GCNConvPyg,\n    \"pyg:gin\": PygLayers.GINConvPyg,\n    \"pyg:gine\": PygLayers.GINEConvPyg,\n    \"pyg:gated-gcn\": PygLayers.GatedGCNPyg,\n    \"pyg:pna-msgpass\": PygLayers.PNAMessagePassingPyg,\n    \"pyg:gps\": PygLayers.GPSLayerPyg,\n    \"pyg:dimenet\": PygLayers.DimeNetPyg,\n    \"pyg:mpnnplus\": PygLayers.MPNNPlusPyg,\n}\n\nLAYERS_DICT = deepcopy(FC_LAYERS_DICT)\nLAYERS_DICT.update(deepcopy(PYG_LAYERS_DICT))\n\nENSEMBLE_LAYERS_DICT = deepcopy(ENSEMBLE_FC_LAYERS_DICT)\n\nRESIDUALS_DICT = {\n    \"none\": Residuals.ResidualConnectionNone,\n    \"simple\": Residuals.ResidualConnectionSimple,\n    \"weighted\": Residuals.ResidualConnectionWeighted,\n    \"concat\": Residuals.ResidualConnectionConcat,\n    \"densenet\": Residuals.ResidualConnectionDenseNet,\n    \"random\": Residuals.ResidualConnectionRandom,\n}\n\nLOSS_DICT = {\n    \"bce\": torch.nn.BCELoss,\n    \"bce_logits\": torch.nn.BCEWithLogitsLoss,\n    \"mse\": torch.nn.MSELoss,\n    \"bce\": torch.nn.BCELoss,\n    \"l1\": torch.nn.L1Loss,\n    \"mae\": torch.nn.L1Loss,\n    \"hybrid_ce\": Losses.HybridCELoss,\n    \"bce_ipu\": IPULosses.BCELossIPU,\n    \"bce_logits_ipu\": IPULosses.BCEWithLogitsLossIPU,\n    \"mse_ipu\": IPULosses.MSELossIPU,\n    \"mae_ipu\": IPULosses.L1LossIPU,\n    \"l1_ipu\": IPULosses.L1LossIPU,\n    \"hybrid_ce_ipu\": IPULosses.HybridCELossIPU,\n}\n\n\nSCHEDULER_DICT = {\n    \"CosineAnnealingLR\": sc.CosineAnnealingLR,\n    \"CosineAnnealingWarmRestarts\": sc.CosineAnnealingWarmRestarts,\n    \"CyclicLR\": sc.CyclicLR,\n    \"ExponentialLR\": sc.ExponentialLR,\n    \"LambdaLR\": sc.LambdaLR,\n    \"MultiStepLR\": sc.MultiStepLR,\n    \"ReduceLROnPlateau\": sc.ReduceLROnPlateau,\n    \"StepLR\": sc.StepLR,\n    \"ConstantLR\": sc.ConstantLR,\n    \"WarmUpLinearLR\": CustomLR.WarmUpLinearLR,\n}\n\nMETRICS_CLASSIFICATION = {\n    \"accuracy\": TorchMetrics.accuracy,\n    \"averageprecision\": TorchMetrics.average_precision,\n    \"auroc\": TorchMetrics.auroc,\n    \"confusionmatrix\": TorchMetrics.confusion_matrix,\n    \"f1\": TorchMetrics.f1_score,\n    \"fbeta\": TorchMetrics.fbeta_score,\n    \"precisionrecallcurve\": TorchMetrics.precision_recall_curve,\n    \"precision\": TorchMetrics.precision,\n    \"recall\": TorchMetrics.recall,\n    \"mcc\": TorchMetrics.matthews_corrcoef,\n    \"auroc_ipu\": Metrics.auroc_ipu,\n    \"accuracy_ipu\": Metrics.accuracy_ipu,\n    \"average_precision_ipu\": Metrics.average_precision_ipu,\n    \"f1_ipu\": Metrics.f1_score_ipu,\n    \"fbeta_ipu\": Metrics.fbeta_score_ipu,\n    \"precision_ipu\": Metrics.precision_ipu,\n    \"recall_ipu\": Metrics.recall_ipu,\n}\n\nMETRICS_REGRESSION = {\n    \"mae\": TorchMetrics.mean_absolute_error,\n    \"mape\": TorchMetrics.mean_absolute_percentage_error,\n    \"mse\": TorchMetrics.mean_squared_error,\n    \"msle\": TorchMetrics.mean_squared_log_error,\n    \"pearsonr\": TorchMetrics.pearson_corrcoef,\n    \"spearmanr\": TorchMetrics.spearman_corrcoef,\n    \"r2_score\": TorchMetrics.r2_score,\n    \"cosine\": TorchMetrics.cosine_similarity,\n    \"pearsonr_ipu\": Metrics.pearson_ipu,\n    \"spearmanr_ipu\": Metrics.spearman_ipu,\n    \"r2_score_ipu\": Metrics.r2_score_ipu,\n    \"mae_ipu\": Metrics.mean_absolute_error_ipu,\n    \"mse_ipu\": Metrics.mean_squared_error_ipu,\n}\n\nMETRICS_DICT = deepcopy(METRICS_CLASSIFICATION)\nMETRICS_DICT.update(METRICS_REGRESSION)\n\n\nDATAMODULE_DICT = {\n    \"GraphOGBDataModule\": Datamodules.GraphOGBDataModule,\n    \"MultitaskFromSmilesDataModule\": Datamodules.MultitaskFromSmilesDataModule,\n    \"ADMETBenchmarkDataModule\": Datamodules.ADMETBenchmarkDataModule,\n    \"FakeDataModule\": Datamodules.FakeDataModule,\n}\n\nGRAPHIUM_PRETRAINED_MODELS_DICT = {\n    \"dummy-pretrained-model\": \"tests/dummy-pretrained-model.ckpt\",  # dummy model used for testing purposes\n}\n\nFINETUNING_HEADS_DICT = {\n    \"mlp\": Architectures.FeedForwardNN,\n    \"gnn\": Architectures.FeedForwardPyg,\n    \"task_head\": Architectures.TaskHeads,\n    \"ens-mlp\": Architectures.EnsembleFeedForwardNN,\n}\n"
  },
  {
    "path": "graphium/utils/tensor.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport os\nimport numpy as np\nimport pandas as pd\nfrom matplotlib import pyplot as plt\nfrom typing import Iterable, List, Union, Any, Callable, Dict\nfrom inspect import getfullargspec\nfrom copy import copy, deepcopy\nfrom loguru import logger\n\nfrom rdkit.Chem import AllChem\n\nimport torch\nfrom torch import Tensor\n\n\ndef save_im(im_dir, im_name: str, ext: List[str] = [\"svg\", \"png\"], dpi: int = 600) -> None:\n    if not os.path.exists(im_dir):\n        if im_dir[-1] not in [\"/\", \"\\\\\"]:\n            im_dir += \"/\"\n        os.makedirs(im_dir)\n\n    if isinstance(ext, str):\n        ext = [ext]\n\n    full_name = os.path.join(im_dir, im_name)\n    for this_ext in ext:\n        plt.savefig(f\"{full_name}.{this_ext}\", dpi=dpi, bbox_inches=\"tight\", pad_inches=0)\n\n\ndef is_dtype_torch_tensor(dtype: Union[np.dtype, torch.dtype]) -> bool:\n    r\"\"\"\n    Verify if the dtype is a torch dtype\n\n    Parameters:\n        dtype: dtype\n            The dtype of a value. E.g. np.int32, str, torch.float\n\n    Returns:\n        A boolean saying if the dtype is a torch dtype\n    \"\"\"\n    return isinstance(dtype, torch.dtype) or (dtype == Tensor)\n\n\ndef is_dtype_numpy_array(dtype: Union[np.dtype, torch.dtype]) -> bool:\n    r\"\"\"\n    Verify if the dtype is a numpy dtype\n\n    Parameters:\n        dtype: dtype\n            The dtype of a value. E.g. np.int32, str, torch.float\n\n    Returns:\n        A boolean saying if the dtype is a numpy dtype\n    \"\"\"\n    is_torch = is_dtype_torch_tensor(dtype)\n    is_num = dtype in (int, float, complex)\n    if hasattr(dtype, \"__module__\"):\n        is_numpy = dtype.__module__ == \"numpy\"\n    else:\n        is_numpy = False\n\n    return (is_num or is_numpy) and not is_torch\n\n\ndef one_of_k_encoding(val: Any, classes: Iterable[Any]) -> List[int]:\n    r\"\"\"Converts a single value to a one-hot vector.\n\n    Parameters:\n        val: int\n            class to be converted into a one hot vector\n            (integers from 0 to num_classes).\n        num_classes: iterator\n            a list or 1D array of allowed\n            choices for val to take\n    Returns:\n        A list of length len(num_classes) + 1\n    \"\"\"\n    encoding = [False] * (len(classes) + 1)\n    found = False\n    for i, v in enumerate(classes):\n        if v == val:\n            encoding[i] = True\n            found = True\n            break\n    if not found:\n        encoding[-1] = True\n    return encoding\n\n\ndef is_device_cuda(device: torch.device, ignore_errors: bool = False) -> bool:\n    r\"\"\"Check wheter the given device is a cuda device.\n\n    Parameters:\n        device: str, torch.device\n            object to check for cuda\n        ignore_errors: bool\n            Whether to ignore the error if the device is not recognized.\n            Otherwise, ``False`` is returned in case of errors.\n    Returns:\n        is_cuda: bool\n    \"\"\"\n\n    if ignore_errors:\n        is_cuda = False\n        try:\n            is_cuda = torch.device(device).type == \"cuda\"\n        except:\n            pass\n    else:\n        is_cuda = torch.device(device).type == \"cuda\"\n    return is_cuda\n\n\ndef nan_mean(input: Tensor, *args, **kwargs) -> Tensor:\n    r\"\"\"\n    Return the mean of all elements, while ignoring the NaNs.\n\n    Parameters:\n\n        input: The input tensor.\n\n        dim (int or tuple(int)): The dimension or dimensions to reduce.\n\n        keepdim (bool): whether the output tensor has dim retained or not.\n\n        dtype (torch.dtype, optional):\n            The desired data type of returned tensor.\n            If specified, the input tensor is casted to dtype before the operation is performed.\n            This is useful for preventing data type overflows. Default: None.\n\n    Returns:\n        output: The resulting mean of the tensor\n    \"\"\"\n\n    sum = torch.nansum(input, *args, **kwargs)\n    num = torch.sum(~torch.isnan(input), *args, **kwargs)\n    mean = sum / num\n    return mean\n\n\ndef nan_median(input: Tensor, **kwargs) -> Tensor:\n    r\"\"\"\n    Return the median of all elements, while ignoring the NaNs.\n    Contrarily to `torch.nanmedian`, this function supports a list\n    of dimensions, or `dim=None`, and does not return the index of the median\n\n    Parameters:\n\n        input: The input tensor.\n\n        dim (int or tuple(int)): The dimension or dimensions to reduce.\n\n        keepdim (bool): whether the output tensor has dim retained or not.\n\n        dtype (torch.dtype, optional):\n            The desired data type of returned tensor.\n            If specified, the input tensor is casted to dtype before the operation is performed.\n            This is useful for preventing data type overflows. Default: None.\n\n    Returns:\n        output: The resulting median of the tensor.\n            Contrarily to `torch.median`, it does not return the index of the median\n    \"\"\"\n\n    dim = kwargs.pop(\"dim\", None)\n    keepdim = kwargs.pop(\"keepdim\", False)\n\n    if isinstance(dim, Iterable) and not isinstance(dim, str):\n        dim = list(dim)\n        dim.sort()\n        # Implement the median for a list of dimensions\n        for d in dim:\n            input = input.unsqueeze(-1)\n            input = input.transpose(d, -1)\n        input = input.flatten(-len(dim))\n        median, _ = torch.nanmedian(input, dim=-1, keepdim=False)\n        if not keepdim:\n            for d in dim[::-1]:\n                median = median.squeeze(d)\n    else:\n        if dim is None:\n            median = torch.nanmedian(input.flatten().float()).to(input.dtype)\n        else:\n            median, _ = torch.nanmedian(input, dim=dim, keepdim=keepdim)\n\n    return median\n\n\ndef nan_var(input: Tensor, unbiased: bool = True, **kwargs) -> Tensor:\n    r\"\"\"\n    Return the variace of all elements, while ignoring the NaNs.\n    If unbiased is True, Bessel’s correction will be used.\n    Otherwise, the sample deviation is calculated, without any correction.\n\n    Parameters:\n\n        input: The input tensor.\n\n        unbiased: whether to use Bessel’s correction (δN=1\\delta N = 1δN=1).\n\n        dim (int or tuple(int)): The dimension or dimensions to reduce.\n\n        keepdim (bool): whether the output tensor has dim retained or not.\n\n        dtype (torch.dtype, optional):\n            The desired data type of returned tensor.\n            If specified, the input tensor is casted to dtype before the operation is performed.\n            This is useful for preventing data type overflows. Default: None.\n\n    Returns:\n        output: The resulting variance of the tensor\n    \"\"\"\n\n    mean_kwargs = deepcopy(kwargs)\n    mean_kwargs.pop(\"keepdim\", None)\n    dim = mean_kwargs.pop(\"dim\", [ii for ii in range(input.ndim)])\n    mean = nan_mean(input, dim=dim, keepdim=True, **mean_kwargs)\n    dist = input - mean\n    dist2 = dist * dist\n    var = nan_mean(dist2, **kwargs)\n\n    if unbiased:\n        num = torch.sum(~torch.isnan(input), **kwargs)\n        var = var * num / (num - 1)\n\n    return var\n\n\ndef nan_std(input: Tensor, unbiased: bool = True, **kwargs) -> Tensor:\n    r\"\"\"\n    Return the standard deviation of all elements, while ignoring the NaNs.\n    If unbiased is True, Bessel’s correction will be used.\n    Otherwise, the sample deviation is calculated, without any correction.\n\n    Parameters:\n\n        input: The input tensor.\n\n        unbiased: whether to use Bessel’s correction (δN=1\\delta N = 1δN=1).\n\n        dim (int or tuple(int)): The dimension or dimensions to reduce.\n\n        keepdim (bool): whether the output tensor has dim retained or not.\n\n        dtype (torch.dtype, optional):\n            The desired data type of returned tensor.\n            If specified, the input tensor is casted to dtype before the operation is performed.\n            This is useful for preventing data type overflows. Default: None.\n\n    Returns:\n        output: The resulting standard deviation of the tensor\n    \"\"\"\n    variances = nan_var(input=input, unbiased=unbiased, **kwargs)\n    return torch.sqrt(variances.float()).to(variances.dtype)\n\n\ndef nan_mad(input: Tensor, normal: bool = True, **kwargs) -> Tensor:\n    r\"\"\"\n    Return the median absolute deviation of all elements, while ignoring the NaNs.\n\n    Parameters:\n\n        input: The input tensor.\n\n        normal: whether to multiply the result by 1.4826 to mimic the\n            standard deviation for normal distributions.\n\n        dim (int or tuple(int)): The dimension or dimensions to reduce.\n\n        keepdim (bool): whether the output tensor has dim retained or not.\n\n        dtype (torch.dtype, optional):\n            The desired data type of returned tensor.\n            If specified, the input tensor is casted to dtype before the operation is performed.\n            This is useful for preventing data type overflows. Default: None.\n\n    Returns:\n        output: The resulting median absolute deviation of the tensor\n    \"\"\"\n    median_kwargs = deepcopy(kwargs)\n    median_kwargs.pop(\"keepdim\", None)\n    dim = median_kwargs.pop(\"dim\", [ii for ii in range(input.ndim)])\n    median = nan_median(input, dim=dim, keepdim=True, **median_kwargs)\n    dist = (input - median).abs()\n    mad = nan_median(dist, **kwargs)\n    if normal:\n        mad = mad * 1.4826\n    return mad\n\n\nclass ModuleWrap(torch.nn.Module):\n    r\"\"\"\n    Wrap a function into a `torch.nn.Module`, with possible `*args` and `**kwargs`\n\n    Parameters:\n        func: function to wrap into a module\n    \"\"\"\n\n    def __init__(self, func, *args, **kwargs) -> None:\n        super().__init__()\n        self.func = func\n        self.__name__ = f\"ModuleWrap({self.func.__name__})\"\n        self.args = args\n        self.kwargs = kwargs\n\n    def forward(self, *args, **kwargs):\n        \"\"\"\n        Calls the function `self.func` with the arguments `self.func(*self.args, *args, **self.kwargs, **kwargs)`\n        \"\"\"\n        return self.func(*self.args, *args, **self.kwargs, **kwargs)\n\n    def __repr__(self):\n        return self.__name__\n\n\nclass ModuleListConcat(torch.nn.ModuleList):\n    r\"\"\"\n    A list of neural modules similar to `torch.nn.ModuleList`,\n    but where the modules are applied on the same input and\n    concatenated together, instead of being applied sequentially.\n\n    Parameters:\n        dim: The dimension for the concatenation\n    \"\"\"\n\n    def __init__(self, dim: int = -1):\n        super().__init__()\n        self.dim = dim\n\n    def forward(self, *args, **kwargs) -> Tensor:\n        r\"\"\"\n        Apply all layers on the `args` and `kwargs`, and concatenate\n        their output alongside the dimension `self.dim`.\n        \"\"\"\n        h = []\n        for module in self:\n            h.append(module.forward(*args, **kwargs))\n\n        return torch.cat(h, dim=self.dim)\n\n\ndef parse_valid_args(param_dict, fn):\n    r\"\"\"\n    Check if a function takes the given argument.\n\n    Parameters\n    ----------\n    fn: func\n        The function to check the argument.\n    param_dict: dict\n        Dictionary of the argument.\n\n    Returns\n    -------\n        param_dict: dict\n            Valid paramter dictionary for the given fucntions.\n    \"\"\"\n    param_dict_cp = copy(param_dict)\n    for key, param in param_dict.items():\n        if not arg_in_func(fn=fn, arg=key):\n            logger.warning(\n                f\"{key} is not an available argument for {fn.__name__}, and is ignored by default.\"\n            )\n            param_dict_cp.pop(key)\n\n    return param_dict_cp\n\n\ndef arg_in_func(fn, arg):\n    r\"\"\"\n    Check if a function takes the given argument.\n\n    Parameters\n    ----------\n    fn: func\n        The function to check the argument.\n    arg: str\n        The name of the argument.\n\n    Returns\n    -------\n        res: bool\n            True if the function contains the argument, otherwise False.\n    \"\"\"\n    fn_args = getfullargspec(fn)\n    return (fn_args.varkw is not None) or (arg in fn_args[0])\n\n\ndef tensor_fp16_to_fp32(tensor: Tensor) -> Tensor:\n    r\"\"\"Cast a tensor from fp16 to fp32 if it is in fp16\n\n    Parameters:\n        tensor: A tensor. If it is in fp16, it will be casted to fp32\n\n    Returns:\n        tensor: The tensor casted to fp32 if it was in fp16\n    \"\"\"\n    if tensor.dtype == torch.float16:\n        return tensor.to(dtype=torch.float32)\n    return tensor\n\n\ndef dict_tensor_fp16_to_fp32(\n    dict_tensor: Union[Tensor, Dict[str, Tensor], Dict[str, Dict[str, Tensor]]],\n) -> Union[Tensor, Dict[str, Tensor], Dict[str, Dict[str, Tensor]]]:\n    r\"\"\"Recursively Cast a dictionary of tensors from fp16 to fp32 if it is in fp16\n\n    Parameters:\n        dict_tensor: A recursive dictionary of tensors. To be casted to fp32 if it was in fp16.\n\n    Returns:\n        dict_tensor: The recursive dictionary of tensors casted to fp32 if it was in fp16\n    \"\"\"\n    if isinstance(dict_tensor, dict):\n        for key, value in dict_tensor.items():\n            dict_tensor[key] = dict_tensor_fp16_to_fp32(value)\n    else:\n        dict_tensor = tensor_fp16_to_fp32(dict_tensor)\n\n    return dict_tensor\n"
  },
  {
    "path": "install_ipu.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Graphcore Limited.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nGraphcore Limited is not liable\nfor any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n#!/bin/bash\n\n# Default location for the virtual environment\ndefault_venv_name=\".graphium_ipu\"\n\n# Allow the user to specify the location of their virtual environment\n# If not specified, use the default location\nvenv_name=${1:-$default_venv_name}\n\n# Constants\nsdk_compressed_file=\"poplar_sdk-ubuntu_20_04-3.3.0-208993bbb7.tar.gz\"\nsdk_wheel_file=\"poptorch-3.3.0+113432_960e9c294b_ubuntu_20_04-cp38-cp38-linux_x86_64.whl\"\nsdk_url=\"https://downloads.graphcore.ai/direct?package=poplar-poplar_sdk_ubuntu_20_04_3.3.0_208993bbb7-3.3.0&file=${sdk_compressed_file}\"\nsdk_path=\"${venv_name}/poplar_sdk-ubuntu_20_04-3.3.0+1403-208993bbb7\"\n\n# Check for Python3 and pip\nif ! command -v python3 &>/dev/null; then\n    echo \"Python3 is required but it's not installed. Exiting.\"\n    exit 1\nfi\n\nif ! command -v pip3 &>/dev/null; then\n    echo \"pip3 is required but it's not installed. Exiting.\"\n    exit 1\nfi\n\n# Remove existing venv directory if it exists\nif [[ -d $venv_name ]]; then\n    echo \"Removing existing virtual environment directory...\"\n    rm -rf $venv_name\nfi\n\n# Create the virtual environment\necho \"Creating virtual environment...\"\nmkdir -p $venv_name\npython3 -m venv $venv_name\nsource $venv_name/bin/activate\n\n# Update pip to the latest version\necho \"Upgrading pip...\"\npython3 -m pip install --upgrade pip\n\n# Download the Poplar SDK\necho \"Downloading Poplar SDK...\"\nwget -q -O \"${venv_name}/${sdk_compressed_file}\" \"$sdk_url\"\n\n# Check the wget exit status\nif [ $? -ne 0 ]; then\n    echo \"Failed to download Poplar SDK. Exiting.\"\n    exit 1\nfi\n\n# Unzip the SDK file\necho \"Extracting Poplar SDK...\"\ntar -xzf \"$venv_name/$sdk_compressed_file\" -C $venv_name\n\n# Install the PopTorch wheel\necho \"Installing PopTorch...\"\npython3 -m pip install \"${sdk_path}/${sdk_wheel_file}\"\n\n# Enable Poplar SDK (including Poplar and PopART)\necho \"Enabling Poplar SDK...\"\nsource ${sdk_path}/enable\n\n# Install the IPU specific and Graphium requirements\necho \"Installing IPU specific and Graphium requirements...\"\npython3 -m pip install -r requirements_ipu.txt\n\n# Install Graphium in dev mode\necho \"Installing Graphium in dev mode...\"\npython3 -m pip install --no-deps -e .\n\n# This is a quick test make sure poptorch is correctly installed\nif python3 -c \"import poptorch;print('poptorch installed correctly')\" &> /dev/null; then\n    echo \"Installation completed successfully.\"\nelse\n    echo \"Installation was not successful. Please check the logs and try again.\"\n    exit 1  # Exit with status code 1 to indicate failure\nfi\n\n# Download the datafiles (Total ~ 10Mb - nothing compared to the libraries)\necho \"Downloading the sub-datasets consisting on the ToyMix dataset\"\ntoymix_dir=expts/data/neurips2023/small-dataset/\nmkdir -p $toymix_dir\n\nbase_url=\"https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/\"\nfiles=(\"ZINC12k.csv.gz\" \"Tox21-7k-12-labels.csv.gz\" \"qm9.csv.gz\" \"qm9_random_splits.pt\" \"Tox21_random_splits.pt\" \"ZINC12k_random_splits.pt\")\n\nfor file in \"${files[@]}\"; do\n    if [ ! -f \"${toymix_dir}${file}\" ]; then\n        echo \"Downloading ${file}...\"\n        wget -P \"${toymix_dir}\" \"${base_url}${file}\"\n    else\n        echo \"${file} already exists. Skipping...\"\n    fi\ndone\n\necho \"Data has been successfully downloaded.\""
  },
  {
    "path": "mkdocs.yml",
    "content": "site_name: \"graphium\"\nsite_description: \"Graphium: Scaling molecular GNNs to infinity.\"\nsite_url: \"https://github.com/datamol-io/graphium\"\nrepo_url: \"https://github.com/datamol-io/graphium\"\nrepo_name: \"valence-discovery/graphium\"\ncopyright: Copyright 2020 - 2023 datamol.io\n\nremote_branch: \"gh-pages\"\nuse_directory_urls: false\ndocs_dir: \"docs\"\n\nnav:\n  - Overview: index.md\n  - Baseline: baseline.md\n  - API:\n      - graphium.nn:\n          - graphium.nn: api/graphium.nn/graphium.nn.md\n          - graphium.nn.architectures: api/graphium.nn/architectures.md\n          - graphium.nn.encoders: api/graphium.nn/encoders.md\n          - graphium.nn.pyg_layers: api/graphium.nn/pyg_layers.md\n      - graphium.features: api/graphium.features.md\n      - graphium.trainer: api/graphium.trainer.md\n      - graphium.data: api/graphium.data.md\n      - graphium.utils: api/graphium.utils.md\n      - graphium.config: api/graphium.config.md\n      - graphium.ipu: api/graphium.ipu.md\n      - graphium.finetuning: api/graphium.finetuning.md\n  - Tutorials:\n      - feature_processing:\n          - Add new positional encoding: tutorials/feature_processing/add_new_positional_encoding.ipynb\n          - Convert CSV to Parquet files: tutorials/feature_processing/csv_to_parquet.ipynb\n          - Timing parallel processing: tutorials/feature_processing/timing_parallel.ipynb\n      - gnn_layers:\n          - Add new gnn layers: tutorials/gnn/add_new_gnn_layers.ipynb\n          - Making GNN networks: tutorials/gnn/making_gnn_networks.ipynb\n          - Using GNN layers: tutorials/gnn/using_gnn_layers.ipynb\n      - model_training:\n          - Simple Molecular Model: tutorials/model_training/simple-molecular-model.ipynb\n          - Training on IPU: tutorials/model_training/running-multitask-ipu.ipynb\n  - Design: design.md\n  - Datasets: datasets.md\n  - Pretrained Models: pretrained_models.md\n  - CLI:\n    - About: cli/reference.md\n    - Commands:\n      - graphium: cli/graphium.md\n      - graphium-train: cli/graphium-train.md\n  - Contribute: contribute.md\n  - License: license.md\n\ntheme:\n  name: material\n  # NOTE(hadim): to customize the material primary and secondary\n  # color check `docs/_assets/css/custom-graphium.css`.\n  features:\n    - navigation.tabs\n    - navigation.expand\n  favicon: images/logo-mini-dark.png\n  logo: images/logo-mini-white.svg\n\nextra_css:\n  - _assets/css/custom.css\n  - _assets/css/custom-graphium.css\n\nextra_javascript:\n  - https://cdn.jsdelivr.net/npm/mathjax@3/es5/tex-mml-chtml.js\n  - _assets/js/google-analytics.js\n\nmarkdown_extensions:\n  - admonition\n  - markdown_include.include\n  - pymdownx.emoji\n  - pymdownx.magiclink\n  - pymdownx.superfences\n  - pymdownx.tabbed\n  - pymdownx.tasklist\n  - pymdownx.details\n  - pymdownx.arithmatex:\n      generic: true\n  - toc:\n      permalink: true\n\nwatch:\n  - graphium/\n\nplugins:\n  - search\n\n  - mkdocstrings:\n      handlers:\n        python:\n          setup_commands:\n            - import sys\n            - sys.path.append(\"docs\")\n            - sys.path.append(\"graphium\")\n          options:\n            new_path_syntax: yes\n            show_root_heading: yes\n            heading_level: 3\n            show_source: false\n\n  - mkdocs-jupyter:\n      execute: False\n\n  - mike:\n      version_selector: true\n\nextra:\n  version:\n    # Multi versioning provider for mkdocs-material (used for the JS selector)\n    provider: mike\n"
  },
  {
    "path": "notebooks/compare-pretraining-finetuning-performance.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import fsspec\\n\",\n    \"import yaml\\n\",\n    \"import pandas as pd\\n\",\n    \"import matplotlib.pyplot as plt\\n\",\n    \"import seaborn as sns\\n\",\n    \"from tqdm import tqdm\\n\",\n    \"from typing import Literal\\n\",\n    \"from dataclasses import dataclass, field\\n\",\n    \"from collections import defaultdict\\n\",\n    \"from graphium.utils import fs\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"ROOT_DIR = \\\"gs://graphium-private/pretrained-models/ToyMix/cas\\\"\\n\",\n    \"\\n\",\n    \"PT_TASKS = [\\\"qm9\\\", \\\"tox21\\\", \\\"zinc\\\"]\\n\",\n    \"PT_FT_RELS = {\\n\",\n    \"    \\\"toymix_gcn_1\\\": {\\\"caco2\\\": \\\"finetuning_caco2_wang_gcn_1\\\", \\\"lipophilicity\\\": \\\"finetuning_lipophilicity_astrazeneca_gcn_1\\\"},\\n\",\n    \"    \\\"toymix_gcn_2\\\": {\\\"caco2\\\": \\\"finetuning_caco2_wang_gcn_2\\\", \\\"lipophilicity\\\": \\\"finetuning_lipophilicity_astrazeneca_gcn_2\\\"},\\n\",\n    \"}\\n\",\n    \"FT_METRICS = {\\\"caco2\\\": \\\"graph_caco2_wang/mae/test\\\", \\\"lipophilicity\\\": \\\"graph_lipophilicity_astrazeneca/r2_score/test\\\"}\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"@dataclass\\n\",\n    \"class FinetuningResult: \\n\",\n    \"    name: str\\n\",\n    \"    scores: dict[str, list[float]] = field(default_factory=lambda: defaultdict(list))\\n\",\n    \"\\n\",\n    \"    def best(self, metric, minimize: bool = False):\\n\",\n    \"        return max(self.scores[metric]) if not minimize else min(self.scores[metric])\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"@dataclass\\n\",\n    \"class PretrainingResult: \\n\",\n    \"    name: str\\n\",\n    \"    loss: dict[Literal[\\\"qm9\\\", \\\"zinc\\\", \\\"tox21\\\", \\\"all\\\"], float] = field(default_factory=dict)\\n\",\n    \"    ft_results: dict[str, FinetuningResult] = field(default_factory=dict)\\n\",\n    \"\\n\",\n    \"    @property\\n\",\n    \"    def finetuning_tasks(self):\\n\",\n    \"        return sorted(list(self.ft_results.keys()))\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/Users/cas.wognum/micromamba/envs/graphium/lib/python3.10/site-packages/google/auth/_default.py:78: UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK without a quota project. You might receive a \\\"quota exceeded\\\" or \\\"API not enabled\\\" error. See the following page for troubleshooting: https://cloud.google.com/docs/authentication/adc-troubleshooting/user-creds. \\n\",\n      \"  warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING)\\n\",\n      \"100%|██████████| 2/2 [00:20<00:00, 10.05s/it]\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"globber, _ = fsspec.core.url_to_fs(ROOT_DIR)\\n\",\n    \"\\n\",\n    \"results = {}\\n\",\n    \"\\n\",\n    \"for pt_dir, ft_dirs in tqdm(PT_FT_RELS.items()):\\n\",\n    \"    \\n\",\n    \"    # Create a new results object\\n\",\n    \"    results[pt_dir] = PretrainingResult(name=pt_dir)\\n\",\n    \"\\n\",\n    \"    # Parse the pre-training results\\n\",\n    \"    pt_results_path = fs.join(ROOT_DIR, pt_dir, \\\"results\\\", \\\"test_results.yaml\\\")\\n\",\n    \"    with fsspec.open(pt_results_path, \\\"r\\\") as f:\\n\",\n    \"        pt_results = yaml.safe_load(f)\\n\",\n    \"    results[pt_dir].loss = {k: pt_results[f\\\"graph_{k}/loss/test\\\"] for k in PT_TASKS}\\n\",\n    \"    results[pt_dir].loss[\\\"all\\\"] = pt_results[\\\"loss/test\\\"]\\n\",\n    \"\\n\",\n    \"    # Parse the associated fine-tuning results\\n\",\n    \"    for ft_label, ft_dir in ft_dirs.items():\\n\",\n    \"\\n\",\n    \"        # Create a new results object\\n\",\n    \"        ft_results = FinetuningResult(name=ft_label)\\n\",\n    \"        \\n\",\n    \"        # Find all results for all trials\\n\",\n    \"        ft_results_pattern = fs.join(ROOT_DIR, ft_dir, \\\"**\\\", \\\"test_results.yaml\\\")\\n\",\n    \"        paths = globber.glob(ft_results_pattern)\\n\",\n    \"\\n\",\n    \"        # Save all scores\\n\",\n    \"        for path in paths: \\n\",\n    \"            with globber.open(path, \\\"r\\\") as f:\\n\",\n    \"                data = yaml.safe_load(f)\\n\",\n    \"            for k, v in data.items():\\n\",\n    \"                ft_results.scores[k].append(v)\\n\",\n    \"        \\n\",\n    \"        # Save the finetuning results to the pre-training results\\n\",\n    \"        results[pt_dir].ft_results[ft_label] = ft_results\\n\",\n    \"\\n\",\n    \"            \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def draw_boxplot(results: dict[str, FinetuningResult], fine_tuning_task: str,  metric_label: str = None, loss_label: str = \\\"all\\\", ax=None):\\n\",\n    \"    if ax is None: \\n\",\n    \"        _, ax = plt.subplots()\\n\",\n    \"    if metric_label is None:\\n\",\n    \"        metric_label = FT_METRICS[fine_tuning_task]\\n\",\n    \"    for pt_label, pt_results in results.items():\\n\",\n    \"        positions = [round(pt_results.loss[loss_label], 3)]\\n\",\n    \"        data = pt_results.ft_results[fine_tuning_task].scores[metric_label]\\n\",\n    \"        ax.boxplot(data, positions=positions)\\n\",\n    \"    ax.set_xlabel(f\\\"Pre-training loss on {loss_label}\\\")\\n\",\n    \"    ax.set_ylabel(metric_label)   \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAABKUAAAGGCAYAAACqvTJ0AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABT3klEQVR4nO3dfXzO9f////sxY4ZtcpazNeZsc5KclBhKqFRC75KKkPq+lXKWTsS7dEZnIokiSQmdUd4V8hZJEs2kNKxYpMlZzFnTtufvj36OT2sbx/HaseP13Ha7Xi67vNvreO11PI9jHbf363h0HMc8xhgjAAAAAAAAIIhC3F4AAAAAAAAASh6GUgAAAAAAAAg6hlIAAAAAAAAIOoZSAAAAAAAACDqGUgAAAAAAAAg6hlIAAAAAAAAIOoZSAAAAAAAACDqGUgAAAAAAAAi6ULcXUBDZ2dn69ddfFRERIY/H4/ZyADhkjNHRo0dVs2ZNhYQUn1k5jQKKh+LYKPoEFA/0CYCtfO1TkR5K/frrr4qOjnZ7GQACZPfu3apdu7bbywgYGgUUL8WpUfQJKF7oEwBbna1PRXooFRERIemvGxkZGenyagA4lZ6erujoaO9jurigUUDxUBwbRZ+A4oE+AbCVr30q0kOp0y/njIyMJFhAMVDcXqJNo4DipTg1ij4BxQt9AmCrs/WpeLzxGAAAAAAAAEUKQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEXajbCwCKkhMnTmjr1q25tp88eVKpqamqU6eOwsPDc10eFxencuXKBWOJAIohp+2R6A8Ad3DOBMBW9MkuDKUAP2zdulWtWrXy++cSExPVsmXLQlgRgJLAaXsk+gPAHZwzAbAVfbILQynAD3FxcUpMTMy1PTk5WX379tXcuXMVHx+f588BgFNO23P6ZwEg2DhnAmAr+mQXhlKAH8qVK3fG6Xh8fDzTcwABR3sAFDV0C4Ct6JNd+KBzAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQykAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAASd60OpPXv2qG/fvqpcubLKlSunCy64QImJiW4vCwDoEwBr0ScAtqJPAPwR6uaV//7770pISFCnTp20ZMkSVatWTT/99JMqVqzo5rIAgD4BsBZ9AmAr+gTAX64OpZ5++mlFR0dr9uzZ3m116tRxb0EA8P+jTwBsRZ8A2Io+AfCXq2/fW7x4sVq3bq0bbrhB1apVU4sWLTRz5sx898/IyFB6enqOLwAoDP72SaJRAIKDPgGwFX0C4C9Xh1I7duzQ9OnT1aBBAy1btkyDBw/W0KFD9cYbb+S5/4QJExQVFeX9io6ODvKKAZQU/vZJolEAgoM+AbAVfQLgL48xxrh15WXKlFHr1q21du1a77ahQ4dqw4YN+uqrr3Ltn5GRoYyMDO/36enpio6O1pEjRxQZGRmUNQN52bhxo1q1aqXExES1bNnS7eUUOenp6YqKirLqsexvnyQaheCjPcFhW6PoE4oyuhVY9AkIHPoUWL72ydVXStWoUUONGzfOsS0+Pl67du3Kc/+wsDBFRkbm+AKAwuBvnyQaBSA46BMAW9EnAP5y9YPOExIStG3bthzbtm/frpiYGJdWVHydOHFCW7duzbX95MmTSk1NVZ06dRQeHp7r8ri4OJUrVy4YSwSsQp/skV+/JBqGkok+ucPpuZREi1By0Cf30CgUVa4OpUaMGKF27dpp/Pjx6t27t9avX68ZM2ZoxowZbi6rWNq6datatWrl98/x0kWUVPTJHk77JdEwFE/0yR20CDg7+uQeGoWiytWh1IUXXqhFixZp9OjReuyxx1S3bl1NnjxZt9xyi5vLKpbi4uKUmJiYa3tycrL69u2ruXPnKj4+Ps+fA0oi+mSP/Pol0TCUTPTJHU7PpU7/LFAS0Cf30CgUVa4OpSTpmmuu0TXXXOP2Moq9cuXKnXH6HR8fz3Qc+Af6ZIez9UuiYSh56FPwcS4F+IY+uYNGoahy9YPOAQAAAAAAUDIxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNAxlAIAAAAAAEDQMZQCAAAAAABA0DGUAgAAAAAAQNA5GkrFxsbq4MGDubYfPnxYsbGxBV4UADhFnwDYij4BsBmNAuAGR0Op1NRUZWVl5dqekZGhPXv2FHhRAOAUfQJgK/oEwGY0CoAbQv3ZefHixd5/XrZsmaKiorzfZ2VlacWKFapTp07AFgcAvqJPAGxFnwDYjEYBcJNfQ6mePXtKkjwej/r375/jstKlS6tOnTqaOHFiwBYHAL6iTwBsRZ8A2IxGAXCTX0Op7OxsSVLdunW1YcMGValSpVAWBQD+ok8AbEWfANiMRgFwk19DqdN27tyZa9vhw4dVsWLFgq4HAAqEPgGwFX0CYDMaBcANjj7o/Omnn9bbb7/t/f6GG25QpUqVVKtWLX377bcBWxwA+Is+AbAVfQJgMxoFwA2OhlKvvPKKoqOjJUnLly/X//73Py1dulTdunXTfffdF9AFAoA/6BMAW9EnADajUQDc4Ojte2lpad5gffTRR+rdu7cuv/xy1alTR23atAnoAgHAH/QJgK3oEwCb0SgAbnD0SqlzzjlHu3fvliQtXbpUXbp0kSQZY5SVlRW41QGAn+gTAFvRJwA2o1EA3ODolVLXXXedbr75ZjVo0EAHDx5Ut27dJEmbNm1S/fr1A7pAAPAHfQJgK/oEwGY0CoAbHA2lJk2apDp16mj37t165plnVKFCBUl/veTzrrvuCugCAcAf9AmAregTAJvRKABucDSUKl26tEaNGpVr+/Dhwwu6HgAoEPoEwFb0CYDNaBQANzj6TClJevPNN9W+fXvVrFlTP//8syRp8uTJ+vDDDwO2OABwgj4BsBV9AmAzGgUg2BwNpaZPn66RI0eqW7duOnz4sPeD7ypWrKjJkycHcn0A4Bf6BMBW9AmAzWgUADc4Gkq9+OKLmjlzpsaMGaNSpUp5t7du3VrfffddwBYHAP6iTwBsRZ8A2IxGAXCDo6HUzp071aJFi1zbw8LCdPz48QIvCgCcok8AbEWfANiMRgFwg6OhVN26dbVp06Zc25csWaLGjRv7fJxx48bJ4/Hk+KpevbqTJQGAJPoEwF70CYDNaBQANzj663v33XefhgwZoj/++EPGGK1fv17z58/XhAkT9Oqrr/p1rCZNmuh///uf9/u/v1QUAPxFnwDYij4BsBmNAuAGR0OpgQMHKjMzU/fff79OnDihm2++WbVq1dILL7ygPn36+LeA0FAm5wAChj4BsBV9AmAzGgXADY6GUpJ0xx136I477tCBAweUnZ2tatWqOTpOSkqKatasqbCwMLVp00bjx49XbGxsnvtmZGQoIyPD+316erqj6wRQvLnRJ4lGATg7+gTAZjzHAxBsjj5T6rLLLtPhw4clSVWqVPHGKj09XZdddpnPx2nTpo3eeOMNLVu2TDNnztTevXvVrl07HTx4MM/9J0yYoKioKO9XdHS0k+UDKMbc6pNEowCcGX0CYDOe4wFwg6Oh1KpVq3Tq1Klc2//44w998cUXPh+nW7du+te//qVmzZqpS5cu+vjjjyVJc+bMyXP/0aNH68iRI96v3bt3O1k+gGLMrT5JNArAmdEnADbjOR4AN/j19r3Nmzd7//mHH37Q3r17vd9nZWVp6dKlqlWrluPFlC9fXs2aNVNKSkqel4eFhSksLMzx8QEUX273SaJRAPJGnwDYzO1G0SegZPNrKHXBBRd4/6xnXi/hDA8P14svvuh4MRkZGUpOTlaHDh0cHwNAyUSfANiKPgGwGY0C4Ca/hlI7d+6UMUaxsbFav369qlat6r2sTJkyqlatml9/7nPUqFHq3r27zjvvPO3bt09PPPGE0tPT1b9/f3+WBQD0CYC16BMAm9EoAG7yayj1yiuvqGfPnsrOzg7Ilf/yyy+66aabdODAAVWtWlUXX3yx1q1bp5iYmIAcH0DJQZ8A2Io+AbAZjQLgJr+GUmlpabrmmmtUqlQpde/eXT169FCXLl0cvwd4wYIFjn4OAP6JPgGwFX0CYDMaBcBNfv31vdmzZ+u3337TO++8o4oVK+ree+9VlSpVdN111+n111/XgQMHCmudAHBG9AmAregTAJvRKABu8msoJUkej0cdOnTQM888o61bt2r9+vW6+OKLNXPmTNWqVUsdO3bUc889pz179hTGegEgX/QJgK3oEwCb0SgAbvF7KPVP8fHxuv/++/Xll19q9+7d6t+/v7744gvNnz8/EOsDAMfoEwBb0ScANqNRAILFr8+U+qcff/xRP/30kzp27Kjw8HBVrVpVgwYN0qBBgwK1PgBwhD4BsBV9AmAzGgUgmBy9UurgwYPq0qWLGjZsqKuuukppaWmSpNtvv12jRo0K6AIBwB/0CYCt6BMAm9EoAG5wNJQaMWKEQkNDtWvXLpUrV867/cYbb9SSJUsCtjgA8Bd9AmAr+gTAZjQKgBscvX3v008/1bJly1S7du0c2xs0aKCff/45IAsDACfoEwBb0ScANqNRANzg6JVSx48fzzE9P+3AgQMKCwsr8KIAwCn6BMBW9AmAzWgUADc4Gkp17NhRb7zxhvd7j8ej7OxsPfvss+rUqVPAFgcA/qJPAGxFnwDYjEYBcIOjt+89++yzuvTSS/XNN9/o1KlTuv/++7VlyxYdOnRIX375ZaDXCAA+o08AbEWfANiMRgFwg6NXSjVu3FibN2/WRRddpK5du+r48eO67rrrlJSUpHr16gV6jQDgM/oEwFb0CYDNaBQANzh6pZQkVa9eXY8++mgg1wIAAUGfANiKPgGwGY0CEGyOh1KSdOLECe3atUunTp3Ksf38888v0KIAoKDoEwBb0ScANqNRAILJ0VBq//79GjhwoJYsWZLn5VlZWQVaFAA4RZ8A2Io+AbAZjQLgBkefKTV8+HD9/vvvWrduncLDw7V06VLNmTNHDRo00OLFiwO9RgDwGX0CYCv6BMBmNAqAGxy9Uuqzzz7Thx9+qAsvvFAhISGKiYlR165dFRkZqQkTJujqq68O9DoBwCf0CYCt6BMAm9EoAG5w9Eqp48ePq1q1apKkSpUqaf/+/ZKkZs2aaePGjYFbHQD4iT4BsBV9AmAzGgXADY6GUo0aNdK2bdskSRdccIFeeeUV7dmzRy+//LJq1KgR0AUCgD/oEwBb0ScANqNRANzg6O17w4cPV1pamiTpkUce0RVXXKG33npLZcqU0euvvx7I9QGAX+gTAFvRJwA2o1EA3OBoKHXLLbd4/7lFixZKTU3V1q1bdd5556lKlSoBWxwA+Is+AbAVfQJgMxoFwA2OhlL/VK5cObVs2TIQhwKAgKJPAGxFnwDYjEYBCAZHQyljjN577z2tXLlS+/btU3Z2do7LFy5cGJDFAYC/6BMAW9EnADajUQDc4GgoNWzYMM2YMUOdOnXSueeeK4/HE+h1AYAj9AmAregTAJvRKABucDSUmjt3rhYuXKirrroq0OsBgAKhTwBsRZ8A2IxGAXBDiJMfioqKUmxsbKDXAgAFRp8A2Io+AbAZjQLgBkdDqXHjxunRRx/VyZMnA70eACgQ+gTAVvQJgM1oFAA3OHr73g033KD58+erWrVqqlOnjkqXLp3j8o0bNwZkcQDgL/oEwFb0CYDNaBQANzgaSg0YMECJiYnq27cvH4IHwCr0CYCt6BMAm9EoAG5wNJT6+OOPtWzZMrVv3z7Q6wGAAqFPAGxFnwDYjEYBcIOjz5SKjo5WZGRkoNcCAAVGnwDYij4BsBmNAuAGR0OpiRMn6v7771dqamqAlwMABUOfANiKPgGwGY0C4AZHb9/r27evTpw4oXr16qlcuXK5PgTv0KFDAVkcAPiLPgGwFX0CYDMaBcANjoZSkyZN4oPvAFiJPgGwFX0CYDMaBcANfg2lPv30U3Xq1EkDBgwopOUAgDP0CYCt6BMAm9EoAG7y6zOlBg8erKpVq+rGG2/UvHnzdPjw4UJaFgD4hz4BsBV9AmAzGgXATX4NpXbs2KHVq1erWbNmmjx5sqpXr67OnTtrypQpfCAeAFfRJwC2ok8AbEajALjJ77++d/7552vs2LFav369duzYoRtuuEFLly5VfHy8mjdvrocffljffPNNYawVAM6IPgGwFX0CYDMaBcAtfg+l/q5mzZoaPHiwPvnkEx04cED/+c9/lJqaqiuvvFLjx48P1BoBwG/0CYCt6BMAm9EoAMHk6K/v5aV8+fK6/vrrdf311ys7O1sHDx4M1KEBoEDoEwBb0ScANqNRAAqbo6HUlClT8tzu8XhUtmxZNWjQQB06dPDrmBMmTNBDDz2kYcOGafLkyU6WBQCF0ieJRgEoOPoEwGY8xwPgBkdDqUmTJmn//v06ceKEzjnnHBljdPjwYZUrV04VKlTQvn37FBsbq5UrVyo6Ovqsx9uwYYNmzJih888/38lyAMAr0H2SaBSAwKBPAGzGczwAbnD0mVLjx4/XhRdeqJSUFB08eFCHDh3S9u3b1aZNG73wwgvatWuXqlevrhEjRpz1WMeOHdMtt9yimTNn6pxzznGyHADwCmSfJBoFIHDoEwCb8RwPgBscDaXGjh2rSZMmqV69et5t9evX13PPPafRo0erdu3aeuaZZ/Tll1+e9VhDhgzR1VdfrS5dupx134yMDKWnp+f4AoC/C2SfJBoFIHDoEwCb8RwPgBscvX0vLS1NmZmZubZnZmZq7969kv76qw1Hjx4943EWLFigjRs3asOGDT5d74QJE/Too4/6v2AAJUag+iTRKACBRZ8A2IzneADc4OiVUp06ddK///1vJSUlebclJSXpzjvv1GWXXSZJ+u6771S3bt18j7F7924NGzZMc+fOVdmyZX263tGjR+vIkSPer927dztZPoBiLBB9kmgUgMCjTwBsxnM8AG5wNJSaNWuWKlWqpFatWiksLExhYWFq3bq1KlWqpFmzZkmSKlSooIkTJ+Z7jMTERO3bt0+tWrVSaGioQkND9fnnn2vKlCkKDQ1VVlZWrp8JCwtTZGRkji8A+LtA9EmiUQACjz4BsBnP8QC4wdHb96pXr67ly5dr69at2r59u4wxiouLU6NGjbz7dOrU6YzH6Ny5s7777rsc2wYOHKi4uDg98MADKlWqlJOlASjhAtEniUYBCDz6BMBmPMcD4AZHQ6nT4uLiFBcX5+hnIyIi1LRp0xzbypcvr8qVK+faDgD+KkifJBoFoPDQJwA24zkegGByNJTKysrS66+/rhUrVmjfvn3Kzs7Ocflnn30WkMUBgL/oEwBb0ScANqNRANzgaCg1bNgwvf7667r66qvVtGlTeTyegCxm1apVATkOgJKrsPok0SgABUOfANiM53gA3OBoKLVgwQK98847uuqqqwK9HgAoEPoEwFb0CYDNaBQANzj663tlypRR/fr1A70WACgw+gTAVvQJgM1oFAA3OBpK3XvvvXrhhRdkjAn0egCgQOgTAFvRJwA2o1EA3ODo7Xtr1qzRypUrtWTJEjVp0kSlS5fOcfnChQsDsjgA8Bd9AmAr+gTAZjQKgBscDaUqVqyoXr16BXotAFBg9AmAregTAJvRKABucDSUmj17dqDXAQABQZ8A2Io+AbAZjQLgBkefKQUAAAAAAAAUhKNXSknSe++9p3feeUe7du3SqVOncly2cePGAi8MAJyiTwBsRZ8A2IxGAQg2R6+UmjJligYOHKhq1aopKSlJF110kSpXrqwdO3aoW7dugV4jAPiMPgGwFX0CYDMaBcANjoZS06ZN04wZMzR16lSVKVNG999/v5YvX66hQ4fqyJEjgV4jAPiMPgGwFX0CYDMaBcANjoZSu3btUrt27SRJ4eHhOnr0qCSpX79+mj9/fuBWBwB+ok8AbEWfANiMRgFwg6OhVPXq1XXw4EFJUkxMjNatWydJ2rlzp4wxgVsdAPiJPgGwFX0CYDMaBcANjoZSl112mf773/9KkgYNGqQRI0aoa9euuvHGG9WrV6+ALhAA/EGfANiKPgGwGY0C4AZHf31vxowZys7OliQNHjxYlSpV0po1a9S9e3cNHjw4oAsEAH/QJwC2ok8AbEajALjB0VAqJCREISH/9yKr3r17q3fv3gFbFAA4RZ8A2Io+AbAZjQLgBkdDqYSEBF1yySW69NJLlZCQoPLlywd6XQDgCH0CYCv6BMBmNAqAGxx9ptQ111yjjRs36vrrr9c555yjtm3b6sEHH9TSpUt17NixQK8RAHxGnwDYij4BsBmNAuAGR0Op0aNHa+nSpfr999+1evVq9ejRQ5s2bdK1116rypUrB3qNAOAz+gTAVvQJgM1oFAA3OHr73mkpKSn69ttv9e2332rz5s2KjIxUhw4dArU2AHCMPgGwFX0CYDMaBSCYHA2lbrzxRq1evVrZ2dnq2LGjOnbsqNGjR+v8888P9PoAwC/0CYCt6BMAm9EoAG5wNJR69913VaVKFQ0YMECdOnVShw4dVKFChUCvDQD8Rp8A2Io+AbAZjQLgBkefKXXo0CG9+uqryszM1NixY1WlShW1adNGDzzwgJYsWRLoNQKAz+gTAFvRJwA2o1EA3OBoKFWxYkVde+21ev7555WYmKgtW7aocePGev7553XNNdcEeo0A4DP6BMBW9AmAzWgUADc4evveoUOH9Pnnn2vVqlVatWqVtmzZokqVKqlHjx7q1KlToNcIAD6jTwBsRZ8A2IxGAXCDo6FU1apVVaVKFXXo0EF33HGHLr30UjVt2jTQawMAv9EnALaiTwBsRqMAuMHRUOrbb7/1KVBffvmlWrdurbCwMCdXAwB+o08AbEWfANiMRgFwg6PPlPJ1Yt6tWzft2bPHyVUAgCP0CYCt6BMAm9EoAG5wNJTylTGmMA8PAI7RJwC2ok8AbEajAARSoQ6lAAAAAAAAgLwwlAIAAAAAAEDQMZQCAAAAAABA0BXqUMrj8RTm4QHAMfoEwFb0CYDNaBSAQOKDzgGUSPQJgK3oEwCb0SgAgRRamAc/evRoYR4eAByjTwBsRZ8A2IxGAQgkv18p9e233+qJJ57QtGnTdODAgRyXpaen67bbbgvY4gDAH/QJgK3oEwCb0SgAbvFrKPXpp5/qoosu0oIFC/T0008rPj5eK1eu9F5+8uRJzZkzJ+CLBICzoU8AbEWfANiMRgFwk19DqXHjxmnUqFH6/vvvlZqaqvvvv1/XXnutli5dWljrAwCf0CcAtqJPAGxGowC4ya/PlNqyZYvefPNNSX/91YX77rtPtWvX1vXXX6/58+froosuKpRFAsDZ0CcAtqJPAGxGowC4ya+hVFhYmA4fPpxj20033aSQkBD16dNHEydODOTaAMBn9AmAregTAJvRKABu8uvtexdccEGO9xefduONN+rVV1/V0KFD/bry6dOn6/zzz1dkZKQiIyPVtm1bLVmyxK9jAIBEnwDYiz4BsBmNAuAmv14pdeedd2r16tV5XnbTTTdJkmbMmOHz8WrXrq2nnnpK9evXlyTNmTNHPXr0UFJSkpo0aeLP0gCUcPQJgK3oEwCb0SgAbvJrKNWrVy/16tUr38tvuukmb7h80b179xzfP/nkk5o+fbrWrVtHsAD4hT4BsBV9AmAzGgXATX4Npf4pMTFRycnJ8ng8io+PV8uWLR0fKysrS++++66OHz+utm3bFmRZAECfAFiLPgGwGY0CEEyOhlL79u1Tnz59tGrVKlWsWFHGGB05ckSdOnXSggULVLVqVZ+P9d1336lt27b6448/VKFCBS1atEiNGzfOc9+MjAxlZGR4v09PT3eyfADFmFt9kmgUgDOjTwBsxnM8AG7w64POT7vnnnuUnp6uLVu26NChQ/r999/1/fffKz093e8PwmvUqJE2bdqkdevW6c4771T//v31ww8/5LnvhAkTFBUV5f2Kjo52snwAxZhbfZJoFIAzo08AbMZzPABucDSUWrp0qaZPn674+HjvtsaNG+ull17y+y8rlClTRvXr11fr1q01YcIENW/eXC+88EKe+44ePVpHjhzxfu3evdvJ8gEUY271SaJRAM6MPgGwGc/xALjB0dv3srOzVbp06VzbS5curezs7AItyBiT4+WbfxcWFqawsLACHR/wVUpKio4ePerTvsnJyTn+1xcRERFq0KCBo7Uhf271SaJRKDh/uiM5a89pNCj46BOKq8I+Z5JoVjDwHA/FTTDOq2hTwTkaSl122WUaNmyY5s+fr5o1a0qS9uzZoxEjRqhz584+H+ehhx5St27dFB0draNHj2rBggVatWqVli5d6mRZQMCkpKSoYcOGfv9c3759/dp/+/btRCzA6BOKKqfdkfxvz2k0KLjoE4qjYJ0zSTSrsNEoFCfBPK+iTQXjaCg1depU9ejRQ3Xq1FF0dLQ8Ho927dqlZs2aae7cuT4f57ffflO/fv2UlpamqKgonX/++Vq6dKm6du3qZFlAwJyeqM+dOzfHS5jzc/LkSaWmpqpOnToKDw8/6/7Jycnq27evX5N7+IY+oajytzuS/+05jQa5gz6hOCrscyaJZgULjUJxEozzKtoUGI6GUtHR0dq4caOWL1+urVu3yhijxo0bq0uXLn4dZ9asWU6uHggaf/4MbkJCQiGvBr6gTyjq/P3z27Sn6KBPKM44Zyr6aBSKI86r7OdoKHVa165dmXgDsBJ9AmAr+gTAZjQKQDA5+ut7Q4cO1ZQpU3Jtnzp1qoYPH17QNQGAY/QJgK3oEwCb0SgAbnA0lHr//ffzfFlbu3bt9N577xV4UQDgFH0CYCv6BMBmNAqAGxwNpQ4ePKioqKhc2yMjI3XgwIECLwoAnKJPAGxFnwDYjEYBcIOjoVT9+vXz/JOeS5YsUWxsbIEXBQBO0ScAtqJPAGxGowC4wdEHnY8cOVJ333239u/fr8suu0yStGLFCk2cOFGTJ08O5PoAwC/0CYCt6BMAm9EoAG5wNJS67bbblJGRoSeffFKPP/64JKlOnTqaPn26br311oAuEAD8QZ8A2Io+AbAZjQLgBkdDKUm68847deedd2r//v0KDw9XhQoVArkuAHCMPgGwFX0CYDMaBSDYHA+lTqtatWog1oEASklJ0dGjR33aNzk5Ocf/+iIiIkINGjRwtDYgmOhT0eJPu05z0jCJjsF99MluhX0uJdEh2I1G2Y1GoThxPJR677339M4772jXrl06depUjss2btxY4IXBmZSUFDVs2NDvn+vbt69f+2/fvp1IwVr0qehx2q7T/G2YRMfgDvpkv2CdS0l0CPahUfajUShuHA2lpkyZojFjxqh///768MMPNXDgQP3000/asGGDhgwZEug1wg+nJ+Zz585VfHz8Wfc/efKkUlNTVadOHYWHh591/+TkZPXt29fvVzMAwUKfiiZ/23Wavw2T6BjcQ5+KhsI+l5LoEOxEo4oGGoXixtFQatq0aZoxY4ZuuukmzZkzR/fff79iY2P18MMP69ChQ4FeIxyIj49Xy5Ytfdo3ISGhkFcDBA99Ktr8addpNAxFBX0qWjiXQklDo4oWGoXiIsTJD+3atUvt2rWTJIWHh3snqP369dP8+fMDtzoA8BN9AmAr+gTAZjQKgBscDaWqV6+ugwcPSpJiYmK0bt06SdLOnTtljAnc6gDAT/QJgK3oEwCb0SgAbnA0lLrsssv03//+V5I0aNAgjRgxQl27dtWNN96oXr16BXSBAOAP+gTAVvQJgM1oFAA3OPpMqRkzZig7O1uSNHjwYFWqVElr1qxR9+7dNXjw4IAuEAD8QZ8A2Io+AbAZjQLgBkdDqZCQEIWE/N+LrHr37q3evXsHbFEA4BR9AmAr+gTAZjQKgBscvX1v9uzZevfdd3Ntf/fddzVnzpwCLwoAnKJPAGxFnwDYjEYBcIOjodRTTz2lKlWq5NperVo1jR8/vsCLAgCn6BMAW9EnADajUQDc4Ggo9fPPP6tu3bq5tsfExGjXrl0FXhQAOEWfANiKPgGwGY0C4AZHQ6lq1app8+bNubZ/++23qly5coEXBQBO0ScAtqJPAGxGowC4wdFQqk+fPho6dKhWrlyprKwsZWVl6bPPPtOwYcPUp0+fQK8RAHxGnwDYij4BsBmNAuAGR39974knntDPP/+szp07KzT0r0NkZ2fr1ltv5f3GAFxFnwDYij4BsBmNAuAGR0OpMmXK6O2339YTTzyhTZs2KTw8XM2aNVNMTEyg1wcAfqFPAGxFnwDYjEYBcIOjodRpDRo0UIMGDfK9PDIyUps2bVJsbGxBrgYA/EafANiKPgGwGY0CEEyOPlPKV8aYwjw8ADhGnwDYij4BsBmNAhBIhTqUAgAAAAAAAPLCUAoAAAAAAABBx1AKAAAAAAAAQVeoQymPx1OYhwcAx+gTAFvRJwA2o1EAAokPOgdQItEnALaiTwBsRqMABFKhDqWWLFmiWrVqFeZVAIAj9AmAregTAJvRKACBFOrkh7KysvT6669rxYoV2rdvn7Kzs3Nc/tlnn0mS2rdvX/AVAoAf6BMAW9EnADajUQDc4GgoNWzYML3++uu6+uqr1bRpU95XDMAa9AmAregTAJvRKABucDSUWrBggd555x1dddVVgV4PABQIfQJgK/oEwGY0CoAbHH2mVJkyZVS/fv1ArwUACow+AbAVfQJgMxoFwA2OhlL33nuvXnjhBf7yAgDr0CcAtqJPAGxGowC4wee371133XU5vv/ss8+0ZMkSNWnSRKVLl85x2cKFCwOzOgDwAX0CYCv6BMBmNAqA23weSkVFReX4vlevXgFfDAA4QZ8A2Io+AbAZjQLgNp+HUrNnzw74lU+YMEELFy7U1q1bFR4ernbt2unpp59Wo0aNAn5dAIov+gTAVvQJgM0C3Sj6BMBfjj5T6rR9+/bpiy++0Jo1a7Rv3z6/f/7zzz/XkCFDtG7dOi1fvlyZmZm6/PLLdfz48YIsCwDoEwBr0ScANitIo+gTAH/5/Eqpv0tPT9eQIUO0YMECZWVlSZJKlSqlG2+8US+99FKul4HmZ+nSpTm+nz17tqpVq6bExER17NjRydIAlHD0CYCt6BMAmwWiUfQJgL8cvVLq9ttv19dff62PPvpIhw8f1pEjR/TRRx/pm2++0R133OF4MUeOHJEkVapUyfExAJRs9AmAregTAJsVRqPoE4CzcfRKqY8//ljLli1T+/btvduuuOIKzZw5U1deeaWjhRhjNHLkSLVv315NmzbNc5+MjAxlZGR4v09PT3d0XQCKL7f6JNEoAGdGnwDYLNCNok8AfOHolVKVK1fO8+WbUVFROueccxwt5O6779bmzZs1f/78fPeZMGGCoqKivF/R0dGOrgtA8eVWnyQaBeDM6BMAmwW6UfQJgC8cDaXGjh2rkSNHKi0tzbtt7969uu+++/Sf//zH7+Pdc889Wrx4sVauXKnatWvnu9/o0aN15MgR79fu3budLB9AMeZWnyQaBeDM6BMAmwWyUfQJgK8cvX1v+vTp+vHHHxUTE6PzzjtPkrRr1y6FhYVp//79euWVV7z7bty4Md/jGGN0zz33aNGiRVq1apXq1q17xusNCwtTWFiYkyUDKCHc6pNEowCcGX0CYLNANIo+AfCXo6FUz549A3LlQ4YM0bx58/Thhx8qIiJCe/fulfTXS0TDw8MDch0AShb6BMBW9AmAzQLRKPoEwF+OhlKPPPJIQK58+vTpkqRLL700x/bZs2drwIABAbkOACULfQJgK/oEwGaBaBR9AuAvR0OpQDHGuHn1AJAv+gTAVvQJgK3oEwB/ORpKZWVladKkSXrnnXe0a9cunTp1Ksflhw4dCsjiAMBf9AmAregTAJvRKABucPTX9x599FE9//zz6t27t44cOaKRI0fquuuuU0hIiMaNGxfgJQKA7+gTAFvRJwA2o1EA3OBoKPXWW29p5syZGjVqlEJDQ3XTTTfp1Vdf1cMPP6x169YFeo0A4DP6BMBW9AmAzWgUADc4Gkrt3btXzZo1kyRVqFBBR44ckSRdc801+vjjjwO3OgDwE30CYCv6BMBmNAqAGxwNpWrXrq20tDRJUv369fXpp59KkjZs2KCwsLDArQ4A/ESfANiKPgGwGY0C4AZHQ6levXppxYoVkqRhw4bpP//5jxo0aKBbb71Vt912W0AXCAD+oE8AbEWfANiMRgFwg6O/vvfUU095//n6669XdHS0vvzyS9WvX1/XXnttwBYHAP6iTwBsRZ8A2IxGAXCD30OpP//8U//v//0//ec//1FsbKwkqU2bNmrTpk3AFwcA/qBPAGxFnwDYjEYBcIvfb98rXbq0Fi1aVBhrAYACoU8AbEWfANiMRgFwi+PPlPrggw8CvBQAKDj6BMBW9AmAzWgUADc4+kyp+vXr6/HHH9fatWvVqlUrlS9fPsflQ4cODcjiAMBf9AmAregTAJvRKABucDSUevXVV1WxYkUlJiYqMTExx2Uej4dgAXANfQJgK/oEwGY0CoAbHA2ldu7cGeh1AEBA0CcAtqJPAGxGowC4wdFQauTIkXlu93g8Klu2rOrXr68ePXqoUqVKBVocAPiLPgGwFX0CYDMaBcANjoZSSUlJ2rhxo7KystSoUSMZY5SSkqJSpUopLi5O06ZN07333qs1a9aocePGgV4zAOSLPgGwFX0CYDMaBcANjv76Xo8ePdSlSxf9+uuvSkxM1MaNG7Vnzx517dpVN910k/bs2aOOHTtqxIgRgV4vAJwRfQJgK/oEwGY0CoAbHA2lnn32WT3++OOKjIz0bouMjNS4ceP0zDPPqFy5cnr44YdzfUAeABQ2+gTAVvQJgM1oFAA3OBpKHTlyRPv27cu1ff/+/UpPT5ckVaxYUadOnSrY6gDAT/QJgK3oEwCb0SgAbnD89r3bbrtNixYt0i+//KI9e/Zo0aJFGjRokHr27ClJWr9+vRo2bBjItQLAWdEnALaiTwBsRqMAuMHRB52/8sorGjFihPr06aPMzMy/DhQaqv79+2vSpEmSpLi4OL366quBWykA+IA+AbAVfQJgMxoFwA2OhlIVKlTQzJkzNWnSJO3YsUPGGNWrV08VKlTw7nPBBRcEao0A4DP6BMBW9AmAzWgUADc4GkqdVqFCBZ1//vmBWgsABAx9AmAr+gTAZjQKQDA5+kwpAAAAAAAAoCAYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoXB1KrV69Wt27d1fNmjXl8Xj0wQcfuLkcAMiBRgGwFX0CYCv6BMAfrg6ljh8/rubNm2vq1KluLgMA8kSjANiKPgGwFX0C4I9QN6+8W7du6tatm5tLAIB80SgAtqJPAGxFnwD4g8+UAgAAAAAAQNC5+kopf2VkZCgjI8P7fXp6uourAYCcaBQAW9EnALaiT0DJVqReKTVhwgRFRUV5v6Kjo91eEgB40SgAtqJPAGxFn4CSrUgNpUaPHq0jR454v3bv3u32kgDAi0YBsBV9AmAr+gSUbEXq7XthYWEKCwtzexkAkCcaBcBW9AmAregTULK5OpQ6duyYfvzxR+/3O3fu1KZNm1SpUiWdd955Lq4MAGgUAHvRJwC2ok8A/OHqUOqbb75Rp06dvN+PHDlSktS/f3+9/vrrLq0KAP5CowDYij4BsBV9AuAPV4dSl156qYwxbi4BAPJFowDYij4BsBV9AuCPIvVB5wAAAAAAACgeGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOhC3V4AYCNP5h9qUT1E4Ye3S78GfnYbfni7WlQPkSfzj4AfG0DRVNjd+TsaBCBQgtEumgXAX7Sp6GAoBeSh7LFd2vjvCtLqf0urA3/8eEkb/11Bycd2SWoX+CsAUOQUdnf+jgYBCJRgtItmAfAXbSo6GEoBefijwnlq+coxvfXWW4qPiwv48ZO3btUtt9yiWVedF/BjAyiaCrs7f0eDAARKMNpFswD4izYVHQylgDyY0LJK2putkxUbSjUvCPjxT+7NVtLebJnQsgE/NoCiqbC783c0CECgBKNdNAuAv2hT0cEHnQMAAAAAACDoeKVUMcMHdAMoiviQbwC24MNxAdiMRqG4YShVzPAB3QCKIj7kG4At+HBcADajUShuGEoVM3xAN4CiiA/5BmALPhwXgM1oFIobhlLFDB/QDaAo4kO+AdiCD8cFYDMaheKGDzoHAAAAAABA0PFKKSAPJ06ckCRt3LjRp/1Pnjyp1NRU1alTR+Hh4WfdPzk5uUDrA1D8+Nsdyf/2nEaDAARKYZ8zSTQLgP+CcV5FmwKDoVQxwzAlMLZu3SpJuuOOOwr1eiIiIgr1+EBR4eTEQSpeT26C1Z2/o0FAbgxZ/BPMdtEsgEb5ijYVHQylihmGKYHRs2dPSVJcXJzKlSvn3Z6cnKy+ffv6fby5c+cqPj4+x7aIiAg1aNCgQOsEigsGMvl3R3LeHinv/kg0CMgPT2T8E4xzJolmAafRKN8UxnkVz+kKh8cYY9xehFPp6emKiorSkSNHFBkZ6fZyrHDgwAF98MEHDFMKyYkTJ7z/R/B3Z/svEHnFEP+nuD6Wi+vtKgz5tUtiICM5b49EfwKhOD6Wi+NtCpRAn0tJJfN8inOm4CiOj+XieJsCiUYVHH0KDl8fywylSggeeLBZcX0sF9fbFWz59UuiYQiO4vhYLo63qbAxHIaNiuNjuTjepmCgUbANQykARUZxfSwX19sFlDTF8bFcHG8TUBIVx8dycbxNQEnk62M5JIhrAgAAAAAAACQxlAIAAAAAAIALGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoGEoBAAAAAAAg6BhKAQAAAAAAIOgYSgEAAAAAACDoQt1eQEEYYyRJ6enpLq8EQEGcfgyffkwXFzQKKB6KY6PoE1A80CcAtvK1T0V6KHX06FFJUnR0tMsrARAIR48eVVRUlNvLCBgaBRQvxalR9AkoXugTAFudrU8eU4TH6tnZ2fr1118VEREhj8fj9nKKpPT0dEVHR2v37t2KjIx0ezlFFvdjwRhjdPToUdWsWVMhIcXnXcU0qvCV9MdeSb/9wVIcG0WfAofHoX+4vwKLPuFseMz5jvsqsHztU5F+pVRISIhq167t9jKKhcjISB54AcD96Fxx+a97f0ejgqekP/ZK+u0PhuLWKPoUeDwO/cP9FTj0Cb7gMec77qvA8aVPxWOcDgAAAAAAgCKFoRQAAAAAAACCjqFUCRcWFqZHHnlEYWFhbi+lSON+BNxR0h97Jf32Azbgcegf7i8guHjM+Y77yh1F+oPOAQAAAAAAUDTxSikAAAAAAAAEHUMpAAAAAAAABB1DKQAAAAAAAAQdQ6kibtq0aapbt67Kli2rVq1a6Ysvvsh334ULF6pr166qWrWqIiMj1bZtWy1btizXPq1bt1bFihVVvnx5XXDBBXrzzTdzHWvPnj3q27evKleurHLlyumCCy5QYmJiwG9fsPhzPw4YMEAejyfXV5MmTfLcf8GCBfJ4POrZs2eO7ePGjct1jOrVqwfyZgHW8+exl5aWpptvvlmNGjVSSEiIhg8fnmufP//8U4899pjq1aunsmXLqnnz5lq6dGmOfTIzMzV27FjVrVtX4eHhio2N1WOPPabs7OxA37yz8uf2r1q1Ks/2bN26Ncd+hw8f1pAhQ1SjRg2VLVtW8fHx+uSTT7yXr169Wt27d1fNmjXl8Xj0wQcfFNbNA4oEfx6Ha9asUUJCgipXrqzw8HDFxcVp0qRJOfbZsmWL/vWvf6lOnTryeDyaPHlyruMU5cdhYZwzna1bnDOhJAv0872/y+95ik3nSv7w576SpIyMDI0ZM0YxMTEKCwtTvXr19Nprr3kv9+W8sij33BYMpYqwt99+W8OHD9eYMWOUlJSkDh06qFu3btq1a1ee+69evVpdu3bVJ598osTERHXq1Endu3dXUlKSd59KlSppzJgx+uqrr7R582YNHDhQAwcOzBGz33//XQkJCSpdurSWLFmiH374QRMnTlTFihUL+yYXCn/vxxdeeEFpaWner927d6tSpUq64YYbcu37888/a9SoUerQoUOex2rSpEmOY3333XcBvW2Azfx97GVkZKhq1aoaM2aMmjdvnuc+Y8eO1SuvvKIXX3xRP/zwgwYPHqxevXrl6NzTTz+tl19+WVOnTlVycrKeeeYZPfvss3rxxRcL5Xbmx9/bf9q2bdtydKNBgwbey06dOqWuXbsqNTVV7733nrZt26aZM2eqVq1a3n2OHz+u5s2ba+rUqYV224Ciwt/HYfny5XX33Xdr9erVSk5O1tixYzV27FjNmDHDu8+JEycUGxurp556Kt/BSVF9HBbGOZMv3ZI4Z0LJVBjP90470/MUW86V/OHkvKp3795asWKFZs2apW3btmn+/PmKi4vzXu7LeWVR7blVDIqsiy66yAwePDjHtri4OPPggw/6fIzGjRubRx999Iz7tGjRwowdO9b7/QMPPGDat2/v32ItVtD7cdGiRcbj8ZjU1NQc2zMzM01CQoJ59dVXTf/+/U2PHj1yXP7II4+Y5s2bF2TpQJFWkMfeJZdcYoYNG5Zre40aNczUqVNzbOvRo4e55ZZbvN9fffXV5rbbbsuxz3XXXWf69u3rx+oLzt/bv3LlSiPJ/P777/kec/r06SY2NtacOnXKpzVIMosWLfJ1yUCxE4hzqV69euXbj5iYGDNp0qQz/nxRehwWxjmTL93inAklVWE93zvb8xRbzpX84e99tWTJEhMVFWUOHjyY7zF9Oa/8u6LUc5vwSqki6tSpU0pMTNTll1+eY/vll1+utWvX+nSM7OxsHT16VJUqVcrzcmOMVqxYoW3btqljx47e7YsXL1br1q11ww03qFq1amrRooVmzpzp/Ma4KBD346xZs9SlSxfFxMTk2P7YY4+patWqGjRoUL4/m5KSopo1a6pu3brq06ePduzY4f+NAIqgQDz28pKRkaGyZcvm2BYeHq41a9Z4v2/fvr1WrFih7du3S5K+/fZbrVmzRldddZXj6/VXQW5/ixYtVKNGDXXu3FkrV67McdnixYvVtm1bDRkyROeee66aNm2q8ePHKysrK+C3ASjqAtGhpKQkrV27VpdccklhLNEqhXXO5Gu3OGdCSVOYz/fO9jzFhnMlfzi5r04/p33mmWdUq1YtNWzYUKNGjdLJkye9+/hyXomCC3V7AXDmwIEDysrK0rnnnptj+7nnnqu9e/f6dIyJEyfq+PHj6t27d47tR44cUa1atZSRkaFSpUpp2rRp6tq1q/fyHTt2aPr06Ro5cqQeeughrV+/XkOHDlVYWJhuvfXWgt+4ICro/ZiWlqYlS5Zo3rx5ObZ/+eWXmjVrljZt2pTvz7Zp00ZvvPGGGjZsqN9++01PPPGE2rVrpy1btqhy5cqObg9QVASiYXm54oor9Pzzz6tjx46qV6+eVqxYoQ8//DDHk5sHHnhAR44cUVxcnEqVKqWsrCw9+eSTuummmxxfr7+c3P4aNWpoxowZatWqlTIyMvTmm2+qc+fOWrVqlfc/HOzYsUOfffaZbrnlFn3yySdKSUnRkCFDlJmZqYcffrjQbxdQlBSkQ7Vr19b+/fuVmZmpcePG6fbbby/MpVqhsM6ZfOkW50woiQrr+Z4vz1NsOFfyh5P7aseOHVqzZo3Kli2rRYsW6cCBA7rrrrt06NAh7+dK+XJeiYJjKFXEeTyeHN8bY3Jty8v8+fM1btw4ffjhh6pWrVqOyyIiIrRp0yYdO3ZMK1as0MiRIxUbG6tLL71U0l8T99atW2v8+PGS/vqv9lu2bNH06dOL3FDqNKf34+uvv66KFSvm+HDAo0ePqm/fvpo5c6aqVKmS789269bN+8/NmjVT27ZtVa9ePc2ZM0cjR470/0YARZDTx15+XnjhBd1xxx2Ki4uTx+NRvXr1NHDgQM2ePdu7z9tvv625c+dq3rx5atKkiTZt2qThw4erZs2a6t+/v+PrdsKf29+oUSM1atTI+33btm21e/duPffcc96hVHZ2tqpVq6YZM2aoVKlSatWqlX799Vc9++yzDKWAfDjp0BdffKFjx45p3bp1evDBB1W/fn1rn6wFWiDPmSTfusU5E0qyQD7f8/V5ik3nSv7w577Kzs6Wx+PRW2+9paioKEnS888/r+uvv14vvfSSwsPDfTqvRMExlCqiqlSpolKlSuWa/O7bty/XhPif3n77bQ0aNEjvvvuuunTpkuvykJAQ1a9fX5J0wQUXKDk5WRMmTPAOpWrUqKHGjRvn+Jn4+Hi9//77BbhF7ijI/WiM0WuvvaZ+/fqpTJky3u0//fSTUlNT1b17d++203+pIjQ0VNu2bVO9evVyHa98+fJq1qyZUlJSCnKTgCKhII+9M6latao++OAD/fHHHzp48KBq1qypBx98UHXr1vXuc9999+nBBx9Unz59JP31BOfnn3/WhAkTgnaiFajbf/HFF2vu3Lne72vUqKHSpUurVKlS3m3x8fHau3evTp06laNVQElXkMfh6aY0a9ZMv/32m8aNG1fsh1KFcc4kOesW50woCQrj+Z6vz1NsOFfyh5P7qkaNGqpVq5Z3ICX91R5jjH755Rc1aNDAp/NKFByfKVVElSlTRq1atdLy5ctzbF++fLnatWuX78/Nnz9fAwYM0Lx583T11Vf7dF3GGGVkZHi/T0hI0LZt23Lss3379lyfqVQUOL0fJenzzz/Xjz/+mOu92HFxcfruu++0adMm79e1116rTp06adOmTYqOjs7zeBkZGUpOTlaNGjUKdqOAIqAgjz1flC1bVrVq1VJmZqbef/999ejRw3vZiRMnFBKS8//+SpUqFdQ/cxyo25+UlJSjGQkJCfrxxx9z3Jbt27erRo0aDKSAfwjU4/Cf50nFVWGcM0nOusU5E0qCwni+5+vzFBvOlfzh5L5KSEjQr7/+qmPHjnm3bd++XSEhIapdu3aOfc90XokAcOPT1REYCxYsMKVLlzazZs0yP/zwgxk+fLgpX7689y+aPPjgg6Zfv37e/efNm2dCQ0PNSy+9ZNLS0rxfhw8f9u4zfvx48+mnn5qffvrJJCcnm4kTJ5rQ0FAzc+ZM7z7r1683oaGh5sknnzQpKSnmrbfeMuXKlTNz584N3o0PIH/vx9P69u1r2rRp49N15PVXLe69916zatUqs2PHDrNu3TpzzTXXmIiIiFx/xQ8orpw89pKSkkxSUpJp1aqVufnmm01SUpLZsmWL9/J169aZ999/3/z0009m9erV5rLLLjN169bN8Rfr+vfvb2rVqmU++ugjs3PnTrNw4UJTpUoVc//99wfldp/m7+2fNGmSWbRokdm+fbv5/vvvzYMPPmgkmffff9+7z65du0yFChXM3XffbbZt22Y++ugjU61aNfPEE0949zl69Kj3fpRknn/+eZOUlGR+/vnn4N14wBL+Pg6nTp1qFi9ebLZv3262b99uXnvtNRMZGWnGjBnj3ScjI8P7GKtRo4YZNWqUSUpKMikpKd59iurjsDDOmXzpFudMKKkK4/neP+X1PMWWcyV/+HtfHT161NSuXdtcf/31ZsuWLebzzz83DRo0MLfffrt3H1/OK4tqz23CUKqIe+mll0xMTIwpU6aMadmypfn888+9l/Xv399ccskl3u8vueQSIynXV//+/b37jBkzxtSvX9+ULVvWnHPOOaZt27ZmwYIFua73v//9r2natKkJCwszcXFxZsaMGYV5MwudP/ejMcYcPnzYhIeH+3y784r9jTfeaGrUqGFKly5tatasaa677rocT66BksDfx15eDYuJifFevmrVKhMfH2/CwsJM5cqVTb9+/cyePXtyHCM9Pd0MGzbMnHfeeaZs2bImNjbWjBkzxmRkZBTmTc2TP7f/6aefNvXq1fP2uX379ubjjz/Odcy1a9eaNm3amLCwMBMbG2uefPJJk5mZ6b185cqVZ/3/AqAk8edxOGXKFNOkSRNTrlw5ExkZaVq0aGGmTZtmsrKyvPvs3Lkzz8fY349TlB+HhXHOdLZucc6EkizQz/f+Ka/nKTadK/nD3z4lJyebLl26mPDwcFO7dm0zcuRIc+LECe/lvpxXFuWe28JjjDGF+EIsAAAAAAAAIBc+UwoAAAAAAABBx1AKAAAAAAAAQcdQCgAAAAAAAEHHUAoAAAAAAABBx1AKAAAAAAAAQcdQCgAAAAAAAEHHUAoAAAAAAABBx1AKAAAAAAAAQcdQCrlceumlGj58uM/7p6amyuPxaNOmTYW2JklatWqVPB6PDh8+XKjXA8Ae9AhAUUO37FDSbi9QEHTLLh6PRx988IGk4N3XbmIoVUgGDBggj8cjj8ej0qVLKzY2VqNGjdLx48cL5bp69uwZsOMtXLhQjz/+uM/7R0dHKy0tTU2bNg3YGlBwK1asULt27RQREaEaNWrogQceUGZmptvLggvoEXwR6N/daRMmTNCFF16oiIgIVatWTT179tS2bdty7LNw4UJdccUVqlKlSrE/8YJv6BYKql27dkpLS1NUVJTbS0EJQbcAZxhKFaIrr7xSaWlp2rFjh5544glNmzZNo0aNynPfP//8s9DX4+t1VKpUSRERET4ft1SpUqpevbpCQ0OdLg0BtnnzZl111VW68sorlZSUpAULFmjx4sV68MEH3V4aXEKP4JbPP/9cQ4YM0bp167R8+XJlZmbq8ssvz3GSfvz4cSUkJOipp55ycaWwDd1CQZQpU0bVq1eXx+NxeykoQegW4IBBoejfv7/p0aNHjm233367qV69ujHGmEceecQ0b97czJo1y9StW9d4PB6TnZ1tDh8+bO644w5TtWpVExERYTp16mQ2bdqU7/U88sgjRlKOr5UrV5qdO3caSebtt982l1xyiQkLCzOvvfaaOXDggOnTp4+pVauWCQ8PN02bNjXz5s3LccxLLrnEDBs2zPt9TEyMefLJJ83AgQNNhQoVTHR0tHnllVe8l5++rqSkJGOMMStXrjSSzP/+9z/TqlUrEx4ebtq2bWu2bt2a43oef/xxU7VqVVOhQgUzaNAg88ADD5jmzZvne1tPH/f333/3bnvvvfdM48aNTZkyZUxMTIx57rnncvzMSy+9ZOrXr2/CwsJMtWrVzL/+9S/vZe+++65p2rSpKVu2rKlUqZLp3LmzOXbsWL7Xv2rVKnPhhReaMmXKmOrVq5sHHnjA/Pnnnznut3vuucfcd9995pxzzjHnnnuueeSRR/I9njHGZGZmmhEjRpioqChTqVIlc99995lbb701x787l1xyibn77rvNsGHDTMWKFU21atXMK6+8Yo4dO2YGDBhgKlSoYGJjY80nn3zi/ZnRo0eb1q1b57iuRYsWmbJly5r09PQzrgnFDz2iR2frUX6/O2OM2bx5s+nUqZN3bXfccYc5evSo934oXbq0Wb16tfdYzz33nKlcubL59ddf87yuffv2GUnm888/z3XZP39/KLnoFt3y5Tzqn787SSYmJibP2zt79mwTFRVlli5dauLi4kz58uXNFVdckatVs2bN8t4n1atXN0OGDDnjGoDT6Bbd8qVb69evN126dDGVK1c2kZGRpmPHjiYxMTHHPpLMokWLjDEl49yIoVQhyStK99xzj6lcubIx5q+YnP4/w40bN5pvv/3WZGdnm4SEBNO9e3ezYcMGs337dnPvvfeaypUrm4MHD+Z5PUePHjW9e/c2V155pUlLSzNpaWkmIyPD+y9vnTp1zPvvv2927Nhh9uzZY3755Rfz7LPPmqSkJPPTTz+ZKVOmmFKlSpl169Z5j5lXlCpVqmReeuklk5KSYiZMmGBCQkJMcnKyMSb/KLVp08asWrXKbNmyxXTo0MG0a9fOe8y5c+easmXLmtdee81s27bNPProoyYyMtKvKH3zzTcmJCTEPPbYY2bbtm1m9uzZJjw83MyePdsYY8yGDRtMqVKlzLx580xqaqrZuHGjeeGFF4wxxvz6668mNDTUPP/882bnzp1m8+bN5qWXXvI+0fqnX375xZQrV87cddddJjk52SxatMhUqVIlR3QuueQSExkZacaNG2e2b99u5syZYzwej/n000/zvU1PP/20iYqKMu+995754YcfzKBBg0xERESuoVRERIR5/PHHzfbt283jjz9uQkJCTLdu3cyMGTPM9u3bzZ133mkqV65sjh8/bowxZuTIkaZ9+/Y5rmvp0qU5nmii5KBH9OhsPcrvd3f8+HFTs2ZNc91115nvvvvOrFixwtStW9f079/f+7P33XefiYmJMYcPHzabNm0yYWFhZuHChfnedykpKUaS+e6773JdVhJOvOAbukW3fDmPOv07S0tLMz/++KOpX7++6devX563d/bs2aZ06dKmS5cuZsOGDSYxMdHEx8ebm2++2Xu8adOmmbJly5rJkyebbdu2mfXr15tJkyble/3A39EtuuVLt1asWGHefPNN88MPP3if/5177rk5XjjAUAoB8c8off3116Zy5cqmd+/expi/olS6dGmzb98+7z4rVqwwkZGR5o8//shxrHr16uWYTJ/tuoz5v395J0+efNa1XnXVVebee+/1fp9XlPr27ev9Pjs721SrVs1Mnz49x3XlNSk/7eOPPzaSzMmTJ40xxrRp0ybXf3lKSEjwK0o333yz6dq1a4597rvvPtO4cWNjjDHvv/++iYyMzPOVQYmJiUaSSU1Nzff6/u6hhx4yjRo1MtnZ2d5tL730kqlQoYLJysoyxvx1v/1zEHThhReaBx54IN/j1qhRwzz11FPe7//8809Tu3btXEOpvx83MzPTlC9f3nvSZcxfJ2WSzFdffWWMMWbZsmUmJCTEzJs3z2RmZppffvnFtG/f3kjK9V9GUPzRI3pkzNl7lNfvbsaMGeacc87J8V8RP/74YxMSEmL27t1rjDEmIyPDtGjRwvTu3ds0adLE3H777fleR3Z2tunevXuutZ1WEk684Bu6RbeMOXu3TsvOzja9evUyrVq1MidOnMjz9s6ePdtIMj/++GOONZx77rne72vWrGnGjBnj020C/olu0S1jfO/WaZmZmSYiIsL897//9W4raUMpPlOqEH300UeqUKGCypYtq7Zt26pjx4568cUXvZfHxMSoatWq3u8TExN17NgxVa5cWRUqVPB+7dy5Uz/99JN27dqVY/v48ePPuobWrVvn+D4rK0tPPvmkzj//fO/1fPrpp9q1a9cZj3P++ed7/9nj8ah69erat2+fzz9To0YNSfL+zLZt23TRRRfl2P+f359NcnKyEhIScmxLSEhQSkqKsrKy1LVrV8XExCg2Nlb9+vXTW2+9pRMnTkiSmjdvrs6dO6tZs2a64YYbNHPmTP3+++9nvK62bdvm+FyChIQEHTt2TL/88kuet/n07c7vfjpy5IjS0tLUtm1b77bQ0NBcv7N/HrdUqVKqXLmymjVr5t127rnnSvq/+/fyyy/Xs88+q8GDByssLEwNGzbU1Vdf7f15lDz0iB6dqUdnuq7mzZurfPnyOa4rOzvb+2HlZcqU0dy5c/X+++/r5MmTmjx5cr7Hu/vuu7V582bNnz/fr3WgZKJbdMvXbj300EP66quv9MEHHyg8PDzf/cqVK6d69erlefx9+/bp119/VefOnc96fUB+6BbdOlu39u3bp8GDB6thw4aKiopSVFSUjh07dtbfR3HGJ5MVok6dOmn69OkqXbq0atasqdKlS+e4/O8n+ZKUnZ2tGjVqaNWqVbmOVbFiRVWsWDHHXySqVKnSWdfwz+uYOHGiJk2apMmTJ6tZs2YqX768hg8frlOnTp3xOP9cu8fjUXZ2ts8/c/rB/Pef+ecHTxpjzni8fzLGnPEYERER2rhxo1atWqVPP/1UDz/8sMaNG6cNGzaoYsWKWr58udauXatPP/1UL774osaMGaOvv/5adevW9eu6/r7dyf3ki7yOe7b7d+TIkRoxYoTS0tJ0zjnnKDU1VaNHj87z9qH4o0f0yEmP8rquvx/vtLVr10qSDh06pEOHDuX6XUvSPffco8WLF2v16tWqXbu2X+tAyUS36JYv99PcuXM1adIkrVq16qxtyev4p9dxpmEW4Cu6RbfOdj8NGDBA+/fv1+TJkxUTE6OwsDC1bdv2rL+P4oxXShWi8uXLq379+oqJicn1L2teWrZsqb179yo0NFT169fP8VWlSpVc209HqUyZMsrKyvJpTV988YV69Oihvn37qnnz5oqNjVVKSkqBbqcTjRo10vr163Ns++abb/w6RuPGjbVmzZoc29auXauGDRt6Xw0UGhqqLl266JlnntHmzZuVmpqqzz77TNJfwUhISNCjjz6qpKQklSlTRosWLcr3utauXZsjemvXrlVERIRq1arl17pPi4qKUo0aNbRu3TrvtszMTCUmJjo6Xl48Ho9q1qyp8PBwzZ8/X9HR0WrZsmXAjo+igx7ljx79Ja/fXePGjbVp06Ycfynvyy+/VEhIiBo2bChJ+umnnzRixAjNnDlTF198sW699dYcJ2PGGN19991auHChPvvsMwbj8Bndyh/d+stXX32l22+/Xa+88oouvvhix8eR/noyW6dOHa1YsaJAx0HJRrfyR7f+8sUXX2jo0KG66qqr1KRJE4WFhenAgQOOj1ccMJSySJcuXdS2bVv17NlTy5YtU2pqqtauXauxY8ee8QFbp04dbd68Wdu2bdOBAwfO+Kc/69ev750QJycn69///rf27t1bGDfnjO655x7NmjVLc+bMUUpKip544glt3rzZrz/be++992rFihV6/PHHtX37ds2ZM0dTp071/tnVjz76SFOmTNGmTZv0888/64033lB2drYaNWqkr7/+WuPHj9c333yjXbt2aeHChdq/f7/i4+PzvK677rpLu3fv1j333KOtW7fqww8/1COPPKKRI0cqJMT5w2jYsGF66qmntGjRIm3dulV33XWXDh8+7Ph4f/fss8/qu+++05YtW/T444/rqaee0pQpU3j7HnxCj0pej/L63d1yyy0qW7as+vfvr++//14rV67UPffco379+uncc89VVlaW+vXrp8svv1wDBw7U7Nmz9f3332vixIne4w4ZMkRz587VvHnzFBERob1792rv3r06efKkd59Dhw5p06ZN+uGHHyT99RL/TZs2ufLvA4ouulWyurV371716tVLffr00RVXXOFty/79+x0dT5LGjRuniRMnasqUKUpJSdHGjRtzvPUKCDS6VbK6Jf31+3jzzTeVnJysr7/+WrfcckuJf6UmQymLeDweffLJJ+rYsaNuu+02NWzYUH369FFqaqr3M4Pycscdd6hRo0Zq3bq1qlatqi+//DLfff/zn/+oZcuWuuKKK3TppZeqevXq6tmzZyHcmjO75ZZbNHr0aI0aNUotW7bUzp07NWDAAJUtW9bnY7Rs2VLvvPOOFixYoKZNm+rhhx/WY489pgEDBkj66yWvCxcu1GWXXab4+Hi9/PLLmj9/vpo0aaLIyEitXr1aV111lRo2bKixY8dq4sSJ6tatW57XVatWLX3yySdav369mjdvrsGDB2vQoEEaO3Zsge6He++9V7feeqsGDBigtm3bKiIiQr169SrQMU9bsmSJOnTooNatW+vjjz/Whx9+6MrvGkUTPSp5Pcrrd1euXDktW7ZMhw4d0oUXXqjrr79enTt31tSpUyVJTz75pFJTUzVjxgxJUvXq1fXqq69q7Nix3rcbTJ8+XUeOHNGll16qGjVqeL/efvtt73UvXrxYLVq08H72XZ8+fdSiRQu9/PLLBbpNKFnoVsnq1tatW/Xbb79pzpw5Odpy4YUXOj5m//79NXnyZE2bNk1NmjTRNddc48orSlBy0K2S1S1Jeu211/T777+rRYsW6tevn4YOHapq1aoV6JhFncf4+0ZOoJB07dpV1atX15tvvun2Ulw1YMAAHT58WB988IHbSwFKLHoEoKihWwCKGroFiQ86h0tOnDihl19+WVdccYVKlSql+fPn63//+5+WL1/u9tIAlDD0CEBRQ7cAFDV0C/lhKAVXnH6p6hNPPKGMjAw1atRI77//vrp06eL20gCUMPQIQFFDtwAUNXQL+eHtewAAAAAAAAg6PugcAAAAAAAAQcdQCgAAAAAAAEHHUAoAAAAAAABBx1AKAAAAAAAAQcdQCgAAAAAAAEHHUAoAAAAAAABBx1AKAAAAAAAAQcdQCgAAAAAAAEHHUAoAAAAAAABB9/8BTFHE/p4+VYsAAAAASUVORK5CYII=\",\n      \"text/plain\": [\n       \"<Figure size 1200x400 with 4 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"fig, axs = plt.subplots(nrows=1, ncols=4, figsize=(12, 4))\\n\",\n    \"for i, loss_label in enumerate(PT_TASKS + [\\\"all\\\"]):\\n\",\n    \"    draw_boxplot(results, \\\"caco2\\\", loss_label=loss_label, ax=axs[i])\\n\",\n    \"plt.tight_layout()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAABKUAAAGGCAYAAACqvTJ0AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAABjMklEQVR4nO3deZyN9f//8ecxmBmMdSxjacbaEFlbpBKh1CfRIhXhg+/HR6XSxkeLRJRKScmWFqUNrSp7KokGH6mxhMmeJDO2Rmbevz/6zfk0zeCca65zrvfMedxvt7llrnOd67zO6Dy65t1ZfMYYIwAAAAAAACCMink9AAAAAAAAACIPi1IAAAAAAAAIOxalAAAAAAAAEHYsSgEAAAAAACDsWJQCAAAAAABA2LEoBQAAAAAAgLBjUQoAAAAAAABhx6IUAAAAAAAAwq641wOEU3Z2tnbv3q24uDj5fD6vxwHw/xljdOjQIVWvXl3FikXmWjl9AuwV6Y2iT4C9Ir1PEo0CbBVonyJqUWr37t2qVauW12MAOIkdO3aoZs2aXo/hCfoE2C9SG0WfAPtFap8kGgXY7nR9iqhFqbi4OEl//lDKli3r8TQAcmRkZKhWrVr+x2gkok+AvSK9UfQJsFek90miUYCtAu1TRC1K5Tyds2zZsgQLsFAkP+WaPgH2i9RG0SfAfpHaJ4lGAbY7XZ8i84XHAAAAAAAA8BSLUgAAAAAAAAg7FqUAAAAAAAAQdixKAQAAAAAAIOxYlAIAAAAAAEDYOVqUqlOnjn799dc82w8ePKg6deoUeCgAcIo+AbAVfQJgMxoFwAuOFqXS0tKUlZWVZ3tmZqZ27dpV4KEAwCn6BMBW9AmAzWgUAC8UD2bnDz74wP/nzz77TOXKlfN/n5WVpUWLFikpKcm14QAgUPQJgK3oEwCb0SgAXgpqUapr166SJJ/Pp969e+e6rESJEkpKStJTTz3l2nAAECj6BMBW9AmAzWgUAC8FtSiVnZ0tSapdu7ZWrVql+Pj4kAwFAMGiTwBsRZ8A2IxGAfBSUItSObZt25Zn28GDB1W+fPmCzgMABUKfANiKPgGwGY0C4AVHi1KPP/64kpKSdMMNN0iSrr/+es2ePVsJCQmaN2+emjZt6uqQCL2jR49qw4YNebYfO3ZMaWlpSkpKUmxsbK7LkpOTVapUqXCNCASEPkUeJ/2SaBjCjz4VbbQIhR2NKtpoFGzlaFFq8uTJmjlzpiRpwYIFWrhwoT799FO9/fbbuvfeezV//nxXh0TobdiwQS1btgzqOikpKWrRokWIJgKcoU+Rx0m/JBqG8KNPRRstQmFHo4o2GgVbOVqU2rNnj2rVqiVJ+uijj9S9e3d16tRJSUlJOu+881wdEOGRnJyslJSUPNtTU1PVs2dPzZw5Uw0bNsxzHcA29CnyOOlXzvWAcKJPRRstQmFHo4o2GgVbOVqUqlChgnbs2KFatWrp008/1ahRoyRJxhhlZWW5OiDCo1SpUqdcAW/YsCEr5CgU6FPkoV8oLOhT0UaLUNjRqKKNRsFWjhalrrnmGt10002qX7++fv31V3Xu3FmStHbtWtWrV8/VAQEgGPQJgK3oEwCb0SgAXnC0KDV+/HglJSVpx44deuKJJ1SmTBlJfz7lc9CgQa4OCADBoE8AbEWfANiMRgHwgqNFqRIlSuiee+7Js/3OO+8s6DwAUCD0CYCt6BMAm9EoAF4o5vSKr732mi688EJVr15dP/30kyTpmWee0fvvv+/acADgBH0CYCv6BMBmNApAuDlalJo0aZKGDBmizp076+DBg/43vitfvryeeeYZN+cDgKDQJwC2ok8AbEajAHjB0aLUc889p6lTp2r48OGKioryb2/VqpW+++4714YDgGDRJwC2ok8AbEajAHjB0aLUtm3b1Lx58zzbo6OjdeTIkQIPBQBO0ScAtqJPAGxGowB4wdGiVO3atbV27do82z/55BM1atSooDMBgGP0CYCt6BMAm9EoAF5w9Ol79957r2699Vb9/vvvMsZo5cqVmjVrlsaMGaNp06a5PSMABIw+AbAVfQJgMxoFwAuOFqX69u2rEydO6L777tPRo0d10003qUaNGnr22WfVo0cPt2cEgIDRJwC2ok8AbEajAHjB0aKUJA0YMEADBgzQ/v37lZ2drSpVqrg5FwA4Rp8A2Io+AbAZjQIQbo7eU6p9+/Y6ePCgJCk+Pt4fq4yMDLVv39614QAgWPQJgK3oEwCb0SgAXnC0KLV06VIdP348z/bff/9dX3zxRYGHAgCn6BMAW9EnADajUQC8ENTL99atW+f/8w8//KC9e/f6v8/KytKnn36qGjVquDcdQmLz5s06dOhQQPumpqbm+ufpxMXFqX79+o5nA5yiT0VfMO2Sgu9XDjoGt9GnoieU51ISHUJ40aiih0ahMAlqUapZs2by+Xzy+Xz5PoUzNjZWzz33nGvDwX2bN29WgwYNgr5ez549A95306ZNRAphR5+KNqftkoLrVw46BjfRp6IlHOdSEh1C+NCoooVGobAJalFq27ZtMsaoTp06WrlypSpXruy/rGTJkqpSpYqioqJcHxLuyVkxnzlzpho2bHja/Y8dO6a0tDQlJSUpNjb2lPumpqaqZ8+eQT2TAXALfSragm2XFFy/ctAxhAJ9KlpCeS4l0SGEH40qWmgUCpugFqUmT56srl27Kjs7O1TzIEwaNmyoFi1aBLRvmzZtQjwNUHD0KTIE0y6JfsEO9Klo4lwKRQWNKppoFAqLoN7ofM+ePfrHP/6hhIQE/d///Z8+/vhjZWZmhmo2AAgYfQJgK/oEwGY0CoCXglqUmjFjhn7++We9/fbbKl++vO6++27Fx8frmmuu0csvv6z9+/eHak6/F154QbVr11ZMTIxatmzJJ0EAkESfANjLhj5JNApA/mxoFH0CIldQL9+TJJ/Pp4suukgXXXSRnnjiCaWmpurDDz/U1KlT9a9//UvnnXeeunTpohtvvNH1T2l46623dOedd+qFF15QmzZtNHnyZHXu3Fk//PCDzjjjDFdvCwjE0aNHtWHDhjzbT/Xa7OTkZJUqVSpcI0YU+oRIcbL2SKd/bwga5A0v+yTRKHjPyTmTRLPChXMoRDL65DHjop9//tlMmzbNdOnSxYwbN87NQxtjjDn33HPNwIEDc21LTk42Q4cODej66enpRpJJT093fbbCIiUlxUgyKSkpherYtsq5z8F8RdLPJ1DheGzSp8ItXH0pLB1z0h4a5FyoH5+h7pMxBWsUfcot1J0oLB0KltNuFbWfg9s4h6JRf0ejgkefQiPQx2bQz5T6qx9//FFbtmzRxRdfrNjYWFWuXFn9+vVTv379CnLYfB0/flwpKSkaOnRoru2dOnXS8uXL871OZmZmrtdDZ2RkuD4XIltycrJSUlLybM/5VIr8PvUiOTk5XONFNPqEouxk7ZFO3Z+c68Jb4eyTFHyj6BNCwck5U871EF6cQyHS0CdvOVqU+vXXX3XDDTdo8eLF8vl82rx5s+rUqaP+/furQoUKevLJJ92eU/v371dWVpaqVq2aa3vVqlW1d+/efK8zZswYPfLII67PAuQoVarUKT/VIthPCkPB0SdEgtO1R6I/NvKiT1LwjaJPCAXOmezHORQiFX3yVlBvdJ7jrrvuUvHixbV9+/Zcr6G84YYb9Mknn7g2XH58Pl+u740xebblGDZsmNLT0/1fO3bsCOlsALxHnwDYyss+SYE3ij4BkYlzKABecPRMqfnz5+uzzz5TzZo1c22vX7++fvrpJ1cG+7v4+HhFRUXlWTHft29fnpX1HNHR0YqOjg7JPIWV78Tval6tmGIPbpJ2O1qTPKnYg5vUvFox+U787upxgWDQp6IplO36KzqGUPKiT1LwjaJPpxbqHtEheIVzqKKBRqGwcbQodeTIkXzfZX7//v0hC0TJkiXVsmVLLViwQN26dfNvX7Bgga6++uqQ3GZRFHN4u1b/q4y07F/SMneP3VDS6n+VUerh7ZIucPfgQIDoU9EUynb9FR1DKHnRJ4lGuS3UPaJD8ArnUEUDjUJh42hR6uKLL9arr76qRx99VNKfT7fMzs7WuHHj1K5dO1cH/KshQ4aoV69eatWqlVq3bq0pU6Zo+/btGjhwYMhus6j5vcwZajH5sF5//XU1dPmN2VI3bNDNN9+s6VcUzY9u3bx5sw4dOhTQvqmpqbn+eTpxcXGqX7++49nwP/SpaAplu/7Kxo4F0x4p+P5INChcvOqTRKPcFOoe2dihYIXynEmiWaHCOVTRQKNOjT7Zx9Gi1Lhx43TJJZfo22+/1fHjx3Xffffp+++/14EDB/TVV1+5PaPfDTfcoF9//VUjR47Unj171LhxY82bN0+JiYkhu82ixhSP0Zq92TpWvoFUvZmrxz62N1tr9mbLFI9x9bg22Lx5sxo0aBD09Xr27Bnwvps2bSJgLqBPRVMo2/VXtnXMaXuk4Poj0aBw8KpPEo1yU6h7ZFuHghWOcyaJZoUC51BFA406OfpkJ0eLUo0aNdK6des0adIkRUVF6ciRI7rmmmt06623KiEhwe0Zcxk0aJAGDRoU0tsA/i5nNf1kHwf6d8eOHVNaWpqSkpIUGxt7yn1zPmo0mGdC4OToE4qSYNsjBdcfiQaFk5d9kmgUwiOU50wSzQolzqFQ1NEnOwW9KPXHH3+oU6dOmjx5Mh/FiYgTzMeBtmnTJsTT4O/oE4qqYD+KmP7Yhz4h0nDOVLjQKEQS+mSXoN+Ov0SJElq/fv1JP6ITALxCnwDYij4BsBmNAuAVR58Recstt2j69OluzwIABUafANiKPgGwGY0C4AVH7yl1/PhxTZs2TQsWLFCrVq1UunTpXJc//fTTrgwHAMGiTwBsRZ8A2IxGAfCCo0Wp9evX+1+DuWnTplyX8ZRPAF6iTwBsRZ8A2IxGAfCCo0WpJUuWuD0HwuTo0aOSpNWrVwe0f7CfIgd4jT4VTcG2Swr+E1MkOobQok9FQyjPpSQ6BO/QqKKBRqGwcbQo9Vc7d+6Uz+dTjRo13JgHIbZhwwZJ0oABA0J2G3FxcSE7NhAM+lR0hKNdf0XHEGr0qfAKV4/oELxEowovGoXCxtGiVHZ2tkaNGqWnnnpKhw8flvTnv5R33323hg8frmLFHL1/OsKga9eukqTk5GSVKlXqtPunpqaqZ8+emjlzpho2bHja/ePi4lS/fv2Cjgk4Rp+KpmDbJQXfrxx0DKFCn4qGUJ9LSXQI3qBRRQONQmHjaFFq+PDhmj59usaOHas2bdrIGKOvvvpKI0aM0O+//67Ro0e7PSdcEh8fr/79+wd9vYYNG/pfYw7YjD4VTU7bJdEv2IM+FQ2cS6GoolFFA41CYeNoUeqVV17RtGnT1KVLF/+2pk2bqkaNGho0aBDBAuAZ+gTAVvQJgM1oFAAvOHoO5oEDB5ScnJxne3Jysg4cOFDgoQDAKfoEwFb0CYDNaBQALzhalGratKkmTpyYZ/vEiRPVtGnTAg8FAE7RJwC2ok8AbEajAHjB0cv3nnjiCV155ZVauHChWrduLZ/Pp+XLl2vHjh2aN2+e2zMCQMDoEwBb0ScANqNRALzg6JlSbdu21caNG9WtWzcdPHhQBw4c0DXXXKONGzfqoosucntGAAgYfQJgK/oEwGY0CoAXHD1TSpJq1KjBm90BsBJ9AmAr+gTAZjQKQLg5eqbUjBkz9M477+TZ/s477+iVV14p8FAA4BR9AmAr+gTAZjQKgBccLUqNHTtW8fHxebZXqVJFjz32WIGHAgCn6BMAW9EnADajUQC84GhR6qefflLt2rXzbE9MTNT27dsLPBQAOEWfANiKPgGwGY0C4AVHi1JVqlTRunXr8mz/73//q0qVKhV4KABwij4BsBV9AmAzGgXAC44WpXr06KHBgwdryZIlysrKUlZWlhYvXqw77rhDPXr0cHtGAAgYfQJgK/oEwGY0CoAXHH363qhRo/TTTz/p0ksvVfHifx4iOztbt9xyC683BuAp+gTAVvQJgM1oFAAvOFqUKlmypN566y2NGjVKa9euVWxsrJo0aaLExES35wOAoNAnALaiTwBsRqMAeMHRolSO+vXrq379+srKytJ3332nsmXLqkKFCm7NBgCO0ScAtqJPAGxGowCEk6P3lLrzzjs1ffp0SVJWVpbatm2rFi1aqFatWlq6dKmb8wFAUOgTAFvRJwA2o1EAvODomVLvvvuuevbsKUn68MMPtXXrVm3YsEGvvvqqhg8frq+++srVIRF6R48e1YYNG/JsT01NzfXPv0pOTlapUqVCPhsQDPoUeZz0S6JhCD/6VLTRIhR2NKpoo1GwlaNFqf3796tatWqSpHnz5ql79+5q0KCB+vXrpwkTJrg6IMJjw4YNatmy5Ukvz/kP1F+lpKSoRYsWoRwLCBp9ijxO+iXRMIQffSraaBEKOxpVtNEo2MrRolTVqlX1ww8/KCEhQZ9++qleeOEFSX+uvkZFRbk6IMIjOTlZKSkpebYfO3ZMaWlpSkpKUmxsbJ7rALahT5HHSb9yrgeEE30q2mgRCjsaVbTRKNjK0aJU37591b17dyUkJMjn86ljx46SpG+++YZ/aQupUqVKnXQFvE2bNmGeBnCOPkUe+oXCgj4VbbQIhR2NKtpoFGzlaFFqxIgRaty4sXbs2KHrr79e0dHRkqSoqCgNHTrU1QEBIBj0CYCt6BMAm9EoAF5wtCglSdddd12ebb179871fZMmTTRv3jzVqlXL6c0AQNDoEwBb0ScANqNRAMKtWCgPnpaWpj/++COUNwEAjtAnALaiTwBsRqMAuCmki1IAAAAAAABAfliUAgAAAAAAQNixKAUAAAAAAICwY1EKAAAAAAAAYceiFAAAAAAAAMIupItSkydPVtWqVUN5EwDgCH0CYCv6BMBmNAqAmxwtSu3cuVOHDx/Os/2PP/7QsmXL/N/fdNNNKl26tPPpACBI9AmAregTAJvRKABeCGpRas+ePTr33HOVmJio8uXLq3fv3rnCdeDAAbVr1871IQHgdOgTAFvRJwA2o1EAvBTUotTQoUMVFRWlb775Rp9++ql++OEHXXLJJfrtt9/8+xhjXB8SAE6HPgGwFX0CYDMaBcBLQS1KLVy4UM8++6xatWqlDh066Msvv1TNmjXVvn17HThwQJLk8/lCMigAnAp9AmAr+gTAZjQKgJeCWpRKT09XhQoV/N9HR0fr3XffVVJSktq1a6d9+/a5PiAABII+AbAVfQJgMxoFwEtBLUrVqVNH69aty7WtePHieuedd1SnTh394x//cHW4vxo9erQuuOAClSpVSuXLlw/Z7QAonOgTAFt52SeJRgE4Nc6hAHgpqEWpzp07a8qUKXm250SrWbNmbs2Vx/Hjx3X99dfr3//+d8huA0DhRZ8A2MrLPkk0CsCpcQ4FwEvFg9l59OjROnr0aP4HKl5cc+bM0c6dO10Z7O8eeeQRSdLLL78ckuMDKNzoEwBbedkniUYBODXOoQB4KahnShUvXlyxsbGqU6eOfvjhhzyXR0VFKTEx0bXhACBQ9AmAregTAJvRKABeCuqZUpJUokQJZWZmFopPYMjMzFRmZqb/+4yMDA+nARBq9AmAregTAJvRKABeCeqZUjluv/12Pf744zpx4kSBbnzEiBHy+Xyn/Pr2228dH3/MmDEqV66c/6tWrVoFmheA/egTAFu51ScptI2iT0Bk4hwKgBeCfqaUJH3zzTdatGiR5s+fryZNmqh06dK5Lp8zZ05Ax7ntttvUo0ePU+6TlJTkZERJ0rBhwzRkyBD/9xkZGUQLKOLoEwBbudUnKbSNok9AZOIcCoAXHC1KlS9fXtdee22Bbzw+Pl7x8fEFPs7JREdHKzo6OmTHB2Af+gTAVm71SQpto+gTEJk4hwLghaAWpQ4fPqwyZcpoxowZoZrnpLZv364DBw5o+/btysrK0tq1ayVJ9erVU5kyZcI+DwC70CcAtvKyTxKNAnBqnEMB8FJQi1Lx8fFq166dunTpoquvvlrVq1cP1Vx5PPTQQ3rllVf83zdv3lyStGTJEl1yySVhmwOAnegTAFt52SeJRgE4Nc6hAHgpqDc637hxo6644grNnj1btWvX1jnnnKNHH31U69atC9V8fi+//LKMMXm+iBUAiT4BsJeXfZJoFIBT4xwKgJeCWpRKTEzU7bffroULF2rfvn0aMmSIvv/+e1188cWqXbu27rjjDi1evFhZWVmhmhcA8kWfANiKPgGwGY0C4KWgFqX+qly5crrxxhv15ptvav/+/Zo8ebKys7PVt29fVa5cWa+//rqbcwJAwOgTAFvRJwA2o1EAws3Rp+/lOUjx4urUqZM6deqk5557TmvWrNGJEyfcODQAFAh9AmAr+gTAZjQKQDgUaFHq6NGj2r59u44fP55re84b1AFFhe/E72perZhiD26Sdjt+gmG+Yg9uUvNqxeQ78burx4109AlFQSjbk4MGhR99QlEW6m7RrNCjUSiq6JOdHC1K/fLLL+rbt68++eSTfC/n9cYoamIOb9fqf5WRlv1LWubusRtKWv2vMko9vF3SBe4ePALRJxQloWxPDhoUPvQJkSDU3aJZoUOjUNTRJzs5WpS688479dtvv2nFihVq166d5s6dq59//lmjRo3SU0895faMgOd+L3OGWkw+rNdff10Nk5NdPXbqhg26+eabNf2KM1w9bqSiTyhKQtmeHDQofOgTIkGou0WzQodGoaijT3ZytCi1ePFivf/++zrnnHNUrFgxJSYmqmPHjipbtqzGjBmjK6+80u05AU+Z4jFaszdbx8o3kKo3c/XYx/Zma83ebJniMa4eN1LRJxQloWxPDhoUPvQJkSDU3aJZoUOjUNTRJzs5eiHlkSNHVKVKFUlSxYoV9csvv0iSmjRpotWrV7s3HQAEiT4BsBV9AmAzGgXAC44Wpc4880xt3LhRktSsWTNNnjxZu3bt0osvvqiEhARXBwSAYNAnALaiTwBsRqMAeMHxe0rt2bNHkvTwww/rsssu0+uvv66SJUvq5ZdfdnM+AAgKfQJgK/oEwGY0CoAXHC1K3Xzzzf4/N2/eXGlpadqwYYPOOOMMxcfHuzYcAASLPgGwFX0CYDMaBcALjhal/q5UqVJq0aKFG4cCAFfRJwC2ok8AbEajAISDo/eUuu666zR27Ng828eNG6frr7++wEMBgFP0CYCt6BMAm9EoAF5wtCj1+eef5/uRoJdffrmWLVtW4KEAwCn6BMBW9AmAzWgUAC84WpQ6fPiwSpYsmWd7iRIllJGRUeChAMAp+gTAVvQJgM1oFAAvOFqUaty4sd56660829988001atSowEMBgFP0CYCt6BMAm9EoAF5w9EbnDz74oK699lpt2bJF7du3lyQtWrRIs2bN0jvvvOPqgAAQDPoEwFb0CYDNaBQALzhalOrSpYvee+89PfbYY3r33XcVGxurs88+WwsXLlTbtm3dnhEAAkafANiKPgGwGY0C4AVHi1KSdOWVV+b7RngA4DX6BMBW9AmAzWgUgHBz9J5SknTw4EFNmzZN//nPf3TgwAFJ0urVq7Vr1y7XhgMAJ+gTAFvRJwA2o1EAws3RM6XWrVunDh06qFy5ckpLS1P//v1VsWJFzZ07Vz/99JNeffVVt+cEgIDQJwC2ok8AbEajAHjB0TOlhgwZoj59+mjz5s2KiYnxb+/cubOWLVvm2nAAECz6BMBW9AmAzWgUAC84WpRatWqV/vWvf+XZXqNGDe3du7fAQwGAU/QJgK3oEwCb0SgAXnC0KBUTE6OMjIw82zdu3KjKlSsXeCgAcIo+AbAVfQJgMxoFwAuOFqWuvvpqjRw5Un/88Yckyefzafv27Ro6dKiuvfZaVwcEgGDQJwC2ok8AbEajAHjB0aLUk08+qV9++UVVqlTRsWPH1LZtW9WrV09xcXEaPXq02zMCQMDoEwBb0ScANqNRALzg6NP3ypYtqy+//FKLFy/W6tWrlZ2drRYtWqhDhw5uzwcAQaFPAGxFnwDYjEYB8IKjRakc7du3V/v27d2aBQBcQ58A2Io+AbAZjQIQTo4XpRYtWqRFixZp3759ys7OznXZSy+9VODBAMAp+gTAVvQJgM1oFIBwc7Qo9cgjj2jkyJFq1aqVEhIS5PP53J4LAByhTwBsRZ8A2IxGAfCCo0WpF198US+//LJ69erl9jwAUCD0CYCt6BMAm9EoAF5w9Ol7x48f1wUXXOD2LABQYPQJgK3oEwCb0SgAXnC0KNW/f3+98cYbbs8CAAVGnwDYij4BsBmNAuAFRy/f+/333zVlyhQtXLhQZ599tkqUKJHr8qefftqV4QAgWPQJgK3oEwCb0SgAXnC0KLVu3To1a9ZMkrR+/fpcl/GGeAC8RJ8A2Io+AbAZjQLgBUeLUkuWLHF7DgBwBX0CYCv6BMBmNAqAFxy9p1SOH3/8UZ999pmOHTsmSTLGuDIUABQUfQJgK/oEwGY0CkA4OVqU+vXXX3XppZeqQYMGuuKKK7Rnzx5Jf7453t133+3qgAAQDPoEwFb0CYDNaBQALzhalLrrrrtUokQJbd++XaVKlfJvv+GGG/Tpp5+6NhwABIs+AbAVfQJgMxoFwAuO3lNq/vz5+uyzz1SzZs1c2+vXr6+ffvrJlcEAwAn6BMBW9AmAzWgUAC84eqbUkSNHcq2e59i/f7+io6MLPBQAOEWfANiKPgGwGY0C4AVHi1IXX3yxXn31Vf/3Pp9P2dnZGjdunNq1a+facAAQLPoEwFb0CYDNaBQALzh6+d64ceN0ySWX6Ntvv9Xx48d133336fvvv9eBAwf01VdfuT2j0tLS9Oijj2rx4sXau3evqlevrp49e2r48OEqWbKk67cHoPCiTwBsFe4+STQKQOA4hwLgBUeLUo0aNdJ///tfvfjii4qKitKRI0d0zTXX6NZbb1VCQoLbM2rDhg3Kzs7W5MmTVa9ePa1fv14DBgzQkSNH9OSTT7p+ewAKL/oEwFbh7pNEowAEjnMoAF5wtCglSQkJCXrkkUfcnOWkLr/8cl1++eX+7+vUqaONGzdq0qRJBAtAHvQJgK3C2SeJRgEIDudQAMLN0XtK1alTR3379lVmZmau7fv371edOnVcGex00tPTVbFixbDcFoDCgz4BsJUNfZJoFID82dAo+gREHkeLUmlpafrqq6900UUXac+ePf7tWVlZYfm40C1btui5557TwIEDT7lfZmamMjIycn0BKNroEwBbed0nKbBG0ScgMnndKM6hgMjkaFHK5/Pp008/Vc2aNdWqVSutWrXK0Y2PGDFCPp/vlF/ffvttruvs3r1bl19+ua6//nr179//lMcfM2aMypUr5/+qVauWozkBFB70CYCt3OqTFNpG0ScgMnEOBcALjt5TyhijMmXKaM6cORo2bJjatm2rKVOmqGPHjkEd57bbblOPHj1OuU9SUpL/z7t371a7du3UunVrTZky5bTHHzZsmIYMGeL/PiMjg2gBRRx9AmArt/okhbZR9AmITJxDAfCCo0Upn8/n//OYMWN01llnacCAAbrxxhuDOk58fLzi4+MD2nfXrl1q166dWrZsqRkzZqhYsdM/ySs6OlrR0dFBzQSgcKNPAGzlVp+k0DaKPgGRiXMoAF5w/Eypv+rZs6fq1q2rbt26uTLU3+3evVuXXHKJzjjjDD355JP65Zdf/JdVq1YtJLcJoHCiTwBsFe4+STQKQOA4hwLgBUeLUtnZ2Xm2tW7dWv/973+1YcOGAg/1d/Pnz9ePP/6oH3/8UTVr1sx12d/jCSCy0ScAtgp3nyQaBSBwnEMB8IKjNzo/mapVq6pt27ZuHlKS1KdPHxlj8v0CgEDQJwC2ClWfJBoFoOA4hwIQSo6eKSVJ7777rt5++21t375dx48fz3XZ6tWrCzwYADhFnwDYij4BsBmNAhBujp4pNWHCBPXt21dVqlTRmjVrdO6556pSpUraunWrOnfu7PaMABAw+gTAVvQJgM1oFAAvOFqUeuGFFzRlyhRNnDhRJUuW1H333acFCxZo8ODBSk9Pd3tGAAgYfQJgK/oEwGY0CoAXHC1Kbd++XRdccIEkKTY2VocOHZIk9erVS7NmzXJvOgAIEn0CYCv6BMBmNAqAFxwtSlWrVk2//vqrJCkxMVErVqyQJG3bto03pgPgKfoEwFb0CYDNaBQALzhalGrfvr0+/PBDSVK/fv101113qWPHjrrhhhvUrVs3VwcEgGDQJwC2ok8AbEajAHjB0afvTZkyRdnZ2ZKkgQMHqmLFivryyy911VVXaeDAga4OCADBoE8AbEWfANiMRgHwQtCLUidOnNDo0aP1z3/+U7Vq1ZIkde/eXd27d3d9OAAIBn0CYCv6BMBmNAqAV4J++V7x4sU1btw4ZWVlhWIeAHCMPgGwFX0CYDMaBcArjt5TqkOHDlq6dKnLowBAwdEnALaiTwBsRqMAeMHRe0p17txZw4YN0/r169WyZUuVLl061+VdunRxZTgACBZ9AmAr+gTAZjQKgBccLUr9+9//liQ9/fTTeS7z+Xw87ROAZ+gTAFvRJwA2o1EAvOBoUSrnUxkAwDb0CYCt6BMAm9EoAF5w9J5Sr776qjIzM/NsP378uF599dUCDwUATtEnALaiTwBsRqMAeMHRolTfvn2Vnp6eZ/uhQ4fUt2/fAg8FAE7RJwC2ok8AbEajAHjB0aKUMUY+ny/P9p07d6pcuXIFHgoAnKJPAGxFnwDYjEYB8EJQ7ynVvHlz+Xw++Xw+XXrppSpe/H9Xz8rK0rZt23T55Ze7PiQAnA59AmAr+gTAZjQKgJeCWpTq2rWrJGnt2rW67LLLVKZMGf9lJUuWVFJSkq699lpXBwSAQNAnALaiTwBsRqMAeCmoRamHH35YkpSUlKQePXooOjo6JEMBQLDoEwBb0ScANqNRALzk6D2l2rdvr19++cX//cqVK3XnnXdqypQprg0GAE7QJwC2ok8AbEajAHjB0aLUTTfdpCVLlkiS9u7dqw4dOmjlypX6z3/+o5EjR7o6IAAEgz4BsBV9AmAzGgXAC44WpdavX69zzz1XkvT222+rSZMmWr58ud544w29/PLLbs4HAEGhTwBsRZ8A2IxGAfCCo0WpP/74w/9a44ULF6pLly6SpOTkZO3Zs8e96QAgSPQJgK3oEwCb0SgAXnC0KHXWWWfpxRdf1BdffKEFCxb4PyJ09+7dqlSpkqsDAkAw6BMAW9EnADajUQC84GhR6vHHH9fkyZN1ySWX6MYbb1TTpk0lSR988IH/KZ8A4AX6BMBW9AmAzWgUAC8Ud3KlSy65RPv371dGRoYqVKjg3/5///d/KlWqlGvDAUCw6BMAW9EnADajUQC84GhRSpKioqJyxUqSkpKSCjoPABQYfQJgK/oEwGY0CkC4OV6Uevfdd/X2229r+/btOn78eK7LVq9eXeDBAMAp+gTAVvQJgM1oFIBwc/SeUhMmTFDfvn1VpUoVrVmzRueee64qVaqkrVu3qnPnzm7PCAABo08AbEWfANiMRgHwgqNFqRdeeEFTpkzRxIkTVbJkSd13331asGCBBg8erPT0dLdnBICA0ScAtqJPAGxGowB4wdGi1Pbt23XBBRdIkmJjY3Xo0CFJUq9evTRr1iz3pgOAINEnALaiTwBsRqMAeMHRolS1atX066+/SpISExO1YsUKSdK2bdtkjHFvOgAIEn0CYCv6BMBmNAqAFxwtSrVv314ffvihJKlfv36666671LFjR91www3q1q2bqwMCQDDoEwBb0ScANqNRALzg6NP3pkyZouzsbEnSwIEDVbFiRX355Ze66qqrNHDgQFcHBIBg0CcAtqJPAGxGowB4wdGiVLFixVSs2P+eZNW9e3d17949z36DBg3SyJEjFR8f73xCAAgCfQJgK/oEwGY0CoAXHL18L1AzZ85URkZGKG8CAByhTwBsRZ8A2IxGAXBTSBeleEM8ALaiTwBsRZ8A2IxGAXBTSBelAAAAAAAAgPywKAUAAAAAAICwY1EKAAAAAAAAYceiFAAAAAAAAMIupItSPXv2VNmyZUN5EwDgCH0CYCv6BMBmNAqAmxwtSiUlJWnkyJHavn37KfebNGmS4uPjHQ0GAE7QJwC2ok8AbEajAHjB0aLU3Xffrffff1916tRRx44d9eabbyozM9Pt2XLp0qWLzjjjDMXExCghIUG9evXS7t27Q3qbAAof+gTAVl70SaJRAALDORQALzhalLr99tuVkpKilJQUNWrUSIMHD1ZCQoJuu+02rV692u0ZJUnt2rXT22+/rY0bN2r27NnasmWLrrvuupDcFoDCiz4BsJUXfZJoFIDAcA4FwBPGBcePHzfPPPOMiY6ONsWKFTNnn322mT59usnOznbj8Pl6//33jc/nM8ePHw/4Ounp6UaSSU9PD9lcKJpSUlKMJJOSklKojl1YhPKxSZ9QmIWjDzTo9EL1+PSiT8YE3yj6hGCEuik0KzfOoWgUAkefwivQx2bxgixo/fHHH5o7d65mzJihBQsW6Pzzz1e/fv20e/duDR8+XAsXLtQbb7xRkJvI14EDB/T666/rggsuUIkSJU66X2ZmZq6nnGZkZLg+CwA70ScAtvKqT1JgjaJPQGTjHApAODlalFq9erVmzJihWbNmKSoqSr169dL48eOVnJzs36dTp066+OKLXRtUku6//35NnDhRR48e1fnnn6+PPvrolPuPGTNGjzzyiKszALAbfQJgK6/6JAXXKPoERCbOoQB4wdF7Sp1zzjnavHmzJk2apJ07d+rJJ5/MFStJatSokXr06HHK44wYMUI+n++UX99++61//3vvvVdr1qzR/PnzFRUVpVtuuUXGmJMef9iwYUpPT/d/7dixw8ndBVCI0CcAtnKrT1JoG0WfgMjEORQALzh6ptTWrVuVmJh4yn1Kly6tGTNmnHKf22677bRRS0pK8v85Pj5e8fHxatCggRo2bKhatWppxYoVat26db7XjY6OVnR09CmPD6BooU8AbOVWn6TQNoo+AZGJcygAXnC0KNWuXTutWrVKlSpVyrX94MGDatGihbZu3RrQcXIC5ETO6nk4PkoZQOFBnwDYyq0+STQKgPs4hwLgBUeLUmlpacrKysqzPTMzU7t27SrwUH+3cuVKrVy5UhdeeKEqVKigrVu36qGHHlLdunVPuoIOIDLRJwC2CnefJBoFIHCcQwHwQlCLUh988IH/z5999pnKlSvn/z4rK0uLFi3K9VRMt8TGxmrOnDl6+OGHdeTIESUkJOjyyy/Xm2++yVM3AUiiTwDs5VWfJBoF4PQ4hwLgpaAWpbp27SpJ8vl86t27d67LSpQooaSkJD311FOuDZejSZMmWrx4sevHBVB00CcAtvKqTxKNAnB6nEMB8FJQi1LZ2dmSpNq1a2vVqlWOXysMAG6jTwBsRZ8A2IxGAfCSo/eU2rZtm9tzAIAr6BMAW9EnADajUQC8EPCi1IQJE/R///d/iomJ0YQJE0657+DBgws8GAAEij4BsBV9AmAzGgXAawEvSo0fP14333yzYmJiNH78+JPu5/P5CBaAsKJPAGxFnwDYjEYB8FrAi1J/fTonT+0EYBP6BMBW9AmAzWgUAK8V83oAAAAAAAAARB5Hi1LXXXedxo4dm2f7uHHjdP311xd4KABwij4BsBV9AmAzGgXAC44WpT7//HNdeeWVebZffvnlWrZsWYGHAgCn6BMAW9EnADajUQC84GhR6vDhwypZsmSe7SVKlFBGRkaBhwIAp+gTAFvRJwA2o1EAvOBoUapx48Z666238mx/88031ahRowIPBQBO0ScAtqJPAGxGowB4IeBP3/urBx98UNdee622bNmi9u3bS5IWLVqkWbNm6Z133nF1QAAIBn0CYCv6BMBmNAqAFxwtSnXp0kXvvfeeHnvsMb377ruKjY3V2WefrYULF6pt27ZuzwgAAaNPAGxFnwDYjEYB8IKjRSlJuvLKK/N9IzwA8Bp9AmAr+gTAZjQKQLg5ek8pAAAAAAAAoCACfqZUxYoVtWnTJsXHx6tChQry+Xwn3ffAgQOuDAcAgaBPAGxFnwDYjEYB8FrAi1Ljx49XXFycJOmZZ54J1TwAEDT6BMBW9AmAzWgUAK8FvCjVu3fvfP8MAF6jTwBsRZ8A2IxGAfBawItSGRkZAR+0bNmyjoYBACfoEwBb0ScANqNRALwW8KJU+fLlT/kaY0kyxsjn8ykrK6vAgwFAoOgTAFvRJwA2o1EAvBbwotSSJUtCOQcAOEafANiKPgGwGY0C4LWAF6Xatm0byjkAwDH6BMBW9AmAzWgUAK8FvCi1bt06NW7cWMWKFdO6detOue/ZZ59d4MEAmxw9elSStHr16oD2P3bsmNLS0pSUlKTY2NhT7puamlrg+SIdfUJRFWx7pOD6I9GgUKNPiDShPGeSaJbbaBQiCX2yU8CLUs2aNdPevXtVpUoVNWvWTD6fT8aYPPvxemMURRs2bJAkDRgwIGS3kfNxvAgefUJRFY725KBBoUGfEGnC1S2a5Q4ahUhCn+wU8KLUtm3bVLlyZf+fgUjStWtXSVJycrJKlSp12v1TU1PVs2dPzZw5Uw0bNjzt/nFxcapfv35Bx4xY9AlFVbDtkYLvj0SDQok+IdKE+pxJolluolGIJPTJTgEvSiUmJub7ZyASxMfHq3///kFfr2HDhmrRokUIJsJf0ScUVU7bI9EfW9AnRBrOmQoXGoVIQp/sFPCi1N9t2rRJS5cu1b59+5SdnZ3rsoceeqjAgwGAU/QJgK3oEwCb0SgA4eZoUWrq1Kn697//rfj4eFWrVk0+n89/mc/nI1gAPEOfANiKPgGwGY0C4AVHi1KjRo3S6NGjdf/997s9DwAUCH0CYCv6BMBmNAqAF4o5udJvv/2m66+/3u1ZAKDA6BMAW9EnADajUQC84GhR6vrrr9f8+fPdngUACow+AbAVfQJgMxoFwAsBv3xvwoQJ/j/Xq1dPDz74oFasWKEmTZqoRIkSufYdPHiwexMCwGnQJwC2ok8AbEajAHgt4EWp8ePH5/q+TJky+vzzz/X555/n2u7z+QgWgLCiTwBsRZ8A2IxGAfBawItS27ZtC+UcAOAYfQJgK/oEwGY0CoDXHL2n1F8ZY2SMcWMWAHAVfQJgK/oEwGY0CkC4OF6Umj59uho3bqyYmBjFxMSocePGmjZtmpuzAYAj9AmAregTAJvRKADhFvDL9/7qwQcf1Pjx43X77berdevWkqSvv/5ad911l9LS0jRq1ChXhwSAQNEnALaiTwBsRqMAeMHRotSkSZM0depU3Xjjjf5tXbp00dlnn63bb7+dYAHwDH0CYCv6BMBmNAqAFxy9fC8rK0utWrXKs71ly5Y6ceJEgYcCAKfoEwBb0ScANqNRALzgaFGqZ8+emjRpUp7tU6ZM0c0331zgoQDAKfoEwFb0CYDNaBQALzh6+Z7055vgzZ8/X+eff74kacWKFdqxY4duueUWDRkyxL/f008/XfApASAI9AmAregTAJvRKADh5mhRav369WrRooUkacuWLZKkypUrq3Llylq/fr1/P5/P58KIABA4+gTAVvQJgM1oFAAvOFqUWrJkidtzAIAr6BMAW9EnADajUQC84Og9pf5q586d2rVrlxuzBCQzM1PNmjWTz+fT2rVrw3a7AAof+gTAVuHuk0SjAASOcygA4eJoUSo7O1sjR45UuXLllJiYqDPOOEPly5fXo48+quzsbLdnzOW+++5T9erVQ3obAAov+gTAVl72SaJRAE6NcygAXnD08r3hw4dr+vTpGjt2rNq0aSNjjL766iuNGDFCv//+u0aPHu32nJKkTz75RPPnz9fs2bP1ySefhOQ2ABRu9AmArbzqk0SjAJwe51AAvOBoUeqVV17RtGnT1KVLF/+2pk2bqkaNGho0aFBIgvXzzz9rwIABeu+991SqVKmArpOZmanMzEz/9xkZGa7PBcAu9AmArbzokxR8o+gTEJk4hwLgBUcv3ztw4ICSk5PzbE9OTtaBAwcKPNTfGWPUp08fDRw4UK1atQr4emPGjFG5cuX8X7Vq1XJ9NgB2oU8AbBXuPknOGkWfgMjEORQALzhalGratKkmTpyYZ/vEiRPVtGnTgI8zYsQI+Xy+U359++23eu6555SRkaFhw4YFNeewYcOUnp7u/9qxY0dQ1wdQ+NAnALZyq09SaBtFn4DIxDkUAC84evneE088oSuvvFILFy5U69at5fP5tHz5cu3YsUPz5s0L+Di33XabevToccp9kpKSNGrUKK1YsULR0dG5LmvVqpVuvvlmvfLKK/leNzo6Os91ABRt9AmArdzqkxTaRtEnIDJxDgXACz5jjHFyxd27d+v555/Xhg0bZIxRo0aNNGjQoJB8asL27dtzvVZ49+7duuyyy/Tuu+/qvPPOU82aNQM6TkZGhsqVK6f09HSVLVvW9TmBHKtXr1bLli2VkpKiFi1aeD2O9dx+bNInRDL64z43H5/h7JPkTqPoE0KJZhUM51A0CqFDnwom0Memo2dKSVL16tVD+ikxf3XGGWfk+r5MmTKSpLp16wYcKwCRgz4BsFU4+yTRKADB4RwKQLg5XpT67bffNH36dKWmpsrn86lhw4bq27evKlas6OZ8ABA0+gTAVvQJgM1oFIBwc/RG559//rlq166tCRMm6LffftOBAwc0YcIE1a5dW59//rnbM+aRlJQkY4yaNWsW8tsCULjQJwC28rpPEo0CcHJeN4o+AZHJ0TOlbr31VnXv3l2TJk1SVFSUJCkrK0uDBg3SrbfeqvXr17s6JAAEij4BsBV9AmAzGgXAC46eKbVlyxbdfffd/lhJUlRUlIYMGaItW7a4NhwABIs+AbAVfQJgMxoFwAuOFqVatGih1NTUPNtTU1N5uiUAT9EnALaiTwBsRqMAeMHRy/cGDx6sO+64Qz/++KPOP/98SdKKFSv0/PPPa+zYsVq3bp1/37PPPtudSQEgAPQJgK3oEwCb0SgAXnC0KHXjjTdKku677758L/P5fDLGyOfzKSsrq2ATAkAQ6BMAW9EnADajUQC84GhRatu2bW7PAQCuoE8AbEWfANiMRgHwgqNFqcTERLfnAABX0CcAtqJPAGxGowB4IeBFqQ8++ECdO3dWiRIl9MEHH5xy3y5duhR4MAAIFH0CYCv6BMBmNAqA1wJelOratav27t2rKlWqqGvXrifdj9cYAwg3+gTAVvQJgM1oFACvBbwolZ2dne+fAcBr9AmAregTAJvRKABeK+b1AAAAAAAAAIg8AT9TasKECQEfdPDgwY6GAQAn6BMAW9EnADajUQC8FvCi1Pjx4wPaz+fzESwAYUWfANiKPgGwGY0C4LWAF6W2bdsWyjkAwDH6BMBW9AmAzWgUAK+F9D2lypYtq61bt4byJgDAEfoEwFb0CYDNaBQAN4V0UcoYE8rDA4Bj9AmAregTAJvRKABu4tP3AAAAAAAAEHYsSgEAAAAAACDsWJQCAAAAAABA2IV0Ucrn84Xy8ADgGH0CYCv6BMBmNAqAm3ijcwARiT4BsBV9AmAzGgXATSFdlPrkk09Uo0aNUN4EADhCnwDYij4BsBmNAuCm4k6ulJWVpZdfflmLFi3Svn37lJ2dnevyxYsXS5IuvPDCgk8IAEGgTwBsRZ8A2IxGAfCCo0WpO+64Qy+//LKuvPJKNW7cmNcVA7AGfQJgK/oEwGY0CoAXHC1Kvfnmm3r77bd1xRVXuD0PABQIfQJgK/oEwGY0CoAXHL2nVMmSJVWvXj23ZwGAAqNPAGxFnwDYjEYB8IKjRam7775bzz77LJ+8AMA69AmAregTAJvRKABeCPjle9dcc02u7xcvXqxPPvlEZ511lkqUKJHrsjlz5rgzHQAEgD4BsBV9AmAzGgXAawEvSpUrVy7X9926dXN9GABwgj4BsBV9AmAzGgXAawEvSs2YMSOUcwCAY/QJgK3oEwCb0SgAXnP06Xs59u3bp40bN8rn86lBgwaqUqWKW3MBQIHQJwC2ok8AbEajAISTozc6z8jIUK9evVSjRg21bdtWF198sWrUqKGePXsqPT3d7RkBIGD0CYCt6BMAm9EoAF5wtCjVv39/ffPNN/roo4908OBBpaen66OPPtK3336rAQMGuD0jAASMPgGwFX0CYDMaBcALjl6+9/HHH+uzzz7ThRde6N922WWXaerUqbr88stdGw4AgkWfANiKPgGwGY0C4AVHz5SqVKlSnk9qkP789IYKFSoUeCgAcIo+AbAVfQJgMxoFwAuOFqUeeOABDRkyRHv27PFv27t3r+699149+OCDrg0HAMGiTwBsRZ8A2IxGAfCCo5fvTZo0ST/++KMSExN1xhlnSJK2b9+u6Oho/fLLL5o8ebJ/39WrV7szKQAEgD4BsBV9AmAzGgXAC44Wpbp27eryGADgDvoEwFb0CYDNaBQALzhalHr44YfdngMAXEGfANiKPgGwGY0C4AVH7ykFAAAAAAAAFISjZ0plZWVp/Pjxevvtt7V9+3YdP3481+UHDhxwZTgACBZ9AmAr+gTAZjQKgBccPVPqkUce0dNPP63u3bsrPT1dQ4YM0TXXXKNixYppxIgRLo8IAIGjTwBsRZ8A2IxGAfCCo0Wp119/XVOnTtU999yj4sWL68Ybb9S0adP00EMPacWKFW7PKElKSkqSz+fL9TV06NCQ3BaAwos+AbCVF32SaBSAwHAOBcALjl6+t3fvXjVp0kSSVKZMGaWnp0uS/vGPf+jBBx90b7q/GTlypAYMGOD/vkyZMiG7LQCFE30CYCuv+iTRKACnxzkUAC84eqZUzZo1tWfPHklSvXr1NH/+fEnSqlWrFB0d7d50fxMXF6dq1ar5vwgWgL+jTwBs5VWfJBoF4PQ4hwLgBUeLUt26ddOiRYskSXfccYcefPBB1a9fX7fccov++c9/ujrgXz3++OOqVKmSmjVrptGjR+d58z0AoE8AbOVVnyQaBeD0OIcC4AVHL98bO3as/8/XXXedatWqpa+++kr16tVTly5dXBvur+644w61aNFCFSpU0MqVKzVs2DBt27ZN06ZNO+l1MjMzlZmZ6f8+IyMjJLMBsAd9AmArL/okBd8o+gREJs6hAHjCBOn48eOmT58+ZsuWLcFeNY+HH37YSDrl16pVq/K97rvvvmskmf379wd9/PT09ALPDpxKSkqKkWRSUlK8HqVQSE9Pd+WxSZ8A+hMKbjTKzT4ZE9pG0SeEE80qGM6haBRChz4VTKB98hljTLALWeXLl9fq1atVp06dYK+ay/79+7V///5T7pOUlKSYmJg823ft2qWaNWtqxYoVOu+88/K9bn6r6LVq1VJ6errKli1boNmBU1m9erVatmyplJQUtWjRwutxrJeRkaFy5cq58tikT4h09Md9bjXKrT5JoW0UfUI40ayC4RyKRiF06FPBBNonRy/f69atm9577z0NGTLE8YCSFB8fr/j4eEfXXbNmjSQpISHhpPtER0eH/I1DAdiFPgGwlVt9kkLbKPoERCbOoQB4wdGiVL169fToo49q+fLlatmypUqXLp3r8sGDB7syXI6vv/5aK1asULt27VSuXDmtWrVKd911l7p06aIzzjjD1dsCULjRJwC2CnefJBoFIHCcQwHwgqNFqWnTpql8+fJKSUlRSkpKrst8Pp/rwYqOjtZbb72lRx55RJmZmUpMTNSAAQN03333uXo7QLCOHj2qDRs25Nmempqa659/lZycrFKlSoV8tkhFnxAJTtYe6dT9kWiQl8LdJ4lGwR5OzpkkmhVOnEMhUtEnbzl6T6nCys3XXAPS/15nHAxek5wXj01+BgiOk/bkoEHBi/THZ6Tff7jDabdo1qnx+ORngIKjT6ER0veUOtnrjH0+n2JiYlSvXj1dffXVqlixopPDA4VGcnJynv+TJEnHjh1TWlqakpKSFBsbm+c6CB36hEhwsvZIp+5PznXhDfqESObknCnneggPGoVIRZ+85eiZUu3atdPq1auVlZWlM888U8YYbd68WVFRUUpOTtbGjRvl8/n05ZdfqlGjRqGY2xFW0QE7ufnYpE8A3ObW45M+AXAb51A0CrBVoI/NYk4OfvXVV6tDhw7avXu3UlJStHr1au3atUsdO3bUjTfeqF27duniiy/WXXfd5fgOAIAT9AmAregTAJvRKABecPRMqRo1amjBggV5Vsi///57derUSbt27dLq1avVqVMn7d+/37VhC4pVdMBObj426RMAt7n1+KRPANzGORSNAmwV0mdKpaena9++fXm2//LLL8rIyJAklS9fXsePH3dyeABwjD4BsBV9AmAzGgXAC45fvvfPf/5Tc+fO1c6dO7Vr1y7NnTtX/fr1U9euXSVJK1euVIMGDdycFQBOiz4BsBV9AmAzGgXAC45evnf48GHdddddevXVV3XixAlJUvHixdW7d2+NHz9epUuX1tq1ayVJzZo1c3PeAuGpnYCd3Hxs0icAbnPr8UmfALiNcygaBdgq0Memo0WpHIcPH9bWrVtljFHdunVVpkwZp4cKC4IF2CkUj036BMAtbj8+6RMAt3AORaMAWwX62CxekBspU6aMzj777IIcAgBCgj4BsBV9AmAzGgUgnBy9pxQAAAAAAABQECxKAQAAAAAAIOxYlAIAAAAAAEDYsSgFAAAAAACAsCvQG50XNjkfNJiRkeHxJAD+KucxWYAPAy306BNgr0hvFH0C7BXpfZJoFGCrQPsUUYtShw4dkiTVqlXL40kA5OfQoUMqV66c12N4gj4B9ovURtEnwH6R2ieJRgG2O12ffCaCltWzs7O1e/duxcXFyefzeT1OoZCRkaFatWppx44dKlu2rNfjFBr83IJjjNGhQ4dUvXp1FSsWma8qpk/ui9THYaTe71CK9EbRp4LhMXlq/HwKJtL7JNGoguIxeHL8bAom0D5F1DOlihUrppo1a3o9RqFUtmxZHogO8HMLXKT+370c9Cl0IvVxGKn3O1QiuVH0yR08Jk+Nn49zkdwniUa5hcfgyfGzcS6QPkXmcjoAAAAAAAA8xaIUAAAAAAAAwo5FKZxSdHS0Hn74YUVHR3s9SqHCzw3wXqQ+DiP1fgO24jF5avx8AG/xGDw5fjbhEVFvdA4AAAAAAAA78EwpAAAAAAAAhB2LUgAAAAAAAAg7FqUAAAAAAAAQdixKFXEvvPCCateurZiYGLVs2VJffPHFSfedM2eOOnbsqMqVK6ts2bJq3bq1Pvvsszz7tGrVSuXLl1fp0qXVrFkzvfbaa3mOtWvXLvXs2VOVKlVSqVKl1KxZM6WkpLh+/0IlmJ9bnz595PP58nydddZZ+e7/5ptvyufzqWvXrrm2jxgxIs8xqlWr5ubdAgqVYB6He/bs0U033aQzzzxTxYoV05133plnnz/++EMjR45U3bp1FRMTo6ZNm+rTTz/Ntc+JEyf0wAMPqHbt2oqNjVWdOnU0cuRIZWdnu333TiqY+7106dJ8+7Nhw4Zc+x08eFC33nqrEhISFBMTo4YNG2revHn+y5ctW6arrrpK1atXl8/n03vvvRequwcUOsE8Jr/88ku1adNGlSpVUmxsrJKTkzV+/Phc+3z//fe69tprlZSUJJ/Pp2eeeSbPcQrTYzIU50ynaxbnTMD/uP373l+d7PcWG86XAhHMz0aSMjMzNXz4cCUmJio6Olp169bVSy+95L88kHPJwtRvW7AoVYS99dZbuvPOOzV8+HCtWbNGF110kTp37qzt27fnu/+yZcvUsWNHzZs3TykpKWrXrp2uuuoqrVmzxr9PxYoVNXz4cH399ddat26d+vbtq759++aK2W+//aY2bdqoRIkS+uSTT/TDDz/oqaeeUvny5UN9l10R7M/t2Wef1Z49e/xfO3bsUMWKFXX99dfn2fenn37SPffco4suuijfY5111lm5jvXdd9+5et+AwiLYx2FmZqYqV66s4cOHq2nTpvnu88ADD2jy5Ml67rnn9MMPP2jgwIHq1q1brsY9/vjjevHFFzVx4kSlpqbqiSee0Lhx4/Tcc8+F5H7+XbD3O8fGjRtztaN+/fr+y44fP66OHTsqLS1N7777rjZu3KipU6eqRo0a/n2OHDmipk2bauLEiSG7b0BhFOxjsnTp0rrtttu0bNkypaam6oEHHtADDzygKVOm+Pc5evSo6tSpo7Fjx550IaWwPCZDcc4USLMkzpkAKTS/7+U41e8tXp8vBcLJOVX37t21aNEiTZ8+XRs3btSsWbOUnJzsvzyQc8nC0m+rGBRZ5557rhk4cGCubcnJyWbo0KEBH6NRo0bmkUceOeU+zZs3Nw888ID/+/vvv99ceOGFwQ1rkYL+3ObOnWt8Pp9JS0vLtf3EiROmTZs2Ztq0aaZ3797m6quvznX5ww8/bJo2bVqQ0YEioyCPw7Zt25o77rgjz/aEhAQzceLEXNuuvvpqc/PNN/u/v/LKK80///nPXPtcc801pmfPnkFM71yw93vJkiVGkvntt99OesxJkyaZOnXqmOPHjwc0gyQzd+7cQEcGijQ3zqW6det20oYkJiaa8ePHn/L6Nj8mQ3HOFEizOGcC/hSq3/dO93uL1+dLgQj2Z/PJJ5+YcuXKmV9//fWkxwzkXPKvbO63TXimVBF1/PhxpaSkqFOnTrm2d+rUScuXLw/oGNnZ2Tp06JAqVqyY7+XGGC1atEgbN27UxRdf7N/+wQcfqFWrVrr++utVpUoVNW/eXFOnTnV+Z8LIjZ/b9OnT1aFDByUmJubaPnLkSFWuXFn9+vU76XU3b96s6tWrq3bt2urRo4e2bt0a/J0ACjk3Hof5yczMVExMTK5tsbGx+vLLL/3fX3jhhVq0aJE2bdokSfrvf/+rL7/8UldccYXj2w1UQe538+bNlZCQoEsvvVRLlizJddkHH3yg1q1b69Zbb1XVqlXVuHFjPfbYY8rKynL9PgBFiRstWrNmjZYvX662bduGYkRPheqcKdBmcc6ESBfK3/dO93uLl+dLgXDys8n5HfaJJ55QjRo11KBBA91zzz06duyYf59AziURvOJeD4DQ2L9/v7KyslS1atVc26tWraq9e/cGdIynnnpKR44cUffu3XNtT09PV40aNZSZmamoqCi98MIL6tixo//yrVu3atKkSRoyZIj+85//aOXKlRo8eLCio6N1yy23FPzOhVBBf2579uzRJ598ojfeeCPX9q+++krTp0/X2rVrT3rd8847T6+++qoaNGign3/+WaNGjdIFF1yg77//XpUqVXJ0f4DCyI1+5eeyyy7T008/rYsvvlh169bVokWL9P777+f6Ref+++9Xenq6kpOTFRUVpaysLI0ePVo33nij49sNlJP7nZCQoClTpqhly5bKzMzUa6+9pksvvVRLly71/8+CrVu3avHixbr55ps1b948bd68WbfeeqtOnDihhx56KOT3CyisCtKimjVr6pdfftGJEyc0YsQI9e/fP5SjeiJU50yBNItzJiB0v+8F8nuLl+dLgXDys9m6dau+/PJLxcTEaO7cudq/f78GDRqkAwcO+N9XKpBzSQSPRakizufz5freGJNnW35mzZqlESNG6P3331eVKlVyXRYXF6e1a9fq8OHDWrRokYYMGaI6derokksukfTninurVq302GOPSfrz/+B///33mjRpkvWLUjmc/txefvlllS9fPtebAR46dEg9e/bU1KlTFR8ff9Lrdu7c2f/nJk2aqHXr1qpbt65eeeUVDRkyJPg7ARRyTh+HJ/Pss89qwIABSk5Ols/nU926ddW3b1/NmDHDv89bb72lmTNn6o033tBZZ52ltWvX6s4771T16tXVu3dvx7cdjGDu95lnnqkzzzzT/33r1q21Y8cOPfnkk/5FqezsbFWpUkVTpkxRVFSUWrZsqd27d2vcuHEsSgEBcNKiL774QocPH9aKFSs0dOhQ1atXz5pf1tzm5jmTFFizOGcC/sfN3/cC/b3FhvOlQATzs8nOzpbP59Prr7+ucuXKSZKefvppXXfddXr++ecVGxsb0LkkgseiVBEVHx+vqKioPCvB+/bty7Ni/HdvvfWW+vXrp3feeUcdOnTIc3mxYsVUr149SVKzZs2UmpqqMWPG+BelEhIS1KhRo1zXadiwoWbPnl2AexQeBfm5GWP00ksvqVevXipZsqR/+5YtW5SWlqarrrrKvy3nkymKFy+ujRs3qm7dunmOV7p0aTVp0kSbN28uyF0CCp2CPA5PpXLlynrvvff0+++/69dff1X16tU1dOhQ1a5d27/Pvffeq6FDh6pHjx6S/vxl56efftKYMWNCfpLl1v0+//zzNXPmTP/3CQkJKlGihKKiovzbGjZsqL179+r48eO5egXgfwrymMzpSpMmTfTzzz9rxIgRRW5RKhTnTJKzZnHOhEgUit/3Av29xcvzpUA4+dkkJCSoRo0a/gUp6c/2GGO0c+dO1a9fP6BzSQSP95QqokqWLKmWLVtqwYIFubYvWLBAF1xwwUmvN2vWLPXp00dvvPGGrrzyyoBuyxijzMxM//dt2rTRxo0bc+2zadOmPO+xZCOnPzdJ+vzzz/Xjjz/mee11cnKyvvvuO61du9b/1aVLF7Vr105r165VrVq18j1eZmamUlNTlZCQULA7BRQyBXkcBiImJkY1atTQiRMnNHv2bF199dX+y44ePapixXL/pzEqKiosH3Hs1v1es2ZNrm60adNGP/74Y677sGnTJiUkJLAgBZyCW4/Jv58nFRWhOGeSnDWLcyZEolD8vhfo7y1eni8FwsnPpk2bNtq9e7cOHz7s37Zp0yYVK1ZMNWvWzLXvqc4l4YAX766O8HjzzTdNiRIlzPTp080PP/xg7rzzTlO6dGn/J5wMHTrU9OrVy7//G2+8YYoXL26ef/55s2fPHv/XwYMH/fs89thjZv78+WbLli0mNTXVPPXUU6Z48eJm6tSp/n1WrlxpihcvbkaPHm02b95sXn/9dVOqVCkzc+bM8N35Agj255ajZ8+e5rzzzgvoNvL7FIu7777bLF261GzdutWsWLHC/OMf/zBxcXF5PsUPiAROHodr1qwxa9asMS1btjQ33XSTWbNmjfn+++/9l69YscLMnj3bbNmyxSxbtsy0b9/e1K5dO9cn1/Xu3dvUqFHDfPTRR2bbtm1mzpw5Jj4+3tx3331W3u/x48ebuXPnmk2bNpn169eboUOHGklm9uzZ/n22b99uypQpY2677TazceNG89FHH5kqVaqYUaNG+fc5dOiQ/+cnyTz99NNmzZo15qeffgrL/QZsFexjcuLEieaDDz4wmzZtMps2bTIvvfSSKVu2rBk+fLh/n8zMTP/jLSEhwdxzzz1mzZo1ZvPmzf59CstjMhTnTIE0i3Mm4E+h+H3v7/L7vcXr86VABPuzOXTokKlZs6a57rrrzPfff28+//xzU79+fdO/f3//PoGcSxaWftuERaki7vnnnzeJiYmmZMmSpkWLFubzzz/3X9a7d2/Ttm1b//dt27Y1kvJ89e7d27/P8OHDTb169UxMTIypUKGCad26tXnzzTfz3O6HH35oGjdubKKjo01ycrKZMmVKKO+m64L5uRljzMGDB01sbGzA9zO/uN9www0mISHBlChRwlSvXt1cc801uX6hBiJNsI/D/PqVmJjov3zp0qWmYcOGJjo62lSqVMn06tXL7Nq1K9cxMjIyzB133GHOOOMMExMTY+rUqWOGDx9uMjMzQ3lXcwnmfj/++OOmbt26/iZfeOGF5uOPP85zzOXLl5vzzjvPREdHmzp16pjRo0ebEydO+C9fsmTJafsPRKpgHpMTJkwwZ511lilVqpQpW7asad68uXnhhRdMVlaWf59t27bl+3j763EK02MyFOdMp2sW50zA/7j9+97f5fd7iw3nS4EItk+pqammQ4cOJjY21tSsWdMMGTLEHD161H95IOeShanftvAZY0wIn4gFAAAAAAAA5MF7SgEAAAAAACDsWJQCAAAAAABA2LEoBQAAAAAAgLBjUQoAAAAAAABhx6IUAAAAAAAAwo5FKQAAAAAAAIQdi1IAAAAAAAAIOxalAAAAAAAAEHYsSiFgl1xyie68886A909LS5PP59PatWtDNpMkLV26VD6fTwcPHgzp7QCwBz0CUNjQLTtE2v0FCoJu2cXn8+m9996TFL6fdTiwKBVmffr0kc/nk8/nU4kSJVSnTh3dc889OnLkSEhuq2vXrq4db86cOXr00UcD3r9WrVras2ePGjdu7NoMKLhFixbpggsuUFxcnBISEnT//ffrxIkTXo8FD9AjBMLtv7scY8aM0TnnnKO4uDhVqVJFXbt21caNG3PtM2fOHF122WWKj48vMideKBi6hYK64IILtGfPHpUrV87rURAh6BZwaixKeeDyyy/Xnj17tHXrVo0aNUovvPCC7rnnnnz3/eOPP0I+T6C3UbFiRcXFxQV83KioKFWrVk3Fixd3Ohpctm7dOl1xxRW6/PLLtWbNGr355pv64IMPNHToUK9Hg0foEbzy+eef69Zbb9WKFSu0YMECnThxQp06dcp1kn7kyBG1adNGY8eO9XBS2IZuoSBKliypatWqyefzeT0KIgjdAk7BIKx69+5trr766lzb+vfvb6pVq2aMMebhhx82TZs2NdOnTze1a9c2Pp/PZGdnm4MHD5oBAwaYypUrm7i4ONOuXTuzdu3ak97Oww8/bCTl+lqyZInZtm2bkWTeeust07ZtWxMdHW1eeukls3//ftOjRw9To0YNExsbaxo3bmzeeOONXMds27atueOOO/zfJyYmmtGjR5u+ffuaMmXKmFq1apnJkyf7L8+5rTVr1hhjjFmyZImRZBYuXGhatmxpYmNjTevWrc2GDRty3c6jjz5qKleubMqUKWP69etn7r//ftO0adOT3tec4/7222/+be+++65p1KiRKVmypElMTDRPPvlkrus8//zzpl69eiY6OtpUqVLFXHvttf7L3nnnHdO4cWMTExNjKlasaC699FJz+PDhk97+0qVLzTnnnGNKlixpqlWrZu6//37zxx9/5Pq53X777ebee+81FSpUMFWrVjUPP/zwSY9njDEnTpwwd911lylXrpypWLGiuffee80tt9yS69+dtm3bmttuu83ccccdpnz58qZKlSpm8uTJ5vDhw6ZPnz6mTJkypk6dOmbevHn+6wwbNsy0atUq123NnTvXxMTEmIyMjFPOhKKHHtGj0/XoZH93xhizbt06065dO/9sAwYMMIcOHfL/HEqUKGGWLVvmP9aTTz5pKlWqZHbv3p3vbe3bt89IMp9//nmey/7+94fIRbfoViDnUX//u5NkEhMT872/M2bMMOXKlTOffvqpSU5ONqVLlzaXXXZZnlZNnz7d/zOpVq2aufXWW085A5CDbtGtQLq1cuVK06FDB1OpUiVTtmxZc/HFF5uUlJRc+0gyc+fONcYUrXMjFqXCLL8o3X777aZSpUrGmD9jkvMfw9WrV5v//ve/Jjs727Rp08ZcddVVZtWqVWbTpk3m7rvvNpUqVTK//vprvrdz6NAh0717d3P55ZebPXv2mD179pjMzEz/v7xJSUlm9uzZZuvWrWbXrl1m586dZty4cWbNmjVmy5YtZsKECSYqKsqsWLHCf8z8olSxYkXz/PPPm82bN5sxY8aYYsWKmdTUVGPMyaN03nnnmaVLl5rvv//eXHTRReaCCy7wH3PmzJkmJibGvPTSS2bjxo3mkUceMWXLlg0qSt9++60pVqyYGTlypNm4caOZMWOGiY2NNTNmzDDGGLNq1SoTFRVl3njjDZOWlmZWr15tnn32WWOMMbt37zbFixc3Tz/9tNm2bZtZt26def755/2/aP3dzp07TalSpcygQYNMamqqmTt3romPj88VnbZt25qyZcuaESNGmE2bNplXXnnF+Hw+M3/+/JPep8cff9yUK1fOvPvuu+aHH34w/fr1M3FxcXkWpeLi4syjjz5qNm3aZB599FFTrFgx07lzZzNlyhSzadMm8+9//9tUqlTJHDlyxBhjzJAhQ8yFF16Y67Y+/fTTXL9oInLQI3p0uh6d7O/uyJEjpnr16uaaa64x3333nVm0aJGpXbu26d27t/+69957r0lMTDQHDx40a9euNdHR0WbOnDkn/dlt3rzZSDLfffddnsuK0okXCoZu0a1AzqNy/s727NljfvzxR1OvXj3Tq1evfO/vjBkzTIkSJUyHDh3MqlWrTEpKimnYsKG56aab/Md74YUXTExMjHnmmWfMxo0bzcqVK8348eNPevvAX9EtuhVItxYtWmRee+0188MPP/h//6tatWquJw6wKAVX/D1K33zzjalUqZLp3r27MebPKJUoUcLs27fPv8+iRYtM2bJlze+//57rWHXr1s21Mn262zLmf//yPvPMM6ed9YorrjB33323//v8otSzZ0//99nZ2aZKlSpm0qRJuW4rv5XyHB9//LGRZI4dO2aMMea8887L83+e2rRpE1SUbrrpJtOxY8dc+9x7772mUaNGxhhjZs+ebcqWLZvvM4NSUlKMJJOWlnbS2/ur//znP+bMM8802dnZ/m3PP/+8KVOmjMnKyjLG/Plz+/tC0DnnnGPuv//+kx43ISHBjB071v/9H3/8YWrWrJlnUeqvxz1x4oQpXbq0/6TLmD9PyiSZr7/+2hhjzGeffWaKFStm3njjDXPixAmzc+dOc+GFFxpJef7PCIo+ekSPjDl9j/L7u5syZYqpUKFCrv+L+PHHH5tixYqZvXv3GmOMyczMNM2bNzfdu3c3Z511lunfv/9JbyM7O9tcddVVeWbLUZROvFAwdItuGXP6buXIzs423bp1My1btjRHjx7N9/7OmDHDSDI//vhjrhmqVq3q/7569epm+PDhAd0n4O/oFt0yJvBu5Thx4oSJi4szH374oX9bUV2U4j2lPPDRRx+pTJkyiomJUevWrXXxxRfrueee81+emJioypUr+79PSUnR4cOHValSJZUpU8b/tW3bNm3ZskXbt2/Ptf2xxx477QytWrXK9X1WVpZGjx6ts88+23878+fP1/bt2095nLPPPtv/Z5/Pp2rVqmnfvn0BXychIUGS/NfZuHGjzj333Fz7//3700lNTVWbNm1ybWvTpo02b96srKwsdezYUYmJiapTp4569eql119/XUePHpUkNW3aVJdeeqmaNGmi66+/XlOnTtVvv/12yttq3bp1rvclaNOmjQ4fPqydO3fme59z7vfJfk7p6enas2ePWrdu7d9WvHjxPH9nfz9uVFSUKlWqpCZNmvi3Va1aVdL/fr6dOnXSuHHjNHDgQEVHR6tBgwa68sor/ddH5KFH9OhUPTrVbTVt2lSlS5fOdVvZ2dn+NysvWbKkZs6cqdmzZ+vYsWN65plnTnq82267TevWrdOsWbOCmgORiW7RrUC79Z///Edff/213nvvPcXGxp50v1KlSqlu3br5Hn/fvn3avXu3Lr300tPeHnAydItuna5b+/bt08CBA9WgQQOVK1dO5cqV0+HDh0/791EU8A5kHmjXrp0mTZqkEiVKqHr16ipRokSuy/96ki9J2dnZSkhI0NKlS/Mcq3z58ipfvnyuTySqWLHiaWf4+2089dRTGj9+vJ555hk1adJEpUuX1p133qnjx4+f8jh/n93n8yk7Ozvg6+Q8mP96nb+/8aQx5pTH+ztjzCmPERcXp9WrV2vp0qWaP3++HnroIY0YMUKrVq1S+fLltWDBAi1fvlzz58/Xc889p+HDh+ubb75R7dq1g7qtv2538nMKRH7HPd3Pd8iQIbrrrru0Z88eVahQQWlpaRo2bFi+9w9FHz2iR056lN9t/fV4OZYvXy5JOnDggA4cOJDn71qSbr/9dn3wwQdatmyZatasGdQciEx0i24F8nOaOXOmxo8fr6VLl562LfkdP2eOUy1mAYGiW3TrdD+nPn366JdfftEzzzyjxMRERUdHq3Xr1qf9+ygKeKaUB0qXLq169eopMTExz7+s+WnRooX27t2r4sWLq169erm+4uPj82zPiVLJkiWVlZUV0ExffPGFrr76avXs2VNNmzZVnTp1tHnz5gLdTyfOPPNMrVy5Mte2b7/9NqhjNGrUSF9++WWubcuXL1eDBg38zwYqXry4OnTooCeeeELr1q1TWlqaFi9eLOnPYLRp00aPPPKI1qxZo5IlS2ru3Lknva3ly5fnit7y5csVFxenGjVqBDV3jnLlyikhIUErVqzwbztx4oRSUlIcHS8/Pp9P1atXV2xsrGbNmqVatWqpRYsWrh0fhQc9Ojl69Kf8/u4aNWqktWvX5vqkvK+++krFihVTgwYNJElbtmzRXXfdpalTp+r888/XLbfckutkzBij2267TXPmzNHixYtZGEfA6NbJ0a0/ff311+rfv78mT56s888/3/FxpD9/mU1KStKiRYsKdBxENrp1cnTrT1988YUGDx6sK664QmeddZaio6O1f/9+x8crTFiUKgQ6dOig1q1bq2vXrvrss8+Ulpam5cuX64EHHjjlAzYpKUnr1q3Txo0btX///lN+9Ge9evX8K8Spqan617/+pb1794bi7pzS7bffrunTp+uVV17R5s2bNWrUKK1bty6oj+29++67tWjRIj366KPatGmTXnnlFU2cONH/sasfffSRJkyYoLVr1+qnn37Sq6++quzsbJ155pn65ptv9Nhjj+nbb7/V9u3bNWfOHP3yyy9q2LBhvrc1aNAg7dixQ7fffrs2bNig999/Xw8//LCGDBmiYsWcP7zuuOMOjR07VnPnztWGDRs0aNAgHTx40PHx/mrcuHH67rvv9P333+vRRx/V2LFjNWHCBF6+h4DQo8jrUX5/dzfffLNiYmLUu3dvrV+/XkuWLNHtt9+uXr16qWrVqsrKylKvXr3UqVMn9e3bVzNmzND69ev11FNP+Y976623aubMmXrjjTcUFxenvXv3au/evTp27Jh/nwMHDmjt2rX64YcfJP35FP+1a9d68u8DCi+6FVnd2rt3r7p166YePXrosssu87fll19+cXQ8SRoxYoSeeuopTZgwQZs3b9bq1atzvfQKcBvdiqxuSX/+fbz22mtKTU3VN998o5tvvjlinqnJolQh4PP5NG/ePF188cX65z//qQYNGqhHjx5KS0vzv2dQfgYMGKAzzzxTrVq1UuXKlfXVV1+ddN8HH3xQLVq00GWXXaZLLrlE1apVU9euXUNwb07t5ptv1rBhw3TPPfeoRYsW2rZtm/r06aOYmJiAj9GiRQu9/fbbevPNN9W4cWM99NBDGjlypPr06SPpz6e8zpkzR+3bt1fDhg314osvatasWTrrrLNUtmxZLVu2TFdccYUaNGigBx54QE899ZQ6d+6c723VqFFD8+bN08qVK9W0aVMNHDhQ/fr10wMPPFCgn8Pdd9+tW265RX369FHr1q0VFxenbt26FeiYOT755BNddNFFatWqlT7++GO9//77nvxdo3CiR5HXo/z+7kqVKqXPPvtMBw4c0DnnnKPrrrtOl156qSZOnChJGj16tNLS0jRlyhRJUrVq1TRt2jQ98MAD/pcbTJo0Senp6brkkkuUkJDg/3rrrbf8t/3BBx+oefPm/ve+69Gjh5o3b64XX3yxQPcJkYVuRVa3NmzYoJ9//lmvvPJKrracc845jo/Zu3dvPfPMM3rhhRd01lln6R//+IcnzyhB5KBbkdUtSXrppZf022+/qXnz5urVq5cGDx6sKlWqFOiYhYXPBPuCTSDMOnbsqGrVqum1117zehRP9enTRwcPHtR7773n9ShAxKJHAAobugWgsKFbkYU3OodVjh49qhdffFGXXXaZoqKiNGvWLC1cuFALFizwejQAEYYeAShs6BaAwoZugUUpWCXnqaqjRo1SZmamzjzzTM2ePVsdOnTwejQAEYYeAShs6BaAwoZugZfvAQAAAAAAIOx4o3MAAAAAAACEHYtSAAAAAAAACDsWpQAAAAAAABB2LEoBAAAAAAAg7FiUAgAAAAAAQNixKAUAAAAAAICwY1EKAAAAAAAAYceiFAAAAAAAAMKORSkAAAAAAACE3f8DHGTvV2Jked0AAAAASUVORK5CYII=\",\n      \"text/plain\": [\n       \"<Figure size 1200x400 with 4 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"fig, axs = plt.subplots(nrows=1, ncols=4, figsize=(12, 4))\\n\",\n    \"for i, loss_label in enumerate(PT_TASKS + [\\\"all\\\"]):\\n\",\n    \"    draw_boxplot(results, \\\"lipophilicity\\\", loss_label=loss_label, ax=axs[i])\\n\",\n    \"plt.tight_layout()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<div>\\n\",\n       \"<style scoped>\\n\",\n       \"    .dataframe tbody tr th:only-of-type {\\n\",\n       \"        vertical-align: middle;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe tbody tr th {\\n\",\n       \"        vertical-align: top;\\n\",\n       \"    }\\n\",\n       \"\\n\",\n       \"    .dataframe thead th {\\n\",\n       \"        text-align: right;\\n\",\n       \"    }\\n\",\n       \"</style>\\n\",\n       \"<table border=\\\"1\\\" class=\\\"dataframe\\\">\\n\",\n       \"  <thead>\\n\",\n       \"    <tr style=\\\"text-align: right;\\\">\\n\",\n       \"      <th></th>\\n\",\n       \"      <th>toymix_gcn_1</th>\\n\",\n       \"      <th>toymix_gcn_2</th>\\n\",\n       \"    </tr>\\n\",\n       \"  </thead>\\n\",\n       \"  <tbody>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>caco2</th>\\n\",\n       \"      <td>0.502565</td>\\n\",\n       \"      <td>0.523741</td>\\n\",\n       \"    </tr>\\n\",\n       \"    <tr>\\n\",\n       \"      <th>lipophilicity</th>\\n\",\n       \"      <td>0.478970</td>\\n\",\n       \"      <td>0.041120</td>\\n\",\n       \"    </tr>\\n\",\n       \"  </tbody>\\n\",\n       \"</table>\\n\",\n       \"</div>\"\n      ],\n      \"text/plain\": [\n       \"               toymix_gcn_1  toymix_gcn_2\\n\",\n       \"caco2              0.502565      0.523741\\n\",\n       \"lipophilicity      0.478970      0.041120\"\n      ]\n     },\n     \"execution_count\": 9,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"cols = sorted(list(results.keys()))\\n\",\n    \"\\n\",\n    \"rows = set()\\n\",\n    \"for k in cols: \\n\",\n    \"    rows.update(results[k].ft_results.keys())\\n\",\n    \"rows = sorted(list(rows))\\n\",\n    \"\\n\",\n    \"data = pd.DataFrame(columns=cols, index=rows, dtype=float)\\n\",\n    \"\\n\",\n    \"for pt_label, pt_results in results.items(): \\n\",\n    \"    for ft_label, ft_results in pt_results.ft_results.items(): \\n\",\n    \"        data.loc[ft_label, pt_label] = ft_results.best(FT_METRICS[ft_label], minimize=\\\"mae\\\" in FT_METRICS[ft_label])\\n\",\n    \"\\n\",\n    \"data\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"<Axes: >\"\n      ]\n     },\n     \"execution_count\": 10,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    },\n    {\n     \"data\": {\n      \"image/png\": \"iVBORw0KGgoAAAANSUhEUgAAAgMAAAGdCAYAAACPX3D5AAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIHZlcnNpb24zLjcuMiwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy8pXeV/AAAACXBIWXMAAA9hAAAPYQGoP6dpAAAz3UlEQVR4nO3deVhV5f7//xcgk6iIIoiKgFrOI1qhOXQ0yzLTsmzSyilLLaO+Fulx6hRm5tD55NjRstScU0tTMqdS00PY6JBNKIIKOGQm4/r94a992htQ2K7NBtfz0bWuy33vte71Xl4tfPO+73UvD8MwDAEAAMvydHcAAADAvUgGAACwOJIBAAAsjmQAAACLIxkAAMDiSAYAALA4kgEAACyOZAAAAIsjGQAAwOIquDuAv2T9uMvdIQBljmdQTXeHAJRJ3sH1XNp/TvrPpvXl6ljNUGaSAQAAyoz8PHdHUKpIBgAAcGTkuzuCUsWcAQAALI7KAAAAjvKtVRkgGQAAwIHBMAEAALASKgMAADhimAAAAItjmAAAAFgJlQEAAByx6BAAABbHMAEAALASKgMAADiy2NMEVAYAAHBgGPmmbSU1a9YsRUVFyc/PT9HR0dq5c2eR+27btk0eHh4FtoMHD5bonFQGAABw5KbKwLJlyzRq1CjNmjVLHTp00Ny5c9WjRw/98MMPqlu3bpHHHTp0SFWqVLF9rlGjRonOS2UAAIAyYtq0aRo0aJAGDx6sxo0ba8aMGQoPD9fs2bMve1xISIhq1qxp27y8vEp0XpIBAAAcGfmmbVlZWTp37pzdlpWVVeCU2dnZSkxMVPfu3e3au3fvrl27dl023NatWyssLExdu3bV1q1bS3y5JAMAADjKzzNti4+PV2BgoN0WHx9f4JTp6enKy8tTaGioXXtoaKjS0tIKDTMsLEzz5s3TqlWrtHr1ajVs2FBdu3bVjh07SnS5zBkAAMCF4uLiFBsba9fm6+tb5P4eHh52nw3DKND2l4YNG6phw4a2zzExMTp69KimTp2qTp06FTtGkgEAAByZuOiQr6/vZf/x/0twcLC8vLwKVAFOnjxZoFpwOTfddJPef//9EsXIMAEAAI7y883bisnHx0fR0dFKSEiwa09ISFD79u2L3U9SUpLCwsKKvb9EZQAAgDIjNjZW/fv3V9u2bRUTE6N58+YpOTlZw4YNk3RpyCElJUWLFi2SJM2YMUORkZFq2rSpsrOz9f7772vVqlVatWpVic5LMgAAgCM3vZugX79+ysjI0KRJk5SamqpmzZppw4YNioiIkCSlpqYqOTnZtn92draef/55paSkyN/fX02bNtXHH3+sO+64o0Tn9TAMwzD1SpyU9ePlH5sArMgzqKa7QwDKJO/gei7tP+ubTab15dviNtP6chXmDAAAYHEMEwAA4MAw8twdQqkiGQAAwJGb5gy4C8kAAACOeIUxAACwEioDAAA4YpgAAACLy7fWBEKGCQAAsDgqAwAAOGKYAAAAi+NpAgAAYCVUBgAAcMQwAQAAFscwAQAAsBIqAwAAOLJYZYBkAAAAB7y1EAAAq7NYZYA5AwAAWByVAQAAHPFoIQAAFscwAQAAsBIqAwAAOGKYAAAAi2OYAAAAWAmVAQAAHDFMAACAxTFMAAAArITKAAAAjixWGSAZAADAEXMGAACwOItVBpgzAACAxVEZAADAEcMEAABYHMMEAADASqgMAADgiGECAAAsjmECAABgJVQGAABwZLHKAMkAAACODMPdEZQqhgkAALA4KgMAADhimAAAAIsjGQAAwOIsts4AcwYAALA4KgMAADhimAAAAIvj0UIAAGAlVAYAAHDEMAEAABZnsWSAYQIAACyuxMnAxx9/rMGDB2v06NE6ePCg3XenT5/WP/7xD9OCAwDALYx887ZyoETJwJIlS3T33XcrLS1Nu3fvVuvWrbV48WLb99nZ2dq+fbvpQQIAUJqMfMO0rTwo0ZyBqVOnavr06Ro5cqQkaeXKlXr88cd18eJFDRo0yCUBAgBQ6iw2Z6BEycDhw4fVs2dP2+e+ffsqODhYvXr1Uk5Ojvr06WN6gAAAwLVKlAxUqVJFJ06cUFRUlK2tS5cuWr9+vXr27Kljx46ZHiAAAKWunIz1m6VEcwZuuOEGbdy4sUB7586dtX79es2YMcOsuAAAcJ98w7ytHChRMvDss8/Kz8+v0O+6dOmijz76SAMGDDAlMAAAUDo8DKNsLMCc9eMud4cAlDmeQTXdHQJQJnkH13Np/xf+/ZRpfVUcOcu0vlzF6RUI8/Ly9OGHH+rAgQPy8PBQ48aNdffdd8vLy8vM+AAAKH08TXBlR44c0Z133qljx46pYcOGMgxDhw8fVnh4uD7++GPVr1/f7DgBAICLOLUc8dNPP6169erp6NGj+uqrr5SUlKTk5GRFRUXp6aefNjtGAABKl2GYt5UDTlUGtm/frj179qhatWq2turVq2vy5Mnq0KGDacHBNT74+DO9s3qj0jPPqH7d2ho95CFFN7u+0H33fXNQg156rUD72tmvKio8zNWhAi7zweqPtHDJSp3KyFSDqAi98PQTim7VrNB99371jQaOfKFA+7ol81QvIlyStHLdRq3buEVHfvlNktSkYQM988Rjat6koesuAq7DMMGV+fr66vfffy/Qfv78efn4+Fx1UHCdT3Z8qSnzl2jMk/3Vusl1WrFxm56aME0fznpFYSHVizxu3dx4Varob/scVKVyaYQLuMTGT7dr8sy5GvvccLVu0UQrPtygYc//U+ven6uwmiFFHvfR0vmqFFDR9jmoaqDtz/u++kZ33NpFrZo1lo+vjxYsXqGhz47Rh+/PUWiNYJdeD3C1nBom6Nmzp4YOHaovv/xShmHIMAzt2bNHw4YNU69evcyOESZa9OFm9bm1k+69rbPqhdfSC0MfUs3galq+4bPLHlctsIqCgwJtm5cXL7xE+bVo2Rrd07O7+va6XfUj6+rFUcNUM6SGPljz8WWPqxZUVcHVq9m2v0+Yfm3CC3rgnp5qdH191YsI18QXnlF+fr72/He/i68GLmGxdQacqgy8+eabevTRRxUTEyNvb29JUm5urnr16qWZM2eaGiDMk5OTqwNHftWgvnfYtce0bqr9B3+67LH3PzNe2dk5qhdeS0MfuEs3tGjsylABl8nJydEPh37UoEfus2tvf0Mbff3dD5c99r7HRygrO1v1I+vqiUcf1A3RLYvc9+LFLOXm5imQKlr5xAqEV1a1alWtXbtWhw8f1sqVK7VixQodOnRIa9asUWBg4JU7gFucPve78vLzVT2oil179aBApZ8+W+gxwdUCNW7EY5oWN1zTXxqhyDo1NWTM6/rvd4dKI2TAdKfPnFNeXr6qVwuya68eVFXpGacLPaZG9Wqa8MLTmv7KWM149Z+KrFtHg56J03/3f1vkeabPWaiQGtUV07a1qfGjlLixMjBr1ixFRUXJz89P0dHR2rlzZ7GO++KLL1ShQgW1atWqxOd0ep0BSWrQoIEaNGhQ4uOysrKUlZVl35idLV/mG5QKD3nYfTYMQx4ehe8bVSdMUXX+N1GwZeMGSjuVqXdXf6K2zZgYhfLLw+F/ekNGgba/REXUUVREHdvnVs0aK+3kKb2zZJXatmpeYP8Fi1doQ8I2Lfy/KfL15ecaim/ZsmUaNWqUZs2apQ4dOmju3Lnq0aOHfvjhB9WtW7fI486ePasBAwaoa9euOnHiRInP61RloG/fvpo8eXKB9tdff1333XdfIUfYi4+PV2BgoN02Zc57zoSCEgiqUllenp4FqgCZZ86petXiV3RaNKqv5OMl/58NKAuCqlaRl5en0jMy7dozT59V9WpVi91Pi6aN9Nux4wXaFy5ZqfmLlmne9FfUsEFUIUeiPDDy803bSmLatGkaNGiQBg8erMaNG2vGjBkKDw/X7NmzL3vcE088oYceekgxMTFOXa9TycD27dt15513Fmi//fbbtWPHjiseHxcXp7Nnz9pto4f1dyYUlIC3dwU1bhCp3fu/t2vfs/8HtWpU/IWiDv6UrOBqDAehfPL29laThtdp974ku/bd+75Sy2ZNit3PwcM/qUb1anZtCxav1Nx3lmrOGy+rWePCH9dFOWHiMEFWVpbOnTtntxWojkvKzs5WYmKiunfvbtfevXt37dpV9JL9Cxcu1E8//aTx48c7fblOJQNFPULo7e2tc+fOXfF4X19fValSxW5jiKB0DOjdXas379CazTv089HjmjJ/qVJPZei+O26RJM18Z4VeemO+bf/31m7WZ7u/0m8paTryW4pmvrNCn+76rx7s2dVdlwBctQH9+mjV+k1a/dEm/fRrsl6bOVepJ06pX59Lk2unz16ouJen2vZ/b9kabdmxS78dTdGRn3/T9NkLlbDtCz147122fRYsXqF/z39XL8c9q9phoUrPyFR6RqYuXPiz1K8PZUth1fD4+PgC+6WnpysvL0+hoaF27aGhoUpLSyu07x9//FEvvviiFi9erAoVnB/5d+rIZs2aadmyZRo3bpxd+wcffKAmTYqfWaP03d7pRp35/Q/N/WCdTmWeVYOI2nprwrOqFXLpOehTp88q7VSGbf+cnFy9sWCZTmaclq+Pj+rXraW3xo9Sx3ZFz6IGyroe3Trr7LnfNWfhEp3KyNR19SI1e+ok1ap56YdwekamUk+ctO2fk5urqf/3tk6eypCvr48aREVo1usT1an9DbZ9Plj9kXJycvXs2FfszvXkwIc1fNAjpXNhMI+JTxPExcUpNjbWrs3X17fI/QvMZzEKn8+Sl5enhx56SBMnTtT1119dJcqptxauW7dO9957rx566CH94x//kCRt2bJFS5cu1YoVK9S7d+8SB8JbC4GCeGshUDhXv7Xwj0kPm9ZXwLjFxdovOztbFStW1IoVK9SnTx9b+zPPPKP9+/dr+/btdvufOXNGQUFBdutd5OfnyzAMeXl5afPmzbZ/o6/EqcpAr1699OGHH+rVV1/VypUr5e/vrxYtWujTTz9V586dnekSAABL8/HxUXR0tBISEuySgYSEBN19990F9q9SpYq+/db+8dZZs2bps88+08qVKxUVVfwJrE4PMNx5552FTiIEAKDcc9O7CWJjY9W/f3+1bdtWMTExmjdvnpKTkzVs2DBJl4YcUlJStGjRInl6eqpZM/v3aYSEhMjPz69A+5Vc1ToDAABck9y0jHC/fv2UkZGhSZMmKTU1Vc2aNdOGDRsUEREhSUpNTVVycrLp53VqzkBeXp6mT5+u5cuXKzk5WdnZ2XbfZ2ZmFnFk0ZgzABTEnAGgcC6fMzDuAdP6Cpj0gWl9uYpTjxZOnDhR06ZN0/3336+zZ88qNjZW99xzjzw9PTVhwgSTQwQAoJQZ+eZt5YBTycDixYs1f/58Pf/886pQoYIefPBBvf322xo3bpz27NljdowAAJQui7210KlkIC0tTc2bX1qPu1KlSjp79tLytj179tTHH1/+FaAAAJR17lqO2F2cSgbq1Kmj1NRUSZdeVrR582ZJ0r59+y67kAIAACh7nEoG+vTpoy1btki6tBjCP//5T1133XUaMGCABg4caGqAAACUOosNEzj1aOHf31jYt29fhYeH64svvlCDBg3Uq1cv04IDAMAtysk/4mZxqjIQHx+vBQsW2D7feOONio2NVXp6ul577TXTggMAAK7nVDIwd+5cNWrUqEB706ZNNWfOnKsOCgAAt7LYo4VODROkpaUpLCysQHuNGjVsEwsBACi3GCa4sr/mCDj64osvVKtWrasOCgAAlB6nKgODBw/WqFGjlJOTY/cK49GjR+u5554zNUAAAEqbYbHKgFPJwOjRo5WZmamnnnrK9l4CPz8/vfDCC4qLizM1QAAASp3FkgGnXlT0l/Pnz+vAgQPy9/fXddddd1ULDvGiIqAgXlQEFM7VLyr6/emepvVV+c2PTOvLVa7qFcaVKlVSu3btzIoFAICyoZwsI2yWq0oGAAC4JllsmIBkAAAARxZLBpx6tBAAAFw7qAwAAODgKubWl0skAwAAOGKYAAAAWAmVAQAAHFmsMkAyAACAA6stR8wwAQAAFkdlAAAARxarDJAMAADgyFqrETNMAACA1VEZAADAgdUmEJIMAADgiGQAAACLY84AAACwEioDAAA4YM4AAABWxzABAACwEioDAAA4YJgAAACrY5gAAABYCZUBAAAcGBarDJAMAADgyGLJAMMEAABYHJUBAAAcMEwAAIDVkQwAAGBtVqsMMGcAAACLozIAAIADq1UGSAYAAHBgtWSAYQIAACyOygAAAI4MD3dHUKpIBgAAcMAwAQAAsBQqAwAAODDyGSYAAMDSGCYAAACWQmUAAAAHBk8TAABgbVYbJiAZAADAgdUmEDJnAAAAi6MyAACAA8NwdwSli2QAAAAHDBMAAABLoTIAAIADq1UGSAYAAHBgtTkDDBMAAGBxVAYAAHDAMAEAABZnteWIGSYAAMDiqAwAAODAau8moDIAAICDfMPDtK2kZs2apaioKPn5+Sk6Olo7d+4sct/PP/9cHTp0UPXq1eXv769GjRpp+vTpJT4nlQEAABy4a87AsmXLNGrUKM2aNUsdOnTQ3Llz1aNHD/3www+qW7dugf0DAgI0YsQItWjRQgEBAfr888/1xBNPKCAgQEOHDi32eT0Mo2w8TZn14y53hwCUOZ5BNd0dAlAmeQfXc2n/hxr1MK2vhgc3FnvfG2+8UW3atNHs2bNtbY0bN1bv3r0VHx9frD7uueceBQQE6L333iv2eRkmAADAgZHvYdqWlZWlc+fO2W1ZWVkFzpmdna3ExER1797drr179+7atat4vzAnJSVp165d6ty5c4mul2QAAAAHhmHeFh8fr8DAQLutsN/y09PTlZeXp9DQULv20NBQpaWlXTbeOnXqyNfXV23bttXw4cM1ePDgEl0vcwYAAHChuLg4xcbG2rX5+voWub+Hh/18BcMwCrQ52rlzp86fP689e/boxRdfVIMGDfTggw8WO0aSAQAAHJi5AqGvr+9l//H/S3BwsLy8vApUAU6ePFmgWuAoKipKktS8eXOdOHFCEyZMKFEywDABAAAO3PFooY+Pj6Kjo5WQkGDXnpCQoPbt2xe7H8MwCp2TcDlUBgAAKCNiY2PVv39/tW3bVjExMZo3b56Sk5M1bNgwSZeGHFJSUrRo0SJJ0ltvvaW6deuqUaNGki6tOzB16lSNHDmyROclGQAAwIG71hno16+fMjIyNGnSJKWmpqpZs2basGGDIiIiJEmpqalKTk627Z+fn6+4uDj98ssvqlChgurXr6/JkyfriSeeKNF5WWcAKMNYZwAonKvXGfgm8i7T+mrx63rT+nIV5gwAAGBxDBMAAODAmXcKlGckAwAAOHDXnAF3IRkAAMBB2ZhNV3qYMwAAgMVRGQAAwAFzBtzEOJ3i7hCAMse/6X3uDgEok3KzXftvhtXmDDBMAACAxZWZygAAAGUFwwQAAFicxR4mYJgAAACrozIAAIADhgkAALA4niYAAACWQmUAAAAH+e4OoJSRDAAA4MCQtYYJSAYAAHCQb7FnC5kzAACAxVEZAADAQT7DBAAAWJvV5gwwTAAAgMVRGQAAwAGPFgIAYHEMEwAAAEuhMgAAgAOGCQAAsDirJQMMEwAAYHFUBgAAcGC1CYQkAwAAOMi3Vi5AMgAAgCOrLUfMnAEAACyOygAAAA4s9gZjkgEAABzxaCEAALAUKgMAADjI97DWBEKSAQAAHFhtzgDDBAAAWByVAQAAHFhtAiHJAAAADqy2AiHDBAAAWByVAQAAHFhtOWKSAQAAHFjtaQKSAQAAHDBnAAAAWAqVAQAAHPBoIQAAFme1OQMMEwAAYHFUBgAAcGC1CYQkAwAAOLDanAGGCQAAsDgqAwAAOLBaZYBkAAAAB4bF5gwwTAAAgMVRGQAAwAHDBAAAWBzJAAAAFscKhAAAwFKoDAAA4IAVCAEAsDirzRlgmAAAAIujMgAAgAOrVQZIBgAAcMDTBMXwxx9/mB0HAABwE6eSgdDQUA0cOFCff/652fEAAOB2+R7mbeWBU8nA0qVLdfbsWXXt2lXXX3+9Jk+erOPHj5sdGwAAbpFv4lYeOJUM3HXXXVq1apWOHz+uJ598UkuXLlVERIR69uyp1atXKzc31+w4AQCwhFmzZikqKkp+fn6Kjo7Wzp07i9x39erVuvXWW1WjRg1VqVJFMTEx2rRpU4nPeVWPFlavXl3PPvusvv76a02bNk2ffvqp+vbtq1q1amncuHG6cOHC1XQPAIBbGCZuJbFs2TKNGjVKY8aMUVJSkjp27KgePXooOTm50P137NihW2+9VRs2bFBiYqJuueUW3XXXXUpKSirReT0Mw3B60mRaWpoWLVqkhQsXKjk5WX369NGgQYN0/PhxTZ48WWFhYdq8eXOx+rq4d4WzYQDXrEo3j3J3CECZlJud4tL+X4l42LS+xvy2uNj73njjjWrTpo1mz55ta2vcuLF69+6t+Pj4YvXRtGlT9evXT+PGjSv2eZ16tHD16tVauHChNm3apCZNmmj48OF65JFHVLVqVds+rVq1UuvWrZ3pHgAAtzJzrD8rK0tZWVl2bb6+vvL19bVry87OVmJiol588UW79u7du2vXrl3FOld+fr5+//13VatWrUQxOjVM8Pjjj6tWrVr64osvtH//fo0YMcIuEZCkevXqacyYMc50DwDANSM+Pl6BgYF2W2G/5aenpysvL0+hoaF27aGhoUpLSyvWud544w398ccfuv/++0sUo1OVgdTUVFWsWPGy+/j7+2v8+PHOdA8AgFuZuehQXFycYmNj7docqwJ/5+Fh/zyiYRgF2gqzdOlSTZgwQWvXrlVISEiJYnQqGahcubJSU1MLnCwjI0MhISHKy8tzplsAAMoEM4cJChsSKExwcLC8vLwKVAFOnjxZoFrgaNmyZRo0aJBWrFihbt26lThGp4YJippzmJWVJR8fH2e6BADA0nx8fBQdHa2EhAS79oSEBLVv377I45YuXarHHntMS5Ys0Z133unUuUtUGXjzzTclXSphvP3226pUqZLtu7y8PO3YsUONGjVyKhAAAMoKd60cGBsbq/79+6tt27aKiYnRvHnzlJycrGHDhkm6NOSQkpKiRYsWSbqUCAwYMEAzZ87UTTfdZKsq+Pv7KzAwsNjnLVEyMH36dEmXKgNz5syRl5eX7TsfHx9FRkZqzpw5JekSAIAyJ99Nryrq16+fMjIyNGnSJKWmpqpZs2basGGDIiIiJF2as/f3NQfmzp2r3NxcDR8+XMOHD7e1P/roo3rnnXeKfV6n1hm45ZZbtHr1agUFBZX00CKxzgBQEOsMAIVz9ToDYyMfMq2vf/26xLS+XMWpCYRbt241Ow4AAMoMq73CuNjJQGxsrF5++WUFBAQUeETC0bRp0646MAAA3KW8vGDILMVOBpKSkpSTk2P7c1GK8ywkAAAoO4qdDPx9aIBhAgDAtcxdEwjdxal1Bs6ePavMzMwC7ZmZmTp37txVBwUAgDu5662F7uJUMvDAAw/ogw8+KNC+fPlyPfDAA1cdFAAA7pRv4lYeOJUMfPnll7rlllsKtHfp0kVffvnlVQcFAABKj1OPFmZlZSk3N7dAe05Ojv7888+rDgoAAHdizkAxtGvXTvPmzSvQPmfOHEVHR191UAAAuJPV5gw4VRl45ZVX1K1bN3399dfq2rWrJGnLli3at2+fNm/ebGqAAADAtZyqDHTo0EG7d+9WeHi4li9frvXr16tBgwb65ptv1LFjR7NjBACgVFltAqFTlQFJatWqlRYvXmxmLAAAlAlGuSnwm6PYycC5c+dUpUoV258v56/9AABA2VfsZCAoKEipqakKCQlR1apVC1122DAMeXh4KC8vz9QgAQAoTeWlvG+WYicDn332mapVqyaJ5YgBANc2qz1aWOxkoHPnzoX+GQAAlG/FTga++eabYnfaokULp4IBAKAssFZdoATJQKtWreTh4SHDuPxfEXMGypZln36pdz7eqfSz51W/dohGP3KH2jSMvOJxSYd/06BX/qMGdUK0/JURdt+9/8kuLd+yV2kZZ1S1ckXd2q6Znr7/Vvn6eLvoKoCrM+yJR/Vc7DCFhYXo+x8O67nnxuvzL/YWuX+njjfp9dfHq2mT63X8+AlNfWO25s1/r9B977+/l5a8P1tr132ie/sOsrV3vPlGPffck2rTurlq1aqpe/oO1Lp1m0y/NrgGwwRF+OWXX1wZB1zgkz3fasr7GzTmsbvU6rq6Wrl1n556fZHWTH5aYcFVizzu9wsXNXbuSt3QtJ4yz563++7jL/Zr5vLNmji4j1peV1e/paVr3LzVkqT/98gdrrwcwCn33ddL096YoBEjX9Ku3fs0ZHB/fbT+fTVv2UVHjx4vsH9kZLjWr3tPb/9niR59bKTax7TT//37VZ1Kz9CaNRvs9q1bt7amTB6nnTv3FOgnIKCivvnmB73z7jKtXP62y64PrsEEwiJERES4Mg64wHsbv1CfztG6p0tbSdLoR+7Urm+PaPmWvXqmX/cij3t5wVr1iGkpL08PbU08YPfd10eOqtV1dXVH+5aSpNo1gnR7TAt999Mx110IcBWefWaIFiz8QAsWLpUkPff8eHXv3lnDnhigMWMnF9j/iaH9lXw0Rc89P16SdPDgEUVHt9Rzzw6zSwY8PT313rv/p4mTpurmm29U1ar2j1R/smmrPtnEZGuUD8VOBtatW6cePXrI29tb69atu+y+vXr1uurAcHVycnN14NfjGnhXJ7v2mGYN9PWPyUUe9+GORB07malXn+yr+Wu3Ffi+9fUR2rDra3370zE1r19Hx05m6vOvD+uum1uZfAXA1fP29labNi302utv2bUnJGxXzE1tCz3mphujlZCw3a5tc8I2DXz8AVWoUMH2krZ/jn1Wp9IztPCdD3TzzTe65gLgNiw6VITevXsrLS1NISEh6t27d5H7MWegbDj9+wXl5eerepVKdu3VAwOU7lD6/8tvaemauWyzFo4dogpeXoXu0yOmhU7//ocee3m+JEO5efm6v+sNGnQXT5ig7AkOrqYKFSro5Il0u/aTJ9MVWjOk0GNCa4bo5EmH/U+ky9vbW8HB1ZSWdlLtY9rq8cceVHS7W10WO9yLYYIi5OfnF/pnZ2RlZSkrK8uuzcjOYQKaCziuDWUYBdskKS8/X3GzVujJe7oqMiy4yP72HfhZb6/brjGP3aXm9eso+USmprz/sYI/3Konet9icvSAORwnPl9pMnTB/f/XXqlSgN59598a9uT/U0bGadNjBdzB6XcTXI34+HhNnDjRrm3M4L4aO+R+d4RzTQqqXFFenp4FqgCZ5/4oUC2QpD/+zNL3v6To4G+pmrzoI0lSvmHIMAy1eXScZo9+VDc2ra+3Vm5Rzw6tbPMQrguvqT+zsvXygrUa0quzPD2devcV4BLp6ZnKzc1VaM0adu01alTXyROnCj3mRNpJhYY67B8SrJycHGVknFbTpg0VFVVXH655x/b9X//fX7zwm5o066Sff/7N3AtBqWOYoJi2bNmiLVu26OTJkwUqBQsWLLjssXFxcYqNjbVrM775yNlQUAjvChXUOLKW9nx3RF3bNrG17/nuiLq0aVxg/0r+vlr56ki7tuVbvtTeH37W1JEPqnaNIEnSxeycAktRe3le+i3LWrcOyoOcnBx99dU36ta1k9au/cTW3q1bJ61fX/hjfnu+TNSdd9qX/2/t1lmJid8oNzdXBw8eUcvW/7D7ftLE0apcqZKefW5coU8ooPxhmKAYJk6cqEmTJqlt27YKCwsr9D0Fl+Pr6ytfX1+7tosMEZiuf48OGjNnpZpE1VbLBuFatfW/Ss04q/u6tpMkzVy2WSdPn9Mrw/rK09NT14WH2h1frUqAfL0r2LV3bt1Q723cpUYRYWpev46OnsjUWyu3qHObRvKiKoAyaPrM+Xp34UwlJn6tPV8masigR1Q3vLbmzru0bsAr/3pRtWqF6fGBz0iS5s57T089+bimThmvtxcs1k03Rmvg4w/o4f7DJV0a5vz++0N25zhz5tLL2/7eHhBQUQ0aRNk+R0XWVcuWTZWZeZqEAWWOU8nAnDlz9M4776h///5mxwMT3X5Tc509f0HzPtyqU2d+V4M6oXrr+f6qFXzpt/z0M78rLeNMifoccncXechDb638VCdPn1NQlQB1btVII+7r5oIrAK7eihXrVL1akMaOeVZhYSH67vtDuqtXfyUnp0iSatYMVd3wWrb9f/31qO7q1V9Tp07Qk08+quPHT2jUs+MKrDFwJW2jW2rLpyttn9+YOkGS9O6i5Ro0+NmrvzC4VP4VFti71ngYV1pSsBDVq1fX3r17Vb9+fdMCubh3hWl9AdeKSjePcncIQJmUm53i0v4fibjHtL7e/221aX25ilN13cGDB2vJkiVmxwIAANyg2MMEf5/wl5+fr3nz5unTTz9VixYt5O1tP94/bdo08yIEAKCU8W6CIiQlJdl9btWqlSTpu+++s2sv6WRCAADKGqs9H1XsZGDrVtbYBgBYg9UeLbzqZ8GOHj2qY8d4SQ0AAOWVU8lAbm6u/vnPfyowMFCRkZGKiIhQYGCgxo4dq5ycHLNjBACgVOXLMG0rD5xaZ2DEiBFas2aNpkyZopiYGEnS7t27NWHCBKWnp2vOnDmmBgkAQGlizkAxLF26VB988IF69Ohha2vRooXq1q2rBx54gGQAAIByxKlkwM/PT5GRkQXaIyMj5ePjc7UxAQDgVkwgLIbhw4fr5ZdftnsNcVZWll555RWNGDHCtOAAAHAH4/9/a6sZW3ngVGUgKSlJW7ZsUZ06ddSyZUtJ0tdff63s7Gx17dpV99zzv2UcV68u+8swAgBgZU4lA1WrVtW9995r1xYeHm5KQAAAuFt5eQrALE4lAwsXLjQ7DgAAygyrzRlwKhn4y6lTp3To0CF5eHjo+uuvV40aNcyKCwAAlBKnJhD+8ccfGjhwoMLCwtSpUyd17NhRtWrV0qBBg3ThwgWzYwQAoFQZJv5XHjiVDMTGxmr79u1av369zpw5ozNnzmjt2rXavn27nnvuObNjBACgVLECYTGsWrVKK1euVJcuXWxtd9xxh/z9/XX//fdr9uzZZsUHAECpKy+PBJrFqcrAhQsXFBoaWqA9JCSEYQIAAMoZp5KBmJgYjR8/XhcvXrS1/fnnn5o4caLtXQUAAJRX+SZu5YFTwwQzZ87U7bffblt0yMPDQ/v375efn582bdpkdowAAJSq8jLxzyxOJQPNmjXTjz/+qPfff18HDx6UYRh64IEH9PDDD8vf39/sGAEAgAs5vc6Av7+/hgwZYmYsAACUCeXlKQCzOJ0MHDp0SP/+97914MABeXh4qFGjRhoxYoQaNWpkZnwAAJQ6niYohpUrV6pZs2ZKTExUy5Yt1aJFC3311Vdq3ry5VqxYYXaMAADAhZyqDIwePVpxcXGaNGmSXfv48eP1wgsv6L777jMlOAAA3MFqwwROVQbS0tI0YMCAAu2PPPKI0tLSrjooAADcieWIi6FLly7auXNngfbPP/9cHTt2vOqgAABwp3zDMG0rD5waJujVq5deeOEFJSYm6qabbpIk7dmzRytWrNDEiRO1bt06u30BAEDZ5WE4MWXS07N4BQUPDw/l5eUVa9+Le5l4CDiqdPMod4cAlEm52Sku7b9j7a6m9bUzZYtpfbmKU5WB/PzyssAiAAAlxwRCAABgKcWuDLz55psaOnSo/Pz89Oabb15236effvqqAwMAwF2sVhko9pyBqKgo/fe//1X16tUVFRVVdIceHvr5559LHAhzBoCCmDMAFM7VcwZuqtXFtL72HN9mWl+uUuzKwC+//FLonwEAQPlW7GQgNja2WPt5eHjojTfecDogAADczWrDBMVOBpKSkoq1n4eHh9PBAABQFpSXlQPNUuxkYOvWra6MAwAAuInTrzAGAOBaxSuMAQCwuHwZpm0lNWvWLEVFRcnPz0/R0dGFvgvoL6mpqXrooYfUsGFDeXp6atSoUU5dL8kAAAAODMMwbSuJZcuWadSoURozZoySkpLUsWNH9ejRQ8nJyYXun5WVpRo1amjMmDFq2bKl09dLMgAAQBkxbdo0DRo0SIMHD1bjxo01Y8YMhYeHa/bs2YXuHxkZqZkzZ2rAgAEKDAx0+rzMGQAAwIGZjxZmZWUpKyvLrs3X11e+vr52bdnZ2UpMTNSLL75o1969e3ft2rXLtHgKQ2UAAAAHhon/xcfHKzAw0G6Lj48vcM709HTl5eUpNDTUrj00NFRpaWkuvV4qAwAAuFBcXFyBhfscqwJ/57hej2EYLl/Dh2QAAAAH+SY+WljYkEBhgoOD5eXlVaAKcPLkyQLVArMxTAAAgAMzhwmKy8fHR9HR0UpISLBrT0hIUPv27c2+RDtUBgAAKCNiY2PVv39/tW3bVjExMZo3b56Sk5M1bNgwSZeGHFJSUrRo0SLbMfv375cknT9/XqdOndL+/fvl4+OjJk2aFPu8JAMAADgwc5igJPr166eMjAxNmjRJqampatasmTZs2KCIiAhJlxYZclxzoHXr1rY/JyYmasmSJYqIiNCvv/5a7PN6GGVkzcWLe1e4OwSgzKl08yh3hwCUSbnZKS7tv1FIO9P6Onhyn2l9uQpzBgAAsDiGCQAAcOCuYQJ3IRkAAMBBSZ4CuBaQDAAA4MBqlQHmDAAAYHFUBgAAcMAwAQAAFmcY+e4OoVQxTAAAgMVRGQAAwEE+wwQAAFhbGVmct9QwTAAAgMVRGQAAwAHDBAAAWBzDBAAAwFKoDAAA4MBqyxGTDAAA4IAVCAEAsDjmDAAAAEuhMgAAgAMeLQQAwOIYJgAAAJZCZQAAAAc8WggAgMUxTAAAACyFygAAAA54mgAAAItjmAAAAFgKlQEAABzwNAEAABbHi4oAALA4q1UGmDMAAIDFURkAAMCB1Z4mIBkAAMCB1eYMMEwAAIDFURkAAMABwwQAAFic1ZIBhgkAALA4KgMAADiwVl1A8jCsVgvBZWVlZSk+Pl5xcXHy9fV1dzhAmcB9gWsdyQDsnDt3ToGBgTp79qyqVKni7nCAMoH7Atc65gwAAGBxJAMAAFgcyQAAABZHMgA7vr6+Gj9+PJOkgL/hvsC1jgmEAABYHJUBAAAsjmQAAACLIxkAAMDiSAYswMPDQx9++KG7wwDKHO4N4BKSARfp0qWLRo0a5e4wJEmpqanq0aOHu8Mwxffff697771XkZGR8vDw0IwZM9wdEkqIe8M15s+fr44dOyooKEhBQUHq1q2b9u7d6+6wUE6QDFhAzZo1r5lHoi5cuKB69epp8uTJqlmzprvDQTl3Ld0b27Zt04MPPqitW7dq9+7dqlu3rrp3766UlBR3h4bywIDpHn30UUOXXnpl23755Rdj27ZtRrt27QwfHx+jZs2axgsvvGDk5OQYhmEY7777rlGtWjXj4sWLdn3dc889Rv/+/Q3DMIzx48cbLVu2NP7zn/8Y4eHhRkBAgDFs2DAjNzfXeO2114zQ0FCjRo0axr/+9S+7PiQZa9assZ0nICDAOHz4sO37ESNGGNddd51x/vz5K17b8ePHjTvuuMPw8/MzIiMjjcWLFxsRERHG9OnTbfucPn3aGDJkiBESEmL4+voaTZs2NdavX28YhmEsXLjQCAwMND755BOjUaNGRkBAgHHbbbcZx48fL/Hfs+N5UfZxb5TOvWEYhpGbm2tUrlzZePfdd506HtZCMuACZ86cMWJiYowhQ4YYqampRmpqqnHs2DGjYsWKxlNPPWUcOHDAWLNmjREcHGyMHz/eMAzDuHDhghEYGGgsX77c1s+pU6cMHx8f47PPPjMM49IPvEqVKhl9+/Y1vv/+e2PdunWGj4+PcdtttxkjR440Dh48aCxYsMCQZOzevdvWz99/4BmGYdx3331Gu3btjJycHGPjxo2Gt7e3sXfv3mJdW7du3YxWrVoZe/bsMRITE43OnTsb/v7+th94eXl5xk033WQ0bdrU2Lx5s/HTTz8Z69evNzZs2GAYxqUfeN7e3ka3bt2Mffv2GYmJiUbjxo2Nhx56qMR/zyQD5Q/3RuncG4ZhGOfOnTP8/PxsyQZwOSQDLtK5c2fjmWeesX1+6aWXjIYNGxr5+fm2trfeesuoVKmSkZeXZxiGYTz55JNGjx49bN/PmDHDqFevnu2Y8ePHGxUrVjTOnTtn2+e2224zIiMjbX0YhmE0bNjQiI+Pt312/IGXmZlp1KlTx3jyySeN0NDQAr8tFeXAgQOGJGPfvn22th9//NGQZPuBt2nTJsPT09M4dOhQoX0sXLjQkGQcOXLE7u8hNDS0WDH8HclA+cS94fp7wzAM46mnnjLq169v/Pnnn04dD2upUJpDElZ24MABxcTEyMPDw9bWoUMHnT9/XseOHVPdunU1ZMgQtWvXTikpKapdu7YWLlyoxx57zO6YyMhIVa5c2fY5NDRUXl5e8vT0tGs7efJkkbEEBQXpP//5j2677Ta1b99eL774YrGu4dChQ6pQoYLatGlja2vQoIGCgoJsn/fv3686dero+uuvL7KfihUrqn79+rbPYWFhl40X1zbujf8x696YMmWKli5dqm3btsnPz6/Ex8N6mEBYSgzDsPvB9VebJFt769at1bJlSy1atEhfffWVvv32Wz322GN2x3h7e9t99vDwKLQtPz//svHs2LFDXl5eOn78uP74449iX8OV2v39/a/YT2HxFtU3rn3cG/9jxr0xdepUvfrqq9q8ebNatGhRomNhXSQDLuLj46O8vDzb5yZNmmjXrl12N/auXbtUuXJl1a5d29Y2ePBgLVy4UAsWLFC3bt0UHh5uemy7du3SlClTtH79elWpUkUjR44s1nGNGjVSbm6ukpKSbG1HjhzRmTNnbJ9btGihY8eO6fDhw2aHjWsE94br7o3XX39dL7/8sj755BO1bdvWZefBtYdkwEUiIyP15Zdf6tdff1V6erqeeuopHT16VCNHjtTBgwe1du1ajR8/XrGxsXZlzIcfflgpKSmaP3++Bg4caHpcv//+u/r376+RI0eqR48eWrJkiZYvX64VK1Zc8dhGjRqpW7duGjp0qPbu3aukpCQNHTpU/v7+tt/gOnfurE6dOunee+9VQkKCfvnlF23cuFGffPKJKfFnZ2dr//792r9/v7Kzs5WSkqL9+/fryJEjpvQP1+PecM29MWXKFI0dO1YLFixQZGSk0tLSlJaWpvPnz5vSP65tJAMu8vzzz8vLy0tNmjRRjRo1lJOTow0bNmjv3r1q2bKlhg0bpkGDBmns2LF2x1WpUkX33nuvKlWqpN69e5se1zPPPKOAgAC9+uqrkqSmTZvqtdde07Bhw4r1PPKiRYsUGhqqTp06qU+fPhoyZIgqV65sNy65atUqtWvXTg8++KCaNGmi0aNH2/0meDWOHz+u1q1bq3Xr1kpNTdXUqVPVunVrDR482JT+4XrcG665N2bNmqXs7Gz17dtXYWFhtm3q1Kmm9I9rG68wLoNuvfVWNW7cWG+++aa7Q7miY8eOKTw8XJ9++qm6du3q7nBwjePeAFyDZKAMyczM1ObNm/Xwww/rhx9+UMOGDd0dUgGfffaZzp8/r+bNmys1NVWjR49WSkqKDh8+XGDyE2AW7g3AtXi0sAxp06aNTp8+rddee80tP+x27tx52XXaz58/r5ycHL300kv6+eefVblyZbVv316LFy827YddpUqVivxu48aN6tixoynnQfnCvcG9AdeiMgCbP//887Jjow0aNHB5DJebCFi7du1iPZ4FmI17A9c6kgEAACyOpwkAALA4kgEAACyOZAAAAIsjGQAAwOJIBgAAsDiSAQAALI5kAAAAiyMZAADA4v4/YDgYj30wX4gAAAAASUVORK5CYII=\",\n      \"text/plain\": [\n       \"<Figure size 640x480 with 2 Axes>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"sns.heatmap(data, annot=True)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"The End.\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  },\n  \"orig_nbformat\": 4\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2\n}\n"
  },
  {
    "path": "notebooks/dev-datamodule-invalidate-cache.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import pathlib\\n\",\n    \"import functools\\n\",\n    \"import tempfile\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import lightning\\n\",\n    \"import torch\\n\",\n    \"import datamol as dm\\n\",\n    \"\\n\",\n    \"import graphium\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Setup a temporary cache file. Only for\\n\",\n    \"# demo purposes, use a known path in prod.\\n\",\n    \"cache_data_path = pathlib.Path(tempfile.mkdtemp()) / \\\"cache.pkl\\\"\\n\",\n    \"cache_data_path = \\\"/home/hadim/test-cache.pkl\\\"\\n\",\n    \"\\n\",\n    \"# Load a dataframe\\n\",\n    \"df = graphium.data.load_tiny_zinc()\\n\",\n    \"df.head()\\n\",\n    \"\\n\",\n    \"# Setup the featurization\\n\",\n    \"featurization_args = {}\\n\",\n    \"featurization_args[\\\"atom_property_list_onehot\\\"] = [\\\"atomic-number\\\", \\\"valence\\\"]\\n\",\n    \"featurization_args[\\\"atom_property_list_float\\\"] = [\\\"mass\\\", \\\"electronegativity\\\", \\\"in-ring\\\"]\\n\",\n    \"featurization_args[\\\"edge_property_list\\\"] = [\\\"bond-type-onehot\\\", \\\"stereo\\\", \\\"in-ring\\\"]\\n\",\n    \"featurization_args[\\\"add_self_loop\\\"] = False\\n\",\n    \"featurization_args[\\\"use_bonds_weights\\\"] = False\\n\",\n    \"featurization_args[\\\"explicit_H\\\"] = False\\n\",\n    \"\\n\",\n    \"# Config for datamodule\\n\",\n    \"dm_args = {}\\n\",\n    \"dm_args[\\\"df\\\"] = df\\n\",\n    \"dm_args[\\\"cache_data_path\\\"] = cache_data_path\\n\",\n    \"dm_args[\\\"featurization\\\"] = featurization_args\\n\",\n    \"dm_args[\\\"smiles_col\\\"] = \\\"SMILES\\\"\\n\",\n    \"dm_args[\\\"label_cols\\\"] = [\\\"SA\\\"]\\n\",\n    \"dm_args[\\\"split_val\\\"] = 0.2\\n\",\n    \"dm_args[\\\"split_test\\\"] = 0.2\\n\",\n    \"dm_args[\\\"split_seed\\\"] = 19\\n\",\n    \"dm_args[\\\"batch_size_training\\\"] = 16\\n\",\n    \"dm_args[\\\"batch_size_inference\\\"] = 16\\n\",\n    \"dm_args[\\\"num_workers\\\"] = 0\\n\",\n    \"dm_args[\\\"pin_memory\\\"] = True\\n\",\n    \"dm_args[\\\"featurization_n_jobs\\\"] = 16\\n\",\n    \"dm_args[\\\"featurization_progress\\\"] = True\\n\",\n    \"\\n\",\n    \"datam = graphium.data.DGLFromSmilesDataModule(**dm_args)\\n\",\n    \"# datam\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-30 14:19:26.972 | INFO     | graphium.data.datamodule:_load_from_cache:460 - Try reloading the data module from /home/hadim/test-cache.pkl.\\n\",\n      \"2021-04-30 14:19:27.001 | INFO     | graphium.data.datamodule:_load_from_cache:485 - Cache featurizer arguments are different than the provided ones.\\n\",\n      \"2021-04-30 14:19:27.001 | INFO     | graphium.data.datamodule:_load_from_cache:486 - Cache featurizer arguments: {'atom_property_list_onehot': ['atomic-number', 'valence'], 'atom_property_list_float': ['mass', 'electronegativity', 'in-ring'], 'edge_property_list': ['bond-type-onehot', 'stereo'], 'add_self_loop': False, 'explicit_H': False, 'use_bonds_weights': False, 'pos_encoding_as_features': None, 'pos_encoding_as_directions': None, 'dtype': torch.float32}\\n\",\n      \"2021-04-30 14:19:27.002 | INFO     | graphium.data.datamodule:_load_from_cache:487 - Provided featurizer arguments: {'atom_property_list_onehot': ['atomic-number', 'valence'], 'atom_property_list_float': ['mass', 'electronegativity', 'in-ring'], 'edge_property_list': ['bond-type-onehot', 'stereo', 'in-ring'], 'add_self_loop': False, 'explicit_H': False, 'use_bonds_weights': False, 'pos_encoding_as_features': None, 'pos_encoding_as_directions': None, 'dtype': torch.float32}.\\n\",\n      \"2021-04-30 14:19:27.002 | INFO     | graphium.data.datamodule:_load_from_cache:488 - Fallback to regular data preparation steps.\\n\",\n      \"2021-04-30 14:19:27.003 | INFO     | graphium.data.datamodule:prepare_data:313 - Prepare dataset with 100 data points.\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"00871e6c7c86454181162e37d837daf2\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/100 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-30 14:19:27.099 | INFO     | graphium.data.datamodule:_save_to_cache:433 - Write prepared datamodule to /home/hadim/test-cache.pkl\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load and prepare the data\\n\",\n    \"datam.prepare_data()\\n\",\n    \"\\n\",\n    \"# Create the split torch datasets\\n\",\n    \"datam.setup()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 27,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-30 14:35:26.199 | INFO     | graphium.data.datamodule:_load_from_cache:460 - Try reloading the data module from /home/hadim/test-cache.pkl.\\n\",\n      \"2021-04-30 14:35:26.226 | INFO     | graphium.data.datamodule:_load_from_cache:485 - Cache featurizer arguments are different than the provided ones.\\n\",\n      \"2021-04-30 14:35:26.227 | INFO     | graphium.data.datamodule:_load_from_cache:486 - Cache featurizer arguments: {'atom_property_list_onehot': [], 'atom_property_list_float': ['mass', 'electronegativity', 'in-ring'], 'edge_property_list': ['bond-type-onehot', 'stereo', 'in-ring'], 'add_self_loop': False, 'explicit_H': False, 'use_bonds_weights': False, 'pos_encoding_as_features': None, 'pos_encoding_as_directions': None, 'dtype': torch.float32}\\n\",\n      \"2021-04-30 14:35:26.228 | INFO     | graphium.data.datamodule:_load_from_cache:487 - Provided featurizer arguments: {'atom_property_list_onehot': [], 'atom_property_list_float': ['mass', 'electronegativity'], 'edge_property_list': ['stereo', 'in-ring'], 'add_self_loop': False, 'explicit_H': False, 'use_bonds_weights': False, 'pos_encoding_as_features': None, 'pos_encoding_as_directions': None, 'dtype': torch.float32}.\\n\",\n      \"2021-04-30 14:35:26.228 | INFO     | graphium.data.datamodule:_load_from_cache:488 - Fallback to regular data preparation steps.\\n\",\n      \"2021-04-30 14:35:26.230 | INFO     | graphium.data.datamodule:prepare_data:313 - Prepare dataset with 100 data points.\\n\",\n      \"2021-04-30 14:35:30.104 | INFO     | graphium.data.datamodule:_save_to_cache:433 - Write prepared datamodule to /home/hadim/test-cache.pkl\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Setup a temporary cache file. Only for\\n\",\n    \"# demo purposes, use a known path in prod.\\n\",\n    \"cache_data_path = pathlib.Path(tempfile.mkdtemp()) / \\\"cache.pkl\\\"\\n\",\n    \"cache_data_path = \\\"/home/hadim/test-cache.pkl\\\"\\n\",\n    \"\\n\",\n    \"# Load a dataframe\\n\",\n    \"df = graphium.data.load_tiny_zinc()\\n\",\n    \"df.head()\\n\",\n    \"\\n\",\n    \"# Setup the featurization\\n\",\n    \"featurization_args = {}\\n\",\n    \"featurization_args[\\\"atom_property_list_float\\\"] = [\\\"mass\\\", \\\"electronegativity\\\"]\\n\",\n    \"featurization_args[\\\"edge_property_list\\\"] = [\\\"stereo\\\", \\\"in-ring\\\"]\\n\",\n    \"\\n\",\n    \"# Config for datamodule\\n\",\n    \"dm_args = {}\\n\",\n    \"dm_args[\\\"df\\\"] = df\\n\",\n    \"dm_args[\\\"cache_data_path\\\"] = cache_data_path\\n\",\n    \"dm_args[\\\"featurization\\\"] = featurization_args\\n\",\n    \"\\n\",\n    \"datam = graphium.data.DGLFromSmilesDataModule(**dm_args)\\n\",\n    \"datam.prepare_data()\\n\",\n    \"datam.setup()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 26,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"assert datam.num_node_feats == 3\\n\",\n    \"assert datam.num_edge_feats == 13\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 16,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"g = batch[\\\"features\\\"]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 28,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"2\"\n      ]\n     },\n     \"execution_count\": 28,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"datam.num_node_feats\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 29,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"8\"\n      ]\n     },\n     \"execution_count\": 29,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"datam.num_edge_feats\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.8\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {\n     \"00871e6c7c86454181162e37d837daf2\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HBoxModel\",\n      \"state\": {\n       \"children\": [\n        \"IPY_MODEL_37d9c98862ec47ed9d7fb861ca143b13\",\n        \"IPY_MODEL_89ab73566bed4fe390806729088f2c25\",\n        \"IPY_MODEL_c5f7310ae3dd4c9eb49ee170ba94737d\"\n       ],\n       \"layout\": \"IPY_MODEL_e3545fe78e9649d2831a7e7abb080c7d\"\n      }\n     },\n     \"0e81947ca6274213823944bae84a159d\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"152a150013ac4b8183686b44cafb4d70\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"2bb452e722d7402c8411536653ef2607\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"2dc57ad3de9d49c38ff0c755becdfbe4\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"361dd11acc754be1b7d1feb857b099b1\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"37d9c98862ec47ed9d7fb861ca143b13\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_3bacc61224d84ae1b6f8419670ed764a\",\n       \"style\": \"IPY_MODEL_152a150013ac4b8183686b44cafb4d70\",\n       \"value\": \"100%\"\n      }\n     },\n     \"3bacc61224d84ae1b6f8419670ed764a\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"406707a37c634af3a04210a3a70717a5\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"4cd44552f1c74407855cf630b0dda913\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_2dc57ad3de9d49c38ff0c755becdfbe4\",\n       \"style\": \"IPY_MODEL_901bf951a68147ca973bcaa9b4e41e15\",\n       \"value\": \"100%\"\n      }\n     },\n     \"4f38767301c74ec99da43bb05c647cea\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"62ffd0e5adf0467eaa8447d685bed061\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_e4b0e4ebb9d1480f8c8e9a9e9eef6763\",\n       \"style\": \"IPY_MODEL_2bb452e722d7402c8411536653ef2607\",\n       \"value\": \" 100/100 [00:03&lt;00:00, 32.28it/s]\"\n      }\n     },\n     \"75bf0c6796ee4c6a89973add825f1607\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"ProgressStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"89ab73566bed4fe390806729088f2c25\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"FloatProgressModel\",\n      \"state\": {\n       \"bar_style\": \"success\",\n       \"layout\": \"IPY_MODEL_406707a37c634af3a04210a3a70717a5\",\n       \"style\": \"IPY_MODEL_cf416a8e58e9435ea059d869ab44fef4\",\n       \"value\": 100\n      }\n     },\n     \"901bf951a68147ca973bcaa9b4e41e15\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"90b1460eeee3435996df0cb5616e35de\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"b2fb93d6e2734502b742860c78777bfe\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"FloatProgressModel\",\n      \"state\": {\n       \"bar_style\": \"success\",\n       \"layout\": \"IPY_MODEL_0e81947ca6274213823944bae84a159d\",\n       \"style\": \"IPY_MODEL_75bf0c6796ee4c6a89973add825f1607\",\n       \"value\": 100\n      }\n     },\n     \"c5f7310ae3dd4c9eb49ee170ba94737d\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_361dd11acc754be1b7d1feb857b099b1\",\n       \"style\": \"IPY_MODEL_4f38767301c74ec99da43bb05c647cea\",\n       \"value\": \" 100/100 [00:00&lt;00:00, 1083.51it/s]\"\n      }\n     },\n     \"cf416a8e58e9435ea059d869ab44fef4\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"ProgressStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"e3545fe78e9649d2831a7e7abb080c7d\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"e4b0e4ebb9d1480f8c8e9a9e9eef6763\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"ec0a8b9534304715b2bb07db46f676fd\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HBoxModel\",\n      \"state\": {\n       \"children\": [\n        \"IPY_MODEL_4cd44552f1c74407855cf630b0dda913\",\n        \"IPY_MODEL_b2fb93d6e2734502b742860c78777bfe\",\n        \"IPY_MODEL_62ffd0e5adf0467eaa8447d685bed061\"\n       ],\n       \"layout\": \"IPY_MODEL_90b1460eeee3435996df0cb5616e35de\"\n      }\n     }\n    },\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/dev-datamodule-ogb.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"```yaml\\n\",\n    \"data:\\n\",\n    \"  module_type: \\\"GraphOGBDataModule\\\"\\n\",\n    \"  args:\\n\",\n    \"    cache_data_path: null\\n\",\n    \"  \\n\",\n    \"    dataset_name: \\\"ogbg-moltox21\\\"\\n\",\n    \"  \\n\",\n    \"    batch_size_training: 16\\n\",\n    \"    batch_size_inference: 16\\n\",\n    \"  \\n\",\n    \"    featurization:\\n\",\n    \"      atom_property_list_float: []\\n\",\n    \"      atom_property_list_onehot: [\\\"atomic-number\\\", \\\"degree\\\"]\\n\",\n    \"      edge_property_list: [\\\"ring\\\", \\\"bond-type-onehot\\\"]\\n\",\n    \"      add_self_loop: false\\n\",\n    \"      use_bonds_weights: false\\n\",\n    \"      explicit_H: false\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import pathlib\\n\",\n    \"import functools\\n\",\n    \"import tempfile\\n\",\n    \"import importlib\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import pandas as pd\\n\",\n    \"import lightning\\n\",\n    \"import torch\\n\",\n    \"import datamol as dm\\n\",\n    \"\\n\",\n    \"import graphium\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-15 14:08:11.044 | INFO     | graphium.data.datamodule:_load_dataset:585 - Loading /home/hadim/.cache/graphium/ogb/freesolv/mapping/mol.csv.gz in memory.\\n\",\n      \"2021-04-15 14:08:11.053 | INFO     | graphium.data.datamodule:_load_dataset:598 - Saving splits to /home/hadim/.cache/graphium/ogb/freesolv/split/scaffold.csv.gz\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"dataset_name: ogbg-molfreesolv\\n\",\n       \"name: GraphOGBDataModule\\n\",\n       \"len: 642\\n\",\n       \"batch_size_training: 16\\n\",\n       \"batch_size_inference: 16\\n\",\n       \"num_node_feats: 50\\n\",\n       \"num_edge_feats: 5\\n\",\n       \"collate_fn: graphium_collate_fn\\n\",\n       \"featurization:\\n\",\n       \"  atom_property_list_float: []\\n\",\n       \"  atom_property_list_onehot:\\n\",\n       \"  - atomic-number\\n\",\n       \"  - degree\\n\",\n       \"  edge_property_list:\\n\",\n       \"  - bond-type-onehot\\n\",\n       \"  add_self_loop: false\\n\",\n       \"  use_bonds_weights: false\\n\",\n       \"  explicit_H: false\"\n      ]\n     },\n     \"execution_count\": 3,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"dataset_names = [\\\"ogbg-molhiv\\\", \\\"ogbg-molpcba\\\", \\\"ogbg-moltox21\\\", \\\"ogbg-molfreesolv\\\"]\\n\",\n    \"dataset_name = dataset_names[3]\\n\",\n    \"\\n\",\n    \"# Setup a temporary cache file. Only for\\n\",\n    \"# demo purposes, use a known path in prod.\\n\",\n    \"cache_data_path = pathlib.Path(tempfile.mkdtemp()) / \\\"cache.pkl\\\"\\n\",\n    \"\\n\",\n    \"# Setup the featurization\\n\",\n    \"featurization_args = {}\\n\",\n    \"featurization_args[\\\"atom_property_list_float\\\"] = []  # [\\\"weight\\\", \\\"valence\\\"]\\n\",\n    \"featurization_args[\\\"atom_property_list_onehot\\\"] = [\\\"atomic-number\\\", \\\"degree\\\"]\\n\",\n    \"featurization_args[\\\"edge_property_list\\\"] = [\\\"bond-type-onehot\\\"]\\n\",\n    \"featurization_args[\\\"add_self_loop\\\"] = False\\n\",\n    \"featurization_args[\\\"use_bonds_weights\\\"] = False\\n\",\n    \"featurization_args[\\\"explicit_H\\\"] = False\\n\",\n    \"\\n\",\n    \"# Config for datamodule\\n\",\n    \"dm_args = {}\\n\",\n    \"dm_args[\\\"dataset_name\\\"] = dataset_name\\n\",\n    \"dm_args[\\\"cache_data_path\\\"] = cache_data_path\\n\",\n    \"dm_args[\\\"featurization\\\"] = featurization_args\\n\",\n    \"dm_args[\\\"batch_size_training\\\"] = 16\\n\",\n    \"dm_args[\\\"batch_size_inference\\\"] = 16\\n\",\n    \"dm_args[\\\"num_workers\\\"] = 0\\n\",\n    \"dm_args[\\\"pin_memory\\\"] = True\\n\",\n    \"dm_args[\\\"featurization_n_jobs\\\"] = 16\\n\",\n    \"dm_args[\\\"featurization_progress\\\"] = True\\n\",\n    \"\\n\",\n    \"ds = graphium.data.GraphOGBDataModule(**dm_args)\\n\",\n    \"ds\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"{'num tasks': '1',\\n\",\n       \" 'eval metric': 'rmse',\\n\",\n       \" 'download_name': 'freesolv',\\n\",\n       \" 'version': '1',\\n\",\n       \" 'url': 'http://snap.stanford.edu/ogb/data/graphproppred/csv_mol_download/freesolv.zip',\\n\",\n       \" 'add_inverse_edge': 'True',\\n\",\n       \" 'data type': 'mol',\\n\",\n       \" 'has_node_attr': 'True',\\n\",\n       \" 'has_edge_attr': 'True',\\n\",\n       \" 'task type': 'regression',\\n\",\n       \" 'num classes': '-1',\\n\",\n       \" 'split': 'scaffold',\\n\",\n       \" 'additional node files': 'None',\\n\",\n       \" 'additional edge files': 'None',\\n\",\n       \" 'binary': 'False'}\"\n      ]\n     },\n     \"execution_count\": 4,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Access to the OGB metadata with\\n\",\n    \"ds.metadata\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-15 14:08:12.006 | INFO     | graphium.data.datamodule:prepare_data:291 - Prepare dataset with 642 data points.\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"f6fe0d8a6ba34cb58d598b57dd10eee3\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/642 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-15 14:08:14.918 | INFO     | graphium.data.datamodule:prepare_data:326 - Write prepared data to /tmp/tmppuh1m6te/cache.pkl\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load and prepare the data\\n\",\n    \"ds.prepare_data()\\n\",\n    \"\\n\",\n    \"# Create the split torch datasets\\n\",\n    \"ds.setup()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"{'smiles': 'CN(C)C(=O)c1ccc(cc1)OC',\\n\",\n       \" 'indices': '4-methoxy-N,N-dimethyl-benzamide',\\n\",\n       \" 'features': Graph(num_nodes=13, num_edges=26,\\n\",\n       \"       ndata_schemes={'feat': Scheme(shape=(50,), dtype=torch.float32)}\\n\",\n       \"       edata_schemes={'feat': Scheme(shape=(5,), dtype=torch.float32)}),\\n\",\n       \" 'labels': array([-11.01])}\"\n      ]\n     },\n     \"execution_count\": 6,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"ds.train_ds[0]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"{'smiles': ['CCCCO[N+](=O)[O-]',\\n\",\n       \"  'CC(=O)OC',\\n\",\n       \"  'CC(=O)Oc1ccccc1C(=O)O',\\n\",\n       \"  'CCl',\\n\",\n       \"  'CC(C)(C)c1ccc(cc1)O',\\n\",\n       \"  'C(CBr)Br',\\n\",\n       \"  'c1ccc(cc1)C(=O)N',\\n\",\n       \"  'CCCCC[N+](=O)[O-]',\\n\",\n       \"  'CCCCBr',\\n\",\n       \"  'c1cc(c(cc1c2ccc(cc2F)F)C(=O)O)O',\\n\",\n       \"  'c1ccc(cc1)C=O',\\n\",\n       \"  'CCCc1ccc(c(c1)OC)O',\\n\",\n       \"  'CC[C@@H](C)CO',\\n\",\n       \"  'CCOc1ccccc1',\\n\",\n       \"  'c1c(c(cc(c1Cl)Cl)Cl)Cl',\\n\",\n       \"  'C(CO[N+](=O)[O-])CO[N+](=O)[O-]'],\\n\",\n       \" 'indices': ['butyl nitrate',\\n\",\n       \"  'methyl acetate',\\n\",\n       \"  'acetylsalicylic acid',\\n\",\n       \"  'chloromethane',\\n\",\n       \"  '4-tert-butylphenol',\\n\",\n       \"  '1,2-dibromoethane',\\n\",\n       \"  'benzamide',\\n\",\n       \"  '1-nitropentane',\\n\",\n       \"  '1-bromobutane',\\n\",\n       \"  'diflunisal',\\n\",\n       \"  'benzaldehyde',\\n\",\n       \"  '4-propylguaiacol',\\n\",\n       \"  '2-methylbutan-1-ol',\\n\",\n       \"  'ethoxybenzene',\\n\",\n       \"  '1,2,4,5-tetrachlorobenzene',\\n\",\n       \"  '3-nitrooxypropyl nitrate'],\\n\",\n       \" 'features': Graph(num_nodes=139, num_edges=264,\\n\",\n       \"       ndata_schemes={'feat': Scheme(shape=(50,), dtype=torch.float32)}\\n\",\n       \"       edata_schemes={'feat': Scheme(shape=(5,), dtype=torch.float32)}),\\n\",\n       \" 'labels': tensor([[ -2.0900],\\n\",\n       \"         [ -3.1300],\\n\",\n       \"         [ -9.9400],\\n\",\n       \"         [ -0.5500],\\n\",\n       \"         [ -5.9100],\\n\",\n       \"         [ -2.3300],\\n\",\n       \"         [-11.0000],\\n\",\n       \"         [ -2.8200],\\n\",\n       \"         [ -0.4000],\\n\",\n       \"         [ -9.4000],\\n\",\n       \"         [ -4.0200],\\n\",\n       \"         [ -5.2600],\\n\",\n       \"         [ -4.4200],\\n\",\n       \"         [ -2.2200],\\n\",\n       \"         [ -1.3400],\\n\",\n       \"         [ -4.8000]], dtype=torch.float64)}\"\n      ]\n     },\n     \"execution_count\": 7,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Load a dataloader and get the first batch from it\\n\",\n    \"dl = ds.train_dataloader()\\n\",\n    \"it = iter(dl)\\n\",\n    \"batch = next(it)\\n\",\n    \"batch\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.6\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {\n     \"2d051530f9634105b12d36afc5d11654\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"ProgressStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"32e5f96ac5fd4c16b62e4368fb992809\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"610f08033bcf44398b601e0d7e713711\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_c24b1f15dd104d289e1aeab380edab1c\",\n       \"style\": \"IPY_MODEL_f65d6dd792de4a4c897faeb334c437da\",\n       \"value\": \" 642/642 [00:02&lt;00:00, 845.57it/s]\"\n      }\n     },\n     \"6b74cba4cbce45f287badc40b74a86f8\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"8c764c51e7ea4d50805261a23d038659\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"FloatProgressModel\",\n      \"state\": {\n       \"bar_style\": \"success\",\n       \"layout\": \"IPY_MODEL_e1dbad6e70ae4c7ba9c7b766bc07723a\",\n       \"max\": 642,\n       \"style\": \"IPY_MODEL_2d051530f9634105b12d36afc5d11654\",\n       \"value\": 642\n      }\n     },\n     \"a782964c18c442b78224af256bc97c8f\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_6b74cba4cbce45f287badc40b74a86f8\",\n       \"style\": \"IPY_MODEL_bb9a948a917f4dc9bb2c5801d69bf912\",\n       \"value\": \"100%\"\n      }\n     },\n     \"bb9a948a917f4dc9bb2c5801d69bf912\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"c24b1f15dd104d289e1aeab380edab1c\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"e1dbad6e70ae4c7ba9c7b766bc07723a\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"f65d6dd792de4a4c897faeb334c437da\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"f6fe0d8a6ba34cb58d598b57dd10eee3\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HBoxModel\",\n      \"state\": {\n       \"children\": [\n        \"IPY_MODEL_a782964c18c442b78224af256bc97c8f\",\n        \"IPY_MODEL_8c764c51e7ea4d50805261a23d038659\",\n        \"IPY_MODEL_610f08033bcf44398b601e0d7e713711\"\n       ],\n       \"layout\": \"IPY_MODEL_32e5f96ac5fd4c16b62e4368fb992809\"\n      }\n     }\n    },\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/dev-datamodule.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\",\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.8/site-packages/ray/autoscaler/_private/cli_logger.py:57: FutureWarning: Not all Ray CLI dependencies were found. In Ray 1.4+, the Ray CLI, autoscaler, and dashboard will only be usable via `pip install 'ray[default]'`. Please update your install command.\\n\",\n      \"  warnings.warn(\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import pathlib\\n\",\n    \"import functools\\n\",\n    \"import tempfile\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import lightning\\n\",\n    \"import torch\\n\",\n    \"import datamol as dm\\n\",\n    \"\\n\",\n    \"import graphium\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Setup a temporary cache file. Only for\\n\",\n    \"# demo purposes, use a known path in prod.\\n\",\n    \"cache_data_path = pathlib.Path(tempfile.mkdtemp()) / \\\"cache.pkl\\\"\\n\",\n    \"cache_data_path = \\\"/home/hadim/test-cache.pkl\\\"\\n\",\n    \"\\n\",\n    \"# Load a dataframe\\n\",\n    \"df = graphium.data.load_tiny_zinc()\\n\",\n    \"df.head()\\n\",\n    \"\\n\",\n    \"# Setup the featurization\\n\",\n    \"featurization_args = {}\\n\",\n    \"featurization_args[\\\"atom_property_list_onehot\\\"] = [\\\"atomic-number\\\", \\\"valence\\\"]\\n\",\n    \"featurization_args[\\\"atom_property_list_float\\\"] = [\\\"mass\\\", \\\"electronegativity\\\", \\\"in-ring\\\"]\\n\",\n    \"featurization_args[\\\"edge_property_list\\\"] = [\\\"bond-type-onehot\\\", \\\"stereo\\\", \\\"in-ring\\\"]\\n\",\n    \"featurization_args[\\\"add_self_loop\\\"] = False\\n\",\n    \"featurization_args[\\\"use_bonds_weights\\\"] = False\\n\",\n    \"featurization_args[\\\"explicit_H\\\"] = False\\n\",\n    \"\\n\",\n    \"# Config for datamodule\\n\",\n    \"dm_args = {}\\n\",\n    \"dm_args[\\\"df\\\"] = df\\n\",\n    \"dm_args[\\\"cache_data_path\\\"] = cache_data_path\\n\",\n    \"dm_args[\\\"featurization\\\"] = featurization_args\\n\",\n    \"dm_args[\\\"smiles_col\\\"] = \\\"SMILES\\\"\\n\",\n    \"dm_args[\\\"label_cols\\\"] = [\\\"SA\\\"]\\n\",\n    \"dm_args[\\\"split_val\\\"] = 0.2\\n\",\n    \"dm_args[\\\"split_test\\\"] = 0.2\\n\",\n    \"dm_args[\\\"split_seed\\\"] = 19\\n\",\n    \"dm_args[\\\"batch_size_training\\\"] = 16\\n\",\n    \"dm_args[\\\"batch_size_inference\\\"] = 16\\n\",\n    \"dm_args[\\\"num_workers\\\"] = 0\\n\",\n    \"dm_args[\\\"pin_memory\\\"] = True\\n\",\n    \"dm_args[\\\"featurization_n_jobs\\\"] = 16\\n\",\n    \"dm_args[\\\"featurization_progress\\\"] = True\\n\",\n    \"\\n\",\n    \"datam = graphium.data.DGLFromSmilesDataModule(**dm_args)\\n\",\n    \"# datam\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-30 12:18:14.443 | INFO     | graphium.data.datamodule:prepare_data:326 - Prepare dataset with 100 data points.\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"44fb42081f30493b83381737c36b749b\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/100 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-04-30 12:18:14.538 | INFO     | graphium.data.datamodule:prepare_data:364 - Write prepared data to /home/hadim/test-cache.pkl\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load and prepare the data\\n\",\n    \"datam.prepare_data()\\n\",\n    \"\\n\",\n    \"# Create the split torch datasets\\n\",\n    \"datam.setup()\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"{'smiles': ['c1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2',\\n\",\n       \"  'CC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1',\\n\",\n       \"  'Cc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1',\\n\",\n       \"  'CCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1',\\n\",\n       \"  'Cc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1',\\n\",\n       \"  'CCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1',\\n\",\n       \"  'Cc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1',\\n\",\n       \"  'CC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O',\\n\",\n       \"  'CCOc1cc(/C=C(\\\\\\\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC',\\n\",\n       \"  'CNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1',\\n\",\n       \"  'CC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1',\\n\",\n       \"  'Cc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1',\\n\",\n       \"  'COCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C',\\n\",\n       \"  'Cc1nc(C)c(S(=O)(=O)/N=C(\\\\\\\\[O-])C[C@H]2CCCO2)s1',\\n\",\n       \"  'COc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1',\\n\",\n       \"  'COc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1'],\\n\",\n       \" 'features': Graph(num_nodes=352, num_edges=754,\\n\",\n       \"       ndata_schemes={'feat': Scheme(shape=(50,), dtype=torch.float32)}\\n\",\n       \"       edata_schemes={'feat': Scheme(shape=(6,), dtype=torch.float32)}),\\n\",\n       \" 'labels': tensor([[3.3631],\\n\",\n       \"         [4.4837],\\n\",\n       \"         [2.3425],\\n\",\n       \"         [5.1279],\\n\",\n       \"         [2.6651],\\n\",\n       \"         [2.7752],\\n\",\n       \"         [1.9474],\\n\",\n       \"         [2.2990],\\n\",\n       \"         [2.3508],\\n\",\n       \"         [4.3913],\\n\",\n       \"         [3.2280],\\n\",\n       \"         [3.1176],\\n\",\n       \"         [3.9551],\\n\",\n       \"         [3.8130],\\n\",\n       \"         [3.5463],\\n\",\n       \"         [2.4272]], dtype=torch.float64)}\"\n      ]\n     },\n     \"execution_count\": 7,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"# Load a dataloader and get the first batch from it\\n\",\n    \"dl = dm.train_dataloader()\\n\",\n    \"it = iter(dl)\\n\",\n    \"batch = next(it)\\n\",\n    \"batch\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"---\\n\",\n    \"\\n\",\n    \"## Launch a training\\n\",\n    \"\\n\",\n    \"In `graphium.cli.train` I have added a `click` CLI command that take a config file as input and build the datamodule. Once I am done with the PL module I will complete the command.\\n\",\n    \"\\n\",\n    \"The way to use it is quite simple, you need to install graphium with `pip install -e .` (omit`-e` in prod) and then:\\n\",\n    \"\\n\",\n    \"```bash\\n\",\n    \"graphium train -c my_config.yaml\\n\",\n    \"```\\n\",\n    \"\\n\",\n    \"It's not here for now but usually I \\\"augment\\\" the CLI command with various config key you might want to set without having to modify the config file itself:\\n\",\n    \"\\n\",\n    \"```bash\\n\",\n    \"graphium train -c my_config.yaml --training-path /home/hadim/data/graphium/runs/exp_1\\n\",\n    \"```\\n\",\n    \"\\n\",\n    \"Later the same strategy could be done to launch an hparams tuning run (with a config file as above + a config file defning the search space).\\n\",\n    \"\\n\",\n    \"```bash\\n\",\n    \"graphium tune -c my_config.yaml --tune-config tuning_space.yaml\\n\",\n    \"```\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.8\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {\n     \"0030b0ce3c8449ef85fa60d666d1b0f2\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HBoxModel\",\n      \"state\": {\n       \"children\": [\n        \"IPY_MODEL_ddbdfc31d8d5490fb76222ebc9a24acd\",\n        \"IPY_MODEL_6e4d111760234fbdb4917b08658d3051\",\n        \"IPY_MODEL_e46ddaffeb8e49a7bc5567c88909e13d\"\n       ],\n       \"layout\": \"IPY_MODEL_eba0f18e613146399666443cbc2c076b\"\n      }\n     },\n     \"018709257e0e4e70acb33fa130116082\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"0610cb6d623d4803bf0f9f815fab0920\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"0d0ffe3e74d644d1925fcaedddf0ece3\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"131843a571704dee959fc517d8c34418\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"3d49c4343fb94ad2a21e3a7f7a46b618\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"ProgressStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"44fb42081f30493b83381737c36b749b\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HBoxModel\",\n      \"state\": {\n       \"children\": [\n        \"IPY_MODEL_612f570826554086b613aea51d44f859\",\n        \"IPY_MODEL_ec5b76e0bc2e48eb8c065e97fe0f3202\",\n        \"IPY_MODEL_58ac0b8fac834dc7b09621fd42f16a7e\"\n       ],\n       \"layout\": \"IPY_MODEL_018709257e0e4e70acb33fa130116082\"\n      }\n     },\n     \"5400f0683b3c4a3087f1e2f01ff7bb1e\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"58ac0b8fac834dc7b09621fd42f16a7e\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_0610cb6d623d4803bf0f9f815fab0920\",\n       \"style\": \"IPY_MODEL_f8a5df3c91a24f8d93f85ebf9d0238a6\",\n       \"value\": \" 100/100 [00:00&lt;00:00, 1098.73it/s]\"\n      }\n     },\n     \"612f570826554086b613aea51d44f859\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_69a0f2f7d9e44b00a36397d27403ffee\",\n       \"style\": \"IPY_MODEL_131843a571704dee959fc517d8c34418\",\n       \"value\": \"100%\"\n      }\n     },\n     \"69a0f2f7d9e44b00a36397d27403ffee\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"6e4d111760234fbdb4917b08658d3051\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"FloatProgressModel\",\n      \"state\": {\n       \"bar_style\": \"success\",\n       \"layout\": \"IPY_MODEL_a478bfea4d2341c58800220f4d103ccf\",\n       \"style\": \"IPY_MODEL_3d49c4343fb94ad2a21e3a7f7a46b618\",\n       \"value\": 100\n      }\n     },\n     \"8c2f56fda6ba4a178adb1972b14b798b\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"a478bfea4d2341c58800220f4d103ccf\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"c4b8f1c9733c48dca039c687273f0d08\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"ddb4d8cfeec84be2967e79056b2a505e\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"ProgressStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     },\n     \"ddbdfc31d8d5490fb76222ebc9a24acd\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_c4b8f1c9733c48dca039c687273f0d08\",\n       \"style\": \"IPY_MODEL_5400f0683b3c4a3087f1e2f01ff7bb1e\",\n       \"value\": \"100%\"\n      }\n     },\n     \"e163ebc8b9fb485aa3dcb6d1e12d15b9\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"e46ddaffeb8e49a7bc5567c88909e13d\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"HTMLModel\",\n      \"state\": {\n       \"layout\": \"IPY_MODEL_e163ebc8b9fb485aa3dcb6d1e12d15b9\",\n       \"style\": \"IPY_MODEL_8c2f56fda6ba4a178adb1972b14b798b\",\n       \"value\": \" 100/100 [00:03&lt;00:00, 21.48it/s]\"\n      }\n     },\n     \"eba0f18e613146399666443cbc2c076b\": {\n      \"model_module\": \"@jupyter-widgets/base\",\n      \"model_module_version\": \"1.2.0\",\n      \"model_name\": \"LayoutModel\",\n      \"state\": {}\n     },\n     \"ec5b76e0bc2e48eb8c065e97fe0f3202\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"FloatProgressModel\",\n      \"state\": {\n       \"bar_style\": \"success\",\n       \"layout\": \"IPY_MODEL_0d0ffe3e74d644d1925fcaedddf0ece3\",\n       \"style\": \"IPY_MODEL_ddb4d8cfeec84be2967e79056b2a505e\",\n       \"value\": 100\n      }\n     },\n     \"f8a5df3c91a24f8d93f85ebf9d0238a6\": {\n      \"model_module\": \"@jupyter-widgets/controls\",\n      \"model_module_version\": \"1.5.0\",\n      \"model_name\": \"DescriptionStyleModel\",\n      \"state\": {\n       \"description_width\": \"\"\n      }\n     }\n    },\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/dev-pretrained.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import pathlib\\n\",\n    \"import functools\\n\",\n    \"import tempfile\\n\",\n    \"import yaml\\n\",\n    \"\\n\",\n    \"from loguru import logger\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import lightning\\n\",\n    \"import torch\\n\",\n    \"import datamol as dm\\n\",\n    \"import pandas as pd\\n\",\n    \"\\n\",\n    \"import graphium\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Training\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load a config\\n\",\n    \"with open(\\\"../expts/config_micro_ZINC.yaml\\\", \\\"r\\\") as file:\\n\",\n    \"    yaml_config = yaml.load(file, Loader=yaml.FullLoader)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"training_dir = \\\"/home/hadim/Drive/Data/graphium/test-training\\\"\\n\",\n    \"\\n\",\n    \"# Tweak config and paths\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"df_path\\\"] = \\\"../graphium/data/micro_ZINC/micro_ZINC.csv\\\"\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"cache_data_path\\\"] = None\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"min_epochs\\\"] = 1\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"max_epochs\\\"] = 1\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"logger\\\"][\\\"save_dir\\\"] = training_dir\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"model_checkpoint\\\"][\\\"dirpath\\\"] = None\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"default_root_dir\\\"] = training_dir\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/Drive/Documents/valence/Platform/Libs/graphium/graphium/features/spectral.py:43: ComplexWarning: Casting complex values to real discards the imaginary part\\n\",\n      \"  eigvecs[comp, :] = this_eigvecs\\n\",\n      \"/home/hadim/Drive/Documents/valence/Platform/Libs/graphium/graphium/features/spectral.py:44: ComplexWarning: Casting complex values to real discards the imaginary part\\n\",\n      \"  eigvals_tile[comp, :] = this_eigvals\\n\",\n      \"2021-06-08 13:34:09.135 | WARNING  | graphium.config._loader:load_trainer:124 - Number of GPUs selected is `1`, but will be ignored since no GPU are available on this device\\n\",\n      \"GPU available: False, used: False\\n\",\n      \"TPU available: False, using: 0 TPU cores\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load datamodule\\n\",\n    \"datamodule = graphium.config.load_datamodule(yaml_config)\\n\",\n    \"\\n\",\n    \"# Initialize the network\\n\",\n    \"model_class, model_kwargs = graphium.config.load_architecture(\\n\",\n    \"    yaml_config,\\n\",\n    \"    in_dim_nodes=datamodule.num_node_feats_with_positional_encoding,\\n\",\n    \"    in_dim_edges=datamodule.num_edge_feats,\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"# Init trainer\\n\",\n    \"metrics = graphium.config.load_metrics(yaml_config)\\n\",\n    \"predictor = graphium.config.load_predictor(yaml_config, model_class, model_kwargs, metrics)\\n\",\n    \"trainer = graphium.config.load_trainer(yaml_config)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-06-08 13:34:10.118 | INFO     | graphium.data.datamodule:prepare_data:347 - Prepare dataset with 1000 data points.\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"9fc35f5ef68542048dd5e89b857130bd\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"  0%|          | 0/1000 [00:00<?, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/pytorch_lightning/utilities/distributed.py:69: RuntimeWarning: Found unsupported keys in the lr scheduler dict: ['mode']\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\",\n      \"\\n\",\n      \"  | Name     | Type           | Params\\n\",\n      \"--------------------------------------------\\n\",\n      \"0 | model    | FullDGLNetwork | 74.4 K\\n\",\n      \"1 | loss_fun | MSELoss        | 0     \\n\",\n      \"--------------------------------------------\\n\",\n      \"74.4 K    Trainable params\\n\",\n      \"0         Non-trainable params\\n\",\n      \"74.4 K    Total params\\n\",\n      \"0.298     Total estimated model params size (MB)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validation sanity check: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/pytorch_lightning/utilities/distributed.py:69: UserWarning: The dataloader, val dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of the `num_workers` argument` (try 16 which is the number of cpus on this machine) in the `DataLoader` init to improve performance.\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\",\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/dgl/base.py:45: DGLWarning: DGLGraph.in_degree is deprecated. Please use DGLGraph.in_degrees\\n\",\n      \"  return warnings.warn(message, category=category, stacklevel=1)\\n\",\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/pytorch_lightning/utilities/distributed.py:69: UserWarning: The dataloader, train dataloader, does not have many workers which may be a bottleneck. Consider increasing the value of the `num_workers` argument` (try 16 which is the number of cpus on this machine) in the `DataLoader` init to improve performance.\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"8aca5a77af224bdbb7478a4021fd83da\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Training: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Training\\n\",\n    \"trainer.fit(model=predictor, datamodule=datamodule)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Load pretrained (hard coded path)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load a config\\n\",\n    \"with open(\\\"../expts/config_micro_ZINC.yaml\\\", \\\"r\\\") as file:\\n\",\n    \"    yaml_config = yaml.load(file, Loader=yaml.FullLoader)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"training_dir = \\\"/home/hadim/Drive/Data/graphium/test-training\\\"\\n\",\n    \"\\n\",\n    \"# Tweak config and paths\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"df_path\\\"] = \\\"../graphium/data/micro_ZINC/micro_ZINC.csv\\\"\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"cache_data_path\\\"] = None\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"min_epochs\\\"] = 1\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"max_epochs\\\"] = 1\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"logger\\\"][\\\"save_dir\\\"] = training_dir\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"model_checkpoint\\\"][\\\"dirpath\\\"] = None\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"default_root_dir\\\"] = training_dir\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-06-08 13:37:06.943 | WARNING  | graphium.config._loader:load_trainer:124 - Number of GPUs selected is `1`, but will be ignored since no GPU are available on this device\\n\",\n      \"GPU available: False, used: False\\n\",\n      \"TPU available: False, using: 0 TPU cores\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load datamodule\\n\",\n    \"datamodule = graphium.config.load_datamodule(yaml_config)\\n\",\n    \"\\n\",\n    \"# Load a trainer\\n\",\n    \"trainer = graphium.config.load_trainer(yaml_config)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/pytorch_lightning/utilities/distributed.py:69: UserWarning: The dataloader, predict dataloader 0, does not have many workers which may be a bottleneck. Consider increasing the value of the `num_workers` argument` (try 16 which is the number of cpus on this machine) in the `DataLoader` init to improve performance.\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"b4a4fa9d68f94d878bd04ac8883b17cd\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Predicting: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/local/conda/envs/graphium/lib/python3.9/site-packages/dgl/base.py:45: DGLWarning: DGLGraph.in_degree is deprecated. Please use DGLGraph.in_degrees\\n\",\n      \"  return warnings.warn(message, category=category, stacklevel=1)\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load a pretrained model\\n\",\n    \"model_path = \\\"https://storage.valencelabs.com/graphium/pretrained-models/ZINC-micro-dummy-test.ckpt\\\"\\n\",\n    \"# model_path = \\\"/home/hadim/Drive/Data/graphium/test-training/default/version_0/checkpoints/model.ckpt\\\"\\n\",\n    \"predictor = graphium.trainer.predictor.PredictorModule.load_from_checkpoint(model_path)\\n\",\n    \"\\n\",\n    \"# Inference\\n\",\n    \"results = trainer.predict(predictor, datamodule=datamodule, return_predictions=True)\"\n   ]\n  },\n  {\n   \"attachments\": {},\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Load pretrained (from graphium available models)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 16,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# Load a config\\n\",\n    \"with open(\\\"../expts/config_micro_ZINC.yaml\\\", \\\"r\\\") as file:\\n\",\n    \"    yaml_config = yaml.load(file, Loader=yaml.FullLoader)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 17,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"training_dir = \\\"/home/hadim/Drive/Data/graphium/test-training\\\"\\n\",\n    \"\\n\",\n    \"# Tweak config and paths\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"df_path\\\"] = \\\"../graphium/data/micro_ZINC/micro_ZINC.csv\\\"\\n\",\n    \"yaml_config[\\\"datamodule\\\"][\\\"args\\\"][\\\"cache_data_path\\\"] = None\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"min_epochs\\\"] = 1\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"max_epochs\\\"] = 1\\n\",\n    \"\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"logger\\\"][\\\"save_dir\\\"] = training_dir\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"model_checkpoint\\\"][\\\"dirpath\\\"] = None\\n\",\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"default_root_dir\\\"] = training_dir\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 21,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"yaml_config[\\\"trainer\\\"][\\\"trainer\\\"][\\\"gpus\\\"] = 0\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"2021-06-08 13:37:06.943 | WARNING  | graphium.config._loader:load_trainer:124 - Number of GPUs selected is `1`, but will be ignored since no GPU are available on this device\\n\",\n      \"GPU available: False, used: False\\n\",\n      \"TPU available: False, using: 0 TPU cores\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Load datamodule\\n\",\n    \"datamodule = graphium.config.load_datamodule(yaml_config)\\n\",\n    \"\\n\",\n    \"# Load a trainer\\n\",\n    \"trainer = graphium.config.load_trainer(yaml_config)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 18,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"0d7d7a795c5e4a50a516d62b23605290\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Predicting: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"# Load a pretrained model\\n\",\n    \"predictor = graphium.trainer.PredictorModule.load_pretrained_models(\\\"ZINC-micro-dummy-test\\\")\\n\",\n    \"\\n\",\n    \"# Inference\\n\",\n    \"results = trainer.predict(predictor, datamodule=datamodule, return_predictions=True)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 38,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"graphium.trainer.predictor.PredictorModule\"\n      ]\n     },\n     \"execution_count\": 38,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"type(predictor)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 31,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"assert tuple(results[0].shape) == (128, 1)\"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.4\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/dev-training-loop.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"import graphium\\n\",\n    \"# from graphium.config._loader import (load_datamodule, load_metrics, load_architecture, load_predictor, load_trainer)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Constants\\n\",\n    \"\\n\",\n    \"First, we define the constants such as the random seed and whether the model should raise or ignore an error.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"constants:\\n\",\n      \"  seed: 42\\n\",\n      \"  raise_train_error: true\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"constants\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Datamodule\\n\",\n    \"\\n\",\n    \"Here, we define all the parameters required by the datamodule to run correctly, such as the dataset path, whether to cache, the columns for the training, the molecular featurization to use, the train/val/test splits and the batch size.\\n\",\n    \"\\n\",\n    \"For more details, see class `graphium.data.datamodule.DGLFromSmilesDataModule`\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"datamodule:\\n\",\n      \"  df_path: graphium/data/micro_ZINC/micro_ZINC.csv\\n\",\n      \"  cache_data_path: graphium/data/cache/micro_ZINC/full.cache\\n\",\n      \"  label_cols:\\n\",\n      \"  - score\\n\",\n      \"  smiles_col: SMILES\\n\",\n      \"  featurization_n_jobs: -1\\n\",\n      \"  featurization_progress: true\\n\",\n      \"  featurization:\\n\",\n      \"    atom_property_list_onehot:\\n\",\n      \"    - atomic-number\\n\",\n      \"    - valence\\n\",\n      \"    atom_property_list_float:\\n\",\n      \"    - mass\\n\",\n      \"    - electronegativity\\n\",\n      \"    - in-ring\\n\",\n      \"    edge_property_list: []\\n\",\n      \"    add_self_loop: false\\n\",\n      \"    explicit_H: false\\n\",\n      \"    use_bonds_weights: false\\n\",\n      \"  split_val: 0.2\\n\",\n      \"  split_test: 0.2\\n\",\n      \"  split_seed: 42\\n\",\n      \"  splits_path: null\\n\",\n      \"  batch_size_training: 128\\n\",\n      \"  batch_size_inference: 256\\n\",\n      \"  num_workers: -1\\n\",\n      \"  pin_memory: false\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"datamodule\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Architecture\\n\",\n    \"\\n\",\n    \"In the architecture, we define all the layers for the model, including the layers for the pre-processing MLP (input layers `pre-nn`), the post-processing MLP (output layers `post-nn`), and the main GNN (graph neural network `gnn`).\\n\",\n    \"\\n\",\n    \"The parameters allow to chose the feature size, the depth, the skip connections, the pooling and the virtual node. It also support different GNN layers such as `gcn`, `gin`, `gat`, `gated-gcn`, `pna-conv` and `pna-msgpass`.\\n\",\n    \"\\n\",\n    \"For more details, see the following classes:\\n\",\n    \"\\n\",\n    \"-  `graphium.nn.architecture.FullDGLNetwork`: Main class for the architecture\\n\",\n    \"-  `graphium.nn.architecture.FeedForwardNN`: Main class for the inputs and outputs MLP\\n\",\n    \"-  `graphium.nn.architecture.FeedForwardDGL`: Main class for the GNN layers\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"architecture:\\n\",\n      \"  model_type: fulldglnetwork\\n\",\n      \"  pre_nn:\\n\",\n      \"    out_dim: 32\\n\",\n      \"    hidden_dims: 32\\n\",\n      \"    depth: 1\\n\",\n      \"    activation: relu\\n\",\n      \"    last_activation: none\\n\",\n      \"    dropout: 0.1\\n\",\n      \"    normalization: \"batch_norm\"\\n\",\n      \"    last_normalization: \"batch_norm\"\\n\",\n      \"    residual_type: none\\n\",\n      \"  gnn:\\n\",\n      \"    out_dim: 32\\n\",\n      \"    hidden_dims: 32\\n\",\n      \"    depth: 4\\n\",\n      \"    activation: relu\\n\",\n      \"    last_activation: none\\n\",\n      \"    dropout: 0.1\\n\",\n      \"    normalization: \"batch_norm\"\\n\",\n      \"    last_normalization: \"batch_norm\"\\n\",\n      \"    residual_type: simple\\n\",\n      \"    pooling: sum\\n\",\n      \"    virtual_node: sum\\n\",\n      \"    layer_type: pna-msgpass\\n\",\n      \"    layer_kwargs:\\n\",\n      \"      aggregators:\\n\",\n      \"      - mean\\n\",\n      \"      - max\\n\",\n      \"      - min\\n\",\n      \"      - std\\n\",\n      \"      scalers:\\n\",\n      \"      - identity\\n\",\n      \"      - amplification\\n\",\n      \"      - attenuation\\n\",\n      \"  graph_output_nn:\\n\",\n      \"    out_dim: 1\\n\",\n      \"    hidden_dims: 32\\n\",\n      \"    depth: 2\\n\",\n      \"    activation: relu\\n\",\n      \"    last_activation: none\\n\",\n      \"    dropout: 0.1\\n\",\n      \"    normalization: \"batch_norm\"\\n\",\n      \"    last_normalization: \"none\"\\n\",\n      \"    residual_type: none\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"architecture\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Predictor\\n\",\n    \"\\n\",\n    \"In the predictor, we define the loss functions, the metrics to track on the progress bar, and all the parameters necessary for the optimizer.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"predictor:\\n\",\n      \"  metrics_on_progress_bar:\\n\",\n      \"  - mae\\n\",\n      \"  - pearsonr\\n\",\n      \"  - f1 > 3\\n\",\n      \"  - precision > 3\\n\",\n      \"  loss_fun: mse\\n\",\n      \"  random_seed: 42\\n\",\n      \"  optim_kwargs:\\n\",\n      \"    lr: 0.01\\n\",\n      \"    weight_decay: 1.0e-07\\n\",\n      \"  lr_reduce_on_plateau_kwargs:\\n\",\n      \"    factor: 0.5\\n\",\n      \"    patience: 7\\n\",\n      \"  scheduler_kwargs:\\n\",\n      \"    monitor: loss/val\\n\",\n      \"    frequency: 1\\n\",\n      \"  target_nan_mask: 0\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"predictor\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Metrics\\n\",\n    \"\\n\",\n    \"All the metrics can be defined there. If we want to use a classification metric, we can also define a threshold.\\n\",\n    \"\\n\",\n    \"See class `graphium.trainer.metrics.MetricWrapper` for more details.\\n\",\n    \"\\n\",\n    \"See `graphium.trainer.metrics.METRICS_CLASSIFICATION` and `graphium.trainer.metrics.METRICS_REGRESSION` for a dictionnary of accepted metrics.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"metrics:\\n\",\n      \"- name: mae\\n\",\n      \"  metric: mae\\n\",\n      \"  threshold_kwargs: null\\n\",\n      \"- name: pearsonr\\n\",\n      \"  metric: pearsonr\\n\",\n      \"  threshold_kwargs: null\\n\",\n      \"- name: f1 > 3\\n\",\n      \"  metric: f1\\n\",\n      \"  num_classes: 2\\n\",\n      \"  average: micro\\n\",\n      \"  threshold_kwargs:\\n\",\n      \"    operator: greater\\n\",\n      \"    threshold: 3\\n\",\n      \"    th_on_preds: true\\n\",\n      \"    th_on_target: true\\n\",\n      \"- name: f1 > 5\\n\",\n      \"  metric: f1\\n\",\n      \"  num_classes: 2\\n\",\n      \"  average: micro\\n\",\n      \"  threshold_kwargs:\\n\",\n      \"    operator: greater\\n\",\n      \"    threshold: 5\\n\",\n      \"    th_on_preds: true\\n\",\n      \"    th_on_target: true\\n\",\n      \"- name: precision > 3\\n\",\n      \"  metric: precision\\n\",\n      \"  average: micro\\n\",\n      \"  threshold_kwargs:\\n\",\n      \"    operator: greater\\n\",\n      \"    threshold: 3\\n\",\n      \"    th_on_preds: true\\n\",\n      \"    th_on_target: true\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"metrics\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"### Trainer\\n\",\n    \"\\n\",\n    \"Finally, the Trainer defines the parameters for the number of epochs to train, the checkpoints, and the patience.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"trainer:\\n\",\n      \"  logger:\\n\",\n      \"    save_dir: logs/micro_ZINC\\n\",\n      \"  early_stopping:\\n\",\n      \"    monitor: loss/val\\n\",\n      \"    min_delta: 0\\n\",\n      \"    patience: 10\\n\",\n      \"    mode: min\\n\",\n      \"  model_checkpoint:\\n\",\n      \"    dirpath: models_checkpoints/micro_ZINC/\\n\",\n      \"    filename: bob\\n\",\n      \"    monitor: loss/val\\n\",\n      \"    mode: min\\n\",\n      \"    save_top_k: 1\\n\",\n      \"    period: 1\\n\",\n      \"  trainer:\\n\",\n      \"    max_epochs: 25\\n\",\n      \"    min_epochs: 5\\n\",\n      \"    gpus: 1\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print_config_with_key(yaml_config, \\\"trainer\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Training the model\\n\",\n    \"\\n\",\n    \"Now that we defined all the configuration files, we want to train the model. The steps are fairly easy using the config loaders, and are given below.\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 10,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\",\n      \"2021-03-25 09:44:37.314 | WARNING  | graphium.config._loader:load_trainer:111 - Number of GPUs selected is `1`, but will be ignored since no GPU are available on this device\\n\",\n      \"/home/dominique/anaconda3/envs/graphium/lib/python3.8/site-packages/pytorch_lightning/utilities/distributed.py:51: UserWarning: Checkpoint directory models_checkpoints/micro_ZINC/ exists and is not empty.\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\",\n      \"GPU available: False, used: False\\n\",\n      \"TPU available: None, using: 0 TPU cores\\n\",\n      \"2021-03-25 09:44:37.331 | INFO     | graphium.data.datamodule:prepare_data:153 - Reload data from graphium/data/cache/micro_ZINC/full.cache.\\n\",\n      \"\\n\",\n      \"datamodule:\\n\",\n      \" name: DGLFromSmilesDataModule\\n\",\n      \"len: 1000\\n\",\n      \"batch_size_training: 128\\n\",\n      \"batch_size_inference: 256\\n\",\n      \"num_node_feats: 55\\n\",\n      \"num_edge_feats: 0\\n\",\n      \"collate_fn: graphium_collate_fn\\n\",\n      \"featurization:\\n\",\n      \"  atom_property_list_onehot:\\n\",\n      \"  - atomic-number\\n\",\n      \"  - valence\\n\",\n      \"  atom_property_list_float:\\n\",\n      \"  - mass\\n\",\n      \"  - electronegativity\\n\",\n      \"  - in-ring\\n\",\n      \"  edge_property_list: []\\n\",\n      \"  add_self_loop: false\\n\",\n      \"  explicit_H: false\\n\",\n      \"  use_bonds_weights: false\\n\",\n      \" \\n\",\n      \"\\n\",\n      \"{'mae': mean_absolute_error, 'pearsonr': pearsonr, 'f1 > 3': f1(>3), 'f1 > 5': f1(>5), 'precision > 3': precision(>3)}\\n\",\n      \"DGL_GNN\\n\",\n      \"---------\\n\",\n      \"    pre-NN(depth=1, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[55 -> 32]\\n\",\n      \"    \\n\",\n      \"    GNN(depth=4, ResidualConnectionSimple(skip_steps=1))\\n\",\n      \"        PNAMessagePassingLayer[32 -> 32 -> 32 -> 32 -> 32]\\n\",\n      \"        -> Pooling(sum) -> FCLayer(32 -> 32, activation=None)\\n\",\n      \"    \\n\",\n      \"    post-NN(depth=2, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[32 -> 32 -> 1]\\n\",\n      \"   | Name                            | Type                     | Params\\n\",\n      \"------------------------------------------------------------------------------\\n\",\n      \"0  | model                           | FullDGLNetwork           | 69.7 K\\n\",\n      \"1  | model.pre_nn                    | FeedForwardNN            | 1.9 K \\n\",\n      \"2  | model.pre_nn.activation         | ReLU                     | 0     \\n\",\n      \"3  | model.pre_nn.residual_layer     | ResidualConnectionNone   | 0     \\n\",\n      \"4  | model.pre_nn.layers             | ModuleList               | 1.9 K \\n\",\n      \"5  | model.pre_nn.layers.0           | FCLayer                  | 1.9 K \\n\",\n      \"6  | model.gnn                       | FeedForwardDGL           | 66.7 K\\n\",\n      \"7  | model.gnn.activation            | ReLU                     | 0     \\n\",\n      \"8  | model.gnn.layers                | ModuleList               | 62.2 K\\n\",\n      \"9  | model.gnn.layers.0              | PNAMessagePassingLayer   | 15.6 K\\n\",\n      \"10 | model.gnn.layers.1              | PNAMessagePassingLayer   | 15.6 K\\n\",\n      \"11 | model.gnn.layers.2              | PNAMessagePassingLayer   | 15.6 K\\n\",\n      \"12 | model.gnn.layers.3              | PNAMessagePassingLayer   | 15.6 K\\n\",\n      \"13 | model.gnn.virtual_node_layers   | ModuleList               | 3.4 K \\n\",\n      \"14 | model.gnn.virtual_node_layers.0 | VirtualNode              | 1.1 K \\n\",\n      \"15 | model.gnn.virtual_node_layers.1 | VirtualNode              | 1.1 K \\n\",\n      \"16 | model.gnn.virtual_node_layers.2 | VirtualNode              | 1.1 K \\n\",\n      \"17 | model.gnn.residual_layer        | ResidualConnectionSimple | 0     \\n\",\n      \"18 | model.gnn.global_pool_layer     | ModuleListConcat         | 0     \\n\",\n      \"19 | model.gnn.global_pool_layer.0   | SumPooling               | 0     \\n\",\n      \"20 | model.gnn.out_linear            | FCLayer                  | 1.1 K \\n\",\n      \"21 | model.gnn.out_linear.linear     | Linear                   | 1.1 K \\n\",\n      \"22 | model.gnn.out_linear.dropout    | Dropout                  | 0     \\n\",\n      \"23 | model.gnn.out_linear.batch_norm | BatchNorm1d              | 64    \\n\",\n      \"24 | model.graph_output_nn                   | FeedForwardNN            | 1.2 K \\n\",\n      \"25 | model.graph_output_nn.activation        | ReLU                     | 0     \\n\",\n      \"26 | model.graph_output_nn.residual_layer    | ResidualConnectionNone   | 0     \\n\",\n      \"27 | model.graph_output_nn.layers            | ModuleList               | 1.2 K \\n\",\n      \"28 | model.graph_output_nn.layers.0          | FCLayer                  | 1.1 K \\n\",\n      \"29 | model.graph_output_nn.layers.1          | FCLayer                  | 33    \\n\",\n      \"30 | loss_fun                        | MSELoss                  | 0     \\n\",\n      \"------------------------------------------------------------------------------\\n\",\n      \"69.7 K    Trainable params\\n\",\n      \"0         Non-trainable params\\n\",\n      \"69.7 K    Total params\\n\",\n      \"0.279     Total estimated model params size (MB)\\n\",\n      \"\\n\",\n      \"  | Name     | Type           | Params\\n\",\n      \"--------------------------------------------\\n\",\n      \"0 | model    | FullDGLNetwork | 69.7 K\\n\",\n      \"1 | loss_fun | MSELoss        | 0     \\n\",\n      \"--------------------------------------------\\n\",\n      \"69.7 K    Trainable params\\n\",\n      \"0         Non-trainable params\\n\",\n      \"69.7 K    Total params\\n\",\n      \"0.279     Total estimated model params size (MB)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"61dc1894ee264599ab493d982b390430\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validation sanity check: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/dominique/anaconda3/envs/graphium/lib/python3.8/site-packages/pytorch_lightning/utilities/distributed.py:51: UserWarning: The validation_epoch_end should not return anything as of 9.1. To log, use self.log(...) or self.write(...) directly in the LightningModule\\n\",\n      \"  warnings.warn(*args, **kwargs)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"657567b3d0b546a1a648173c2bfb1e4a\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Training: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"01792491c7fd49b08ce5086832135c7b\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"b9d96f2469234fe380d07ffd806350ed\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"eed00b1f81524fe99e07e2084c952532\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"c435a81142c24be09a0e38d7575b365b\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"ae0357d542ec4021b0c9c38fb38bd11c\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"a1d24f83f1504d3b80b0741d1c0f404b\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"647aee1f810f407c90697c394d0f604f\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"c69086a4802e421f816cab1ebbc20ae9\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"ebb4a32a0f78470ba21ff5bea8b450f4\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"840d27d095344fcd9a74862e61a2fe7f\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"905c7ee70f4b4282871c130b0c7b9f0a\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"daab3285c0854c23a4e3dd47846c820e\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"e0e0d9096fc64198a470ae1b3cd7f351\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"99ac351f4e334e8c838a6913ef6bee08\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"69b47fad071248eab8095d67e33b5d5e\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"8c68dcc01135429e845427bb6908f414\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"096aaea9ce2649fba9bf70b99b7e7955\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"d8ecff999d934a119157a3e0ca7a1c6a\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"4287a56d059b4eb2966eb2e90498a210\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"0a5ffab4db4e4768a4876b01a8b10f96\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"6177ef595f9542598e5b065d6d77bb32\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"e86938e35b0b443791119e37dd2e2199\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"98aff21b49cc434dbaaaf12c355ab783\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"e4c88c49c0c843c09934e9786e9b6aa5\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"dcb917418a084d4ba36d57f5b0406819\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Validating: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"1\"\n      ]\n     },\n     \"execution_count\": 10,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"\\n\",\n    \"\\n\",\n    \"MAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\\n\",\n    \"os.chdir(MAIN_DIR)\\n\",\n    \"\\n\",\n    \"cfg = dict(deepcopy(yaml_config))\\n\",\n    \"\\n\",\n    \"# Load and initialize the dataset\\n\",\n    \"datamodule = load_datamodule(cfg)\\n\",\n    \"print(\\\"\\\\ndatamodule:\\\\n\\\", datamodule, \\\"\\\\n\\\")\\n\",\n    \"\\n\",\n    \"# Initialize the network\\n\",\n    \"model_class, model_kwargs = load_architecture(\\n\",\n    \"    cfg,\\n\",\n    \"    in_dim_nodes=datamodule.num_node_feats_with_positional_encoding,\\n\",\n    \"    in_dim_edges=datamodule.num_edge_feats,\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"metrics = load_metrics(cfg)\\n\",\n    \"print(metrics)\\n\",\n    \"\\n\",\n    \"predictor = load_predictor(cfg, model_class, model_kwargs, metrics)\\n\",\n    \"\\n\",\n    \"print(predictor.model)\\n\",\n    \"print(predictor.summarize(max_depth=4))\\n\",\n    \"\\n\",\n    \"trainer = load_trainer(cfg, metrics)\\n\",\n    \"\\n\",\n    \"# Run the model training\\n\",\n    \"trainer.fit(model=predictor, datamodule=datamodule)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.8\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/dev.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"import pathlib\\n\",\n    \"import functools\\n\",\n    \"import tempfile\\n\",\n    \"import yaml\\n\",\n    \"\\n\",\n    \"from loguru import logger\\n\",\n    \"\\n\",\n    \"import numpy as np\\n\",\n    \"import lightning\\n\",\n    \"import torch\\n\",\n    \"import datamol as dm\\n\",\n    \"import pandas as pd\\n\",\n    \"\\n\",\n    \"import graphium\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/hadim/test-data/graphium-zinc-micro\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"import graphium\\n\",\n    \"\\n\",\n    \"dataset_dir = \\\"/home/hadim/test-data\\\"\\n\",\n    \"data_path = graphium.data.utils.download_graphium_dataset(\\\"graphium-zinc-micro\\\", output_path=dataset_dir)\\n\",\n    \"print(data_path)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"'/home/hadim/test-data/graphium-zinc-micro'\"\n      ]\n     },\n     \"execution_count\": 8,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"data_path\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python [conda env:graphium]\",\n   \"language\": \"python\",\n   \"name\": \"conda-env-graphium-py\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.9.4\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "notebooks/finetuning-on-tdc-admet-benchmark.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"id\": \"55ab828f\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"id\": \"07e0ed51\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"import yaml\\n\",\n    \"import omegaconf\\n\",\n    \"from datetime import datetime\\n\",\n    \"\\n\",\n    \"from typing import Union, List\\n\",\n    \"from copy import deepcopy\\n\",\n    \"from tdc.utils import retrieve_benchmark_names\\n\",\n    \"\\n\",\n    \"from graphium.config._loader import (\\n\",\n    \"    load_datamodule,\\n\",\n    \"    load_metrics,\\n\",\n    \"    load_architecture,\\n\",\n    \"    load_predictor,\\n\",\n    \"    load_trainer,\\n\",\n    \"    save_params_to_wandb,\\n\",\n    \"    load_accelerator,\\n\",\n    \"    load_yaml_config,\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"fba4e4dc\",\n   \"metadata\": {},\n   \"source\": [\n    \"# Fine-tuning on the TDC ADMET benchmarking group\\n\",\n    \"\\n\",\n    \"[TDC](https://tdcommons.ai/) hosts a variety of ML-ready datasets and benchmarks for ML for drug discovery. The [TDC ADMET benchmarking group](https://tdcommons.ai/benchmark/admet_group/overview/) is a popular collection of benchmarks for evaluating new _foundation models_ (see e.g. [MolE](https://arxiv.org/abs/2211.02657)) due to the variety and relevance of the included tasks.\\n\",\n    \"\\n\",\n    \"The ADMET benchmarking group is integrated in `graphium` through the `ADMETBenchmarkDataModule` data-module. This notebook shows how to easily fine-tune and test a model using that data-module. \\n\",\n    \"\\n\",\n    \"<div style=\\\"background-color: #fff3cd; border-radius: 10px; border-color: #ffeeba; padding: 20px; margin: 20px 0;  color: #856404\\\">\\n\",\n    \"    <b>NOTE:</b> This notebook is still <i>work in progress</i>. While the <b>fine-tuning logic is unfinished</b>, the notebook does demo how one could use the data-module to easily loop over each of the datasets in the benchmarking group and get the prescribed train-test split. Once the fine-tuning logic is finalized, we will finish this notebook and officially provide it as a tutorial within Graphium. \\n\",\n    \"</div>\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"id\": \"4d5af838\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"# First, let's read the yaml configuration file\\n\",\n    \"with open(\\\"../expts/configs/config_tdc_admet_demo.yaml\\\", \\\"r\\\") as file:\\n\",\n    \"    config = yaml.load(file, Loader=yaml.FullLoader)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"7b2047a3\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Get all TDC benchmark names\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"id\": \"faa5d4b4\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"data\": {\n      \"text/plain\": [\n       \"22\"\n      ]\n     },\n     \"execution_count\": 4,\n     \"metadata\": {},\n     \"output_type\": \"execute_result\"\n    }\n   ],\n   \"source\": [\n    \"benchmarks = retrieve_benchmark_names(\\\"admet_group\\\")\\n\",\n    \"len(benchmarks)\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"f80de284\",\n   \"metadata\": {},\n   \"source\": [\n    \"While there is a total of 22, let's just use two for practicality sake: One regression and one classification task! \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"id\": \"3ac3c111\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"benchmarks = [\\\"caco2_wang\\\", \\\"hia_hou\\\"]\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"3643aa30\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Initialize all training components per task\\n\",\n    \"**NOTE**: Since we do not have fine-tuning logic, this for now just creates a new model. Ultimately, we will want to use fine-tuning code to evaluate how well the pre-trained model transfers to downstream tasks. \"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 6,\n   \"id\": \"9538abfb\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def training_testing_loop(cfg):\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Simple loop to train a model from scratch and test it. \\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    \\n\",\n    \"    # Initialize object from config\\n\",\n    \"    cfg, accelerator_type = load_accelerator(cfg)\\n\",\n    \"    datamodule = load_datamodule(cfg, accelerator_type)\\n\",\n    \"    model_class, model_kwargs = load_architecture(cfg, in_dims=datamodule.in_dims)\\n\",\n    \"    metrics = load_metrics(cfg)\\n\",\n    \"    \\n\",\n    \"    # Prepare data\\n\",\n    \"    datamodule.prepare_data()\\n\",\n    \"    \\n\",\n    \"    # Initialize the predictor\\n\",\n    \"    predictor = load_predictor(\\n\",\n    \"        cfg,\\n\",\n    \"        model_class,\\n\",\n    \"        model_kwargs,\\n\",\n    \"        metrics,\\n\",\n    \"        datamodule.get_task_levels(),\\n\",\n    \"        accelerator_type,\\n\",\n    \"        datamodule.featurization,\\n\",\n    \"        datamodule.task_norms\\n\",\n    \"    )\\n\",\n    \"    \\n\",\n    \"    # Initialize the trainer\\n\",\n    \"    date_time_suffix = datetime.now().strftime(\\\"%d.%m.%Y_%H.%M.%S\\\")\\n\",\n    \"    trainer = load_trainer(cfg, \\\"tdc-admet\\\", accelerator_type, date_time_suffix)\\n\",\n    \"        \\n\",\n    \"    # Train\\n\",\n    \"    predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\\\"train\\\", \\\"val\\\"])\\n\",\n    \"    trainer.fit(model=predictor, datamodule=datamodule)\\n\",\n    \"    \\n\",\n    \"    # Test\\n\",\n    \"    predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\\\"test\\\"])\\n\",\n    \"    results = trainer.test(model=predictor, datamodule=datamodule)\\n\",\n    \"    \\n\",\n    \"    return results\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 7,\n   \"id\": \"1ee4586e\",\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": [\n    \"def filter_cfg_based_on_benchmark_name(config, names: Union[List[str], str]):\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    Filter a base config for the full TDC ADMET benchmarking group to only \\n\",\n    \"    have settings related to a subset of the endpoints\\n\",\n    \"    \\\"\\\"\\\"\\n\",\n    \"    \\n\",\n    \"    if config[\\\"datamodule\\\"][\\\"module_type\\\"] != \\\"ADMETBenchmarkDataModule\\\":\\n\",\n    \"        raise ValueError(\\\"You can only use this method for the `ADMETBenchmarkDataModule`\\\")\\n\",\n    \"        \\n\",\n    \"    if isinstance(names, str):\\n\",\n    \"        names = [names]\\n\",\n    \"    \\n\",\n    \"    def _filter(d):\\n\",\n    \"        return {k: v for k, v in d.items() if k in names}\\n\",\n    \"         \\n\",\n    \"    cfg = deepcopy(config)\\n\",\n    \"    \\n\",\n    \"    # Update the datamodule arguments\\n\",\n    \"    cfg[\\\"datamodule\\\"][\\\"args\\\"][\\\"tdc_benchmark_names\\\"] = names\\n\",\n    \"    \\n\",\n    \"    # Filter the relevant config sections\\n\",\n    \"    cfg[\\\"architecture\\\"][\\\"task_heads\\\"] = _filter(cfg[\\\"architecture\\\"][\\\"task_heads\\\"])\\n\",\n    \"    cfg[\\\"predictor\\\"][\\\"metrics_on_progress_bar\\\"] = _filter(cfg[\\\"predictor\\\"][\\\"metrics_on_progress_bar\\\"])\\n\",\n    \"    cfg[\\\"predictor\\\"][\\\"loss_fun\\\"] = _filter(cfg[\\\"predictor\\\"][\\\"loss_fun\\\"])\\n\",\n    \"    cfg[\\\"metrics\\\"] = _filter(cfg[\\\"metrics\\\"])\\n\",\n    \"    \\n\",\n    \"    return cfg\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 8,\n   \"id\": \"96bc606a\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2023-07-13 14:02:22.438\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36m__init__\\u001b[0m:\\u001b[36m2425\\u001b[0m - \\u001b[1mPreparing the TDC ADMET Benchmark Group splits for each of the 1 benchmarks.\\u001b[0m\\n\",\n      \"\\u001b[34m\\u001b[1mwandb\\u001b[0m: Currently logged in as: \\u001b[33mcwognum\\u001b[0m (\\u001b[33mvalence-ood\\u001b[0m). Use \\u001b[1m`wandb login --relogin`\\u001b[0m to force relogin\\n\",\n      \"\\u001b[34m\\u001b[1mwandb\\u001b[0m: \\u001b[33mWARNING\\u001b[0m Path logs/tdc-admet-demo/wandb/ wasn't writable, using system temp directory.\\n\",\n      \"wandb: WARNING Path logs/tdc-admet-demo/wandb/ wasn't writable, using system temp directory\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"Tracking run with wandb version 0.15.5\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"Run data is saved locally in <code>/tmp/wandb/run-20230713_140223-iablpj4z</code>\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"Syncing run <strong><a href='https://wandb.ai/valence-ood/tdc_admet_demo/runs/iablpj4z' target=\\\"_blank\\\">tdc_admet_demo_13.07.2023_14.02.22</a></strong> to <a href='https://wandb.ai/valence-ood/tdc_admet_demo' target=\\\"_blank\\\">Weights & Biases</a> (<a href='https://wandb.me/run' target=\\\"_blank\\\">docs</a>)<br/>\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \" View project at <a href='https://wandb.ai/valence-ood/tdc_admet_demo' target=\\\"_blank\\\">https://wandb.ai/valence-ood/tdc_admet_demo</a>\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \" View run at <a href='https://wandb.ai/valence-ood/tdc_admet_demo/runs/iablpj4z' target=\\\"_blank\\\">https://wandb.ai/valence-ood/tdc_admet_demo/runs/iablpj4z</a>\"\n      ],\n      \"text/plain\": [\n       \"<IPython.core.display.HTML object>\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"GPU available: True (cuda), used: True\\n\",\n      \"TPU available: False, using: 0 TPU cores\\n\",\n      \"IPU available: False, using: 0 IPUs\\n\",\n      \"HPU available: False, using: 0 HPUs\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.065\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 634\\n\",\n      \"\\tnum_nodes_total = 18351\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 2\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.441048173903344\\n\",\n      \"\\tmean_num_nodes_per_graph = 28.944794952681388\\n\",\n      \"\\tnum_edges_total = 39466\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 2\\n\",\n      \"\\tstd_num_edges_per_graph = 25.32877270037731\\n\",\n      \"\\tmean_num_edges_per_graph = 62.24921135646688\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.065\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 91\\n\",\n      \"\\tnum_nodes_total = 2791\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 13\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.461406761225364\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.67032967032967\\n\",\n      \"\\tnum_edges_total = 6034\\n\",\n      \"\\tmax_num_edges_per_graph = 148\\n\",\n      \"\\tmin_num_edges_per_graph = 28\\n\",\n      \"\\tstd_num_edges_per_graph = 25.619228830769817\\n\",\n      \"\\tmean_num_edges_per_graph = 66.3076923076923\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.066\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 181\\n\",\n      \"\\tnum_nodes_total = 5503\\n\",\n      \"\\tmax_num_nodes_per_graph = 68\\n\",\n      \"\\tmin_num_nodes_per_graph = 8\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.244248028261115\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.403314917127073\\n\",\n      \"\\tnum_edges_total = 11736\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 16\\n\",\n      \"\\tstd_num_edges_per_graph = 19.961337435561646\\n\",\n      \"\\tmean_num_edges_per_graph = 64.83977900552486\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.066\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m556\\u001b[0m - \\u001b[1mMax num nodes being calcuated train\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.067\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m565\\u001b[0m - \\u001b[1mMax num nodes being calcuated val\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.067\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mprepare_data\\u001b[0m:\\u001b[36m943\\u001b[0m - \\u001b[1mData is already prepared. Skipping the preparation\\u001b[0m\\n\",\n      \"You are using a CUDA device ('NVIDIA GeForce RTX 3070') that has Tensor Cores. To properly utilize them, you should set `torch.set_float32_matmul_precision('medium' | 'high')` which will trade-off precision for performance. For more details, read https://pytorch.org/docs/stable/generated/torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.069\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 634\\n\",\n      \"\\tnum_nodes_total = 18351\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 2\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.441048173903344\\n\",\n      \"\\tmean_num_nodes_per_graph = 28.944794952681388\\n\",\n      \"\\tnum_edges_total = 39466\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 2\\n\",\n      \"\\tstd_num_edges_per_graph = 25.32877270037731\\n\",\n      \"\\tmean_num_edges_per_graph = 62.24921135646688\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:28.069\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 91\\n\",\n      \"\\tnum_nodes_total = 2791\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 13\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.461406761225364\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.67032967032967\\n\",\n      \"\\tnum_edges_total = 6034\\n\",\n      \"\\tmax_num_edges_per_graph = 148\\n\",\n      \"\\tmin_num_edges_per_graph = 28\\n\",\n      \"\\tstd_num_edges_per_graph = 25.619228830769817\\n\",\n      \"\\tmean_num_edges_per_graph = 66.3076923076923\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]\\n\",\n      \"\\n\",\n      \"  | Name  | Type                      | Params\\n\",\n      \"----------------------------------------------------\\n\",\n      \"0 | model | FullGraphMultiTaskNetwork | 111 K \\n\",\n      \"----------------------------------------------------\\n\",\n      \"111 K     Trainable params\\n\",\n      \"0         Non-trainable params\\n\",\n      \"111 K     Total params\\n\",\n      \"0.448     Total estimated model params size (MB)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Sanity Checking: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"a99952d7025f418bae86c250887d7b60\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Training: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"`Trainer.fit` stopped: `max_epochs=10` reached.\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.408\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 634\\n\",\n      \"\\tnum_nodes_total = 18351\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 2\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.441048173903344\\n\",\n      \"\\tmean_num_nodes_per_graph = 28.944794952681388\\n\",\n      \"\\tnum_edges_total = 39466\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 2\\n\",\n      \"\\tstd_num_edges_per_graph = 25.32877270037731\\n\",\n      \"\\tmean_num_edges_per_graph = 62.24921135646688\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.409\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 91\\n\",\n      \"\\tnum_nodes_total = 2791\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 13\\n\",\n      \"\\tstd_num_nodes_per_graph = 11.461406761225364\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.67032967032967\\n\",\n      \"\\tnum_edges_total = 6034\\n\",\n      \"\\tmax_num_edges_per_graph = 148\\n\",\n      \"\\tmin_num_edges_per_graph = 28\\n\",\n      \"\\tstd_num_edges_per_graph = 25.619228830769817\\n\",\n      \"\\tmean_num_edges_per_graph = 66.3076923076923\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.410\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 181\\n\",\n      \"\\tnum_nodes_total = 5503\\n\",\n      \"\\tmax_num_nodes_per_graph = 68\\n\",\n      \"\\tmin_num_nodes_per_graph = 8\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.244248028261115\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.403314917127073\\n\",\n      \"\\tnum_edges_total = 11736\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 16\\n\",\n      \"\\tstd_num_edges_per_graph = 19.961337435561646\\n\",\n      \"\\tmean_num_edges_per_graph = 64.83977900552486\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.410\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m574\\u001b[0m - \\u001b[1mMax num nodes being calcuated test\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.411\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mprepare_data\\u001b[0m:\\u001b[36m943\\u001b[0m - \\u001b[1mData is already prepared. Skipping the preparation\\u001b[0m\\n\",\n      \"You are using a CUDA device ('NVIDIA GeForce RTX 3070') that has Tensor Cores. To properly utilize them, you should set `torch.set_float32_matmul_precision('medium' | 'high')` which will trade-off precision for performance. For more details, read https://pytorch.org/docs/stable/generated/torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision\\n\",\n      \"\\u001b[32m2023-07-13 14:02:49.412\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 181\\n\",\n      \"\\tnum_nodes_total = 5503\\n\",\n      \"\\tmax_num_nodes_per_graph = 68\\n\",\n      \"\\tmin_num_nodes_per_graph = 8\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.244248028261115\\n\",\n      \"\\tmean_num_nodes_per_graph = 30.403314917127073\\n\",\n      \"\\tnum_edges_total = 11736\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 16\\n\",\n      \"\\tstd_num_edges_per_graph = 19.961337435561646\\n\",\n      \"\\tmean_num_edges_per_graph = 64.83977900552486\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"e92ded2754004662a47e2fb8cb8112be\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Testing: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<pre style=\\\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\\\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\\n\",\n       \"┃<span style=\\\"font-weight: bold\\\">             Test metric             </span>┃<span style=\\\"font-weight: bold\\\">            DataLoader 0             </span>┃\\n\",\n       \"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">          gpu_allocated_GB           </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.0019927024841308594        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">    graph_caco2_wang/L1Loss/test     </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">          2.312683582305908          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">     graph_caco2_wang/loss/test      </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">          2.312683582305908          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">      graph_caco2_wang/mae/test      </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">          2.312683582305908          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_caco2_wang/mean_pred/test   </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         -3.0216422080993652         </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_caco2_wang/mean_target/test  </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         -5.334325790405273          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_caco2_wang/median_pred/test  </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">          -3.02557373046875          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\"> graph_caco2_wang/median_target/test </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         -5.300000190734863          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">    graph_caco2_wang/pearson/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         0.11455658078193665         </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_caco2_wang/r2_score/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         -11.43954086303711          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_caco2_wang/spearman/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         0.2947588562965393          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_caco2_wang/std_pred/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.014227418228983879         </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_caco2_wang/std_target/test   </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">         0.6855407357215881          </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">              loss/test              </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">          2.312683582305908          </span>│\\n\",\n       \"└─────────────────────────────────────┴─────────────────────────────────────┘\\n\",\n       \"</pre>\\n\"\n      ],\n      \"text/plain\": [\n       \"┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\\n\",\n       \"┃\\u001b[1m \\u001b[0m\\u001b[1m            Test metric            \\u001b[0m\\u001b[1m \\u001b[0m┃\\u001b[1m \\u001b[0m\\u001b[1m           DataLoader 0            \\u001b[0m\\u001b[1m \\u001b[0m┃\\n\",\n       \"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m         gpu_allocated_GB          \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.0019927024841308594       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m   graph_caco2_wang/L1Loss/test    \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m         2.312683582305908         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m    graph_caco2_wang/loss/test     \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m         2.312683582305908         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m     graph_caco2_wang/mae/test     \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m         2.312683582305908         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_caco2_wang/mean_pred/test  \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        -3.0216422080993652        \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_caco2_wang/mean_target/test \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        -5.334325790405273         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_caco2_wang/median_pred/test \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m         -3.02557373046875         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36mgraph_caco2_wang/median_target/test\\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        -5.300000190734863         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m   graph_caco2_wang/pearson/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        0.11455658078193665        \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_caco2_wang/r2_score/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        -11.43954086303711         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_caco2_wang/spearman/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        0.2947588562965393         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_caco2_wang/std_pred/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.014227418228983879        \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_caco2_wang/std_target/test  \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m        0.6855407357215881         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m             loss/test             \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m         2.312683582305908         \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"└─────────────────────────────────────┴─────────────────────────────────────┘\\n\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2023-07-13 14:02:50.286\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36m__init__\\u001b[0m:\\u001b[36m2425\\u001b[0m - \\u001b[1mPreparing the TDC ADMET Benchmark Group splits for each of the 1 benchmarks.\\u001b[0m\\n\",\n      \"GPU available: True (cuda), used: True\\n\",\n      \"TPU available: False, using: 0 TPU cores\\n\",\n      \"IPU available: False, using: 0 IPUs\\n\",\n      \"HPU available: False, using: 0 HPUs\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.421\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 403\\n\",\n      \"\\tnum_nodes_total = 9065\\n\",\n      \"\\tmax_num_nodes_per_graph = 101\\n\",\n      \"\\tmin_num_nodes_per_graph = 7\\n\",\n      \"\\tstd_num_nodes_per_graph = 8.379156239363269\\n\",\n      \"\\tmean_num_nodes_per_graph = 22.49379652605459\\n\",\n      \"\\tnum_edges_total = 19420\\n\",\n      \"\\tmax_num_edges_per_graph = 220\\n\",\n      \"\\tmin_num_edges_per_graph = 14\\n\",\n      \"\\tstd_num_edges_per_graph = 18.678530787820403\\n\",\n      \"\\tmean_num_edges_per_graph = 48.188585607940446\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.422\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 58\\n\",\n      \"\\tnum_nodes_total = 1431\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 9\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.610317607573146\\n\",\n      \"\\tmean_num_nodes_per_graph = 24.67241379310345\\n\",\n      \"\\tnum_edges_total = 3102\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 18\\n\",\n      \"\\tstd_num_edges_per_graph = 21.938976007372833\\n\",\n      \"\\tmean_num_edges_per_graph = 53.48275862068966\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.423\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 117\\n\",\n      \"\\tnum_nodes_total = 2716\\n\",\n      \"\\tmax_num_nodes_per_graph = 66\\n\",\n      \"\\tmin_num_nodes_per_graph = 3\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.966464331904731\\n\",\n      \"\\tmean_num_nodes_per_graph = 23.213675213675213\\n\",\n      \"\\tnum_edges_total = 5790\\n\",\n      \"\\tmax_num_edges_per_graph = 136\\n\",\n      \"\\tmin_num_edges_per_graph = 4\\n\",\n      \"\\tstd_num_edges_per_graph = 22.25440142180863\\n\",\n      \"\\tmean_num_edges_per_graph = 49.48717948717949\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.423\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m556\\u001b[0m - \\u001b[1mMax num nodes being calcuated train\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.423\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m565\\u001b[0m - \\u001b[1mMax num nodes being calcuated val\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.424\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mprepare_data\\u001b[0m:\\u001b[36m943\\u001b[0m - \\u001b[1mData is already prepared. Skipping the preparation\\u001b[0m\\n\",\n      \"You are using a CUDA device ('NVIDIA GeForce RTX 3070') that has Tensor Cores. To properly utilize them, you should set `torch.set_float32_matmul_precision('medium' | 'high')` which will trade-off precision for performance. For more details, read https://pytorch.org/docs/stable/generated/torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.426\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 403\\n\",\n      \"\\tnum_nodes_total = 9065\\n\",\n      \"\\tmax_num_nodes_per_graph = 101\\n\",\n      \"\\tmin_num_nodes_per_graph = 7\\n\",\n      \"\\tstd_num_nodes_per_graph = 8.379156239363269\\n\",\n      \"\\tmean_num_nodes_per_graph = 22.49379652605459\\n\",\n      \"\\tnum_edges_total = 19420\\n\",\n      \"\\tmax_num_edges_per_graph = 220\\n\",\n      \"\\tmin_num_edges_per_graph = 14\\n\",\n      \"\\tstd_num_edges_per_graph = 18.678530787820403\\n\",\n      \"\\tmean_num_edges_per_graph = 48.188585607940446\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:02:50.426\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 58\\n\",\n      \"\\tnum_nodes_total = 1431\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 9\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.610317607573146\\n\",\n      \"\\tmean_num_nodes_per_graph = 24.67241379310345\\n\",\n      \"\\tnum_edges_total = 3102\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 18\\n\",\n      \"\\tstd_num_edges_per_graph = 21.938976007372833\\n\",\n      \"\\tmean_num_edges_per_graph = 53.48275862068966\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]\\n\",\n      \"\\n\",\n      \"  | Name  | Type                      | Params\\n\",\n      \"----------------------------------------------------\\n\",\n      \"0 | model | FullGraphMultiTaskNetwork | 111 K \\n\",\n      \"----------------------------------------------------\\n\",\n      \"111 K     Trainable params\\n\",\n      \"0         Non-trainable params\\n\",\n      \"111 K     Total params\\n\",\n      \"0.448     Total estimated model params size (MB)\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Sanity Checking: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2023-07-13 14:02:51.236\\u001b[0m | \\u001b[33m\\u001b[1mWARNING \\u001b[0m | \\u001b[36mgraphium.trainer.predictor_summaries\\u001b[0m:\\u001b[36mget_metrics_logs\\u001b[0m:\\u001b[36m273\\u001b[0m - \\u001b[33m\\u001b[1mError for metric graph_hia_hou/mcc/val. NaN is returned. Exception: stack expects a non-empty TensorList\\u001b[0m\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"9ab41b110acd4f3f8977249dc715e49d\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Training: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2023-07-13 14:02:52.078\\u001b[0m | \\u001b[33m\\u001b[1mWARNING \\u001b[0m | \\u001b[36mgraphium.trainer.predictor_summaries\\u001b[0m:\\u001b[36mget_metrics_logs\\u001b[0m:\\u001b[36m273\\u001b[0m - \\u001b[33m\\u001b[1mError for metric graph_hia_hou/mcc/train. NaN is returned. Exception: stack expects a non-empty TensorList\\u001b[0m\\n\",\n      \"`Trainer.fit` stopped: `max_epochs=10` reached.\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.289\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1168\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = training set\\n\",\n      \"\\tnum_graphs_total = 403\\n\",\n      \"\\tnum_nodes_total = 9065\\n\",\n      \"\\tmax_num_nodes_per_graph = 101\\n\",\n      \"\\tmin_num_nodes_per_graph = 7\\n\",\n      \"\\tstd_num_nodes_per_graph = 8.379156239363269\\n\",\n      \"\\tmean_num_nodes_per_graph = 22.49379652605459\\n\",\n      \"\\tnum_edges_total = 19420\\n\",\n      \"\\tmax_num_edges_per_graph = 220\\n\",\n      \"\\tmin_num_edges_per_graph = 14\\n\",\n      \"\\tstd_num_edges_per_graph = 18.678530787820403\\n\",\n      \"\\tmean_num_edges_per_graph = 48.188585607940446\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.290\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1169\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = validation set\\n\",\n      \"\\tnum_graphs_total = 58\\n\",\n      \"\\tnum_nodes_total = 1431\\n\",\n      \"\\tmax_num_nodes_per_graph = 67\\n\",\n      \"\\tmin_num_nodes_per_graph = 9\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.610317607573146\\n\",\n      \"\\tmean_num_nodes_per_graph = 24.67241379310345\\n\",\n      \"\\tnum_edges_total = 3102\\n\",\n      \"\\tmax_num_edges_per_graph = 144\\n\",\n      \"\\tmin_num_edges_per_graph = 18\\n\",\n      \"\\tstd_num_edges_per_graph = 21.938976007372833\\n\",\n      \"\\tmean_num_edges_per_graph = 53.48275862068966\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.290\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 117\\n\",\n      \"\\tnum_nodes_total = 2716\\n\",\n      \"\\tmax_num_nodes_per_graph = 66\\n\",\n      \"\\tmin_num_nodes_per_graph = 3\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.966464331904731\\n\",\n      \"\\tmean_num_nodes_per_graph = 23.213675213675213\\n\",\n      \"\\tnum_edges_total = 5790\\n\",\n      \"\\tmax_num_edges_per_graph = 136\\n\",\n      \"\\tmin_num_edges_per_graph = 4\\n\",\n      \"\\tstd_num_edges_per_graph = 22.25440142180863\\n\",\n      \"\\tmean_num_edges_per_graph = 49.48717948717949\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.291\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mget_max_num_nodes_datamodule\\u001b[0m:\\u001b[36m574\\u001b[0m - \\u001b[1mMax num nodes being calcuated test\\u001b[0m\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.291\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36mprepare_data\\u001b[0m:\\u001b[36m943\\u001b[0m - \\u001b[1mData is already prepared. Skipping the preparation\\u001b[0m\\n\",\n      \"You are using a CUDA device ('NVIDIA GeForce RTX 3070') that has Tensor Cores. To properly utilize them, you should set `torch.set_float32_matmul_precision('medium' | 'high')` which will trade-off precision for performance. For more details, read https://pytorch.org/docs/stable/generated/torch.set_float32_matmul_precision.html#torch.set_float32_matmul_precision\\n\",\n      \"\\u001b[32m2023-07-13 14:03:06.292\\u001b[0m | \\u001b[1mINFO    \\u001b[0m | \\u001b[36mgraphium.data.datamodule\\u001b[0m:\\u001b[36msetup\\u001b[0m:\\u001b[36m1186\\u001b[0m - \\u001b[1m-------------------\\n\",\n      \"MultitaskDataset\\n\",\n      \"\\tabout = test set\\n\",\n      \"\\tnum_graphs_total = 117\\n\",\n      \"\\tnum_nodes_total = 2716\\n\",\n      \"\\tmax_num_nodes_per_graph = 66\\n\",\n      \"\\tmin_num_nodes_per_graph = 3\\n\",\n      \"\\tstd_num_nodes_per_graph = 9.966464331904731\\n\",\n      \"\\tmean_num_nodes_per_graph = 23.213675213675213\\n\",\n      \"\\tnum_edges_total = 5790\\n\",\n      \"\\tmax_num_edges_per_graph = 136\\n\",\n      \"\\tmin_num_edges_per_graph = 4\\n\",\n      \"\\tstd_num_edges_per_graph = 22.25440142180863\\n\",\n      \"\\tmean_num_edges_per_graph = 49.48717948717949\\n\",\n      \"-------------------\\n\",\n      \"\\u001b[0m\\n\",\n      \"LOCAL_RANK: 0 - CUDA_VISIBLE_DEVICES: [0]\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"application/vnd.jupyter.widget-view+json\": {\n       \"model_id\": \"47a42b8385de4cd484d3aee1ee77a8ca\",\n       \"version_major\": 2,\n       \"version_minor\": 0\n      },\n      \"text/plain\": [\n       \"Testing: 0it [00:00, ?it/s]\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    },\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\u001b[32m2023-07-13 14:03:07.087\\u001b[0m | \\u001b[33m\\u001b[1mWARNING \\u001b[0m | \\u001b[36mgraphium.trainer.predictor_summaries\\u001b[0m:\\u001b[36mget_metrics_logs\\u001b[0m:\\u001b[36m273\\u001b[0m - \\u001b[33m\\u001b[1mError for metric graph_hia_hou/mcc/test. NaN is returned. Exception: stack expects a non-empty TensorList\\u001b[0m\\n\"\n     ]\n    },\n    {\n     \"data\": {\n      \"text/html\": [\n       \"<pre style=\\\"white-space:pre;overflow-x:auto;line-height:normal;font-family:Menlo,'DejaVu Sans Mono',consolas,'Courier New',monospace\\\">┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\\n\",\n       \"┃<span style=\\\"font-weight: bold\\\">           Test metric            </span>┃<span style=\\\"font-weight: bold\\\">           DataLoader 0           </span>┃\\n\",\n       \"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">         gpu_allocated_GB         </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">      0.0018811225891113281       </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">    graph_hia_hou/BCELoss/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.5804430842399597        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_hia_hou/accuracy/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.7692307829856873        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">     graph_hia_hou/auprc/test     </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.8519759774208069        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">     graph_hia_hou/auroc/test     </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.7008230686187744        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">     graph_hia_hou/loss/test      </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.5804430842399597        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">      graph_hia_hou/mcc/test      </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">               nan                </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_hia_hou/mean_pred/test   </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.8799146413803101        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_hia_hou/mean_target/test  </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.7692307829856873        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_hia_hou/median_pred/test  </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.8844738006591797        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\"> graph_hia_hou/median_target/test </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">               1.0                </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">   graph_hia_hou/std_pred/test    </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">       0.010796700604259968       </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">  graph_hia_hou/std_target/test   </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">       0.42313718795776367        </span>│\\n\",\n       \"│<span style=\\\"color: #008080; text-decoration-color: #008080\\\">            loss/test             </span>│<span style=\\\"color: #800080; text-decoration-color: #800080\\\">        0.5804430842399597        </span>│\\n\",\n       \"└──────────────────────────────────┴──────────────────────────────────┘\\n\",\n       \"</pre>\\n\"\n      ],\n      \"text/plain\": [\n       \"┏━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┳━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┓\\n\",\n       \"┃\\u001b[1m \\u001b[0m\\u001b[1m          Test metric           \\u001b[0m\\u001b[1m \\u001b[0m┃\\u001b[1m \\u001b[0m\\u001b[1m          DataLoader 0          \\u001b[0m\\u001b[1m \\u001b[0m┃\\n\",\n       \"┡━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━╇━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━┩\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m        gpu_allocated_GB        \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m     0.0018811225891113281      \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m   graph_hia_hou/BCELoss/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.5804430842399597       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_hia_hou/accuracy/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.7692307829856873       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m    graph_hia_hou/auprc/test    \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.8519759774208069       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m    graph_hia_hou/auroc/test    \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.7008230686187744       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m    graph_hia_hou/loss/test     \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.5804430842399597       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m     graph_hia_hou/mcc/test     \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m              nan               \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_hia_hou/mean_pred/test  \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.8799146413803101       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_hia_hou/mean_target/test \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.7692307829856873       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_hia_hou/median_pred/test \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.8844738006591797       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36mgraph_hia_hou/median_target/test\\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m              1.0               \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m  graph_hia_hou/std_pred/test   \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m      0.010796700604259968      \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m graph_hia_hou/std_target/test  \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m      0.42313718795776367       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"│\\u001b[36m \\u001b[0m\\u001b[36m           loss/test            \\u001b[0m\\u001b[36m \\u001b[0m│\\u001b[35m \\u001b[0m\\u001b[35m       0.5804430842399597       \\u001b[0m\\u001b[35m \\u001b[0m│\\n\",\n       \"└──────────────────────────────────┴──────────────────────────────────┘\\n\"\n      ]\n     },\n     \"metadata\": {},\n     \"output_type\": \"display_data\"\n    }\n   ],\n   \"source\": [\n    \"results = {}\\n\",\n    \"\\n\",\n    \"for name in benchmarks: \\n\",\n    \"    \\n\",\n    \"    # Run the training-testing loop\\n\",\n    \"    cfg = filter_cfg_based_on_benchmark_name(config, name)\\n\",\n    \"    benchmark_results = training_testing_loop(cfg)\\n\",\n    \"    \\n\",\n    \"    # Extract the main metric from the config\\n\",\n    \"    metric = cfg[\\\"predictor\\\"][\\\"metrics_on_progress_bar\\\"][name][0]\\n\",\n    \"    key = f\\\"graph_{name}/{metric}/test\\\"\\n\",\n    \"    results[f\\\"{name}/{metric}\\\"] = benchmark_results[0][key]\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 9,\n   \"id\": \"c2cea93b\",\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"caco2_wang/mae: 2.312683582305908\\n\",\n      \"hia_hou/auroc: 0.7008230686187744\\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"print(omegaconf.OmegaConf.to_yaml(results))\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"id\": \"2150be90\",\n   \"metadata\": {},\n   \"source\": [\n    \"The End. \"\n   ]\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3 (ipykernel)\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.10.12\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 5\n}\n"
  },
  {
    "path": "notebooks/running-fingerprints-from-pretrained-model.ipynb",
    "content": "{\n \"metadata\": {\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.8\"\n  },\n  \"orig_nbformat\": 4,\n  \"kernelspec\": {\n   \"name\": \"python3\",\n   \"display_name\": \"Python 3.8.8 64-bit ('graphium': conda)\"\n  },\n  \"interpreter\": {\n   \"hash\": \"f4a99d018a205fcbcc0480c84566beaebcb91b08d0414b39a842df533e2a1d25\"\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 2,\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"source\": [\n    \"# General imports\\r\\n\",\n    \"import yaml\\r\\n\",\n    \"import numpy as np\\r\\n\",\n    \"import torch\\r\\n\",\n    \"import fsspec\\r\\n\",\n    \"\\r\\n\",\n    \"# Current project imports\\r\\n\",\n    \"import graphium\\r\\n\",\n    \"from graphium.config._loader import load_datamodule, load_trainer\\r\\n\",\n    \"from graphium.trainer.predictor import PredictorModule\\r\\n\"\n   ],\n   \"outputs\": [\n    {\n     \"output_type\": \"stream\",\n     \"name\": \"stderr\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"metadata\": {}\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"source\": [\n    \"# Path containing the model and its configurations\\r\\n\",\n    \"MODEL_PATH = \\\"https://storage.valencelabs.com/graphium/pretrained-models/graphium-zinc-micro-dummy-test\\\"\\r\\n\",\n    \"MODEL_FILE = f\\\"{MODEL_PATH}/model.ckpt\\\"\\r\\n\",\n    \"CONFIG_FILE = f\\\"{MODEL_PATH}/configs.yaml\\\"\\r\\n\",\n    \"\\r\\n\",\n    \"# Path containing the SMILES data to infer\\r\\n\",\n    \"SMILES_DF_PATH = f\\\"https://storage.valencelabs.com/graphium/datasets/graphium-zinc-bench-gnn/smiles_score.csv.gz\\\"\\r\\n\",\n    \"SMILES_COL = \\\"SMILES\\\"\\r\\n\",\n    \"\\r\\n\",\n    \"# Number of layers to drop when inferring the fingerprints\\r\\n\",\n    \"NUM_LAYERS_TO_DROP = 1\"\n   ],\n   \"outputs\": [],\n   \"metadata\": {}\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"source\": [\n    \"# Load the configuration file of the trained model\\r\\n\",\n    \"with fsspec.open(CONFIG_FILE, \\\"rb\\\") as f:\\r\\n\",\n    \"    cfg = yaml.safe_load(f)\\r\\n\",\n    \"\\r\\n\",\n    \"# Overwrite configurations of the datamodule\\r\\n\",\n    \"cfg[\\\"datamodule\\\"][\\\"module_type\\\"] = \\\"DGLFromSmilesDataModule\\\"\\r\\n\",\n    \"args = cfg[\\\"datamodule\\\"][\\\"args\\\"]\\r\\n\",\n    \"cfg[\\\"datamodule\\\"][\\\"args\\\"] = {\\r\\n\",\n    \"        \\\"df_path\\\": SMILES_DF_PATH,\\r\\n\",\n    \"        \\\"smiles_col\\\": SMILES_COL,\\r\\n\",\n    \"        \\\"label_cols\\\": [],\\r\\n\",\n    \"        \\\"featurization\\\": args[\\\"featurization\\\"],\\r\\n\",\n    \"    }\\r\\n\",\n    \"\\r\\n\",\n    \"# Load and initialize the dataset\\r\\n\",\n    \"datamodule = load_datamodule(cfg)\"\n   ],\n   \"outputs\": [],\n   \"metadata\": {}\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"source\": [\n    \"# Load the model, drop the layers, and load the trainer\\r\\n\",\n    \"predictor = PredictorModule.load_from_checkpoint(MODEL_FILE)\\r\\n\",\n    \"predictor.model.drop_graph_output_nn_layers(num_layers_to_drop=NUM_LAYERS_TO_DROP)\\r\\n\",\n    \"trainer = load_trainer(cfg)\\r\\n\",\n    \"\\r\\n\",\n    \"predictor\"\n   ],\n   \"outputs\": [\n    {\n     \"output_type\": \"error\",\n     \"ename\": \"AssertionError\",\n     \"evalue\": \"\",\n     \"traceback\": [\n      \"\\u001b[1;31m---------------------------------------------------------------------------\\u001b[0m\",\n      \"\\u001b[1;31mAssertionError\\u001b[0m                            Traceback (most recent call last)\",\n      \"\\u001b[1;32m<ipython-input-4-5f92b03a935d>\\u001b[0m in \\u001b[0;36m<module>\\u001b[1;34m\\u001b[0m\\n\\u001b[0;32m      1\\u001b[0m \\u001b[1;31m# Load the model, drop the layers, and load the trainer\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m      2\\u001b[0m \\u001b[0mpredictor\\u001b[0m \\u001b[1;33m=\\u001b[0m \\u001b[0mPredictorModule\\u001b[0m\\u001b[1;33m.\\u001b[0m\\u001b[0mload_from_checkpoint\\u001b[0m\\u001b[1;33m(\\u001b[0m\\u001b[0mMODEL_FILE\\u001b[0m\\u001b[1;33m)\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m----> 3\\u001b[1;33m \\u001b[0mpredictor\\u001b[0m\\u001b[1;33m.\\u001b[0m\\u001b[0mmodel\\u001b[0m\\u001b[1;33m.\\u001b[0m\\u001b[0mdrop_graph_output_nn_layers\\u001b[0m\\u001b[1;33m(\\u001b[0m\\u001b[0mnum_layers_to_drop\\u001b[0m\\u001b[1;33m=\\u001b[0m\\u001b[0mNUM_LAYERS_TO_DROP\\u001b[0m\\u001b[1;33m)\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[0;32m      4\\u001b[0m \\u001b[0mtrainer\\u001b[0m \\u001b[1;33m=\\u001b[0m \\u001b[0mload_trainer\\u001b[0m\\u001b[1;33m(\\u001b[0m\\u001b[0mcfg\\u001b[0m\\u001b[1;33m)\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m      5\\u001b[0m \\u001b[1;33m\\u001b[0m\\u001b[0m\\n\",\n      \"\\u001b[1;32mc:\\\\users\\\\domin\\\\documents\\\\gits\\\\graphium_windows\\\\graphium\\\\nn\\\\architectures.py\\u001b[0m in \\u001b[0;36mdrop_graph_output_nn_layers\\u001b[1;34m(self, num_layers_to_drop)\\u001b[0m\\n\\u001b[0;32m    867\\u001b[0m \\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m    868\\u001b[0m         \\u001b[1;32massert\\u001b[0m \\u001b[0mnum_layers_to_drop\\u001b[0m \\u001b[1;33m>=\\u001b[0m \\u001b[1;36m0\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[1;32m--> 869\\u001b[1;33m         \\u001b[1;32massert\\u001b[0m \\u001b[0mnum_layers_to_drop\\u001b[0m \\u001b[1;33m<=\\u001b[0m \\u001b[0mlen\\u001b[0m\\u001b[1;33m(\\u001b[0m\\u001b[0mself\\u001b[0m\\u001b[1;33m.\\u001b[0m\\u001b[0mgraph_output_nn\\u001b[0m\\u001b[1;33m.\\u001b[0m\\u001b[0mlayers\\u001b[0m\\u001b[1;33m)\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0m\\u001b[0;32m    870\\u001b[0m \\u001b[1;33m\\u001b[0m\\u001b[0m\\n\\u001b[0;32m    871\\u001b[0m         \\u001b[1;32mif\\u001b[0m \\u001b[0mnum_layers_to_drop\\u001b[0m \\u001b[1;33m>\\u001b[0m \\u001b[1;36m0\\u001b[0m\\u001b[1;33m:\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[1;33m\\u001b[0m\\u001b[0m\\n\",\n      \"\\u001b[1;31mAssertionError\\u001b[0m: \"\n     ]\n    }\n   ],\n   \"metadata\": {}\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"source\": [\n    \"# Run the model prediction, and concatenate the batched results\\r\\n\",\n    \"preds = trainer.predict(model=predictor, datamodule=datamodule)\\r\\n\",\n    \"if isinstance(preds[0], torch.Tensor):\\r\\n\",\n    \"    preds = [p.detach().cpu().numpy() for p in preds]\\r\\n\",\n    \"preds = np.concatenate(preds, axis=0)\\r\\n\",\n    \"\\r\\n\",\n    \"preds\"\n   ],\n   \"outputs\": [],\n   \"metadata\": {}\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"source\": [\n    \"preds.shape\"\n   ],\n   \"outputs\": [],\n   \"metadata\": {}\n  }\n ]\n}\n"
  },
  {
    "path": "notebooks/running-model-from-config.ipynb",
    "content": "{\n \"cells\": [\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 1,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"Using backend: pytorch\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"%load_ext autoreload\\n\",\n    \"%autoreload 2\\n\",\n    \"\\n\",\n    \"# General imports\\n\",\n    \"import os\\n\",\n    \"from os.path import dirname, abspath\\n\",\n    \"import yaml\\n\",\n    \"from copy import deepcopy\\n\",\n    \"from omegaconf import DictConfig, OmegaConf\\n\",\n    \"\\n\",\n    \"\\n\",\n    \"# Current project imports\\n\",\n    \"import graphium\\n\",\n    \"from graphium.utils.config_loader import (\\n\",\n    \"    config_load_constants,\\n\",\n    \"    config_load_dataset,\\n\",\n    \"    config_load_architecture,\\n\",\n    \"    config_load_metrics,\\n\",\n    \"    config_load_predictor,\\n\",\n    \"    config_load_training,\\n\",\n    \")\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Read the config file\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 2,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stderr\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"/home/dominique/anaconda3/envs/graphium_env/lib/python3.7/site-packages/torch/cuda/__init__.py:52: UserWarning: CUDA initialization: Found no NVIDIA driver on your system. Please check that you have an NVIDIA GPU and installed a driver from http://www.nvidia.com/Download/index.aspx (Triggered internally at  /opt/conda/conda-bld/pytorch_1607370156314/work/c10/cuda/CUDAFunctions.cpp:100.)\\n\",\n      \"  return torch._C._cuda_getDeviceCount() > 0\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Set up the working directory\\n\",\n    \"MAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\\n\",\n    \"os.chdir(MAIN_DIR)\\n\",\n    \"\\n\",\n    \"with open(os.path.join(MAIN_DIR, \\\"expts/config_micro_ZINC.yaml\\\"), \\\"r\\\") as f:\\n\",\n    \"    cfg = yaml.safe_load(f)\\n\",\n    \"\\n\",\n    \"cfg = dict(deepcopy(cfg))\\n\",\n    \"\\n\",\n    \"# Get the general parameters and generate the train/val/test datasets\\n\",\n    \"data_device, model_device, dtype, exp_name, seed, raise_train_error = config_load_constants(\\n\",\n    \"    **cfg[\\\"constants\\\"], main_dir=MAIN_DIR\\n\",\n    \")\"\n   ]\n  },\n  {\n   \"cell_type\": \"markdown\",\n   \"metadata\": {},\n   \"source\": [\n    \"## Load a dataset\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 3,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\n\",\n      \"datamodule:\\n\",\n      \" name: DGLFromSmilesDataModule\\n\",\n      \"len: 1000\\n\",\n      \"batch_size_training: 128\\n\",\n      \"batch_size_inference: 256\\n\",\n      \"num_node_feats: 55\\n\",\n      \"num_edge_feats: 13\\n\",\n      \"collate_fn: graphium_collate_fn\\n\",\n      \"featurization:\\n\",\n      \"  atom_property_list_onehot:\\n\",\n      \"  - atomic-number\\n\",\n      \"  - valence\\n\",\n      \"  atom_property_list_float:\\n\",\n      \"  - mass\\n\",\n      \"  - electronegativity\\n\",\n      \"  - in-ring\\n\",\n      \"  edge_property_list:\\n\",\n      \"  - bond-type-onehot\\n\",\n      \"  - stereo\\n\",\n      \"  - in-ring\\n\",\n      \"  add_self_loop: false\\n\",\n      \"  explicit_H: false\\n\",\n      \"  use_bonds_weights: false\\n\",\n      \" \\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"\\n\",\n    \"# Load and initialize the dataset\\n\",\n    \"datamodule = config_load_dataset(**cfg[\\\"datasets\\\"], main_dir=MAIN_DIR,)\\n\",\n    \"print(\\\"\\\\ndatamodule:\\\\n\\\", datamodule, \\\"\\\\n\\\")\\n\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 4,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"\\n\",\n      \"model:\\n\",\n      \" DGL_GNN\\n\",\n      \"---------\\n\",\n      \"    pre-trans-NN(depth=1, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[55 -> 32] -> Linear(32)\\n\",\n      \"    \\n\",\n      \"    main-GNN(depth=4, ResidualConnectionSimple(skip_steps=1))\\n\",\n      \"        PNAMessagePassingLayer[32 -> 32 -> 32 -> 32 -> 32]\\n\",\n      \"        -> Pooling(sum) -> FCLayer(32 -> 32, activation=None)\\n\",\n      \"    \\n\",\n      \"    post-trans-NN(depth=2, ResidualConnectionNone)\\n\",\n      \"        [FCLayer[32 -> 32 -> 32] -> Linear(32) \\n\",\n      \"\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"# Initialize the network\\n\",\n    \"model = config_load_architecture(\\n\",\n    \"    **cfg[\\\"architecture\\\"],\\n\",\n    \"    in_dim_nodes=datamodule.num_node_feats_with_positional_encoding,\\n\",\n    \"    in_dim_edges=datamodule.num_edge_feats\\n\",\n    \")\\n\",\n    \"\\n\",\n    \"print(\\\"\\\\nmodel:\\\\n\\\", model, \\\"\\\\n\\\")\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": 5,\n   \"metadata\": {},\n   \"outputs\": [\n    {\n     \"name\": \"stdout\",\n     \"output_type\": \"stream\",\n     \"text\": [\n      \"{'mae': mean_absolute_error, 'pearsonr': pearsonr, 'f1 > 5': f1(>5), 'precision > 5': precision(>5), 'auroc > 5': auroc(>5)}\\n\"\n     ]\n    }\n   ],\n   \"source\": [\n    \"metrics = config_load_metrics(cfg[\\\"metrics\\\"])\\n\",\n    \"print(metrics)\"\n   ]\n  },\n  {\n   \"cell_type\": \"code\",\n   \"execution_count\": null,\n   \"metadata\": {},\n   \"outputs\": [],\n   \"source\": []\n  }\n ],\n \"metadata\": {\n  \"kernelspec\": {\n   \"display_name\": \"Python 3\",\n   \"language\": \"python\",\n   \"name\": \"python3\"\n  },\n  \"language_info\": {\n   \"codemirror_mode\": {\n    \"name\": \"ipython\",\n    \"version\": 3\n   },\n   \"file_extension\": \".py\",\n   \"mimetype\": \"text/x-python\",\n   \"name\": \"python\",\n   \"nbconvert_exporter\": \"python\",\n   \"pygments_lexer\": \"ipython3\",\n   \"version\": \"3.8.6\"\n  },\n  \"widgets\": {\n   \"application/vnd.jupyter.widget-state+json\": {\n    \"state\": {},\n    \"version_major\": 2,\n    \"version_minor\": 0\n   }\n  }\n },\n \"nbformat\": 4,\n \"nbformat_minor\": 4\n}\n"
  },
  {
    "path": "profiling/configs_profiling.yaml",
    "content": "constants:\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\ndatamodule:\n  module_type: \"DGLFromSmilesDataModule\"\n  args:\n    df_path: https://storage.valencelabs.com/graphium/datasets/graphium-zinc-bench-gnn/smiles_score.csv.gz\n    processed_graph_data_path: null\n    label_cols: ['score']\n    smiles_col: SMILES\n\n    # Featurization\n    featurization_n_jobs: -1\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [atomic-number, valence]\n      atom_property_list_float: [mass, electronegativity]\n      edge_property_list: [bond-type-onehot, stereo]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n      pos_encoding_as_features: &pos_enc\n        pos_type: laplacian_eigvec\n        num_pos: 3\n        normalization: \"none\"\n        disconnected_comp: True\n      pos_encoding_as_directions: *pos_enc\n      on_error: warn\n\n    # Train, val, test parameters\n    split_val: null\n    split_test: null\n    split_seed: *seed\n    splits_path: https://storage.valencelabs.com/graphium/datasets/graphium-zinc-bench-gnn/indexes_train_val_test.csv.gz\n    batch_size_training: 128\n    batch_size_inference: 256\n\n    # Data loading\n    num_workers: 0\n    pin_memory: False\n    persistent_workers: False  # Keep True on Windows if running multiple workers, false for single worker\n\n\narchitecture:\n  model_type: fulldglnetwork\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: &middle_dim 200\n    hidden_dims: *middle_dim\n    depth: 0\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.\n    normalization: &normalization \"batch_norm\"\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: *middle_dim\n    hidden_dims: *middle_dim\n    depth: 6\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: simple\n    pooling: 'sum'\n    virtual_node: none\n    layer_type: 'dgl:dgn-msgpass'\n    layer_kwargs:\n      # num_heads: 3\n      aggregators: [mean, max, dir1/dx_abs, dir1/smooth]\n      scalers: [identity, amplification, attenuation]\n\n  graph_output_nn:\n    out_dim: 1\n    hidden_dims: *middle_dim\n    depth: 0\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: \"none\"\n    residual_type: none\n\n\npredictor:\n  metrics_on_progress_bar: [\"mae\", \"pearsonr\", \"f1 < 0\", \"precision < 0\"]\n  loss_fun: mse\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-3\n    weight_decay: 3.e-6\n  lr_reduce_on_plateau_kwargs:\n    factor: 0.5\n    patience: 50\n    min_lr: 1.e-5\n  scheduler_kwargs:\n    monitor: &monitor loss/val\n    frequency: 1\n  target_nan_mask: 0 # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n\n\nmetrics:\n  - name: mae\n    metric: mae\n    threshold_kwargs: null\n\n  - name: pearsonr\n    metric: pearsonr\n    threshold_kwargs: null\n\n  - name: f1 < 0\n    metric: f1\n    target_to_int: True\n    num_classes: 2\n    average: micro\n    threshold_kwargs: &threshold_0\n      operator: lower\n      threshold: 0\n      th_on_preds: True\n      th_on_target: True\n\n  - name: f1 < -1\n    metric: f1\n    target_to_int: True\n    num_classes: 2\n    average: micro\n    threshold_kwargs:\n      operator: lower\n      threshold: -1\n      th_on_preds: True\n      th_on_target: True\n\n  - name: precision < 0\n    metric: precision\n    average: micro\n    threshold_kwargs: *threshold_0\n\ntrainer:\n  logger:\n    save_dir: logs/ZINC_bench_gnn\n  early_stopping:\n    monitor: *monitor\n    min_delta: 0\n    patience: 200\n    mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/ZINC_bench_gnn/\n    filename: \"model\"\n    monitor: *monitor\n    mode: *mode\n    save_top_k: 1\n    every_n_val_epochs: 1\n  trainer:\n    max_epochs: 2000\n    min_epochs: 100\n    gpus: 1\n\n"
  },
  {
    "path": "profiling/profile_mol_to_graph.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom tqdm import tqdm\nimport datamol as dm\nimport pickle\n\nfrom graphium.data.utils import load_micro_zinc\nfrom graphium.features.featurizer import mol_to_pyggraph, mol_to_adj_and_features, mol_to_graph_dict\n\n# Check out this profiling tool: https://kirillstrelkov.medium.com/python-profiling-with-vscode-3a17c0407833\n\n\ndef main():\n    df = load_micro_zinc()\n    smiles = df[\"SMILES\"].values.tolist()\n    smiles = smiles * 10\n    print(\"Num smiles: \", len(smiles))\n\n    pos_enc = {\n        \"pos_type\": \"laplacian_eigvec\",\n        \"num_pos\": 3,\n        \"normalization\": \"none\",\n        \"disconnected_comp\": True,\n    }\n    atom_property_list_float1 = [\n        \"mass\",\n        \"in-ring\",\n        \"hybridization\",\n        \"chirality\",\n        \"aromatic\",\n        \"degree\",\n        \"formal-charge\",\n        \"single-bond\",\n        \"double-bond\",\n        \"radical-electron\",\n    ]\n    atom_property_list_float2 = [\"electronegativity\", \"vdw-radius\", \"covalent-radius\", \"metal\"]\n\n    featurizer = {\n        \"atom_property_list_onehot\": [\"atomic-number\", \"valence\"],\n        \"atom_property_list_float\": atom_property_list_float1 + atom_property_list_float2,\n        \"edge_property_list\": [\n            \"bond-type-onehot\",\n            \"bond-type-float\",\n            \"stereo\",\n            \"in-ring\",\n            \"conjugated\",\n            \"estimated-bond-length\",\n        ],\n        \"add_self_loop\": False,\n        \"explicit_H\": False,\n        \"use_bonds_weights\": False,\n        \"pos_encoding_as_features\": pos_enc,\n        # \"on_error\": \"raise\",\n    }\n\n    graphs = []\n    for s in tqdm(smiles):\n        mol = dm.to_mol(\n            s\n        )  # Doesn't need `ordered=True` because this is just to test the speed of the featurizer\n        graphs.append(mol_to_graph_dict(mol, **featurizer))\n\n    print(graphs[0])\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "profiling/profile_one_of_k_encoding.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom tqdm import tqdm\nfrom graphium.utils.tensor import one_of_k_encoding\n\n\ndef main():\n    CLASSES = [\"AA\", \"BB\", \"CC\", \"DD\", \"EE\"]\n    CHOICES = CLASSES + [\"FF\"]\n\n    for ii in tqdm(range(500000)):\n        for choice in CHOICES:\n            one_of_k_encoding(choice, CLASSES)\n\n    print(\"DONE :)\")\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "profiling/profile_predictor.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom tqdm import tqdm\nimport os\nfrom time import time\nimport yaml\nimport fsspec\n\nfrom graphium.config._loader import (\n    load_datamodule,\n    load_metrics,\n    load_trainer,\n    load_predictor,\n    load_architecture,\n)\nfrom lightning import Trainer\n\n\ndef main():\n    CONFIG_PATH = \"expts/config_micro-PCBA.yaml\"\n    # DATA_PATH = \"https://storage.valencelabs.com/graphium/datasets/graphium-zinc-bench-gnn/smiles_score.csv.gz\"\n\n    with fsspec.open(CONFIG_PATH, \"r\") as f:\n        cfg = yaml.safe_load(f)\n    cfg[\"datamodule\"][\"args\"][\n        \"processed_graph_data_path\"\n    ] = \"graphium/data/cache/profiling/predictor_data.cache\"\n    # cfg[\"datamodule\"][\"args\"][\"df_path\"] = DATA_PATH\n    cfg[\"trainer\"][\"trainer\"][\"max_epochs\"] = 5\n    cfg[\"trainer\"][\"trainer\"][\"min_epochs\"] = 5\n\n    datamodule = load_datamodule(cfg)\n\n    # Initialize the network\n    model_class, model_kwargs = load_architecture(\n        cfg,\n        in_dim_nodes=datamodule.num_node_feats_with_positional_encoding,\n        in_dim_edges=datamodule.num_edge_feats,\n    )\n\n    metrics = load_metrics(cfg)\n    print(metrics)\n    predictor = load_predictor(cfg, model_class, model_kwargs, metrics)\n    trainer = load_trainer(cfg)\n    trainer.fit(model=predictor, datamodule=datamodule)\n\n    print(\"Done :)\")\n\n\nif __name__ == \"__main__\":\n    main()\n"
  },
  {
    "path": "pyproject.toml",
    "content": "[build-system]\nrequires = [\"setuptools\", \"setuptools-scm\"]\nbuild-backend = \"setuptools.build_meta\"\n\n[project]\nname = \"graphium\"\ndescription = \"Graphium: Scaling molecular GNNs to infinity.\"\ndynamic = [\"version\"]\nauthors = [\n    { name = \"Dominique Beaini\", email = \"dominique@valencediscovery.com\" },\n]\nreadme = \"README.md\"\nrequires-python = \">=3.8\"\nclassifiers = [\n    \"Development Status :: 5 - Production/Stable\",\n    \"Intended Audience :: Developers\",\n    \"Intended Audience :: Healthcare Industry\",\n    \"Intended Audience :: Science/Research\",\n    \"Topic :: Scientific/Engineering :: Artificial Intelligence\",\n    \"Topic :: Scientific/Engineering :: Bio-Informatics\",\n    \"Topic :: Scientific/Engineering :: Information Analysis\",\n    \"Topic :: Scientific/Engineering :: Medical Science Apps.\",\n    \"Natural Language :: English\",\n    \"Operating System :: OS Independent\",\n    \"Programming Language :: Python\",\n    \"Programming Language :: Python :: 3\",\n    \"Programming Language :: Python :: 3.8\",\n    \"Programming Language :: Python :: 3.9\",\n    \"Programming Language :: Python :: 3.10\",\n    \"Programming Language :: Python :: 3.11\",\n]\ndependencies = [\n    \"typer\",\n    \"loguru\",\n    \"omegaconf >=2.0.0\",\n    \"tqdm\",\n    \"platformdirs\",\n    # scientific\n    \"numpy\",\n    \"scipy >=1.4\",\n    \"pandas >=1.0\",\n    \"scikit-learn\",\n    \"fastparquet\",\n    # viz\n    \"matplotlib >=3.0.1\",\n    \"seaborn\",\n    # cloud IO\n    \"fsspec >=2021.6\",\n    \"s3fs >=2021.6\",\n    \"gcsfs >=2021.6\",\n    # ML packages\n    \"lightning >=2.0\",\n    \"torchmetrics >=0.7.0,<0.11\",\n    \"ogb\",\n    \"torch-geometric >=2.0\",\n    \"wandb\",\n    \"mup\",\n    \"torch_sparse >=0.6\",\n    \"torch_cluster >=1.5\",\n    \"torch_scatter >=2.0\",\n    # chemistry\n    \"datamol >=0.10\",\n]\n\n[project.scripts]\ngraphium = \"graphium.cli.main:app\"\ngraphium-train = \"graphium.cli.train_finetune_test:cli\"\n\n[project.urls]\nWebsite = \"https://graphium.datamol.io/\"\n\"Source Code\" = \"https://github.com/datamol-io/graphium\"\n\"Bug Tracker\" = \"https://github.com/datamol-io/graphium/issues\"\nDocumentation = \"https://graphium-docs.datamol.io/\"\n\n[tool.setuptools]\ninclude-package-data = true\n\n[tool.setuptools_scm]\nfallback_version = \"dev\"\n\n[tool.setuptools.packages.find]\nwhere = [\".\"]\ninclude = [\"graphium\", \"graphium.*\", \"expts\", \"expts.*\"]\nexclude = []\nnamespaces = true\n\n[tool.setuptools.package-data]\n\"graphium\" = [\"**/*.csv\", \"**/*.parquet\", \"**/*.yaml\"]\n\n[tool.black]\nline-length = 110\ntarget-version = ['py310', 'py311']\ninclude = '\\.pyi?$'\n\n[tool.pytest.ini_options]\nminversion = \"6.0\"\naddopts = \"--verbose --durations=10 -n 1 --cov=graphium --cov-fail-under=60 --cov-report xml --cov-report term\"\ntestpaths = [\"tests\"]\nfilterwarnings = [\n    \"ignore::DeprecationWarning:ray.*:\",\n    \"ignore::DeprecationWarning:numba.*:\",\n    \"ignore::DeprecationWarning:lightning_fabric.*:\",\n    \"ignore::DeprecationWarning:pytorch_lightning.*:\",\n    \"ignore::DeprecationWarning:pkg_resources.*:\",\n]\nmarkers = [\n    \"ipu: marks tests that are specific to the IPU (deselect with '-m \\\"not ipu\\\"')\",\n]\n\n[tool.coverage.run]\nsource = [\"graphium/\"]\ndisable_warnings = [\"no-data-collected\"]\ndata_file = \".coverage/coverage\"\n\n[tool.coverage.report]\nomit = [\"graphium/__init__.py\", \"graphium/_version.py\"]\n\n[tool.coverage.xml]\noutput = \"coverage.xml\"\n"
  },
  {
    "path": "scripts/balance_params_and_train.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n#!/bin/bash\n\nset -e\n\nold_dim=0\nnum_params=10000000\nrel_error=0.005\nrel_step=0.5\n\nout=$(graphium params balance \"${@}\" \"$num_params\" \"$rel_error\" \"$rel_step\" \"$old_dim\")\nread -r new_dim new_edge_dim rel_step stop <<< \"$out\"\n\nwhile true; do\n    tmp_dim=$new_dim\n    out=$(graphium params balance \"${@}\" constants.gnn_dim=\"$new_dim\" constants.gnn_edge_dim=\"$new_edge_dim\" \"$num_params\" \"$rel_error\" \"$rel_step\" \"$old_dim\")\n    read -r new_dim new_edge_dim rel_step stop <<< \"$out\"\n    old_dim=$tmp_dim\n    [[ $stop == \"true\" ]] && break\ndone\n\ngraphium-train \"${@}\" constants.gnn_dim=\"$new_dim\" constants.gnn_edge_dim=\"$new_edge_dim\""
  },
  {
    "path": "scripts/convert_yml.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nConvert the dependencies from conda's `env.yml` to pip `requirements.txt`\n\"\"\"\n\nimport ruamel.yaml\n\nyaml = ruamel.yaml.YAML()\ndata = yaml.load(open(\"env.yml\"))\n\nrequirements = []\nfor dep in data[\"dependencies\"]:\n    if isinstance(dep, str):\n        outputs = dep.split(\"=\")\n        if len(outputs) == 1:\n            package = outputs[0]\n            requirements.append(package)\n        elif len(outputs) == 2:\n            package, package_version = outputs[0], outputs[1]\n            requirements.append(package + \"==\" + package_version)\n        elif len(outputs) == 3:\n            package, package_version, python_version = outputs[0], outputs[1], outputs[2]\n            requirements.append(package + \"==\" + package_version)\n    elif isinstance(dep, dict):\n        for preq in dep.get(\"pip\", []):\n            requirements.append(preq)\n\nwith open(\"requirements.txt\", \"w\") as fp:\n    for requirement in requirements:\n        print(requirement, file=fp)\n"
  },
  {
    "path": "scripts/ipu_start.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n\"\"\"\nStart the ipu environment and SDK\n\"\"\"\n\nsource /opt/gc/sdk-3.0.0+1128/poplar-ubuntu_20_04-3.0.0+5468-0379b9a65d/enable.sh\nsource /opt/gc/sdk-3.0.0+1128/popart-ubuntu_20_04-3.0.0+5468-0379b9a65d/enable.sh\n\nsource ~/.venv/graphium_ipu/bin/activate # Change to your path\n\nexport VISUAL=vim\nexport EDITOR=\"$VISUAL\"\n"
  },
  {
    "path": "scripts/ipu_venv.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n\"\"\"\nCreate the pip environment for IPU\n\"\"\"\n\n## Uncomment this to create the folder for the environment\n# mkdir ~/.venv                              # Create the folder for the environment\n# python3 -m venv ~/.venv/graphium_ipu           # Create the environment\n# source ~/.venv/graphium_ipu/bin/activate       # Activate the environment\n\n# Installing the dependencies for the IPU environment\npip install torch==1.10+cpu torchvision==0.11+cpu torchaudio==0.10 -f https://download.pytorch.org/whl/torch_stable.html\npip install torch-scatter torch-sparse torch-cluster torch-spline-conv torch-geometric -f https://data.pyg.org/whl/torch-1.10.0+cpu.html\npip install dgl dglgo -f https://data.dgl.ai/wheels/repo.html\npip install /opt/gc/sdk-3.0.0+1128/poptorch-3.0.0+84519_672c9cbc7f_ubuntu_20_04-cp38-cp38-linux_x86_64.whl\npip install -r requirements.txt\npip install -e .\n"
  },
  {
    "path": "scripts/scale_mpnn.sh",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\n#!/bin/bash\n\ngraphium-train \\\n    --config-path=/home/frederik_valencediscovery_com/projects/graphium_hps/expts/configs/ \\\n    --config-name=config_mpnn_base.yaml \\\n    constants.max_epochs=100 \\\n    trainer.model_checkpoint.dirpath=model_checkpoints/large-dataset/scale_mpnn/ \\\n    +architecture.mup_scale_factor=2 +architecture.mup_base_path=mup/mpnn_base/base_shapes.yaml \\\n    datamodule.args.batch_size_inference=1024 datamodule.args.batch_size_training=1024 +trainer.trainer.accumulate_grad_batches=2 \\"
  },
  {
    "path": "tests/.gitignore",
    "content": "temp_cache_*"
  },
  {
    "path": "tests/__init__.py",
    "content": ""
  },
  {
    "path": "tests/config_test_ipu_dataloader.yaml",
    "content": "# Testing the multitask pipeline with the QM9 dataset on IPU, by splitting it up into three tasks: homo, alpha and cv.\nconstants:\n  name: &name test_ipu #qm9_full\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 20 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 60\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 16 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 120\n        # Data handling-related\n        batch_size_training: 6\n        batch_size_inference: 6\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(2)\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      homo:\n        df: null\n        df_path: &df_path https://storage.valencelabs.com/datasets-public-research/PCQM4M/cxsmiles/pcqm4mv2-2k-lumo-alpha.csv\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"homo_lumo_gap\", \"lumo\"]\n        split_val: 0.2\n        split_test: 0.2\n        seed: *seed\n        splits_path: null                 # This may not always be provided\n        sample_size: null                 # This may not always be provided\n        idx_col: null                     # This may not always be provided\n        weights_col: null                 # This may not always be provided\n        weights_type: null                # This may not always be provided\n        task_level: graph\n      alpha:\n        df: null\n        df_path: *df_path\n        smiles_col: \"cxsmiles\"\n        label_cols: [\"alpha\"]\n        split_val: 0.2\n        split_test: 0.2\n        seed: *seed\n        splits_path: null                 # This may not always be provided\n        sample_size: null                 # This may not always be provided\n        idx_col: null                     # This may not always be provided\n        weights_col: null                 # This may not always be provided\n        weights_type: null                # This may not always be provided\n        task_level: graph\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 0\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [atomic-number, valence]\n      atom_property_list_float: [mass, electronegativity, in-ring]\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      conformer_property_list: [positions_3d]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          node_laplacian_eigvec:\n            pos_type: laplacian_eigvec\n            pos_level: node\n            num_pos: 5\n            normalization: \"none\"\n            disconnected_comp: True\n          node_laplacian_eigval:\n            pos_type: laplacian_eigval\n            pos_level: node\n            num_pos: 5\n            normalization: \"none\"\n            disconnected_comp: True\n          rw_return_probs:\n            pos_type: rw_return_probs\n            pos_level: node\n            ksteps: [4, 8]\n          edge_rw_transition_probs:\n            pos_type: rw_transition_probs\n            pos_level: edge\n            ksteps: [2, 4]\n          nodepair_rw_return_probs:\n            pos_type: rw_return_probs\n            pos_level: nodepair\n            ksteps: [4]\n          electrostatic:\n            pos_type: electrostatic\n            pos_level: node\n          edge_commute:\n            pos_type: commute\n            pos_level: edge\n          nodepair_graphormer:\n            pos_type: graphormer\n            pos_level: nodepair\n\n    num_workers: -1\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization batch_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  pe_encoders:\n    out_dim: &pe_out_dim 16\n    edge_out_dim: &edge_pe_out_dim 8\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    max_num_nodes_per_graph: 30\n    encoders:\n      emb_la_pos:\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      emb_rwse:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_electrostatic:\n        encoder_type: \"mlp\"\n        input_keys: [\"electrostatic\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_edge_rwse:\n        encoder_type: \"mlp\"\n        input_keys: [\"edge_rw_transition_probs\"]\n        output_keys: [\"edge_feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      emb_edge_pes:\n        encoder_type: \"cat_mlp\"\n        input_keys: [\"edge_rw_transition_probs\", \"edge_commute\"]\n        output_keys: [\"edge_feat\"]\n        hidden_dim: 32\n        num_layers: 1\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n      gaussian_pos:\n        encoder_type: \"gaussian_kernel\"\n        input_keys: [\"positions_3d\"]\n        output_keys: [\"feat\", \"nodepair_gaussian_bias_3d\"]\n        num_heads: &num_heads 2\n        num_layers: 2\n        embed_dim: *pe_out_dim\n        use_input_keys_prefix: False\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 8\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gps' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs:  # Parameters for the model itself. You could define dropout_attn: 0.1\n      mpnn_type: 'pyg:gine'\n      mpnn_kwargs: null\n        #out_dim_edges: 10\n      attn_type: \"none\" # \"full-attention\", \"none\"\n      attn_kwargs: null\n\n  graph_output_nn:\n    graph:\n      pooling: [sum, mean, max]\n      out_dim: 8\n      hidden_dims: 8\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    homo:\n      out_dim: 2\n      hidden_dims: 8\n      depth: 1                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n      task_level: graph\n    alpha:\n      out_dim: 1\n      hidden_dims: 8\n      depth: 1                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n      task_level: graph\n    cv:\n      out_dim: 1\n      hidden_dims: 8\n      depth: 2                          # Not needed if we have hidden_dims\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n      task_level: graph\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    homo: [\"mae\"]\n    alpha: [\"mae\"]\n  loss_fun:\n    homo: mse_ipu\n    alpha: mse_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-3\n  target_nan_mask: null\n\n# Task-specific\nmetrics:\n  homo:\n    - name: mae\n      metric: mae\n      threshold_kwargs: null\n      target_nan_mask: null\n  alpha:\n    - name: mae\n      metric: mae\n      threshold_kwargs: null\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/QM9\n    name: *name\n  model_checkpoint:\n    dirpath: models_checkpoints/QM9/\n    filename: *name\n    save_top_k: 1\n    every_n_epochs: 1\n  trainer:\n    max_epochs: 2\n    min_epochs: 1\n"
  },
  {
    "path": "tests/config_test_ipu_dataloader_multitask.yaml",
    "content": "# Testing the gcn model with the PCQMv2 dataset on IPU.\nconstants:\n  name: &name neurips2023_small_data_gcn\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\naccelerator:\n  type: ipu  # cpu or ipu or gpu\n  config_override:\n    datamodule:\n      args:\n        ipu_dataloader_training_opts:\n          mode: async\n          max_num_nodes_per_graph: 44 # train max nodes: 20, max_edges: 54\n          max_num_edges_per_graph: 80\n        ipu_dataloader_inference_opts:\n          mode: async\n          max_num_nodes_per_graph: 44 # valid max nodes: 51, max_edges: 118\n          max_num_edges_per_graph: 80\n        # Data handling-related\n        batch_size_training: 50\n        batch_size_inference: 50\n    predictor:\n      optim_kwargs:\n        loss_scaling: 1024\n    trainer:\n      trainer:\n        precision: 16\n        accumulate_grad_batches: 4\n\n  ipu_config:\n    - deviceIterations(5) # IPU would require large batches to be ready for the model.\n    - replicationFactor(1)\n    # - enableProfiling(\"graph_analyser\")       # The folder where the profile will be stored\n    # - enableExecutableCaching(\"pop_compiler_cache\")\n    - TensorLocations.numIOTiles(128)\n    - _Popart.set(\"defaultBufferingDepth\", 128)\n    - Precision.enableStochasticRounding(True)\n    - useIpuModel(True)\n\n# accelerator:\n#   type: cpu  # cpu or ipu or gpu\n#   config_override:\n#     datamodule:\n#       batch_size_training: 64\n#       batch_size_inference: 256\n#     trainer:\n#       trainer:\n#         precision: 32\n#         accumulate_grad_batches: 1\n\ndatamodule:\n  module_type: \"MultitaskFromSmilesDataModule\"\n  # module_type: \"FakeDataModule\"  # Option to use generated data\n  args: # Matches that in the test_multitask_datamodule.py case.\n    task_specific_args:   # To be replaced by a new class \"DatasetParams\"\n      qm9:\n        df: null\n        df_path: qm9.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/qm9.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"A\", \"B\", \"C\", \"mu\", \"alpha\", \"homo\", \"lumo\", \"gap\", \"r2\", \"zpve\", \"u0\", \"u298\", \"h298\", \"g298\", \"cv\", \"u0_atom\", \"u298_atom\", \"h298_atom\", \"g298_atom\"]\n        sample_size: 2000 # use sample_size for test\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n      tox21:\n        df: null\n        df_path: Tox21-7k-12-labels.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/Tox21-7k-12-labels.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"NR-AR\", \"NR-AR-LBD\", \"NR-AhR\", \"NR-Aromatase\", \"NR-ER\", \"NR-ER-LBD\", \"NR-PPAR-gamma\", \"SR-ARE\", \"SR-ATAD5\", \"SR-HSE\", \"SR-MMP\", \"SR-p53\"]\n        sample_size: 2000 # use sample_size for test\n        seed: *seed\n        task_level: graph\n\n      zinc:\n        df: null\n        df_path: ZINC12k.csv.gz\n        # df_path: data/neurips2023/small-dataset/ZINC12k.csv.gz\n        # wget https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/ZINC12k.csv.gz\n        # or set path as the URL directly\n        smiles_col: \"smiles\"\n        label_cols: [\"SA\", \"logp\", \"score\"]\n        sample_size: 2000 # use sample_size for test\n        seed: *seed\n        task_level: graph\n        label_normalization:\n          normalize_val_test: True\n          method: \"normal\"\n\n    # Featurization\n    prepare_dict_or_graph: pyg:graph\n    featurization_n_jobs: 30\n    featurization_progress: True\n    featurization_backend: \"loky\"\n    # processed_graph_data_path: \"../datacache/neurips2023-small/\"\n    featurization:\n    # OGB: ['atomic_num', 'degree', 'possible_formal_charge', 'possible_numH' (total-valence),\n    # 'possible_number_radical_e', 'possible_is_aromatic', 'possible_is_in_ring',\n    # 'num_chiral_centers (not included yet)']\n      atom_property_list_onehot: [atomic-number, group, period, total-valence]\n      atom_property_list_float: [degree, formal-charge, radical-electron, aromatic, in-ring]\n      # OGB: ['possible_bond_type', 'possible_bond_stereo', 'possible_is_in_ring']\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False # if H is included\n      use_bonds_weights: False\n      pos_encoding_as_features: # encoder dropout 0.18\n        pos_types:\n          lap_eigvec:\n            pos_level: node\n            pos_type: laplacian_eigvec\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          lap_eigval:\n            pos_level: node\n            pos_type: laplacian_eigval\n            num_pos: 8\n            normalization: \"none\" # nomrlization already applied on the eigen vectors\n            disconnected_comp: True # if eigen values/vector for disconnected graph are included\n          rw_pos: # use same name as pe_encoder\n            pos_level: node\n            pos_type: rw_return_probs\n            ksteps: 16\n\n    num_workers: -1 # -1 to use all\n    persistent_workers: False # if use persistent worker at the start of each epoch.\n    # Using persistent_workers false might make the start of each epoch very long.\n    featurization_backend: \"loky\"\n\n\narchitecture:\n  model_type: FullGraphMultiTaskNetwork\n  mup_base_path: null\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization layer_norm\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:  null # Set as null to avoid a pre-nn network\n\n  pe_encoders:\n    out_dim: 32\n    pool: \"sum\" #\"mean\" \"max\"\n    last_norm: None #\"batch_norm\", \"layer_norm\"\n    encoders: #la_pos |  rw_pos\n      la_pos:  # Set as null to avoid a pre-nn network\n        encoder_type: \"laplacian_pe\"\n        input_keys: [\"laplacian_eigvec\", \"laplacian_eigval\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        model_type: 'DeepSet' #'Transformer' or 'DeepSet'\n        num_layers: 2\n        num_layers_post: 1 # Num. layers to apply after pooling\n        dropout: 0.1\n        first_normalization: \"none\" #\"batch_norm\" or \"layer_norm\"\n      rw_pos:\n        encoder_type: \"mlp\"\n        input_keys: [\"rw_return_probs\"]\n        output_keys: [\"feat\"]\n        hidden_dim: 64\n        out_dim: 32\n        num_layers: 2\n        dropout: 0.1\n        normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n        first_normalization: \"layer_norm\" #\"batch_norm\" or \"layer_norm\"\n\n\n\n  gnn:  # Set as null to avoid a post-nn network\n    in_dim: 16 # or otherwise the correct value\n    out_dim: &gnn_dim 16\n    hidden_dims: *gnn_dim\n    depth: 1\n    activation: gelu\n    last_activation: none\n    dropout: 0.1\n    normalization: \"layer_norm\"\n    last_normalization: *normalization\n    residual_type: simple\n    virtual_node: 'none'\n    layer_type: 'pyg:gcn' #pyg:gine #'pyg:gps' # pyg:gated-gcn, pyg:gine,pyg:gps\n    layer_kwargs: null # Parameters for the model itself. You could define dropout_attn: 0.1\n\n\n  graph_output_nn:\n    graph:\n      pooling: [sum]\n      out_dim: *gnn_dim\n      hidden_dims: *gnn_dim\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n  task_heads:\n    qm9:\n      task_level: graph\n      out_dim: 19\n      hidden_dims: 16\n      depth: 1\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    tox21:\n      task_level: graph\n      out_dim: 12\n      hidden_dims: 16\n      depth: 1\n      activation: relu\n      last_activation: sigmoid\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n    zinc:\n      task_level: graph\n      out_dim: 3\n      hidden_dims: 16\n      depth: 2\n      activation: relu\n      last_activation: none\n      dropout: *dropout\n      normalization: *normalization\n      last_normalization: \"none\"\n      residual_type: none\n\n#Task-specific\npredictor:\n  metrics_on_progress_bar:\n    qm9: [\"mae\"]\n    tox21: [\"auroc\"]\n    zinc: [\"mae\"]\n  loss_fun:\n    qm9: mae_ipu\n    tox21: bce_ipu\n    zinc: mae_ipu\n  random_seed: *seed\n  optim_kwargs:\n    lr: 4.e-5 # warmup can be scheduled using torch_scheduler_kwargs\n    # weight_decay: 1.e-7\n  torch_scheduler_kwargs:\n    module_type: WarmUpLinearLR\n    max_num_epochs: &max_epochs 1\n    warmup_epochs: 1\n    verbose: False\n  scheduler_kwargs:\n  #  monitor: &monitor qm9/mae/train\n  #  mode: min\n  #  frequency: 1\n  target_nan_mask: null # null: no mask, 0: 0 mask, ignore-flatten, ignore-mean-per-label\n  multitask_handling: flatten # flatten, mean-per-label\n\n# Task-specific\nmetrics:\n  qm9: &qm9_metrics\n    - name: mae\n      metric: mae_ipu\n      target_nan_mask: null\n      multitask_handling: flatten\n      threshold_kwargs: null\n    - name: pearsonr\n      metric: pearsonr_ipu\n      threshold_kwargs: null\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n    - name: r2_score\n      metric: r2_score_ipu\n      target_nan_mask: null\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n  tox21:\n    - name: auroc\n      metric: auroc_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: avpr\n      metric: average_precision_ipu\n      task: binary\n      multitask_handling: mean-per-label\n      threshold_kwargs: null\n    - name: f1 > 0.5\n      metric: f1\n      multitask_handling: mean-per-label\n      target_to_int: True\n      num_classes: 2\n      average: micro\n      threshold_kwargs: &threshold_05\n        operator: greater\n        threshold: 0.5\n        th_on_preds: True\n        th_on_target: True\n    - name: precision > 0.5\n      metric: precision\n      multitask_handling: mean-per-label\n      average: micro\n      threshold_kwargs: *threshold_05\n  zinc: *qm9_metrics\n\ntrainer:\n  seed: *seed\n  logger:\n    save_dir: logs/neurips2023-small/\n    name: *name\n    project: *name\n  #early_stopping:\n  #  monitor: *monitor\n  #  min_delta: 0\n  #  patience: 10\n  #  mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/neurips2023-small-gcn/\n    filename: *name\n    # monitor: *monitor\n    # mode: *mode\n    # save_top_k: 1\n    save_last: True\n  trainer:\n    max_epochs: *max_epochs\n    min_epochs: 1\n    check_val_every_n_epoch: 20\n"
  },
  {
    "path": "tests/conftest.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pathlib\n\nimport pytest\n\nTEST_DIR_PATH = pathlib.Path(__file__).parent / \"data\"\nDATA_DIR_PATH = TEST_DIR_PATH.absolute()\n\n\n@pytest.fixture\ndef datadir(request):\n    return DATA_DIR_PATH\n"
  },
  {
    "path": "tests/data/config_micro_ZINC.yaml",
    "content": "constants:\n  seed: &seed 42\n  raise_train_error: true   # Whether the code should raise an error if it crashes during training\n\ndatamodule:\n  module_type: \"DGLFromSmilesDataModule\"\n  args:\n    df_path: graphium/data/micro_ZINC/micro_ZINC.csv\n    processed_graph_data_path: graphium/data/cache/micro_ZINC/\n    label_cols: ['score']\n    smiles_col: SMILES\n\n    # Featurization\n    featurization_n_jobs: -1\n    featurization_progress: True\n    featurization:\n      atom_property_list_onehot: [atomic-number, valence]\n      atom_property_list_float: [mass, electronegativity, in-ring]\n      edge_property_list: [bond-type-onehot, stereo, in-ring]\n      add_self_loop: False\n      explicit_H: False\n      use_bonds_weights: False\n      pos_encoding_as_features: &pos_enc\n        pos_type: laplacian_eigvec\n        num_pos: 3\n        normalization: \"none\"\n        disconnected_comp: True\n      pos_encoding_as_directions: *pos_enc\n\n    # Train, val, test parameters\n    split_val: 0.2\n    split_test: 0.2\n    split_seed: *seed\n    splits_path: null\n    batch_size_training: 128\n    batch_size_inference: 128\n\n    # Data loading\n    num_workers: 0\n    pin_memory: False\n    persistent_workers: False  # Keep True on Windows if running multiple workers\n\n\narchitecture:\n  model_type: fulldglnetwork\n  pre_nn:   # Set as null to avoid a pre-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 1\n    activation: relu\n    last_activation: none\n    dropout: &dropout 0.1\n    normalization: &normalization \"batch_norm\"\n    last_normalization: *normalization\n    residual_type: none\n\n  pre_nn_edges:   # Set as null to avoid a pre-nn network\n    out_dim: 16\n    hidden_dims: 16\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: none\n\n  gnn:  # Set as null to avoid a post-nn network\n    out_dim: 32\n    hidden_dims: 32\n    depth: 4\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: *normalization\n    residual_type: simple\n    pooling: [sum, max, dir1]\n    virtual_node: 'sum'\n    layer_type: 'dgl:dgn-msgpass'\n    layer_kwargs:\n      # num_heads: 3\n      aggregators: [mean, max, dir1/dx_abs, dir1/smooth]\n      scalers: [identity, amplification, attenuation]\n\n  graph_output_nn:\n    out_dim: 1\n    hidden_dims: 32\n    depth: 2\n    activation: relu\n    last_activation: none\n    dropout: *dropout\n    normalization: *normalization\n    last_normalization: \"none\"\n    residual_type: none\n\npredictor:\n  metrics_on_progress_bar: [\"mae\", \"pearsonr\", \"f1 > 3\", \"precision > 3\"]\n  loss_fun: mse\n  random_seed: *seed\n  optim_kwargs:\n    lr: 1.e-2\n    weight_decay: 1.e-7\n  lr_reduce_on_plateau_kwargs:\n    factor: 0.5\n    patience: 7\n  scheduler_kwargs:\n    monitor: &monitor loss/val\n    frequency: 1\n  target_nan_mask: 0 # null: no mask, 0: 0 mask, ignore: ignore nan values from loss\n\n\nmetrics:\n  - name: mae\n    metric: mae\n    threshold_kwargs: null\n\n  - name: pearsonr\n    metric: pearsonr\n    threshold_kwargs: null\n\n  - name: f1 > 3\n    metric: f1\n    target_to_int: True\n    num_classes: 2\n    average: micro\n    threshold_kwargs: &threshold_3\n      operator: greater\n      threshold: 3\n      th_on_preds: True\n      th_on_target: True\n\n  - name: f1 > 5\n    metric: f1\n    target_to_int: True\n    num_classes: 2\n    average: micro\n    threshold_kwargs:\n      operator: greater\n      threshold: 5\n      th_on_preds: True\n      th_on_target: True\n\n  - name: precision > 3\n    metric: precision\n    average: micro\n    threshold_kwargs: *threshold_3\n\ntrainer:\n  logger:\n    save_dir: logs/micro_ZINC\n  early_stopping:\n    monitor: *monitor\n    min_delta: 0\n    patience: 10\n    mode: &mode min\n  model_checkpoint:\n    dirpath: models_checkpoints/micro_ZINC/\n    filename: \"model\"\n    monitor: *monitor\n    mode: *mode\n    save_top_k: 1\n    every_n_epochs: 1\n  trainer:\n    max_epochs: 25\n    min_epochs: 5\n    gpus: 1\n\n"
  },
  {
    "path": "tests/data/micro_ZINC.csv",
    "content": "SMILES,SA,logp,score\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,2.775189478,3.7897,1.014510522\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.071497898,3.161,0.089502102\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,2.84000896,1.5048,-1.33520896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,3.605168457,-1.1109,-4.716068457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.132664673,2.2426,0.109935327\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,3.117639223,0.98102,-2.136619223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,2.744789746,4.3197,1.574910254\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,2.547847184,0.97954,-1.568307184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.444743614,2.4136,-0.031143614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,3.546311784,0.8084,-2.737911784\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,2.342516783,3.55016,1.207643217\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,2.685016252,2.63872,-0.046296252\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,2.04419285,3.0455,1.00130715\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,2.369895336,4.13924,1.769344664\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,4.165551538,0.6424,-3.523151538\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,3.989079408,0.7123,-3.276779408\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,3.287567863,3.0474,-0.240167863\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,3.152874099,1.2597,-1.893174099\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,2.173785623,3.35398,1.180194377\nCc1ncc(-c2ccc(S(=O)(=O)NC3CCCCCC3)s2)o1,2.513723357,3.71262,1.198896643\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,1.947448768,4.97972,3.032271232\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,4.391338086,0.42742,-3.963918086\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,2.956072154,2.7309,-0.225172154\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,2.665131639,3.32222,0.657088361\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,2.249569987,1.76082,-0.488749987\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,2.921725033,1.12714,-1.794585033\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,3.257260117,4.3639,1.106639883\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,4.36293875,1.63596,-2.72697875\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,3.725368177,1.9079,-1.817468177\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,3.812797047,0.2247,-3.588097047\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,1.826753956,2.73994,0.913186044\nCC(C)(C)OC(=O)N[C@H]1CCN(c2cc(-c3cccs3)n[nH]2)C1,3.228025092,3.2416,0.013574908\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,4.787067722,1.3846,-3.402467722\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,2.485526606,3.6869,1.201373394\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,2.708478583,4.09694,1.388461417\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,2.943459024,3.01252,0.069060976\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,2.781430253,2.0053,-0.776130253\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,2.258312512,2.22984,-0.028472512\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,3.277555708,1.84896,-1.428595708\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,3.259412472,2.4361,-0.823312472\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,2.887282024,3.0685,0.181217976\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,2.501067124,2.11938,-0.381687124\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,2.296421252,3.7633,1.466878748\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,3.273511868,1.3679,-1.905611868\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,2.000346516,4.1539,2.153553484\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,4.483674272,0.9483,-3.535374272\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,3.510193788,0.3764,-3.133793788\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,2.774610155,3.0369,0.262289845\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,2.584851266,3.792,1.207148734\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,3.068452859,2.49544,-0.573012859\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,2.492863438,1.83022,-0.662643438\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,4.576896432,3.54158,-1.035316432\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,4.151966768,2.7888,-1.363166768\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,4.514122047,1.3831,-3.131022047\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,3.960984106,1.7691,-2.191884106\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,3.669896168,0.8081,-2.861796168\nC=CCSc1nnc(C2CCOCC2)n1N,2.825013358,1.164,-1.661013358\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,2.674797385,2.9367,0.261902615\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,2.133219774,4.04592,1.912700226\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,3.418504414,1.4823,-1.936204414\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,3.363061129,3.6889,0.325838871\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,1.931563902,4.0076,2.076036098\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,2.82917551,1.5648,-1.26437551\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,2.376871447,1.5444,-0.832471447\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,3.548164288,5.37382,1.825655712\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,3.226440403,3.7253,0.498859597\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,3.128631552,2.55582,-0.572811552\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,2.869359841,0.1186,-2.750759841\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,2.427176885,2.6842,0.257023115\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,2.493551313,2.76928,0.275728687\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,3.779297449,1.9323,-1.846997449\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,5.127905513,0.2312,-4.896705513\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,3.24179101,1.003,-2.23879101\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,2.114582277,1.80932,-0.305262277\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,2.852054716,1.8074,-1.044654716\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,3.940919906,2.9729,-0.968019906\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,2.400946341,5.3985,2.997553659\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,2.601600041,3.70032,1.098719959\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],3.576525387,0.94,-2.636525387\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,2.298959953,3.204,0.905040047\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,3.812988405,0.77654,-3.036448405\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,2.732126257,3.06272,0.330593743\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,3.955103091,1.47556,-2.479543091\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,3.628458628,1.43692,-2.191538628\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,2.795116489,0.8664,-1.928716489\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,3.002385916,2.4608,-0.541585916\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,3.789130146,1.5831,-2.206030146\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,2.245396536,2.436,0.190603464\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,2.86962049,2.07512,-0.79450049\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,3.445502237,3.7062,0.260697763\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,2.872766629,2.94166,0.068893371\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,1.983649537,3.6512,1.667550463\nO=c1c2ccccc2sn1-c1ncc(Br)s1,2.853247472,3.2712,0.417952528\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,2.60278761,2.92602,0.32323239\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,2.870575026,1.55642,-1.314155026\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,2.965149266,4.1977,1.232550734\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,1.961616674,1.8963,-0.065316674\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,2.529083407,1.1019,-1.427183407\nCCOc1cc(/C=C(\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC,2.350767662,5.01848,2.667712338\nCCOC(=O)Nc1ccc(NC(=O)N2C[C@@H](C)CC[C@H]2C)cc1C#N,3.067347555,3.77898,0.711632445\nCn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,2.748880773,2.327,-0.421880773\nCC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,3.356907136,1.8907,-1.466207136\nCCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,5.501545361,1.5698,-3.931745361\nCC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,3.063324,0.5446,-2.518724\nCOc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,2.583698935,3.5677,0.984001065\nCC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,3.087047482,3.1426,0.055552518\nCOc1ccc2c(c1)SC(=O)C2=O,2.401574365,1.5102,-0.891374365\nCOc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,2.913337418,4.3619,1.448562582\nCOc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,2.051226236,4.0812,2.029973764\nCNC(=O)O/N=C(/N)c1ccc(Br)cc1,2.250317734,1.4254,-0.824917734\nCC1(C)CN(C[C@@H](CS)c2ccccc2)CCO1,3.027019854,2.8108,-0.216219854\nCCOC(=O)[C@H]1CCCN(C(=O)Cn2c(SC(F)F)nc3ccccc32)C1,2.799716077,3.1527,0.352983923\n[NH3+][C@@H](Cc1ccccn1)c1ccc(Cl)cn1,3.324055225,1.6557,-1.668355225\nCCOCCN(C)C(=O)C(=O)Nc1cccnc1-c1ccccc1,2.256607012,2.182,-0.074607012\nCCn1cc([C@H](NC(=O)c2ccc(OC)c(OC)c2)c2ccc(F)cc2)c2cc[nH+]cc21,3.447988059,4.151,0.703011941\nO=C([C@@H]1CC12CCOCC2)N1CC[C@@H](Oc2cccc(Cl)c2)C1,3.713785706,3.1364,-0.577385706\nC[NH2+][C@]1(C(N)=O)CCC[C@H]1CCSC1CCOCC1,4.614268117,0.5061,-4.108168117\nCc1cc(NC(=O)c2ccc(-n3cncn3)nc2)ccc1N(C)C,2.267592306,2.28902,0.021427694\nCc1ccccc1CNC(=O)N1CCC([C@@H](O)c2ccc(Cl)cc2)CC1,2.524142332,4.30362,1.779477668\nCCc1nc(CNCCc2cccs2)cs1,2.300594957,3.0993,0.798705043\nO=C(Nc1ccc2nc(-c3cc(F)ccc3F)[nH]c2c1)c1ccc([N+](=O)[O-])cc1,2.145744096,4.6686,2.522855904\nN#Cc1ccccc1NC(=O)C(=O)NC[C@H]1COC2(CCCC2)O1,3.412921889,1.29868,-2.114241889\nO=C(NC1(C(=O)[O-])CC[NH2+]CC1)OCC1c2ccccc2-c2ccccc21,3.490460281,0.371,-3.119460281\nCc1ccc(C(=O)N2CCC[C@H]2CNC(=O)OC(C)(C)C)cc1,2.424440646,3.12432,0.699879354\nC[NH+](C)CC(=O)NNC(=O)c1ccc(C2CCCCC2)cc1,2.819298295,0.6398,-2.179498295\nC[C@@H]1CCCC[C@@H]1OCC(=O)N(C)Cc1cccc(O)c1,2.912373702,2.9459,0.033526298\nC#CCNS(=O)(=O)c1ccc(C(=O)N[C@@H]2CCO[C@@]3(CCOC3)C2)cc1,3.898168643,0.666,-3.232168643\nCc1ccc(C2(CN3CCN(S(=O)(=O)N(C)C)CC3)CCCC2)cc1,2.439754533,2.23082,-0.208934533\nCc1sc2ncn(CCC(=O)NCC(N)=O)c(=O)c2c1C,2.347207183,0.06644,-2.280767183\nCOc1ccc(CCNC(=O)c2cc(C(C)C)nc3c2c(C)nn3C)cc1OC,2.282558575,3.38982,1.107261425\nO=[N+]([O-])c1cnc(Br)s1,3.179818392,1.8138,-1.366018392\nCC(C)(C)CC(=O)N1CCN(C(=O)c2ccn3ccsc23)CC1,2.712888331,2.7214,0.008511669\nC/[NH+]=C(/NCCCc1cccc(Br)c1)NC1CC1,2.991361127,0.7897,-2.201661127\nO=C(CCC1CCCC1)N1CCC(COc2ccccc2C(=O)N2CCCC2)CC1,2.19573155,4.5104,2.31466845\nCC(C)(C)c1n[nH]cc1/C=C1\\SC(NC2CCCCC2)=NC1=O,3.076214803,3.5998,0.523585197\nCN(CC(F)(F)F)C(=O)CNC(=O)N1CCO[C@H](C#N)C1,3.449633505,-0.05892,-3.508553505\nCC[C@@H](NC(=O)NCc1ccc(C#N)cc1)c1ccc(C)c(F)c1,2.506404532,3.9563,1.449895468\nCC(C)c1noc(CCNC(=O)/C=C/c2cn(C)c3ccccc23)n1,2.489753078,3.0568,0.567046922\nCC(C)=C1Oc2c3c(cc(C)c2C1=O)OCN(Cc1ccccc1)C3,2.840364996,4.21612,1.375755004\nO=C1/C(=N/O)c2ccc(Cl)cc2N1Cc1ccccc1,2.0410745,3.0651,1.0240255\nc1ccc(CC2(Cc3ccc4c(c3)CCC4)C[NH2+]C2)cc1,2.844717318,2.5239,-0.320817318\nCC[C@H](C)NC(=O)N[C@H](c1ccccc1)C(F)(F)F,2.819638554,3.3877,0.568061446\nNC(=O)COc1ccc2ccccc2c1C[NH2+]C1CCCCCCC1,2.95041288,2.8802,-0.07021288\nO=C(COc1ccc(Cl)cc1[N+](=O)[O-])N1CCc2cc([N+](=O)[O-])ccc21,2.251658336,3.1245,0.872841664\nO=C(Cc1ccc(F)cc1F)NC1CCN(c2cccc(F)c2)CC1,2.033335093,3.4316,1.398264907\nCOC(=O)c1ccc(C[NH+](C2CC2)[C@@H](C)c2ccco2)cc1F,4.038401414,2.5138,-1.524601414\nCc1ccc([C@@H]2C[NH2+]CC[C@@H]2c2c(F)cccc2F)cc1,3.825954589,3.10772,-0.718234589\nCc1ccc(-c2ccc(=O)n(CC(=O)NCc3ccco3)n2)c(C)c1,2.180504107,2.43654,0.256035893\nCCNC(=O)NC(=O)CSc1nnc(N2C[C@@H](C)C[C@@H](C)C2)n1Cc1ccco1,3.320744633,2.3395,-0.981244633\nCC(C)(C)NC(=O)N1CCC(CNC(=O)C(=O)Nc2ccccc2[N+](=O)[O-])CC1,2.351664532,1.8696,-0.482064532\nCc1ccc(-n2cnnn2)cc1NC(=O)c1cnn(Cc2ccccc2)c1,2.221673771,2.46782,0.246146229\nO=C(Nc1ccc(F)cc1OCC(F)F)c1cccnc1N1CCCC1,2.273178554,3.7171,1.443921446\nCCCNC(=O)CCNC(=O)CN1CCC([NH3+])CC1,2.682391179,-1.2748,-3.957191179\nCOCCNC(=O)COc1cc2c(c3oc(=O)c4c(c13)CCC4)CCC(C)(C)O2,2.819768111,2.5267,-0.293068111\nCC(=O)N[C@@H](CC(=O)Nc1cnn(C)c1)c1cccs1,2.760825185,1.6876,-1.073225185\nO=C(c1cccc(S(=O)(=O)[N-]c2ccccc2F)c1)N1CCC[C@@H]1c1nc2ccccc2[nH]1,3.413113671,5.0734,1.660286329\nC[C@H](CNC(=O)c1[nH]nc(C2CC2)c1Cl)Oc1cccc(F)c1,3.06692193,3.2769,0.20997807\nCOc1ccccc1C(=O)/C(C#N)=C\\c1cnc(-c2ccccn2)s1,2.549035852,4.00358,1.454544148\nO=C(NCc1ccco1)[C@H]1CCCN1C(=O)C[NH+]1CCc2sccc2C1,4.132206287,0.5895,-3.542706287\nCC[C@@H]1CN(C(=O)c2ccc3c(c2)OCO3)CC[NH2+]1,3.69238642,0.2131,-3.47928642\nCc1cc(C(=O)Nc2ccc(F)cc2C)on1,1.934252507,2.68284,0.748587493\nC[C@@H](N[C@H](C[C@@H]1CCOC1)c1ccccc1)c1ccc([N+](=O)[O-])cc1,3.168218916,4.4133,1.245081084\nO=C(NNS(=O)(=O)c1ccc(F)cc1)c1cc2c(s1)CCCC2,2.219578826,2.3893,0.169721174\nCc1cc(C#N)ccc1Oc1ccc([N+](=O)[O-])cc1C#N,2.231562188,3.43888,1.207317812\nCOc1ccc(-c2nc(CNC(=O)c3ccccc3)n[nH]2)cc1,1.980697063,2.4103,0.429602937\nCc1cccc(OCC(=O)N/N=C/c2cc(Br)c(O)c(Br)c2O)c1,2.453904832,3.46032,1.006415168\nCC1=Nc2ccccc2Oc2c1c(=O)n(-c1ccccc1)c1ccccc21,2.458237938,5.2371,2.778862062\nCOC[C@@](C)(O)CNC(=O)[C@@H]1CC(=O)N(c2ccc(F)cc2)C1,3.005123049,0.6922,-2.312923049\nFc1ccccc1[C@@H]([NH2+]Cc1ccoc1)C1CCCC1,3.69757193,3.4136,-0.28397193\nCC1CCN(C(=O)CN2C(=O)N[C@](C)(c3ccc(Cl)cc3Cl)C2=O)CC1,2.732754879,3.0189,0.286145121\nCOc1cccc(NC(=O)NCc2ccc3c(c2)N(C(=O)c2cccc(C)c2)CC3)c1,2.09420016,4.52822,2.43401984\nO=C(/C=C/c1ccccc1)NC(=S)Nc1cc([N+](=O)[O-])ccc1F,2.105372184,3.2603,1.154927816\nCCCn1nccc1NC(=O)c1nnn(-c2ccc(C)cc2)c1C,2.315214931,2.74294,0.427725069\nCOc1cccc(-c2nc(CC(=O)N[C@@H]3CCS(=O)(=O)C3)cs2)c1,2.722418907,1.6645,-1.057918907\nCOc1cc(NC(=O)c2cnn(-c3ccccc3)n2)cc(OC)c1OC,2.151091028,2.5454,0.394308972\nCOc1ccc2c(c1)cc(C(=O)N[C@H](C)Cc1cnccn1)n2C,2.807843567,2.3379,-0.469943567\nCC(C)COCCNC(=O)NNc1ccccc1Cl,2.165718029,2.6387,0.472981971\nCCC[C@@H](C)[NH2+]C[C@H](C)Oc1cccnc1C,4.126301903,1.90932,-2.216981903\nCc1ccc(C)c(OCCSc2ccc(N)c(C)c2)c1,2.060838323,4.36516,2.304321677\nC[C@H](CC#N)N(C)C[C@@H]1CSc2ccccc21,3.590360435,3.10988,-0.480480435\nCCN[C@@]1(C#N)CC[C@@H](Oc2cccc(F)c2F)C1,3.52564354,2.76798,-0.75766354\nCc1cc(C[NH2+][C@@H](CO)C(C)C)c(C)n1-c1ccccc1,3.596987275,2.17444,-1.422547275\nCC1CCC(NC(=O)C(=O)Nc2cccc(-c3nncn3C)c2)CC1,2.347143544,2.1155,-0.231643544\nCC(=O)Nc1cccc(C(=O)/C=C/c2ccco2)c1,2.003848944,3.1341,1.130251056\nCCCNC(=O)c1cccc(NC(=O)C(=O)N2C[C@H]3CC=CC[C@@H]3C2)c1,3.148539376,2.1895,-0.959039376\nCc1nc(C)c(C(=O)N2CCN(C(=O)Cc3ccc(Cl)cc3)CC2)o1,2.226794753,2.47194,0.245145247\nCCc1nc(CN/C(=[NH+]/C)N2CC[NH+](CC(C)C)CC2)cs1,4.614668767,-1.282,-5.896668767\nCN(C)c1ccc(C(=O)Nc2c3c(nn2C)CCC3)cc1,2.254443099,2.2271,-0.027343099\nO=C1[C@H](Nc2ccccc2O[C@H]2CCOC2)CCCN1Cc1ccccc1,2.979200776,3.4574,0.478199224\nN#Cc1cccc(CNC(=O)CCOc2ccc(C=O)cc2)c1,1.99088313,2.45608,0.46519687\nCOc1ccc(CCn2cc(C(=O)[O-])c(=O)[nH]c2=O)cc1,2.453088713,-0.8486,-3.301688713\nCN(CCc1ccccc1)C(=O)C(=O)Nc1ccc(Oc2ccncc2)cc1,2.089518656,3.5135,1.423981344\nCc1cc(C)cc(NC(=O)[C@H]2CCCN(S(=O)(=O)c3cn(C)cn3)C2)c1,2.830417176,2.07634,-0.754077176\nCc1nc(CNC(=O)[C@@H]2CCCCN2C(=O)OC(C)(C)C)sc1C,2.787253271,3.16574,0.378486729\nCCSC1=NC(=O)[C@H]2C(=N1)NC1=C(C(=O)CCC1)[C@H]2c1cccnc1,4.10432105,2.4395,-1.66482105\nCOCC[C@H](C)C(=O)Nc1ccc(Oc2cc[nH+]c3ccccc23)cc1,3.383061791,4.0573,0.674238209\nCOC(=O)c1sccc1S(=O)(=O)N1CCC[C@@H](NC(C)=O)C1,2.758371043,0.8239,-1.934471043\nCC(=O)O[C@H]1CC[C@]2(C)[C@@H]3CC[C@@]4(C)[C@@H](C[C@@H]5CCCC[C@@]54C(C)=O)[C@@H]3C[C@@H](O)[C@@]2(O)C1,4.807691125,4.422,-0.385691125\nCCNS(=O)(=O)[C@H]1CCN(CCSc2ccccc2)C1,2.893618866,1.7923,-1.101318866\nCC(C)OCCCNC(=O)N1CCC[NH+](CC2CCCCC2)CC1,3.869396076,1.682,-2.187396076\nC[C@]1(CCc2ccccc2)NC(=O)N(CC(=O)N2CCc3sccc3[C@H]2c2cccs2)C1=O,3.330561001,4.227,0.896438999\nOc1n[nH]c(-c2ccccc2)c1/C=N\\c1cccc(Cl)c1,2.575508438,4.1863,1.610791562\nC[C@@](NC(=O)c1scnc1C1CC1)(C(N)=O)c1ccccc1,3.152593137,2.151,-1.001593137\nCn1cc(C[NH2+]C2CN(C(=O)OC(C)(C)C)C2)c(C(C)(C)C)n1,3.531149004,1.4003,-2.130849004\nO=C(Cn1ncc2c([nH]c3ccccc32)c1=O)N1CC[C@@H](c2ccccc2)C1,2.801434255,2.8939,0.092465745\nCOc1cccn(CC(=O)N2CCC[C@H]2c2nccs2)c1=O,2.926470384,1.6771,-1.249370384\nCc1ccc(S(=O)(=O)N2CCCSCC2)c(C)c1,2.172121888,2.43104,0.258918112\nCC(C)CSc1ccc(N)c(N)n1,2.677260409,1.9941,-0.683160409\nCc1ccccc1O[C@@H](C)CC[C@H](C)[NH+]1CCCCC1,4.183468302,2.99982,-1.183648302\nCC1=CC(=O)C(C)=C(C)C1=O,2.554710776,1.4209,-1.133810776\nCOC(=O)c1c(NC(=O)c2cc3cc(Br)ccc3o2)sc2c1CCC2,2.326967163,4.7844,2.457432837\nO=C(c1cc(=O)c2ccccc2o1)N1CCN(c2ccc(-c3noc(C(F)(F)F)n3)c[nH+]2)CC1,3.316210434,2.6383,-0.677910434\nCCN1C(=O)c2cc(NC(=O)c3cccc(OC)c3OC)ccc2Oc2ccccc21,2.188834997,4.7285,2.539665003\nCCC[C@H](C)C(=O)NCc1c(O)ccc2c1CCCC2,2.79815046,3.3234,0.52524954\nCc1c(CCC(=O)NC2CC2)c(=O)n2nc(-c3ccccc3)nc2n1Cc1ccccc1,2.39384803,3.12592,0.73207197\nCC(C)(C)C(=O)N1N=C(c2cccc(NS(C)(=O)=O)c2)C[C@H]1c1cccs1,3.033061881,3.8434,0.810338119\nCOc1ccc(/C=C/C(=O)Nc2ccc(Cl)c(-c3nc4ncccc4o3)c2)cc1,2.282822064,5.2037,2.920877936\nCc1cc(C(=O)Nc2ccc(C[NH+]3CCCC3)cc2)c2cnn(Cc3cccs3)c2n1,3.308540728,3.28052,-0.028020728\nC=CCn1c(CC)nnc1Sc1nc2c([nH]1)c(=O)n(C)c(=O)n2C,2.9507363,0.4514,-2.4993363\nCCN(C(=O)[C@H](C)c1cccc(F)c1)c1ccc(C#N)c(Cl)c1,2.787640646,4.50738,1.719739354\nCOC(=O)CSc1nnc(-c2ccc(OC)cc2OC)o1,2.12901572,2.0189,-0.11011572\nC[C@@H](NC(=O)CCc1cccc(F)c1)c1ccc2c(c1)CCC(=O)N2,2.559093635,3.5204,0.961306365\nCOC(=O)[C@H](NC(=O)c1cc(C)on1)c1ccc(F)c(F)c1,2.709930274,1.90532,-0.804610274\nO=C([O-])c1ccccc1NCc1ccccc1[N+](=O)[O-],2.300858927,1.5704,-0.730458927\nCc1cc(=O)c(C(=O)NCCc2ccccc2F)nn1-c1ccccc1F,2.138447824,2.79162,0.653172176\nCC(C)[C@H](NC(=O)c1ccccc1)C(=O)N[C@H](C)C(N)=O,2.47185794,0.431,-2.04085794\nCc1ccccc1OC1CN(c2ncccc2C(=O)N2C[C@@H]3CC[C@H]2C3)C1,3.852979496,3.28212,-0.570859496\nCCc1nc(CN2CCC[C@H]([NH2+]CC3CCN(C(C)=O)CC3)C2)no1,3.812233153,0.4183,-3.393933153\nCCN(CC)C(=O)CSc1c(C)cnc2c1c(=O)n(C)c(=O)n2C,2.632741217,0.90112,-1.731621217\nCCOc1cc2c(cc1OCC)C[NH+](Cn1nc3n(c1=S)CCCCC3)CC2,4.018751641,2.53659,-1.482161641\nO=C(CNCC(F)(F)F)N(Cc1cccc(O)c1)C1CC1,2.389515685,2.0351,-0.354415685\nCc1cc(CN2CCn3c(nnc3-c3ccccc3)C2)ccc1-n1cncn1,2.418767322,2.85002,0.431252678\nC[C@@H]1C[C@@H](C)CN(S(=O)(=O)c2cccc(C(=O)Nc3nc(C4CC4)cs3)c2)C1,3.076699104,3.9394,0.862700896\nCOC(=O)c1c(NC(=O)CCC(=O)[O-])sc2c1CCCCCC2,2.562728956,1.6623,-0.900428956\nS=C(Nc1cccnc1)N(Cc1ccccc1)C[C@H]1CCCO1,2.556251215,3.4596,0.903348785\nC[NH+]1CC[C@@H](N2CC3(CCC2=O)CC[NH+](Cc2c[nH]cn2)CC3)C1,6.165717612,-1.5158,-7.681517612\nCC[C@@](C)([C@@H]([NH2+]C)C1C[C@@H](C)C[C@@H](C)C1)[NH+]1CCCCC1,5.7683801,1.858,-3.9103801\nCn1cc(NC(=O)N2CCN(c3ccc([N+](=O)[O-])cc3)CC2)c2ccccc2c1=O,2.24714082,2.8008,0.55365918\nCC(C)C[C@H](NC(=O)c1cc(-c2cccn2C)n[nH]1)c1ncnn1C,3.319192988,2.0609,-1.258292988\nCCNC(=O)CN(C1CCCCC1)S(=O)(=O)c1ccccc1,2.045861074,2.1461,0.100238926\nCCSc1cc(C(=O)NC[C@H](C)Nc2ccccc2)ccn1,2.637650604,3.424,0.786349396\nCc1noc(C(C)C)c1C(=O)N1CCN(C[C@H](C)O)CC1,2.859488392,1.24502,-1.614468392\nCSc1ccc(Cl)c(C(=O)Nc2cc(C(=O)N(C)C)ccc2Cl)c1,2.013567479,4.6694,2.655832521\nCC(=O)c1ccc(OC[C@@H](O)CN2CCn3c(nn(C)c3=O)C2)cc1,3.001332769,0.0399,-2.961432769\nCc1ccc([C@H]2C[NH+]=C(N)N2Cc2ccccc2)c(C)c1,3.297715153,1.25564,-2.042075153\nCc1nc(SCC(=O)c2ccc3[nH]c(=O)[nH]c3c2)nc(C)c1C,2.466697225,2.54646,0.079762775\nCn1cc(C(=O)N[C@H]2CCC[C@H]3OCC[C@H]23)c(-c2ccccc2)n1,3.352877627,2.7745,-0.578377627\nCC[NH2+]C[C@]1(c2ccc(F)cc2Cl)CCC[C@@H]1C,4.165842394,3.1202,-1.045642394\nCCN(CCNC(=O)N[C@@H](C)c1ncc(C)s1)c1cccc(C)c1,2.919644026,3.64664,0.726995974\nCOCC(=O)Nc1ccc(-c2noc([C@@H]3CCCN3Cc3[nH]cc[nH+]3)n2)cn1,3.85152994,1.1958,-2.65572994\nCc1cc(NC(=O)[C@H]2COc3ccccc3O2)c2ncccc2c1,2.627262029,3.32172,0.694457971\nCC[C@H]1CC(=O)N(C[C@@H]([NH3+])c2ccc(OC(C)C)cc2)C1,3.641910716,2.0153,-1.626610716\nCc1ccc(C(=O)N(C)CCC#N)cc1,1.82915826,1.9807,0.15154174\nCN(O)[C@H]1CC[C@H](c2ccc(C(C)(C)C)cc2)C1,3.088427273,3.9412,0.852772727\nCc1cc(C)n(-c2nc([O-])cc(C(C)C)n2)n1,3.071945177,1.47614,-1.595805177\nCO[C@@H](C)c1nc(C[NH+](C)Cc2cccc(C)c2C)cs1,4.259931405,2.68224,-1.577691405\nCCc1ncc(CN(C)C(=O)c2c[nH]nc2-c2ccc(OC)cc2)s1,2.753719275,3.3764,0.622680725\nCc1ccc2[nH]c3c(=O)n(CC(=O)N(C)c4ccc(C)c(C)c4)ncc3c2c1,2.543058432,3.46606,0.923001568\nCC[NH2+]C[C@H]1CCO[C@@H]1c1ccc(F)cc1F,4.211152742,1.6257,-2.585452742\nCn1nc(C(C)(C)C)cc1C(=O)Nc1ccc(C(N)=O)cc1,2.062408364,2.0688,0.006391636\nO=C(Nc1cccnc1)c1ccc(NC(=O)c2ncn(-c3ccccc3)n2)cc1,2.128494967,3.1669,1.038405033\nO=c1cc(C2CCC2)nc(-c2cccc(C[NH+]3CC(O)C3)c2)[nH]1,3.57334498,0.4638,-3.10954498\nC[C@@H](OC[C@H]1CCCO1)C(=O)Oc1cc(Cl)ccc1Cl,3.004023247,3.4829,0.478876753\nCOc1ccccc1C1(C(=O)Nc2ccc3c(c2)N(C(=O)C2CC2)CC3)CC1,2.34415068,3.6646,1.32044932\nO=C([O-])[C@@H]1Cc2c([nH]c3ccccc23)[C@H](C(=O)[O-])[NH2+]1,4.686340492,-2.8031,-7.489440492\nO=C1NC2(CCCC2)C(=O)N1Cc1ccc(Br)s1,2.905374606,2.8752,-0.030174606\nCC(C)N1CCO[C@H](c2nnn[n-]2)C1,4.545882887,-0.3895,-4.935382887\nCc1ccc2c(c1)C(=O)N([C@H](C)C(=O)NCCCC(C)C)C2=O,2.543137633,2.53192,-0.011217633\nCCN(CC)S(=O)(=O)N[C@H](c1ccc(F)cc1)c1nccn1C,2.880328558,1.8248,-1.055528558\nCCN(CC)C(=O)C1CCN(c2cccc(Cl)c2)CC1,1.925061949,3.4248,1.499738051\nO=C(NCC(=O)N1CCN(c2ccc(F)cc2)CC1)N[C@@H]1CCS(=O)(=O)C1,2.699990976,-0.0394,-2.739390976\nCCn1c(SCC(N)=O)nnc1[C@H](C)Oc1cccc(Cl)c1,2.695897046,2.6688,-0.027097046\nCOC(C)(C)C(=O)Cc1cccc2ccccc12,2.090800657,3.3764,1.285599343\nCC1CCN(C(=O)c2ccccc2NCC(=O)N[C@H](C)C(C)C)CC1,2.513852635,3.1313,0.617447365\nCOCC[C@@H](C)C(=O)Nc1ccc2c(C)n[nH]c2c1,2.813255353,2.48242,-0.330835353\nCC[C@H]([NH3+])[C@H](Sc1cc2ccccc2[nH]1)c1cccnc1,3.893539633,3.4168,-0.476739633\nN#C[C@@H](c1ccc(Cl)cc1)c1ccc([C@H](O)c2ccccc2)cc1,2.754553872,5.07718,2.322626128\nCCS(=O)(=O)c1ccccc1NCc1c[nH+]cn1C1CC1,3.319676997,2.0428,-1.276876997\nCc1ccc(OC[C@@H](O)C[NH+](C)Cc2cc(C)on2)c(C)c1,3.945702345,1.05446,-2.891242345\nCOc1cccc2c1OC[C@@H](NC(=O)Nc1cncc3ccccc13)C2,2.816092277,3.3686,0.552507723\nCc1cc([C@H]2CCCN2CCc2ccc([N+](=O)[O-])cc2)no1,2.770692362,3.27082,0.500127638\nCOc1ccccc1-n1nc(C(=O)NC2CCCCCC2)c2ccccc2c1=O,2.082487998,3.8469,1.764412002\nO=C(Cn1ccc(=O)[nH]c1=O)NC[C@H]1CCOC1,2.849533882,-1.3107,-4.160233882\nCC(=O)N1C(C)(C)C=C(CO)C1(C)C,3.451082344,1.3244,-2.126682344\nCc1cnc(CNCc2c[nH+]cn2C2CC2)cn1,3.652209805,1.02532,-2.626889805\nCc1cc(C(=O)N2CCc3cc(Cl)cc(Cl)c3C2)[nH]n1,2.669253158,3.22342,0.554166842\nCCOC(=O)[C@H](CC)N1CCC(C(=O)N2CCCCCC2)CC1,2.5693103,2.4427,-0.1266103\nCC(C)[C@H](C)NC(=O)C(=O)Nc1cc(Br)ccc1-n1cccn1,2.804510976,2.734,-0.070510976\nO=C(CSc1ccc(F)cc1)Nc1nnc(-c2ccccn2)o1,2.191540921,3.0015,0.809959079\nC/C=C/c1ccc(OCC(=O)Nc2ccc3[nH]c(=O)[nH]c3c2)c(OC)c1,2.253707868,2.9154,0.661692132\nCN(CC[NH+](C)C)C(=O)c1cc(F)c(Cl)cc1Cl,3.312684493,1.349,-1.963684493\nO=[N+]([O-])c1cnccc1NCc1cccc(F)c1,2.013819951,2.741,0.727180049\nCc1ccc(F)c(NC(=O)c2cccc(CS(C)(=O)=O)c2)c1,1.842203357,2.93102,1.088816643\nCc1nc(N2CCN(c3ccccc3O)CC2)nc2cc3c(cc12)CCC3,2.344492661,3.45912,1.114627339\nCC(C)c1ccsc1C(=O)N1CCN(c2ncccc2C#N)CC1,2.467553166,3.10058,0.633026834\nCc1ccc(S(=O)(=O)N2CCC[C@@H](c3nc(-c4ccccc4Br)no3)C2)cc1,2.679753116,4.37582,1.696066884\nCCc1ccc2nc(C)cc(C(=O)N[C@H]3CCC(=O)N(C)C3)c2c1,2.726969629,2.45622,-0.270749629\nCS(=O)(=O)[N-]c1ccc(C2=NN/C(=N\\c3ccccc3)SC2)cc1,3.686623439,3.3796,-0.307023439\nCOc1ccc(Br)cc1C[NH2+][C@H]1CCCC1(C)C,3.803916035,3.0998,-0.704116035\nO=C1CCC[C@@H]1CC[S@@](=O)c1cc(Cl)ccc1Cl,3.495879625,3.8603,0.364420375\nC[C@@H]([NH2+]Cc1ccn[nH]1)c1ccc(Cl)s1,4.249743882,1.9492,-2.300543882\nNC(=O)[C@@H]1CCCN1c1nc(-c2cccnc2)nc2scc(-c3ccccc3)c12,2.881834056,3.8744,0.992565944\nCOc1ccccc1CNC(=O)c1nn(C)c(=O)c2ccccc12,1.894141175,1.8721,-0.022041175\nCc1ccc(F)cc1S(=O)(=O)NCCCCn1ccccc1=O,2.142833523,2.05452,-0.088313523\nCc1cc(C)nc(N(C)CC(C)(C)O)n1,2.621362705,1.30054,-1.320822705\nCOC(=O)c1sc2ccc(NC(=O)NC(C)C)cc2c1OC,2.204937973,3.2264,1.021462027\nOc1c(F)ccc2cc[nH]c12,2.532800833,2.0126,-0.520200833\nCc1cccc(-c2nc(C[NH+]3CCC[C@@H](Cn4cncn4)C3)no2)c1,4.25447131,1.13162,-3.12285131\nO[C@H]1Cc2ccccc2[C@H]1[NH2+]Cc1nccn1Cc1ccccc1,3.674403729,1.6531,-2.021303729\nCc1cccc(S(=O)(=O)[C@H](C)C(=O)Nc2cc(F)ccc2F)c1,2.496263936,3.07412,0.577856064\nO=C1C([O-])=C(O)C(=O)c2ccccc21,2.704172084,0.1955,-2.508672084\nCC(C)(C)CC(=O)Nc1cc(F)c(F)cc1Cl,2.036293505,3.9929,1.956606495\nCOc1ccc(C(=O)NC(=S)Nc2ccc3oc(-c4ccc(O)c(Br)c4)nc3c2)cc1,2.285489018,5.0983,2.812810982\nCOC(=O)C1(NCc2cnc3ccccn23)CCCC1,2.69067309,1.9097,-0.78097309\nCOC(=O)c1cc(-c2oc3ccc(Br)cc3c(=O)c2C)cc([N+](=O)[O-])c1,2.387240107,4.22572,1.838479893\nFc1ccc(CNC2=NC=NC3=NC=N[C@H]32)cc1,3.602406741,1.1647,-2.437706741\nOC[C@@H](CCc1cccs1)c1cccc(F)c1,2.630519417,3.5959,0.965380583\nC[C@H]1CCc2c(C(=O)NCc3ccc(S(N)(=O)=O)cc3)csc2C1,2.683701432,2.4503,-0.233401432\nCOC(=O)NCc1ccc(NCc2ccc(Br)c(C)c2)cc1,1.910394325,4.22562,2.315225675\nCNS(=O)(=O)CC(=O)N[C@H](C)c1ccc(SC(C)C)cc1,2.831783505,1.9135,-0.918283505\nc1cc(N2CCCCCC2)ccc1C[NH2+]Cc1nnc2n1CCC2,3.039500112,1.8683,-1.171200112\nCC[C@@H](C)NC(=O)[C@@H](C)S(=O)(=O)Cc1ncc(Cl)n1C,3.496099959,1.2915,-2.204599959\nCC(=O)c1cccc(OC(=O)/C=C/c2cn(CCC#N)c3ccccc23)c1,2.377963524,4.37638,1.998416476\nCC(C)c1ccccc1NC(=O)C1CCN(S(C)(=O)=O)CC1,1.946704896,2.4201,0.473395104\nC[NH+](C)[C@H]1CC[C@@H](NC(=O)N2CCN(c3nnnn3-c3ccccc3)CC2)C1,3.830633555,-0.4405,-4.271133555\nCCn1cc(C(=O)C(=O)N(C)c2c(Cl)cccc2Cl)cn1,2.653209612,3.0555,0.402290388\nCC(C)CCN1N[C@H](C2=NC(=O)C3=C4CCCC[C@@H]4SC3=N2)c2ccccc2C1=O,4.330683963,4.0574,-0.273283963\nCc1ccnc2nc(C(=O)N(CCC[NH+](C)C)Cc3ccccc3)nn12,3.171964209,0.60972,-2.562244209\nCC[C@H](NC(=O)c1ccc([N+](=O)[O-])cc1F)c1ccc(OC)cc1,2.384004406,3.6236,1.239595594\nCCc1ccccc1NC(=O)N[C@H](C)C(=O)N1CCCC[C@@H]1C,2.748723049,3.16,0.411276951\nO=C(CS(=O)(=O)c1ccc2ccccc2n1)N1CCCC1,2.204420544,1.6309,-0.573520544\nCc1nc(-c2ccc(S(=O)(=O)N[C@H](C)C[NH+]3C[C@@H](C)C[C@@H](C)C3)cc2)co1,4.544819199,1.87762,-2.667199199\nCCOc1ccccc1NC(=O)C[C@@H]1CSc2nc(C(C)(C)C)cc(=O)n21,3.009533648,3.6151,0.605566352\nC[C@H]([NH2+]Cc1ccc(F)c(CO)c1)c1ccc(-c2cccs2)cc1,3.417228851,3.8711,0.453871149\nC[C@@H](Oc1ccccc1Cl)C(=O)Nc1cccc(I)c1,2.252775185,4.3506,2.097824815\nCOc1ccc(CCNC(=O)c2c[nH]nc2-c2ccccc2)cc1,2.153124233,3.0578,0.904675767\nC[NH+]1CCN(CC(=O)NC[C@@H]2COc3ccccc3O2)CC1,3.69680263,-1.2271,-4.92390263\nCOc1ccc2c(c1)N(C(=O)c1cccc([N+](=O)[O-])c1)CCO2,2.081577406,2.6426,0.561022594\nCCc1nc(C(=O)Nc2ccc(N(C)C(=O)OC)cc2)nn1-c1c(Cl)cccc1Cl,2.515200631,4.5914,2.076199369\nO=C(Nc1sc2c(c1C(=O)Nc1ccccc1)CCC2)c1ccccc1Cl,1.933911169,5.3948,3.460888831\nCc1ccc(Sc2ccc(=O)n(-c3ccc(C(=O)[O-])cc3)n2)cc1,2.544596831,2.05562,-0.488976831\nO=C1CCCN1CC#CC[NH+]1CCCC1,4.434869482,-0.7091,-5.143969482\nCOc1cc2c(cc1C[NH+]1C[C@@H]3CCC[C@H](C1)C3(O)c1ccccn1)CCC2,5.299426023,2.2815,-3.017926023\nCC(C)CN(C)CN1C(=O)C(=O)N(C23CC4CC(CC(C4)C2)C3)C1=O,4.036679938,2.2913,-1.745379938\nC[C@@H](O)c1ccc(N(C)[C@H](C)c2ccccc2F)cc1O,3.081308678,3.782,0.700691322\nCOc1ccc(NC(=O)N[C@@H]2CCO[C@@H]2C)c(Br)c1,3.049111174,2.7566,-0.292511174\nCn1c(C[NH+]2CCC(CS(N)(=O)=O)CC2)nnc1-c1ccccc1,3.609551356,-0.4345,-4.044051356\nCCc1cccc(NC(=O)Cc2nc(Cc3ccc(F)cc3)n[nH]2)c1,2.201228159,3.2782,1.076971841\nCc1cc(O)c(-c2cc(C(=O)Nc3cccnc3)n[nH]2)cc1C,2.368833974,3.04644,0.677606026\nCc1cnc(CN2CCO[C@H](CN(C)c3cccn[nH+]3)C2)o1,4.141147342,0.52932,-3.611827342\nCC[NH2+][C@@H](Cc1ccc(Br)s1)[C@@H]1CSCCO1,4.741884708,2.137,-2.604884708\nO=C(NC(=S)Nc1ccc(O)cc1)c1ccccc1[N+](=O)[O-],1.926072916,2.4272,0.501127084\nCC(=O)Nc1cccc(NC(=O)C(=O)N(C)CCc2cccs2)c1C,2.321622474,2.65452,0.332897526\nClc1cccc(CC[NH2+]Cc2cc[nH]c2)c1,3.452394629,1.9742,-1.478194629\nO=C(CSC(=S)N1CCCC1)N1CCc2ccc([N+](=O)[O-])cc21,2.513903323,2.5978,0.083896677\nCOc1ccc(N2C[C@@H](C(=O)NN3CCCCC3)CC2=O)c(OC)c1,2.730423213,1.5738,-1.156623213\nO=C(c1cccc(O)c1)N1CCN(C/C=C/c2ccccc2[N+](=O)[O-])CC1,2.224870018,2.7716,0.546729982\nCOc1cc(-c2cccc(CN3CCOCC3)c2)ncn1,2.083956027,1.9844,-0.099556027\nCc1csc(C(=O)[O-])c1NC(=O)c1cccc([N+](=O)[O-])c1,2.643339977,1.58052,-1.062819977\nCCC(CC)C(=O)Oc1ccc(S(=O)(=O)N(C)C)cc1,2.090592718,2.2785,0.187907282\nCN(C)S(=O)(=O)c1ccc(C(=O)N2CCC(Oc3nc4c(F)cccc4s3)CC2)cc1,2.426432086,3.3693,0.942867914\nCOC(=O)Cc1ccc(OCC(=O)N2CCCC2)cc1,1.744125512,1.4033,-0.340825512\nCC[C@H](Oc1ccc2c(c1)CCC2)C(=O)N1CCCCCC1,2.545510082,3.7353,1.189789918\nCO[C@H](C(=O)Nc1ccc(C(=O)NCC(C)C)cc1)c1ccccc1,2.252969911,3.3986,1.145630089\nO=C1C[C@@H](CNC(=O)Nc2ccc3c(c2)COC3)c2ccccc2N1,2.934345456,2.9643,0.029954544\nC=CCN(CC(=O)[O-])C(=O)CSc1cccs1,3.162627302,0.6047,-2.557927302\nCCOC(=O)C1CCN(CN2C(=O)NC(c3ccccc3)(c3ccccc3)C2=O)CC1,2.379611396,2.7146,0.334988604\nCC(C)([C@H](O)[C@H]1CCOC2(CCCC2)C1)[NH+]1CCCCC1,5.244483813,1.9341,-3.310383813\nNc1cc(-c2nc3cc(-n4cnnc4)ccc3o2)ccc1Cl,2.535858453,3.3111,0.775241547\nC[NH+](C)[C@@H]1CC[C@H](NC(=O)C(C)(C)CCCc2ccccc2)C1,3.853817039,2.2173,-1.636517039\nCCOc1ccc([C@H](C)NC(=O)c2ccc(F)cc2F)cc1,2.159335886,3.8545,1.695164114\nCOC(=O)C(C)(C)[C@@H]1CCC[NH+](Cc2nnc(C)n2C2CC2)C1,4.613226345,0.91552,-3.697706345\nCCc1cnc(CN(C)C(=O)c2ccc(SC(F)F)cc2)s1,2.537210369,4.2924,1.755189631\nC[NH2+]Cc1cc(=O)[nH]c(-c2ccc(F)cn2)n1,3.604829082,-0.3358,-3.940629082\nC[C@@H](NC(=O)COc1ccccc1F)c1ccccc1,2.002354089,3.0819,1.079545911\nCC[C@@H](C)Oc1cccc(C(=O)Nc2cccc(C(=O)N3CC[NH+](C)CC3)c2)c1,3.343060863,2.0867,-1.256360863\nCOC(=O)c1cccc(CNC(=O)Nc2cc(F)ccc2OC)c1,1.778129704,2.9426,1.164470296\nNC(=O)c1cncc(C#CC2(NC(=O)C3CCCC3)CCCCC2)c1,2.735607005,2.5413,-0.194307005\nO=C(CN1CC2(CCOCC2)Oc2ccccc2C1=O)NCCN1CCCC1=O,3.166655355,0.809,-2.357655355\nCC[C@H]1CCC[C@@H](C(=O)[C@@](C)(CC)N2CCOCC2)C1,3.690512322,3.2728,-0.417712322\nCOc1ccc(OC)c(-n2nnnc2S[C@@H](C)c2nc3ccccc3n2C(F)F)c1,3.036983258,4.2776,1.240616742\nCC(=O)NN1C(N)=C(C#N)[C@@H](c2ccc(Cl)cc2)C2=C1CC(C)(C)CC2=O,3.23336918,3.12738,-0.10598918\nCc1cc(-c2cncnc2C2CCN(C(=O)c3cn(C)nc3C)CC2)on1,2.788500188,2.50174,-0.286760188\nC[C@H](NC(=O)Cc1cc2ccccc2[nH]c1=O)c1ccc2ccccc2c1,2.464508914,4.1012,1.636691086\nCc1ccc(CC(=O)N2CCC[C@H](C)C2)cn1,2.432306191,2.19102,-0.241286191\nCCn1/c(=N/C(=O)Cn2cccn2)[nH]c2ccc(Br)cc21,3.05958439,2.0758,-0.98378439\nCc1cc(C(=O)N2CCN(Cc3cnc4ncccn34)CC2)c(C)o1,2.619295467,1.89714,-0.722155467\nCOc1ccccc1CNC[C@@H](c1ccc(C)o1)N1CCOCC1,2.661936754,2.75972,0.097783246\nC#CCN(Cc1ccccc1Cl)C1CCCC1,2.276202846,3.7178,1.441597154\nCCn1cc(Cn2cc(NC(=O)C3CCN(S(C)(=O)=O)CC3)cn2)cn1,2.468456254,0.7579,-1.710556254\nCC[C@@H](C)NC(=O)[C@@H](C)NCc1cccc(C(N)=O)c1,2.653152917,1.1783,-1.474852917\nNC(=O)C1CC[NH+](C/C=C/c2ccccc2)CC1,3.578814538,0.48,-3.098814538\nc1ccc(-c2cc(CSc3ncnc4sccc34)on2)cc1,2.382726154,4.6386,2.255873846\nC[C@H](NC(=O)N(C)C)c1cccc(C#CCCO)c1,2.827051719,1.7527,-1.074351719\nCC(=O)CCc1ccc(O[C@H](C)C(=O)N[C@H]2CCCC[C@@H]2C)cc1,2.92425479,3.6704,0.74614521\nO=Cc1c(C2CC2)oc2ccc(OCc3ccc(Cl)nc3)cc12,2.624109703,4.7501,2.125990297\nCN(Cc1cscn1)c1ccc(S(=O)(=O)C(C)(C)C)nc1,2.900280867,2.7467,-0.153580867\nCC(C)[NH+]1CCN(c2ccc(C(=O)N3C[C@H](C)OCC3(C)C)cc2)CC1,4.047651791,1.4394,-2.608251791\nO=C(NCCC[NH+]1CCCCC1)N1CCO[C@H](c2ccccc2)C1,3.814447251,1.2284,-2.586047251\nC[C@H](C(=O)N(C)CCC#N)[NH+](C)Cc1ccccc1,3.961452338,0.46188,-3.499572338\nO=C(/C=C/c1ccc(Cl)cc1)Nc1nnc([C@H]2CCCO2)s1,2.806479793,3.6949,0.888420207\nCC[C@@](C)([C@H](NC)c1cccc(OC)c1)[NH+]1CCCCC1,4.2931591,2.1932,-2.0999591\nCn1c(=O)c2[nH]c(NCCCCl)nc2n(C)c1=O,2.736699884,0.0011,-2.735599884\nCc1csc(C2(NC(=O)CCCc3nnc(-c4ccccc4)o3)CCCC2)n1,2.643283154,4.40992,1.766636846\nCCSc1ccc(Cl)cc1C(=O)Nc1cccnc1C,2.132314147,4.40772,2.275405853\nC[C@@H]1CCCN(C(=O)N[C@@H]2CCCCC[C@@H]2[NH3+])C1,3.76711659,1.3711,-2.39601659\nO=C(CN1CCCOC1=O)N1CCC[C@@H]([NH+]2CCCCCC2)C1,4.143119308,0.2786,-3.864519308\nCc1cc(SCCCSCC#N)nc2ccccc12,2.540944048,4.2822,1.741255952\nO=C(CCNc1ccccc1[N+](=O)[O-])OC1CCC1,2.06472993,2.4925,0.42777007\nCCCSC1=NC(=O)[C@H]2C(=N1)NC(=O)C[C@H]2c1cc(Br)ccc1OC,3.954159567,3.1153,-0.838859567\nC=CCN(Cc1cccc(C#N)c1)C[C@H](O)c1cccc(F)c1,2.763633613,3.41898,0.655346387\nO=C([O-])c1cccc2c1ccn2Cc1cscn1,2.841347083,1.5096,-1.331747083\nC[C@@H]1CCCCN1C(=O)c1ccc(NS(=O)(=O)c2ccc3c(c2)=[NH+]C(=O)[NH+]=3)cc1,3.780275122,-1.964,-5.744275122\nCC[C@@H](Cl)CCc1cc(C)cc(F)c1,2.71715977,4.08412,1.36696023\n[NH3+][C@H](Cc1cccc(F)c1F)c1cc(Cl)sc1Cl,3.588641574,3.8588,0.270158426\nCc1cnc(N(C)CC2CCCC2)c([N+](=O)[O-])c1,2.421682195,2.92462,0.502937805\nCc1cccc2c(CCC(=O)N3CCC[C@@H](n4cncn4)C3)c[nH]c12,2.996277595,2.86412,-0.132157595\nCc1noc(C)c1[C@@H]1CCCN1C(=O)CN1CCNC1=O,3.158593888,0.98014,-2.178453888\nCCCCCCSc1nc2ncccn2n1,2.464580554,2.7967,0.332119446\nCc1ccc(OC(=O)C2CCN(C(=O)c3ccc(F)cc3)CC2)c(C)c1,1.905636655,3.90034,1.994703345\nCC[C@]1(C2CC2)CC(=O)NC(=O)[C@H]1Cc1ccccc1,3.395847148,2.6982,-0.697647148\nO=C(CSc1ncc2ccccn12)N1CCc2sccc2C1,2.587505606,3.0728,0.485294394\nCC(=O)Nc1ccc(NC(=O)c2ccccc2C(=O)c2ccc(F)cc2)cc1Cl,1.8770974,4.9208,3.0437026\nCc1ccc(Cl)cc1NC(=O)N1CC[C@@H](N(C)C(=O)OC(C)(C)C)C1,2.673373595,4.12152,1.448146405\nNc1nc(CC(=O)N2CCC[C@@H](c3[nH]ncc3-c3ccccc3)C2)cc(=O)[nH]1,3.16264224,1.6909,-1.47174224\nC[C@H](NS(=O)(=O)c1ccccc1)C(=O)N1C[C@H](C)c2ccccc21,2.879521476,2.5037,-0.375821476\nCc1cccc2cc(C(=O)NC[C@H](C)C(=O)[O-])oc12,3.101639588,0.85702,-2.244619588\nCNC(=O)c1cccc(OC(=O)c2oc3ccc(OC)cc3c2C)c1,2.081208379,3.32862,1.247411621\nCC[C@@H](NC(=O)Nc1cc(COC)ncn1)c1ccc(C)c(F)c1,2.843322787,3.34332,0.499997213\nCCc1nn(C)cc1NC(=O)N1CCN(C(=O)[C@H]2CCCO2)CC1,3.02663316,0.8376,-2.18903316\nCC(C)Nc1nnc(SCc2cn3cc(Cl)ccc3n2)s1,2.651955928,3.9518,1.299844072\nCC(=O)Nc1cccc(N[C@H](C)C(=O)NC[C@H](C)c2ccccc2)c1C,2.750741611,3.67372,0.922978389\nCC(C)(CCC(=O)OC(C)(C)C)NC(=O)c1ccc(Br)o1,2.503298166,3.6724,1.169101834\nCCOC(=O)c1c(NC(=O)c2cc3cccc(OC)c3oc2=O)sc(C)c1C,2.294956795,3.90894,1.613983205\nCC1(C)C[C@H]([NH2+]C(C)(C)C(N)=O)C(C)(C)O1,4.587654623,0.1598,-4.427854623\nFc1cccc(C[NH2+][C@H]2CCC[C@@H](C(F)(F)F)C2)c1,4.065181131,3.0102,-1.054981131\nCOc1ccc(-c2nc(S(=O)(=O)c3ccccc3)c(NCc3ccc4c(c3)OCO4)o2)cc1,2.369919411,4.5238,2.153880589\nCCN(C(=O)c1sc(NC(=O)c2ccco2)cc1C)[C@@H](C)C(C)C,3.067582502,4.40842,1.340837498\nCC(=O)c1ccc(N2CCN(CC(=O)Nc3ccc(F)cc3)CC2)cc1,1.745600799,2.789,1.043399201\nCC(C)(C)c1ccccc1NC(=O)CNc1cc2c(cc1Cl)NC(=O)CO2,2.379469962,4.019,1.639530038\nCOc1ccc(NC(=S)NC(=O)c2ccccc2OC(C)C)cc1Cl,1.95587331,4.2626,2.30672669\nCOCCn1ccc2cc(NC(=O)c3cccs3)ccc21,2.079198728,3.6015,1.522301272\nCCOc1ccc(C(=S)NC[C@@H]2CCCO2)cc1,2.528851633,2.5294,0.000548367\nCCCc1c(C(=O)NC2CCN(C(=O)OCC)CC2)cnn1-c1ccccc1,2.229874275,3.1755,0.945625725\nC#Cc1cccc(NC(=O)NCCC[NH+]2CCCCCC2)c1,3.486046434,1.6384,-1.847646434\nCCc1ccc([C@@H](O)[C@@]2(C)CCCO2)cc1,3.291365597,2.8515,-0.439865597\nCOC(=O)[C@@]1(F)CCN(C(=O)Cc2cccc(OC)c2)C1,2.848120431,1.3513,-1.496820431\nCNC(=O)CC1CCN(C(=O)c2nc(-c3ccccc3)oc2C2CC2)CC1,2.470756132,3.2073,0.736543868\nO=C1N[C@H](CO)C(=O)N2CCN(Cc3cnn(-c4ccccc4Cl)c3)C[C@H]12,3.293818928,0.0292,-3.264618928\nCNc1nc(C)cc(-c2cccc3cnccc23)n1,2.234845867,3.04192,0.807074133\nCCn1cc(-c2nnc(SCc3ccccc3Cl)n2C)cn1,2.291699181,3.6442,1.352500819\nO=C(CSc1nc(-c2ccco2)nc2ccccc12)Nc1nccs1,2.346398845,4.0771,1.730701155\nO=C1CCc2cc(S(=O)(=O)Oc3cccc(C(F)(F)F)c3)ccc2N1,2.341990367,3.3578,1.015809633\nCC[C@@H](C)c1noc([C@H]2CCCN(C(=O)Cc3cccs3)C2)n1,3.228844727,3.5933,0.364455273\nCCCOc1ccc(Br)cc1C[NH+]1CCC(c2noc(C)n2)CC1,3.614952536,2.89182,-0.723132536\nCCNc1cc[nH+]cc1S(=O)(=O)[N-]c1ccc(C)[nH+]c1C,5.185505424,1.75754,-3.427965424\nCc1cc([C@H]([NH3+])C2CCCCCC2)sc1Br,3.776193085,4.07242,0.296226915\nCC1(C)CCC[C@H]1NS(=O)(=O)c1ccc(N)cc1Cl,3.001833462,2.7792,-0.222633462\nCc1ccc(Cl)c(-n2nnc(C(=O)Nc3ccccc3F)c2C)c1,2.064234641,3.92894,1.864705359\nCCOC(=O)[C@H]1CCCN(C(=S)Nc2ccccc2F)C1,2.457567813,2.7976,0.340032187\nC=CC[C@H](C)[C@@H](C)[NH2+][C@@H](C)CS(C)(=O)=O,4.867243714,0.5836,-4.283643714\nCOc1cc([C@@H]2CC(O)=Nc3c(C(=O)[O-])cn(-c4ccc(Cl)cc4)c32)cc(OC)c1O,3.620591927,3.3406,-0.279991927\nCCC([NH3+])(CC)C(=O)N[C@@H]1C[C@H]2C[C@@H]1[C@H]1CCC[C@@H]12,5.358506374,1.728,-3.630506374\nO=C(CNC(=O)N1CCC(OCc2ccccc2F)CC1)N1CCCCC1,2.212233062,2.5288,0.316566938\nCOC(=O)c1ccc(Oc2ncnc(Oc3cccc4cccnc34)c2[N+](=O)[O-])cc1,2.403730151,4.3042,1.900469849\nCC[NH2+]C[C@@H](O)CN1C(=O)CCCC1=O,4.011495477,-1.5303,-5.541795477\n[NH3+][C@H]1C=C[C@@H](C(=O)NC2(CC(=O)[O-])CCCCC2)C1,4.700839766,-0.8679,-5.568739766\nCOc1cccc([C@H]2C(C(=O)c3ccc(Cl)cc3)=C([O-])C(=O)N2c2cc(C)on2)c1,3.126908422,3.23022,0.103311578\nCC(C)CC[NH2+]Cc1cc(F)ccc1F,3.319744001,2.0743,-1.245444001\nCC(C)C[C@H](NC(N)=O)C(=O)NCCc1ccc(Cl)cc1Cl,2.501281448,2.7351,0.233818552\nCc1nc(C(=O)N(C)Cc2ccc(C(F)(F)F)cc2)nn1-c1ccc(F)cc1,2.28443192,4.00582,1.72138808\nCCCc1nnc(NC(=O)c2c(Cl)ccc(Cl)c2Cl)s1,2.26792167,4.7031,2.43517833\nO=C([O-])c1ccc[nH+]c1NC[C@H](c1ccccc1)N1CCOCC1,3.813179172,0.3496,-3.463579172\nCOCCOc1ccc(CNC(=O)N2CCC[C@H](C)CC2)cn1,2.587574312,2.4384,-0.149174312\nCc1nccc2c1=C[C@@H](C(=O)N[C@@H](C)c1ccccc1)C(=O)[NH+]=2,4.094326362,-1.09548,-5.189806362\nCOc1ccccc1NC(=O)c1ccc(NC(=O)Cn2nc(C(=O)[O-])c3ccccc3c2=O)cc1,2.34156721,1.6596,-0.68196721\nO=C(NC[C@@H]1COc2ccccc2C1)c1n[nH]c(=O)c2ccccc12,2.702355993,1.9042,-0.798155993\nCC1(C)CCC[C@@H]1[NH2+][C@@H]1CCCC1(C)C,4.747288252,2.7072,-2.040088252\nCC(C)n1nccc1C(=O)OC[C@H]1CN(Cc2ccccc2)CCO1,2.811890846,2.5218,-0.290090846\nCCc1ccc([C@@H](Br)c2ccc3c(c2)C[C@@H](C)O3)o1,3.294719399,4.6497,1.354980601\nCc1ccc(S(=O)(=O)N2CCCCC2)cc1C(=O)N(C)CC[NH+](C)C,2.941929517,0.38612,-2.555809517\nCC[C@@H]1CCc2c(sc(=O)n2CN2CC[NH+](C)C[C@@H]2c2ccccc2)C1,4.511671939,1.9538,-2.557871939\nC=CCC[C@H](O)c1ccc(F)cn1,3.059250513,2.2203,-0.838950513\nCC(C)(C)NC(=S)N/N=C/c1cccnc1,2.423634839,1.6781,-0.745534839\nCC1CCN(C(=O)c2ccc(CSc3nc4ccccc4[nH]3)cc2)CC1,2.05068455,4.7273,2.67661545\nCc1c(C)n(-c2ccc(Br)cn2)c2nc[n+](Cc3ccncc3)c(N)c12,2.971114521,3.11284,0.141725479\nCOc1ccc([C@H]2Nn3c(C)nnc3S[C@@H]2C(=O)N2CCCC2)cc1,3.304234512,1.97662,-1.327614512\nCc1ncc(CC(=O)N2CCc3c(c(C(=O)[O-])nn3CC3CC3)C2)c(=O)[nH]1,3.159885346,-0.82428,-3.984165346\nCOCc1nc2n(n1)CCC[C@@H]2NC(=O)Nc1cnccc1C,3.196257529,1.78452,-1.411737529\nCc1ccc(NC(=O)C[C@H](c2cccc(C)c2)n2cccc2)cc1,2.522819487,4.72314,2.200320513\nCC1CCN(C(=O)CN2CCN(c3cc(C#N)ccn3)CC2)CC1,2.242431466,1.33378,-0.908651466\nN#Cc1ccnc(N2CCC(CCC(=O)NCc3cccc(Cl)c3)CC2)c1,2.250365015,3.91968,1.669314985\nCOCC[C@@H](C)C(=O)Nc1cccc(Oc2ccncc2)c1,2.484806648,3.485,1.000193352\nO=C(Cn1nc(C(=O)[O-])c2ccccc2c1=O)Nc1ccc2c(c1)C(=O)c1ccccc1C2=O,2.536245035,1.1741,-1.362145035\nCC(C)(C)c1csc(CNC(=O)N[C@@H]2CCc3c(O)cccc32)n1,3.019480995,3.6329,0.613419005\nCCn1cncc1[C@@H]1OCC[C@H]1C[NH2+]C(C)(C)C,4.646004856,1.3425,-3.303504856\nO=C(CN1C(=O)N[C@]2(CCCc3sccc32)C1=O)N[C@H](c1ccccc1)c1cccs1,3.709928512,3.7988,0.088871488\nCc1ccc(C(=O)N[C@H](CC[NH+](C)C)c2ccc(Cl)cc2)s1,3.43628494,2.71562,-0.72066494\nC/C=C/C[S@@](=O)Cc1nc(-c2cccs2)oc1C,3.517488633,3.53632,0.018831367\nNc1nonc1-c1noc(COc2ccc(Br)cc2)n1,2.399600676,2.0433,-0.356300676\nCOC[C@H]1CCCN(C(=O)N[C@@H]2C[C@](C)(OC)C2(C)C)C1,3.874154216,2.258,-1.616154216\nC[NH+](C)[C@H]1CC[C@H](NC(=O)N(Cc2c(F)cccc2Cl)C2CC2)C1,3.994602094,2.2187,-1.775902094\nCC(=O)/N=C1\\S[C@@H]2CS(=O)(=O)C[C@@H]2N1c1ccc(Br)cc1Cl,3.526675094,2.7238,-0.802875094\nCc1c(Cl)cccc1S(=O)(=O)N1CCC(C(N)=O)CC1,2.029710667,1.53442,-0.495290667\nCN(Cc1ncnn1C)C(=O)[C@@H]1CC(=O)N(CC(F)(F)F)C1,3.223841813,0.1843,-3.039541813\nCC(=O)/C=C(/NNC(=O)c1cccc([N+](=O)[O-])c1)c1ccccc1,2.269320979,2.4593,0.189979021\nCC(C)c1cc(C(=O)NC[C@@H]2CCC[NH+](C(C)C)C2)n(C)n1,4.366273556,0.9766,-3.389673556\nCC[C@](C)([NH3+])CO[C@H]1CCOC2(CCCCC2)C1,4.663770551,2.2955,-2.368270551\nCc1ccc(-n2nc(C(=O)NC[C@H]3CCCO3)c(=O)n(Cc3cccc(F)c3)c2=O)cc1Cl,2.883985307,2.45222,-0.431765307\nC[C@H](Cc1cccs1)N(C)C(=O)NCC1CC1,2.835861306,2.7305,-0.105361306\nCOc1ccc(C(=O)N[C@@H](Cc2ccccc2)c2nc3ccccc3[nH]2)cc1[N+](=O)[O-],2.584541779,4.1935,1.608958221\nCCCn1c(N)c(Br)c(=O)n(CC(=O)NC)c1=O,2.537467799,-0.4893,-3.026767799\nCc1nc(/C=C/C(=O)Nc2nc(-c3ccccc3)ns2)cs1,2.462739944,3.62192,1.159180056\nCc1ccccc1OCCCC(=O)NCc1ccccc1,1.537882208,3.47042,1.932537792\nCC[C@H](C)OCc1ccc(C(=O)NN)cn1,2.8132977,1.0002,-1.8130977\nCCCc1noc(-c2cc(S(=O)(=O)NC)ccc2OC)n1,2.261944947,1.6058,-0.656144947\nCc1cc(C)n2cc(CN3C[C@@H](C)C[C@H]([NH3+])C3)nc2n1,4.026418302,0.79844,-3.227978302\nCc1noc(C)c1[C@@H](C)NC(=O)NCCCC[NH+]1C[C@@H](C)C[C@H](C)C1,4.640605204,1.99264,-2.647965204\nC[C@@H](NC(=O)C(C)(C)C)C(=O)Nc1ccc(F)c(C(=O)N(C)C)c1,2.557460441,2.0168,-0.540660441\nO=c1oc2cccnc2n1CC[NH+](Cc1ccccc1F)C1CCCCC1,3.845392421,2.5464,-1.298992421\nCOc1cccc(CCC(=O)N2CCC(OCc3ccccc3F)CC2)c1,2.033454364,3.9747,1.941245636\nCCn1c(N/N=C/c2ccc(N(C)C)cc2)nc2c1c(=O)[nH]c(=O)n2C,2.59704872,0.9552,-1.64184872\nCc1cc2n(n1)CCC(=O)N2[C@@H](C)C(=O)NCc1ccccn1,3.100623436,1.02812,-2.072503436\nCc1ccc(CC(=O)NC[C@@H]2COc3ccccc3O2)cc1,2.34855287,2.49372,0.14516713\nCC[C@H]1COCCN1Cc1c[nH]nc1-c1ccccc1F,3.202033955,2.8266,-0.375433955\nCC[C@H](C)N1C(=O)c2ccccc2N[C@@H]1c1ccc(Cl)c([N+](=O)[O-])c1,3.14156896,4.6132,1.47163104\nCS(=O)(=O)c1ccc(NC(=O)Cn2c(-c3ccccc3)cc3ccccc32)cc1,1.961669185,4.3505,2.388830815\nCS[C@@H]1CC[C@@H](NC(=O)/C=C/c2ccc(OCc3cccnc3)cc2)C1,3.120059729,4.0741,0.954040271\nCS[C@H](CO)[C@@H](C)NC(=O)NCc1cc(=O)[nH]c2ccccc12,3.361951895,1.4397,-1.922251895\nO=C(NCC1CC1)C1([NH2+]Cc2cnn(-c3ccccc3)c2)CCCC1,3.257340383,1.7747,-1.482640383\nCC(C)[C@@H]1CN(C(=O)NC[C@H]2CC[NH+](C3CC3)C2)CCS1,5.10628001,0.8366,-4.26968001\n[NH3+]C[C@H](CC(=O)NC1CC1)N1CCn2cnnc2C1,3.791785348,-1.6271,-5.418885348\nO=C(NN1Cc2ccccc2C1)c1cnn(-c2ccccc2)n1,2.463468958,1.9279,-0.535568958\nCOc1ccc(S(=O)(=O)[N-]c2ccc(Cl)c(Cl)c2)cc1[N+](=O)[O-],3.09463754,4.3043,1.20966246\nO=C(NCC[C@@H]1C[C@H]2CC[C@@H]1C2)C(=O)Nc1nc(-c2ccccc2)cs1,3.907753927,3.6911,-0.216653927\nCCCS(=O)(=O)c1ncc(Cl)c(C(=O)Nc2sc3c(c2C(=O)OC)CCC3)n1,2.655303848,2.9028,0.247496152\nCC(C)(C)c1csc(CNc2cc(F)cc(F)c2)n1,2.393327346,4.3309,1.937572654\nCCOc1ccc(OCC(=O)N2CCN(c3nccn3-c3cccc(Cl)c3)CC2)cc1,2.239043704,3.652,1.412956296\nCNC(=O)[C@@H]1C[C@H]([NH3+])CN1Cc1ccc(-n2cccn2)cc1C,3.745433108,0.11152,-3.633913108\nC[C@@H](Cc1nc2ccccc2s1)NCc1ncn(C)n1,3.139160511,2.1456,-0.993560511\nCCC(CC)n1ccc(Cn2nnc(N)c2C(C)(C)C)n1,3.056509207,2.7637,-0.292809207\nCCC[C@H]1CN(C(=O)[C@H]2C[C@@H]2c2ccccc2C)CCO1,3.194344378,3.12602,-0.068324378\nC[C@H]1CCCC[C@@H]1OCCCOc1ccccc1C[NH3+],3.348375729,2.7927,-0.555675729\nO=c1oc(-c2ccccc2)nn1CN1CCN(c2ccc(F)cn2)CC1,2.311079692,1.8171,-0.493979692\nO=C(CCc1ccco1)NCc1cnn(-c2ccccc2)c1,2.047571641,2.7143,0.666728359\nCOc1ccc2ccccc2c1/C=C1/N=C(c2cccc([N+](=O)[O-])c2)OC1=O,2.34548966,4.1011,1.75561034\nO=C([O-])[C@H]1CCO[C@H]1Cc1ccccc1,3.392413353,0.3841,-3.008313353\nO=C([O-])C1[NH+]=c2ccccc2=[NH+]1,5.228431607,-5.8234,-11.05183161\nC=CCN1C(=O)N=C(N)[C@@H]1c1ccc(OC(C)C)cc1,3.176426978,2.4937,-0.682726978\nCc1n[nH]c(SCC(=O)NC2(C)Cc3ccccc3C2)n1,2.992188236,1.87892,-1.113268236\nC[C@H](NC(=O)NC[C@@H]1C[NH+](C)CCN1C)c1ccc(C(C)(C)C)cc1,4.117589731,1.173,-2.944589731\nC=CCN1CCN(S(C)(=O)=O)[C@@H](C(=O)NCc2ccccc2)C1,2.679932748,0.4346,-2.245332748\nO=S(=O)(c1cccn2c(SCc3c(F)cccc3Cl)nnc12)N1CCCC1,2.468769119,3.5986,1.129830881\nCN[C@@H](c1c[nH+]ccc1N)[C@@H]1CN2CCC[C@@H]2CO1,4.894342078,0.2066,-4.687742078\nC[C@@H](Sc1nc2ccccc2n1C)C(=O)N(C)Cc1ccccc1,2.431248545,3.7125,1.281251455\nC[C@H]1CCCC[C@@H]1[NH+](C)Cc1cc(F)cc(C#CC[NH3+])c1,5.04524194,1.0125,-4.03274194\nO=C(Nc1ccc2c(c1)CCCN2S(=O)(=O)c1ccccc1)c1cc2ccccc2o1,2.17677755,4.8266,2.64982245\nC[C@@H]1CCCC[C@H]1[NH2+][C@@H]1CCN(C2CCCCC2)C1=O,4.263772335,2.0621,-2.201672335\nCCc1nn(C)c2c1nc(N)n2[C@H](C)c1ccccc1F,3.276272156,2.6628,-0.613472156\nCC(C)CNC(=O)CNC(=O)C(=O)Nc1ccc2ccn(C)c2c1,2.263583154,1.0052,-1.258383154\nCc1nc(-c2nnc(NC(=O)c3ccc(S(=O)(=O)N4CCC[C@H](C)C4)cc3)o2)cs1,2.887521357,3.17442,0.286898643\nO=C(NCCc1ccccc1)[C@H]1CCCN(C(=O)c2cccc(Cl)c2)C1,2.231072868,3.5511,1.320027132\nCC1(C)C[C@H]([NH2+]C[C@@H]2CCC[C@@H]2NS(C)(=O)=O)C(C)(C)O1,4.794950037,0.6138,-4.181150037\nCc1ccc2cccc(NC(=O)[C@H](C)SCc3nc4scc(-c5ccccc5)c4c(=O)[nH]3)c2n1,2.981012578,5.76862,2.787607422\nCN(C(=O)CS(=O)(=O)c1nnc(N2CCCC2)s1)C1CC1,2.737794284,0.5328,-2.204994284\nCC(C)c1ocnc1C[S@@](=O)[C@@H](C)c1ccc(F)c(F)c1,3.919654427,4.0861,0.166445573\nC/C(Cn1nnc2ccccc21)=N\\NC(=O)c1ccc(Br)cc1,2.193402011,2.9997,0.806297989\nCCOc1ccc(-n2ccnc(SCC(=O)Nc3ccccc3Cl)c2=O)cc1,2.118518314,4.0154,1.896881686\nCc1ccc(OC(F)F)c(CN2CCN(c3nc(C)ns3)CC2)c1,2.451202487,3.07854,0.627337513\nCc1cc2ncc([C@H](C)NC3CC[NH+](Cc4ccccn4)CC3)c(C)n2n1,4.106032583,1.63924,-2.466792583\nCC[NH2+][C@@]1(C(=O)[O-])CC[C@@H](n2cnc3ccccc32)C1,4.410980043,-0.1667,-4.577680043\nCCCN1C(=O)CCc2cc(NC(=O)N(CC)CC(C)(C)O)ccc21,2.532985399,3.0005,0.467514601\nCOC(C)(C)c1noc(-c2cc(Br)cn2C(C)C)n1,3.212215459,3.763,0.550784541\nCc1nc(NC[C@H](c2ccc(F)cc2)[NH+]2CCCC2)cc(-c2cc[nH]n2)n1,4.361277329,2.14612,-2.215157329\nC[C@@H](O)CCC(=O)Nc1cccc(NC(N)=O)c1,2.34605947,1.2767,-1.06935947\nCC(C)(C)c1csc([C@H]2CN(Cc3ccc(O)cc3O)CCO2)n1,3.143730594,3.4253,0.281569406\nCc1ccc(-c2c[n+](-c3cccc(C(F)(F)F)c3)c3n2CCCCC3)cc1,2.729851836,5.48542,2.755568164\nO=c1[nH]c(CCl)nc2ccc(O)cc12,2.419924373,1.3675,-1.052424373\nCOc1cc([C@H](C)N[C@@H]2CCCc3nc(C)sc32)ccc1F,3.302915814,4.32742,1.024504186\nCc1ccc([N-]S(=O)(=O)[C@@H]2CCC[NH2+]C2)c(C)[nH+]1,6.031753882,0.17834,-5.853413882\nCc1ccccc1-n1nc2c(c1NC(=O)c1cc([N+](=O)[O-])ccc1Cl)CS(=O)(=O)C2,2.635545395,3.42302,0.787474605\nCc1ccc(NC(=O)C[C@@H]2C(=O)Nc3nc(-c4ccccc4)nn32)cc1C,2.991716945,3.08394,0.092223055\nCC(C)[C@H]1CC[C@@H](C)C[C@H]1[NH2+]CC1(CS(C)(=O)=O)CC1,4.753834564,1.8354,-2.918434564\nO=C(NCCc1nnc(-c2ccccc2)o1)c1ccc2c(c1)C(=O)NC2=O,2.221256139,1.5927,-0.628556139\nCc1ncc(CC(=O)N2CCC(=O)N(CC3CC3)[C@@H](C(C)C)C2)c(=O)[nH]1,3.25717113,1.11632,-2.14085113\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@@H]1CCN1CCS(=O)(=O)CC1,4.746850504,-2.021,-6.767850504\nCc1ccccc1NN/C=C1/C(=O)NC(=O)c2ccccc21,2.204343137,2.22262,0.018276863\nCCN(CCC(=O)NC1CC1)C(=O)c1cnc(C)cn1,2.306046425,0.91582,-1.390226425\nCc1cccc([C@H]2CCC[NH2+]2)c1,4.094583094,1.39332,-2.701263094\nCS(=O)(=O)N1CCc2cc(C(=O)Nc3cc(Br)ccc3Br)ccc21,2.207749587,3.786,1.578250413\nCC(C)Cn1ncc(C(=O)N(C)[C@@H](C)Cc2ccsc2)c1C(F)F,3.336934928,4.2414,0.904465072\nCc1ccc(F)c(NC(=O)C(=O)NCCC[NH+]2CCCCCC2)c1,3.312323302,1.03782,-2.274503302\nCc1cc(NC(=O)[C@H](C)NS(=O)(=O)c2cccc(Cl)c2)ccc1N1CCCC1=O,2.630654339,3.08072,0.450065661\nO=C(Cn1nc(-c2ccccc2F)ccc1=O)NC[C@@H]1CCCO1,2.613433534,1.3446,-1.268833534\nCC(C)COC(=O)N1CCC([NH2+]CCc2nnc3ccccn23)CC1,3.358366621,1.0922,-2.266166621\nCC(=O)N1CC[C@@H]([NH2+][C@H](C)c2c[nH]c3ccc([N+](=O)[O-])cc23)C1,4.043087472,1.3213,-2.721787472\nCc1cnc(C(=O)N2CCC(c3[nH]ncc3-c3ccncc3)CC2)cn1,2.56977868,2.58992,0.02014132\nCCCNC(=O)CN1CCC([NH2+][C@H](C)c2ccc(F)cn2)CC1,3.633696098,0.8357,-2.797996098\nC1CCC([NH2+]CC[C@H]2C[NH2+]CCO2)CC1,5.438513113,-0.7652,-6.203713113\nCCOC(=O)C1(C)CCN(C(=O)CSc2ccccc2Cl)CC1,2.317976552,3.6239,1.305923448\nCCN(CC)C(=O)[C@@H]1CCC[NH+]1Cc1cnc(Cl)s1,4.76589579,1.2122,-3.55369579\nCCOc1ccccc1N/C(=C\\C(=O)OC)C(=O)NC,2.268032439,1.3001,-0.967932439\nCNC(=O)COc1ccccc1-c1noc(-c2ccc(S(=O)(=O)N(C)C)o2)n1,2.535585059,1.3717,-1.163885059\nCOC(=O)CSc1cc(Cl)ccc1Cl,1.950704472,3.2585,1.307795528\nCOC(=O)[C@@H]1C(=O)C2=C(C[C@@H]1C)Nc1ccccc1N[C@@H]2c1ccc(Cl)c(F)c1,3.528144568,4.71,1.181855432\nCOCC[C@@H](C)C(=O)Nc1ccccc1S(=O)(=O)C(F)F,2.730644768,2.294,-0.436644768\nCCN(c1ccc(C(=O)NCc2ccc([S@](C)=O)cc2)cc1)C(C)C,2.670970361,3.5887,0.917729639\nCC[C@@H]1CCCC[C@H]1[C@H]([NH3+])c1c(F)cccc1F,3.827628952,3.4642,-0.363428952\nCc1cccc(C(=O)NN2Cc3ccccc3C2)c1F,2.30064616,2.79472,0.49407384\nCCn1c(=O)n(CC(=O)Nc2cc(OC)ccc2OC)c(=O)c2ccccc21,2.001306361,1.839,-0.162306361\nCc1cc(=O)c(C(=O)N2CCC[C@H](Cn3cc(CC4CCCCC4)nn3)C2)c[nH]1,3.262503707,2.95002,-0.312483707\nCc1c(-c2ccnc(N3c4ccccc4C[C@H]3C)n2)nnc2nc(-c3ccco3)nn12,3.333401872,3.62742,0.294018128\nCc1ccc(NC(=O)c2cnn(-c3ccccc3)c2C(F)(F)F)cc1S(=O)(=O)N(C)C,2.297134473,3.70212,1.404985527\nCC[C@@H](CSC)N(C)C(=O)Nc1ccc(C(=O)N(C)C)c(C)c1,2.971901104,3.30212,0.330218896\nCOc1cc2c(cc1NC(=O)c1cc(OC)c(OC)c(OC)c1)CCCC2,2.044842359,3.8521,1.807257641\nCOc1cc(C)c(Br)cc1[C@@H](Cl)[C@H]1CCO[C@@H]1C,3.81993083,4.47102,0.65108917\nCc1cnc(SCC[NH+]2CCOCC2)nc1[C@@H]1CCCN(C(=O)/C=C/c2cccnc2)C1,4.141789854,1.60662,-2.535169854\nCc1ccc(NC(=O)CCn2nnc3cc(C(=O)NCc4cccc(Cl)c4)ccc32)cc1,2.111270605,4.35192,2.240649395\nCc1ccc(SCCC(=O)NC[C@@H]2CN3CCCC[C@@H]3CO2)cc1,3.075083992,2.84672,-0.228363992\nCn1cc(CNC(=O)c2cc(-c3ccccc3)nc3ccc(Br)cc23)cn1,2.113376849,4.3278,2.214423151\nCc1cccn2c(=O)c(C(=O)Nc3ccc([S@](C)=O)c(F)c3)cnc12,2.962959658,2.13172,-0.831239658\nCc1nonc1C(=O)NCc1csc(=O)[nH]1,3.075637706,0.05782,-3.017817706\nCOc1cc2[nH]c(C(=O)N3CCN(c4ccc(F)cc4)CC3)cc2c(OC)c1OC,2.277064802,3.2952,1.018135198\nO=C([O-])[C@H]1C[C@@H]2CCCC[C@@H]2N1C(=O)[C@@H]1CCS(=O)(=O)C1,4.106939223,-0.6693,-4.776239223\nCCn1c([C@@H](C)[NH2+]CC2([C@H](O)c3ccccc3)CC2)nc2ccccc21,3.791835426,3.1944,-0.597435426\nCCSC[C@H](C)N(C)C(=O)CCc1c[nH]c2ccccc12,2.930423354,3.7005,0.770076646\nCOc1ccc2c(C)c(CCC(=O)Nc3c(C(C)C)cccc3C(C)C)c(=O)oc2c1,2.335432391,5.92812,3.592687609\nO=C(NNC(=O)c1cccc(NC(=O)c2ccco2)c1)c1ccccc1,1.806007951,2.6067,0.800692049\nCc1ncc([C@H](C)NC[C@H]2CN3CCCC[C@@H]3CO2)c(C)n1,3.635850378,1.99734,-1.638510378\nCC[C@H]1CCCC[C@H]1n1cc(C(=O)[O-])c(=O)[nH]c1=O,3.611392278,0.0414,-3.569992278\nN#C[C@H](C(=O)c1cccc(N2CCCC2=O)c1)c1ccc(C(F)(F)F)cn1,2.994926879,3.71728,0.722353121\nCCOc1cccc([C@@H](C)[NH2+][C@H](C)Cn2cnc(C)c2C)c1,3.8512171,2.61174,-1.2394771\nO=C(OCc1nnsc1Cl)c1cccc(N2CCCS2(=O)=O)c1,2.690322495,2.0884,-0.601922495\nCc1cc(NC(=O)c2cc(Cl)ccc2Cl)n(C(C)C)n1,2.145072206,4.33152,2.186447794\nCc1ccc(OC[C@@H](O)C[NH2+]C[C@@]2(C)CCCO2)cc1,3.972963039,0.86722,-3.105743039\nCSC(C)(C)CNC(=O)c1ccc(S(=O)(=O)NC2CC2)cc1,2.376249042,1.9987,-0.377549042\nCc1ccccc1-n1nnnc1[C@@H](c1ccccc1)N(C)Cc1cccc([N+](=O)[O-])c1,2.833420949,4.10032,1.266899051\nCOc1cc(C)c(NC(=O)N2CC[C@@H](C)C[C@@H]2C)cc1OC,2.80715971,3.66452,0.85736029\nCCC[NH2+]CC/C=C(/C)c1ccc(C)c(C)c1,3.433729102,3.07024,-0.363489102\nCS(=O)(=O)c1nc(C(=O)Nc2ccc3c(c2)Cc2ccccc2-3)c2ccccn12,2.440662697,3.5613,1.120637303\nO=C(N[C@@H](c1ccc2c(c1)OCCO2)C1CC1)NC1(c2noc(C3CC3)n2)CCCC1,3.346508313,3.938,0.591491687\nCC(=O)Oc1ccccc1-c1nc2s/c(=C\\c3cc(C)c(OC(C)=O)c(C)c3)c(=O)n2n1,2.667587844,2.83314,0.165552156\nO=C(NCC(=O)N1CCN(C(=O)c2ccccc2)CC1)c1ccc(Br)o1,2.03066504,1.7565,-0.27416504\nO=C(NC1CCCC1)[C@@H]1CCCN1C(=O)Nc1cccc(Cl)c1,2.401089697,3.3951,0.994010303\nCc1c(C(=O)N2CCC3(CC2)C(=O)NCCC[NH+]3C)cnc2ccnn12,4.613407565,-0.95288,-5.566287565\nCC[C@@H](C)CN(C)c1ccccc1C[NH2+]C1CC1,3.802154805,2.3947,-1.407454805\nO=C(Nc1ccnn1Cc1cccc(Cl)c1Cl)c1cccc(S(=O)(=O)NC2CCCC2)c1,2.387381354,4.7114,2.324018646\nCOc1ccc(C(=O)Nc2ccccc2C(=O)Nc2ccccn2)cc1,1.688012874,3.5948,1.906787126\nCC(=O)Nc1cccc(-n2cc3c(c2-c2ccccc2Cl)c(=O)n(C)c(=O)n3C)c1,2.439185247,3.3067,0.867514753\nCc1ccc2nc(C)c(C(=O)NCC(C)(C)c3ccncc3)cc2c1,2.306965961,3.95424,1.647274039\nCSc1ccsc1C(=O)NC1(c2cccc(Cl)c2)CC1,2.584660576,4.5425,1.957839424\nO=C(OCc1nnc(-c2ccccc2Cl)o1)C1CCOCC1,2.256423598,2.8598,0.603376402\nCOc1ccc(-c2csc(NC(=O)CSc3nc4ccccc4[nH]3)n2)cc1OC,2.179890746,4.4344,2.254509254\nCCc1cc([C@@H]2CCC[NH2+]2)cc(CC)c1Cl,4.274119239,2.8631,-1.411019239\nCc1nc2sc3c(NC[C@H]4CCCO4)ncnc3c2c2c1COC(C)(C)C2,3.430342051,3.99012,0.559777949\nCc1cncc(NC(=O)NCCc2cccc(F)c2)c1,1.947457496,2.89332,0.945862504\nCN(C)c1ccc(NC(=O)[C@@H]2CCCN2C(=O)C2CC2)cn1,2.57703349,1.4871,-1.08993349\nO=C1O[C@H](CCN2CCOCC2)CN1Cc1ccccc1,2.614376716,1.7297,-0.884676716\nO=C(Cn1c(Cl)nc2ccccc21)c1ccc2c(c1)OCCO2,2.226699819,3.3438,1.117100181\nCC(=O)N[C@H](CC(=O)Nc1nc[nH]n1)c1ccc(C)cc1,3.10270471,1.31912,-1.78358471\nCCC[C@@]1(CC)NC(=O)NC1=O,3.343567709,0.7747,-2.568867709\nCC(C)[C@@H](O)c1ccn(CCC(C)(C)C)c1,3.326211986,3.6137,0.287488014\nO=C(CNS(=O)(=O)c1ccccc1Cl)NCc1ccco1,2.015111156,1.5277,-0.487411156\nCCSC1=C(C#N)[C@H](CC(C)C)C2=C(CCCC2=O)N1,3.509414391,3.74728,0.237865609\nC[C@H]1CC[C@H](NC(=O)NC[C@](C)(O)c2ccsc2)[C@H](C)C1,3.828507393,3.0795,-0.749007393\nO=C(Cc1ccc(-n2cccn2)cc1)Nc1cccc(CN2CCOC2=O)c1,2.246151822,3.0057,0.759548178\nCOCCSc1ccccc1C(=O)N1C[C@H](C)O[C@@H](C)C1,2.919174745,2.6745,-0.244674745\nCN[C@@H]1[C@@H](C[NH+](C)CCO)C(C)(C)OC1(C)C,5.466499996,-0.715,-6.181499996\nCC(C)COC1CCN(C(=O)/C=C/c2cccc3cccnc23)CC1,2.385662775,3.9116,1.525937225\nCc1cc(-n2c(C)cc(C(=O)CN3C(=O)NC(c4ccccc4)(c4ccccc4)C3=O)c2C)no1,2.659919657,4.06886,1.408940343\nCCC[NH2+][C@@H](CC[C@H]1CCCO1)[C@@H]1C[NH+]2CCN1CC2,6.424619424,-1.1297,-7.554319424\nCC[C@@H](C)NC(=O)[C@H](NC(=O)c1cccc(C)c1)C1CCN(C(=O)c2ccccc2)CC1,2.852569634,3.56052,0.707950366\nCc1ccc([C@H]2CSCCN2C(=O)N[C@H]2CC[C@H]([NH+](C)C)C2)cc1,4.245038493,1.86012,-2.384918493\nC[C@H](NC(=O)c1ccnn1C)C12CC3CC(CC(C3)C1)C2,4.194819005,2.7548,-1.440019005\nCn1ncnc1CCNC(=O)N1CCC[C@H]1Cc1ccccc1,2.744029526,1.7743,-0.969729526\nO=C([O-])COc1cc(-c2ccncc2)nc(N2CCN(c3ccccc3F)CC2)n1,2.663513821,1.133,-1.530513821\nN#C[C@H]1CNCCN1C(=O)c1ccsc1,3.291177871,0.68568,-2.605497871\nCC[NH+]1CC[C@H](N(C)CC(=O)OC(C)(C)C)[C@H](C)C1,4.803468008,0.5731,-4.230368008\nO=C(Cc1ccc2c(c1)OCCCO2)NCCNC(=O)[C@H]1C=c2ccccc2=[NH+]1,3.610596228,-1.8141,-5.424696228\nC[C@@H]1CCN(C(=O)NCCn2c(=O)[nH]c3ccccc32)C[C@@H]1O,3.115989084,0.7419,-2.374089084\nCCC1(CC)CC(=O)N(CCOc2ccccc2)C1=O,2.279792993,2.6307,0.350907007\nCSc1cccc(NC(=O)[C@@H]2CCCN2C(=O)c2cccs2)c1,2.519958928,3.7133,1.193341072\nCCOC(=O)c1ccc(N2C(=O)c3oc4ccc(F)cc4c(=O)c3[C@@H]2c2ccc(SC)cc2)cc1,2.869982169,5.5805,2.710517831\nCc1ccsc1CN[C@@H]1COC[C@@H]1O,3.438656169,0.90582,-2.532836169\nCO[C@H](C)C(=O)Nc1cccc(NC(=O)N(C)[C@@H](C)c2cccnc2)c1,2.920062765,3.2799,0.359837235\nCN(C(=O)c1ccc2c(c1)CCO2)C1CCC(=O)CC1,2.371567617,2.2052,-0.166367617\nCC(C)c1csc(/C(C#N)=C\\c2csc(-c3ccc(F)cc3)n2)n1,2.706482804,5.59328,2.886797196\nCOc1ccc(CC(=O)N2CCO[C@@H](COCc3cccc(C)c3)C2)cc1,2.523511281,2.99032,0.466808719\nO=C(COc1ccc(Cl)cc1)N[C@@H](c1ccccc1)c1cccs1,2.260327084,4.6861,2.425772916\nCCOC(=O)[C@H]1C(=O)C(=O)N(c2ccccc2)[C@@H]1c1ccc([N+](=O)[O-])cc1,2.954844007,2.4311,-0.523744007\nC[NH+](C)CCCOc1ccc(NC(=O)[C@H]2CCC(=O)O2)cc1,3.481344542,0.2441,-3.237244542\nCC(C)(C)OC(=O)N(CCNC(=O)N1CCC([NH+]2CCCC2)CC1)C1CC1,3.689186885,1.2386,-2.450586885\nCC1=C[C@@H](c2ccc(S(=O)(=O)Nc3ccc(Cl)cc3C)o2)N=N1,3.54172818,4.45292,0.91119182\nCc1nn(-c2ccccc2)c(COC(=O)C23C[C@@H]4C[C@@H](CC(O)(C4)C2)C3)c1C#N,4.885278242,3.4269,-1.458378242\nCC(C)C(=O)Nc1cccc(NC(=O)C(=O)NCC(C)(C)c2ccncc2)c1,2.297021131,2.7086,0.411578869\nCc1cc(C)n2nc(C)c(C(=O)O[C@@H](C)[C@@H]3CCCO3)c2n1,3.472079899,2.37886,-1.093219899\nCCCN[C@]1(C#N)CC[C@@H](Sc2nc(C)c(C)c(C)n2)C1,3.865569071,3.30844,-0.557129071\nCCOC(=O)C1(c2nc([C@H]3CC[C@H](C)C3)no2)CCCC1,3.496369973,3.3481,-0.148269973\nCCCNC(=O)c1ccc2c(c1)c(C)c(C)n2C,1.967924688,2.93494,0.967015312\nCCCOc1ccc(NCC(=O)Nc2ccc(F)cc2)cc1,1.632813467,3.6651,2.032286533\nS=c1n(CN2CCSCC2)nc(N2CCCCC2)n1-c1ccccc1,2.583555394,3.39989,0.816334606\nCc1cc(C(F)F)nn1CC(=O)N1CC(c2ccncc2)C1,2.61704593,2.15012,-0.46692593\nO=[N+]([O-])c1ccc(Cl)c(S(=O)(=O)/N=C([O-])/C=C/C2CC2)c1,3.166163675,1.6619,-1.504263675\nCC[C@@H](C)n1ncc(C(=O)N2CCO[C@H](C#N)C2)c1C1CC1,3.768596419,2.09608,-1.672516419\nO=C(NC(C1CC1)C1CC1)N1CC[C@@H](Oc2ccc(C(F)(F)F)cn2)C1,3.144221022,3.4517,0.307478978\nC[C@H]1CCCC[C@@H]1OCCN(C)Cc1nc(=O)c2ccccc2[nH]1,3.075784642,2.9502,-0.125584642\nCOc1ccc(Cl)cc1S(=O)(=O)N1CCN(C(=O)c2cc3ccccc3oc2=O)CC1,2.197795481,2.6017,0.403904519\nC[C@@H]1CCN(c2ncc([N+](=O)[O-])cc2Cl)[C@H](C)C1,3.208323663,3.268,0.059676337\nCc1nc(N2CCN(C(=O)c3cccc(F)c3)CC2)cc(-n2cnc(C)c2C)n1,2.473499443,2.68906,0.215560557\nCOC(=O)[C@H]1CCCCN1C[C@H](O)COc1ccc(Cl)cc1Cl,2.853293962,2.7606,-0.092693962\nCOC[C@](C)(CCO)[NH2+]Cc1ncccc1C,4.061519898,0.24092,-3.820599898\nO=C(NC1CCCCC1)C1CCN(C(=O)c2cc3cc(Cl)cnc3[nH]2)CC1,2.420386909,3.5174,1.097013091\nCC(C)(C)N(CCC(=O)[O-])C(=O)N1CC[NH+]2CCC[C@H]2C1,5.296421228,-1.2902,-6.586621228\nCOC(=O)c1cc(NC(=O)NCc2ncc[nH]2)ccc1C,2.150937232,1.82642,-0.324517232\nCN(Cc1ccc(C#N)cc1)C(=O)NCCN(C)c1ccccc1,2.099813573,2.83608,0.736266427\nCCCNC(=O)C1CCN(C(=O)c2cccc(NS(=O)(=O)c3ccc4c(c3)CCCC4)c2)CC1,2.239853257,3.7446,1.504746743\nO[C@H]1Cc2ccccc2[C@H]1[NH2+]Cc1cnc(-c2ccsc2)s1,4.127840834,2.5933,-1.534540834\nN#Cc1cc([N+](=O)[O-])ccc1Oc1ccc(/C=N\\NC(=O)c2ccccc2-n2cccc2)cc1,2.503753463,4.81338,2.309626537\nC[C@H]1C[C@H](C)CN(C(=O)[C@@H](C)S(=O)(=O)c2nc3ccccc3[nH]2)C1,3.481529855,2.2296,-1.251929855\nCCN(C(=O)CN1CC[NH+](CC2CC2)CC1)c1cccc2ccccc12,3.297787028,1.8032,-1.494587028\nCc1ccc(Nc2nc3c(c(=O)[nH]2)CCCC3)c(F)c1,2.46036033,2.83982,0.37945967\nCCC[C@H]1[C@H](C)CCCN1C(=O)NCCCCSC,3.291429317,3.7398,0.448370683\nCc1noc(C)c1CN1C(=O)N(c2cccc(Cl)c2)c2ncccc2S1(=O)=O,2.790302811,3.80244,1.012137189\nCCC[NH2+][C@@H](C)c1ccc(SCC(C)C)cc1,3.752956887,3.4691,-0.283856887\nC[C@@H](C(=O)NCCc1ccc(F)cc1)[NH+](C)Cc1ccc(Cl)nc1,3.667307699,1.6362,-2.031107699\nCC(C)(C)N1C[C@@H](C(=O)n2ccnc2)CC1=O,3.241710455,1.1703,-2.071410455\nCN(c1ccc(C(=O)Nc2ccccc2C(=O)N2CCCC2)cc1)S(C)(=O)=O,1.980031213,2.5707,0.590668787\nNc1ccc(=O)n(CCN2CC[NH+]3CCC[C@@H]3C2)c1,4.656713618,-1.2066,-5.863313618\nCc1cc(NC(=O)C(=O)N2CCN(c3ccc(Br)cn3)CC2)no1,2.379687103,1.42782,-0.951867103\nCc1ccc(C(=O)N2CCN(C(=O)CN3CCOCC3)CC2)s1,2.130681032,0.67312,-1.457561032\nCOc1ccc(C(=O)NCc2ccc(C[NH+]3CCC[C@H](C)C3)cc2)cc1[N+](=O)[O-],3.752516678,2.3482,-1.404316678\nC[NH+](Cc1ccc(OC(F)(F)F)cc1)C[C@H]1CCOC1,4.01571473,1.6364,-2.37931473\nCOC(=O)c1ccc(C(=O)N2CCC(Oc3nc4c(C)ccc(C)c4s3)CC2)cc1,2.358502992,4.38334,2.024837008\nCC[C@@H](C)[C@@H]([NH2+]C)C1([NH+](C)C)CCC(C)CC1,5.323609858,0.6877,-4.635909858\nCC[C@H](C)NC(=O)[C@H](C)NC(=O)Nc1ccc2c(c1)COC2,2.971019369,2.1415,-0.829519369\nCc1nn(CC(=O)Nc2ccc(N3CCOCC3)cc2)c(=O)c(C#N)c1C,2.242124545,1.20712,-1.035004545\nO=C([O-])CCCS(=O)(=O)Nc1ccc(Cl)cc1F,2.651310618,0.7509,-1.900410618\nCCOC(=O)c1ccccc1[N-]S(=O)(=O)c1cccc(F)c1,2.877115287,3.3965,0.519384713\nCC(C)(C)C(=O)Nc1ccc(NC(=O)c2ccccc2)nc1,1.82913395,3.3185,1.48936605\nCc1ccc(/C=C/C(=O)N(C)Cc2sccc2C)o1,2.502480003,3.62974,1.127259997\nCC(C)c1noc(CN(C)C(=O)c2csc3nc(-c4cccc(F)c4)cn23)n1,2.730284857,3.9805,1.250215143\nCCC[C@H](CCCc1c([O-])[nH+]c(N)[nH]c1=O)C(=O)[O-],4.655077806,-1.6663,-6.321377806\nCc1ccsc1-c1nc(C)c(C)nc1N,2.624802162,2.71256,0.087757838\nCn1nnc2cc(C(=O)NCC(C)(C)c3ccncc3)ccc21,2.42817859,2.0709,-0.35727859\nO=C(c1cccc(C2=NO[C@@H]([C@@H]3N=NC4=C3CCCC4)N2)c1)N1CCCC1,3.976111738,3.1926,-0.783511738\nCOC[C@H](O)CN(C)C(=O)c1cscc1C,3.166815768,1.13582,-2.030995768\nCOc1ccc2c(c1)C(N1CCN(C(=O)c3ccc(F)cc3)CC1)=Nc1ccccc1O2,2.297544373,4.4763,2.178755627\nNS(=O)(=O)N1CCC(C(=O)NCCNC(=O)c2ccco2)CC1,2.251835206,-0.9589,-3.210735206\nCCC(CC)C(C)(C)C[C@@H](C)[NH2+]C,4.576779185,2.4206,-2.156179185\nCCc1ccc2ncc(C#N)c(Nc3cc(OC)ccc3OC)c2c1,2.189201832,4.42968,2.240478168\nC[C@@H]1CCCC[C@H]1NC(=O)N(C)Cc1nccs1,3.124031281,2.8632,-0.260831281\nCCc1ccc(OCC(=O)N(C(C)C)C2CCOCC2)cc1,2.18664966,3.0438,0.85715034\nCOC1CCN(C(=O)c2ccc(N)c(C)c2)CC1,1.975924657,1.82822,-0.147704657\nCCOC(=O)c1cc(S(=O)(=O)N2CCN(c3ccc(C(C)=O)cc3)CC2)cn1C,2.234633575,1.9153,-0.319333575\nO=C(c1occc1Br)N1CCS[C@H](c2ccccc2)C1,3.097373773,3.9724,0.875026227\nClc1cc(N2CCC[C@H](OCc3cccnc3)C2)nc2[nH]ccc12,3.074239667,3.7969,0.722660333\nCC(C)[NH2+]Cc1ccc([N+](=O)[O-])c(OC/C(Cl)=C/Cl)c1,3.652626052,2.7644,-0.888226052\nCc1cc(C)nc(Nc2nc(CC(=O)Nc3ccc(-n4cnnn4)cc3)cs2)n1,2.491890746,2.45044,-0.041450746\nCc1c2c(cc3c1O/C(=C\\c1c(F)cccc1Cl)C3=O)CN(CCCN1CCOCC1)CO2,2.96219473,4.27792,1.31572527\nCSc1ncc(CNc2cc(-c3ccccc3)nc3ccnn23)n1C,2.639763986,3.4638,0.824036014\nCC[C@@H](C)c1ccc(S(=O)(=O)[N-]c2cc(F)ccc2F)cc1,3.448403351,4.8724,1.423996649\nCOCc1cccc(NC2=[NH+]CCCCC2)c1,3.269881966,1.298,-1.971881966\nC[C@@H](Oc1ccccc1F)C(=O)NNC(=O)Nc1ccccn1,2.508059175,1.8409,-0.667159175\nCc1ccc(-n2c(C)cc(/C=N\\NC(=O)CN3CCOCC3)c2C)cc1[N+](=O)[O-],2.45769911,2.09306,-0.36463911\nCCN1C(=O)Cc2cc(NC(=O)c3ccc(Cl)c([N+](=O)[O-])c3)ccc21,2.169784347,3.4095,1.239715653\nCc1cc(I)ccc1NC(=O)C[NH+](C)[C@@H]1CCc2ccccc21,3.83053583,2.74032,-1.09021583\nCCn1cc(NC(=O)[C@H]2CSCN2C(=O)CC(C)(C)C)ccc1=O,3.085154336,2.1444,-0.940754336\nN#Cc1c(Cl)nsc1N[C@@H]1CCc2cc(Cl)ccc21,3.258787912,4.42098,1.162192088\nCc1cc(C)c(NC(=O)CNC(=O)[C@@H](NC(=O)c2ccco2)C(C)C)c(C)c1,2.611430454,2.71416,0.102729546\nC[C@@H]1CCC[C@@H](N(C)C(=O)NCc2cccc(Cn3cccn3)c2)C1,3.054786435,3.6515,0.596713565\nC[C@H](NC(=O)CN(C)Cc1nc2ccccc2c(=O)[nH]1)c1ccccc1,2.556272893,2.2323,-0.323972893\nCc1ccc(Oc2ncnc(Nc3ncccc3C)c2[N+](=O)[O-])cc1C,2.404361839,4.24096,1.836598161\nO=C(CCn1cncn1)Nc1nc(-c2ccc3ccccc3c2)cs1,2.191495935,3.5836,1.392104065\nCc1cc(C)c(NC(=O)[C@H](C)N(C)OCc2ccccc2)c(Cl)c1,2.804898113,4.34744,1.542541887\nCc1ncc([C@@H](C)[NH2+][C@H](C)c2ccc3c(c2)CC(=O)N3)s1,4.219180742,2.33172,-1.887460742\nCCc1ccc(C(=O)OCC(=O)N2CCCC2)o1,2.044629001,1.6212,-0.423429001\nO=C(NCCCn1ccc2ccccc21)C1CCN(C(=O)/C=C/c2ccccc2)CC1,2.216937347,4.0996,1.882662653\nCC/[NH+]=C(/NCc1nnc2n1CCCCC2)N[C@@H]1C[C@@H]1C,4.195178435,-0.4514,-4.646578435\nCC[C@H]1CCCN1c1nc(C(F)(F)F)ccc1C(N)=S,3.041634251,3.1134,0.071765749\nCc1ccc(C)c(NC(=O)[C@@H](C)[S@@](=O)Cc2nccn2C(F)F)c1,3.483063654,3.17094,-0.312123654\nC[C@@H]1CCCC[NH+]1CCNC(=O)CCc1ccc2c(ccn2C)c1,4.085937819,1.6844,-2.401537819\nCc1ccc(OCCOc2ccccc2)c(C(=O)[O-])n1,2.306296362,1.21132,-1.094976362\nCc1csc([C@@H](C#N)C(=O)CSc2nnc(-c3ccccc3F)n2C2CCCCC2)n1,3.136642384,5.3229,2.186257616\nO=C(CCl)N1CCCC2(CCCCC2)C1,2.906406136,2.7981,-0.108306136\nN#CCN1CCN(CC(=O)NCCCC2CCCCC2)CC1,2.20926879,1.60428,-0.60498879\nCOc1ccc(F)cc1CSCc1nc(CC(C)C)no1,2.372806593,3.8492,1.476393407\n[NH3+][C@@H]1CCCC[C@@H]1/N=C/[C@@H]1C(=O)N=C(S)N(c2ccccc2)C1=O,4.498557498,1.0857,-3.412857498\nCNS(=O)(=O)c1cc(NC(=O)Cc2cccc(F)c2F)ccc1C,2.073399905,2.36252,0.289120095\nN#CCCN(CCC(F)(F)F)C(=O)c1ccc2c(c1)CCCC2,2.435420061,3.87368,1.438259939\nCCCC(=O)N1CCC(n2nc(C(=O)OCC)cc2C2CC2)CC1,2.485511105,2.9008,0.415288895\nCC(=O)Nc1c(N)cc2c3c(cccc13)C(=O)c1ccccc1-2,2.268598471,3.5918,1.323201529\nCCN(Cc1cccs1)CN1C(=O)N[C@](CC)(c2ccccc2)C1=O,2.892918554,3.3848,0.491881446\nC[C@@H](O)[C@H]1CCOC2(CCC2)C1,4.084328607,1.7165,-2.367828607\nO=C1CCCN1C[C@H](NC(=O)N1CCN(c2nccs2)CC1)c1ccccc1,2.838055644,2.3384,-0.499655644\nC=CCc1cc(N)ccc1O[C@H]1CCOC2(CCCC2)C1,3.696392259,3.8679,0.171507741\nCOC[C@@H]1CN(C(=O)C(=O)Nc2ccccc2-c2ccccc2)CCO1,2.611122099,2.1659,-0.445222099\nO=C(CCc1ccc(Cl)cc1)NCCc1nc2cc(F)ccc2s1,2.030294342,4.3803,2.350005658\nCc1cn2c(=O)c(C(=O)N3CCC[C@H](n4cccn4)C3)cnc2s1,3.246150591,1.73822,-1.507930591\nBrc1cc(C[NH+]2CCC[C@@H](c3cn[nH]c3)C2)cc2c1OCCCO2,4.62014089,2.296,-2.32414089\nCOc1ccccc1OCC(=O)NCC1(c2cccs2)CCCCC1,2.253408876,4.1538,1.900391124\nCC[C@@H](C)Oc1cccc(C(=O)Nc2cccc(-c3nn4cnnc4s3)c2)c1,2.865814889,4.2824,1.416585111\nCc1ccc(CN[C@@H](C(=O)NC2CC2)c2ccccc2)s1,2.532502845,3.16602,0.633517155\nC[C@@H](NC(=O)NNC(=O)[C@@H]1CCCO1)c1ccc(Cl)cc1,2.857988826,1.9104,-0.947588826\nCC[C@H](O)Cn1nc(Cn2cncn2)nc1Cc1cccc(Cl)c1,3.119028851,1.933,-1.186028851\nCC(C)[C@H](Sc1nnc(-c2ccco2)n1C)C(=O)Nc1ccc(Cl)cc1,2.778211591,4.4839,1.705688409\nCO[C@H](C)c1noc(CNc2ccc(Sc3ccccc3)cc2C)n1,2.950654307,4.84872,1.898065693\nC[C@@H](c1ccccc1)N1C[C@@H](C(=O)NCCCN2CC[NH+](C)CC2)CC1=O,3.848030117,-0.0673,-3.915330117\nCCCC[C@@H](C(=O)OC)[C@H](O)c1ccc(Br)cc1F,3.086125432,3.601,0.514874568\nO=C(NCC1(O)CCOCC1)c1ccc(CSC(F)F)o1,2.852911417,2.0067,-0.846211417\nC[C@](O)(CNC(=O)c1cccs1)CN1CCOCC1,2.825985348,0.5611,-2.264885348\nCCOc1ccccc1NC[C@@H]1COc2ccccc21,2.555787576,3.6734,1.117612424\nCCN1C=C(C[NH+]2CCC(C(=O)Nc3ccc([C@@H]4C=c5ccccc5=[NH+]4)cc3)CC2)[C@@H](C)N1,5.152966629,-0.7319,-5.884866629\nCC(C)(C)n1nccc1NC(=O)C(=O)N1CCC[C@@H](c2cn[nH]c2)C1,3.491526175,1.7059,-1.785626175\nCc1ccc(-n2ncc3c2ncn2nc(CCc4ccccc4)nc32)cc1Cl,2.444150134,4.21022,1.766069866\nCC(=O)N1CCN(Cc2cc(NC[C@H](O)c3ccsc3)cc[nH+]2)CC1,3.795660076,1.3718,-2.423860076\nCCC[C@H]1CN(C(=O)Nc2ccc(CC)c(F)c2)CCO1,2.654457773,3.4209,0.766442227\nCC[NH+]1CCN([C@H](C)CNC(=O)COc2ccccc2C#N)CC1,3.789513477,-0.33782,-4.127333477\nCCOc1ccc(N)c2ccccc12,1.62683674,2.8207,1.19386326\nCC[C@@](C)([C@@H](N)c1cccc(Cl)c1)[NH+](C)C,4.121757026,1.653,-2.468757026\nCC1CCC(NC(=O)C2=CC(=O)CS2)CC1,2.872345398,1.8811,-0.991245398\nCOc1cccc([C@@H](CO)NC(=O)NCc2scnc2C)c1,2.77534479,1.99292,-0.78242479\nC[C@@H](CCO)C[NH2+]Cc1ncc(-c2ccccc2)s1,3.718444882,1.892,-1.826444882\nC[C@@H](c1ccccc1)[C@@H]1[NH2+][C@@H](C(=O)[O-])CS1,4.769985386,-0.4551,-5.225085386\nO=C(NCCc1ccc(S(=O)(=O)NC(=O)c2ccc(Cl)cc2)cc1)c1ccc(Cl)cc1,1.894832086,4.0846,2.189767914\nCc1nc(-c2c[nH]c(C(=O)NC(C)(C)C)c2)cs1,2.51899596,2.97492,0.45592404\nCCSc1nnc(NC(=O)c2sc3cccc(F)c3c2C)s1,2.398658841,4.56462,2.165961159\nCn1c(=O)c(C2=NN[C@H](c3cccs3)C2)c(O)c2ccccc21,3.233543543,2.7443,-0.489243543\nCc1noc(C)c1CCC(=O)N1CCC2(CC1)Nc1ccccc1S(=O)(=O)N2,3.384083936,1.94674,-1.437343936\nCOCc1ccccc1NC(=O)NCc1ccc(C)cc1N(C)C,2.104700542,3.52912,1.424419458\nCc1cc2c3c([nH]c2cc1S(=O)(=O)N[C@H](C)Cc1ccco1)CC(C)(C)CC3=O,3.268707608,4.13392,0.865212392\nCCOc1ccc([C@H](C)NC(=O)C(=O)Nc2ccc(Cl)cc2C)cc1,2.322029223,3.86302,1.540990777\nCO[C@H](CNC(=O)C(=O)Nc1cccnc1-n1cccn1)c1ccccc1,2.837716015,1.7097,-1.128016015\nCCc1ccc([C@H]([NH3+])[C@@H](C)Oc2ccccc2C)cc1,3.183736434,3.30792,0.124183566\nCc1cccc([C@H]2CCCCC[NH+]2C2CCN(C(N)=O)CC2)c1,4.215146126,2.03812,-2.177026126\nCCC[C@@H](NC(=O)c1ccc2[nH]c3c(c2c1)CCCC3)C(=O)OC,2.682829088,3.1182,0.435370912\nC[C@H]1CN(C(C)(C)CNC(=O)[C@@H]2C[C@@H]2c2ccc(F)cc2)C[C@@H](C)O1,3.550507363,2.9332,-0.617307363\nCC(C)CN(C[C@H](O)c1ccc(F)cc1)C(=O)Nc1cccnc1Cl,2.717118187,4.0976,1.380481813\nCCCCOc1ccc(Cl)cc1,1.295802195,3.5189,2.223097805\nCS(=O)(=O)NC1CCC([NH2+][C@@H]2C[C@H]3C[C@@H]2[C@H]2CCC[C@@H]23)CC1,5.544107265,1.2349,-4.309207265\nCOC[C@@H](C)CNC(=O)[C@@H]1C=C[C@@H]([NH3+])C1,4.562448454,-0.4283,-4.990748454\nCc1ccc(C[C@@H](C)[NH2+][C@H](C)c2ccc(C)c(F)c2)s1,3.86904984,3.75964,-0.10940984\nCOCCN1C(=O)[C@@H]2[C@@H](Cc3ccc(O)c(O)c3)N[C@@]3(C(=O)Nc4ccc(F)cc43)[C@@H]2C1=O,4.296201915,0.8464,-3.449801915\nCc1ccc(C(=O)C2=C([O-])C(=O)N(CCCN3CCOCC3)[C@@H]2c2cccs2)cc1,3.18358086,2.15942,-1.02416086\nCc1cc([C@H](O)[C@@H]2CCO[C@]3(CCOC3)C2)ccc1F,4.177699502,2.75322,-1.424479502\nC[C@@H]1CCC[C@H](N(C)C(=O)c2cccc(NC(=O)C(C)(C)C)c2)C1,2.889836908,4.3219,1.432063092\nCOC(=O)[C@@H]1c2ccsc2CCN1S(=O)(=O)c1ccc(F)cc1,2.801380825,2.3483,-0.453080825\nCC[NH+](CC)C[C@@H](C)NC(=O)CC1C[C@@H]2CC[C@H](C1)[NH2+]2,6.514367429,-0.6897,-7.204067429\nCOC(=O)c1c(N)sc2c1CCS2,3.06293975,1.7651,-1.29783975\nCC[C@H](C(=O)N1CCCSC[C@H]1C[NH+](C)C)c1ccccc1,3.940231943,1.6588,-2.281431943\nCc1occc1C(=O)N/N=C/c1cc(Br)cc([N+](=O)[O-])c1[O-],3.024713357,2.09622,-0.928493357\nCCOc1ccccc1O[C@H]1CCC[C@H]1NC(=O)c1ccc(OC)nc1,2.886927227,3.2188,0.331872773\nCc1cc(C)c(NC(=O)NCc2ccc(N3CCOCC3)cc2)c(C)c1,1.995046745,3.77016,1.775113255\nCOc1ccc(C[C@@H](C#N)C(=O)[O-])cc1OC,2.962785706,0.13598,-2.826805706\nCC(C)NC(=O)Nc1nc2c(s1)CN(C(=O)c1ccc(C(F)(F)F)cc1)CC2,2.379930821,3.8903,1.510369179\nCc1cccnc1CSc1nc2cc(S(=O)(=O)N(C)C)ccc2o1,2.451381369,3.07382,0.622438631\nCc1ccccc1[C@@H](C)NC(=O)C(=O)Nc1ccc(N(C)C)c(C)c1,2.528571728,3.18534,0.656768272\nCC(C)(C)OC(=O)N1CC(=CC(=O)NCC#N)C1,2.850974182,0.80328,-2.047694182\nO=C1c2cccnc2C(=O)N1Cc1ccc(F)cc1S(=O)(=O)N1CCCCCC1,2.385687811,2.5816,0.195912189\nCCCN(CC(=O)NC)C(=O)[C@@H]1CC[C@H]2CCCC[C@@H]2[NH2+]1,4.358803608,0.2556,-4.103203608\nCS(=O)(=O)/N=c1/[nH]c(CC(=O)Nc2c(Cl)cccc2Cl)cs1,2.978326434,2.4245,-0.553826434\nC=CCn1c([S-])nc2cc(C(=O)N(C)CC(=O)Nc3ccccc3C(F)(F)F)ccc2c1=O,2.867110276,3.2178,0.350689724\nC[C@@H](c1ccc(S(C)(=O)=O)cc1)N(C)C(=O)c1cccc(Br)c1,2.415458229,3.6858,1.270341771\nCc1c(C(=O)NNC(=O)NC[C@H]2CCCO2)sc2ccc(F)cc12,2.847237911,2.47182,-0.375417911\nCCOc1ccc(O[C@@H](C)C(=O)NNC(=O)c2[nH]c3ccccc3c2Br)cc1,2.67368861,3.5576,0.88391139\nCc1ccc([C@@H](C)[NH2+][C@H](C)c2nc3c(s2)CCCC3)s1,4.200584091,3.77742,-0.423164091\nO=C(CN1C(=O)NC2(CCCCCC2)C1=O)NCc1cccs1,2.725502984,2.0091,-0.716402984\nCC[C@H](NC(=O)CCc1cc(F)ccc1F)C(=O)OC,2.471739379,1.9652,-0.506539379\nO=C(/C=C/c1c(F)cccc1F)N1CCN(c2ncccc2F)CC1,2.325210066,2.8609,0.535689934\nCC(C)C[C@H]1NC(=O)[C@H]2CN(C(=O)c3ccc4ccccc4c3)CCN2C1=O,3.012230743,2.0373,-0.974930743\nCCOc1cc(/C=C2/C(=N)N3N=CSC3=NC2=O)ccc1OC(=O)c1ccccc1Cl,2.837941116,4.20687,1.368928884\nCc1oc(-c2ccco2)nc1CC(=O)NC[C@H](C)Oc1ccccc1,2.739838922,3.36922,0.629381078\nCCC[C@H]1C[NH2+]CC[C@@H]1c1cnn(-c2ccccc2)c1,4.019041705,2.3393,-1.679741705\nO=C(c1c[nH]c(=O)cn1)N(CC1CC[NH+](C2CCCC2)CC1)C[C@H]1CCCO1,4.283016619,0.6286,-3.654416619\nC[C@@H]1C[C@@H]1c1ccc(/C=C/C(=O)N2CCC[C@H](O)C2)o1,3.44240408,2.3995,-1.04290408\nC[C@H](NC(=O)NC[C@@](C)(O)c1cccs1)c1cccc(C#N)c1,3.261078337,2.88768,-0.373398337\nO=C(Nc1ccnn1C1CC[NH+](Cc2ccccc2F)CC1)c1cccnc1,3.31736458,2.0895,-1.22786458\nCC(C)[C@@H](CNC(=O)NC[C@@]1(C)CCCO1)C(=O)[O-],3.979685613,-0.1232,-4.102885613\nCc1sc(NC(=O)C2CC2)c(C(=O)NCc2n[nH]c(=S)n2C(C)C)c1C,2.734164435,3.47843,0.744265565\nCc1nc(-n2cccc2)sc1C(=O)N1CCN(c2ccccc2Cl)CC1,2.289127951,3.85802,1.568892049\nCc1cc(NC(=O)N[C@H]2CCCN(c3ccccc3F)C2=O)no1,2.730594246,2.43922,-0.291374246\nCOc1ccc(Cl)c(NC(=O)[C@@H]2CC(O)=Nc3nncn32)c1,3.229683996,2.1116,-1.118083996\nO=C(Nc1ccc(Cl)cc1)C1CCN(c2ccc(-n3cccc3)cn2)CC1,2.15321418,4.3808,2.22758582\nCCc1c(N)cnn1[C@@]1(C)CCS(=O)(=O)C1,3.768605035,0.5614,-3.207205035\nCOc1ccc(OC)c(N2C(=O)/C(=C/c3ccc(O)cc3)SC2=S)c1,2.199361566,3.8152,1.615838434\nNC(=O)c1cccc(CNC(=O)COc2ccc(F)cc2Cl)c1,1.814883042,2.2732,0.458316958\nCOC[C@H](Nc1nccc(OC)n1)c1ccc(F)c(F)c1,2.847808751,2.563,-0.284808751\nCOC(=O)[C@@]1(NC(=O)CCCc2cc(Cl)sc2Cl)CCOC1,3.349771447,2.8259,-0.523871447\nC[C@H]1C[C@H](C)CN(S(=O)(=O)CCn2cnc3ccccc32)C1,3.092222018,2.344,-0.748222018\nCCCC[C@@H]1CCC[C@@H]1NC(=O)NC1CC[NH+](CC(=O)N(C)C)CC1,4.320234072,0.78,-3.540234072\nCc1cc(C)cc(N(C)CC(=O)N(C)[C@@H]2CCS(=O)(=O)C2)c1,2.950781578,1.38514,-1.565641578\nCc1ccc(/C=C/C(=O)Nc2sc(C)c(C)c2C#N)o1,2.5068509,3.78994,1.2830891\nCn1nnc2c(=O)n(CC(=O)NC3CC3)cnc21,2.465292041,-1.1964,-3.661692041\nCOc1cc([C@@H](C)N[C@H]2CC[C@H]([NH+](C)C)C2)ccc1OC(F)F,4.124893168,2.0128,-2.112093168\nCC(C)(C)OC(=O)NC1CCN(C(=O)C[NH3+])CC1,2.72243658,-0.256,-2.97843658\nCC[C@](C)([C@@H]([NH2+]C)c1cc(Cl)cnc1N)N1CCOCC1,4.283133262,1.0524,-3.230733262\nCc1cnc([C@@H](NC(=O)N(C)[C@H](c2ccccc2)C(C)C)C2CC2)s1,3.46224187,4.94132,1.47907813\nCOc1ccc([C@H]2CC(=O)C3=C(C)Nc4ccccc4N[C@H]3C2)c(OC)c1OC,3.44999679,4.3391,0.88910321\nCC(=O)N(C)C1CCC(C)(C)CC1,2.486653325,2.4335,-0.053153325\nCNC(=O)c1cccc(NC(=O)Cc2ccccc2Cl)c1,1.657934364,2.8808,1.222865636\nCOc1cc(Br)c(O[C@H]2CCCCO2)cc1C,2.706881191,3.67152,0.964638809\nO=C(C[C@@H]([NH2+]C[C@H]1CCCO1)C(=O)[O-])Nc1cccc(Cl)c1,4.003377104,-0.4705,-4.473877104\nCOc1ccc2cc(-c3csc(NC(=O)COc4ccccc4)n3)oc2c1,2.100462086,4.5824,2.481937914\nC[C@H](Sc1nnc(-c2ccccc2)n1C)C(=O)N1CCC[C@@H](C(N)=O)C1,2.909074093,1.6866,-1.222474093\nCCC[C@@H]1C[C@@H]1NC(=O)C(=O)Nc1cccc(-c2nnc(C)o2)c1,3.176625808,2.28832,-0.888305808\nCc1cc(C(=O)COC(=O)c2ccc(I)cc2)c(C)n1C[C@H]1CCCO1,2.718210326,3.92824,1.210029674\nCc1c(Cl)cccc1NC(=O)C(=O)NCc1ncn(C)n1,2.463583143,1.03182,-1.431763143\nCc1ccc([C@@]2(C(C)(C)C)CCC[NH2+]2)cc1,3.995007141,2.59362,-1.401387141\nCN(C)C(=O)c1ccc(NC[C@@H]2CC[C@@H](C(=O)[O-])O2)nn1,3.770737241,-1.1122,-4.882937241\nc1cncc([C@H](c2nc(-c3ccc4nc[nH]c4c3)no2)N2CCOCC2)c1,3.116341853,2.4295,-0.686841853\nC[C@@H]1[C@H](C(=O)[O-])CCN1C(=O)N1CCc2ccccc21,3.598947972,0.6294,-2.969547972\nCn1cc(CCCOC(=O)c2cc3ccccc3c(=O)[nH]2)cn1,2.31302229,2.0512,-0.26182229\nCCOc1ccc(NC(=O)Cn2cnc3sccc3c2=O)cc1,2.011111102,2.4954,0.484288898\nO=C(Nc1ccccc1O)C1CC[NH+](CC2CCCCC2)CC1,3.239610037,2.2059,-1.033710037\nCc1ccc(CCNC(=O)c2cc(NC(=O)C3CCCC3)n(C)n2)o1,2.383830001,2.42272,0.038889999\nO=C(Nc1cccnc1N1CCCC1)c1ccccc1,1.689281897,2.9341,1.244818103\nCOc1ccc(N2C(=O)NC(=O)/C(=C/Nc3ccc(F)cc3)C2=O)cc1,2.107915685,2.4131,0.305184315\nC[C@@H]([NH2+]Cc1ccc(Cl)c(Cl)c1)c1ccc2c(c1)NC(=O)CO2,3.579272193,3.1489,-0.430372193\nCc1cccc(C)c1NC(=O)CN(C)C(=O)c1cc(C)n(-c2ccccc2)c1C,2.10301896,4.42168,2.31866104\nCCc1ccsc1CNC(=O)Nc1ccc(NC(C)=O)c(OC)c1,2.285167592,3.5992,1.314032408\nCC/C(C)=N\\NC(=O)Cc1ccc(OC)cc1,1.94757722,2.1398,0.19222278\nCc1n[nH]c2cc(NC(=O)c3c(O)cc(F)cc3F)ccc12,2.437484013,3.10742,0.669935987\nCc1ccc(S(=O)(=O)NC[C@H](c2ccc(N(C)C)cc2)[NH+](C)C)c(C)c1,3.427929923,1.53354,-1.894389923\nCCC(O)(CC)C(=O)NN(C(=O)c1csc(C)c1)c1ccccc1,2.822409727,3.28562,0.463210273\nCC(C)CCNc1nc2c(c(=O)n(Cc3ccccc3Cl)c(=O)n2C)n1C,2.454505588,2.5934,0.138894412\nCOc1cc(C(=O)NC[C@H](C2CC2)[NH+](C)C)cc(OC)c1C,3.532251982,0.66512,-2.867131982\nCc1cccc2cc([C@H]3NC(=O)c4c(sc5c4CCCC5)N3)c(Cl)nc12,3.223424868,4.99102,1.767595132\nCc1[nH]cnc1C[NH+]1CCCCCCC1,4.107993259,1.06712,-3.040873259\nCCC1CCC(C[NH3+])([NH+](CC)[C@@H](C)CC)CC1,5.245877951,1.2706,-3.975277951\nN#Cc1ccc([C@@H]([NH3+])CO)cc1F,3.721661942,-0.02732,-3.748981942\nCC1(C)SC[C@H](C(=O)N2CCC(OCc3cccnc3)CC2)NC1=O,3.235181826,1.5994,-1.635781826\nO=C(C[NH+]1CCC[C@H]1c1nc2ccccc2s1)Nc1cc2[nH]c(=O)[nH]c2cc1Br,4.250457538,2.5869,-1.663557538\nCS(=O)(=O)NC[C@H]1CCCCN1C(=O)Nc1cc(C(F)(F)F)ccc1N1CCCC1,2.897130764,3.2412,0.344069236\nC[C@H](CC(=O)[C@@H](C#N)c1nc2cc(Cl)ccc2s1)NC(=O)C1CCCC1,3.325606232,4.21098,0.885373768\nC=CC[NH+](Cc1cc(F)ccc1OC)[C@H]1CCS(=O)(=O)C1,4.383766153,0.5923,-3.791466153\nCCc1cccc(OC2CN(C(=O)[C@H]3CCCCC[NH+]3C)C2)c1,3.985533936,1.2959,-2.689633936\nC[NH+]1CCN(C(=O)[C@@H](O)c2ccccc2)CC1,3.840745614,-0.9231,-4.763845614\nCOc1ccc(C(=O)N2CCC[C@H](c3nc(O)c4nnn(Cc5ccc(F)cc5)c4n3)C2)cc1OC,3.02645316,3.1513,0.12484684\nNC(=O)CCOc1ccccc1NC(=O)c1ccc(Cl)c(Cl)c1,1.802330638,3.4999,1.697569362\nC[C@@H](O)c1ccc(-c2ccccc2)cc1,1.840468032,3.4069,1.566431968\nC[NH2+]C1CCC(N(C)C(=O)c2sccc2Br)CC1,3.545652041,2.087,-1.458652041\nC[C@H]1CCC[NH+]1CC(=O)Nc1ccc(F)c(N2CCCS2(=O)=O)c1,4.340830087,0.3713,-3.969530087\nCC[C@@H](C)NC(=O)[C@@H](C)NC(=O)C#Cc1ccccc1,2.993557533,1.4575,-1.536057533\nc1ccc2sc(C3CC[NH+](C4CCSCC4)CC3)nc2c1,3.761832631,2.9542,-0.807632631\nC=C1C[C@@]23CC[C@@H]4[C@@](C)(CCC[C@@]4(C)C(=O)[O-])[C@H]2CC[C@]1(O)C3,5.990082746,2.8203,-3.169782746\nNc1c(NCCN2CCOCC2)ncnc1Nc1ccc(Br)cn1,2.469548408,1.704,-0.765548408\nCn1c(COC(=O)CCc2ncc(-c3ccccc3F)o2)cc(=O)n(C)c1=O,2.549319379,1.5541,-0.995219379\nCOc1ccc(Br)cc1/C=C/C(=O)NC[C@@H]1CCCO1,2.604302267,2.7661,0.161797733\nCCC[C@H](C)NC(=O)[C@@H](C)Nc1cccc(C)c1,2.577453333,3.10022,0.522766667\nC[C@H]1C[C@H](C)CN(C(=O)COC(=O)c2cncc(Br)c2)C1,2.968923183,2.5054,-0.463523183\nNc1cc(-c2cccc(Cl)c2)nn1-c1ccccc1F,2.017455876,3.914,1.896544124\nCOc1ccc(NC(=O)C(=O)NC2CC2)cc1C,1.795305536,1.22072,-0.574585536\nCC(C)N(Cc1cccnc1)C(=O)Nc1cnn(C[C@H]2CCCO2)c1,2.91680623,2.8996,-0.01720623\nC[C@@H](CNS(=O)(=O)c1ccc2c(c1)NC(=O)CO2)[NH+](C)C1CC1,4.101762609,-0.6386,-4.740362609\nCOc1ccc(NC(=O)N2CCC[C@@H](c3nnc(C(=O)Nc4cccc(F)c4)s3)C2)cc1,2.731919328,4.3496,1.617680672\nCOC1CC[NH+](C[C@@H]2CCC[C@H](C)C2)CC1,4.704832754,1.5064,-3.198432754\nCSc1ccc([C@H](Br)Cc2ccccc2)cc1,2.313993515,5.0872,2.773206485\nC[C@@]1(c2ccccc2F)NC(=O)N(CC(=O)NCC(F)(F)F)C1=O,2.754553183,1.2712,-1.483353183\nO=c1[nH]c(Cn2c(-c3ccccc3)noc2=O)nc2sc3c(c12)CCCC3,2.594235212,2.7284,0.134164788\nCc1cncc(C(=O)NCCC[NH+]2C[C@H](C)C[C@@H](C)C2)c1,4.271486349,1.07072,-3.200766349\nCCc1ccc(NC(=O)[C@@H](C)S(=O)(=O)Cc2nncn2C)cc1,2.848858229,1.3195,-1.529358229\nCOc1ccccc1-c1ccccc1C[C@H]1CN(C(=O)c2ccccc2)CCN(C)C1=O,2.69381381,4.1353,1.44148619\nCc1nsc(N)c1-c1nc2cnccc2n1C1CC1,2.945782277,2.78032,-0.165462277\nCc1ccsc1CN(CC[NH+](C)C)C(=O)c1ccc2c(c1)nnn2C,3.351786841,1.12512,-2.226666841\nCc1cccnc1CCNC(=O)c1ccc([C@H]2CCC[NH+]2C(C)C)s1,4.198988791,2.55222,-1.646768791\nCCc1ccsc1CNC(=O)C1(c2ccccc2F)CC1,2.52839512,3.7976,1.26920488\nCOc1cccc([C@@H](C)NC(=O)COC(=O)c2cncc(Br)c2)c1,2.462586621,2.8869,0.424313379\nCS(=O)(=O)N(CC(=O)N1CCOCC1)c1ccc(F)c(Cl)c1,2.227684896,1.1039,-1.123784896\nCOc1ccc(N2C(=O)CC[C@H](C(=O)Nc3ccc(-n4ccnc4)cc3)[C@H]2c2ccccc2OC)cc1,3.144845929,5.0125,1.867654071\nCC[C@@H]1C(=O)N2CCC[C@H]2C(=O)N1CC1CCC1,3.231498394,1.3983,-1.833198394\nCn1ccc(C(=O)N2C[C@@H]3CC[C@H](C2)[NH+](Cc2ccccc2)C3)n1,5.015766837,0.7396,-4.276166837\nNC1=NC(=O)N(Cc2ccccc2)[C@@H]1c1ccccc1Br,2.854435153,3.4832,0.628764847\nCC(C)[NH+]1CCCN(C(=O)c2cc(C3CC3)n(C(C)(C)C)n2)CC1,4.137344869,1.6547,-2.482644869\nCc1cc(CNC(=O)[C@@]2(C)CCC[NH2+]2)cc(C)c1F,3.955747577,1.17464,-2.781107577\nC[C@@H]1CC(=O)N(c2ccc(NCc3cccnc3)c([N+](=O)[O-])c2)C1=O,2.76947101,2.5013,-0.26817101\nCC1CCC2(CC1)NC(=O)N(NC(=O)CNC(=O)c1cccs1)C2=O,3.018403098,1.0098,-2.008603098\nCCOc1ccc(C(C)=O)cc1Cn1c(C(F)(F)F)nc2ccccc21,2.185433336,4.7047,2.519266664\nCc1ccccc1N1C(=O)c2ccc(C(=O)OCC(=O)NC[C@H]3CCCO3)cc2C1=O,2.664389642,2.24762,-0.416769642\nCOC1CCC(CC(=O)Nc2[nH+]cccc2[O-])CC1,3.890243815,1.1081,-2.782143815\nNc1ccc(F)cc1NC(=O)CN1CCc2ccccc21,1.963255159,2.4091,0.445844841\nCOc1c(Br)cc(Cl)cc1S(=O)(=O)[N-]c1cc(F)ccc1F,3.247974726,4.7834,1.535425274\nCCOc1ccc(NC(=O)N(C)Cc2ncnn2C)cc1C,2.274019952,2.18612,-0.087899952\nC[C@H]1OCC[C@@H]1C(=O)Nc1ccc(Cl)c(C(=O)[O-])c1,3.500226302,1.067,-2.433226302\nO=C(NCc1cccnc1)C(=O)N1CCCC1,1.957724353,0.3202,-1.637524353\nC#CCOc1ccc2ccccc2c1CN1CCN(CC#N)CC1,2.405771046,2.49298,0.087208954\nCCOc1ncccc1-c1ncnc2scc(C)c12,2.507808944,3.46042,0.952611056\nCC(=O)NC1(c2ccc(F)cc2)CCN(C(=O)C2CCCCC2)CC1,2.31470064,3.3598,1.04509936\nCc1ccc(NC(=O)C(=O)NC[C@@H](c2ccc3c(c2)CCN3C)[NH+]2CCCCC2)c([N+](=O)[O-])c1,3.902352433,1.76032,-2.142032433\nCCc1ccc(Oc2ccc3nnnn3n2)c(N)c1,2.589085867,1.4562,-1.132885867\nCc1cc(OCC(=O)OC(C)C)cc2c1C(=O)/C(=C/c1cc(Br)cc3c1OCOC3)O2,2.988519088,4.57062,1.582100912\nCOC(=O)[C@H]1[C@@H](NC(=S)NC[C@@H]2C=CN=N2)S[C@@H](C)[C@H]1C,5.238676386,1.6853,-3.553376386\nC[C@H]1C[C@@H](C)CN(C(=O)CSc2ccccc2S(C)(=O)=O)C1,3.009957402,2.6867,-0.323257402\nCC1(C)COc2cscc2OC1,3.146666448,2.5455,-0.601166448\nCc1cc(C)c(NC(=O)CNC(=O)CCc2nc3ccccc3c(=O)[nH]2)c(C)c1,2.263700819,2.53586,0.272159181\nCOC[C@@H](C)NC(=O)Nc1ccc(F)cc1F,2.338775673,2.1212,-0.217575673\nCCOc1ccc(N[C@H](C)c2ccc(F)cc2F)cc1CO,2.516868547,4.0289,1.512031453\nCCCn1nnnc1CN1C(=O)c2ccc([N+](=O)[O-])cc2C1=O,2.398476792,0.7875,-1.610976792\nCC(C)CC(C)(C)C(=O)Nc1ccc(F)c(C(=O)N(C)C)c1,2.287139552,3.5383,1.251160448\nCn1cc(/C=C/C(=O)c2cccc(F)c2)c(=O)n(C)c1=O,2.300985321,1.1192,-1.181785321\nO=C(NCc1ccnc(OC2CCC2)c1)c1[nH]ncc1Br,2.55409362,2.4285,-0.12559362\nCc1ccc([C@@H](CN2CCOCC2)NC(=O)COc2ccc(F)cc2)cc1,2.421749151,2.70262,0.280870849\n"
  },
  {
    "path": "tests/data/micro_ZINC_corrupt.csv",
    "content": "SMILES1,SMILES2,SMILES3,SA,logp,score\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,CCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,CCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,2.775189478,3.7897,1.014510522\nXXX,C=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,C=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.071497898,3.161,0.089502102\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,COc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,COc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,2.84000896,1.5048,-1.33520896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,CN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,CN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,3.605168457,-1.1109,-4.716068457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],COc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],COc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.132664673,2.2426,0.109935327\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,Cc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,Cc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,3.117639223,0.98102,-2.136619223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,C=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,C=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,2.744789746,4.3197,1.574910254\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,XXX,CON(C)C(=O)c1cnn2c(C)cc(C)nc12,2.547847184,0.97954,-1.568307184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,NC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,NC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.444743614,2.4136,-0.031143614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,COc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,COc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,3.546311784,0.8084,-2.737911784\nCc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,XXX,Cc1ccc(-c2nc3ccc(C)c(C)c3[nH]2)nc1,2.342516783,3.55016,1.207643217\nCc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,Cc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,Cc1n[nH]c(SCC(=O)Nc2cccc(C#Cc3ccccn3)c2)n1,2.685016252,2.63872,-0.046296252\nCOC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,COC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,COC(=O)c1cncc(C(=O)Nc2cccc(COC(C)C)c2)c1,2.04419285,3.0455,1.00130715\nCc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,Cc1ccc(C)c(Oc2ccc(CNC(=O)N3CCCSCC3)cn2)c1,XXX,2.369895336,4.13924,1.769344664\nCC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,CC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,CC[NH2+][C@@]1(C(=O)[O-])CCC[C@H](Oc2ccc(CC)cc2)C1,4.165551538,0.6424,-3.523151538\nCC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,CC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,CC(=O)N1c2ccc(S(=O)(=O)N3CCCC3)cc2C[C@H]1C(=O)NCC[NH+](C)C1CCCCC1,3.989079408,0.7123,-3.276779408\nC=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,C=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,C=CCN1CC(=O)N2[C@@H](Cc3c([nH]c4ccccc34)[C@@H]2c2ccccc2OC)C1=O,3.287567863,3.0474,-0.240167863\nCOc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,COc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,COc1ccc(-c2noc3ncnc(N4CCC[C@H](C(=O)[O-])C4)c23)cc1,3.152874099,1.2597,-1.893174099\nN#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,N#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,N#Cc1cc(NC(=O)Cc2ccccc2Cl)ccc1N1CCC(O)CC1,2.173785623,3.35398,1.180194377\nXXX,XXX,XXX,2.513723357,3.71262,1.198896643\nCc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,Cc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,Cc1ccc(NC(=O)c2ccc(F)cc2F)cc1S(=O)(=O)Nc1ccc(Cl)cc1,1.947448768,4.97972,3.032271232\nCNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,CNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,CNC(=O)[C@@H]1CCC[NH+]1Cc1ccc(C)c(F)c1,4.391338086,0.42742,-3.963918086\nCN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,CN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,CN1C(=O)N[C@@H](c2cccc([N+](=O)[O-])c2)C2=C1CN(c1ccc(F)cc1)C2=O,2.956072154,2.7309,-0.225172154\nCc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,Cc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,Cc1cc(CC(=O)N[C@H](c2ccc(F)cc2)C2CCC2)no1,2.665131639,3.32222,0.657088361\nCc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,Cc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,Cc1cc(N2CCN(CC(=O)NC3CC3)CC2)nc(-c2ccc([N+](=O)[O-])cc2)n1,2.249569987,1.76082,-0.488749987\nCc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,Cc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,Cc1cc(C(=O)N2CCN(C(=O)N[C@H]3CC(=O)N(C4CC4)C3)CC2)c(C)o1,2.921725033,1.12714,-1.794585033\nO=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,O=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,O=c1c2c3nc4ccccc4nc3n(CCC3=CCCCC3)c2ncn1C[C@H]1CCCO1,3.257260117,4.3639,1.106639883\nCC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,CC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,CC[NH+](C/C=C/c1ccc(C#N)cc1)C[C@H](C)C#N,4.36293875,1.63596,-2.72697875\nCOc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,COc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,COc1c(Cl)cc(C[NH2+][C@@H](Cc2c[nH]c3ccccc23)C(=O)[O-])cc1Cl,3.725368177,1.9079,-1.817468177\nCOc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,COc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,COc1ccccc1[C@H](C)NC(=O)C[C@H]1C[NH2+]CCO1,3.812797047,0.2247,-3.588097047\nCc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,Cc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,Cc1ccc(NC(=O)C(=O)NCCc2ccccc2F)c(C)c1,1.826753956,2.73994,0.913186044\nXXX,XXX,XXX,3.228025092,3.2416,0.013574908\nCC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,CC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,CC[C@H](C)C[NH+]1CCCC[C@@H]1C(=O)NC(C)(C)C,4.787067722,1.3846,-3.402467722\nCOC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,COC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,COC(=O)c1c(S(=O)(=O)NC2CC2)sc2c1CCN(Cc1ccc3ccccc3c1)C2,2.485526606,3.6869,1.201373394\nCc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,Cc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,Cc1cc(C)cc(NC(=O)[C@@H](Sc2nnnn2C2CC2)c2ccccc2)c1,2.708478583,4.09694,1.388461417\nC=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,C=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,C=CCn1c([S-])nnc1-c1sc(NC(=O)c2cccc(NC(C)=O)c2)nc1C,2.943459024,3.01252,0.069060976\nO=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,O=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,O=C(CNc1nc(C2CC2)no1)N1CCc2sccc2C1,2.781430253,2.0053,-0.776130253\nCc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,Cc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,Cc1ccccc1NC(=O)NCCn1c([N+](=O)[O-])cnc1C,2.258312512,2.22984,-0.028472512\nC[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,C[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,C[C@H](NC=C(C#N)C#N)c1ccc(-n2cncn2)cc1,3.277555708,1.84896,-1.428595708\nCN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,CN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,CN1C[C@H](C(=O)NC[C@H]2CCC(C)(C)c3ccccc32)CC1=O,3.259412472,2.4361,-0.823312472\nCC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,CC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,CC(C)n1cccc1C(=O)N[C@H]1CCc2cc(N)ccc21,2.887282024,3.0685,0.181217976\nCCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,CCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,CCN1C(=O)C(C#N)=C(C)/C(=C\\Nc2ccc([N+](=O)[O-])cc2)C1=O,2.501067124,2.11938,-0.381687124\nO=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,O=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,O=C(Nc1ccnn1Cc1cccc(Cl)c1)C1CCN(S(=O)(=O)c2ccccc2F)CC1,2.296421252,3.7633,1.466878748\nCOc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,COc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,COc1cc([C@@H]2N[C@H](C(=O)[O-])CS2)ccc1OC(=O)c1ccccc1,3.273511868,1.3679,-1.905611868\nCCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,CCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,CCOc1ccc(C2=CCN(C(=O)c3ccccc3F)CC2)cc1,2.000346516,4.1539,2.153553484\nCC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,CC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,CC(=O)N1CC[C@@H]([NH2+][C@@H](C)CSCC(C)C)C1,4.483674272,0.9483,-3.535374272\nCCO[P@](=O)(CC(=O)[O-])c1ccccc1,CCO[P@](=O)(CC(=O)[O-])c1ccccc1,CCO[P@](=O)(CC(=O)[O-])c1ccccc1,3.510193788,0.3764,-3.133793788\nO=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,O=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,O=C(CCc1nc2ccccc2c(=O)[nH]1)N1CCC[C@H]1C1CCCC1,2.774610155,3.0369,0.262289845\nO=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,O=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,O=C(/C=C/SCc1ccco1)Nc1cccc(N2CCCC2=O)c1,2.584851266,3.792,1.207148734\nCC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,CC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,CC(=O)c1c(C)[nH]c(C(=O)OCC(=O)N2C[C@H](C)C[C@@H](C)C2)c1C,3.068452859,2.49544,-0.573012859\nCc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,Cc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,Cc1ccc(C(=O)N2CCC[C@@H](C(=O)N(C)CC(=O)NC(C)C)C2)cc1,2.492863438,1.83022,-0.662643438\nC[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,C[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,C[C@@H](C#N)Sc1nnc(C23CC4CC(CC(C4)C2)C3)o1,4.576896432,3.54158,-1.035316432\nCC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,CC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,CC1(C)CCC[C@H]1n1c(N)[nH+]c2ccccc21,4.151966768,2.7888,-1.363166768\nCCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,CCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,CCOc1ccc(C[NH+]2CCS[C@H]3COCC[C@@H]32)cc1OC,4.514122047,1.3831,-3.131022047\nCOc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,COc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,COc1cc(OC)c([C@@H](C)[NH2+]C[C@H](O)C2CCOCC2)cc1Cl,3.960984106,1.7691,-2.191884106\nCOC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,COC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,COC(C[C@@](C)(O)C[C@@H]1CCCN(S(C)(=O)=O)C1)OC,3.669896168,0.8081,-2.861796168\nC=CCSc1nnc(C2CCOCC2)n1N,C=CCSc1nnc(C2CCOCC2)n1N,C=CCSc1nnc(C2CCOCC2)n1N,2.825013358,1.164,-1.661013358\nO=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,O=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,O=C(Nc1c[nH]c2ccccc12)N1CCCC[C@@H]1CCO,2.674797385,2.9367,0.261902615\nCc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,Cc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,Cc1nc(-c2ccc(-c3noc(C4CC4)n3)cc2)cs1,2.133219774,4.04592,1.912700226\nC[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,C[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,C[C@H]1CCCCN1C(=O)[C@@H](C)NC(=O)C(=O)Nc1ccnn1C(C)(C)C,3.418504414,1.4823,-1.936204414\nc1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,c1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,c1cc2c(cc1N[C@@H]1CCOC3(CCC3)C1)CCC2,3.363061129,3.6889,0.325838871\nO=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,O=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,O=C(Nc1ccc(CNC(=O)N2CCCCCCC2)cc1)c1ccco1,1.931563902,4.0076,2.076036098\nCNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,CNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,CNC(=O)[C@H]1CCCCN1c1cccc(F)c1C(N)=S,2.82917551,1.5648,-1.26437551\nO=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,O=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,O=C(NCc1nnc2n1CCC2)c1cc(-c2ccccc2)[nH]n1,2.376871447,1.5444,-0.832471447\nCc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,Cc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,Cc1nnc(-c2cc(CC(C)C)n(-c3cccc(C(=O)N[C@@H]4C[C@@H](C)CC(C)(C)C4)c3)n2)o1,3.548164288,5.37382,1.825655712\nCOc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,COc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,COc1cccc([C@@H]2CC(=O)C3=C(C2)NC(=O)C[C@H]3c2ccc(OC)c(OC)c2OC)c1,3.226440403,3.7253,0.498859597\nC[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,C[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,C[NH2+]Cc1ccc(-c2cc(C)ccc2F)s1,3.128631552,2.55582,-0.572811552\nCCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,CCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,CCC[C@H](NC(N)=O)C(=O)NC[C@@H]1CCCO1,2.869359841,0.1186,-2.750759841\nCOc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,COc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,COc1cc(F)cc(CNC(=O)[C@H]2CCCN2C(=O)Cc2ccccc2)c1,2.427176885,2.6842,0.257023115\nCCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,CCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,CCc1cc(C#N)c(NC(=O)CCC(=O)N(CC)CC)s1,2.493551313,2.76928,0.275728687\nC#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,C#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,C#Cc1cccc(NC(=O)C[NH+](C)[C@H](C)c2nnc(-c3ccccc3)o2)c1,3.779297449,1.9323,-1.846997449\nCCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,CCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,CCOC[C@H](O)[C@](C)(CC)[NH+]1CCCC1,5.127905513,0.2312,-4.896705513\nC[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,C[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,C[NH+](C)Cc1cccc(CNC(=O)NC[C@H](O)c2ccc(F)cc2)c1,3.24179101,1.003,-2.23879101\nCc1ccsc1C(=O)NCCNC(=O)c1ccco1,Cc1ccsc1C(=O)NCCNC(=O)c1ccco1,Cc1ccsc1C(=O)NCCNC(=O)c1ccco1,2.114582277,1.80932,-0.305262277\nCOC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,COC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,COC(=O)[C@H](C)N(Cc1ccccc1)C(=O)Cc1ccon1,2.852054716,1.8074,-1.044654716\nC/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,C/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,C/C=C(/C)C(=O)NC[C@H]1C[C@@]12CCc1ccccc12,3.940919906,2.9729,-0.968019906\nO=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,O=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,O=C1Nc2ccccc2Oc2cc([N+](=O)[O-])cc(Oc3ccc(Cl)cc3)c21,2.400946341,5.3985,2.997553659\nCCc1onc(C)c1NC(=O)CCCC(C)(C)C,CCc1onc(C)c1NC(=O)CCCC(C)(C)C,CCc1onc(C)c1NC(=O)CCCC(C)(C)C,2.601600041,3.70032,1.098719959\nCOC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],COC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],COC(=O)C(C)(C)COc1ccc2c(c1)OC[C@@H]2[NH3+],3.576525387,0.94,-2.636525387\nCC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,CC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,CC#CCCC(=O)Nc1cccc2c1C(=O)c1ccccc1C2=O,2.298959953,3.204,0.905040047\nCc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,Cc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,Cc1nc(C)c(S(=O)(=O)/N=C(\\[O-])C[C@H]2CCCO2)s1,3.812988405,0.77654,-3.036448405\nCCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,CCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,CCC(=O)N[C@H](CCSC)C(=O)Nc1ccc2[nH]c(C)cc2c1,2.732126257,3.06272,0.330593743\nCOCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,COCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,COCC[NH+](C)Cc1c(C)cc(C)c(C(C)=O)c1C,3.955103091,1.47556,-2.479543091\nCc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,Cc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,Cc1cscc1CNC(=O)N[C@@H](C)c1nc(C(=O)[O-])cs1,3.628458628,1.43692,-2.191538628\nCNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,CNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,CNC(=O)NCC(=O)NC[C@H](c1cccnc1)C(C)C,2.795116489,0.8664,-1.928716489\nCOc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,COc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,COc1cccc(C(=O)N2CCC[C@H]2[C@H]2CCC[C@@H]2O)c1,3.002385916,2.4608,-0.541585916\nCOc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,COc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,COc1ccc2c(c1)[C@H]([NH2+][C@H](C)CCN1CCOCC1)CCCO2,3.789130146,1.5831,-2.206030146\nCOc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,COc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,COc1cccc(OCC(=O)N2CCN(c3nc4ccc(S(C)(=O)=O)cc4s3)CC2)c1,2.245396536,2.436,0.190603464\nCOc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,COc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,COc1cccc(-c2nc(C(=O)O[C@@H](C)CNC(C)=O)c(C)[nH]2)c1,2.86962049,2.07512,-0.79450049\nCCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,CCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,CCO[C@H]1C[C@@H]1C(=O)Nc1ccc(Sc2nncs2)c(Cl)c1,3.445502237,3.7062,0.260697763\nCc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,Cc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,Cc1ccc(N(CC(=O)N2CCCC[C@H]2C)S(=O)(=O)c2c(C)nn(C)c2C)cc1,2.872766629,2.94166,0.068893371\nCOc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,COc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,COc1ccc(C(=O)Nc2nc(CC(=O)Nc3ccc(N(C)C)cc3)cs2)cc1,1.983649537,3.6512,1.667550463\nO=c1c2ccccc2sn1-c1ncc(Br)s1,O=c1c2ccccc2sn1-c1ncc(Br)s1,O=c1c2ccccc2sn1-c1ncc(Br)s1,2.853247472,3.2712,0.417952528\nCc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,Cc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,Cc1ocnc1CNC(=O)N(C)Cc1ccc(OC(F)F)cc1,2.60278761,2.92602,0.32323239\nCOc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,COc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,COc1cc(Cl)c(C)cc1NC(=O)C(=O)NCc1ccccc1C[NH+](C)C,2.870575026,1.55642,-1.314155026\nC[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,C[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,C[C@@H]([NH3+])c1cccc(Oc2cc(Br)ccc2Cl)c1,2.965149266,4.1977,1.232550734\nCC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,CC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,CC(C)NS(=O)(=O)c1cccc(C(=O)NCc2ccco2)c1,1.961616674,1.8963,-0.065316674\nCN(Cc1ncnn1C)C(=O)CSCc1ccccn1,CN(Cc1ncnn1C)C(=O)CSCc1ccccn1,CN(Cc1ncnn1C)C(=O)CSCc1ccccn1,2.529083407,1.1019,-1.427183407\nCCOc1cc(/C=C(\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC,CCOc1cc(/C=C(\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC,CCOc1cc(/C=C(\\C#N)C(=O)c2c[nH]c3cc(Cl)ccc23)ccc1OC,2.350767662,5.01848,2.667712338\nCCOC(=O)Nc1ccc(NC(=O)N2C[C@@H](C)CC[C@H]2C)cc1C#N,CCOC(=O)Nc1ccc(NC(=O)N2C[C@@H](C)CC[C@H]2C)cc1C#N,CCOC(=O)Nc1ccc(NC(=O)N2C[C@@H](C)CC[C@H]2C)cc1C#N,3.067347555,3.77898,0.711632445\nCn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,Cn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,Cn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,2.748880773,2.327,-0.421880773\nCC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,CC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,CC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,3.356907136,1.8907,-1.466207136\nCCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,CCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,CCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,5.501545361,1.5698,-3.931745361\nCC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,CC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,CC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,3.063324,0.5446,-2.518724\nCOc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,COc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,COc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,2.583698935,3.5677,0.984001065\nCC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,CC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,CC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,3.087047482,3.1426,0.055552518\nCOc1ccc2c(c1)SC(=O)C2=O,COc1ccc2c(c1)SC(=O)C2=O,COc1ccc2c(c1)SC(=O)C2=O,2.401574365,1.5102,-0.891374365\nCOc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,COc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,COc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,2.913337418,4.3619,1.448562582\nCOc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,COc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,COc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,2.051226236,4.0812,2.029973764\nCNC(=O)O/N=C(/N)c1ccc(Br)cc1,CNC(=O)O/N=C(/N)c1ccc(Br)cc1,CNC(=O)O/N=C(/N)c1ccc(Br)cc1,2.250317734,1.4254,-0.824917734\nCC1(C)CN(C[C@@H](CS)c2ccccc2)CCO1,CC1(C)CN(C[C@@H](CS)c2ccccc2)CCO1,CC1(C)CN(C[C@@H](CS)c2ccccc2)CCO1,3.027019854,2.8108,-0.216219854\nCCOC(=O)[C@H]1CCCN(C(=O)Cn2c(SC(F)F)nc3ccccc32)C1,CCOC(=O)[C@H]1CCCN(C(=O)Cn2c(SC(F)F)nc3ccccc32)C1,CCOC(=O)[C@H]1CCCN(C(=O)Cn2c(SC(F)F)nc3ccccc32)C1,2.799716077,3.1527,0.352983923\n[NH3+][C@@H](Cc1ccccn1)c1ccc(Cl)cn1,[NH3+][C@@H](Cc1ccccn1)c1ccc(Cl)cn1,[NH3+][C@@H](Cc1ccccn1)c1ccc(Cl)cn1,3.324055225,1.6557,-1.668355225\nCCOCCN(C)C(=O)C(=O)Nc1cccnc1-c1ccccc1,CCOCCN(C)C(=O)C(=O)Nc1cccnc1-c1ccccc1,CCOCCN(C)C(=O)C(=O)Nc1cccnc1-c1ccccc1,2.256607012,2.182,-0.074607012\nCCn1cc([C@H](NC(=O)c2ccc(OC)c(OC)c2)c2ccc(F)cc2)c2cc[nH+]cc21,CCn1cc([C@H](NC(=O)c2ccc(OC)c(OC)c2)c2ccc(F)cc2)c2cc[nH+]cc21,CCn1cc([C@H](NC(=O)c2ccc(OC)c(OC)c2)c2ccc(F)cc2)c2cc[nH+]cc21,3.447988059,4.151,0.703011941\nO=C([C@@H]1CC12CCOCC2)N1CC[C@@H](Oc2cccc(Cl)c2)C1,O=C([C@@H]1CC12CCOCC2)N1CC[C@@H](Oc2cccc(Cl)c2)C1,O=C([C@@H]1CC12CCOCC2)N1CC[C@@H](Oc2cccc(Cl)c2)C1,3.713785706,3.1364,-0.577385706\nC[NH2+][C@]1(C(N)=O)CCC[C@H]1CCSC1CCOCC1,C[NH2+][C@]1(C(N)=O)CCC[C@H]1CCSC1CCOCC1,C[NH2+][C@]1(C(N)=O)CCC[C@H]1CCSC1CCOCC1,4.614268117,0.5061,-4.108168117\nCc1cc(NC(=O)c2ccc(-n3cncn3)nc2)ccc1N(C)C,Cc1cc(NC(=O)c2ccc(-n3cncn3)nc2)ccc1N(C)C,Cc1cc(NC(=O)c2ccc(-n3cncn3)nc2)ccc1N(C)C,2.267592306,2.28902,0.021427694\nCc1ccccc1CNC(=O)N1CCC([C@@H](O)c2ccc(Cl)cc2)CC1,Cc1ccccc1CNC(=O)N1CCC([C@@H](O)c2ccc(Cl)cc2)CC1,Cc1ccccc1CNC(=O)N1CCC([C@@H](O)c2ccc(Cl)cc2)CC1,2.524142332,4.30362,1.779477668\nCCc1nc(CNCCc2cccs2)cs1,CCc1nc(CNCCc2cccs2)cs1,CCc1nc(CNCCc2cccs2)cs1,2.300594957,3.0993,0.798705043\nO=C(Nc1ccc2nc(-c3cc(F)ccc3F)[nH]c2c1)c1ccc([N+](=O)[O-])cc1,O=C(Nc1ccc2nc(-c3cc(F)ccc3F)[nH]c2c1)c1ccc([N+](=O)[O-])cc1,O=C(Nc1ccc2nc(-c3cc(F)ccc3F)[nH]c2c1)c1ccc([N+](=O)[O-])cc1,2.145744096,4.6686,2.522855904\nN#Cc1ccccc1NC(=O)C(=O)NC[C@H]1COC2(CCCC2)O1,N#Cc1ccccc1NC(=O)C(=O)NC[C@H]1COC2(CCCC2)O1,N#Cc1ccccc1NC(=O)C(=O)NC[C@H]1COC2(CCCC2)O1,3.412921889,1.29868,-2.114241889\nO=C(NC1(C(=O)[O-])CC[NH2+]CC1)OCC1c2ccccc2-c2ccccc21,O=C(NC1(C(=O)[O-])CC[NH2+]CC1)OCC1c2ccccc2-c2ccccc21,O=C(NC1(C(=O)[O-])CC[NH2+]CC1)OCC1c2ccccc2-c2ccccc21,3.490460281,0.371,-3.119460281\nCc1ccc(C(=O)N2CCC[C@H]2CNC(=O)OC(C)(C)C)cc1,Cc1ccc(C(=O)N2CCC[C@H]2CNC(=O)OC(C)(C)C)cc1,Cc1ccc(C(=O)N2CCC[C@H]2CNC(=O)OC(C)(C)C)cc1,2.424440646,3.12432,0.699879354\nC[NH+](C)CC(=O)NNC(=O)c1ccc(C2CCCCC2)cc1,C[NH+](C)CC(=O)NNC(=O)c1ccc(C2CCCCC2)cc1,C[NH+](C)CC(=O)NNC(=O)c1ccc(C2CCCCC2)cc1,2.819298295,0.6398,-2.179498295\nC[C@@H]1CCCC[C@@H]1OCC(=O)N(C)Cc1cccc(O)c1,C[C@@H]1CCCC[C@@H]1OCC(=O)N(C)Cc1cccc(O)c1,C[C@@H]1CCCC[C@@H]1OCC(=O)N(C)Cc1cccc(O)c1,2.912373702,2.9459,0.033526298\nC#CCNS(=O)(=O)c1ccc(C(=O)N[C@@H]2CCO[C@@]3(CCOC3)C2)cc1,C#CCNS(=O)(=O)c1ccc(C(=O)N[C@@H]2CCO[C@@]3(CCOC3)C2)cc1,C#CCNS(=O)(=O)c1ccc(C(=O)N[C@@H]2CCO[C@@]3(CCOC3)C2)cc1,3.898168643,0.666,-3.232168643\nCc1ccc(C2(CN3CCN(S(=O)(=O)N(C)C)CC3)CCCC2)cc1,Cc1ccc(C2(CN3CCN(S(=O)(=O)N(C)C)CC3)CCCC2)cc1,Cc1ccc(C2(CN3CCN(S(=O)(=O)N(C)C)CC3)CCCC2)cc1,2.439754533,2.23082,-0.208934533\nCc1sc2ncn(CCC(=O)NCC(N)=O)c(=O)c2c1C,Cc1sc2ncn(CCC(=O)NCC(N)=O)c(=O)c2c1C,Cc1sc2ncn(CCC(=O)NCC(N)=O)c(=O)c2c1C,2.347207183,0.06644,-2.280767183\nCOc1ccc(CCNC(=O)c2cc(C(C)C)nc3c2c(C)nn3C)cc1OC,COc1ccc(CCNC(=O)c2cc(C(C)C)nc3c2c(C)nn3C)cc1OC,COc1ccc(CCNC(=O)c2cc(C(C)C)nc3c2c(C)nn3C)cc1OC,2.282558575,3.38982,1.107261425\nO=[N+]([O-])c1cnc(Br)s1,O=[N+]([O-])c1cnc(Br)s1,O=[N+]([O-])c1cnc(Br)s1,3.179818392,1.8138,-1.366018392\nCC(C)(C)CC(=O)N1CCN(C(=O)c2ccn3ccsc23)CC1,CC(C)(C)CC(=O)N1CCN(C(=O)c2ccn3ccsc23)CC1,CC(C)(C)CC(=O)N1CCN(C(=O)c2ccn3ccsc23)CC1,2.712888331,2.7214,0.008511669\nC/[NH+]=C(/NCCCc1cccc(Br)c1)NC1CC1,C/[NH+]=C(/NCCCc1cccc(Br)c1)NC1CC1,C/[NH+]=C(/NCCCc1cccc(Br)c1)NC1CC1,2.991361127,0.7897,-2.201661127\nO=C(CCC1CCCC1)N1CCC(COc2ccccc2C(=O)N2CCCC2)CC1,O=C(CCC1CCCC1)N1CCC(COc2ccccc2C(=O)N2CCCC2)CC1,O=C(CCC1CCCC1)N1CCC(COc2ccccc2C(=O)N2CCCC2)CC1,2.19573155,4.5104,2.31466845\nCC(C)(C)c1n[nH]cc1/C=C1\\SC(NC2CCCCC2)=NC1=O,CC(C)(C)c1n[nH]cc1/C=C1\\SC(NC2CCCCC2)=NC1=O,CC(C)(C)c1n[nH]cc1/C=C1\\SC(NC2CCCCC2)=NC1=O,3.076214803,3.5998,0.523585197\nCN(CC(F)(F)F)C(=O)CNC(=O)N1CCO[C@H](C#N)C1,CN(CC(F)(F)F)C(=O)CNC(=O)N1CCO[C@H](C#N)C1,CN(CC(F)(F)F)C(=O)CNC(=O)N1CCO[C@H](C#N)C1,3.449633505,-0.05892,-3.508553505\nCC[C@@H](NC(=O)NCc1ccc(C#N)cc1)c1ccc(C)c(F)c1,CC[C@@H](NC(=O)NCc1ccc(C#N)cc1)c1ccc(C)c(F)c1,CC[C@@H](NC(=O)NCc1ccc(C#N)cc1)c1ccc(C)c(F)c1,2.506404532,3.9563,1.449895468\nCC(C)c1noc(CCNC(=O)/C=C/c2cn(C)c3ccccc23)n1,CC(C)c1noc(CCNC(=O)/C=C/c2cn(C)c3ccccc23)n1,CC(C)c1noc(CCNC(=O)/C=C/c2cn(C)c3ccccc23)n1,2.489753078,3.0568,0.567046922\nCC(C)=C1Oc2c3c(cc(C)c2C1=O)OCN(Cc1ccccc1)C3,CC(C)=C1Oc2c3c(cc(C)c2C1=O)OCN(Cc1ccccc1)C3,CC(C)=C1Oc2c3c(cc(C)c2C1=O)OCN(Cc1ccccc1)C3,2.840364996,4.21612,1.375755004\nO=C1/C(=N/O)c2ccc(Cl)cc2N1Cc1ccccc1,O=C1/C(=N/O)c2ccc(Cl)cc2N1Cc1ccccc1,O=C1/C(=N/O)c2ccc(Cl)cc2N1Cc1ccccc1,2.0410745,3.0651,1.0240255\nc1ccc(CC2(Cc3ccc4c(c3)CCC4)C[NH2+]C2)cc1,c1ccc(CC2(Cc3ccc4c(c3)CCC4)C[NH2+]C2)cc1,c1ccc(CC2(Cc3ccc4c(c3)CCC4)C[NH2+]C2)cc1,2.844717318,2.5239,-0.320817318\nCC[C@H](C)NC(=O)N[C@H](c1ccccc1)C(F)(F)F,CC[C@H](C)NC(=O)N[C@H](c1ccccc1)C(F)(F)F,CC[C@H](C)NC(=O)N[C@H](c1ccccc1)C(F)(F)F,2.819638554,3.3877,0.568061446\nNC(=O)COc1ccc2ccccc2c1C[NH2+]C1CCCCCCC1,NC(=O)COc1ccc2ccccc2c1C[NH2+]C1CCCCCCC1,NC(=O)COc1ccc2ccccc2c1C[NH2+]C1CCCCCCC1,2.95041288,2.8802,-0.07021288\nO=C(COc1ccc(Cl)cc1[N+](=O)[O-])N1CCc2cc([N+](=O)[O-])ccc21,O=C(COc1ccc(Cl)cc1[N+](=O)[O-])N1CCc2cc([N+](=O)[O-])ccc21,O=C(COc1ccc(Cl)cc1[N+](=O)[O-])N1CCc2cc([N+](=O)[O-])ccc21,2.251658336,3.1245,0.872841664\nO=C(Cc1ccc(F)cc1F)NC1CCN(c2cccc(F)c2)CC1,O=C(Cc1ccc(F)cc1F)NC1CCN(c2cccc(F)c2)CC1,O=C(Cc1ccc(F)cc1F)NC1CCN(c2cccc(F)c2)CC1,2.033335093,3.4316,1.398264907\nCOC(=O)c1ccc(C[NH+](C2CC2)[C@@H](C)c2ccco2)cc1F,COC(=O)c1ccc(C[NH+](C2CC2)[C@@H](C)c2ccco2)cc1F,COC(=O)c1ccc(C[NH+](C2CC2)[C@@H](C)c2ccco2)cc1F,4.038401414,2.5138,-1.524601414\nCc1ccc([C@@H]2C[NH2+]CC[C@@H]2c2c(F)cccc2F)cc1,Cc1ccc([C@@H]2C[NH2+]CC[C@@H]2c2c(F)cccc2F)cc1,Cc1ccc([C@@H]2C[NH2+]CC[C@@H]2c2c(F)cccc2F)cc1,3.825954589,3.10772,-0.718234589\nCc1ccc(-c2ccc(=O)n(CC(=O)NCc3ccco3)n2)c(C)c1,Cc1ccc(-c2ccc(=O)n(CC(=O)NCc3ccco3)n2)c(C)c1,Cc1ccc(-c2ccc(=O)n(CC(=O)NCc3ccco3)n2)c(C)c1,2.180504107,2.43654,0.256035893\nCCNC(=O)NC(=O)CSc1nnc(N2C[C@@H](C)C[C@@H](C)C2)n1Cc1ccco1,CCNC(=O)NC(=O)CSc1nnc(N2C[C@@H](C)C[C@@H](C)C2)n1Cc1ccco1,CCNC(=O)NC(=O)CSc1nnc(N2C[C@@H](C)C[C@@H](C)C2)n1Cc1ccco1,3.320744633,2.3395,-0.981244633\nCC(C)(C)NC(=O)N1CCC(CNC(=O)C(=O)Nc2ccccc2[N+](=O)[O-])CC1,CC(C)(C)NC(=O)N1CCC(CNC(=O)C(=O)Nc2ccccc2[N+](=O)[O-])CC1,CC(C)(C)NC(=O)N1CCC(CNC(=O)C(=O)Nc2ccccc2[N+](=O)[O-])CC1,2.351664532,1.8696,-0.482064532\nCc1ccc(-n2cnnn2)cc1NC(=O)c1cnn(Cc2ccccc2)c1,Cc1ccc(-n2cnnn2)cc1NC(=O)c1cnn(Cc2ccccc2)c1,Cc1ccc(-n2cnnn2)cc1NC(=O)c1cnn(Cc2ccccc2)c1,2.221673771,2.46782,0.246146229\nO=C(Nc1ccc(F)cc1OCC(F)F)c1cccnc1N1CCCC1,O=C(Nc1ccc(F)cc1OCC(F)F)c1cccnc1N1CCCC1,O=C(Nc1ccc(F)cc1OCC(F)F)c1cccnc1N1CCCC1,2.273178554,3.7171,1.443921446\nCCCNC(=O)CCNC(=O)CN1CCC([NH3+])CC1,CCCNC(=O)CCNC(=O)CN1CCC([NH3+])CC1,CCCNC(=O)CCNC(=O)CN1CCC([NH3+])CC1,2.682391179,-1.2748,-3.957191179\nCOCCNC(=O)COc1cc2c(c3oc(=O)c4c(c13)CCC4)CCC(C)(C)O2,COCCNC(=O)COc1cc2c(c3oc(=O)c4c(c13)CCC4)CCC(C)(C)O2,COCCNC(=O)COc1cc2c(c3oc(=O)c4c(c13)CCC4)CCC(C)(C)O2,2.819768111,2.5267,-0.293068111\nCC(=O)N[C@@H](CC(=O)Nc1cnn(C)c1)c1cccs1,CC(=O)N[C@@H](CC(=O)Nc1cnn(C)c1)c1cccs1,CC(=O)N[C@@H](CC(=O)Nc1cnn(C)c1)c1cccs1,2.760825185,1.6876,-1.073225185\nO=C(c1cccc(S(=O)(=O)[N-]c2ccccc2F)c1)N1CCC[C@@H]1c1nc2ccccc2[nH]1,O=C(c1cccc(S(=O)(=O)[N-]c2ccccc2F)c1)N1CCC[C@@H]1c1nc2ccccc2[nH]1,O=C(c1cccc(S(=O)(=O)[N-]c2ccccc2F)c1)N1CCC[C@@H]1c1nc2ccccc2[nH]1,3.413113671,5.0734,1.660286329\nC[C@H](CNC(=O)c1[nH]nc(C2CC2)c1Cl)Oc1cccc(F)c1,C[C@H](CNC(=O)c1[nH]nc(C2CC2)c1Cl)Oc1cccc(F)c1,C[C@H](CNC(=O)c1[nH]nc(C2CC2)c1Cl)Oc1cccc(F)c1,3.06692193,3.2769,0.20997807\nCOc1ccccc1C(=O)/C(C#N)=C\\c1cnc(-c2ccccn2)s1,COc1ccccc1C(=O)/C(C#N)=C\\c1cnc(-c2ccccn2)s1,COc1ccccc1C(=O)/C(C#N)=C\\c1cnc(-c2ccccn2)s1,2.549035852,4.00358,1.454544148\nO=C(NCc1ccco1)[C@H]1CCCN1C(=O)C[NH+]1CCc2sccc2C1,O=C(NCc1ccco1)[C@H]1CCCN1C(=O)C[NH+]1CCc2sccc2C1,O=C(NCc1ccco1)[C@H]1CCCN1C(=O)C[NH+]1CCc2sccc2C1,4.132206287,0.5895,-3.542706287\nCC[C@@H]1CN(C(=O)c2ccc3c(c2)OCO3)CC[NH2+]1,CC[C@@H]1CN(C(=O)c2ccc3c(c2)OCO3)CC[NH2+]1,CC[C@@H]1CN(C(=O)c2ccc3c(c2)OCO3)CC[NH2+]1,3.69238642,0.2131,-3.47928642\nCc1cc(C(=O)Nc2ccc(F)cc2C)on1,Cc1cc(C(=O)Nc2ccc(F)cc2C)on1,Cc1cc(C(=O)Nc2ccc(F)cc2C)on1,1.934252507,2.68284,0.748587493\nC[C@@H](N[C@H](C[C@@H]1CCOC1)c1ccccc1)c1ccc([N+](=O)[O-])cc1,C[C@@H](N[C@H](C[C@@H]1CCOC1)c1ccccc1)c1ccc([N+](=O)[O-])cc1,C[C@@H](N[C@H](C[C@@H]1CCOC1)c1ccccc1)c1ccc([N+](=O)[O-])cc1,3.168218916,4.4133,1.245081084\nO=C(NNS(=O)(=O)c1ccc(F)cc1)c1cc2c(s1)CCCC2,O=C(NNS(=O)(=O)c1ccc(F)cc1)c1cc2c(s1)CCCC2,O=C(NNS(=O)(=O)c1ccc(F)cc1)c1cc2c(s1)CCCC2,2.219578826,2.3893,0.169721174\nCc1cc(C#N)ccc1Oc1ccc([N+](=O)[O-])cc1C#N,Cc1cc(C#N)ccc1Oc1ccc([N+](=O)[O-])cc1C#N,Cc1cc(C#N)ccc1Oc1ccc([N+](=O)[O-])cc1C#N,2.231562188,3.43888,1.207317812\nCOc1ccc(-c2nc(CNC(=O)c3ccccc3)n[nH]2)cc1,COc1ccc(-c2nc(CNC(=O)c3ccccc3)n[nH]2)cc1,COc1ccc(-c2nc(CNC(=O)c3ccccc3)n[nH]2)cc1,1.980697063,2.4103,0.429602937\nCc1cccc(OCC(=O)N/N=C/c2cc(Br)c(O)c(Br)c2O)c1,Cc1cccc(OCC(=O)N/N=C/c2cc(Br)c(O)c(Br)c2O)c1,Cc1cccc(OCC(=O)N/N=C/c2cc(Br)c(O)c(Br)c2O)c1,2.453904832,3.46032,1.006415168\nCC1=Nc2ccccc2Oc2c1c(=O)n(-c1ccccc1)c1ccccc21,CC1=Nc2ccccc2Oc2c1c(=O)n(-c1ccccc1)c1ccccc21,CC1=Nc2ccccc2Oc2c1c(=O)n(-c1ccccc1)c1ccccc21,2.458237938,5.2371,2.778862062\nCOC[C@@](C)(O)CNC(=O)[C@@H]1CC(=O)N(c2ccc(F)cc2)C1,COC[C@@](C)(O)CNC(=O)[C@@H]1CC(=O)N(c2ccc(F)cc2)C1,COC[C@@](C)(O)CNC(=O)[C@@H]1CC(=O)N(c2ccc(F)cc2)C1,3.005123049,0.6922,-2.312923049\nFc1ccccc1[C@@H]([NH2+]Cc1ccoc1)C1CCCC1,Fc1ccccc1[C@@H]([NH2+]Cc1ccoc1)C1CCCC1,Fc1ccccc1[C@@H]([NH2+]Cc1ccoc1)C1CCCC1,3.69757193,3.4136,-0.28397193\nCC1CCN(C(=O)CN2C(=O)N[C@](C)(c3ccc(Cl)cc3Cl)C2=O)CC1,CC1CCN(C(=O)CN2C(=O)N[C@](C)(c3ccc(Cl)cc3Cl)C2=O)CC1,CC1CCN(C(=O)CN2C(=O)N[C@](C)(c3ccc(Cl)cc3Cl)C2=O)CC1,2.732754879,3.0189,0.286145121\nCOc1cccc(NC(=O)NCc2ccc3c(c2)N(C(=O)c2cccc(C)c2)CC3)c1,COc1cccc(NC(=O)NCc2ccc3c(c2)N(C(=O)c2cccc(C)c2)CC3)c1,COc1cccc(NC(=O)NCc2ccc3c(c2)N(C(=O)c2cccc(C)c2)CC3)c1,2.09420016,4.52822,2.43401984\nO=C(/C=C/c1ccccc1)NC(=S)Nc1cc([N+](=O)[O-])ccc1F,O=C(/C=C/c1ccccc1)NC(=S)Nc1cc([N+](=O)[O-])ccc1F,O=C(/C=C/c1ccccc1)NC(=S)Nc1cc([N+](=O)[O-])ccc1F,2.105372184,3.2603,1.154927816\nCCCn1nccc1NC(=O)c1nnn(-c2ccc(C)cc2)c1C,CCCn1nccc1NC(=O)c1nnn(-c2ccc(C)cc2)c1C,CCCn1nccc1NC(=O)c1nnn(-c2ccc(C)cc2)c1C,2.315214931,2.74294,0.427725069\nCOc1cccc(-c2nc(CC(=O)N[C@@H]3CCS(=O)(=O)C3)cs2)c1,COc1cccc(-c2nc(CC(=O)N[C@@H]3CCS(=O)(=O)C3)cs2)c1,COc1cccc(-c2nc(CC(=O)N[C@@H]3CCS(=O)(=O)C3)cs2)c1,2.722418907,1.6645,-1.057918907\nCOc1cc(NC(=O)c2cnn(-c3ccccc3)n2)cc(OC)c1OC,COc1cc(NC(=O)c2cnn(-c3ccccc3)n2)cc(OC)c1OC,COc1cc(NC(=O)c2cnn(-c3ccccc3)n2)cc(OC)c1OC,2.151091028,2.5454,0.394308972\nCOc1ccc2c(c1)cc(C(=O)N[C@H](C)Cc1cnccn1)n2C,COc1ccc2c(c1)cc(C(=O)N[C@H](C)Cc1cnccn1)n2C,COc1ccc2c(c1)cc(C(=O)N[C@H](C)Cc1cnccn1)n2C,2.807843567,2.3379,-0.469943567\nCC(C)COCCNC(=O)NNc1ccccc1Cl,CC(C)COCCNC(=O)NNc1ccccc1Cl,CC(C)COCCNC(=O)NNc1ccccc1Cl,2.165718029,2.6387,0.472981971\nCCC[C@@H](C)[NH2+]C[C@H](C)Oc1cccnc1C,CCC[C@@H](C)[NH2+]C[C@H](C)Oc1cccnc1C,CCC[C@@H](C)[NH2+]C[C@H](C)Oc1cccnc1C,4.126301903,1.90932,-2.216981903\nCc1ccc(C)c(OCCSc2ccc(N)c(C)c2)c1,Cc1ccc(C)c(OCCSc2ccc(N)c(C)c2)c1,Cc1ccc(C)c(OCCSc2ccc(N)c(C)c2)c1,2.060838323,4.36516,2.304321677\nC[C@H](CC#N)N(C)C[C@@H]1CSc2ccccc21,C[C@H](CC#N)N(C)C[C@@H]1CSc2ccccc21,C[C@H](CC#N)N(C)C[C@@H]1CSc2ccccc21,3.590360435,3.10988,-0.480480435\nCCN[C@@]1(C#N)CC[C@@H](Oc2cccc(F)c2F)C1,CCN[C@@]1(C#N)CC[C@@H](Oc2cccc(F)c2F)C1,CCN[C@@]1(C#N)CC[C@@H](Oc2cccc(F)c2F)C1,3.52564354,2.76798,-0.75766354\nCc1cc(C[NH2+][C@@H](CO)C(C)C)c(C)n1-c1ccccc1,Cc1cc(C[NH2+][C@@H](CO)C(C)C)c(C)n1-c1ccccc1,Cc1cc(C[NH2+][C@@H](CO)C(C)C)c(C)n1-c1ccccc1,3.596987275,2.17444,-1.422547275\nCC1CCC(NC(=O)C(=O)Nc2cccc(-c3nncn3C)c2)CC1,CC1CCC(NC(=O)C(=O)Nc2cccc(-c3nncn3C)c2)CC1,CC1CCC(NC(=O)C(=O)Nc2cccc(-c3nncn3C)c2)CC1,2.347143544,2.1155,-0.231643544\nCC(=O)Nc1cccc(C(=O)/C=C/c2ccco2)c1,CC(=O)Nc1cccc(C(=O)/C=C/c2ccco2)c1,CC(=O)Nc1cccc(C(=O)/C=C/c2ccco2)c1,2.003848944,3.1341,1.130251056\nCCCNC(=O)c1cccc(NC(=O)C(=O)N2C[C@H]3CC=CC[C@@H]3C2)c1,CCCNC(=O)c1cccc(NC(=O)C(=O)N2C[C@H]3CC=CC[C@@H]3C2)c1,CCCNC(=O)c1cccc(NC(=O)C(=O)N2C[C@H]3CC=CC[C@@H]3C2)c1,3.148539376,2.1895,-0.959039376\nCc1nc(C)c(C(=O)N2CCN(C(=O)Cc3ccc(Cl)cc3)CC2)o1,Cc1nc(C)c(C(=O)N2CCN(C(=O)Cc3ccc(Cl)cc3)CC2)o1,Cc1nc(C)c(C(=O)N2CCN(C(=O)Cc3ccc(Cl)cc3)CC2)o1,2.226794753,2.47194,0.245145247\nCCc1nc(CN/C(=[NH+]/C)N2CC[NH+](CC(C)C)CC2)cs1,CCc1nc(CN/C(=[NH+]/C)N2CC[NH+](CC(C)C)CC2)cs1,CCc1nc(CN/C(=[NH+]/C)N2CC[NH+](CC(C)C)CC2)cs1,4.614668767,-1.282,-5.896668767\nCN(C)c1ccc(C(=O)Nc2c3c(nn2C)CCC3)cc1,CN(C)c1ccc(C(=O)Nc2c3c(nn2C)CCC3)cc1,CN(C)c1ccc(C(=O)Nc2c3c(nn2C)CCC3)cc1,2.254443099,2.2271,-0.027343099\nO=C1[C@H](Nc2ccccc2O[C@H]2CCOC2)CCCN1Cc1ccccc1,O=C1[C@H](Nc2ccccc2O[C@H]2CCOC2)CCCN1Cc1ccccc1,O=C1[C@H](Nc2ccccc2O[C@H]2CCOC2)CCCN1Cc1ccccc1,2.979200776,3.4574,0.478199224\nN#Cc1cccc(CNC(=O)CCOc2ccc(C=O)cc2)c1,N#Cc1cccc(CNC(=O)CCOc2ccc(C=O)cc2)c1,N#Cc1cccc(CNC(=O)CCOc2ccc(C=O)cc2)c1,1.99088313,2.45608,0.46519687\nCOc1ccc(CCn2cc(C(=O)[O-])c(=O)[nH]c2=O)cc1,COc1ccc(CCn2cc(C(=O)[O-])c(=O)[nH]c2=O)cc1,COc1ccc(CCn2cc(C(=O)[O-])c(=O)[nH]c2=O)cc1,2.453088713,-0.8486,-3.301688713\nCN(CCc1ccccc1)C(=O)C(=O)Nc1ccc(Oc2ccncc2)cc1,CN(CCc1ccccc1)C(=O)C(=O)Nc1ccc(Oc2ccncc2)cc1,CN(CCc1ccccc1)C(=O)C(=O)Nc1ccc(Oc2ccncc2)cc1,2.089518656,3.5135,1.423981344\nCc1cc(C)cc(NC(=O)[C@H]2CCCN(S(=O)(=O)c3cn(C)cn3)C2)c1,Cc1cc(C)cc(NC(=O)[C@H]2CCCN(S(=O)(=O)c3cn(C)cn3)C2)c1,Cc1cc(C)cc(NC(=O)[C@H]2CCCN(S(=O)(=O)c3cn(C)cn3)C2)c1,2.830417176,2.07634,-0.754077176\nCc1nc(CNC(=O)[C@@H]2CCCCN2C(=O)OC(C)(C)C)sc1C,Cc1nc(CNC(=O)[C@@H]2CCCCN2C(=O)OC(C)(C)C)sc1C,Cc1nc(CNC(=O)[C@@H]2CCCCN2C(=O)OC(C)(C)C)sc1C,2.787253271,3.16574,0.378486729\nCCSC1=NC(=O)[C@H]2C(=N1)NC1=C(C(=O)CCC1)[C@H]2c1cccnc1,CCSC1=NC(=O)[C@H]2C(=N1)NC1=C(C(=O)CCC1)[C@H]2c1cccnc1,CCSC1=NC(=O)[C@H]2C(=N1)NC1=C(C(=O)CCC1)[C@H]2c1cccnc1,4.10432105,2.4395,-1.66482105\nCOCC[C@H](C)C(=O)Nc1ccc(Oc2cc[nH+]c3ccccc23)cc1,COCC[C@H](C)C(=O)Nc1ccc(Oc2cc[nH+]c3ccccc23)cc1,COCC[C@H](C)C(=O)Nc1ccc(Oc2cc[nH+]c3ccccc23)cc1,3.383061791,4.0573,0.674238209\nCOC(=O)c1sccc1S(=O)(=O)N1CCC[C@@H](NC(C)=O)C1,COC(=O)c1sccc1S(=O)(=O)N1CCC[C@@H](NC(C)=O)C1,COC(=O)c1sccc1S(=O)(=O)N1CCC[C@@H](NC(C)=O)C1,2.758371043,0.8239,-1.934471043\nCC(=O)O[C@H]1CC[C@]2(C)[C@@H]3CC[C@@]4(C)[C@@H](C[C@@H]5CCCC[C@@]54C(C)=O)[C@@H]3C[C@@H](O)[C@@]2(O)C1,CC(=O)O[C@H]1CC[C@]2(C)[C@@H]3CC[C@@]4(C)[C@@H](C[C@@H]5CCCC[C@@]54C(C)=O)[C@@H]3C[C@@H](O)[C@@]2(O)C1,CC(=O)O[C@H]1CC[C@]2(C)[C@@H]3CC[C@@]4(C)[C@@H](C[C@@H]5CCCC[C@@]54C(C)=O)[C@@H]3C[C@@H](O)[C@@]2(O)C1,4.807691125,4.422,-0.385691125\nCCNS(=O)(=O)[C@H]1CCN(CCSc2ccccc2)C1,CCNS(=O)(=O)[C@H]1CCN(CCSc2ccccc2)C1,CCNS(=O)(=O)[C@H]1CCN(CCSc2ccccc2)C1,2.893618866,1.7923,-1.101318866\nCC(C)OCCCNC(=O)N1CCC[NH+](CC2CCCCC2)CC1,CC(C)OCCCNC(=O)N1CCC[NH+](CC2CCCCC2)CC1,CC(C)OCCCNC(=O)N1CCC[NH+](CC2CCCCC2)CC1,3.869396076,1.682,-2.187396076\n"
  },
  {
    "path": "tests/data/micro_ZINC_shard_1.csv",
    "content": "SMILES,SA,logp,score\nCCc1ccc(C(=O)/C(=C(/S)NC2CC2)[n+]2ccc(CC)cc2)cc1,2.775189478,3.7897,1.014510522\nC=CCOc1ccc(C(F)(F)F)cc1C(=O)NC[C@H]1CCC[C@H]1O,3.071497898,3.161,0.089502102\nCOc1cc(OC)cc([C@@H](NC(=O)Cn2ccccc2=O)c2nccn2C)c1,2.84000896,1.5048,-1.33520896\nCN1CCC[C@@]2(CCN(C(=O)CN3CCNC(=O)C3)C2)C1=O,3.605168457,-1.1109,-4.716068457\nCOc1ccc(Cl)cc1NC(=O)c1nn(C)cc1[N+](=O)[O-],2.132664673,2.2426,0.109935327\nCc1nn(-c2ccccc2)c(O)c1/C=[NH+]/Cc1ccncc1,3.117639223,0.98102,-2.136619223\nC=CCN1C(=O)C(C)(C)COc2ccc(NC(=O)C3(c4ccc(OC)cc4)CCOCC3)cc21,2.744789746,4.3197,1.574910254\nCON(C)C(=O)c1cnn2c(C)cc(C)nc12,2.547847184,0.97954,-1.568307184\nNC(=O)[C@@H](NC(=O)CCCc1cccs1)c1ccccc1,2.444743614,2.4136,-0.031143614\nCOc1cccc(CN2CCC[NH+](CC(=O)Nc3ccc(F)cc3)S2(=O)=O)c1,3.546311784,0.8084,-2.737911784\n"
  },
  {
    "path": "tests/data/micro_ZINC_shard_2.csv",
    "content": "SMILES,SA,logp,score\nCn1c(=O)oc2ccc(NC(=O)[C@@H]3CCN(C(=O)OC(C)(C)C)C3)cc21,2.748880773,2.327,-0.421880773\nCC(C)N1CCO[C@@H]([C@H](O)c2cccs2)C1,3.356907136,1.8907,-1.466207136\nCCN[C@H]1C[C@H](C)C[C@H](C)[C@@H]1[NH+]1C[C@@H](C)[C@@H](C)C1,5.501545361,1.5698,-3.931745361\nCC1CCN(C(=O)N(CC[NH3+])C2CC2)CC1,3.063324,0.5446,-2.518724\nCOc1ccc(NC(=O)Cn2nc3c4scc(-c5ccc(F)cc5)c4ncn3c2=O)c(OC)c1,2.583698935,3.5677,0.984001065\nCC(C)OC(=O)N1CCC(NC(=O)[C@H]2SCCc3ccccc32)CC1,3.087047482,3.1426,0.055552518\nCOc1ccc2c(c1)SC(=O)C2=O,2.401574365,1.5102,-0.891374365\nCOc1ccc(CCC(=O)N2CCC3(CC2)Nc2ccccc2-n2cccc23)cc1,2.913337418,4.3619,1.448562582\nCOc1ccccc1Cc1nnc(NC(=O)Cc2ccc(Br)cc2)s1,2.051226236,4.0812,2.029973764\nCNC(=O)O/N=C(/N)c1ccc(Br)cc1,2.250317734,1.4254,-0.824917734\n"
  },
  {
    "path": "tests/data/pcqm4mv2-2k.csv",
    "content": "smiles,homo\r\n\"[H]C1C2=C(NC(=O)[C@@]1([H])C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H])C([H])=C([H])N=C2[H] |(6.4528,-1.5789,-1.2859;5.789,-0.835,-0.8455;4.8499,-0.2104,-1.5946;3.9134,0.7241,-0.934;3.9796,1.1019,0.3172;5.0405,0.6404,1.1008;5.2985,1.1457,2.1772;5.9121,-0.5519,0.613;6.9467,-0.2303,0.8014;5.677,-1.7955,1.4745;4.7751,-2.7953,1.0929;4.2336,-2.7113,0.154;4.5521,-3.9001,1.914;3.8445,-4.6636,1.5979;5.215,-4.0391,3.1392;4.9919,-5.2514,4.0126;5.1819,-5.0262,5.0671;5.6619,-6.0746,3.7296;3.966,-5.6247,3.925;6.1051,-3.0257,3.52;6.6247,-3.101,4.4725;6.3372,-1.9217,2.7029;7.0168,-1.1395,3.0281;2.8586,1.2252,-1.7853;2.1303,1.9004,-1.3493;2.8118,0.8707,-3.0956;2.0282,1.2549,-3.7434;3.716,0.0207,-3.7371;4.6658,-0.476,-3.0127;5.3755,-1.1468,-3.5021)|\",3.0476751256\r\n\"[H]C1=C(OC([H])([H])[H])C([H])=C(OC([H])([H])[H])C(/C([H])=C(\\[H])N(C(=O)C([H])([H])[H])C([H])([H])[H])=C1[H] |(2.1307,2.7235,7.6863;1.9474,1.819,7.119;1.0098,0.873,7.538;0.2593,0.9658,8.6759;0.4432,2.1011,9.5057;-0.2404,1.9714,10.3469;0.1957,3.0318,8.978;1.473,2.1641,9.8816;0.7754,-0.2729,6.7631;0.0389,-0.9814,7.1205;1.4835,-0.4763,5.5827;1.3108,-1.5731,4.7848;0.3366,-2.535,5.1551;0.3625,-3.3009,4.3776;-0.6682,-2.0951,5.1999;0.5723,-2.9938,6.1242;2.4537,0.4611,5.1348;3.183,0.2123,3.8873;2.7052,-0.4677,3.1921;4.387,0.7439,3.5935;4.8961,1.3664,4.3173;5.0856,0.5626,2.3959;6.3568,1.0842,2.1622;6.9369,0.8984,1.1006;7.0068,1.897,3.2727;7.1466,1.3059,4.1845;6.4174,2.7835,3.5327;7.9816,2.217,2.9038;4.465,-0.2257,1.3353;4.3066,-1.2574,1.6696;5.134,-0.2156,0.4776;3.4967,0.2086,1.0638;2.6396,1.6,5.9257;3.3352,2.3628,5.587)|\",4.410965516605\r\n\"[H]C([H])=C([H])C([H])([H])N(C(=O)C([H])([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])C([H])=C([H])C([H])=C1[H] |(5.6398,-3.9451,-3.6978;5.5212,-2.8688,-3.6095;6.4169,-2.2685,-3.7605;4.3388,-2.3123,-3.3477;3.4569,-2.937,-3.2114;4.0912,-0.8272,-3.2721;3.4597,-0.5217,-4.1102;5.0313,-0.2687,-3.3174;3.3696,-0.4136,-2.0566;1.9835,-0.2888,-2.1584;1.409,-0.5259,-3.2127;1.2104,0.1511,-0.925;1.3377,-0.5489,-0.0921;0.157,0.1862,-1.2039;1.5195,1.1441,-0.5796;4.0697,-0.1952,-0.866;3.4734,0.2457,-0.0772;5.369,-0.4755,-0.639;5.9379,-1.0039,-1.3951;6.0662,-0.1542,0.6153;7.1526,-0.9547,1.0515;7.6182,-2.1495,0.2508;8.4165,-2.683,0.7758;8.0123,-1.8565,-0.7318;6.8033,-2.8593,0.0652;7.7894,-0.6353,2.2548;8.6184,-1.2554,2.5888;7.3853,0.4487,3.0347;7.9,0.6726,3.9652;6.3255,1.2451,2.602;6.0086,2.1051,3.1863;5.685,0.9485,1.4018;4.892,1.6015,1.0472)|\",4.639541151025\r\n\"[H]C([H])=C([H])C([H])([H])N(C(=O)C([H])([H])[H])/C([H])=C(\\[H])C1=C(F)C([H])=C([H])C([H])=C1[H] |(5.8405,2.9679,-4.1686;5.4898,1.975,-3.902;6.1513,1.1454,-4.1461;4.3093,1.7822,-3.3149;3.6657,2.6305,-3.0846;3.7368,0.4303,-2.9728;4.4721,-0.3627,-3.1422;2.8623,0.2362,-3.5985;3.2588,0.3282,-1.5827;1.891,0.5151,-1.3701;1.1412,0.7684,-2.3024;1.3602,0.392,0.0503;1.8224,1.1205,0.7253;1.5301,-0.608,0.4653;0.2869,0.5781,0.0063;4.1604,0.0686,-0.5486;3.6884,-0.1102,0.4087;5.5037,0.0223,-0.656;5.9814,0.2798,-1.5923;6.4088,-0.3052,0.4474;7.7788,-0.0242,0.3174;8.2146,0.5382,-0.8366;8.7136,-0.2875,1.3076;9.754,-0.0358,1.1291;8.2861,-0.865,2.5032;9.005,-1.0788,3.2885;6.9338,-1.1738,2.6743;6.5937,-1.6376,3.5959;6.0178,-0.9052,1.6608;4.9773,-1.1831,1.8028)|\",4.492599671754999\r\n\"[H]C([H])=C([H])C([H])([H])N(C(=O)C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1Cl |(-0.0541,0.451,-1.5824;0.8271,-0.1777,-1.4924;1.1729,-0.6618,-2.4037;1.4505,-0.3465,-0.3274;1.0864,0.1623,0.5654;2.6492,-1.2429,-0.1242;3.0469,-1.5762,-1.089;2.3502,-2.145,0.4138;3.7386,-0.61,0.6303;3.9548,-0.8423,1.9948;4.8671,-0.2947,2.5954;3.0165,-1.8039,2.7083;3.1354,-2.8311,2.3438;1.9626,-1.5277,2.5961;3.2808,-1.7789,3.7659;4.6049,0.2792,-0.0164;5.3824,0.63,0.6503;4.5334,0.6611,-1.3058;3.7509,0.2781,-1.9483;5.4798,1.5887,-1.9347;6.2554,2.4982,-1.1848;6.1239,2.528,-0.1076;7.1571,3.3707,-1.7824;7.7335,4.053,-1.164;7.3083,3.3803,-3.1708;8.0046,4.0633,-3.6486;6.5488,2.509,-3.9482;6.6448,2.4985,-5.0287;5.6551,1.6343,-3.333;4.7423,0.5451,-4.3871)|\",4.612329765975001\r\n\"[H]C1=C([H])C(Cl)=C(/C([H])=C(\\[H])N(C(=O)C([H])([H])[H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(6.2238,4.1383,2.4712;5.24,3.877,2.0928;4.1152,4.5511,2.5596;4.2053,5.3382,3.3006;2.8531,4.2139,2.0726;1.4816,5.1234,2.7256;2.6547,3.203,1.1079;1.3156,2.8732,0.6116;0.5108,3.4577,1.0391;1.045,1.9255,-0.3086;1.8126,1.3044,-0.7521;-0.2167,1.6082,-0.8198;-0.2885,0.5673,-1.755;0.7109,-0.0361,-2.1169;-1.6578,0.2217,-2.3172;-2.0697,1.0453,-2.9119;-2.3833,-0.0224,-1.5341;-1.5276,-0.6451,-2.9658;-1.4044,2.3178,-0.3288;-1.1326,3.3714,-0.2126;-2.1722,2.2863,-1.103;-1.9495,1.7548,0.988;-2.7974,2.3567,1.3335;-1.1826,1.7638,1.7677;-2.2931,0.7222,0.8633;3.822,2.5453,0.6613;3.7301,1.7593,-0.0812;5.0854,2.8684,1.1383;5.9525,2.3323,0.763)|\",4.478993979229999\r\n\"[H]O[C@@]1([H])[C@@]([H])(OC([H])([H])[H])O[C@]([H])(C([H])([H])C([H])([H])C(=O)OC([H])([H])[H])[C@@]1([H])O[H] |(-1.0691,5.6873,4.0522;-0.2843,5.4719,4.5894;0.1104,4.1994,4.1602;1.1869,4.098,4.3392;-0.5856,2.9983,4.8301;-0.235,2.7521,5.8396;-1.9792,3.2653,4.8608;-2.7429,2.2366,5.4837;-3.7884,2.5498,5.4425;-2.4435,2.1157,6.5344;-2.6222,1.2796,4.9633;-0.2712,1.9088,3.9942;-0.1505,2.3686,2.611;-0.9931,1.976,2.0339;1.1608,1.834,2.0323;1.208,2.1117,0.9731;2.002,2.3242,2.5389;1.3228,0.3177,2.1752;1.1655,0.0119,3.215;2.3454,0.012,1.9207;0.3773,-0.4731,1.2912;-0.4063,-0.0046,0.4915;0.5221,-1.8009,1.497;-0.3233,-2.6443,0.697;-0.0694,-3.6662,0.9799;-0.1335,-2.4814,-0.3673;-1.3771,-2.439,0.9035;-0.2223,3.9072,2.6778;0.473,4.3879,1.9857;-1.5087,4.4384,2.3655;-2.1346,3.9966,2.9719)|\",7.102171498050001\r\n\"[H]S[C@@]([H])(N([H])/N=C(\\C1=NC([H])=C([H])C([H])=C1[H])C([H])([H])[H])N([H])C([H])([H])[H] |(6.9999,1.9411,-0.3126;6.6745,1.253,-1.4292;4.8621,1.6138,-1.2944;4.4614,1.4174,-2.3018;4.2381,0.6742,-0.3937;4.6743,0.4429,0.5014;2.9027,0.622,-0.4733;2.1575,0.0451,0.4276;2.6134,-0.5662,1.6918;3.9294,-0.4989,2.0095;4.3507,-1.0353,3.1575;5.4174,-0.9482,3.3566;3.5111,-1.6758,4.0659;3.9092,-2.0996,4.9824;2.1533,-1.7526,3.7518;1.4558,-2.2419,4.4266;1.6984,-1.1954,2.5634;0.6467,-1.2462,2.3085;0.6793,0.0293,0.1039;0.0796,0.5347,0.8727;0.2883,-0.9926,0.0067;0.5242,0.5428,-0.847;4.6445,2.9821,-0.8485;3.673,3.0285,-0.542;4.8872,4.0021,-1.8678;4.5894,4.9768,-1.4695;4.3407,3.8279,-2.8121;5.9548,4.0409,-2.1026)|\",4.242254929294999\r\n\"[H]S[C@]([H])(N([H])[H])N([H])/N=C(\\C1=NC([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(7.2606,-1.6234,-0.6765;6.8058,-2.1066,0.5019;5.0453,-2.2046,-0.0304;4.5019,-2.4401,0.8951;4.8022,-3.2785,-0.9619;5.3219,-3.095,-1.8209;3.8147,-3.2345,-1.219;4.6641,-0.8796,-0.5384;5.0086,-0.0455,-0.0564;3.4174,-0.8193,-1.0338;2.779,0.298,-1.2309;3.2458,1.6577,-0.8874;4.4246,1.793,-0.2334;4.8472,3.0145,0.1017;5.7989,3.0576,0.6284;4.1447,4.182,-0.189;4.5382,5.1505,0.1028;2.9297,4.0577,-0.8642;2.3434,4.9376,-1.1156;2.4734,2.7927,-1.2139;1.5306,2.6817,-1.7358;1.4175,0.142,-1.8737;1.3644,0.6342,-2.8543;0.6159,0.5656,-1.2544;1.2167,-0.9215,-2.0162)|\",4.280350868365\r\n\"[H]C1=C([H])C(=O)C([H])=C(N([H])[H])C1C1=NNC(=S)N1N([H])[H] |(-0.9934,2.2369,0.0519;-0.5239,1.2597,0.0598;0.8124,1.1192,0.211;1.4702,1.9759,0.3162;1.4433,-0.2124,0.2642;2.6569,-0.3543,0.4315;0.5557,-1.3545,0.0222;1.0388,-2.3201,-0.102;-0.7895,-1.2143,-0.1355;-1.6248,-2.3012,-0.4796;-1.0917,-3.1467,-0.6625;-2.2162,-2.098,-1.2829;-1.4008,0.1045,0.0152;-2.7666,0.3424,0.1501;-3.3225,1.5873,-0.1933;-4.5951,1.5351,-0.0812;-4.9697,0.256,0.4096;-6.4944,-0.2362,0.8084;-3.8047,-0.4721,0.5273;-3.7065,-1.6264,1.3287;-3.1524,-2.3094,0.8021;-4.6705,-1.9496,1.4408)|\",1.9592197236000004\r\n\"[H]OC1=NC(=O)N(C([H])([H])C([H])([H])C([H])([H])[H])[C@@]([H])(Cl)[C@@]1([H])N(=O)=O |(8.9512,-1.772,-2.0038;8.701,-0.9866,-1.4655;7.4018,-1.0378,-1.218;6.8307,-0.0601,-0.6283;5.4532,-0.07,-0.3653;4.9458,0.6918,0.4322;4.6528,-0.9912,-1.0773;3.1877,-0.8438,-1.0219;2.9801,0.2046,-0.7998;2.7956,-1.0612,-2.0224;2.5405,-1.7455,0.0339;2.8024,-2.7942,-0.1662;2.9676,-1.4908,1.0107;1.0172,-1.5877,0.0655;0.5729,-2.2322,0.831;0.5664,-1.8544,-0.8981;0.7316,-0.5542,0.2936;5.2069,-1.8749,-2.0096;4.5698,-2.7394,-2.1783;5.3767,-1.1336,-3.748;6.5988,-2.2992,-1.5474;6.4863,-2.8865,-0.6268;7.1937,-3.3122,-2.5213;8.3849,-3.2177,-2.8316;6.4504,-4.2033,-2.8849)|\",4.174226466670001\r\n\"[H]OC1=C(C([H])([H])[H])C([H])=C2C(=C1[H])[C@]([H])(C([H])([H])[H])[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]2([H])C(=C([H])[H])C([H])([H])[H] |(0.9653,-6.556,2.0573;0.8703,-5.8018,2.6594;1.4279,-4.6954,2.0681;1.3896,-3.4948,2.8016;0.7662,-3.4601,4.1734;0.8154,-2.453,4.5989;1.272,-4.1492,4.8606;-0.2857,-3.7688,4.1423;1.9412,-2.3687,2.2;1.9073,-1.4268,2.7449;2.529,-2.3808,0.9229;2.5717,-3.5916,0.2167;2.0078,-4.7356,0.8054;2.026,-5.6789,0.2583;3.2413,-3.7278,-1.1475;2.5814,-4.3387,-1.7817;4.5703,-4.5054,-1.0142;5.0039,-4.731,-1.9943;4.4067,-5.4563,-0.4967;5.3109,-3.9447,-0.4332;3.3908,-2.3482,-1.8324;2.373,-1.9975,-2.0546;4.1536,-2.4044,-3.1618;4.0879,-1.4406,-3.6804;3.7386,-3.1697,-3.8297;5.2174,-2.6272,-3.0189;4.008,-1.3381,-0.8557;4.2199,-0.3965,-1.3759;4.9753,-1.7124,-0.495;3.0783,-1.0765,0.357;3.6941,-0.6093,1.141;1.9954,-0.0512,0.0018;0.7269,-0.3789,-0.2667;-0.0052,0.3782,-0.5377;0.3719,-1.403,-0.2103;2.4563,1.3879,-0.0317;1.6575,2.0588,-0.3623;3.3149,1.5307,-0.7006;2.783,1.7148,0.9657)|\",5.776977046115\r\n\"[H]ON1C(=O)C2/C(C([H])([H])[H])=N\\N[C@@]2([H])C2=C([H])C([H])=C([H])N=C21 |(8.6974,0.7359,-1.0912;8.0344,1.4638,-1.0143;6.8869,0.7543,-0.677;5.6742,1.413,-0.9112;5.6111,2.5596,-1.3166;4.5322,0.5218,-0.6381;3.3005,0.4958,-1.1911;2.594,1.4495,-2.0878;3.2566,2.2705,-2.3669;1.7133,1.8641,-1.5817;2.234,0.935,-2.9864;2.6429,-0.7292,-0.7996;3.3894,-1.4481,-0.0826;4.658,-0.7229,0.1697;4.5973,-0.491,1.2499;5.9545,-1.4356,-0.099;6.1928,-2.7934,0.0493;5.3786,-3.4503,0.3395;7.482,-3.2948,-0.1784;7.6971,-4.3528,-0.0751;8.488,-2.4025,-0.5292;9.5063,-2.7425,-0.7;8.2793,-1.0854,-0.6827;7.0436,-0.6272,-0.4865)|\",3.68442153577\r\n\"[H]O/C(=C(\\[H])C(=O)C([H])([H])[H])C([H])([H])/C(O[H])=C(/[H])C(=O)C([H])([H])[H] |(3.5127,-0.0343,-1.4386;3.6234,0.3026,-0.5021;2.5036,0.9121,-0.1289;2.3475,1.3124,1.1682;3.1957,1.1731,1.8321;1.1566,1.9252,1.7163;0.0343,1.9315,1.17;1.3017,2.5847,3.0757;0.3582,3.0526,3.3616;2.1018,3.3341,3.0659;1.5743,1.8338,3.8281;1.4871,1.1484,-1.2273;2.0247,1.3036,-2.1647;0.897,2.0378,-0.9982;0.5425,-0.028,-1.3674;-0.5485,0.0153,-0.6106;-0.4593,0.7672,0.0445;0.7319,-1.073,-2.2274;-0.0719,-1.7995,-2.3047;1.9025,-1.2737,-3.0545;2.9946,-0.6868,-2.9092;1.7774,-2.2992,-4.1663;1.5478,-3.286,-3.7448;2.7141,-2.3535,-4.7236;0.9557,-2.042,-4.8452)|\",4.93614524807\r\n\"[H]OC(=O)/C([H])=N/N([H])OC([H])([H])C1=C([H])C([H])=NC([H])=C1[H] |(1.8008,1.4792,5.7199;1.6872,1.608,4.7598;1.7389,2.9436,4.5672;1.9014,3.7447,5.4669;1.5757,3.3618,3.158;1.645,4.4408,2.992;1.3723,2.5038,2.2254;1.3225,2.948,0.9515;0.8792,3.8607,0.8235;0.5858,2.0618,0.1301;1.4927,1.1704,-0.54;2.1208,1.7297,-1.2451;2.1463,0.7032,0.2081;0.6685,0.1254,-1.2479;0.9521,-0.2527,-2.5618;1.7522,0.2268,-3.1198;0.189,-1.2593,-3.1561;0.3973,-1.5709,-4.1786;-0.8205,-1.8877,-2.5437;-1.0908,-1.5119,-1.2857;-1.9169,-2.0299,-0.8011;-0.385,-0.5265,-0.5978;-0.6529,-0.2616,0.4199)|\",5.09397128136\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[C@@]1([H])C([H])([H])O[H] |(6.9482,0.5215,1.2332;6.353,1.2768,1.0499;5.1514,0.7677,0.6577;4.2602,1.6098,0.2998;3.026,1.1368,-0.1727;2.346,1.8684,0.0075;2.559,-0.1707,0.2937;1.7168,-0.4366,-0.3589;2.056,-0.2368,1.74;1.2855,-1.3398,2.1383;1.0609,-2.1247,1.4179;0.7901,-1.4401,3.4375;0.1897,-2.3004,3.7224;1.0571,-0.4316,4.3664;0.6707,-0.5045,5.3795;1.8177,0.6719,3.9808;2.0294,1.4638,4.6948;2.3121,0.7713,2.6775;2.9076,1.6332,2.3925;3.6967,-1.169,0.012;3.4085,-2.1799,0.3183;3.8658,-1.1829,-1.0723;4.9852,-0.7435,0.7331;4.9126,-1.0148,1.797;6.1924,-1.4997,0.1691;6.4116,-1.1549,-0.8515;5.9681,-2.575,0.1284;7.3145,-1.2668,1.0375;8.1241,-1.545,0.5826)|\",5.0068948492\r\n\"[H]C(=O)C1=C([H])C([H])=C(/C([H])=N/N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(-3.3172,-5.0004,6.7885;-3.1619,-5.7033,5.9386;-3.4535,-6.8812,6.0492;-2.5947,-5.0821,4.7279;-2.3458,-5.8507,3.5802;-2.5785,-6.9111,3.6049;-1.8142,-5.2556,2.4459;-1.6227,-5.8559,1.5591;-1.5142,-3.8778,2.4216;-0.9647,-3.2903,1.2004;-0.7932,-3.9798,0.3696;-0.7017,-2.0268,1.1403;-0.1203,-1.4452,0.0699;-0.3423,0.0045,0.0254;-0.3164,0.3688,1.0546;0.4977,0.4529,-0.522;-1.672,0.3477,-0.6662;-2.4941,-0.0194,-0.0374;-1.7831,1.4369,-0.7398;-1.7507,-0.2997,-2.0593;-1.0152,0.1782,-2.7237;-2.736,-0.123,-2.507;-1.4594,-1.8097,-1.9917;-1.4198,-2.2431,-2.999;-2.2663,-2.3222,-1.452;-0.1268,-2.0615,-1.2655;0.6966,-1.5997,-1.8247;0.109,-3.125,-1.1837;-1.7634,-3.1122,3.5785;-1.5295,-2.0532,3.5633;-2.2959,-3.7093,4.7113;-2.486,-3.112,5.6012)|\",3.7551711369\r\n\"[H]C#C[C@@]1(C([H])([H])O[H])C([H])=C([H])[C@]([H])(N2C(=O)N=C(O[H])C([H])([H])C2([H])[H])C1([H])[H] |(0.2884,1.3436,-0.8489;1.2118,0.8783,-0.5869;2.286,0.3913,-0.3188;3.5513,-0.2615,0.0573;3.2517,-1.2409,1.2252;2.9826,-0.6556,2.1197;4.1617,-1.8072,1.4493;2.261,-2.1949,0.9031;1.459,-1.6969,0.6683;4.1563,-0.9756,-1.1482;3.6821,-1.8547,-1.5714;5.2529,-0.3684,-1.6021;5.8337,-0.6795,-2.4662;5.5929,0.8782,-0.8117;5.3594,1.7763,-1.3905;7.014,1.0179,-0.4796;7.6756,2.1877,-0.8359;7.0902,3.1428,-1.3198;9.0749,2.2497,-0.617;9.6228,1.4333,0.193;10.9466,1.5776,0.4219;11.2459,0.8992,1.0472;8.8894,0.3287,0.9209;9.5554,-0.5112,1.153;8.5006,0.72,1.871;7.748,-0.1372,0.018;7.0608,-0.7826,0.572;8.1559,-0.7388,-0.8097;4.6634,0.7699,0.4305;4.2563,1.7395,0.725;5.2354,0.3821,1.2834)|\",5.67629492143\r\n\"[H]C#C[C@@]1(C([H])([H])O[H])C([H])=C([H])[C@]([H])(N([H])[C@@]2([H])N([H])C(O[H])=NC([H])([H])C2([H])[H])C1([H])[H] |(0.1511,-1.2521,-0.2052;1.1467,-0.8862,-0.0958;2.278,-0.4753,0.0121;3.6492,0.0204,0.1818;3.7169,0.8567,1.4953;3.3733,0.235,2.3359;3.0456,1.7168,1.4176;5.0112,1.3845,1.7288;5.6166,0.6299,1.8153;4.6394,-1.1384,0.2469;4.5669,-1.8947,1.0252;5.5426,-1.1065,-0.7335;6.3303,-1.8429,-0.867;5.3482,0.0633,-1.6841;5.0771,-0.3146,-2.6799;6.493,0.9541,-1.8867;6.8835,1.2612,-0.997;7.5419,0.5667,-2.8146;7.0339,0.0945,-3.672;8.2161,1.7944,-3.2185;7.6324,2.5926,-3.4281;9.4962,1.7729,-3.7093;9.8805,3.0026,-4.1423;10.7929,2.8754,-4.4579;10.3061,0.785,-3.7676;9.7905,-0.5009,-3.3115;9.4638,-1.0924,-4.1822;10.6141,-1.067,-2.8598;8.6362,-0.3876,-2.2998;9.0179,0.0119,-1.3508;8.2081,-1.3753,-2.0982;4.1599,0.845,-1.0479;3.3588,1.0222,-1.7681;4.5276,1.8152,-0.7055)|\",5.711669721995\r\n\"[H]OC1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)O[C@]([H])(O[H])C([H])([H])[H])C([H])=C1O[H] |(9.4264,3.3736,4.155;9.7326,2.8289,3.4128;8.6472,2.3007,2.7844;7.3321,2.5407,3.1776;7.1445,3.1848,4.0347;6.2659,1.9659,2.488;5.2482,2.1641,2.8116;6.4903,1.1337,1.3816;5.3448,0.5521,0.6941;4.3684,0.8081,1.1033;5.3403,-0.2644,-0.3784;6.2435,-0.5939,-0.8821;4.0629,-0.7607,-0.9187;2.9552,-0.5127,-0.4724;4.2707,-1.555,-2.0025;3.097,-2.184,-2.5892;2.4136,-2.4397,-1.7785;3.51,-3.3855,-3.1588;4.0851,-3.1754,-3.9132;2.4529,-1.2264,-3.5784;1.5983,-1.7111,-4.0601;3.1749,-0.9294,-4.3496;2.1073,-0.328,-3.0602;7.826,0.8976,0.9906;8.035,0.2548,0.1383;8.8933,1.4635,1.6712;10.2002,1.2635,1.3336;10.2346,0.6715,0.5661)|\",4.144293943115\r\n\"[H]OC1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])C([H])([H])O[H])C([H])=C1O[H] |(12.1432,2.2868,1.8222;11.6907,2.8547,1.179;10.461,2.33,0.9225;9.9807,1.1595,1.5151;10.615,0.629,2.2232;8.7149,0.6669,1.2139;8.3741,-0.245,1.693;7.8871,1.3405,0.3015;6.552,0.8815,-0.0644;6.0219,1.4898,-0.7965;5.8933,-0.2014,0.3934;6.3095,-0.8849,1.1262;4.5399,-0.5063,-0.1005;3.9154,0.1391,-0.9248;4.0569,-1.6287,0.498;2.7338,-2.0444,0.1056;2.3598,-2.6152,0.9605;2.1109,-1.1564,-0.036;2.7621,-2.899,-1.1591;3.1954,-2.3027,-1.9723;3.4082,-3.7712,-1.0057;1.3653,-3.3767,-1.5664;0.6994,-2.5113,-1.7216;0.9229,-3.9857,-0.77;1.3878,-4.2172,-2.7134;1.7293,-3.6899,-3.4524;8.3817,2.5207,-0.2914;7.7563,3.0589,-1.0023;9.644,3.0206,0.004;10.1564,4.1588,-0.548;9.4979,4.5324,-1.1541)|\",4.100755727035001\r\n\"[H]O[C@]1([H])C([H])([H])N([C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])O[H] |(2.6724,5.5109,-0.6301;3.5482,5.0951,-0.5574;3.4235,3.9931,0.3487;4.405,3.5081,0.3376;2.3627,2.9881,-0.1122;2.6422,2.6402,-1.1079;1.3865,3.5156,-0.2105;2.2851,1.8598,0.8161;1.3849,0.77,0.3897;0.3286,1.0904,0.4963;1.5848,-0.4616,1.2698;2.8731,-0.9259,1.5686;3.7298,-0.3536,1.2248;3.0535,-2.088,2.3178;4.0594,-2.4323,2.5449;1.948,-2.8082,2.7777;2.0894,-3.7136,3.3618;0.6615,-2.3536,2.4864;-0.2058,-2.9022,2.8449;0.4841,-1.1871,1.7394;-0.5219,-0.8342,1.5207;1.591,0.3679,-1.083;1.0061,-0.5323,-1.2936;2.6446,0.1395,-1.2795;1.2626,1.1455,-1.7795;1.95,2.3276,2.1705;0.9511,2.8118,2.1813;1.8858,1.4588,2.8296;3.0007,3.3044,2.706;3.9698,2.7993,2.7924;2.7178,3.6509,3.7065;3.1465,4.4939,1.7643;2.1819,5.0416,1.7389;4.1819,5.3506,2.2131;4.4094,5.9054,1.4466)|\",5.994668126515\r\n\"[H]OC([H])([H])C([H])([H])N1C([H])([H])C([H])([H])[C@@]([H])(O[H])[C@]([H])(O[H])C1([H])[H] |(0.657,-2.6941,-1.4077;0.9897,-3.5857,-1.192;0.8383,-3.6745,0.212;-0.1961,-3.9431,0.487;1.4863,-4.4834,0.5672;1.2441,-2.3551,0.8763;2.3086,-2.1903,0.6774;1.1133,-2.4065,1.9724;0.5003,-1.237,0.2777;1.234,0.0338,0.3239;1.374,0.3803,1.367;2.2314,-0.1296,-0.0974;0.5022,1.1183,-0.4775;0.4846,0.8264,-1.5376;1.037,2.072,-0.4044;-0.9305,1.2811,0.0286;-0.918,1.6622,1.0574;-1.6948,2.2486,-0.6981;-1.6448,2.011,-1.6393;-1.6506,-0.0683,0.0402;-1.7227,-0.4277,-1.0037;-2.9382,0.0497,0.6178;-3.3021,0.8822,0.2691;-0.8468,-1.0919,0.841;-1.3642,-2.0549,0.8105;-0.8152,-0.7704,1.8997)|\",7.86409027945\r\n\"[H]O[C@]1([H])C([H])([H])N(C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])O[H] |(8.5313,-5.555,-0.0139;8.314,-4.6409,0.2391;7.1243,-4.3307,-0.4667;7.3502,-4.0866,-1.5215;6.4763,-3.1046,0.1759;7.1806,-2.2695,0.129;6.2976,-3.3195,1.2489;5.2405,-2.7536,-0.5258;4.6752,-1.4799,-0.0668;3.6423,-1.4307,-0.4345;4.6162,-1.4432,1.039;5.4291,-0.247,-0.5844;4.9662,0.6415,-0.1314;6.467,-0.2567,-0.2245;5.4183,-0.1149,-2.1121;5.8372,-1.0292,-2.5485;4.3762,-0.0605,-2.4595;6.1886,1.11,-2.6142;6.1623,1.1781,-3.7079;7.2419,1.0691,-2.3096;5.7663,2.04,-2.2131;4.2712,-3.8493,-0.4512;3.9835,-4.0701,0.5979;3.3592,-3.5406,-0.9737;4.8296,-5.1224,-1.1003;4.9905,-4.9309,-2.1711;4.1105,-5.9451,-1.0139;6.1563,-5.514,-0.4515;5.9895,-5.7972,0.5952;6.7688,-6.6689,-1.0374;6.8443,-6.5014,-1.9919)|\",7.575649597920001\r\n\"[H]O[C@]1([H])C([H])([H])N(C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])O[H] |(7.6268,2.6972,0.3305;7.5124,2.6519,-0.6339;6.9378,1.3752,-0.935;6.741,1.4003,-2.0116;5.6236,1.1368,-0.1874;4.9129,1.926,-0.4553;5.8124,1.2282,0.9069;5.0556,-0.1626,-0.5404;3.722,-0.3518,0.0355;3.7757,-0.5541,1.1254;3.1826,0.5975,-0.0742;2.9015,-1.45,-0.6515;2.8814,-1.2432,-1.7286;3.3894,-2.425,-0.529;1.4743,-1.5336,-0.0996;0.9025,-2.322,-0.6009;0.935,-0.5892,-0.2434;1.4733,-1.7551,0.9748;5.9827,-1.2512,-0.2127;6.1726,-1.2949,0.881;5.521,-2.2026,-0.4903;7.3102,-1.0954,-0.9621;7.1367,-1.1648,-2.0422;7.9992,-1.901,-0.684;7.9409,0.2578,-0.6518;8.1791,0.2946,0.4312;9.1222,0.4296,-1.4158;9.3139,1.3832,-1.3834)|\",7.61374553699\r\n\"[H]/N=C(/S[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])N([H])/N=C(\\[H])C1=C([H])C([H])=NC([H])=C1[H] |(4.0774,2.0139,-3.6171;4.2436,1.475,-2.7629;4.9152,2.1655,-1.9342;5.4012,1.5568,-0.3206;4.4101,-0.0136,-0.2294;4.4741,-0.4339,-1.2373;2.9486,0.2838,0.1072;2.3603,-0.6412,0.0901;2.517,0.9639,-0.631;2.8547,0.7335,1.1022;5.118,-0.9355,0.7784;6.1757,-1.0169,0.5007;5.0903,-0.4784,1.7775;4.507,-2.3416,0.8422;5.0898,-2.979,1.5162;4.504,-2.8167,-0.1462;3.4764,-2.3277,1.2117;5.3996,3.4655,-2.1223;5.8351,3.958,-1.3441;5.1848,4.1012,-3.2873;5.5928,5.3157,-3.4066;6.0994,5.8303,-2.5794;5.4056,6.0733,-4.6434;4.798,5.529,-5.7869;4.4424,4.5045,-5.7825;4.6655,6.3255,-6.9189;4.1977,5.9218,-7.8152;5.083,7.5991,-6.9994;5.6602,8.1114,-5.9067;5.9947,9.1449,-5.9817;5.8441,7.4029,-4.7199;6.3222,7.8795,-3.8678)|\",4.443619178665\r\n\"[H]SC1=N[C@@]([H])(C([H])([H])N([H])C2=C([H])C([H])=C(F)C([H])=C2[H])N=N1 |(-2.1111,-6.9961,-0.6683;-3.4412,-6.9886,-0.4447;-3.4949,-5.3067,0.0421;-2.55,-4.4482,0.1254;-3.2223,-3.248,0.5802;-2.8507,-2.9629,1.5745;-3.0394,-2.0671,-0.3847;-3.5777,-1.209,0.0306;-3.51,-2.32,-1.3506;-1.6258,-1.7673,-0.4926;-1.0702,-2.6125,-0.5699;-1.1821,-0.738,-1.337;-2.0335,0.2702,-1.8201;-3.0913,0.2608,-1.5807;-1.5348,1.303,-2.6178;-2.1899,2.0833,-2.9914;-0.1868,1.325,-2.9427;0.296,2.3218,-3.7206;0.6802,0.3368,-2.486;1.7302,0.3761,-2.7574;0.1814,-0.6858,-1.6865;0.8557,-1.4554,-1.3182;-4.6551,-3.5878,0.7217;-4.814,-4.7898,0.411)|\",2.95515641643\r\n\"[H]C1=C([H])C([H])=C([H])[C@]2([H])C1/N=C1/NC(=S)N([H])N12 |(2.5888,-1.038,0.0961;2.2138,-0.0238,0.1823;0.9806,0.3401,-0.2594;0.356,-0.3928,-0.7628;0.413,1.6643,-0.0178;-0.6152,1.8382,-0.3214;1.1214,2.6476,0.5712;0.702,3.6356,0.7391;2.5737,2.419,0.8536;3.1461,2.8876,0.0301;2.9771,0.9462,0.9181;3.969,0.7107,1.7318;4.2237,1.9346,2.3312;5.2945,2.4272,2.8813;5.0452,3.7894,3.002;5.9531,4.9368,3.7683;3.8287,4.0858,2.3301;3.2524,4.8,2.7649;3.1843,2.8531,2.1272)|\",2.397323022905\r\n\"[H]OC1=NC(=O)N([C@@]2([H])C([H])=C(C([H])([H])[H])[C@](C([H])([H])[H])(C([H])([H])O[H])C2([H])[H])C([H])([H])C1([H])[H] |(5.52,-2.213,6.1172;6.1558,-2.3241,5.3933;5.6915,-1.7069,4.2835;6.4045,-1.7731,3.2305;5.9851,-1.081,2.0649;6.7961,-0.7941,1.1973;4.6416,-0.7603,1.9535;4.163,-0.1433,0.7094;4.9668,-0.3218,-0.0096;2.8562,-0.6926,0.19;2.6644,-1.7607,0.1193;2.0331,0.2619,-0.2586;0.7074,0.042,-0.9262;0.4554,-1.0222,-0.966;0.713,0.4212,-1.9569;-0.1042,0.5635,-0.4021;2.6456,1.6562,-0.0958;1.6712,2.6788,0.5169;2.1806,3.6286,0.7207;1.2602,2.3096,1.4636;0.8301,2.8928,-0.1532;3.1178,2.2111,-1.4657;3.5197,3.2263,-1.3129;2.2602,2.3034,-2.1417;4.0502,1.3877,-2.1477;4.9025,1.4495,-1.6903;3.8591,1.373,0.8361;4.7351,1.9947,0.6139;3.5789,1.5924,1.874;3.6922,-1.0959,3.0044;2.8374,-0.4168,2.9386;3.3015,-2.1181,2.8786;4.3696,-0.9754,4.3699;3.7258,-1.3951,5.1525;4.5563,0.078,4.6192)|\",5.567449381229999\r\n\"[H]NC1=C([H])C([H])N([C@@]2([H])C([H])=C(C([H])([H])[H])[C@](C([H])([H])[H])(C([H])([H])O[H])C2([H])[H])C(O[H])=N1 |(4.2662,-0.7769,6.7281;5.142,-0.6754,6.2063;4.8732,-0.5798,4.9509;3.5399,-0.6094,4.333;2.6602,-0.7159,4.9597;3.3984,-0.5142,2.9966;2.441,-0.5416,2.492;4.4994,-0.3915,2.1572;4.3837,-0.1972,0.6849;5.2468,-0.7192,0.2665;3.0915,-0.7237,0.1192;2.849,-1.7834,0.131;2.3151,0.2337,-0.4044;1.0016,0.0222,-1.095;0.6847,-1.0238,-1.0399;1.0844,0.3001,-2.1527;0.208,0.641,-0.6572;2.933,1.6243,-0.2377;2.1603,2.4603,0.8066;2.5975,3.4622,0.898;2.1951,1.9919,1.7957;1.1076,2.5824,0.5267;2.9362,2.4123,-1.5678;3.5202,3.3377,-1.4341;1.913,2.7105,-1.8193;3.3905,1.6718,-2.6905;4.3171,1.4282,-2.5409;4.3768,1.2963,0.2474;5.0989,1.42,-0.5697;4.7089,1.9556,1.0546;5.7215,-0.3412,2.7911;6.7598,-0.1769,1.9375;7.5451,-0.1724,2.5159;5.9457,-0.4283,4.0521)|\",5.13750949744\r\n\"[H]OC1=C([H])/C(=C(\\[H])N([H])N2C([H])([H])C([H])([H])N(N([H])[H])C([H])([H])C2([H])[H])C([H])=C([H])C1=O |(0.4949,0.5517,-4.0083;-0.2985,0.6383,-3.4561;0.0869,0.707,-2.1511;1.3765,0.6574,-1.7252;2.1857,0.5586,-2.4497;1.7268,0.7375,-0.3222;3.0548,0.6949,0.0374;3.8217,0.5759,-0.7225;3.5825,0.8355,1.2836;2.9639,0.8317,2.094;4.9179,0.4628,1.507;5.0149,-0.8671,2.1287;4.4923,-0.8899,3.1027;4.5348,-1.5914,1.4635;6.4833,-1.2325,2.3289;6.9921,-1.3196,1.3537;6.543,-2.2067,2.83;7.126,-0.2276,3.1793;8.5485,-0.5946,3.2783;8.5441,-1.4587,3.8281;8.9376,0.0932,3.9297;7.0638,1.0741,2.5097;7.5567,1.8197,3.146;7.5902,1.0534,1.5399;5.6056,1.4797,2.3121;5.5465,2.4322,1.7781;5.1229,1.6046,3.2986;0.6441,0.8638,0.6243;0.8649,0.916,1.6892;-0.6485,0.9144,0.2163;-1.4665,1.0057,0.9247;-1.0397,0.8442,-1.1965;-2.2141,0.8892,-1.5661)|\",3.447682485835\r\n\"[H]OC1=NC(S[H])=N[C@]1([H])/C(=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(3.7635,1.5671,-0.5454;3.9805,1.5393,-1.4966;3.5747,0.3653,-1.9748;3.5819,0.0613,-3.2269;3.0398,-1.2493,-3.2119;2.8977,-2.017,-4.7924;2.3694,-3.1601,-4.3082;2.6919,-1.7842,-2.0882;3.0553,-0.7737,-1.0947;3.9024,-1.1518,-0.5019;1.958,-0.3145,-0.1492;2.2914,-0.0808,1.1397;3.2937,-0.3827,1.4498;1.4685,0.4643,2.2351;0.4934,1.4616,2.0523;0.3325,1.8841,1.0658;-0.2457,1.9452,3.1314;-0.991,2.7188,2.9667;-0.0237,1.4501,4.4174;-0.5986,1.8306,5.2571;0.9559,0.4751,4.6197;1.1444,0.0899,5.6182;1.6974,-0.0037,3.5427;2.4594,-0.762,3.7073;0.5961,-0.1272,-0.7646;-0.1877,-0.073,-0.0067;0.3866,-0.9692,-1.4319;0.5424,0.7849,-1.3757)|\",5.134788358934999\r\n\"[H]OC1=NC(S[H])=N[C@]1([H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.8438,0.0556,-2.4506;4.7653,0.0671,-2.7675;5.5678,-0.0906,-1.7159;6.8365,-0.2902,-1.8125;7.2175,-0.4238,-0.4524;8.9345,-0.7185,-0.1842;8.8252,-0.7669,1.1596;6.3223,-0.3317,0.4743;5.0872,-0.053,-0.2636;4.7534,0.9709,-0.0335;3.9606,-1.0152,-0.0085;4.2345,-2.0675,-0.0468;2.6914,-0.6279,0.2052;2.489,0.4433,0.2625;1.5127,-1.484,0.4028;1.5669,-2.8895,0.3554;2.5114,-3.3904,0.1647;0.4205,-3.6529,0.5509;0.4841,-4.737,0.5108;-0.809,-3.0334,0.7953;-1.7023,-3.6332,0.9458;-0.8805,-1.6411,0.8436;-1.8302,-1.1485,1.0336;0.2691,-0.8768,0.6492;0.2083,0.2084,0.6909)|\",4.944308663585\r\n\"[H]C1=C([H])C(/C([H])=N/N(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[H])=C([H])S1 |(4.776,4.8794,2.8532;4.7791,4.3958,1.8855;3.9927,3.3719,1.4521;3.2284,2.8926,2.0517;4.2753,2.9781,0.0968;3.5888,1.9135,-0.6259;3.9154,1.7165,-1.6505;2.6468,1.245,-0.054;2.0432,0.1948,-0.6947;2.092,0.08,-2.1305;1.4539,1.0176,-2.9543;0.937,1.8608,-2.5045;1.491,0.8673,-4.3401;0.9915,1.5938,-4.9755;2.1639,-0.2168,-4.9108;2.1902,-0.3324,-5.9909;2.8034,-1.1482,-4.0917;3.3303,-1.9907,-4.5316;2.7713,-0.9981,-2.7034;3.2649,-1.7119,-2.0505;0.7818,-0.1855,-0.0662;0.5065,-1.1906,-0.3981;-0.0435,0.5027,-0.3092;0.9284,-0.1881,1.016;5.285,3.735,-0.4528;5.7004,3.6584,-1.449;5.8948,4.9193,0.6557)|\",4.636820012520001\r\n\"[H]C(C1=C(C([H])([H])[H])NC2=C1C([H])=C([H])C([H])=C2[H])N([H])N(C([H])([H])[H])C([H])([H])[H] |(3.5667,-2.9001,-0.188;2.9188,-2.0274,-0.2085;3.4631,-0.7745,-0.1846;2.8234,0.549,-0.1924;1.3498,0.8309,-0.2337;0.8282,0.4163,0.6419;0.8764,0.4126,-1.1335;1.1918,1.9119,-0.2384;3.6887,1.5307,-0.1559;4.9646,0.9424,-0.1235;4.8933,-0.4724,-0.1402;6.0608,-1.2349,-0.1125;6.029,-2.3228,-0.1275;7.291,-0.5724,-0.0678;8.2102,-1.1521,-0.0468;7.3562,0.8289,-0.0501;8.326,1.3183,-0.0148;6.1924,1.6008,-0.078;6.2293,2.6862,-0.0654;1.6082,-2.3714,-0.2875;0.8861,-1.6583,-0.2061;1.2265,-3.7117,-0.1057;0.6099,-3.9066,1.2084;0.3778,-4.9683,1.3355;-0.3245,-3.3289,1.3335;1.317,-3.6073,1.9861;0.361,-4.1425,-1.2034;0.1392,-5.2067,-1.0784;0.8859,-3.999,-2.1505;-0.5962,-3.5907,-1.2391)|\",4.225928098265001\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])[H])C([H])([H])SC(=O)C([H])([H])[H] |(6.8727,0.5866,-3.8194;5.8807,0.6488,-4.0552;5.2559,1.2749,-3.1435;3.9205,1.4757,-3.3238;3.5303,1.82,-2.5001;5.8715,1.9085,-1.8833;6.9491,1.7111,-1.9416;5.7178,3.3665,-1.8017;4.7306,3.6209,-1.7815;6.1006,3.7977,-2.6421;5.4202,1.2601,-0.5614;6.012,1.6843,0.2524;5.5487,0.1735,-0.589;3.6536,1.5338,-0.1186;3.0117,-0.1723,-0.3017;3.7238,-1.1108,-0.5548;1.5144,-0.2657,-0.1123;1.2818,-1.1865,0.4294;1.046,-0.3261,-1.102;1.1044,0.5981,0.417)|\",5.58921848927\r\n\"[H]C1=C([H])C([H])=C(C2=C([H])C([H])([H])[C@]([H])(C([H])([H])C([H])([H])[H])C([H])([H])C2=O)C([H])=C1[H] |(0.194,7.8663,4.7016;0.5905,6.9318,4.3135;0.8301,6.7832,2.9469;0.6157,7.5994,2.2618;1.3324,5.5802,2.4527;1.4902,5.4628,1.3838;1.6137,4.5039,3.3113;2.197,3.2496,2.765;3.1459,3.2741,1.8006;3.5271,4.2414,1.4752;3.7457,2.0653,1.1363;4.7427,1.8871,1.5729;3.9296,2.288,0.0763;2.8813,0.7985,1.2837;3.5051,-0.0735,1.0418;1.693,0.7977,0.2981;1.0332,1.6493,0.5095;2.0923,0.966,-0.7117;0.8733,-0.4977,0.2976;0.1043,-0.4685,-0.4822;0.3621,-0.6614,1.2523;1.5098,-1.3703,0.1036;2.453,0.6866,2.7572;3.3513,0.5576,3.3815;1.8124,-0.1778,2.9509;1.7301,1.9239,3.2822;0.8449,1.8199,4.1183;1.3547,4.6629,4.6836;1.5449,3.8405,5.3626;0.8534,5.8664,5.1768;0.6658,5.9708,6.2423)|\",4.786482630295\r\n\"[H]O[C@]1([H])C([H])([H])[C@@]([H])(C([H])([H])C([H])([H])[H])C([H])([H])C(=O)[C@@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.7185,1.987,0.4019;6.3297,1.3145,0.9849;4.9229,1.3382,0.7969;4.5503,0.5736,1.4873;4.5467,0.9707,-0.6465;5.0204,1.7049,-1.3178;4.9959,0.0011,-0.886;3.0276,0.9645,-0.915;2.8811,0.8396,-1.9969;2.2692,-0.1852,-0.2162;2.3605,-0.0928,0.8746;1.1997,-0.0602,-0.4317;2.704,-1.5903,-0.6477;2.0602,-2.3508,-0.1925;3.7341,-1.8155,-0.3499;2.6391,-1.7084,-1.7367;2.4365,2.3413,-0.5185;1.3473,2.3573,-0.619;2.8535,3.1137,-1.1807;2.7934,2.6914,0.9199;1.9478,2.8962,1.7672;4.3076,2.7146,1.2106;4.732,3.4478,0.5041;4.6716,3.1356,2.6168;5.5302,4.2207,2.8245;5.917,4.7686,1.9673;5.8977,4.6127,4.1127;6.5669,5.458,4.2502;5.4064,3.9189,5.2182;5.6884,4.2202,6.2236;4.5472,2.8351,5.0243;4.1552,2.2916,5.88;4.1821,2.448,3.736;3.5016,1.6131,3.6014)|\",5.825957539205\r\n\"[H]O[C@@]1([H])C([H])([H])C(=O)C([H])([H])[C@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C1([H])[H] |(8.5473,-2.4087,-3.0755;7.5905,-2.4014,-3.2428;6.9559,-3.0019,-2.1186;5.8877,-2.9316,-2.3515;7.3338,-4.4927,-1.9857;8.4318,-4.5572,-1.9094;7.0213,-5.0615,-2.8651;6.7455,-5.127,-0.7292;6.1132,-6.1651,-0.7581;7.0005,-4.3484,0.5552;6.481,-4.8493,1.3775;8.0803,-4.3888,0.764;6.5701,-2.8649,0.4222;6.9456,-2.3329,1.3072;5.031,-2.7384,0.4319;4.5995,-3.2677,-0.4297;4.6577,-3.2723,1.3173;4.4983,-1.2992,0.4659;4.9718,-0.7572,1.2985;4.788,-0.7654,-0.4496;2.973,-1.229,0.6195;2.4984,-1.7661,-0.215;2.6749,-1.7648,1.5329;2.4283,0.2042,0.6737;2.9022,0.7398,1.5088;2.7264,0.7402,-0.2387;0.9052,0.2652,0.8276;0.5481,1.3005,0.8672;0.58,-0.2352,1.7481;0.4017,-0.2291,-0.0122;7.2471,-2.2402,-0.8185;8.3376,-2.2348,-0.6595;6.948,-1.1946,-0.946)|\",6.027321788575\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])[C@]([H])(C([H])([H])C([H])([H])[H])C([H])([H])C([H])(OC([H])([H])[H])OC([H])([H])[H])C([H])=C1[H] |(-1.0754,-5.6375,-4.8269;-0.566,-5.0707,-4.0522;0.1135,-3.8916,-4.3731;0.1314,-3.5375,-5.4008;0.7651,-3.1619,-3.3835;1.2771,-2.2425,-3.6521;0.7589,-3.5918,-2.043;1.4377,-2.8645,-0.9584;1.2389,-3.2477,0.0432;2.2702,-1.8181,-1.0732;2.5174,-1.4258,-2.0618;2.9251,-1.0994,0.0808;2.5537,-1.5429,1.0141;2.5212,0.3963,0.0646;2.9607,0.8765,-0.8218;1.4331,0.4519,-0.0666;2.9089,1.1819,1.3231;2.5383,2.2116,1.2658;3.9941,1.2345,1.4633;2.477,0.725,2.2223;4.4606,-1.2803,0.0436;4.9447,-0.6487,0.7963;4.853,-0.954,-0.928;4.9094,-2.7267,0.2619;4.4814,-3.3841,-0.5066;6.3162,-2.7374,0.1781;6.8706,-4.0427,0.0891;7.9531,-3.9211,0.0013;6.4946,-4.574,-0.7983;6.6396,-4.6406,0.9788;4.4337,-3.2965,1.4723;4.8701,-2.6615,2.6679;4.6316,-3.3483,3.4842;4.3494,-1.7102,2.8491;5.9506,-2.4769,2.6543;0.0645,-4.7759,-1.7377;0.0439,-5.1241,-0.7074;-0.5881,-5.5089,-2.7279;-1.1157,-6.4219,-2.4641)|\",5.099413558369999\r\n\"[H]C1=C(C#C/C([H])=N/N2C([H])([H])C([H])([H])OC([H])([H])C2([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-2.3862,-0.3718,-0.4269;-1.3268,-0.181,-0.2671;-0.4932,-1.2343,-0.1215;-0.9704,-2.5721,-0.2078;-1.304,-3.7426,-0.2681;-1.7882,-5.0711,-0.3643;-2.8435,-5.1857,-0.6226;-0.9943,-6.0725,-0.1419;-1.4501,-7.3496,-0.1673;-0.3731,-8.2996,-0.4576;-0.0834,-8.2427,-1.5201;0.4911,-8.0159,0.1489;-0.8301,-9.7175,-0.1349;-1.0004,-9.8196,0.9495;-0.0727,-10.4436,-0.4433;-2.0179,-10.0398,-0.8484;-3.0671,-9.1625,-0.4789;-3.9488,-9.4674,-1.05;-3.2901,-9.2707,0.5955;-2.7336,-7.6973,-0.7773;-3.5242,-7.0731,-0.3481;-2.7099,-7.5173,-1.8665;0.9983,-1.0549,0.1336;1.3194,-1.7724,0.8984;1.5486,-1.3264,-0.7794;1.3509,0.3778,0.558;1.0457,0.5347,1.6023;2.4381,0.5182,0.5237;0.6419,1.405,-0.3328;0.943,2.4254,-0.0675;0.9449,1.2461,-1.3771;-0.8828,1.2566,-0.2201;-1.3831,1.8191,-1.0212;-1.2369,1.714,0.7188)|\",4.220485821255\r\n\"[H]C(=O)C([H])([H])[C@@]([H])(C(=C([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C([H])([H])C(=O)OC([H])([H])[H] |(6.3774,0.9522,1.1642;5.2963,0.7592,0.9758;4.4605,1.1515,1.7603;5.0079,0.0326,-0.3202;5.7228,-0.7976,-0.4124;5.272,0.7331,-1.1289;3.5512,-0.4406,-0.4616;2.9139,0.3722,-0.0884;3.1306,-0.7441,-1.8986;3.9954,-0.9648,-2.8944;3.6485,-1.1997,-3.8975;5.0724,-0.9291,-2.7603;1.6388,-0.8121,-2.1197;1.3952,-1.0822,-3.1515;1.1659,0.1547,-1.8976;1.1764,-1.5485,-1.4508;3.3089,-1.6591,0.412;3.8245,-2.5683,0.1099;2.5182,-1.6626,1.4868;2.0094,-0.7392,1.7636;2.2269,-2.8124,2.4255;1.1904,-3.1528,2.2694;2.2702,-2.4646,3.4629;3.107,-4.0416,2.293;3.4426,-4.577,1.2578;3.4543,-4.5118,3.513;4.2456,-5.7115,3.5068;4.4304,-5.9448,4.5557;5.1875,-5.5472,2.9769;3.7038,-6.5272,3.0204)|\",5.99194698801\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C(=O)C([H])([H])[H])[C@]([H])(O[H])C([H])([H])[H] |(2.8089,-7.8292,-6.7047;2.754,-7.0985,-6.072;3.0087,-5.8701,-6.7758;2.4818,-5.8913,-7.7458;2.3388,-4.7607,-5.9447;2.4088,-3.827,-6.5121;0.9461,-5.0643,-5.8602;0.9128,-5.9943,-5.5742;2.9586,-4.5494,-4.5542;2.9212,-5.4979,-3.9994;4.0176,-4.2902,-4.6732;2.2361,-3.4532,-3.7594;2.3191,-2.5006,-4.3046;1.1681,-3.696,-3.7305;2.7503,-3.2622,-2.324;2.0767,-2.566,-1.8031;2.6742,-4.2182,-1.7825;4.1875,-2.7343,-2.2198;4.2803,-1.7957,-2.7829;4.8921,-3.4308,-2.6874;4.6248,-2.4954,-0.7742;4.5247,-3.4193,-0.181;3.965,-1.769,-0.2754;6.0651,-2.0163,-0.6292;6.8227,-1.9522,-1.5792;6.5104,-1.6188,0.7698;5.9755,-0.716,1.0923;7.5839,-1.4185,0.7741;6.2754,-2.4048,1.4978;4.5126,-5.6813,-7.0414;5.0378,-5.6732,-6.0818;4.7587,-4.3917,-7.6066;4.336,-4.3689,-8.481;5.0987,-6.791,-7.9165;5.0231,-7.7689,-7.4274;6.1562,-6.5916,-8.1128;4.5775,-6.8494,-8.883)|\",6.337531578145\r\n\"[H]C(=O)C([H])([H])[C@]([H])(C(=C([H])[H])C([H])([H])[H])C([H])([H])C([H])(OC([H])([H])[H])OC([H])([H])[H] |(0.0306,0.0192,-1.3324;0.6568,-0.3616,-2.1688;0.1479,-0.9489,-3.0992;2.1411,-0.0906,-2.0467;2.3006,0.9872,-2.2038;2.6435,-0.6154,-2.866;2.742,-0.5137,-0.6802;2.5394,-1.5831,-0.5469;2.1074,0.215,0.5027;1.5562,-0.4734,1.5089;1.112,0.0256,2.367;1.5386,-1.5601,1.5156;2.1338,1.7274,0.5125;1.7226,2.1196,1.4471;1.5464,2.1536,-0.3115;3.1528,2.1202,0.4037;4.2782,-0.3459,-0.7195;4.5553,0.7138,-0.7595;4.6745,-0.7956,-1.6375;4.9936,-0.9676,0.4822;4.6435,-0.5121,1.4185;4.7109,-2.3442,0.6734;5.0945,-3.2094,-0.3904;4.4281,-3.1185,-1.2596;5.0175,-4.2267,0.0011;6.1253,-3.0207,-0.7116;6.3685,-0.7186,0.3053;7.1534,-0.9975,1.4569;8.1861,-0.7487,1.201;7.0915,-2.0536,1.7451;6.8355,-0.3813,2.3116)|\",5.962014464455001\r\n\"[H]C([H])=C(C([H])([H])[H])[C@@]([H])(C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])C([H])(OC([H])([H])[H])OC([H])([H])[H] |(2.137,-1.9918,-2.7779;2.6556,-1.7337,-1.858;3.6089,-2.2276,-1.6939;2.1375,-0.8553,-0.9931;0.8138,-0.178,-1.2572;0.344,-0.5459,-2.1744;0.9452,0.9087,-1.3499;0.1097,-0.3337,-0.4284;2.7967,-0.4903,0.3365;2.4681,0.5225,0.5992;2.258,-1.4487,1.4434;2.701,-2.4406,1.3185;1.1709,-1.537,1.351;2.5904,-0.9565,2.8358;3.5338,-1.3173,3.5081;1.7107,-0.0092,3.2389;1.9793,0.5724,4.5249;1.1856,1.3021,4.6883;2.9575,1.0608,4.5285;1.9641,-0.1943,5.3041;4.3368,-0.4788,0.325;4.7318,-1.4484,0.0006;4.6926,-0.323,1.3484;4.9336,0.6044,-0.5744;4.5473,0.5143,-1.5985;4.5557,1.8518,-0.0282;4.8226,2.9545,-0.8829;5.8933,3.049,-1.0993;4.4752,3.8497,-0.3609;4.2795,2.8596,-1.8356;6.334,0.4946,-0.7445;7.1127,0.5425,0.4467;6.8191,1.3825,1.0873;8.1494,0.6782,0.1278;7.0415,-0.3894,1.0233)|\",6.759308046419999\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])=C([H])[H])[C@]([H])(O[H])C([H])([H])[H] |(11.476,6.0421,-0.3479;11.1205,5.6673,-1.173;9.712,5.5608,-0.9581;9.2539,6.5628,-0.994;9.1725,4.7546,-2.1506;9.3489,5.3892,-3.0369;9.8884,3.5326,-2.3028;10.825,3.7741,-2.1937;7.6801,4.4375,-2.0749;7.4949,3.768,-1.2243;7.1322,5.3693,-1.8725;7.1408,3.7861,-3.3557;7.3109,4.4639,-4.2056;7.7263,2.8828,-3.5582;5.6456,3.4513,-3.2672;5.4764,2.7618,-2.4264;5.0904,4.3683,-3.0219;5.0482,2.8417,-4.5462;3.9546,2.8034,-4.4401;5.2472,3.5136,-5.3946;5.56,1.4327,-4.8806;6.6478,1.4516,-5.0231;5.371,0.7692,-4.0248;4.8989,0.8465,-6.132;3.8046,0.8485,-6.0013;5.0785,1.5175,-6.9874;5.3276,-0.5457,-6.5216;4.8379,-0.9402,-7.4138;6.2206,-1.3152,-5.897;6.4594,-2.3105,-6.2622;6.7491,-0.9914,-5.0046;9.4438,5.0157,0.4564;8.3581,5.0319,0.6345;10.0796,5.9948,1.3034;10.1288,5.6279,2.1994;9.9933,3.6211,0.7462;9.5361,2.8694,0.0996;9.7984,3.3482,1.7923;11.0745,3.5984,0.579)|\",7.58653415194\r\n\"[H]C1([H])C([H])([H])[C@@]2([H])C([H])([H])[C@]1([H])[C@]([H])(C([H])([H])Cl)[C@]2([H])C([H])([H])Cl |(-0.9631,-1.1488,-1.6151;-1.1898,-0.7714,-0.612;-2.1552,-1.1972,-0.3177;-1.2146,0.7906,-0.5403;-2.1928,1.1568,-0.2102;-1.001,1.2649,-1.5044;-0.1323,1.1104,0.5167;-0.2129,2.1142,0.9403;-0.2973,-0.0711,1.4952;0.4529,-0.0963,2.2948;-1.2888,-0.1086,1.9604;-0.0961,-1.152,0.4125;-0.1441,-2.1924,0.7424;1.2874,-0.7672,-0.1809;1.3506,-1.0731,-1.2309;2.4623,-1.4272,0.5415;2.4211,-1.2974,1.6248;3.4264,-1.0699,0.1776;2.4932,-3.2298,0.2624;1.2618,0.826,-0.1081;1.3133,1.2283,-1.1257;2.4178,1.4534,0.6717;3.3906,1.1719,0.2668;2.3928,1.2123,1.7363;2.3797,3.275,0.5775)|\",7.92939760357\r\n\"[H]C1=C(C([H])([H])[H])[C@@]([H])(OC(=O)C([H])([H])Cl)[C@@]2([H])OC2([H])C1=O |(-0.4886,1.2461,-2.6486;0.5926,1.2381,-2.5293;1.1916,0.163,-1.9537;0.4246,-1.0514,-1.5109;-0.6378,-0.9739,-1.7561;0.5212,-1.2039,-0.4275;0.829,-1.95,-1.9943;2.7208,0.11,-1.6913;2.9413,-0.2891,-0.6968;3.2958,-0.7743,-2.6746;4.5247,-1.2612,-2.3764;5.14,-1.0042,-1.3709;5.0422,-2.161,-3.4854;6.0436,-2.4957,-3.2217;5.0526,-1.6263,-4.4371;4.003,-3.6246,-3.7052;3.1236,1.5609,-1.8782;2.7922,2.1727,-1.0374;4.0387,2.2799,-2.7099;2.7189,1.9952,-3.2202;2.7567,1.212,-3.9756;1.3108,2.499,-3.0045;0.7901,3.5866,-3.0933)|\",4.982404602655\r\n\"[H]O[C@@]1([H])C([H])=C([C@@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C1([H])[H] |(6.0197,-0.8795,-4.0111;5.2779,-0.9015,-4.6378;4.1799,-0.2459,-4.0056;3.3419,-0.4186,-4.6919;3.8561,-0.7674,-2.6251;3.7806,-1.828,-2.4106;3.6394,0.2097,-1.7381;3.2434,0.0627,-0.2951;2.2854,0.5822,-0.154;3.0606,-1.3347,-0.0177;2.0951,-1.6168,0.9659;2.2054,-2.6923,1.1648;2.2725,-1.0441,1.8853;0.7827,-1.3018,0.5897;0.2952,-2.0827,-0.4949;-0.7486,-1.7963,-0.6422;0.3448,-3.1575,-0.2613;0.8592,-1.8938,-1.4153;4.2907,0.6529,0.6613;4.4264,1.7257,0.4874;5.2525,0.1486,0.523;3.9786,0.5229,1.7031;3.7579,1.5763,-2.3897;4.349,2.284,-1.7959;2.4214,2.1105,-2.5158;2.3797,3.5053,-2.6945;3.0903,3.8329,-3.4641;1.3508,3.7239,-3.0126;2.7187,4.2383,-1.5461;1.7467,4.1578,-0.5123;2.1012,4.7917,0.304;1.6205,3.131,-0.148;0.77,4.5297,-0.8579;4.4047,1.2683,-3.7579;4.0071,1.9008,-4.5573;5.4857,1.4481,-3.7068)|\",6.748423492400001\r\n\"[H]OC([H])([H])C([H])([H])[C@@]([H])(O[H])C(=C([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[H] |(0.7775,3.2577,3.1107;1.3219,4.0648,3.0834;2.6496,3.6127,2.8877;3.2633,4.5103,2.7538;3.0328,3.0911,3.7837;2.8179,2.6967,1.6664;3.8906,2.5324,1.5042;2.4365,3.2096,0.7719;2.1059,1.3374,1.8193;2.492,0.8501,2.725;0.7131,1.5352,2.1169;0.3135,1.9402,1.3276;2.301,0.398,0.6305;1.2656,-0.2052,0.0407;1.4181,-0.8747,-0.797;0.2519,-0.0696,0.4018;3.7335,0.1469,0.1804;4.194,1.1049,-0.0928;3.7203,-0.6891,-0.9853;4.7456,-0.4149,-1.9098;5.7311,-0.3918,-1.4271;4.6988,-1.241,-2.633;4.622,0.8267,-2.5471;3.4788,0.9245,-3.3882;3.5362,1.897,-3.8827;2.5476,0.8596,-2.8137;3.4819,0.1316,-4.1517;4.5734,-0.5298,1.275;4.6548,0.0966,2.17;5.5892,-0.7221,0.913;4.1188,-1.4849,1.5562)|\",7.19469020722\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@]1([H])C(=C([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[H] |(2.5922,3.8903,-3.7865;2.9324,3.1391,-4.2997;2.4753,1.9653,-3.6443;1.3778,1.88,-3.6846;2.9018,1.1209,-4.1968;2.9274,1.9322,-2.2052;4.006,1.8986,-2.0435;2.2465,2.8672,-1.3461;2.0541,1.4668,-1.1048;1.0602,1.107,-1.3871;2.5953,0.8955,0.1661;3.2642,1.6494,1.0417;3.66,1.2226,1.957;3.419,2.7074,0.8566;2.3387,-0.5934,0.3626;1.3052,-0.8204,0.0636;2.5171,-0.9936,1.7261;1.406,-0.7844,2.5366;1.7401,-0.9617,3.5694;1.0267,0.2501,2.4494;0.3868,-1.6962,2.1778;-0.8156,-1.4692,2.8848;-1.5419,-2.2018,2.5251;-1.2108,-0.4558,2.7074;-0.6847,-1.6032,3.9706;3.2872,-1.4578,-0.4707;3.2139,-1.2056,-1.534;3.039,-2.5152,-0.341;4.3201,-1.295,-0.1461)|\",6.7783560159550005\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])C(=C([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[H] |(-0.3818,2.901,-5.3195;-0.9789,3.5732,-4.9547;-0.5031,3.8813,-3.6436;0.4753,4.387,-3.6758;-1.2287,4.6015,-3.2459;-0.4303,2.667,-2.7642;-1.3485,2.0834,-2.6931;0.6755,2.2756,-2.1133;1.5953,2.8456,-2.2492;0.7918,1.1059,-1.2222;-0.2239,0.6276,-0.488;-1.2078,1.088,-0.4968;-0.1007,-0.2441,0.1502;2.1732,0.4583,-1.15;2.1502,-0.3199,-0.3693;3.1796,1.4189,-0.8219;3.1172,1.9093,0.5094;2.2909,2.6202,0.6314;2.9718,1.0553,1.2005;4.2781,2.6064,0.8029;5.4433,1.7934,0.8465;6.2602,2.4373,1.1802;5.6822,1.3781,-0.1387;5.3211,0.9659,1.5631;2.5961,-0.1733,-2.475;1.8874,-0.9547,-2.7678;3.593,-0.6148,-2.3776;2.629,0.5794,-3.2691)|\",5.518468888139999\r\n\"[H]OC([H])([H])C(=C([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[H] |(0.958,-0.1915,-3.5042;1.1103,0.7202,-3.2082;1.8774,0.6474,-2.019;2.8809,0.2307,-2.2192;2.0359,1.6894,-1.7136;1.2316,-0.1393,-0.8926;-0.0337,-0.5591,-0.9584;-0.6561,-0.3233,-1.8159;-0.478,-1.129,-0.1493;2.1207,-0.3928,0.3183;3.1324,-0.6499,-0.0292;1.6356,-1.4641,1.1383;1.9728,-2.7514,0.6729;1.3599,-3.4393,1.272;1.7394,-2.8623,-0.3922;3.3368,-3.0565,0.7921;3.7771,-3.1705,2.1388;4.8357,-3.4384,2.1028;3.224,-3.9601,2.6706;3.6565,-2.228,2.6861;2.2115,0.8364,1.2269;2.578,1.7095,0.6769;2.8948,0.6392,2.0588;1.2226,1.0726,1.6323)|\",7.23278614629\r\n\"[H]O[C@]1([H])[C@]([H])(O[H])[C@@]([H])(C([H])([H])[H])O[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])[C@]1([H])O[H] |(3.9371,2.1184,-2.7589;3.074,1.8856,-3.1465;2.5901,0.828,-2.3469;1.5066,0.779,-2.5129;2.8771,1.0956,-0.8556;2.3394,1.9985,-0.5453;4.2604,1.3987,-0.6873;4.7456,0.6469,-1.0822;2.4343,-0.0841,0.0305;2.9276,0.0425,0.9995;0.9199,-0.1584,0.2556;0.5759,0.732,0.7933;0.6791,-1.0394,0.8585;0.3495,-0.2183,-0.6776;2.9667,-1.327,-0.4607;2.6525,-1.6419,-1.8119;1.5618,-1.7082,-1.9555;3.2659,-2.9892,-2.1429;2.7271,-3.7697,-3.175;1.8324,-3.4316,-3.6941;3.3196,-4.9793,-3.5348;2.8868,-5.5755,-4.3337;4.4621,-5.4264,-2.8661;4.9229,-6.3703,-3.1441;5.0031,-4.6574,-1.8357;5.886,-5.0029,-1.3041;4.4114,-3.4438,-1.4758;4.8106,-2.8568,-0.6553;3.1793,-0.5332,-2.7596;2.8773,-0.7602,-3.7925;4.6044,-0.4284,-2.6823;4.9732,-1.3224,-2.7876)|\",6.353858409175\r\n\"[H]C([H])=C([H])C1=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@]2([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]12C([H])([H])[H] |(-0.2187,-2.2405,3.0515;0.8606,-2.1529,2.9553;1.4586,-2.8893,3.4857;1.4164,-1.1919,2.208;0.7418,-0.5243,1.6688;2.8645,-0.9704,1.9808;3.7524,-0.8499,2.9956;3.3972,-0.9149,4.4637;2.3232,-0.8522,4.6445;3.8908,-0.0972,5.0062;3.7583,-1.8482,4.9201;5.2345,-0.6351,2.7615;5.5915,0.1592,3.433;5.7628,-1.5453,3.0925;5.6238,-0.3173,1.315;6.6957,-0.4984,1.1838;5.4683,0.7491,1.1151;4.7907,-1.1706,0.3499;4.8427,-2.2024,0.7349;5.3712,-1.2866,-1.1023;6.7427,-1.9989,-1.0386;7.0834,-2.2479,-2.0512;6.6808,-2.9346,-0.4694;7.5185,-1.3756,-0.5812;5.5959,0.0651,-1.8143;4.6667,0.5702,-2.0869;6.161,-0.0941,-2.7414;6.1782,0.7548,-1.193;4.4314,-2.1924,-1.9387;4.769,-2.1928,-2.9845;4.5412,-3.2271,-1.5807;2.9506,-1.8117,-1.8556;2.3532,-2.5379,-2.4222;2.7784,-0.8417,-2.3393;2.461,-1.7773,-0.4016;2.5345,-2.7878,0.0239;1.3969,-1.5153,-0.3782;3.2738,-0.7999,0.4918;2.9141,0.66,0.1027;1.8671,0.8658,0.3524;3.0264,0.8554,-0.9659;3.5211,1.3898,0.6455)|\",5.825957539205\r\n\"[H]C1N(O)[C@]([H])(C([H])([H])/C([H])=C(\\[H])C(=O)OC([H])([H])[H])C([H])([H])C1([H])[H] |(-4.4121,-3.4487,2.9254;-3.5657,-2.9952,2.4265;-2.4055,-3.5999,2.4734;-2.0922,-4.6837,3.0458;-1.3627,-2.858,1.6604;-1.2581,-3.4415,0.7368;-0.0293,-2.857,2.412;-0.1232,-2.2616,3.3285;0.1556,-3.8964,2.7154;1.0927,-2.3513,1.5607;1.275,-2.8665,0.6169;1.8931,-1.3216,1.8643;1.7843,-0.7561,2.7856;2.9817,-0.9154,0.9469;3.2381,-1.4249,-0.1272;3.6776,0.1274,1.4627;4.7623,0.599,0.6503;5.2073,1.4252,1.2058;4.3955,0.9424,-0.321;5.4965,-0.1951,0.4892;-2.0239,-1.4974,1.4045;-1.6715,-0.7745,2.1485;-1.7802,-1.0984,0.4168;-3.5424,-1.7529,1.5943;-4.0524,-1.907,0.631;-4.0464,-0.9094,2.0806)|\",4.669473674580001\r\n\"[H]C1N[C@]([H])(C([H])([H])/C([H])=C(\\[H])C(=O)OC([H])([H])[H])C([H])([H])C1([H])[H] |(-4.2537,-5.1318,1.6965;-3.5189,-4.3598,1.4609;-2.3592,-4.6824,1.0431;-1.593,-3.4451,0.7749;-1.5279,-3.3471,-0.3192;-0.1607,-3.5843,1.3251;-0.2002,-3.7003,2.4154;0.2572,-4.5152,0.9169;0.7184,-2.4327,0.9467;0.8439,-2.2311,-0.1179;1.3586,-1.6274,1.8041;1.2915,-1.759,2.8807;2.2047,-0.517,1.313;2.3893,-0.2174,0.1488;2.7679,0.1507,2.3511;3.6143,1.2486,1.9825;3.9854,1.6617,2.9212;3.0486,2.0027,1.4281;4.4442,0.9034,1.3597;-2.4146,-2.2733,1.3815;-1.9908,-1.9846,2.3503;-2.4112,-1.383,0.7472;-3.8216,-2.8806,1.5715;-4.5231,-2.5823,0.7789;-4.2955,-2.6171,2.5243)|\",6.03548520409\r\n\"[H]OC([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]1([H])[C@@]([H])(OC(=O)C([H])([H])[H])C([H])([H])[H] |(4.8861,3.6857,-3.342;5.5067,3.4112,-2.6452;5.099,2.1116,-2.2681;5.362,1.3641,-3.0369;5.6463,1.8563,-1.353;3.5922,2.062,-2.0094;3.3472,2.725,-1.1727;2.9025,2.5611,-3.1587;2.2096,1.5015,-3.83;2.8778,1.1984,-5.1715;2.8417,2.0777,-5.8224;2.3616,0.3725,-5.671;3.9224,0.9133,-5.017;0.7435,1.8988,-3.9758;0.6541,2.8003,-4.5904;0.3231,2.0947,-2.9871;0.1782,1.0916,-4.4522;2.317,0.3466,-2.9927;3.042,0.6369,-1.8037;3.8698,-0.0851,-1.7248;2.1877,0.4483,-0.5475;2.8224,0.6098,0.3301;1.1614,1.4749,-0.5459;0.7366,1.9203,0.6649;1.1847,1.534,1.7214;-0.3698,2.9352,0.5018;-0.5979,3.382,1.4701;-1.2653,2.4397,0.1098;-0.0833,3.7094,-0.2161;1.533,-0.926,-0.4732;2.2973,-1.7114,-0.482;0.8725,-1.0757,-1.3313;0.9583,-1.0196,0.4522)|\",7.099450359545\r\n\"[H]O[C@@]([H])(C([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]1([H])C([H])([H])OC(=O)C([H])([H])[H] |(3.4505,-1.5693,0.2359;2.5333,-1.3709,-0.0188;2.4234,0.0463,-0.0019;3.0117,0.4811,-0.8296;0.9539,0.3989,-0.1906;0.8084,1.4803,-0.2827;0.3591,0.0258,0.6514;0.5812,-0.0809,-1.1006;3.0138,0.5827,1.3091;2.4109,0.2076,2.1501;4.3443,0.0701,1.4545;5.2655,1.1511,1.6814;6.1347,0.8164,2.8858;6.8159,1.6442,3.106;6.7264,-0.0823,2.6876;5.4982,0.6391,3.7564;6.0784,1.4081,0.4095;6.7752,2.2376,0.5662;5.4174,1.6621,-0.4252;6.6484,0.5149,0.1348;4.4612,2.2781,2.0105;3.1918,2.1127,1.3818;3.2043,2.5461,0.3718;2.1249,2.8087,2.2151;2.2961,2.5983,3.2728;1.1293,2.4699,1.9199;2.1918,4.2433,2.1016;1.5774,4.7928,1.0264;1.0021,4.1489,0.1745;1.7063,6.2965,1.0488;1.2281,6.7157,0.1632;2.7625,6.5826,1.0767;1.2358,6.6994,1.9519)|\",6.95523001878\r\n\"[H]/C(=C1\\OC([H])([H])C([H])([H])N1C([H])([H])[H])C([H])([H])[H] |(3.1306,-0.4053,-0.7612;2.4758,0.1342,-0.0845;3.1145,0.8659,0.8425;4.496,0.8624,0.8857;4.9038,1.4266,2.1334;5.0053,0.6333,2.8872;5.8691,1.918,1.9925;3.764,2.3785,2.4745;3.9153,3.3637,1.9988;3.6325,2.5304,3.5515;2.6228,1.6719,1.8925;1.4465,2.5043,1.6816;1.1658,2.9633,2.6353;0.6032,1.9075,1.3372;1.6297,3.3095,0.949;0.9903,-0.0638,-0.2439;0.8029,-0.9676,-0.8336;0.4827,0.7603,-0.768;0.4794,-0.2056,0.7167)|\",6.479030780405001\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(SC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(-0.2516,5.9758,-0.9799;0.3927,5.1008,-0.969;0.2345,4.0973,-1.9248;-0.5339,4.1853,-2.6887;1.0672,2.9769,-1.9064;0.9333,2.2,-2.6563;2.0741,2.8323,-0.94;2.9369,1.5753,-0.9417;2.8465,1.1176,-1.9329;4.7482,1.9583,-0.7149;5.4161,2.2298,-2.4462;5.3216,0.945,-3.2815;5.7806,1.1069,-4.266;4.2832,0.6419,-3.4543;5.8411,0.1159,-2.7913;4.7087,3.395,-3.1521;5.1821,3.5782,-4.1264;4.7724,4.3132,-2.5605;3.65,3.1844,-3.3305;6.894,2.5845,-2.2046;7.3913,2.7575,-3.1671;7.422,1.7741,-1.6905;6.9912,3.4947,-1.6035;2.4779,0.4617,0.081;2.7353,0.8727,1.5434;2.4097,0.0716,2.2177;2.1808,1.7777,1.8122;3.7997,1.0513,1.7331;3.2277,-0.8521,-0.2197;2.8734,-1.6473,0.4471;4.3063,-0.7441,-0.081;3.0491,-1.1805,-1.2516;0.9679,0.1995,-0.1027;0.6542,-0.6231,0.551;0.7326,-0.09,-1.1346;0.3634,1.0761,0.1468;2.2305,3.858,0.0043;3.0311,3.7936,0.734;1.3959,4.9756,-0.0057;1.5372,5.7566,0.7371)|\",5.65452581339\r\n\"[H]OC1=NC(=O)N([C@]2([H])OC([H])([H])C([H])([H])[C@@]2([H])F)C([H])([H])[C@@]1([H])C([H])([H])[H] |(1.6278,-2.1499,0.9024;2.4918,-2.3435,0.5055;2.9859,-1.2271,-0.0698;4.1328,-1.3113,-0.6232;4.6395,-0.1827,-1.3034;5.4493,-0.3223,-2.2106;4.2092,1.0695,-0.8939;4.5932,2.2713,-1.6377;4.3666,3.1143,-0.9802;3.8437,2.4469,-2.8254;4.6328,2.0348,-3.9584;4.4569,0.9763,-4.1732;4.3,2.6362,-4.8093;6.0979,2.287,-3.5671;6.4554,3.2498,-3.9441;6.7559,1.4951,-3.9287;6.0852,2.3656,-2.0341;6.7063,1.6167,-1.5461;6.5452,3.6241,-1.6163;3.1453,1.1914,0.1041;3.5699,1.1965,1.119;2.6331,2.1457,-0.0468;2.1449,0.0357,-0.0287;1.5216,0.0216,0.8764;1.2371,0.1627,-1.2679;0.5565,1.012,-1.1462;0.6349,-0.7398,-1.4174;1.8334,0.3361,-2.1682)|\",5.79058273864\r\n\"[H]SC1=N[C@@]2([H])C3=C([H])C([H])=C([H])C([H])=C3C([H])([H])[C@@]2([H])S1 |(3.5364,-2.7007,1.4934;4.0996,-1.475,1.5136;2.8771,-0.7564,0.4429;1.8932,-1.3885,-0.0463;1.0357,-0.5466,-0.8815;1.0502,-0.9763,-1.892;-0.3811,-0.4594,-0.3497;-1.272,-1.5135,-0.1483;-0.9736,-2.5345,-0.3703;-2.5443,-1.2361,0.3537;-3.2505,-2.0458,0.5162;-2.9153,0.081,0.6496;-3.9088,0.2868,1.0393;-2.018,1.1319,0.4512;-2.3092,2.1527,0.688;-0.7441,0.8543,-0.0482;0.377,1.8323,-0.3269;0.6914,2.3542,0.5842;0.0745,2.6057,-1.0443;1.5227,0.9587,-0.9115;1.7674,1.2663,-1.9294;3.0932,0.9864,0.0556)|\",5.9484087719300005\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(SC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(-3.5756,-3.949,-0.0236;-2.669,-3.385,-0.2256;-1.5412,-4.0259,-0.7364;-1.5619,-5.0945,-0.9344;-0.3786,-3.2949,-0.9958;0.4987,-3.8038,-1.3902;-0.3237,-1.9158,-0.7558;0.9615,-1.1487,-1.0103;1.6803,-1.8187,-1.4943;0.7744,0.3428,-2.1056;0.3681,-0.2798,-3.8289;0.6056,0.9558,-4.7155;0.357,0.7131,-5.7564;1.6504,1.2808,-4.6781;-0.0267,1.7961,-4.4074;1.3122,-1.4131,-4.2501;1.1194,-1.6821,-5.2971;1.1575,-2.3158,-3.6501;2.3615,-1.1123,-4.1631;-1.0987,-0.7237,-3.9393;-1.3252,-0.9944,-4.9799;-1.776,0.0829,-3.6418;-1.3093,-1.593,-3.3115;1.6121,-0.6277,0.2942;2.5035,-0.0484,0.0262;0.9223,0.069,0.7872;1.9999,-1.7472,1.2666;2.4838,-1.3289,2.1563;2.7051,-2.4485,0.8037;1.1261,-2.3187,1.5955;-1.4663,-1.2834,-0.2421;-1.4485,-0.2105,-0.0713;-2.6255,-2.0097,0.0221;-3.501,-1.501,0.4175)|\",5.583776212260001\r\n\"[H]O[C@@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])C([H])(C#N)C#N |(3.6313,1.9541,1.512;3.8852,1.0381,1.3149;2.6984,0.3122,1.0363;1.6768,0.4167,2.1329;0.7475,1.4665,2.1175;0.7256,2.149,1.2709;-0.1521,1.6363,3.1701;-0.869,2.4525,3.1424;-0.1318,0.7547,4.252;-0.834,0.8827,5.0713;0.7921,-0.2929,4.278;0.8121,-0.9791,5.1202;1.6939,-0.4594,3.2264;2.4199,-1.268,3.2537;2.3252,0.1515,-0.4395;0.8745,-0.0937,-0.8302;0.8309,-0.6675,-1.764;0.3447,0.8524,-0.9939;0.3284,-0.6531,-0.0652;3.1044,0.9392,-1.4819;3.2259,0.3575,-2.4029;4.092,1.2416,-1.1264;2.5563,1.8536,-1.741;2.9917,-0.9883,0.3304;2.3477,-1.8309,0.5699;4.4511,-1.3995,0.0939;5.07,-0.496,0.0447;4.9479,-2.2056,1.2253;5.304,-2.8261,2.1384;4.6009,-2.1357,-1.176;4.6869,-2.6886,-2.1921)|\",6.462703949375001\r\n\"[H]C1=NC([H])=C(/C([H])=C(\\[H])C2=C([H])C([H])=C(N(C([H])([H])[H])C([H])([H])[H])C([H])=C2[H])O1 |(8.3713,4.2403,7.4059;7.6283,4.6205,6.7199;7.2177,5.8392,6.5699;6.2675,5.7501,5.5636;5.7347,6.6208,5.2081;6.1472,4.4512,5.1439;5.3353,3.7983,4.1536;4.675,4.4688,3.6089;5.3552,2.4723,3.8852;6.0403,1.8543,4.4636;4.5592,1.7552,2.8979;4.7261,0.3658,2.7512;5.4472,-0.147,3.3842;4.0007,-0.3778,1.8307;4.1737,-1.4458,1.7739;3.0542,0.2409,0.9821;2.3425,-0.4826,0.0384;2.421,-1.9335,0.0414;1.8146,-2.3261,-0.7771;2.059,-2.3775,0.9823;3.4519,-2.2722,-0.1179;1.2512,0.1523,-0.681;0.8303,-0.5607,-1.3924;1.6069,1.0182,-1.2524;0.4423,0.4929,-0.0153;2.8709,1.638,1.1341;2.1466,2.1631,0.5229;3.6044,2.3632,2.0592;3.4218,3.4319,2.1326;7.0362,3.7133,5.9005)|\",3.692584951285\r\n\"[H]C([H])([H])C([H])([H])OC(=O)[C@@]([H])(C([H])([H])[H])[C@@]1([H])O[C@]([H])(C([H])([H])[H])C([H])([H])C1([H])[H] |(1.2992,-1.2609,-3.4805;1.9654,-1.2185,-2.6111;1.5615,-0.4854,-1.907;1.9721,-2.2039,-2.1336;3.3664,-0.8217,-3.0529;3.3736,0.1781,-3.4932;3.7773,-1.5375,-3.7692;4.2981,-0.8501,-1.9445;4.3708,0.2655,-1.1811;3.7117,1.2654,-1.3835;5.341,0.0747,-0.0256;6.1362,-0.5982,-0.366;5.9478,1.4075,0.4191;6.6025,1.2437,1.2791;5.1713,2.1302,0.6843;6.5424,1.8489,-0.3875;4.6122,-0.6932,1.1138;4.2372,-1.6315,0.6835;5.5823,-1.0084,2.1216;5.0741,-0.7041,3.43;5.9346,-0.3618,4.0177;4.4886,-1.951,4.0949;4.212,-1.7401,5.1351;5.2273,-2.759,4.0931;3.5937,-2.3105,3.5726;4.058,0.4252,3.1976;3.3007,0.4865,3.986;4.5768,1.3886,3.1551;3.4663,0.0715,1.8218;3.1422,0.9509,1.2587;2.5942,-0.5824,1.93)|\",6.9661145728000005\r\n\"[H]C([H])([H])C([H])([H])OC(=S)C([H])([H])[C@@]1([H])O[C@]([H])(C([H])([H])[H])C([H])([H])C1([H])[H] |(1.1852,-2.305,-5.3469;1.3946,-1.2545,-5.1184;1.4107,-0.6946,-6.0589;0.5782,-0.8686,-4.4995;2.7248,-1.1472,-4.397;3.5592,-1.52,-4.9995;2.7336,-1.6922,-3.4479;2.9518,0.2558,-4.1237;4.0667,0.6398,-3.4923;5.2561,-0.3714,-2.9544;4.0672,2.1333,-3.285;5.0754,2.5182,-3.4554;3.3835,2.6023,-4.0008;3.6638,2.5366,-1.8402;4.3298,2.0112,-1.1472;3.882,3.9394,-1.6794;2.6371,4.6647,-1.6034;2.6755,5.4569,-2.3624;2.509,5.3036,-0.2221;1.638,5.9689,-0.1817;3.4037,5.8918,0.0053;2.3981,4.5408,0.5574;1.536,3.6365,-1.9272;0.6011,3.8441,-1.3973;1.3142,3.6367,-3.0005;2.1801,2.3047,-1.5107;1.7577,1.4425,-2.0342;2.0608,2.1371,-0.4346)|\",4.726617583185\r\n\"[H]O[C@@]1(C([H])([H])[H])O[C@@]([H])([C@@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])[H])[C@]([H])(C([H])([H])[H])C([H])([H])C1([H])[H] |(5.8799,-2.2517,1.7579;6.6418,-1.6624,1.8872;6.4694,-0.5784,1.0073;6.5066,-1.0732,-0.4454;7.4441,-1.607,-0.6283;6.4271,-0.2591,-1.1721;5.6701,-1.7597,-0.6203;5.1795,-0.031,1.3176;4.8364,1.2339,0.736;4.861,1.1507,-0.3641;3.3618,1.5052,1.1299;3.0919,2.4807,0.7028;2.4701,0.4637,0.4943;2.6645,-0.5636,0.8015;1.5071,0.7139,-0.3939;0.901,-0.0807,-0.8219;1.284,1.7255,-0.7288;3.1504,1.5689,2.6617;3.3901,0.5901,3.0928;3.8683,2.2769,3.0943;1.7345,1.9931,3.0656;1.6348,2.0301,4.1564;0.9807,1.299,2.6798;1.4927,2.9903,2.676;5.8639,2.3127,1.1402;5.8393,2.4128,2.2336;5.5684,3.6795,0.5097;6.3727,4.3886,0.7379;4.6335,4.1138,0.8789;5.4944,3.6056,-0.5832;7.2702,1.8178,0.7533;7.3548,1.7895,-0.3433;8.0241,2.5367,1.0983;7.5658,0.4359,1.3475;7.6045,0.4989,2.441;8.5364,0.0569,1.0073)|\",7.322583716955\r\n\"[H]C1=C([H])[C@]([H])(C([H])([H])C([H])([H])[H])[C@]([H])([C@@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C(=O)C([H])([H])[H])OC1=O |(4.0783,-0.6913,-5.1005;3.58,-0.2946,-4.2223;3.7038,-0.8364,-3.0038;4.3259,-1.7169,-2.8573;2.9625,-0.2682,-1.8158;3.6106,-0.3606,-0.9338;1.6576,-1.0528,-1.5297;1.1923,-0.6363,-0.6285;0.9542,-0.8824,-2.3534;1.8681,-2.5581,-1.3286;0.9268,-3.0435,-1.0496;2.5926,-2.7572,-0.5289;2.231,-3.0446,-2.2402;2.7612,1.2368,-2.0844;3.7642,1.6834,-2.1538;1.943,2.0265,-1.0504;0.9647,1.5381,-0.9555;2.6237,1.9864,0.3293;2.0529,2.5842,1.0478;3.6422,2.3928,0.2959;2.6895,0.9688,0.7301;1.6496,3.4755,-1.5052;0.9709,3.4486,-2.3628;1.109,3.9854,-0.7007;2.8682,4.3243,-1.8805;3.3358,3.9721,-2.8091;3.6546,4.2649,-1.1106;2.5409,5.8095,-2.0289;1.5391,6.302,-1.5458;3.5411,6.6447,-2.8098;4.5735,6.3935,-2.5399;3.354,7.707,-2.6389;3.4292,6.4327,-3.8815;2.1201,1.4397,-3.3728;2.6536,0.8329,-4.4732;2.3192,1.1943,-5.5788)|\",5.276287561195\r\n\"[H]C([H])([H])N(NC1C(C([H])([H])[H])(C([H])([H])[H])[C@]2([H])O[C@@]1([H])C([H])([H])C2([H])[H])C([H])([H])[H] |(1.7813,2.1475,-3.7242;2.3863,2.2966,-2.8237;3.2906,1.687,-2.9095;2.6827,3.3574,-2.759;1.6088,1.8593,-1.6609;2.411,2.1929,-0.4823;2.7523,1.1911,0.2249;2.4976,-0.328,0.1324;1.0125,-0.6432,0.4104;0.8588,-1.7301,0.4277;0.3816,-0.2196,-0.3752;0.7002,-0.237,1.3766;2.9272,-0.9714,-1.1948;2.9333,-2.0654,-1.0992;3.9271,-0.6576,-1.5113;2.2252,-0.6979,-1.9852;3.3465,-0.7621,1.3745;3.0323,-1.7151,1.8058;3.077,0.2994,2.3136;3.5741,1.3738,1.4929;3.4456,2.3349,1.9891;5.0329,0.9369,1.2027;5.4176,1.3686,0.2739;5.6894,1.246,2.0213;4.8686,-0.6117,1.1507;5.4066,-1.092,1.9729;5.221,-1.0562,0.2169;0.384,2.6573,-1.551;-0.2218,2.4901,-2.4479;0.5896,3.7362,-1.4475;-0.1878,2.3279,-0.6788)|\",5.477651810565\r\n\"[H]C1C(C([H])([H])C2=C([H])C([H])=C([H])C([H])=C2[H])C(=O)N2N([H])N([H])N([H])[C@@]2([H])N1[H] |(3.5155,-2.1421,1.8502;3.711,-2.307,0.7916;3.6025,-1.2979,-0.1092;3.0984,0.0804,0.2539;3.0165,0.1547,1.345;3.8442,0.8132,-0.0734;1.7538,0.4151,-0.3769;1.6827,0.9005,-1.6901;2.6022,1.0403,-2.2519;0.4491,1.1914,-2.274;0.4119,1.57,-3.2924;-0.733,1.0026,-1.555;-1.6931,1.2336,-2.0091;-0.6728,0.5191,-0.247;-1.5866,0.3724,0.3235;0.5623,0.2273,0.3341;0.6018,-0.1467,1.3554;4.2798,-1.4706,-1.4143;4.7464,-0.5459,-2.0763;4.4506,-2.7879,-1.8242;5.7823,-3.1808,-2.3507;5.8623,-2.7208,-3.2596;5.5759,-4.5703,-2.5869;4.822,-4.6533,-3.2928;5.003,-4.9462,-1.312;4.5699,-5.8597,-1.4243;4.0232,-3.895,-0.9707;2.9826,-4.1411,-1.2289;4.0583,-3.6103,0.4512;4.8659,-4.0062,0.9276)|\",5.189211129035001\r\n\"[H]O[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])/C([H])=C(/C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[H] |(7.1113,4.637,1.7819;7.8605,4.0186,1.7996;8.053,3.5488,0.4613;8.9534,2.9251,0.5285;8.3232,4.7114,-0.4953;8.569,4.3524,-1.5014;9.157,5.3185,-0.1288;7.4395,5.3589,-0.5806;6.8786,2.6724,0.0009;5.9693,3.2931,-0.0372;7.0591,2.3321,-1.0274;6.6415,1.4582,0.92;7.528,0.8155,0.9084;6.5448,1.8342,1.9481;5.4011,0.6979,0.5566;4.4848,1.2848,0.5666;5.2885,-0.6006,0.2189;3.9521,-1.178,-0.1131;3.7928,-2.3396,-0.4435;2.9284,-0.2928,-0.0082;1.6104,-0.794,-0.3218;1.5188,-1.813,0.0623;0.9333,-0.1367,0.2302;1.334,-0.7456,-1.8177;0.3018,-1.0573,-2.0159;2.0061,-1.4211,-2.3537;1.4671,0.2699,-2.2056;6.4126,-1.602,0.1311;7.3822,-1.1567,0.3622;6.4598,-2.0411,-0.8716;6.242,-2.4359,0.8215)|\",5.9484087719300005\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@@]23C([H])([H])C(=C([H])[H])[C@@]([H])(C([H])([H])C2([H])[H])C([H])([H])[C@@]3([H])C1([H])[H] |(6.2038,5.2702,1.6498;5.3597,5.7008,1.4364;4.5009,4.6866,0.9043;3.6015,5.2424,0.6043;5.112,4.0364,-0.3149;5.6456,4.7088,-0.9857;5.027,2.7299,-0.588;5.5128,2.3359,-1.4822;4.295,1.7273,0.2678;3.3527,0.8368,-0.5841;2.7071,1.46,-1.2157;3.9535,0.2164,-1.2658;2.5185,-0.042,0.3431;1.4434,-0.7288,-0.0478;1.0836,-0.6995,-1.0743;0.8823,-1.3509,0.6457;3.0448,-0.0277,1.7647;2.508,-0.7568,2.3816;4.5548,-0.354,1.7299;4.9325,-0.4506,2.7551;4.7116,-1.3224,1.2413;5.3065,0.779,0.976;5.9925,0.3624,0.2282;5.9276,1.3635,1.6642;2.8819,1.3991,2.3466;3.449,1.4662,3.2861;1.8331,1.5977,2.5936;3.403,2.4441,1.3233;2.5375,2.841,0.7716;4.1181,3.6407,1.9607;5.0355,3.3021,2.4652;3.4929,4.1073,2.7316)|\",6.723933245855\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])O/C([H])=C(\\[H])[C@@]([H])(C#N)C([H])([H])[H])C([H])=C1[H] |(4.3913,-8.8141,-5.7345;4.1274,-8.0498,-5.0085;3.0293,-7.2193,-5.2351;2.4358,-7.333,-6.1382;2.6927,-6.2383,-4.3007;1.8388,-5.5891,-4.4801;3.4436,-6.0824,-3.1312;3.0939,-5.0163,-2.1271;3.1735,-5.4058,-1.1027;2.0649,-4.6626,-2.2797;4.0083,-3.9161,-2.2824;3.8016,-2.8653,-1.448;2.9022,-2.9158,-0.831;4.6358,-1.823,-1.3934;5.5279,-1.812,-2.0148;4.3842,-0.6277,-0.5022;3.4522,-0.7933,0.0557;5.4596,-0.4974,0.5026;6.3159,-0.3915,1.28;4.2457,0.6894,-1.3012;3.4101,0.601,-2.0019;5.1562,0.8911,-1.8742;4.0643,1.5382,-0.6341;4.5446,-6.9202,-2.9124;5.1359,-6.8008,-2.0078;4.8864,-7.8978,-3.8452;5.7424,-8.5425,-3.665)|\",5.77969818462\r\n\"[H]C1=C([H])C([H])=C(C2=C(C([H])([H])[H])C(=O)N3N([H])N([H])N([H])[C@]3([H])N2[H])C([H])=C1[H] |(0.7301,5.8119,0.1302;1.1685,4.825,0.0099;1.6883,4.435,-1.2258;1.6488,5.114,-2.0731;2.2553,3.1706,-1.3793;2.6488,2.8634,-2.344;2.2984,2.2715,-0.3003;2.9326,0.9368,-0.4676;2.3763,-0.2481,-0.0722;1.0021,-0.4251,0.523;0.3584,0.4338,0.3204;1.0406,-0.5766,1.6093;0.5302,-1.3207,0.1057;3.1945,-1.4495,-0.1711;2.947,-2.5063,0.4099;4.3254,-1.3994,-1.0283;5.4154,-2.2734,-0.656;4.9642,-3.0438,-0.1504;6.1503,-1.4553,0.3103;5.5372,-1.3003,1.1207;6.2077,-0.1858,-0.3539;7.0463,-0.2377,-0.9317;5.0192,-0.1549,-1.2957;5.4078,-0.15,-2.3213;4.1501,0.9854,-1.1232;4.6485,1.863,-1.048;1.7757,2.6749,0.9383;1.8253,1.9979,1.7851;1.216,3.9426,1.0909;0.822,4.2433,2.0578)|\",4.525253333815\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])[C@@]2([H])C([H])=C([H])C(=O)[C@]([H])(OC(=O)C([H])([H])[H])C2([H])[H])O1 |(8.4858,0.2482,-5.0709;8.8401,0.268,-4.0516;9.9666,0.7377,-3.4536;10.7991,1.2256,-3.941;9.8249,0.4443,-2.0566;10.5327,0.665,-1.269;8.6216,-0.1818,-1.9025;7.8987,-0.7283,-0.7147;7.5605,-1.7515,-0.9251;8.6253,-0.8,0.1033;6.6776,0.0965,-0.2209;6.3639,-0.3689,0.7275;7.0517,1.5235,0.0937;8.0589,1.6967,0.4721;6.2217,2.5703,-0.057;6.5331,3.5855,0.1748;4.8367,2.412,-0.5351;4.0928,3.3595,-0.7304;4.3513,0.9685,-0.7489;3.5312,0.9702,-1.4686;3.8264,0.5653,0.547;2.8,-0.3265,0.5371;2.3475,-0.8238,-0.4688;2.3204,-0.5939,1.9443;1.9692,0.3368,2.4017;3.1438,-0.9695,2.5605;1.5113,-1.3243,1.9188;5.48,0.0396,-1.1857;5.0987,-0.9835,-1.274;5.8043,0.3474,-2.1867;8.009,-0.2962,-3.1256)|\",4.52253219531\r\n\"[H]OC([H])([H])[C@]1([H])O[C@@]([H])(C([H])([H])[H])C2=C(C(OC([H])([H])[H])=C([H])C([H])=C2OC([H])([H])[H])C1([H])[H] |(2.1717,3.7488,0.3763;2.1579,4.3667,-0.3746;2.9535,3.7452,-1.3671;2.7989,4.2967,-2.3008;4.0261,3.7956,-1.1143;2.5627,2.2822,-1.5525;1.4999,2.2436,-1.8344;2.7376,1.6878,-0.2612;2.2417,0.3503,-0.1517;2.6774,-0.0227,0.7816;0.712,0.3321,-0.0012;0.3709,-0.6806,0.2274;0.2104,0.6624,-0.9165;0.4171,0.9979,0.817;2.7438,-0.5257,-1.2917;3.2857,0.0442,-2.4435;3.7613,-0.7906,-3.4791;4.2849,-0.1339,-4.5644;4.7815,-0.9118,-5.6385;5.1515,-0.2006,-6.3797;3.9927,-1.5285,-6.0907;5.606,-1.5634,-5.3186;3.6863,-2.1736,-3.3571;4.0418,-2.8264,-4.1457;3.1505,-2.7459,-2.1954;3.1115,-3.8259,-2.1146;2.6868,-1.9319,-1.1679;2.1663,-2.4022,0.0147;2.1336,-3.8035,0.2205;1.7121,-3.9514,1.2167;3.1406,-4.2404,0.1839;1.4971,-4.3065,-0.52;3.4053,1.546,-2.5884;4.4611,1.8358,-2.4837;3.1073,1.8465,-3.5995)|\",5.434113594485\r\n\"[H]C1=C(OC([H])([H])[H])C([H])=C(C([H])([H])[C@@]2([H])O[C@@]([H])(C([H])([H])[H])OC2([H])[H])C(OC([H])([H])[H])=C1[H] |(3.915,-2.1365,-8.7281;3.5384,-1.7794,-7.7748;3.4964,-0.4029,-7.5382;3.9461,0.3966,-8.5551;3.9247,1.7997,-8.3611;4.3226,2.2361,-9.2793;4.556,2.1025,-7.5144;2.9042,2.1722,-8.1976;3.0044,0.0624,-6.3141;2.9478,1.1271,-6.1121;2.5592,-0.8211,-5.3226;2.0461,-0.2778,-4.0048;1.3328,-0.9719,-3.5518;1.5178,0.6663,-4.1817;3.1593,0.0084,-2.9938;3.9064,0.6824,-3.4463;2.592,0.6384,-1.8429;3.5458,0.467,-0.8158;4.322,1.2501,-0.9012;2.8753,0.5112,0.5402;3.6128,0.3223,1.326;2.0973,-0.2561,0.5887;2.4233,1.4938,0.7086;4.1612,-0.8012,-1.0354;3.8673,-1.2097,-2.3737;4.7948,-1.4695,-2.8961;3.2106,-2.0867,-2.3584;2.6106,-2.2036,-5.5759;2.1627,-3.0245,-4.568;2.1352,-4.4222,-4.8002;1.7232,-4.8685,-3.8931;3.142,-4.825,-4.975;1.4929,-4.6777,-5.6537;3.0974,-2.6725,-6.8027;3.1388,-3.7354,-7.011)|\",5.27356642269\r\n\"[H]O[C@@]12O[C@@]3(C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[C@@]([H])(C1([H])[H])C([H])([H])[C@@]([H])(C2([H])[H])C3([H])[H] |(5.7791,0.6158,-4.4827;5.8792,1.5566,-4.2591;4.7975,1.8911,-3.4381;4.819,0.9624,-2.3343;3.7936,1.1894,-1.3303;4.0799,0.0887,-0.303;3.9927,-0.8902,-0.793;5.1256,0.1633,0.0174;3.2035,0.0682,0.9344;2.2141,0.7428,1.1368;3.6822,-0.8389,1.8158;2.9451,-1.0036,3.0514;3.6927,-1.3599,3.7644;2.573,-0.0277,3.3723;1.8079,-2.0013,2.8907;1.3275,-2.178,3.86;2.181,-2.9587,2.5123;1.0528,-1.6172,2.1988;3.9745,2.6084,-0.7575;3.2055,2.7985,-0.0022;4.9541,2.6764,-0.2662;3.887,3.6305,-1.9078;4.0201,4.6449,-1.5119;5.0072,3.3145,-2.9187;5.9948,3.3673,-2.4474;4.997,4.0166,-3.7607;2.5115,3.5155,-2.6008;2.4411,4.2455,-3.4189;1.7106,3.7533,-1.8884;2.3202,2.085,-3.1507;1.343,2.0023,-3.6427;3.4448,1.7682,-4.1556;3.4261,2.4544,-5.0108;3.3398,0.7458,-4.5445;2.4093,1.0679,-1.996;2.28,0.0471,-2.38;1.6287,1.2457,-1.2503)|\",6.707606414825001\r\n\"[H]/C(C(=O)C([H])([H])[H])=C(/[H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(11.921,-0.4813,4.2718;12.3033,-1.3182,3.6882;13.3912,-2.1428,4.2879;13.8817,-3.0965,3.7022;13.8604,-1.7336,5.6733;13.0266,-1.7641,6.3865;14.2313,-0.7004,5.6618;14.6543,-2.4035,6.0082;11.8102,-1.5881,2.4716;12.238,-2.4396,1.9404;10.7137,-0.8285,1.7886;10.4093,0.0317,2.3961;11.0913,-0.4297,0.8347;9.4805,-1.7202,1.4919;9.0763,-2.117,2.4299;9.8098,-2.5832,0.8942;8.4121,-0.9744,0.7453;8.7069,-0.5734,-0.2252;7.1681,-0.7644,1.2011;6.8636,-1.1614,2.1688;6.0897,-0.0075,0.5152;5.0036,0.11,1.068;6.3194,0.5948,-0.8666;5.6371,1.4477,-0.9454;7.3429,0.9757,-0.9693;6.0134,-0.4097,-1.9969;6.6686,-1.2869,-1.9056;4.9854,-0.7737,-1.8692;6.1689,0.2094,-3.3921;5.5022,1.0803,-3.4738;7.1918,0.5985,-3.5077;5.8662,-0.7694,-4.5334;6.5284,-1.643,-4.4502;4.842,-1.1519,-4.4209;6.0251,-0.1399,-5.9216;5.796,-0.8596,-6.7155;7.0503,0.2165,-6.0808;5.3534,0.7184,-6.0463)|\",4.971520048635\r\n\"[H]OC([H])([H])[C@@]1([H])N(C([H])([H])[C@@]([H])(N([H])C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(2.691,6.1354,2.5879;3.0289,5.5979,1.8564;3.1005,4.2469,2.3068;3.825,4.1323,3.1284;2.1183,3.9082,2.6703;3.5533,3.4202,1.11;4.4392,3.9357,0.7123;3.9965,2.0422,1.4211;3.1187,1.2575,2.3066;2.7603,1.9136,3.1024;2.2305,0.8522,1.7897;3.8911,0.1008,2.9685;4.2893,-0.5429,2.1553;4.9937,0.6791,3.7388;5.3799,1.4184,3.1509;6.0529,-0.2629,4.0802;5.7094,-0.9663,4.8466;6.9023,0.2871,4.4992;6.4183,-0.8582,3.221;2.9583,-0.8182,3.79;2.2717,-1.2789,3.0647;3.5508,-1.6499,4.1976;2.1171,-0.2104,4.936;1.6282,0.702,4.564;1.0013,-1.1912,5.3331;0.3842,-0.7823,6.1423;1.4224,-2.1415,5.6879;0.341,-1.4163,4.4867;2.9574,0.1709,6.1646;2.3262,0.6136,6.9455;3.7462,0.8781,5.8976;3.4339,-0.7208,6.5951;4.132,1.4467,0.0781;5.0993,1.7537,-0.3391;4.1353,0.3554,0.1391;2.9713,2.0134,-0.7932;3.3133,2.2298,-1.8105;2.1507,1.2935,-0.88;2.5204,3.2898,-0.0377;1.5121,3.1672,0.377;2.5029,4.1824,-0.6684)|\",7.46136178071\r\n\"[H]NC1=NNN([H])N1C([H])([H])/C(C#N)=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(7.3512,1.0349,2.5039;6.5302,0.7586,1.9651;6.697,-0.4184,1.5067;7.8004,-1.3134,1.6516;7.6016,-2.3392,0.9324;6.4644,-2.2055,0.2053;5.9194,-3.0651,0.1312;5.7672,-1.0905,0.7029;4.87,-0.3587,-0.1895;4.9131,0.6846,0.1357;5.2546,-0.4294,-1.2158;3.4374,-0.8688,-0.1146;3.3241,-2.271,-0.3623;3.3749,-3.4195,-0.5592;2.4058,-0.0403,0.1756;2.6971,0.9913,0.3673;0.9667,-0.272,0.2831;0.3267,-1.502,0.0282;0.9013,-2.3687,-0.2763;-1.0533,-1.6191,0.1626;-1.5296,-2.5746,-0.0381;-1.8257,-0.5206,0.5509;-2.9026,-0.6209,0.6538;-1.2081,0.705,0.8048;-1.7993,1.5646,1.1071;0.1717,0.8257,0.6701;0.6503,1.7815,0.8692)|\",3.95109310926\r\n\"[H]OC1=C(C2=NC([H])([H])C([H])([H])C([H])([H])N2[H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(6.7379,0.9197,0.0189;6.461,-0.0512,0.0496;5.1203,-0.0453,0.0777;4.3477,1.145,-0.037;5.0242,2.4463,-0.2649;6.3136,2.5132,-0.1138;7.0098,3.7735,-0.3506;7.8492,3.8318,0.3524;7.4537,3.7522,-1.3579;6.1083,5.0035,-0.2008;5.8491,5.1456,0.8551;6.6232,5.9085,-0.5421;4.829,4.7905,-1.0083;5.0597,4.7975,-2.0855;4.105,5.5919,-0.8221;4.2305,3.5221,-0.6067;3.3314,3.3008,-1.0075;2.9486,1.0442,0.0657;2.3484,1.9515,0.0541;2.2866,-0.1715,0.2306;0.7804,-0.2298,0.3539;0.4282,-1.2655,0.3963;0.2816,0.2528,-0.4961;0.4262,0.2739,1.2628;3.0729,-1.3326,0.2942;2.5878,-2.2988,0.4159;4.4572,-1.2741,0.2231;5.0648,-2.1711,0.2909)|\",4.726617583185\r\n\"[H]O[C@@]1([H])C(=C([H])[H])C2=C([H])C(OC([H])([H])[H])=C(OC([H])([H])[H])C([H])=C2OC1([H])[H] |(1.6449,2.3849,-1.1034;2.1628,2.6167,-0.3144;3.0145,1.5155,-0.0402;3.4084,1.6955,0.9685;2.3194,0.1631,-0.0805;0.9834,0.0834,0.0179;0.4482,-0.859,-0.032;0.3877,0.9763,0.1744;3.2364,-0.9849,-0.2317;2.8264,-2.3232,-0.0593;1.7955,-2.552,0.1896;3.7011,-3.3908,-0.1503;3.2402,-4.6617,0.1063;3.1999,-5.539,-1.02;4.2016,-5.7265,-1.4202;2.5594,-5.1291,-1.8131;2.7705,-6.4766,-0.6589;5.0713,-3.1447,-0.4201;5.882,-4.2376,-0.4657;7.2722,-4.0297,-0.6676;7.731,-5.0192,-0.6275;7.6998,-3.3972,0.1205;7.4737,-3.5745,-1.646;5.5021,-1.8356,-0.615;6.5347,-1.598,-0.8384;4.5956,-0.7719,-0.5209;5.131,0.4742,-0.7129;4.1943,1.5065,-1.0177;4.7457,2.447,-0.9613;3.8197,1.3741,-2.0454)|\",4.69124278262\r\n\"[H]O[C@@]([H])(C([H])=C([H])[H])C([H])([H])OC1=C([H])C(OC([H])([H])[H])=C(OC([H])([H])[H])C([H])=C1[H] |(-0.1235,0.8271,-0.3572;0.3837,0.6741,0.4578;1.6609,1.2704,0.2383;2.1323,1.3471,1.225;2.5273,0.4193,-0.6601;2.1462,0.2602,-1.6683;3.6853,-0.1232,-0.2863;4.2796,-0.7322,-0.9622;4.086,0.0193,0.7158;1.4753,2.6917,-0.3026;2.4441,3.172,-0.4944;0.9125,3.292,0.4253;0.7279,2.5637,-1.5112;0.3662,3.6979,-2.196;0.6915,4.9971,-1.7923;1.2657,5.1993,-0.8964;0.2879,6.0954,-2.5531;0.6948,7.3391,-2.1372;-0.3414,8.2483,-1.7565;0.1659,9.15,-1.4058;-0.9433,7.8301,-0.9385;-0.9905,8.4955,-2.6011;-0.4518,5.9063,-3.7383;-0.7798,7.0361,-4.4416;-1.4825,6.877,-5.662;-1.6207,7.8834,-6.0616;-2.4653,6.4109,-5.5076;-0.9098,6.277,-6.3819;-0.783,4.6027,-4.1219;-1.3512,4.4268,-5.0281;-0.3785,3.5065,-3.3603;-0.6304,2.4965,-3.6671)|\",5.43683473299\r\n\"[H]C1=C(/C(=C(/[H])C#N)N2C([H])([H])C([H])([H])OC([H])([H])C2([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-1.2432,-1.2699,2.2269;-0.576,-0.8134,1.4984;-0.2205,-1.5192,0.4127;-0.7547,-2.8936,0.1947;-2.099,-3.1304,0.114;-2.486,-4.1386,0.0119;-3.0838,-2.1073,0.1297;-3.9346,-1.3091,0.1361;0.1892,-3.8921,-0.0193;-0.2468,-5.2216,-0.445;-0.6853,-5.7866,0.3952;-1.0124,-5.1112,-1.2187;0.9379,-5.9988,-1.0236;1.2667,-5.5211,-1.9614;0.6401,-7.0286,-1.2405;2.0202,-6.071,-0.1092;2.4739,-4.7673,0.2236;3.3068,-4.893,0.9215;2.8429,-4.252,-0.6789;1.3615,-3.947,0.8691;1.7068,-2.9322,1.0767;1.0823,-4.4091,1.8306;0.672,-0.9398,-0.6757;1.6796,-1.3788,-0.6183;0.2801,-1.2386,-1.6564;0.7674,0.5907,-0.5857;1.5641,0.9513,-1.2475;-0.1706,1.0338,-0.946;1.019,1.0419,0.8585;1.9596,0.5984,1.215;1.1439,2.1298,0.9091;-0.1332,0.599,1.7717;-1.0023,1.2621,1.6369;0.1499,0.7001,2.8284)|\",5.09941355837\r\n\"[H]C(=O)C([H])([H])C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(10.4626,-4.6775,-3.2277;9.7709,-5.1267,-2.4795;9.4454,-6.2903,-2.5608;9.3027,-4.177,-1.3995;10.1982,-3.7891,-0.8889;8.8559,-3.3001,-1.8922;8.3235,-4.8179,-0.4039;7.4354,-5.1721,-0.9383;8.8046,-5.7107,0.0194;7.9433,-3.8804,0.7001;8.7485,-3.4801,1.3182;6.6969,-3.4863,0.9977;5.8467,-3.8498,0.422;6.4253,-2.5536,2.1293;7.3253,-2.1019,2.8249;4.9714,-2.1653,2.36;4.3418,-3.0631,2.2938;4.8898,-1.7634,3.3751;4.4715,-1.1209,1.335;4.5972,-1.5151,0.3162;3.3902,-0.9912,1.4779;5.1673,0.2416,1.4492;5.014,0.6398,2.4626;6.253,0.1163,1.3396;4.6648,1.2619,0.4196;4.8242,0.8642,-0.5931;3.5776,1.3843,0.5291;5.3477,2.6277,0.5444;4.9721,3.3329,-0.2059;6.4326,2.5417,0.4081;5.1736,3.0689,1.5333)|\",5.10485583538\r\n\"[H]O/C(=N/C1=C([H])C(=O)[C@@]2([H])O[C@]2([H])[C@]1([H])O[H])OC([H])([H])C([H])=C([H])[H] |(6.9971,-1.8121,-2.8076;6.6842,-0.9101,-3.0011;5.3555,-1.0238,-3.1942;4.7961,-2.1713,-3.1119;3.4459,-2.4086,-3.3255;2.699,-1.8701,-4.3413;3.1676,-1.1789,-5.0348;1.2514,-2.1987,-4.6233;0.7039,-2.0378,-5.6935;0.7811,-2.7528,-3.2947;0.8462,-2.0114,-2.5005;0.2377,-3.9928,-2.7848;1.6395,-3.9221,-3.0858;1.8826,-4.5214,-3.965;2.8815,-3.4729,-2.3346;3.6355,-4.263,-2.2073;2.6246,-2.8229,-1.0988;2.1315,-3.4494,-0.5441;4.7418,0.1314,-3.4354;5.4961,1.3772,-3.3602;5.9316,1.4786,-2.362;6.3099,1.3259,-4.0919;4.546,2.4907,-3.6683;4.0175,2.4142,-4.6172;4.3609,3.5405,-2.8699;3.693,4.3524,-3.1423;4.8747,3.632,-1.9151)|\",4.726617583185\r\n\"[H]O[C@@]1([H])C([C@@]([H])(O[H])C([H])([H])[H])=C([H])C(=O)C1([H])[H] |(0.7658,2.8486,-0.2481;1.0961,2.7231,0.6569;2.5204,2.6732,0.5924;2.8261,2.5501,1.6374;3.0272,1.4981,-0.2405;2.4903,0.1031,-0.0365;1.4326,0.1109,-0.3296;3.0986,-0.8407,-0.9108;4.023,-0.9432,-0.6307;2.5666,-0.3482,1.4299;1.9701,0.304,2.0751;3.6034,-0.3369,1.7905;2.1822,-1.3688,1.5126;3.9518,1.8804,-1.1394;4.4605,1.2398,-1.8505;4.1952,3.3379,-1.0623;5.0185,3.9779,-1.6879;3.2024,3.9064,-0.0362;2.4652,4.5346,-0.5516;3.7152,4.5485,0.6863)|\",5.129346081925\r\n\"[H]/C(=C(/[H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])N(=O)=O |(5.6738,2.0418,-0.5186;5.2966,1.4867,-1.3673;4.7646,0.2645,-1.3483;4.4317,-0.155,-2.297;4.6094,-0.5841,-0.1185;4.9429,-0.0086,0.754;5.5104,-1.832,-0.2384;6.5635,-1.5532,-0.3479;5.2297,-2.4407,-1.1063;5.4122,-2.4569,0.6563;3.1325,-1.0002,0.084;2.8181,-1.5845,-0.7934;3.0907,-1.6894,0.94;2.1239,0.1436,0.3114;2.2063,0.8436,-0.5332;0.6918,-0.4122,0.3191;-0.0422,0.3905,0.4544;0.5523,-1.1304,1.1378;0.4566,-0.9273,-0.6197;2.406,0.927,1.6024;1.6602,1.7167,1.7485;3.3906,1.4075,1.5912;2.3658,0.2663,2.4786;5.4292,2.2361,-2.6113;5.0426,1.7268,-3.6643;5.933,3.3571,-2.5108)|\",5.46948839505\r\n\"[H]/C(=C(/[H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])N(=O)=O |(5.8715,-4.3721,-0.4133;5.9375,-4.2482,0.6596;5.501,-3.2083,1.3696;5.6466,-3.2355,2.4485;4.8253,-2.0004,0.7849;4.7577,-2.1294,-0.3044;5.6753,-0.7462,1.0756;5.2349,0.1423,0.6132;6.6904,-0.8608,0.6816;5.7502,-0.5648,2.1549;3.3854,-1.8908,1.3475;2.8814,-2.856,1.2029;3.4374,-1.7324,2.4349;2.5322,-0.787,0.7068;2.9841,0.1951,0.8942;2.5274,-0.9212,-0.3842;1.0909,-0.7843,1.2278;0.4995,0.0063,0.7527;1.0608,-0.6176,2.3113;0.5932,-1.7405,1.0258;6.5849,-5.3863,1.3013;6.9432,-6.2945,0.548;6.7359,-5.3814,2.5239)|\",5.469488395049999\r\n\"[H]C1=C([H])[C@]2(C([H])(C([H])([H])[H])C([H])([H])[H])OC([H])([H])C([H])([H])O[C@@]2([H])C([H])([H])C1=O |(0.6721,2.4286,2.5518;1.4603,1.701,2.7264;2.708,1.883,2.2619;2.9681,2.7588,1.6721;3.8252,0.9012,2.5503;4.5124,1.3407,3.9001;3.679,1.483,4.6005;5.4808,0.3461,4.5728;5.6715,0.6843,5.5986;6.4511,0.3102,4.0666;5.0879,-0.671,4.6107;5.202,2.7058,3.7332;5.6545,3.0159,4.6817;4.5006,3.4901,3.4292;5.9936,2.6539,2.9788;4.7148,0.9678,1.4197;5.6256,-0.1292,1.3027;6.078,-0.031,0.3102;6.4243,-0.0488,2.049;4.9274,-1.4835,1.428;5.6638,-2.2888,1.5103;4.3008,-1.667,0.5401;4.1142,-1.5424,2.5989;3.1579,-0.4938,2.544;2.6463,-0.5623,1.5687;2.1032,-0.6208,3.634;1.5862,-1.5836,3.5744;2.5558,-0.5633,4.632;1.0612,0.4978,3.501;-0.0503,0.3961,3.9945)|\",5.08308672734\r\n\"[H]C1=C([H])[C@]2(C([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])O[C@@]2([H])C([H])([H])C1=O |(1.3307,3.5682,1.9146;2.1757,2.9056,2.0805;3.1666,2.7804,1.1819;3.1598,3.3331,0.2444;4.3755,1.907,1.4364;5.4424,2.8174,2.1221;5.673,3.6078,1.3959;4.9395,3.3228,2.9543;6.7579,2.2183,2.6483;7.2731,2.9735,3.2538;7.4399,1.939,1.8407;6.5898,1.3358,3.2692;4.807,1.4286,0.1487;5.6996,0.3121,0.1813;5.7445,-0.059,-0.8483;6.7078,0.6305,0.4695;5.2197,-0.7988,1.1173;6.009,-1.5422,1.2639;4.3369,-1.3003,0.6891;4.8876,-0.2832,2.4063;3.873,0.6995,2.2524;3.064,0.2547,1.6475;3.2795,1.1293,3.5857;2.8636,0.2763,4.1301;4.0472,1.577,4.2303;2.162,2.1576,3.3645;1.2957,2.3484,4.2028)|\",5.096692419865\r\n\"[H]O[C@]1([H])[C@]([H])(O[H])[C@@]([H])(C([H])([H])[H])O[C@]([H])(C([H])([H])[C@@]([H])(N([H])[H])C([H])([H])[H])[C@]1([H])O[H] |(2.7895,4.4778,3.4177;1.9423,4.1992,3.8091;1.9844,2.7881,3.8017;0.948,2.4464,3.9079;2.7922,2.2262,4.9865;2.3155,2.5326,5.9291;4.1022,2.8129,4.8984;4.6596,2.4221,5.5893;2.8876,0.6901,4.9111;3.6591,0.368,5.6256;1.5872,-0.0324,5.2855;1.3078,0.1988,6.3197;1.7336,-1.1135,5.2018;0.7443,0.2459,4.6457;3.4121,0.2864,3.646;2.6704,0.7237,2.4946;1.6494,0.3136,2.542;3.4077,0.1387,1.2874;4.4291,0.5353,1.3065;3.4798,-0.9476,1.4368;2.7953,0.4189,-0.1032;2.7297,1.5058,-0.241;3.6415,-0.0736,-1.2035;3.7905,-1.0774,-1.0866;4.566,0.3519,-1.1348;1.3937,-0.1767,-0.2721;0.6767,0.2503,0.4378;1.4111,-1.2637,-0.1133;1.0299,0.0109,-1.2865;2.5565,2.2599,2.4699;1.879,2.5571,1.662;3.7957,2.9063,2.186;4.3678,2.7306,2.9588)|\",7.597418705959999\r\n\"[H]O[C@]1([H])[C@]([H])(O[H])[C@@]([H])(C([H])([H])[H])O[C@]([H])(C([H])([H])/C(=N\\OC([H])([H])[H])C([H])([H])[H])[C@]1([H])O[H] |(1.9691,6.5793,2.1325;1.9418,6.3385,1.1857;1.4523,4.9988,1.1777;0.7478,4.9009,0.3429;0.7408,4.7048,2.5143;-0.1808,5.2949,2.5767;1.5661,5.1805,3.5827;2.4028,4.6786,3.5087;0.4143,3.213,2.6836;0.168,3.0603,3.7395;-0.7645,2.7285,1.8321;-1.6858,3.2312,2.1474;-0.8967,1.6509,1.9694;-0.6328,2.9194,0.7623;1.6029,2.4348,2.4918;2.2177,2.5775,1.2071;1.5122,2.2941,0.4143;3.4029,1.6031,1.1666;3.9308,1.7495,0.2186;4.0842,1.8531,1.9841;2.9616,0.1604,1.2727;2.4905,-0.5278,0.2964;2.4259,0.2303,-0.8959;1.9862,-0.6207,-1.9461;1.9418,0.0106,-2.8378;0.9932,-1.0341,-1.733;2.6907,-1.4447,-2.1106;3.0521,-0.5359,2.6013;4.0913,-0.5602,2.9549;2.6776,-1.5598,2.5276;2.4675,0.0166,3.3459;2.6364,4.0301,0.9606;2.9941,4.1235,-0.0765;3.691,4.37,1.8662;3.8439,5.3197,1.7112)|\",6.669510475755\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N1C([H])([H])C([H])([H])N(C([H])([H])[H])C([H])([H])[C@]1([H])C([H])([H])C([H])([H])O2 |(2.361,-3.3782,6.3707;2.7823,-2.8602,5.5143;3.4353,-3.5676,4.5095;3.5502,-4.6463,4.5588;3.9962,-2.9266,3.4039;3.9113,-1.5166,3.2462;3.2162,-0.8335,4.271;3.0897,0.2388,4.2129;2.675,-1.4794,5.3808;2.161,-0.8887,6.1344;4.5779,-0.812,2.2051;4.5035,0.6565,2.2394;4.6415,1.0164,3.2608;5.3483,1.0355,1.6528;3.1989,1.2013,1.6328;3.2509,2.2967,1.581;2.3435,0.9464,2.2871;2.9988,0.6857,0.2763;1.7553,1.1703,-0.3004;1.7598,2.2661,-0.3181;0.8553,0.8417,0.2564;1.664,0.8139,-1.3327;3.0971,-0.7764,0.2257;2.2851,-1.2715,0.7953;2.9975,-1.0831,-0.8231;4.4705,-1.2018,0.7765;5.1894,-0.5488,0.267;4.9072,-2.6241,0.4183;4.7434,-2.7584,-0.6597;5.9855,-2.7127,0.5949;4.238,-3.7508,1.1815;4.5454,-4.7186,0.7717;3.1409,-3.6958,1.1371;4.6767,-3.7536,2.5377)|\",5.396017655414999\r\n\"[H]C1=C2OC([H])([H])C([H])([H])[C@]3([H])N(C2=C([H])C(N([H])[H])=C1[H])C([H])([H])C([H])([H])N([H])C3([H])[H] |(-1.2625,-2.711,0.4918;-1.3929,-1.6554,0.2728;-0.2377,-0.8782,0.2158;0.9347,-1.5415,0.5211;1.9501,-1.5159,-0.4769;2.5746,-2.3953,-0.2862;1.5048,-1.6238,-1.4766;2.7754,-0.2516,-0.3403;3.6795,-0.3323,-0.9607;3.1066,-0.1943,0.7034;2.0839,1.0638,-0.7098;2.7738,1.8426,-0.3538;0.8106,1.3769,-0.0066;-0.3199,0.5157,-0.0484;-1.6201,1.031,-0.2448;-1.7584,2.0773,-0.4834;-2.781,0.2503,-0.158;-4.0361,0.833,-0.408;-4.8082,0.3294,0.0129;-4.0882,1.8171,-0.1711;-2.6633,-1.1181,0.1053;-3.5457,-1.7496,0.1679;0.5618,2.8176,-0.1872;-0.3182,3.1194,0.3866;1.417,3.3474,0.254;0.4422,3.2015,-1.6792;0.3641,4.2901,-1.7889;-0.4719,2.7663,-2.099;1.5686,2.7325,-2.4995;2.3862,3.3052,-2.2863;1.9126,1.3294,-2.2217;1.1173,0.6914,-2.6286;2.836,1.093,-2.7654)|\",5.13750949744\r\n\"[H]C1=C2C(=C([H])\\C(C([H])([H])[H])=C/1[H])\\OC([H])([H])C([H])([H])[C@@]1([H])N\\2C([H])([H])C([H])([H])N([H])C1([H])[H] |(4.9,2.1764,0.0011;4.3399,1.25,-0.0679;4.8941,0.0755,0.4572;4.135,-1.1129,0.3288;2.8876,-1.0847,-0.2962;2.3473,-2.0251,-0.3677;2.3319,0.0996,-0.7934;0.9665,0.0955,-1.4407;0.726,1.0746,-1.8673;0.179,-0.1543,-0.718;0.9069,-0.6453,-2.2475;3.0809,1.2711,-0.6706;2.6875,2.2099,-1.0527;4.5058,-2.3317,0.8298;5.8817,-2.7316,0.8997;6.2278,-2.6211,1.9337;5.8694,-3.8009,0.6605;6.8003,-1.993,-0.07;6.2897,-1.9001,-1.0349;7.6902,-2.6086,-0.2548;7.2506,-0.6137,0.4431;7.5791,0.0009,-0.415;6.1435,0.0582,1.1522;6.5525,1.337,1.7378;6.8588,2.0685,0.9689;5.6995,1.7521,2.2839;7.731,1.1309,2.6845;7.4046,0.4975,3.5308;8.0465,2.1009,3.0858;8.8361,0.547,1.9269;9.6544,0.4446,2.5235;8.4524,-0.7557,1.3901;9.2959,-1.1652,0.8215;8.1931,-1.4839,2.1827)|\",5.423229040464999\r\n\"[H]C1=C([H])C(=O)[C@@]([H])(OC([H])([H])[H])C(Br)=N1 |(1.7189,2.8984,4.1397;1.994,2.3035,3.2734;1.6898,0.9892,3.2009;1.1232,0.4905,3.9805;2.0074,0.2333,1.9874;1.5945,-0.8891,1.7529;2.9811,0.9373,1.0018;3.9888,0.6045,1.3321;2.7534,0.6084,-0.332;3.3762,-0.5967,-0.7743;3.1602,-0.6718,-1.8419;2.9708,-1.4672,-0.2515;4.4662,-0.5466,-0.6314;3.0102,2.4417,1.217;3.8155,3.4762,-0.1887;2.626,3.0483,2.2635)|\",4.111640281055001\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])[C@]1([H])C(=O)N=C([H])C(F)=C1[H] |(2.1294,-2.3222,-1.0322;1.1836,-2.1736,-1.1887;0.977,-0.7633,-1.2746;1.5933,-0.3205,-2.072;-0.0677,-0.6336,-1.5683;1.2421,-0.0329,0.0454;2.3226,-0.0291,0.2535;0.9463,1.0153,-0.0776;0.5435,-0.66,1.2612;0.7956,-0.0947,2.1666;0.9222,-1.6785,1.4025;-1.0136,-0.7723,1.1556;-1.2301,-1.3556,0.2531;-1.6582,0.6013,0.9404;-1.6609,1.1233,-0.1576;-2.2255,1.299,2.0439;-2.5374,0.6015,3.082;-3.0051,1.1124,3.9275;-2.287,-0.8261,3.2395;-2.7838,-1.3849,4.3611;-1.5529,-1.5022,2.351;-1.3041,-2.5493,2.5005)|\",4.410965516605\r\n\"[H]C1=N[C@]2([H])C(=O)NC(C([H])([H])[H])=NC2=C([H])C1Br |(7.0057,-1.8524,2.1645;6.1495,-1.6742,1.5139;5.959,-0.4868,1.0793;4.8451,-0.2434,0.1958;5.2894,-0.0409,-0.799;4.0928,1.0845,0.4745;4.4707,1.9093,1.2689;2.9719,1.273,-0.3515;2.384,0.2304,-0.8572;1.2098,0.4196,-1.7685;0.3143,-0.0188,-1.3122;1.3777,-0.1205,-2.7082;1.0446,1.479,-1.9675;2.6766,-1.1156,-0.5737;3.8225,-1.3391,-0.0041;4.1523,-2.6644,0.4697;3.4608,-3.4733,0.2615;5.2804,-2.8183,1.1993;5.7917,-4.5097,1.9037)|\",3.44496134733\r\n\"[H]C1=C([H])C(=O)[C@]2([H])OC([H])([H])C([H])([H])OC2=N1 |(2.7739,-1.943,-0.063;2.0373,-1.1425,-0.0613;2.4255,0.1376,0.1806;3.4494,0.3801,0.4446;1.4172,1.192,0.2391;1.5706,2.3102,0.6937;0.0964,0.7799,-0.47;0.3673,0.8846,-1.5383;-0.973,1.6667,-0.2302;-2.0286,1.0807,0.5133;-1.7036,0.8241,1.5338;-2.8189,1.8331,0.5777;-2.518,-0.1556,-0.2125;-2.8508,0.1035,-1.2242;-3.3322,-0.6578,0.3167;-1.4611,-1.1344,-0.3084;-0.1923,-0.7006,-0.3109;0.7321,-1.5926,-0.2248)|\",4.427292347635\r\n\"[H]C1=C([H])/C(=C(/[H])NO)[C@]2([H])OC([H])([H])C([H])([H])O\\C2=N\\1 |(2.6808,-2.0866,-0.2667;1.9719,-1.2697,-0.1638;2.3786,-0.062,0.3134;3.3913,0.0892,0.6667;1.4237,1.0084,0.435;1.631,2.1818,1.0997;0.8765,2.9616,1.1716;2.8699,2.4014,1.7145;2.9432,3.4826,2.3074;0.143,0.7932,-0.3588;0.41,1.0515,-1.4012;-0.9085,1.6737,0.0069;-2.0353,1.0142,0.5676;-1.7869,0.5609,1.5397;-2.7969,1.7826,0.7227;-2.5232,-0.044,-0.399;-2.7784,0.4091,-1.3642;-3.3914,-0.587,-0.0162;-1.5038,-1.042,-0.6074;-0.2196,-0.6871,-0.4427;0.657,-1.624,-0.4493)|\",2.78644582912\r\n\"[H]C1=NC([H])=C([H])C([C@@]2([H])N([H])N3C(=O)C([H])=C([H])N([H])[C@@]3([H])C2([H])[H])=C1[H] |(-2.7825,1.4824,0.3522;-1.7857,1.0781,0.1832;-1.7142,-0.2472,0.0016;-0.4949,-0.7539,-0.2175;-0.4442,-1.8309,-0.3719;0.6678,0.0142,-0.2619;1.6258,-0.4624,-0.4562;0.5881,1.3971,-0.0595;1.8577,2.2286,-0.0671;2.6192,1.6595,-0.6128;1.7684,3.572,-0.7484;0.8364,3.7663,-1.1182;1.984,4.58,0.25;0.8466,5.2334,0.7585;-0.1905,5.2644,0.0932;1.0335,5.8792,2.0505;0.2543,6.5371,2.4106;2.1473,5.6215,2.7765;2.3226,6.0757,3.7476;3.1483,4.8075,2.3264;3.8798,4.5349,2.9652;2.9674,3.9826,1.1496;3.9228,3.9206,0.6129;2.3977,2.5554,1.3511;3.1421,1.8292,1.6904;1.5845,2.5943,2.0836;-0.6854,1.9361,0.165;-0.8314,3.0039,0.304)|\",4.879001339465\r\n\"[H]C1=NC([H])=C([H])C([H])=C1[C@@]1([H])N([H])N2C(=O)C([H])=C([H])N([H])[C@@]2([H])C1([H])[H] |(-1.5428,0.4106,-0.522;-0.5801,-0.0356,-0.2691;0.457,0.8068,-0.2297;1.6543,0.2848,0.0629;2.4888,0.9838,0.0885;1.8566,-1.0724,0.3165;2.8523,-1.4477,0.5343;0.7636,-1.9357,0.2783;0.8991,-3.0008,0.4505;-0.5027,-1.4124,-0.0159;-1.7694,-2.2443,-0.0534;-2.5205,-1.673,-0.6111;-1.6588,-3.5819,-0.7492;-0.7146,-3.7683,-1.0908;-1.8995,-4.6008,0.2326;-0.7739,-5.2454,0.7766;0.2886,-5.2573,0.1509;-0.9999,-5.9069,2.0544;-0.2282,-6.5626,2.4343;-2.1412,-5.6671,2.7432;-2.3473,-6.1342,3.7022;-3.1313,-4.8552,2.267;-3.888,-4.5958,2.8814;-2.9122,-4.0141,1.1078;-3.8492,-3.9459,0.5408;-2.3482,-2.591,1.3445;-3.1013,-1.8686,1.6729;-1.5562,-2.6401,2.0996)|\",4.821857430860001\r\n\"[H]C1=NC([C@@]2([H])N([H])N3C(=O)C([H])=C([H])N([H])[C@@]3([H])C2([H])[H])=C([H])C([H])=C1[H] |(-0.1981,-1.7054,-1.1168;-0.3013,-0.7489,-0.6066;0.7224,0.1027,-0.7507;0.642,1.2943,-0.1377;1.8325,2.2182,-0.3485;2.4219,1.7841,-1.1588;1.4908,3.615,-0.7325;0.5643,3.6965,-1.1507;1.4316,4.3055,0.5252;0.9387,5.6006,0.5932;0.1993,6.0592,-0.2732;1.3051,6.2599,1.8583;0.8988,7.2461,2.0411;2.0255,5.6041,2.7969;2.224,6.0333,3.775;2.5012,4.316,2.6057;3.2703,4.0433,3.205;2.6499,3.9204,1.2135;3.5204,4.4258,0.7495;2.708,2.4056,0.938;3.7328,2.058,0.7844;2.277,1.8609,1.782;-0.4666,1.6702,0.6367;-0.4835,2.64,1.1243;-1.5264,0.7771,0.7685;-2.3992,1.0445,1.3582;-1.4467,-0.4638,0.1349;-2.249,-1.1915,0.2112)|\",4.6858005056100005\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C2N=C([H])C([H])=C([H])N2N=C1[H] |(3.369,4.4865,0.9638;2.9396,5.267,0.4648;1.8299,4.8929,-0.0392;1.0821,5.8532,-0.6636;0.3453,5.4262,-1.1263;1.2168,3.5443,-0.0309;0.13,3.4844,-0.0552;1.9404,2.4075,-0.0132;3.0278,2.4849,-0.0177;1.4234,1.0568,-0.0001;0.0907,0.6067,-0.0015;-1.0963,1.244,-0.0224;-2.1823,0.4939,-0.0184;-3.1348,1.0193,-0.0347;-2.1622,-0.9278,0.0054;-3.0823,-1.4996,0.0083;-0.9494,-1.5653,0.0232;-0.7922,-2.6371,0.0407;0.1694,-0.797,0.019;1.4425,-1.2555,0.0318;2.1773,-0.1384,0.0192;3.257,-0.2262,0.0261)|\",3.855853261585\r\n\"[H]C1=NC([H])=C(C(=O)[C@]2([H])C([H])([H])N([H])N([H])[C@]2([H])N([H])[H])C([H])=C1[H] |(-4.0719,-4.2049,5.1256;-3.6034,-4.2432,4.1436;-4.2989,-3.6895,3.1394;-3.749,-3.7274,1.9245;-4.318,-3.2821,1.1127;-2.4938,-4.2971,1.6396;-2.0129,-4.2764,0.2214;-2.7803,-3.9369,-0.676;-0.5977,-4.6822,-0.1222;0.0267,-4.7606,0.7728;0.0948,-3.722,-1.1171;0.5008,-2.8225,-0.6434;-0.6198,-3.4156,-1.889;1.1673,-4.5203,-1.7143;1.9539,-4.5509,-1.0659;0.6721,-5.8644,-1.8043;0.3396,-6.0142,-2.7556;-0.5083,-6.0557,-0.9127;-0.3053,-6.864,-0.2004;-1.6494,-6.458,-1.711;-2.1585,-5.6436,-2.0537;-2.3071,-7.0061,-1.1615;-1.7884,-4.8652,2.7086;-0.8196,-5.3314,2.5585;-2.354,-4.8422,3.981;-1.8389,-5.2792,4.831)|\",3.94020855524\r\n\"[H]C1=C(N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(-0.2943,-4.098,-0.0256;-0.9044,-3.6537,0.7463;-1.0738,-2.3176,0.9951;-0.508,-1.1632,0.4725;0.4394,-1.3885,-0.6197;1.1034,-2.2073,-0.3254;-0.0877,-1.7081,-1.539;1.2583,-0.1244,-0.9009;1.9032,0.0771,-0.0356;1.9149,-0.3079,-1.7604;0.3503,1.0858,-1.1564;-0.2073,0.9343,-2.0924;0.9504,1.9941,-1.2896;-0.6399,1.2565,0.0024;-1.3449,2.0716,-0.2037;-0.0953,1.5189,0.9188;-1.4254,-0.0335,0.2452;-2.0812,-0.2396,-0.6218;-2.0656,0.0655,1.1237;-1.9455,-2.1406,2.033;-2.3515,-3.3979,2.4551;-3.3405,-3.4309,3.5733;-3.5448,-4.4928,3.7747;-4.6208,-2.8338,3.1376;-4.4335,-1.8849,2.8135;-5.6243,-2.7936,4.2064;-6.5148,-2.2868,3.8152;-5.9215,-3.8317,4.4171;-5.1433,-2.1319,5.5081;-4.9697,-1.0606,5.3265;-5.9217,-2.2015,6.2798;-3.841,-2.789,5.9876;-4.0517,-3.8255,6.2919;-3.4517,-2.2769,6.8767;-2.7846,-2.7939,4.8721;-1.8849,-3.3357,5.1912;-2.4727,-1.7642,4.649;-1.729,-4.3409,1.7023;-1.8527,-5.4112,1.7988)|\",5.711669721995\r\n\"[H]C1=C(N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(-5.9455,1.4434,0.6795;-5.0583,0.8295,0.6374;-4.927,-0.4587,1.0925;-5.79,-1.3289,1.7031;-7.2281,-1.0568,1.7008;-7.4918,-0.3687,2.5172;-7.5366,-0.5817,0.7584;-7.8701,-2.4462,1.8864;-8.7862,-2.4062,2.4831;-8.1285,-2.8696,0.9086;-6.7567,-3.2844,2.5399;-6.8849,-4.3605,2.3893;-6.7203,-3.0952,3.6192;-5.4838,-2.7486,1.8735;-5.2977,-3.2459,0.9067;-4.5877,-2.881,2.4887;-3.6546,-0.908,0.8946;-2.9409,0.124,0.2874;-1.512,-0.1637,-0.032;-1.1046,0.7548,-0.4794;-1.4199,-1.211,-1.0715;-1.9269,-2.0327,-0.7431;-0.0389,-1.5798,-1.3953;-0.0694,-2.3986,-2.1244;0.4189,-0.7206,-1.9083;0.8235,-1.9594,-0.1806;0.4459,-2.8974,0.2534;1.8588,-2.1495,-0.4947;0.7722,-0.8485,0.8782;1.2826,0.0457,0.4896;1.3196,-1.149,1.7808;-0.6797,-0.4928,1.2334;-0.7109,0.3585,1.9255;-1.153,-1.3385,1.7512;-3.767,1.1881,0.1188;-3.4931,2.1323,-0.3332)|\",5.61098759731\r\n\"[H]C1=C(N(=O)=O)OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(-5.9847,0.5795,1.1368;-4.9088,0.4965,1.1435;-4.1116,0.4871,2.2526;-4.4411,0.5783,3.6364;-5.6424,0.7022,3.8987;-3.5235,0.5283,4.4559;-2.8012,0.3693,1.9251;-2.7488,0.2941,0.5622;-1.3874,0.1661,-0.0424;-1.5371,0.0934,-1.129;-0.7697,-1.1022,0.3844;-0.7142,-1.1065,1.4034;0.5784,-1.2745,-0.174;0.9882,-2.2098,0.2233;0.4682,-1.4141,-1.2593;1.5275,-0.0958,0.0969;1.7385,-0.0429,1.1751;2.4888,-0.2608,-0.4068;0.8933,1.2245,-0.3671;0.8139,1.2226,-1.4644;1.5319,2.0755,-0.1017;-0.5057,1.4088,0.2406;-0.9878,2.3098,-0.1599;-0.4283,1.5425,1.328;-4.016,0.3689,0.0427;-4.275,0.3302,-1.0059)|\",3.893949200655\r\n\"[H]C1=C(N(C([H])([H])[H])C([H])([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(4.5269,0.3046,1.2927;4.496,-0.7482,1.056;5.1887,-1.3893,0.0636;6.0259,-0.9519,-0.9521;6.2997,0.4754,-0.9308;6.8446,0.7455,-1.84;6.8995,0.7873,-0.0576;5.3557,1.0289,-0.9173;7.1849,-1.7895,-1.2549;7.6155,-1.4627,-2.2064;6.8691,-2.8286,-1.357;7.9661,-1.7331,-0.4775;4.9101,-2.726,0.0821;4.0178,-2.9501,1.1213;3.5861,-4.3649,1.3244;2.9129,-4.3616,2.194;4.7431,-5.2071,1.6933;5.4465,-5.1182,0.96;4.3881,-6.6209,1.8595;5.3081,-7.1786,2.0713;3.7601,-6.6977,2.7595;3.6367,-7.2348,0.6658;4.3112,-7.2763,-0.2027;3.3466,-8.2696,0.8922;2.4058,-6.3864,0.3185;1.6762,-6.4566,1.1395;1.9046,-6.7765,-0.5765;2.796,-4.9156,0.1099;1.9058,-4.2965,-0.0598;3.4242,-4.8194,-0.7864;3.7369,-1.7682,1.7272;3.0753,-1.6315,2.572)|\",5.7171119990050006\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])O/C([C@]1([H])N([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])=C\\2C([H])([H])[H] |(6.3216,3.4526,-0.2555;5.4323,2.8353,-0.1647;4.1873,3.4455,0.0727;4.1354,4.5271,0.1626;3.0231,2.69,0.195;2.0671,3.1725,0.3801;3.1149,1.2956,0.0768;4.3736,0.7178,-0.1606;5.5471,1.4497,-0.286;6.4993,0.9629,-0.4706;4.2562,-0.6426,-0.2481;2.9127,-0.934,-0.0585;2.5769,-2.3902,-0.1254;1.5038,-2.4871,0.0905;3.2876,-3.1124,0.947;4.2856,-2.9266,0.8491;3.0522,-4.5601,0.9073;3.662,-5.0211,1.6931;2.0012,-4.7315,1.1836;3.3305,-5.2117,-0.4573;4.4085,-5.1577,-0.6702;3.0655,-6.2771,-0.4295;2.551,-4.4869,-1.564;1.4738,-4.6561,-1.4156;2.7985,-4.9015,-2.5491;2.8348,-2.9775,-1.5369;2.216,-2.4516,-2.2755;3.8824,-2.7881,-1.8066;2.1754,0.1936,0.1417;0.7025,0.3094,0.3906;0.2078,0.8906,-0.3985;0.5018,0.8204,1.3407;0.2189,-0.6709,0.4343)|\",5.344316023819999\r\n\"[H]C1=C(N(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(5.7259,2.7611,-2.6215;5.8627,1.7269,-2.3443;4.9188,0.8672,-1.8371;3.6037,1.0048,-1.4406;2.7344,-0.1793,-1.5389;1.737,0.163,-1.8377;3.0974,-0.8289,-2.3443;2.6474,-0.9774,-0.2356;1.989,-1.8457,-0.3649;2.2488,-0.3618,0.5779;3.6378,-1.3366,0.0575;2.9942,2.3199,-1.6191;3.7938,3.0617,-1.5438;2.5574,2.4203,-2.6284;1.9374,2.637,-0.5591;1.5349,3.6427,-0.7255;2.3745,2.5983,0.444;1.0962,1.9365,-0.5916;5.4588,-0.3719,-1.6644;6.785,-0.3157,-2.0665;7.5664,-1.5841,-1.959;6.9412,-2.4044,-2.3472;7.817,-1.9155,-0.5393;8.297,-1.1238,-0.1098;8.6254,-3.1283,-0.3748;8.8125,-3.2682,0.6965;8.0117,-3.9809,-0.7011;9.9444,-3.1202,-1.1656;10.6092,-2.349,-0.7483;10.4641,-4.0812,-1.0527;9.6765,-2.8196,-2.6476;9.1214,-3.6601,-3.0906;10.6186,-2.7407,-3.2043;8.8541,-1.5316,-2.8099;8.5983,-1.3665,-3.864;9.459,-0.671,-2.4888;7.0633,0.9512,-2.4777;8.0128,1.3043,-2.8566)|\",5.61642987432\r\n\"[H]C1=C(C([H])([H])OC(=O)C([H])([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(7.5168,2.6029,1.381;6.692,3.2077,1.0314;5.4988,2.7322,0.5725;4.9503,1.3677,0.3791;3.8646,1.3668,0.4921;5.4045,0.674,1.088;5.1675,0.8641,-0.9716;6.3728,0.2896,-1.2005;7.2234,0.134,-0.3511;6.5054,-0.111,-2.6512;7.3861,-0.7425,-2.7744;6.6163,0.7884,-3.2675;5.6084,-0.6349,-2.9932;4.7001,3.7838,0.1895;5.4015,4.9371,0.4177;4.7269,6.2152,0.032;5.4077,7.028,0.3238;3.492,6.3979,0.8169;2.8917,5.5873,0.6644;2.7679,7.6208,0.4518;1.8421,7.6523,1.0379;3.3797,8.4736,0.7806;2.4731,7.7589,-1.0513;1.7537,6.9827,-1.3518;1.9986,8.7279,-1.2561;3.7651,7.6053,-1.8676;4.4242,8.4639,-1.6682;3.5485,7.6262,-2.9429;4.4979,6.3073,-1.4982;5.4604,6.2379,-2.0213;3.9016,5.4399,-1.8131;6.628,4.634,0.9337;7.3939,5.3445,1.2126)|\",5.902149417345\r\n\"[H]OC([H])([H])[C@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]([H])(O[H])C1([H])[H] |(7.1488,1.7842,0.4155;7.9175,1.1891,0.3497;7.8806,0.5873,-0.9289;8.3591,1.2208,-1.6959;8.4794,-0.3283,-0.8552;6.4448,0.2842,-1.4104;5.9565,1.2451,-1.5891;5.6375,-0.3792,-0.3715;5.03,0.4317,0.5528;5.0694,1.6617,0.5013;4.367,-0.2778,1.4892;3.6883,0.3868,2.6182;4.7112,1.1544,3.4609;4.2315,1.5208,4.3755;5.1149,2.0062,2.9115;5.5377,0.4961,3.7493;3.1224,-0.8015,3.4011;2.585,-0.4469,4.2869;3.9267,-1.4684,3.7282;2.4254,-1.3757,2.7819;2.56,1.2858,2.1045;1.8715,0.712,1.4743;2.9577,2.1208,1.5269;1.9923,1.68,2.9551;5.8736,-1.8087,-0.1387;6.8704,-1.9521,0.306;5.1416,-2.1633,0.584;5.7706,-2.5982,-1.4478;4.7347,-2.5544,-1.8155;6.0132,-3.6516,-1.2675;6.7048,-2.0254,-2.5166;7.7426,-2.1583,-2.1884;6.6328,-2.7453,-3.7435;5.7285,-2.6473,-4.0842;6.4129,-0.5324,-2.7161;7.1252,-0.109,-3.4337;5.412,-0.4355,-3.1614)|\",7.57020732091\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])[H])OC([C@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C1[H] |(5.203,-1.254,-1.2606;5.5045,-0.2235,-1.1292;4.6672,0.842,-0.9746;3.1804,0.9845,-0.9552;2.759,0.0055,-0.6978;2.894,1.6716,-0.1469;2.5447,1.4674,-2.28;2.833,0.7753,-3.0819;1.4539,1.389,-2.1778;2.9153,2.9,-2.674;2.4043,3.1948,-3.5977;3.9927,3.0059,-2.8351;2.6267,3.6121,-1.8908;5.4157,1.986,-0.8264;6.7425,1.6307,-0.8806;7.7339,2.739,-0.7235;8.731,2.2768,-0.7622;7.6021,3.3435,0.6168;6.6409,3.6672,0.7256;8.5226,4.4661,0.826;8.3167,4.8906,1.8158;9.5416,4.0535,0.8666;8.4582,5.5525,-0.2608;7.4828,6.0587,-0.2077;9.2235,6.3186,-0.0777;8.6383,4.9278,-1.652;9.6647,4.5408,-1.7422;8.5219,5.6856,-2.4369;7.6441,3.777,-1.8706;7.8278,3.2812,-2.8326;6.6191,4.1714,-1.9054;6.844,0.2859,-1.0674;7.7635,-0.2791,-1.138)|\",6.163378713825\r\n\"[H]OC1=NN([H])[C@@]2([H])C([H])([H])C([H])([H])N(C([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])C([H])([H])[C@]12[H] |(8.0334,-3.325,-0.7003;7.7353,-4.1038,-1.1931;6.4024,-4.2551,-1.0032;5.7557,-5.1757,-1.6163;4.3522,-5.0618,-1.2662;4.1552,-5.839,-0.6305;4.2137,-3.7553,-0.587;4.063,-3.0214,-1.3894;3.1313,-3.5401,0.465;2.1593,-3.3713,-0.0093;3.0329,-4.4205,1.114;3.5523,-2.3117,1.3273;4.0319,-2.6553,2.2641;2.6657,-1.7451,1.6277;4.4256,-1.4014,0.5711;4.5239,-0.0661,1.1528;3.5048,0.2515,1.4107;5.1008,-0.0674,2.0997;5.1323,0.9519,0.2034;4.6622,1.0627,-1.1128;3.8864,0.3821,-1.4528;5.1909,2.0198,-1.9772;4.8173,2.0928,-2.9953;6.1967,2.8852,-1.5374;6.6081,3.6319,-2.2113;6.6715,2.7823,-0.23;7.4568,3.4474,0.1199;6.1435,1.8177,0.6318;6.5201,1.7381,1.6496;5.7523,-1.9741,0.2655;6.1497,-1.4557,-0.6158;6.4647,-1.7996,1.0957;5.5891,-3.4775,0.0109;5.6937,-4.0141,0.9704)|\",5.308941223254999\r\n\"[H]OC([H])([H])[C@@]([H])(C1=C([H])N([H])C2=NC([H])=C([H])C2=C1[H])N([H])C([H])([H])[H] |(2.8352,3.3805,-0.2073;2.8718,3.5164,-1.178;2.6679,2.2113,-1.6931;1.5911,1.9853,-1.7633;3.0783,2.1756,-2.7072;3.3524,1.1541,-0.7961;2.9687,0.1619,-1.0921;4.8657,1.1096,-0.9272;5.6037,2.2739,-1.1034;5.1332,3.2448,-1.1918;6.9577,2.2288,-1.1888;7.4886,3.0824,-1.3272;7.657,1.0635,-1.0982;8.9698,0.9248,-1.1683;9.1381,-0.4411,-1.0237;10.1435,-0.8487,-1.0422;7.9604,-1.1591,-0.8667;7.8529,-2.2279,-0.7394;6.9254,-0.1835,-0.9093;5.5434,-0.1306,-0.828;4.9584,-1.0394,-0.6952;2.975,1.4636,0.6035;3.7061,1.1303,1.2262;1.6823,0.9172,1.0202;1.5188,1.1476,2.0774;1.5983,-0.1747,0.8825;0.8749,1.389,0.4508)|\",4.078986618995\r\n\"[H]C1=C([H])/C2=C([H])/C([C@@]([H])(N([H])[H])C([H])([H])C(=O)OC([H])([H])[H])=C(/[H])N([H])C2=N1 |(-5.7581,6.8409,-2.2501;-4.8576,6.319,-1.9429;-4.1385,5.4019,-2.6974;-4.3558,5.0618,-3.7009;-3.054,5.0077,-1.8635;-1.9616,4.157,-1.9151;-1.7765,3.5574,-2.805;-1.0827,4.0513,-0.8101;0.142,3.133,-0.8403;0.5011,3.026,0.1886;-0.0986,1.7746,-1.3454;-0.3916,1.8046,-2.3215;-0.8726,1.3543,-0.8328;1.3005,3.7429,-1.6492;0.9931,3.995,-2.6706;2.0946,2.9891,-1.7341;1.8988,4.9702,-0.9912;1.8389,5.2336,0.1931;2.5532,5.7333,-1.8894;3.2119,6.8937,-1.3542;3.9727,6.6011,-0.6254;2.4895,7.5544,-0.8678;3.6704,7.3898,-2.2098;-1.3311,4.8121,0.3224;-0.678,4.7962,1.1856;-2.4004,5.6487,0.3862;-2.5713,6.2011,1.2199;-3.2757,5.7844,-0.6484;-4.3422,6.5649,-0.6824)|\",4.062659787965\r\n\"[H]OC(=O)[C@@](C1=C(\\[H])N([H])C2=NC([H])=C([H])\\C2=C\\1[H])(N([H])[H])C([H])([H])[H] |(0.8002,-0.6406,-1.347;0.1904,-0.3076,-0.6413;0.9986,0.1898,0.3063;0.5957,0.736,1.305;2.5212,-0.0327,0.0188;2.9467,-1.2709,0.8158;3.6584,-1.1417,1.9991;3.959,-0.1899,2.4129;4.0161,-2.234,2.7229;4.5299,-2.1256,3.5912;3.702,-3.4997,2.3337;4.0103,-4.6179,2.9635;3.4488,-5.5879,2.1491;3.5499,-6.63,2.4333;2.7977,-5.099,1.0257;2.2893,-5.6684,0.2596;2.9411,-3.6854,1.1027;2.5708,-2.5684,0.3755;1.993,-2.6716,-0.5383;2.6496,-0.3418,-1.4285;3.4938,-0.8817,-1.6075;2.7243,0.5227,-1.9639;3.2756,1.2454,0.4139;3.0247,1.5531,1.4314;2.9791,2.0674,-0.2477;4.3582,1.1044,0.3261)|\",4.03816954142\r\n\"[H]C1=NC([H])=C([H])C(C([H])([H])[C@@]([H])(C2=C([H])N([H])C3=NC([H])=C([H])C3=C2[H])N([H])[H])=C1[H] |(0.5339,-1.5653,1.5355;0.4253,-0.8845,0.6925;-0.2357,-1.364,-0.3684;-0.3717,-0.5366,-1.413;-0.9044,-0.9382,-2.274;0.1271,0.765,-1.4437;-0.0108,1.3789,-2.3308;0.8149,1.2652,-0.3307;1.3476,2.6788,-0.3175;1.7564,2.9362,-1.3011;2.1696,2.7604,0.4057;0.2744,3.7476,0.0156;-0.5581,3.5977,-0.6841;-0.2806,3.5986,1.4307;-1.5809,3.1578,1.6133;-2.2336,2.9018,0.7871;-2.1075,3.0104,2.8586;-3.0619,2.6881,2.9802;-1.3906,3.2866,3.982;-1.8102,3.1817,5.2303;-0.702,3.5821,5.9572;-0.764,3.5926,7.0404;0.3999,3.937,5.1919;1.3645,4.2765,5.5443;-0.0139,3.754,3.8425;0.5082,3.9008,2.5689;1.5285,4.2551,2.4304;0.8409,5.0687,-0.2948;1.6703,5.2345,0.2752;0.1743,5.7945,-0.0335;0.9613,0.4022,0.7605;1.4856,0.7251,1.6558)|\",4.03272726441\r\n\"[H]C1=C([H])/C2=C([H])/C([C@@]([H])(N([H])C([H])([H])[H])C([H])([H])[H])=C(/[H])N([H])C2=N1 |(6.422,6.14,1.4113;5.7787,5.2983,1.1762;6.1334,3.9572,1.2057;7.0952,3.536,1.4654;4.9557,3.2552,0.8224;4.5456,1.9473,0.6282;5.2406,1.1239,0.7843;3.2165,1.6634,0.2252;2.7434,0.2241,0.0002;1.6936,0.2739,-0.3229;3.4544,-0.5063,-1.0552;4.4285,-0.6304,-0.7828;3.3881,0.102,-2.3804;3.9544,-0.516,-3.0846;3.773,1.1344,-2.4381;2.3442,0.1155,-2.7171;2.7974,-0.6007,1.2939;2.213,-0.124,2.0875;3.828,-0.6995,1.6573;2.4054,-1.6056,1.1113;2.3409,2.7161,0.0205;1.3131,2.5704,-0.2914;2.7288,4.0071,0.2098;2.0785,4.7702,0.0548;3.9922,4.3292,0.5997;4.4676,5.545,0.8054)|\",4.03816954142\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C1=C(\\[H])N([H])C2=NC([H])=C([H])\\C2=C\\1[H])N([H])[H] |(0.4778,-0.11,-4.9792;0.5243,-0.6745,-4.1917;1.181,0.0694,-3.1719;0.5853,0.9493,-2.8722;1.2171,-0.6024,-2.3084;2.5894,0.5097,-3.5809;3.1677,-0.3818,-3.8539;2.5088,1.1105,-4.4998;3.332,1.3482,-2.5263;2.7577,2.2657,-2.3357;4.2897,1.6743,-2.9508;3.5838,0.657,-1.1754;2.6379,0.3771,-0.6981;4.0529,1.3856,-0.4967;4.4565,-0.6123,-1.225;3.9597,-1.3381,-1.8805;5.8492,-0.3749,-1.8157;6.7085,0.5569,-1.2485;6.4384,1.172,-0.3981;7.9615,0.7497,-1.7393;8.5803,1.4345,-1.3176;8.4373,0.0504,-2.8054;9.6356,0.168,-3.3528;9.5894,-0.7581,-4.3785;10.4614,-0.8794,-5.0129;8.3902,-1.4519,-4.4844;8.1333,-2.2145,-5.207;7.5742,-0.9392,-3.4391;6.2992,-1.1277,-2.9268;5.6251,-1.859,-3.3697;4.4529,-1.2206,0.1182;4.8929,-0.5854,0.7845;5.0299,-2.0611,0.1127)|\",4.0272849874\r\n\"[H]C1=C([H])C2=C([H])C([C@@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=C([H])N([H])C2=N1 |(6.712,9.4431,2.9817;6.3266,8.5544,2.4926;7.0572,7.4356,2.1212;8.1166,7.2653,2.2575;6.1092,6.5548,1.5259;6.0887,5.3016,0.9386;6.9932,4.7111,0.8217;4.8695,4.7671,0.4495;4.824,3.3823,-0.1776;3.8191,3.255,-0.6236;5.9039,3.2767,-1.1721;5.8639,2.3658,-1.6276;5.7747,3.9772,-1.9008;5.0097,2.2517,0.856;5.0901,1.3007,0.3075;5.9736,2.3965,1.3606;3.8827,2.138,1.888;3.7987,3.0797,2.4474;2.9213,2.0039,1.3685;4.0854,0.9811,2.8752;4.1646,0.0366,2.3181;5.0475,1.1108,3.3903;2.9625,0.8703,3.9115;3.135,0.0354,4.6;1.9908,0.708,3.4285;2.8849,1.7858,4.5106;3.7107,5.5174,0.5526;2.7513,5.1685,0.1888;3.72,6.7509,1.1273;2.8679,7.2971,1.1926;4.8614,7.3056,1.6222;4.9755,8.4923,2.1917)|\",4.068102064975\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C1=C([H])N([H])C2=NC([H])=C([H])C2=C1[H])N([H])[H] |(3.1433,4.9481,-1.5139;2.4571,5.264,-2.1254;1.26,4.7018,-1.8158;0.2857,4.984,-2.4726;1.2713,3.7563,-0.618;2.0686,3.0129,-0.7665;1.5667,4.348,0.261;-0.0755,3.0683,-0.3699;-0.0467,2.5906,0.6176;-0.8533,3.8368,-0.3236;-0.4173,2.0175,-1.4351;0.2825,1.1765,-1.3636;-0.2778,2.4455,-2.4359;-1.8307,1.423,-1.3301;-1.9359,0.9847,-0.3278;-2.9749,2.4252,-1.5048;-2.8207,3.608,-2.2162;-1.8728,3.9565,-2.6066;-3.8771,4.44,-2.4221;-3.7455,5.3093,-2.9289;-5.1248,4.1669,-1.9532;-6.2058,4.9145,-2.1008;-7.1711,4.1862,-1.4279;-8.184,4.5737,-1.3877;-6.7186,3.0027,-0.8608;-7.293,2.2821,-0.2945;-5.335,2.9422,-1.19;-4.2528,2.1038,-0.9815;-4.3672,1.184,-0.4083;-1.9046,0.2834,-2.2716;-1.8757,0.6494,-3.2248;-2.8185,-0.1619,-2.1865)|\",4.057217510955\r\n\"[H]C1=C([H])C2=C([H])C([C@@]3([H])N([H])C([H])([H])C([H])([H])OC3([H])[H])=C([H])N([H])C2=N1 |(-3.4606,-0.3932,0.0198;-2.3922,-0.2035,0.011;-1.7738,1.0301,0.1527;-2.2493,1.9913,0.2937;-0.3787,0.7563,0.0754;0.8163,1.4531,0.125;0.843,2.5327,0.2432;2.0467,0.7591,-0.0046;3.3685,1.4918,0.0859;4.1804,0.814,-0.2425;3.3337,2.7231,-0.7153;3.1084,2.4937,-1.6814;4.6023,3.4537,-0.6453;5.4545,2.8698,-1.0404;4.5161,4.3744,-1.2342;4.8858,3.793,0.8151;4.1085,4.4778,1.1916;5.8628,4.2728,0.9259;4.9251,2.612,1.6076;3.6987,1.9001,1.5349;3.8147,1.0137,2.1652;2.8738,2.5132,1.9327;2.0371,-0.614,-0.1893;2.9428,-1.1979,-0.3049;0.8682,-1.3063,-0.2395;0.8682,-2.3106,-0.3824;-0.3405,-0.6906,-0.1138;-1.526,-1.2717,-0.1555)|\",4.062659787965\r\n\"[H]C1=C([H])C2=C([H])C([C@]3([H])N([H])C([H])([H])C([H])([H])N([H])C3([H])[H])=C([H])N([H])C2=N1 |(-2.1959,-2.651,-0.8071;-1.3981,-1.9215,-0.7117;-0.0412,-2.1573,-0.8852;0.4375,-3.0932,-1.1402;0.5842,-0.9001,-0.6511;1.8592,-0.358,-0.6446;2.7252,-0.9784,-0.8735;2.0629,1.0141,-0.3513;3.4948,1.5524,-0.3161;4.0754,0.9281,-1.0073;4.1702,1.4348,0.995;4.073,0.4806,1.3371;3.6659,2.3751,1.999;2.5976,2.2314,2.2401;4.2383,2.229,2.9221;3.8695,3.7992,1.4851;4.9543,3.9852,1.3831;3.4628,4.5204,2.2029;3.1546,3.9455,0.2112;3.2592,4.8971,-0.1352;3.6666,3.0049,-0.7935;3.1195,3.155,-1.7303;4.742,3.1578,-0.9951;0.9578,1.8108,-0.075;1.0538,2.8664,0.1456;-0.2972,1.2879,-0.0849;-1.0974,1.8794,0.1135;-0.5414,-0.0239,-0.3541;-1.7237,-0.6165,-0.3864)|\",4.0653809264700005\r\n\"[H]OC(=O)C([H])([H])C([H])([H])[C@@]([H])(C1=C(\\[H])N([H])C2=NC([H])=C([H])\\C2=C\\1[H])N([H])[H] |(2.2434,-1.1139,-2.203;2.0973,-0.4814,-2.9591;1.0453,0.3146,-2.6792;0.7264,1.2222,-3.4115;0.2991,-0.025,-1.386;-0.6512,0.5136,-1.4118;0.0723,-1.0982,-1.387;1.048,0.3467,-0.0843;0.4613,-0.0404,0.7597;1.0668,1.4357,0.034;2.5039,-0.1566,0.0202;3.1044,0.42,-0.692;3.1112,0.0763,1.3991;2.5432,-0.4943,2.5305;1.6501,-1.1074,2.5037;3.0927,-0.3109,3.7592;2.6688,-0.7283,4.5815;4.2142,0.4373,3.9462;4.8181,0.6738,5.0967;5.8825,1.4735,4.7199;6.5664,1.8271,5.4843;5.9571,1.7415,3.3591;6.699,2.338,2.846;4.8473,1.0616,2.7882;4.2795,0.8647,1.5398;4.7189,1.3126,0.6504;2.5836,-1.5594,-0.4627;1.9746,-2.1667,0.0858;3.5313,-1.914,-0.337)|\",4.0218427103900005\r\n\"[H]C1=C([H])C2=C([H])C([C@@]3([H])N([H])C3([H])[H])=C([H])N([H])C2=N1 |(-3.4369,-0.167,-0.0341;-2.3593,-0.0427,-0.0013;-1.6719,1.1527,0.1521;-2.0907,2.1436,0.2643;-0.2946,0.7927,0.1354;0.938,1.4169,0.2243;1.0325,2.4944,0.3277;2.1248,0.6465,0.1555;3.4724,1.279,0.2854;4.2853,0.6908,-0.142;3.5513,2.7423,0.1418;4.4063,3.009,-0.3448;3.8535,2.1413,1.4465;4.8853,2.1091,1.7944;3.117,2.3427,2.2214;2.0404,-0.7257,-0.0165;2.912,-1.3663,-0.0843;0.8335,-1.3454,-0.1097;0.7778,-2.3493,-0.2436;-0.3383,-0.6558,-0.038;-1.5556,-1.164,-0.1214)|\",4.051775233945\r\n\"[H]C1=C([H])/C2=C([H])/C([C@]3([H])N([H])C([H])([H])C([H])([H])C([H])([H])C3([H])[H])=C(/[H])N([H])C2=N1 |(-8.179,1.5546,0.8717;-7.1547,1.2082,0.7788;-6.1309,1.8379,0.0833;-6.1853,2.764,-0.473;-4.9949,0.9988,0.2592;-3.6606,0.9711,-0.1188;-3.2534,1.7713,-0.7293;-2.8224,-0.0932,0.2914;-1.355,-0.2048,-0.1215;-1.2911,-0.9033,-0.9724;-0.5915,-0.8483,0.9661;-0.6663,-0.2574,1.7962;0.8278,-1.0334,0.6371;1.3224,-1.4762,1.5094;0.8844,-1.7761,-0.1715;1.5352,0.2587,0.1994;1.5949,0.9443,1.0577;2.5677,0.0428,-0.1053;0.7618,0.9292,-0.9439;0.834,0.3008,-1.844;1.2138,1.8944,-1.2029;-0.7211,1.1235,-0.5861;-1.2546,1.5221,-1.4579;-0.8203,1.8687,0.2171;-3.3613,-1.1052,1.077;-2.7604,-1.9344,1.4254;-4.666,-1.0852,1.4512;-5.0446,-1.8284,2.0289;-5.5124,-0.0842,1.0826;-6.7919,0.0264,1.3999)|\",4.0817077575\r\n\"[H]C1=C2/C([H])=C([H])\\N=C/2N([H])C([H])=C1[C@]([H])(N([H])[H])C(F)(F)F |(1.1147,2.3538,-0.556;0.9701,1.3091,-0.2984;-0.3014,0.7636,-0.2621;-1.6436,1.1863,-0.472;-1.9871,2.1758,-0.7413;-2.4157,0.0541,-0.2555;-3.4965,-0.0122,-0.3228;-1.698,-1.083,0.078;-0.4514,-0.6495,0.0712;0.6647,-1.3843,0.3347;0.537,-2.3648,0.5625;1.908,-0.8409,0.2992;2.727,-1.5089,0.5361;2.0963,0.4957,-0.013;3.51,1.0474,0.025;4.2073,0.2044,0.1014;3.7201,1.8919,1.2005;3.0762,2.6809,1.1827;4.6637,2.277,1.1854;3.9147,1.7462,-1.2818;3.7269,0.9635,-2.3602;3.2357,2.9017,-1.4839;5.2305,2.073,-1.2389)|\",4.05449637245\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C1=C(\\[H])N([H])C2=NC([H])=C([H])\\C2=C\\1[H])N([H])[H] |(2.3405,3.5691,2.1096;2.8423,2.8253,1.7405;2.3624,2.6105,0.4146;2.5267,3.5028,-0.211;2.9907,1.8098,0.0083;0.8859,2.2097,0.3702;0.2732,3.0139,0.7983;0.5788,2.115,-0.6818;0.5991,0.9034,1.1195;1.0655,0.9555,2.1125;1.0793,0.0637,0.6005;-0.903,0.5966,1.2739;-1.3623,0.663,0.2773;-1.1472,-0.8197,1.7924;-1.8387,-1.725,1.0052;-2.2109,-1.4783,0.0175;-2.0898,-2.9915,1.4369;-2.5992,-3.6453,0.852;-1.6773,-3.4309,2.657;-1.8728,-4.6334,3.1685;-1.2598,-4.5386,4.4067;-1.2662,-5.4053,5.0595;-0.6874,-3.3047,4.6806;-0.1592,-3.009,5.5769;-0.9426,-2.5143,3.5245;-0.6958,-1.2293,3.073;-0.1521,-0.5217,3.6964;-1.5332,1.6566,2.0823;-1.1456,1.6376,3.0262;-2.5275,1.4577,2.188)|\",4.03272726441\r\n\"[H]C1=C([H])N([H])/C2=N\\C([H])=C(C([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])N([H])[H])/C([H])=C\\12 |(8.5625,-2.0441,-4.9528;8.5865,-2.2681,-3.8952;9.712,-2.5268,-3.1539;10.7487,-2.5578,-3.459;9.3629,-2.7702,-1.8386;9.9885,-2.9924,-1.0794;7.9953,-2.6699,-1.7125;7.3066,-2.8487,-0.588;5.9851,-2.7029,-0.7191;5.3996,-2.8474,0.1879;5.3176,-2.3847,-1.9252;3.8155,-2.2066,-1.9336;3.3589,-2.8037,-1.1346;3.4001,-2.567,-2.8841;3.3336,-0.735,-1.7518;3.7835,-0.113,-2.5321;3.7998,-0.2234,-0.3934;3.33,-0.607,0.6593;4.799,0.6676,-0.493;5.3299,1.143,0.7569;5.7601,0.3146,1.3259;4.5441,1.617,1.3502;6.1013,1.8653,0.4891;1.8799,-0.5681,-1.7915;1.48,-1.0265,-0.9733;1.5016,-1.0354,-2.6139;6.0793,-2.2087,-3.0824;5.5967,-1.9716,-4.0288;7.4696,-2.3517,-2.9979)|\",5.151115189965001\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C([H])N=C2C(=C1[H])C([H])=C([H])N2[H] |(-1.2523,0.7685,-0.9008;-1.8626,0.3226,-0.2146;-1.1891,-0.5021,0.4849;-1.885,-1.2637,1.3834;-1.2586,-1.7007,1.9802;0.2687,-0.7705,0.4431;0.5791,-1.7896,0.6745;1.1841,0.1751,0.1605;0.8191,1.1882,-0.0118;2.6359,0.0229,0.0774;3.4193,1.197,-0.0768;2.9154,2.1613,-0.1301;4.7477,1.2307,-0.1633;5.3338,0.0369,-0.0963;4.6888,-1.2283,0.0506;3.2969,-1.2141,0.1362;2.7381,-2.1406,0.2363;5.7251,-2.2226,0.0647;5.6046,-3.2928,0.1598;6.9152,-1.5562,-0.0668;7.9243,-1.9422,-0.1004;6.6849,-0.1945,-0.1633;7.38,0.5285,-0.2745)|\",4.435455763149999\r\n\"[H]OC1=C([H])C([H])=C([C@@]([H])(/N=C(/O[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(2.6874,-1.2132,6.4598;2.9609,-0.3299,6.1675;2.9118,-0.2865,4.7985;2.5303,-1.3836,4.0203;2.2584,-2.3223,4.5015;2.5023,-1.2796,2.6299;2.2214,-2.1346,2.0241;2.8524,-0.0858,1.9906;2.7747,0.0602,0.471;3.508,0.8269,0.1861;2.9836,-1.2287,-0.1583;3.8665,-1.5523,-1.0053;3.79,-2.8628,-1.4081;4.4855,-3.0535,-2.0516;5.0112,-0.709,-1.6435;5.98,-0.2384,-0.5332;6.7949,0.348,-0.9751;6.4235,-1.096,-0.0149;5.4901,0.3819,0.2196;5.8429,-1.5229,-2.6623;6.6207,-0.8818,-3.0911;5.2364,-1.8854,-3.5026;6.3603,-2.3717,-2.1963;4.4097,0.4944,-2.4071;5.2104,1.0674,-2.8902;3.8626,1.1761,-1.7534;3.7195,0.1556,-3.1881;1.3846,0.5776,0.0448;1.1545,1.5247,0.545;0.6173,-0.1557,0.3112;1.3477,0.7345,-1.0396;3.2377,1.0014,2.7858;3.5214,1.9395,2.3118;3.2694,0.9128,4.1756;3.5728,1.7557,4.789)|\",5.8041884311650005\r\n\"[H]C1N=C(C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])N=C([H])C1=O |(-4.1395,2.5985,2.8582;-3.1038,2.7355,2.5431;-2.582,1.8844,1.7221;-1.2588,2.0722,1.3284;-0.7079,1.1706,0.4432;0.7132,1.2641,-0.0287;0.705,1.4526,-1.112;1.237,2.0836,0.4643;1.4372,-0.0812,0.2004;1.5685,-0.2295,1.2894;2.4349,-0.0356,-0.249;0.6793,-1.1557,-0.4361;1.1918,-2.0322,-0.3688;-0.6503,-1.3062,0.1501;-1.1595,-2.1458,-0.335;-0.6239,-1.5151,1.2367;-1.4643,-0.0137,-0.082;-2.4525,-0.0816,0.3738;-1.5855,0.1081,-1.1681;-0.4644,3.1257,1.7757;-0.983,3.9786,2.5969;-0.37,4.8078,2.9539;-2.3727,3.8965,3.0878;-2.8674,4.7066,3.8681)|\",3.24903937497\r\n\"[H]C1N=C(C2N([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])N=C([H])C1=O |(-4.706,2.1295,-2.477;-4.1906,1.1872,-2.2897;-3.0355,1.2207,-1.6805;-2.4038,0.0301,-1.4609;-1.1604,0.0482,-0.7883;-0.5413,-1.1121,-0.5527;-1.0387,-1.9181,-0.9288;0.7948,-1.298,0.01;0.7696,-2.1797,0.6599;1.4991,-1.517,-0.8066;1.2548,-0.0595,0.7788;0.7279,-0.0054,1.7404;2.323,-0.1576,1.0003;0.9657,1.2039,-0.0387;1.5067,1.1575,-0.9938;1.3252,2.096,0.4849;-0.5414,1.3302,-0.2957;-0.7841,2.1265,-1.0022;-1.0546,1.6028,0.6381;-2.8824,-1.1997,-1.8362;-4.0346,-1.2425,-2.4458;-4.429,-2.2121,-2.7515;-4.8369,-0.0484,-2.7433;-5.9313,-0.0879,-3.3171)|\",3.43407679331\r\n\"[H]OC1=NC([H])([H])C([H])([H])[C@@]([H])(C2([H])C([H])([H])C2([H])[H])N1[H] |(-3.3776,-4.6817,-1.4751;-2.4812,-4.3211,-1.5948;-2.6138,-2.9919,-1.3312;-3.7499,-2.5094,-1.0012;-3.7919,-1.0682,-0.776;-4.796,-0.7098,-1.033;-3.6523,-0.853,0.2933;-2.7463,-0.3037,-1.6033;-2.7502,0.7613,-1.3446;-3.0084,-0.385,-2.6657;-1.3315,-0.8804,-1.3973;-0.702,-0.5563,-2.2375;-0.6673,-0.381,-0.1139;-0.5676,0.7042,-0.1021;0.4525,-1.1403,0.5536;0.7221,-2.1117,0.1481;1.288,-0.5748,0.9573;-0.8937,-1.0335,1.2276;-0.9921,-0.3935,2.1003;-1.5032,-1.9318,1.2593;-1.4116,-2.3423,-1.4564;-0.6779,-2.8542,-1.924)|\",7.725312215695\r\n\"[H]OC1=NC([H])([H])C([H])([H])[C@@]([H])(C2([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])N1[H] |(-2.7271,4.9057,-2.2331;-2.8108,3.9421,-2.3456;-1.5312,3.482,-2.2791;-0.5656,4.3,-2.0997;0.7769,3.7336,-2.0125;1.474,4.4177,-2.5129;1.0884,3.7115,-0.9564;0.9052,2.3315,-2.6311;1.8768,1.8885,-2.3886;0.8476,2.4138,-3.7232;-0.237,1.4109,-2.1647;-0.228,0.4991,-2.779;-0.1457,0.9804,-0.6858;-0.1914,1.8875,-0.0692;-1.3036,0.0272,-0.2426;-2.0369,0.5675,0.3652;-1.8496,-0.3721,-1.1076;-0.6429,-1.1245,0.5627;-1.2243,-1.4107,1.4454;-0.5568,-2.0194,-0.0663;0.7647,-0.6084,0.9095;1.4748,-1.4121,1.1339;0.7221,0.0464,1.7904;1.1494,0.215,-0.3281;1.4235,-0.4616,-1.1516;2.0027,0.8792,-0.1549;-1.4817,2.1217,-2.4703;-2.3572,1.6331,-2.3386)|\",7.442313811175\r\n\"[H]OC1=NC([H])([H])C([H])([H])[C@@]([H])(C2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])N1[H] |(0.1242,1.2554,5.276;0.6022,0.8356,4.539;-0.3762,0.2854,3.768;-1.6013,0.366,4.1211;-2.5417,-0.2253,3.1792;-3.464,-0.4747,3.7169;-2.8205,0.5203,2.4197;-1.9878,-1.4948,2.507;-2.6919,-1.874,1.7574;-1.8992,-2.2659,3.2816;-0.598,-1.2917,1.8585;-0.0528,-2.2419,1.9668;-0.6336,-1.0355,0.3252;-1.1351,-1.9393,-0.0542;0.7729,-1.0153,-0.3269;1.4045,-1.7966,0.1206;0.6529,-1.2955,-1.3833;1.4925,0.3456,-0.2798;1.7856,0.5906,0.748;2.4222,0.2821,-0.8604;0.6042,1.4752,-0.8205;0.4224,1.3127,-1.8939;1.1236,2.4381,-0.7342;-0.7408,1.5234,-0.0816;-1.3791,2.307,-0.5104;-0.5702,1.7911,0.9677;-1.4657,0.1703,-0.1705;-1.709,-0.0155,-1.2274;-2.4281,0.2142,0.3517;0.1469,-0.2828,2.6326;1.1538,-0.3722,2.6422)|\",7.5103422738\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C([H])N=C([H])N=C2[H])C([H])([H])C1([H])[H] |(-3.0436,-0.0236,0.4122;-2.4772,-0.7968,0.2718;-1.2019,-0.364,0.0199;-0.3282,-1.2737,-0.1654;0.9792,-0.8936,-0.4947;1.5908,-1.6504,-0.2073;1.4541,0.4175,-0.049;2.3942,0.5969,-0.5866;1.7487,0.5418,1.4449;1.3085,-0.3641,2.4161;0.702,-1.2245,2.1415;1.597,-0.2316,3.7183;2.3437,0.8243,4.0502;2.5806,0.9405,5.106;2.8379,1.7525,3.2226;2.532,1.5953,1.9315;2.9403,2.3418,1.2483;0.4253,1.4552,-0.5402;0.728,2.468,-0.257;0.4042,1.4061,-1.634;-0.9659,1.1279,0.0199;-1.7407,1.6279,-0.5776;-1.0724,1.5032,1.0484)|\",4.400080962585001\r\n\"[H]NC1=NC([H])=C([C@@]2([H])N=NC(=O)C([H])=C2[H])C([H])N1[H] |(3.6283,-1.3458,5.7843;3.6541,-0.4143,5.3613;3.1167,-0.4096,4.2004;3.0519,0.7694,3.4575;2.4749,0.7682,2.2966;2.4696,1.7236,1.7642;1.8617,-0.3672,1.6596;1.1848,-0.2625,0.3056;1.6456,0.5906,-0.2189;1.6135,-1.4371,-0.5144;0.7973,-2.1985,-1.069;-0.6657,-1.9803,-0.9734;-1.3701,-2.7697,-1.564;-1.1605,-0.8534,-0.1835;-2.2372,-0.7322,-0.1145;-0.2887,-0.0329,0.4173;-0.6215,0.8149,1.0139;1.9501,-1.5382,2.3533;1.5687,-2.4857,1.9857;2.5589,-1.5623,3.5662;2.6353,-2.4447,4.0559)|\",3.2517605134750007\r\n\"[H]C1=NC(C([H])([H])[H])=NC([H])=C1[C@@]1([H])N=NC(=O)C([H])=C1[H] |(4.9311,1.9326,0.9522;4.3791,1.123,0.4736;3.0554,1.1363,0.6256;2.3697,0.1378,0.0417;0.8732,0.1521,0.1848;0.4728,-0.8609,0.1064;0.4263,0.7531,-0.6181;0.5852,0.6054,1.1362;2.9132,-0.8604,-0.6755;4.2405,-0.8516,-0.8109;4.6709,-1.6672,-1.3914;5.0531,0.138,-0.2524;6.5678,0.1294,-0.3753;6.9177,1.1432,-0.1251;7.1005,-0.6915,0.7684;7.8326,-1.6816,0.5903;8.236,-2.1203,-0.7654;8.887,-3.1392,-0.8377;7.8457,-1.3029,-1.9164;8.2203,-1.6091,-2.8884;7.0553,-0.2367,-1.7366;6.7409,0.3881,-2.5713)|\",3.5293166409850003\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C([H])N=C(C3([H])C([H])([H])C3([H])[H])N=C2[H])C([H])([H])C1([H])[H] |(-1.0131,0.5407,-7.2951;-1.9455,0.6262,-7.0467;-2.3364,-0.5279,-6.419;-3.5433,-0.5617,-6.0106;-4.0192,-1.7415,-5.4236;-4.7947,-1.4815,-4.8232;-3.0539,-2.6135,-4.7508;-3.5884,-3.5507,-4.5471;-2.5117,-2.0931,-3.4208;-2.6609,-0.7791,-2.9632;-3.1767,-0.0345,-3.5666;-2.1906,-0.3607,-1.7852;-1.5531,-1.2652,-1.0232;-1.0263,-0.8242,0.2895;-0.524,-1.6162,0.8347;-1.7916,0.2106,1.1012;-2.7044,0.5935,0.6555;-1.8245,0.0678,2.1778;-0.4964,0.591,0.4599;0.3835,0.7162,1.0851;-0.5374,1.2303,-0.4164;-1.3614,-2.5565,-1.3517;-1.8399,-2.9432,-2.5355;-1.6907,-3.9949,-2.7873;-1.9402,-2.9167,-5.7722;-1.1881,-3.5824,-5.3379;-2.4012,-3.4366,-6.6185;-1.2952,-1.6094,-6.2548;-0.7758,-1.7747,-7.209;-0.5341,-1.2624,-5.5406)|\",4.470830563714999\r\n\"[H]C1=NC([H])=C(N([H])[H])C([H])=C1[C@@]1([H])N=NC(=O)C([H])=C1[H] |(1.2777,-2.3589,1.8726;1.7221,-1.4876,2.3513;2.0479,-1.6156,3.6418;2.5823,-0.5683,4.2638;2.8357,-0.704,5.3158;2.8299,0.6736,3.6429;3.4381,1.713,4.3361;3.2627,2.6433,3.9779;3.3538,1.6728,5.344;2.4914,0.7854,2.2883;2.6662,1.7201,1.7598;1.9289,-0.3086,1.6308;1.5526,-0.2231,0.1542;1.0346,-1.1603,-0.0987;0.4801,0.811,0.0141;0.6202,1.8222,-0.7005;1.8611,2.0664,-1.4702;1.941,3.1271,-2.0521;2.892,1.0284,-1.4917;3.751,1.1967,-2.1342;2.7477,-0.0602,-0.7243;3.4977,-0.8492,-0.7025)|\",3.480336147895\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N=NC(=O)C([H])=C2[H])N=C1F |(-1.9208,1.8382,0.1342;-1.1589,1.0732,0.2285;-1.4251,-0.2078,0.698;-2.4346,-0.4813,0.9911;-0.3894,-1.1434,0.7913;-0.5593,-2.1466,1.1603;0.8885,-0.7477,0.4;2.1076,-1.6731,0.478;2.7013,-1.3189,1.3399;1.6753,-3.0323,0.8983;2.2216,-4.062,0.4621;3.3299,-3.9882,-0.5205;3.9596,-5.0062,-0.7077;3.5491,-2.7188,-1.212;4.1954,-2.731,-2.0843;2.9685,-1.6067,-0.7422;3.1009,-0.6313,-1.2016;1.16,0.4877,-0.0543;0.1625,1.3405,-0.1272;0.4658,2.5641,-0.5853)|\",3.5837394110850003\r\n\"[H]C1=C(F)C([H])=C([H])C([C@@]2([H])N=NC(=O)C([H])=C2[H])=N1 |(1.1366,-1.9373,0.7941;0.5473,-1.1352,0.3556;-0.7064,-1.4022,-0.1862;-1.1837,-2.659,-0.162;-1.4581,-0.3789,-0.7455;-2.4364,-0.5862,-1.167;-0.9126,0.9038,-0.7433;-1.4478,1.7417,-1.1725;0.3527,1.0944,-0.1778;1.0466,2.4625,-0.1536;1.8187,2.4295,-0.9433;0.1069,3.5056,-0.6458;0.1,4.6602,-0.1813;1.0259,5.0569,0.9068;1.1266,6.2435,1.1299;1.7063,3.9958,1.6474;2.1892,4.2696,2.5805;1.7202,2.7526,1.1483;2.2149,1.9217,1.6427;1.0655,0.095,0.3581)|\",3.5837394110850003\r\n\"[H]OC(=O)[C@]1([H])/C([H])=C(/[H])C2=NC([H])=C([H])C(=O)/C2=C/1[H] |(-4.7782,0.5375,0.4211;-4.5695,1.0156,1.2404;-3.2562,1.3547,1.2894;-2.8188,1.9918,2.2078;-2.4056,0.8642,0.0871;-2.9176,1.2659,-0.8097;-2.3958,-0.6418,-0.0051;-3.3533,-1.158,-0.015;-1.2571,-1.3497,-0.0796;-1.2633,-2.4333,-0.147;0.0523,-0.7096,-0.0769;1.1166,-1.4616,-0.1514;2.3515,-0.824,-0.1383;3.1966,-1.5059,-0.2039;2.5609,0.516,-0.0537;3.5611,0.9365,-0.0506;1.4319,1.4377,0.0237;1.5419,2.6606,0.0779;0.0954,0.754,0.0203;-1.0409,1.474,0.1241;-0.9641,2.553,0.2308)|\",3.455845901350001\r\n\"[H]C1N=C(SC([H])([H])[H])N=C2NC(=O)OC(=O)[C@]12[H] |(5.4131,4.7905,-0.9842;4.7268,4.0061,-1.3008;4.2368,3.2174,-0.4199;3.3791,2.2051,-0.9105;2.5557,1.4207,0.4062;1.5251,0.2094,-0.4877;1.0007,-0.3649,0.2794;2.1552,-0.4459,-1.0908;0.8077,0.7191,-1.1329;3.2163,1.8209,-2.1434;3.836,2.5503,-3.1257;3.8899,2.0761,-4.3273;4.549,2.7956,-5.3063;4.4098,2.6601,-6.4863;5.5597,3.7339,-4.8961;5.5992,4.2655,-3.6544;6.49,4.9935,-3.3043;4.4035,3.9095,-2.7699;3.6365,4.6746,-2.9892)|\",4.046332956935\r\n\"[H]C#CC([H])([H])N1C(=O)OC(=O)[C@@]2([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])N([H])[C@]12[H] |(7.1485,-1.6683,-0.3411;6.2032,-1.1875,-0.2246;5.1205,-0.6753,-0.07;3.8207,-0.0101,0.0923;3.06,-0.7285,0.4056;3.5057,0.4312,-0.859;3.8351,1.0454,1.1124;3.2176,0.7706,2.3035;2.5703,-0.2181,2.5531;3.3681,1.7269,3.309;3.7816,3.0295,3.1015;3.8774,3.7629,4.0469;4.061,3.3891,1.6622;3.0888,3.5017,1.1586;4.8624,4.6822,1.5041;4.3086,5.5136,1.9501;5.8018,4.606,2.0646;5.1232,4.9283,0.0124;5.7526,5.8149,-0.1249;4.1691,5.1303,-0.4904;5.7885,3.719,-0.6674;5.7399,3.8776,-1.7519;7.2754,3.5667,-0.2927;7.8423,4.449,-0.612;7.7124,2.6938,-0.7927;7.4307,3.4453,0.7846;4.9769,2.5006,-0.4218;5.4701,1.7141,-0.8442;4.7465,2.198,0.9849;5.6816,1.9581,1.5261)|\",6.201474652895\r\n\"[H]N1[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@]2([H])C(=O)OC(=O)N(C([H])([H])C([H])([H])C([H])([H])[H])[C@@]12[H] |(4.2344,0.2296,-2.472;4.8212,1.0262,-2.716;5.5758,0.6633,-3.9418;4.8153,0.5737,-4.7274;6.3054,-0.6913,-3.8555;6.7307,-0.958,-4.83;5.6076,-1.489,-3.5714;7.1237,-0.6907,-3.1278;6.5112,1.8223,-4.3265;7.0982,1.5376,-5.2073;5.8965,2.6849,-4.613;7.4343,2.2247,-3.1699;8.0292,3.1055,-3.4279;8.1517,1.4272,-2.9416;6.5839,2.5274,-1.9346;5.9257,3.3833,-2.149;7.4084,2.8799,-0.7191;8.5638,3.2108,-0.7238;6.7411,2.8027,0.4839;5.3873,2.4619,0.6203;4.84,2.7431,1.6608;4.8127,1.8241,-0.4403;3.4152,1.3975,-0.2817;2.9428,2.1317,0.3736;2.9384,1.4722,-1.2626;3.2539,-0.0056,0.3182;3.7566,-0.7494,-0.3175;3.7625,-0.0282,1.2893;1.7797,-0.3841,0.4892;1.6748,-1.3867,0.9169;1.2475,-0.3747,-0.4705;1.2687,0.3183,1.1581;5.6505,1.3681,-1.567;6.2646,0.5135,-1.2222)|\",6.261339700005\r\n\"[H]N1[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@]2([H])C(=O)OC(=O)N(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]12[H] |(3.9985,2.521,0.0091;3.4047,2.8386,0.7737;3.8513,4.2076,1.1381;3.6013,4.8269,0.2676;5.3701,4.3298,1.367;5.6521,5.3798,1.5066;5.9244,3.953,0.4984;5.713,3.7754,2.2468;3.0239,4.7163,2.3299;3.3778,5.7124,2.6191;1.9804,4.8236,2.0081;3.0892,3.7548,3.5217;2.4362,4.084,4.335;4.102,3.7155,3.9401;2.6668,2.3569,3.0619;1.6136,2.3819,2.7443;2.7957,1.3232,4.1579;2.8199,1.5529,5.3372;2.9181,0.0196,3.7292;2.7534,-0.4027,2.3979;2.536,-1.5786,2.2202;2.8636,0.5646,1.4429;2.6829,0.1819,0.0151;2.6659,1.1339,-0.5166;3.8529,-0.6699,-0.498;3.7177,-0.8934,-1.5622;3.9092,-1.614,0.05;4.8095,-0.1449,-0.3833;1.3183,-0.4745,-0.2357;1.1707,-0.5853,-1.3159;0.5126,0.157,0.1534;1.2517,-1.4581,0.2311;3.4653,1.8613,1.8497;4.5048,1.6613,2.1732)|\",6.3021567775800005\r\n\"[H]N1C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C(=O)OC(=O)N(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]12[H] |(2.0472,0.8996,2.7339;1.62,0.0005,2.9487;1.3581,-0.0275,4.4004;2.2851,0.1064,4.9894;0.7035,0.8207,4.6257;0.7012,-1.3459,4.8067;0.5172,-1.3473,5.8872;-0.2714,-1.4326,4.3061;1.6033,-2.5205,4.4139;1.133,-3.4858,4.6208;2.5306,-2.4943,5.0018;1.9442,-2.427,2.9224;1.0267,-2.5557,2.3293;2.9272,-3.4922,2.4915;3.1341,-4.5279,3.0648;3.6471,-3.207,1.3516;3.4091,-2.1069,0.51;3.8596,-2.1658,-0.6104;2.6973,-1.0769,1.0517;2.4127,0.1132,0.2021;1.7266,0.7143,0.7993;3.6815,0.9269,-0.0881;3.4243,1.8356,-0.6439;4.3849,0.3419,-0.6862;4.1831,1.2291,0.8394;1.6445,-0.2641,-1.0722;1.3245,0.6528,-1.58;0.7488,-0.8419,-0.8208;2.2602,-0.8478,-1.7576;2.5203,-1.0607,2.5235;3.5189,-0.954,2.9952)|\",6.370185240205\r\n\"[H]C1=NC(=O)C(=C([H])[H])N=C1Cl |(6.0433,0.0416,-0.0539;4.9546,0.1155,-0.0539;4.412,1.2795,-0.0521;2.9853,1.3425,-0.0504;2.4308,2.4232,-0.0459;2.2284,0.0426,-0.054;0.8801,0.0382,-0.055;0.3342,0.9748,-0.0545;0.3369,-0.9006,-0.0563;2.9034,-1.1862,-0.0559;4.1764,-1.1358,-0.0561;5.1098,-2.6216,-0.06)|\",3.54292233351\r\n\"[H]C1C(C2N([H])N([H])N([H])N2[H])=NC([H])=C1[H] |(2.2683,0.5693,0.253;1.2275,0.3023,0.1082;0.0824,1.1677,0.2063;0.0254,2.5221,0.4493;-1.1255,3.2752,0.4763;-1.8611,3.0368,-0.1787;-0.7631,4.6736,0.5675;-0.8165,4.8914,1.5646;0.6287,4.7164,0.2662;0.6717,4.7094,-0.7614;1.0606,3.4006,0.72;2.0074,3.1538,0.4615;-1.1011,0.4693,-0.0343;-0.7115,-0.7696,-0.2715;-1.4354,-1.5451,-0.5062;0.7171,-0.9379,-0.191;1.2688,-1.858,-0.341)|\",4.381032993050001\r\n\"[H]OB(O[H])[C@]1([H])C([H])C2=C([H])C([H])=C([H])C([H])=C2NC1=O |(1.7175,1.6444,1.9709;1.6682,2.6103,2.0384;2.0372,3.2541,0.9055;2.0527,4.6155,0.9413;2.364,5.0208,0.1223;2.4103,2.4456,-0.4806;3.1078,3.0711,-1.0495;1.1236,2.2456,-1.1822;0.702,3.0651,-1.7632;0.4298,1.0807,-1.0381;-0.8784,0.8627,-1.6117;-1.2944,1.6368,-2.2521;-1.5743,-0.2702,-1.3359;-2.5598,-0.4322,-1.7615;-1.0095,-1.2698,-0.4617;-1.5944,-2.16,-0.2448;0.2305,-1.1325,0.0847;0.6667,-1.8936,0.723;1.0424,0.0283,-0.1983;2.2688,0.0755,0.2875;3.0965,1.1421,-0.0937;4.3095,1.055,-0.0411)|\",3.0694442336400005\r\n\"[H]C1=C([H])C(=O)[C@]2([H])C(=N1)/C([H])=C(C#N)\\C([H])=C/2[H] |(-2.1974,-4.4216,-0.5693;-1.5435,-3.5925,-0.3073;-2.0498,-2.5208,0.369;-3.0779,-2.5092,0.7164;-1.1827,-1.4189,0.7624;-1.5,-0.4941,1.4938;0.1941,-1.4578,0.0688;-0.0716,-0.9506,-0.8883;0.6251,-2.8319,-0.3902;-0.2204,-3.794,-0.6534;2.031,-3.0642,-0.5826;2.3296,-4.0035,-1.035;2.9539,-2.1603,-0.1269;4.3523,-2.4305,-0.286;5.4922,-2.6294,-0.4084;2.5586,-0.9391,0.5684;3.3385,-0.3161,0.9937;1.2575,-0.6192,0.6948;0.93,0.2663,1.2294)|\",3.31434669909\r\n\"[H]C1=C([H])C(=O)[C@@]([H])(C(=O)OC([H])([H])[H])C(OC([H])([H])[H])=N1 |(6.0978,-0.3868,1.4031;5.3189,-1.1457,1.4214;5.5292,-2.3204,2.0635;6.4614,-2.5276,2.5773;4.4854,-3.3322,2.1215;4.5719,-4.3895,2.7263;3.1889,-2.9971,1.3323;3.1223,-3.7241,0.5134;1.9821,-3.1971,2.25;1.639,-2.4032,3.0955;1.3857,-4.3747,2.0126;0.2916,-4.6951,2.8925;-0.0678,-5.6703,2.5643;-0.4966,-3.9427,2.8092;0.6404,-4.7394,3.927;3.1884,-1.6104,0.7285;2.0423,-1.3359,0.1056;1.9193,-0.0371,-0.5029;0.9228,-0.0238,-0.9447;2.6881,0.0997,-1.2671;2.0217,0.7446,0.2534;4.1598,-0.7677,0.7473)|\",4.58239724242\r\n\"[H]OC(=O)/C(C#N)=C(\\[H])C1=C(C([H])([H])[H])ON=C1C([H])([H])[H] |(2.2093,-4.8741,2.3894;1.5372,-5.0303,1.7013;1.5642,-4.067,0.7569;0.7903,-4.0853,-0.1709;2.6003,-2.9806,0.9239;3.4384,-3.0385,2.0745;4.0556,-3.1531,3.0577;2.6051,-1.9669,0.0144;1.7977,-2.0404,-0.7131;3.4748,-0.8313,-0.118;4.8048,-0.6338,0.2172;5.8798,-1.4856,0.7965;5.7682,-2.5227,0.4684;5.8478,-1.4855,1.8914;6.8541,-1.1094,0.4741;5.175,0.5923,-0.1668;4.091,1.2628,-0.7747;3.1027,0.4076,-0.7615;1.7745,0.7822,-1.3405;1.4685,0.0709,-2.1164;1.8324,1.7781,-1.7853;0.9962,0.787,-0.5693)|\",4.438176901655\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C(Cl)C([H])=C([H])C([H])=C2[H])[C@@]1([H])C#N |(2.6253,5.6305,1.8499;2.0566,5.8983,1.1081;1.6465,4.7988,0.4462;1.1989,4.8358,-0.7463;0.6967,3.5489,-1.0605;0.877,3.3746,-2.0465;1.3745,2.555,-0.1862;2.3818,2.3268,-0.5582;0.5736,1.2802,-0.0269;1.1722,0.0239,0.1539;2.9275,-0.1464,0.1881;0.4101,-1.1341,0.3139;0.9104,-2.0866,0.4505;-0.9807,-1.0513,0.2965;-1.5727,-1.9535,0.4207;-1.6016,0.1851,0.1155;-2.6851,0.2565,0.0937;-0.8286,1.3321,-0.0454;-1.2985,2.2959,-0.2127;1.5354,3.4168,1.101;0.6123,3.3753,1.6991;2.6658,3.1074,1.971;3.5732,2.9662,2.6824)|\",5.298056669235\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C([H])C([H])=C(Cl)C([H])=C2[H])[C@@]1([H])C#N |(2.6059,5.6145,1.8868;2.0575,5.888,1.1326;1.649,4.7942,0.4599;1.2172,4.8423,-0.7387;0.7131,3.5592,-1.0681;0.8922,3.394,-2.0557;1.3783,2.5599,-0.1984;2.4016,2.3475,-0.5495;0.612,1.2649,-0.0598;-0.7869,1.2383,-0.127;-1.3249,2.1595,-0.3264;-1.4852,0.0432,0.0374;-2.5683,0.0226,-0.0212;-0.776,-1.1339,0.2757;-1.6508,-2.6432,0.4846;0.6165,-1.1331,0.3445;1.1545,-2.0576,0.5244;1.3015,0.0688,0.1716;2.388,0.0708,0.2198;1.5137,3.4088,1.1008;0.5773,3.3663,1.6778;2.624,3.0507,1.9762;3.5254,2.8276,2.6742)|\",5.385133101395001\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])N([H])[C@]([H])(C2=C([H])C([H])=C(F)C([H])=C2[H])C1([H])[H] |(3.2334,4.5648,3.6417;3.9427,3.9626,3.9238;3.949,2.8945,2.9922;4.5391,3.1489,2.0908;4.4484,2.0471,3.4742;2.537,2.5068,2.5768;1.9567,2.2833,3.4779;1.8701,3.6586,1.9282;2.5194,3.9724,1.1867;0.757,2.9971,1.2982;0.2731,3.6602,0.6964;1.3236,1.8474,0.5566;1.8428,2.1818,-0.3605;0.2648,0.8392,0.1649;-0.7876,0.5243,1.0371;-0.8569,1.0452,1.9866;-1.7468,-0.4251,0.689;-2.5687,-0.674,1.3526;-1.6426,-1.0623,-0.5429;-2.5688,-1.9837,-0.8853;-0.6164,-0.774,-1.4335;-0.5723,-1.2879,-2.3882;0.3314,0.1838,-1.0704;1.1354,0.4229,-1.7628;2.4051,1.3369,1.5628;2.0906,0.4122,2.0537;3.3485,1.1353,1.0423)|\",5.93752421791\r\n\"[H]C1=NOC2=NC(C([H])([H])[H])=NC(=S)[C@]12[H] |(7.524,1.2504,-0.0579;6.5005,1.6011,-0.1187;6.2654,2.8565,-0.0256;4.8228,3.0441,-0.1214;4.2404,1.8405,-0.2289;2.983,1.6378,-0.0929;2.6713,0.2744,0.0201;1.2213,-0.048,-0.146;0.6406,0.4718,0.6259;1.0541,-1.1228,-0.073;0.8651,0.3313,-1.1113;3.5021,-0.6839,0.3543;4.8533,-0.4408,0.4267;5.9211,-1.3935,1.2337;5.2843,0.7797,-0.389;5.3006,0.4439,-1.4422)|\",3.240875959455\r\n\"[H]SC1=NC([H])([H])N([H])[C@@]2([H])ON([H])C([H])([H])[C@]12[H] |(-0.1564,3.475,-0.2374;1.1125,3.1643,-0.5749;1.0236,1.426,-0.1443;2.1026,0.7605,-0.2413;2.1504,-0.6612,0.0449;2.4407,-1.1449,-0.9045;2.9706,-0.8291,0.7523;0.919,-1.2692,0.6183;0.9445,-2.2763,0.4615;-0.2183,-0.6788,-0.0281;-0.1996,-0.7857,-1.1379;-1.4709,-1.1434,0.4456;-2.4185,-0.1496,-0.0483;-2.5931,-0.4571,-1.0089;-1.675,1.1588,-0.1274;-1.695,1.549,-1.1515;-2.1593,1.8844,0.532;-0.2599,0.7975,0.3434;-0.2402,0.8242,1.4396)|\",6.15521529831\r\n\"[H]C1/C([H])=C([H])\\N=C2\\NC(=O)C(N([H])[H])=C([H])[C@@]12[H] |(2.1964,-0.3464,3.3565;2.2981,0.1097,2.3744;2.2663,-0.6123,1.2443;2.126,-1.6888,1.2433;2.4731,0.084,-0.0248;2.6513,-0.5164,-0.9193;2.4614,1.3663,-0.1853;2.213,2.1571,0.9454;1.7261,3.3311,0.7295;1.365,4.1412,1.816;0.9963,5.2932,1.6367;1.4288,3.5996,3.2165;0.8467,4.4262,4.1517;1.1589,4.3454,5.1094;0.7198,5.3743,3.8157;2.0113,2.4038,3.4412;2.1217,2.005,4.448;2.5709,1.5966,2.3089;3.6817,1.6651,2.3571)|\",3.2871353140399995\r\n\"[H]C1=NC(C(F)(F)F)[C@]([H])(N(=O)=O)C(=O)N1 |(-0.5102,2.0763,-1.0587;-0.3348,1.1129,-0.5846;1.0412,0.8305,-0.3686;1.3302,-0.3865,-0.1294;2.7757,-0.785,0.13;3.091,-1.826,-0.6687;3.6328,0.206,-0.0847;2.8921,-1.2001,1.4096;0.2976,-1.4848,-0.0373;0.5485,-2.2598,0.6841;0.1671,-2.2049,-1.393;0.2855,-1.5138,-2.3964;-0.0825,-3.3961,-1.3421;-1.0754,-0.8668,0.3148;-1.8588,-1.452,1.0205;-1.3427,0.3943,-0.2499)|\",4.0218427103900005\r\n\"[H]C1=C2/C([H])=C(/[H])[C@]3([H])N4C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]4(C([H])([H])[H])[C@@]2(OC1=O)C3([H])[H] |(5.2345,-2.6135,0.5888;4.374,-2.189,0.0882;4.2717,-1.0223,-0.5751;5.2326,-0.0002,-0.9412;6.2263,-0.0133,-0.5007;4.8683,0.9143,-1.864;5.5625,1.6914,-2.1774;3.4616,0.8853,-2.4651;3.4379,1.4459,-3.4054;2.4257,1.3921,-1.5599;2.3464,2.8024,-1.203;2.9199,3.3699,-1.9452;1.2997,3.1351,-1.2986;2.8178,3.127,0.2218;3.9047,2.9885,0.3003;2.6052,4.1809,0.4433;2.1058,2.2042,1.2188;1.0229,2.3877,1.1783;2.4202,2.4268,2.2457;2.4242,0.7378,0.8896;1.9033,0.0628,1.5793;3.4947,0.5874,1.0598;2.0498,0.3537,-0.5697;0.5349,0.08,-0.6586;0.2376,-0.7189,0.0264;0.2455,-0.2088,-1.6728;-0.0247,0.9861,-0.4019;2.8605,-0.8933,-1.1185;2.232,-2.1263,-0.7841;3.0942,-2.9115,-0.025;2.7658,-3.9915,0.3974;2.9958,-0.5772,-2.619;3.7204,-1.226,-3.1177;2.0369,-0.6297,-3.1401)|\",3.719796336335\r\n\"[H]OC(=N/C([H])([H])C1(C([H])([H])O[H])C([H])([H])C1([H])[H])/C([H])=C(\\[H])C1=C([H])SC([H])=C1[H] |(-1.8684,-3.0588,0.6832;-2.201,-2.2841,0.2037;-1.2191,-1.3251,0.2018;-1.5981,-0.1277,0.0012;-0.621,0.9408,-0.12;-0.0225,1.047,0.7986;0.0935,0.7344,-0.9339;-1.2854,2.2871,-0.398;-0.3035,3.4394,-0.4587;-0.8277,4.3584,-0.767;0.4789,3.2452,-1.2025;0.3951,3.6386,0.7712;-0.2834,3.7432,1.4579;-2.6523,2.5679,0.19;-3.1192,1.772,0.7616;-2.8726,3.5772,0.5337;-2.4954,2.3397,-1.2946;-2.6052,3.1894,-1.9647;-2.8596,1.3945,-1.6852;0.1637,-1.7967,0.4045;0.8691,-1.0514,0.76;0.583,-3.0497,0.1302;-0.131,-3.7591,-0.288;1.9328,-3.5683,0.3025;2.2816,-4.8532,-0.0581;1.6421,-5.6071,-0.4982;3.9431,-5.201,0.2579;4.1967,-3.6011,0.8922;5.1733,-3.3113,1.2557;3.0572,-2.8562,0.8542;3.0098,-1.8327,1.208)|\",4.530695610825\r\n\"[H]O/C(=N/C([H])([H])C1(C([H])([H])O[H])C([H])([H])C1([H])[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(4.2115,0.4151,2.1405;3.9024,-0.394,2.598;2.8928,-0.9188,1.8706;2.1808,-1.8269,2.3835;1.0498,-2.4274,1.7091;0.571,-1.7493,0.9785;1.3766,-3.3129,1.1448;-0.022,-2.8928,2.6935;-1.1926,-3.573,2.0132;-1.653,-2.9081,1.2722;-1.9657,-3.8146,2.7601;-0.8095,-4.7358,1.2786;-0.3754,-5.3338,1.9086;-0.2948,-2.0831,3.9356;0.3134,-1.1971,4.0951;-1.3217,-2.0006,4.2839;0.4023,-3.4179,4.0482;-0.1544,-4.2511,4.4734;1.4625,-3.4047,4.2788;2.7463,-0.325,0.4404;1.7241,0.0862,0.3811;3.6333,0.8661,0.3549;4.2046,1.2028,-0.6975;4.0128,0.4214,-1.9397;4.2972,0.9648,-2.9828;3.5532,-0.9594,-1.7903;3.703,-1.6323,-2.6287;2.9409,-1.3245,-0.6558;2.5538,-2.3294,-0.5161)|\",3.578297134075\r\n\"[H]O/C(=N/C([H])([H])C1(O[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(-1.9973,0.1389,3.68;-1.7201,-0.0988,2.7709;-1.2878,1.0298,2.1672;-1.1597,1.0313,0.9077;-0.7112,2.1892,0.1621;-1.4921,2.44,-0.5689;-0.5434,3.0924,0.7706;0.5906,1.8702,-0.6061;1.5751,1.4054,0.3356;1.2421,0.5545,0.6708;1.1637,3.135,-1.2955;0.3851,3.9076,-1.3533;1.9908,3.545,-0.7086;1.566,2.6955,-2.7145;1.6049,3.5283,-3.4249;2.5602,2.231,-2.6952;0.5057,1.6384,-3.0619;0.777,1.011,-3.9177;-0.4452,2.1306,-3.3092;0.3723,0.8381,-1.7533;1.1813,0.1003,-1.6947;-0.5747,0.2994,-1.6535;-0.9673,2.1933,3.1412;-1.5094,3.08,2.7764;-1.6122,1.8764,4.4483;-1.1475,2.282,5.5274;0.0833,3.1073,5.548;0.2707,3.7943,6.5264;0.9882,2.9537,4.4088;2.0288,3.2287,4.5497;0.4965,2.5128,3.2419;1.1125,2.3838,2.3533)|\",3.57013371856\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])SC([H])=C1[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(2.8142,1.3099,1.8715;2.8587,0.341,2.0069;1.8315,-0.217,1.3313;1.531,-1.4207,1.5736;0.4251,-2.1064,0.9263;-0.1364,-2.6081,1.7267;-0.2965,-1.4414,0.4234;0.9093,-3.1508,-0.0592;0.2707,-3.4601,-1.2325;-0.6341,-3.0218,-1.634;1.0655,-4.7453,-2.0878;2.2836,-4.8617,-0.8553;3.0836,-5.5834,-0.952;2.0738,-3.9619,0.1499;2.7184,-3.8531,1.014;1.1596,0.7221,0.2917;0.0855,0.753,0.537;1.6383,2.1069,0.559;1.7778,2.9494,-0.3448;1.4607,2.6028,-1.7492;1.2596,3.5205,-2.5119;1.483,1.1821,-2.0974;1.5941,0.9222,-3.1454;1.3344,0.2705,-1.1261;1.3185,-0.796,-1.3356)|\",3.3524426381600008\r\n\"[H]O/C(=N/C([H])([H])C1=NC(C([H])([H])[H])=NO1)[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(5.2278,-6.1771,1.1059;4.7199,-5.3849,0.8643;5.5664,-4.4071,0.4531;5.0587,-3.2842,0.1727;5.8804,-2.1877,-0.3117;6.6608,-2.5172,-1.0131;5.238,-1.4913,-0.858;6.528,-1.4278,0.8125;7.1899,-1.8975,1.8322;7.5685,-0.7463,2.4961;8.3627,-0.7512,3.7592;9.3213,-1.258,3.6053;7.8236,-1.2909,4.5451;8.5491,0.2719,4.0925;7.1708,0.3643,1.9266;6.4663,-0.0901,0.7922;7.0586,-4.8288,0.3586;7.643,-3.9064,0.3986;7.3259,-5.4766,-1.0164;7.3709,-4.7831,-2.0117;7.4981,-6.8783,-1.1124;7.7094,-7.5361,-0.0208;7.8578,-8.6136,-0.1256;7.7616,-6.9804,1.334;8.0411,-7.6352,2.1539;7.455,-5.685,1.5232;7.4641,-5.2199,2.5063)|\",4.035448402915\r\n\"[H]O/C(=N/C([H])([H])C1=NC(C([H])([H])[H])=NO1)[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(5.3706,-6.0021,0.1237;5.0026,-5.1051,0.0516;6.0067,-4.1949,0.0559;5.6798,-2.9844,-0.1214;6.6458,-1.908,-0.1195;7.6493,-2.1997,-0.4737;6.2915,-1.1236,-0.7949;6.8039,-1.3003,1.247;7.0912,-1.8976,2.3662;7.1263,-0.8469,3.264;7.4152,-1.0204,4.717;6.6727,-1.683,5.1742;7.3939,-0.0542,5.2251;8.3997,-1.479,4.8578;6.8817,0.3256,2.7326;6.6611,0.0246,1.3723;7.4063,-4.8062,0.3048;8.1451,-4.012,0.1325;7.5177,-5.2045,1.7685;7.4038,-4.3781,2.473;7.7242,-6.3782,2.2352;7.8826,-7.4707,1.3263;8.0248,-8.599,1.7555;7.8758,-7.1941,-0.1329;8.0587,-8.0485,-0.7775;7.6708,-5.961,-0.6169;7.6736,-5.7653,-1.6876)|\",4.688521644115\r\n\"[H]OC1=NC(=O)N(C([H])([H])[H])[C@@]2([H])NC3S[C@]([H])(C([H])([H])[H])C([H])([H])N3[C@@]12[H] |(3.3741,-3.7083,1.8849;4.3208,-3.7617,1.6789;4.9698,-2.6869,2.1967;6.2347,-2.6773,2.1017;7.0004,-1.6473,2.7081;8.2106,-1.6403,2.5817;6.3384,-0.7388,3.5272;7.15,0.3087,4.1448;6.6285,0.6787,5.0338;7.3186,1.1462,3.4583;8.1108,-0.1176,4.4301;4.9513,-0.398,3.3164;4.5548,-0.0278,4.2704;4.7627,0.6723,2.3073;3.7959,0.277,1.5785;3.0659,1.0354,0.1607;1.811,-0.34,0.032;1.7054,-0.5698,-1.0314;0.4622,0.0635,0.6281;-0.2684,-0.7431,0.4855;0.0721,0.9628,0.1426;0.5536,0.2658,1.6997;2.4772,-1.5311,0.7539;3.1559,-2.061,0.0706;1.7063,-2.2335,1.0977;3.1801,-0.9426,1.8896;4.0937,-1.6152,2.8318;3.5277,-2.0639,3.6585)|\",5.325268054285001\r\n\"[H]OC(=N/C([H])([H])C1=NOC(C([H])([H])[H])=C1[H])/C([H])=C(\\[H])C([H])([H])[H] |(8.0465,-2.1434,3.1943;7.9087,-2.368,2.2609;6.6856,-1.8822,1.883;6.1927,-2.3759,0.8173;4.9373,-1.8364,0.3;4.7953,-0.7648,0.4928;4.9517,-1.9668,-0.7888;3.7533,-2.5794,0.8633;2.8824,-1.9408,1.614;1.9383,-2.9068,1.9971;2.2823,-4.1009,1.4614;1.3819,-5.2513,1.7557;0.3794,-5.081,1.346;1.2761,-5.4001,2.8363;1.785,-6.1669,1.3161;3.4247,-3.9605,0.7333;3.9681,-4.7184,0.1894;6.0993,-0.8359,2.7572;5.0136,-0.7765,2.776;6.8268,0.0229,3.4874;7.9163,-0.0172,3.4289;6.2584,1.0892,4.374;6.6036,0.9645,5.4092;5.1644,1.0756,4.3718;6.5929,2.0841,4.0513)|\",5.668131505915\r\n\"[H]C1=NC([H])=C([H])C([C@@]2([H])C(=C3C([H])=C([H])C(=O)C([H])=C3[H])N([H])N([H])C2([H])[H])=C1[H] |(2.2167,-1.9711,5.3697;2.1969,-2.4776,4.4061;1.7348,-3.7351,4.4145;1.7053,-4.3703,3.2381;1.3273,-5.3913,3.2533;2.1233,-3.7978,2.0356;2.0739,-4.3743,1.1149;2.6028,-2.4844,2.0333;3.1008,-1.8493,0.7403;2.894,-2.536,-0.0849;2.4919,-0.4759,0.4669;1.2056,-0.1942,0.0565;0.7703,1.1622,-0.2089;1.4917,1.9729,-0.1233;-0.4984,1.4442,-0.591;-0.8167,2.4627,-0.7945;-1.5129,0.3917,-0.7665;-2.6706,0.6472,-1.115;-1.0464,-0.9774,-0.5101;-1.7762,-1.7711,-0.6406;0.2268,-1.2442,-0.1273;0.5226,-2.2717,0.0648;3.4687,0.4677,0.6924;3.1965,1.3305,1.1515;4.6314,-0.1316,1.3206;5.4373,0.3605,0.9326;4.6082,-1.5068,0.781;5.0193,-1.5588,-0.238;5.1806,-2.1672,1.4354;2.6362,-1.8131,3.2619;3.0121,-0.7977,3.3338)|\",3.42319223929\r\n\"[H]C1=C([H])[C@@]([H])(O[C@@]2([H])C([H])([H])N([H])C([H])([H])C([H])([H])C2([H])[H])N=NC1=O |(-0.317,-6.3164,2.1567;-0.2576,-5.5436,1.3966;-0.5038,-4.2491,1.632;-0.775,-3.861,2.6105;-0.438,-3.274,0.5025;-1.4411,-3.1657,0.0497;0.0469,-2.0542,0.9792;-0.1624,-0.9165,0.1146;-0.1944,-1.2621,-0.9275;-1.4807,-0.206,0.4744;-1.4647,0.0194,1.5496;-2.3419,-0.8627,0.2976;-1.6887,1.0462,-0.2559;-1.8405,0.841,-1.2433;-0.5482,1.961,-0.1224;-0.746,2.8454,-0.7387;-0.522,2.3016,0.9226;0.8064,1.3301,-0.4874;0.8287,1.1209,-1.5671;1.6239,2.0351,-0.2883;1.0262,0.0252,0.2951;1.1313,0.2444,1.367;1.9443,-0.4822,-0.0217;0.4193,-3.7106,-0.6542;0.661,-4.9115,-0.8644;0.1135,-5.9557,0.0381;0.0621,-7.0862,-0.3908)|\",3.202780020385\r\n\"[H]C1=C([H])[C@@]([H])(OC([H])([H])C2([H])C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])N=NC1=O |(-3.6706,-2.3403,3.1183;-3.4684,-1.3666,2.6832;-3.6104,-1.0882,1.3818;-3.9504,-1.8142,0.6477;-3.2539,0.2748,0.8841;-2.1888,0.2907,0.5797;-4.0833,0.6091,-0.186;-3.6204,1.7147,-0.9748;-3.5083,2.5992,-0.3365;-2.63,1.4666,-1.3937;-4.6241,1.9758,-2.0913;-5.5952,2.1762,-1.6132;-4.7938,0.7686,-3.0314;-3.8167,0.4974,-3.455;-5.164,-0.0961,-2.4702;-5.7496,1.1017,-4.1806;-6.7671,1.256,-3.7667;-5.8083,0.2592,-4.8794;-5.2552,2.2729,-4.9073;-5.8487,2.4544,-5.7138;-5.201,3.4623,-4.0563;-4.8649,4.3102,-4.6642;-6.1915,3.7318,-3.636;-4.2236,3.2238,-2.8995;-4.1944,4.108,-2.2497;-3.2167,3.0861,-3.3162;-3.3361,1.3584,1.9238;-3.2205,1.0992,3.1339;-3.0129,-0.3006,3.5834;-2.5364,-0.4646,4.6836)|\",3.29801986806\r\n\"[H]C1=C([H])[C@@]([H])(OC2([H])C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])N=NC1=O |(2.5539,3.3424,-2.8545;1.7144,3.2208,-2.1771;0.4632,3.5969,-2.4683;0.1797,4.048,-3.4158;-0.6097,3.4263,-1.4434;-0.6891,4.3485,-0.8381;-1.8127,3.1259,-2.0847;-3.0038,3.2972,-1.2865;-2.7685,3.0332,-0.2467;-4.0527,2.324,-1.8199;-4.1852,2.5006,-2.8939;-3.7006,1.2953,-1.6861;-5.3908,2.5361,-1.1009;-5.2806,2.239,-0.0384;-6.1561,1.8864,-1.5398;-5.8155,3.9286,-1.2552;-6.7377,4.0564,-0.8444;-4.8677,4.8613,-0.6466;-5.2577,5.8795,-0.7553;-4.7236,4.6807,0.4381;-3.5122,4.7419,-1.3575;-2.7915,5.4344,-0.903;-3.6302,5.0205,-2.4111;-0.3329,2.3614,-0.4177;0.8276,1.9992,-0.1583;1.9796,2.6079,-0.8706;3.0674,2.5079,-0.3495)|\",3.4177499622800007\r\n\"[H]OC1=C([H])C(N([H])C(=O)C(=O)N([H])C([H])([H])[C@@]([H])(O[H])C([H])([H])[H])=C([H])C([H])=C1[H] |(3.0652,-8.7612,-6.3635;3.1009,-8.2958,-5.5135;3.196,-6.9524,-5.7484;3.2549,-6.118,-4.6322;3.2221,-6.5629,-3.6418;3.3528,-4.7319,-4.8021;3.4069,-3.9496,-3.6321;3.3636,-4.4314,-2.7373;3.5073,-2.6051,-3.4988;3.5743,-1.7567,-4.3845;3.5318,-2.2053,-2.0101;3.4694,-3.0497,-1.108;3.6449,-0.8808,-1.8371;3.6467,-0.3278,-2.687;3.724,-0.2677,-0.519;4.3469,-0.9169,0.1086;4.2394,0.6927,-0.6181;2.3568,-0.0689,0.1602;1.7751,0.6503,-0.434;1.5936,-1.2666,0.172;2.2021,-2.0275,0.1593;2.5381,0.4876,1.5753;3.039,1.464,1.5676;3.1306,-0.2013,2.1896;1.5594,0.6053,2.0493;3.3939,-4.1721,-6.089;3.47,-3.101,-6.2098;3.3338,-5.0263,-7.1879;3.3652,-4.6033,-8.1882;3.2353,-6.4093,-7.037;3.1902,-7.0601,-7.9079)|\",4.5633492728850005\r\n\"[H]OC(=O)[C@@]1([H])C([H])N=C(C2([H])C([H])([H])C2([H])[H])NC1=O |(-1.6488,-0.7189,-5.7287;-2.0232,-1.5695,-6.0921;-2.4526,-2.3135,-5.0812;-2.9016,-3.4276,-5.2091;-2.4127,-1.6403,-3.6766;-3.4233,-1.2067,-3.564;-2.2544,-2.6655,-2.5899;-2.6467,-3.6625,-2.7881;-1.6815,-2.4334,-1.4663;-1.1739,-1.1237,-1.2794;-0.7878,-0.8344,0.1036;-0.9639,-1.6507,0.7942;-0.9249,0.5983,0.6288;-1.3314,1.3195,-0.0732;-1.2637,0.6969,1.6559;0.4307,0.0633,0.3605;1.0639,-0.2252,1.1942;0.9462,0.4124,-0.5283;-0.9943,-0.2196,-2.2051;-1.4361,-0.4786,-3.4833;-1.11,0.234,-4.4322)|\",4.38103299305\r\n\"[H]OC(=O)C([H])([H])C([H])([H])/C1=C(\\C([H])([H])[H])N([H])[C@@]2([H])N(N=C(O[H])C2([H])[H])C1=O |(5.0434,-2.6709,1.0682;4.5492,-2.7196,1.9157;3.2278,-2.8481,1.6596;2.4144,-2.7966,2.5544;2.8486,-3.0282,0.1928;3.6627,-3.4917,-0.365;1.9812,-3.6943,0.1666;2.4413,-1.6938,-0.4819;1.4576,-1.4235,-0.0883;2.3051,-1.9045,-1.5526;3.374,-0.5056,-0.3016;2.8816,0.7679,-0.2231;1.4177,1.1222,-0.3135;0.9032,0.5321,-1.0751;1.2922,2.179,-0.5726;0.91,0.9457,0.6432;3.7319,1.8667,-0.0936;3.2846,2.7044,0.2605;5.0157,1.5748,0.4995;4.9136,1.3232,1.574;5.5738,0.4181,-0.2039;6.9647,0.4609,-0.241;7.2901,1.6666,0.048;8.5816,2.0519,0.0305;8.6507,2.9591,0.3628;6.127,2.6125,0.3068;6.2694,3.2594,1.1781;5.9399,3.2381,-0.5777;4.8366,-0.7291,-0.3797;5.3608,-1.8215,-0.6274)|\",4.917097278535\r\n\"[H]C1=C(F)C([H])=C(OC([H])([H])[H])C([C@@]2([H])N=NC(=O)C([H])=C2[H])=C1[H] |(4.8734,3.5873,3.3955;4.318,2.8549,2.8209;3.0487,3.1543,2.3523;2.5104,4.3606,2.622;2.2899,2.256,1.6075;1.3045,2.5528,1.2717;2.836,1.0023,1.3231;2.1838,0.0331,0.6151;0.8986,0.3295,0.082;0.5772,-0.5729,-0.4403;0.9466,1.1648,-0.6275;0.1804,0.5659,0.8768;4.1278,0.6626,1.7762;4.6908,-0.7182,1.4498;4.3725,-0.9554,0.4224;6.1691,-0.6166,1.3358;6.9434,-1.3573,1.973;6.4397,-2.3865,2.9125;7.276,-3.0374,3.5018;4.9956,-2.5522,3.0677;4.6571,-3.32,3.7569;4.162,-1.7685,2.3712;3.0811,-1.8622,2.4487;4.8456,1.5965,2.5221;5.8443,1.3447,2.8636)|\",3.5565280260350005\r\n\"[H]C1=C(F)C([C@@]2([H])N=NC(=O)C([H])=C2[H])=C([H])C([H])=C1OC([H])([H])[H] |(4.6943,0.1124,0.8493;4.3582,1.1357,0.9684;5.2402,2.1271,1.3571;6.5266,1.7819,1.5895;4.8599,3.4617,1.5307;5.8695,4.5245,1.9392;6.8297,4.013,2.092;6.1273,5.3842,0.7396;5.9431,6.6167,0.7502;5.4645,7.322,1.9597;5.2575,8.5117,1.8507;5.2807,6.56,3.1957;4.9827,7.1161,4.0794;5.4761,5.2346,3.1903;5.3427,4.6332,4.0881;3.5204,3.7721,1.2837;3.1853,4.799,1.4022;2.5986,2.8041,0.8862;1.5707,3.0919,0.7032;3.0193,1.4762,0.7261;2.2172,0.4484,0.345;0.8465,0.7187,0.0825;0.4058,-0.2373,-0.2046;0.7293,1.4346,-0.7407;0.3345,1.1015,0.9745)|\",3.49938411743\r\n\"[H]C1=C([H])C([C@@]2([H])N=NC(=O)C([H])=C2[H])=C([H])C([H])=C1SC([H])([H])[H] |(1.2109,2.5015,1.5107;2.2548,2.4279,1.8035;2.9125,3.5501,2.2928;2.3702,4.4884,2.38;4.2583,3.4858,2.6746;4.9684,4.7157,3.2324;4.3082,5.5782,3.0574;4.9702,4.5965,4.7283;6.0224,4.592,5.3953;7.3496,4.719,4.7545;8.317,4.6209,5.4786;7.4168,4.9692,3.3134;8.4009,5.1395,2.8871;6.2894,4.971,2.5898;6.3001,5.145,1.5149;4.925,2.2628,2.5613;5.9686,2.1845,2.8549;4.2741,1.1304,2.0728;4.826,0.2001,2.0002;2.9292,1.2009,1.6855;1.9901,-0.1666,1.0406;3.1792,-1.5475,1.0482;2.6296,-2.4087,0.6601;3.5244,-1.7771,2.0598;4.032,-1.35,0.393)|\",3.3660483306850004\r\n\"[H]OC(=O)C([H])([H])[C@@]1([H])C([H])N=C(C([H])(C([H])([H])[H])C([H])([H])[H])NC1=O |(6.4929,2.0565,2.7588;6.8271,2.8689,2.3168;5.8831,3.826,2.3438;6.0249,4.8749,1.7596;4.6183,3.4999,3.1455;4.8872,3.1042,4.1306;4.0843,4.4423,3.2886;3.6837,2.491,2.4517;2.7684,2.4114,3.0673;3.2111,2.898,1.0725;3.2465,3.9561,0.8036;2.7783,2.0613,0.2047;2.7547,0.6975,0.6156;1.8772,-0.1937,-0.2244;1.9105,-1.1822,0.2447;0.4241,0.32,-0.2333;-0.1992,-0.3491,-0.8362;0.3662,1.3242,-0.6647;0.0025,0.352,0.7778;2.4428,-0.2968,-1.6562;1.8132,-0.9659,-2.2528;3.4607,-0.7006,-1.6526;2.4606,0.6848,-2.1395;3.4462,0.1915,1.5876;4.216,1.0648,2.3543;5.2285,0.6947,2.931)|\",4.280350868365001\r\n\"[H]C1=C(OC([H])([H])[H])C(OC([H])([H])[H])=C([H])[C@]2([H])C(=O)N(C([H])([H])[H])C(=O)/N=C/12 |(-0.5218,-3.7922,-2.2859;-0.5479,-2.7263,-2.4725;-1.7061,-2.0027,-2.3956;-2.8996,-2.489,-2.0325;-2.9926,-3.8657,-1.6663;-4.0336,-4.0228,-1.3833;-2.3365,-4.0906,-0.8176;-2.7347,-4.5129,-2.5122;-1.7462,-0.5724,-2.7653;-2.9987,-0.053,-2.7604;-3.1297,1.3123,-3.1341;-4.1947,1.539,-3.0683;-2.7783,1.4765,-4.1607;-2.5694,1.9649,-2.4523;-0.6097,0.0737,-3.0874;-0.591,1.1132,-3.3871;0.7188,-0.6124,-3.0005;1.1861,-0.2974,-2.0455;1.715,-0.1192,-4.0526;1.6094,0.9779,-4.5767;2.7721,-0.9779,-4.2678;3.8915,-0.5453,-5.1107;4.8202,-0.5935,-4.5387;3.6883,0.4742,-5.434;3.9851,-1.2058,-5.9765;2.853,-2.2978,-3.7203;3.9017,-2.9044,-3.8244;1.708,-2.8691,-3.1705;0.675,-2.1214,-2.9152)|\",4.01640043338\r\n\"[H]C1=C(/C([H])=C(/C#N)C(=O)OC([H])([H])[H])N=C(C2([H])C([H])([H])C2([H])[H])S1 |(4.9089,3.2257,-1.1103;5.3915,2.5675,-1.8144;4.7871,1.8429,-2.8232;3.3999,1.7502,-3.2266;3.2977,1.0738,-4.0717;2.2237,2.3032,-2.7986;1.0403,1.9168,-3.5148;0.1024,1.5822,-4.1169;2.0406,3.2572,-1.6727;2.9193,3.694,-0.951;0.7452,3.595,-1.537;0.4478,4.5149,-0.474;0.7418,4.0922,0.4902;-0.6307,4.6645,-0.5172;0.9761,5.4595,-0.6274;5.6644,1.0675,-3.5809;6.8918,1.183,-3.1821;8.053,0.4972,-3.7807;9.0003,0.6849,-3.2839;8.0956,0.3009,-5.2899;7.2492,0.6903,-5.847;9.0605,0.4091,-5.777;7.8596,-0.8782,-4.3997;8.6586,-1.601,-4.2609;6.854,-1.2843,-4.3561;7.0866,2.2837,-1.8046)|\",3.967419940290001\r\n\"[H]OC1=NC([H])([H])C([H])([H])C([H])([H])[C@@]1([H])/N=C(/O[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(0.5913,-2.8153,-0.219;-0.2945,-2.7121,-0.6084;-0.6022,-1.3786,-0.6169;-1.7059,-1.0325,-1.1309;-2.0717,0.3808,-1.1749;-2.5379,0.5677,-2.1497;-2.8669,0.5432,-0.4314;-0.9222,1.373,-0.9385;-0.2837,1.4146,-1.8314;-1.3237,2.382,-0.7874;-0.0851,0.9334,0.2651;-0.7141,0.9205,1.1654;0.7433,1.6205,0.4661;0.4784,-0.4732,-0.0074;1.2459,-0.3623,-0.7874;1.1816,-1.0646,1.1302;0.6065,-1.5115,2.1687;1.4123,-2.0143,3.138;0.883,-2.2597,3.9156;-0.9043,-1.6036,2.512;-1.4568,-1.145,1.6834;-1.3248,-3.0635,2.5668;-1.2177,-3.6109,1.6275;-1.789,-3.6978,3.576;-1.9621,-3.002,4.8154;-2.3359,-3.6081,5.8;-1.6792,-1.5456,4.8611;-1.8884,-1.0518,5.8051;-1.2001,-0.886,3.7973;-0.9901,0.1809,3.841)|\",4.225928098265\r\n\"[H]C1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C(C([H])([H])[H])=C([H])C(C([H])([H])[H])=C2[H])OC(C([H])([H])[H])=C1[H] |(9.4029,-3.5755,-1.5866;8.9951,-4.3112,-0.9088;7.7496,-4.2502,-0.3398;6.683,-3.2495,-0.4689;6.8623,-2.2686,-1.1934;5.4327,-3.4797,0.2894;5.3679,-4.3799,0.892;4.4242,-2.588,0.2152;4.5998,-1.7208,-0.4206;3.1232,-2.6419,0.881;2.2124,-1.5957,0.6555;2.5018,-0.7813,-0.0058;0.9511,-1.5797,1.2572;-0.0128,-0.4451,0.9959;-0.9449,-0.5761,1.5544;-0.2683,-0.3752,-0.0686;0.4184,0.5207,1.2866;0.6081,-2.6403,2.1024;-0.3697,-2.6416,2.5802;1.4897,-3.7022,2.3533;1.0954,-4.837,3.2719;0.0858,-4.6968,3.6705;1.7818,-4.9191,4.1236;1.116,-5.8007,2.7481;2.741,-3.6901,1.738;3.4308,-4.5085,1.9274;7.5636,-5.3515,0.4687;8.6968,-6.1029,0.4053;8.7176,-7.3572,1.2062;9.6871,-7.851,1.1013;7.9385,-8.0556,0.8763;8.5474,-7.1536,2.2704;9.6049,-5.5037,-0.4292;10.591,-5.879,-0.6659)|\",4.111640281055\r\n\"[H]OC1=C(Cl)C([H])=C(/C([H])=C(C#N)\\C(=N/C([H])([H])[H])O[H])C([H])=C1[H] |(8.8633,2.505,2.5437;8.3646,3.3341,2.619;7.0386,3.0619,2.5802;6.1325,4.1316,2.6815;6.738,5.7623,2.8548;4.7653,3.8976,2.6519;4.0894,4.7429,2.7368;4.249,2.5964,2.4983;2.798,2.4325,2.4682;2.2331,3.2426,2.926;2.0672,1.4183,1.9413;0.6433,1.4333,2.0821;-0.5174,1.432,2.1795;2.6183,0.2601,1.1547;2.9974,0.2289,-0.0522;2.9583,1.4003,-0.9064;3.9748,1.6231,-1.2518;2.5479,2.3009,-0.431;2.3597,1.1696,-1.795;2.6572,-0.8733,1.9085;2.9437,-1.5813,1.299;5.1616,1.5277,2.4182;4.8028,0.5066,2.3666;6.5291,1.7616,2.458;7.2227,0.9247,2.4069)|\",4.2993988379\r\n\"[H]OC1=C([H])C([H])=C(/C([H])=C(/C#N)C(=O)OC([H])([H])[H])C([H])=C1Cl |(9.3477,3.0414,-2.2815;8.825,2.9155,-3.0898;7.5435,2.6485,-2.7549;7.0991,2.5773,-1.4267;7.8138,2.7446,-0.623;5.7764,2.3003,-1.1188;5.459,2.2508,-0.0871;4.8321,2.0802,-2.1458;3.4191,1.7807,-1.9932;2.9346,1.66,-2.9606;2.5357,1.6086,-0.9573;1.1833,1.3114,-1.3442;0.1026,1.0729,-1.7042;2.789,1.689,0.508;3.8507,1.9274,1.0544;1.6534,1.4574,1.1934;1.7667,1.507,2.6247;0.7659,1.2936,2.9992;2.0977,2.4974,2.9478;2.4802,0.7579,2.9778;5.2912,2.1548,-3.4835;4.5976,1.9933,-4.3027;6.6128,2.4318,-3.7869;7.1344,2.515,-5.4534)|\",3.93476627823\r\n\"[H]C1=C(/C([H])=C(\\C#N)C(=O)OC([H])([H])[H])C([H])=C(C([H])([H])[H])C([H])=C1C([H])([H])[H] |(6.1927,0.6197,-0.0271;5.1064,0.6662,-0.0325;4.3643,-0.5206,-0.0161;5.0567,-1.8323,-0.0274;5.2889,-2.2921,-0.9882;5.4371,-2.5321,1.061;5.2046,-2.0628,2.3963;5.0177,-1.684,3.4796;6.1279,-3.8509,0.8596;6.3783,-4.3255,-0.2284;6.4288,-4.4324,2.0313;7.0933,-5.7053,1.9401;7.2727,-6.0112,2.9704;8.035,-5.6047,1.3949;6.4566,-6.4297,1.4259;2.9654,-0.4653,-0.0226;2.3919,-1.389,-0.0095;2.3003,0.7656,-0.0385;0.7898,0.8192,-0.0364;0.4278,1.8518,-0.0443;0.3751,0.3261,0.8512;0.3703,0.3106,-0.9132;3.0651,1.9372,-0.0528;2.5566,2.8991,-0.064;4.464,1.9092,-0.0486;5.271,3.187,-0.0577;4.6222,4.0682,-0.0643;5.921,3.2433,-0.9396;5.9187,3.2551,0.8249)|\",4.59328179644\r\n\"[H]O/C(=N/C(C1=C([H])C([H])=C([H])C([H])=C1[H])(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(5.5765,1.7105,0.0966;4.6826,1.656,-0.2766;4.1649,0.3965,-0.05;2.9291,0.2746,-0.2814;2.1419,-0.9611,-0.099;2.7136,-2.1793,-0.8484;3.2164,-2.0155,-2.1494;3.2581,-1.0198,-2.5811;3.6717,-3.1052,-2.8884;4.0606,-2.9487,-3.8913;3.6338,-4.3907,-2.3443;3.9902,-5.2414,-2.919;3.1366,-4.5701,-1.055;3.1028,-5.564,-0.6158;2.6788,-3.4753,-0.3164;2.2951,-3.6456,0.6846;0.7532,-0.6483,-0.7079;0.0833,-1.5103,-0.6138;0.8498,-0.3994,-1.7686;0.3072,0.2113,-0.1964;1.9548,-1.199,1.4146;1.1844,-1.9524,1.6088;1.6335,-0.2629,1.8814;2.8801,-1.5233,1.9025;5.193,-0.6172,0.4346;4.7152,-1.5883,0.5452;6.2946,-0.7906,-0.5428;6.5237,0.1434,-0.916;7.3693,-1.2335,0.3122;8.2292,-1.2472,-0.2314;7.3888,-0.3012,1.4455;8.0006,-0.6954,2.2627;7.7805,0.7004,1.1754;5.8883,-0.2167,1.7918;5.5996,0.7777,2.148;5.6222,-0.9346,2.572)|\",6.196032375884999\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])C([H])([H])C([H])=C([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.2709,4.5386,-0.4122;2.5225,3.9354,-0.5568;2.8619,2.6809,-0.1517;1.9718,1.7891,-0.1976;2.1835,0.4018,0.1602;2.3851,-0.1651,-0.7602;3.0507,0.2407,0.8228;0.9211,-0.1914,0.8184;0.1111,-0.087,0.0859;1.1225,-1.6871,1.1355;0.1424,-2.184,1.2004;1.6416,-2.1854,0.3032;1.8813,-1.921,2.4209;2.2726,-2.9246,2.5845;2.0788,-0.9779,3.3481;2.6441,-1.2191,4.2478;1.5551,0.4308,3.2188;2.3979,1.1209,3.0458;1.1174,0.7545,4.1731;0.5121,0.5648,2.0961;0.3363,1.6201,1.8614;-0.4425,0.1522,2.4477;4.3524,2.5351,0.2619;4.4329,1.5831,0.8047;5.191,2.4195,-0.9937;4.9504,1.5709,-1.6408;6.1244,3.2026,-1.3831;6.4691,4.3259,-0.5656;7.3175,5.1096,-0.9438;5.7788,4.4978,0.7372;6.1203,5.3275,1.3486;4.8088,3.6623,1.1372;4.3144,3.7886,2.0987)|\",4.280350868365\r\n\"[H]C#CC([H])([H])OC1=C([H])C([H])=C(/C([H])=C(C#N)/C(=N\\C([H])([H])[H])O[H])C([H])=C1[H] |(0.2854,-2.5909,-0.0496;1.289,-2.593,0.312;2.4256,-2.5838,0.7136;3.7923,-2.6121,1.2209;4.302,-3.5285,0.8908;3.7894,-2.6062,2.3202;4.4909,-1.4613,0.7302;5.794,-1.3045,1.0779;6.5195,-2.1843,1.8964;6.0574,-3.0726,2.3107;7.8539,-1.9235,2.1884;8.3854,-2.6201,2.8247;8.5072,-0.7837,1.6779;9.8954,-0.4305,1.9217;10.2103,0.4777,1.409;10.8715,-1.0036,2.6849;10.6855,-2.1984,3.4439;10.5702,-3.1811,4.0599;12.2418,-0.3727,2.7433;12.9615,-0.1251,3.7512;12.4973,-0.3457,5.1064;12.8281,0.4967,5.724;12.9675,-1.2489,5.5157;11.4095,-0.4494,5.2189;12.7274,0.0127,1.5177;12.2682,-0.4959,0.8309;7.7524,0.0845,0.8538;8.2284,0.9723,0.4443;6.4249,-0.1625,0.5573;5.8508,0.5073,-0.0742)|\",3.956535386270001\r\n\"[H]O/C(=N/C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])OC([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(1.4724,-0.303,-5.1442;1.0633,-0.0834,-4.2903;1.9969,0.4631,-3.4631;1.6028,0.9005,-2.3482;2.4754,1.45,-1.3342;3.5271,1.5454,-1.6411;2.1147,2.4618,-1.0979;2.425,0.6025,-0.044;2.704,-0.426,-0.3133;1.0238,0.5897,0.5782;0.2848,0.264,-0.1581;0.7358,1.5926,0.9203;0.9787,-0.0859,1.4401;3.4631,1.1025,0.9605;3.3808,0.53,1.8999;3.2737,2.163,1.2088;4.7617,0.9536,0.4104;5.7785,1.4151,1.2741;6.734,1.2495,0.7693;5.7799,0.8658,2.2293;5.6711,2.4896,1.4946;3.4411,0.4206,-4.0374;4.0513,1.0833,-3.4094;3.9889,-0.9831,-3.886;4.0508,-1.3513,-2.8578;4.3718,-1.7761,-4.8137;4.3096,-1.3295,-6.1718;4.6208,-2.0863,-7.0707;3.8715,0.0617,-6.4454;3.9007,0.3713,-7.4858;3.4848,0.8868,-5.4612;3.1745,1.9092,-5.6689)|\",4.59328179644\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(6.0607,-4.0582,-2.3219;6.3636,-4.7849,-1.7527;5.7605,-4.689,-0.533;6.1755,-5.4994,0.3551;5.6047,-5.5287,1.6844;5.1143,-4.5859,1.9654;4.5918,-6.6354,1.7633;4.6504,-7.1341,0.8754;5.0047,-7.5866,2.7687;4.5422,-7.2923,3.6295;6.4571,-7.4005,2.9294;6.7857,-7.7825,3.9017;6.9692,-7.9677,2.1431;6.6812,-5.8937,2.7359;7.6811,-5.6335,2.3787;6.493,-5.3561,3.6735;4.6665,-3.635,-0.4441;3.9716,-3.7995,-1.2788;4.088,-3.7811,0.47;5.1807,-2.17,-0.4895;5.9054,-2.0223,0.3192;4.3257,-1.5186,-0.2682;5.7969,-1.7537,-1.8122;4.9895,-1.5623,-2.9454;3.9117,-1.6899,-2.8685;5.5594,-1.1886,-4.1618;4.9258,-1.0333,-5.0312;6.9389,-1.0019,-4.2645;7.3747,-0.7061,-5.2158;7.7691,-1.1894,-3.1532;9.2671,-1.0228,-3.2599;9.5406,-0.3963,-4.1149;9.7637,-1.9932,-3.3908;9.6853,-0.5659,-2.3564;7.1787,-1.5635,-1.9382;7.8127,-1.7116,-1.066)|\",5.53479571917\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])C([H])([H])N([H])C1=NC([H])=C([H])C([H])=N1 |(4.7958,0.0384,-2.2225;5.4692,-0.4005,-1.6823;5.6675,0.3221,-0.5374;6.59,-0.0989,0.2279;6.8385,0.5317,1.5074;6.4007,1.5346,1.5925;6.2256,-0.328,2.5804;6.0225,-1.2236,2.134;7.2265,-0.6131,3.5764;7.1685,0.1492,4.2521;8.5226,-0.5347,2.8778;9.3322,-0.3599,3.5942;8.7053,-1.4974,2.3852;8.3542,0.5824,1.8337;8.9536,0.4318,0.9319;8.6154,1.5564,2.2643;4.7122,1.4892,-0.3288;5.2612,2.3668,0.0216;4.2423,1.7698,-1.2808;3.606,1.171,0.7085;4.0699,0.9402,1.6721;3.042,0.2868,0.3941;2.6596,2.2586,0.8657;1.7359,2.1934,0.4645;2.9007,3.3967,1.5741;4.1256,3.5511,2.118;4.3087,4.6712,2.8217;5.2949,4.7951,3.2676;3.3214,5.639,2.9862;3.4907,6.544,3.5579;2.1012,5.3733,2.3596;1.2728,6.078,2.4264;1.8691,4.2685,1.6555)|\",5.09397128136\r\n\"[H]C1=C([H])C([H])=C([H])/C(=C2/ON([H])[C@]([H])(C([H])([H])SC([H])([H])[H])N2[H])C1=O |(10.0965,-0.9958,2.8918;9.331,-1.4327,2.2571;9.1082,-2.7741,2.221;9.7061,-3.438,2.8428;8.1056,-3.3491,1.3662;7.9766,-4.4268,1.3365;7.3398,-2.5379,0.5857;6.6063,-2.9725,-0.0902;7.513,-1.1073,0.5967;6.6387,-0.3224,-0.1331;6.8251,0.9999,-0.3551;5.5796,1.5654,-0.8627;5.0811,1.8144,-0.0006;4.9144,0.4081,-1.474;5.2221,0.3676,-2.5268;3.3925,0.4918,-1.4199;2.9663,-0.3142,-2.0268;3.0839,1.4469,-1.8585;2.765,0.3676,0.3069;0.998,0.7066,-0.0085;0.4895,0.6361,0.956;0.569,-0.0344,-0.6891;0.8555,1.7126,-0.4126;5.4928,-0.7393,-0.767;4.8915,-1.407,-0.2998;8.5776,-0.4858,1.4293;8.8131,0.728,1.4465)|\",3.545643472015\r\n\"[H]/N=C(O[H])\\C(C#N)=C(/[H])C1=C([H])C([H])=C(O[H])C(Cl)=C1[H] |(-1.9933,-4.3601,-1.1811;-2.2687,-4.9746,-0.4157;-2.2245,-4.3685,0.7002;-2.655,-5.0447,1.7945;-2.5495,-4.5087,2.5973;-1.7334,-2.9623,0.9494;-2.2622,-2.3207,2.1102;-2.7172,-1.8771,3.0883;-0.8082,-2.392,0.1295;-0.4543,-3.044,-0.6669;-0.1858,-1.0794,0.1287;-0.4943,-0.0262,1.0152;-1.2505,-0.1502,1.7796;0.1636,1.1903,0.9229;-0.0879,1.9921,1.614;1.1503,1.4125,-0.0497;1.813,2.5836,-0.1714;1.4964,3.2056,0.5033;1.463,0.3703,-0.9406;2.6842,0.6176,-2.1644;0.8057,-0.8459,-0.8491;1.0696,-1.6312,-1.5505)|\",3.904833754675\r\n\"[H]O/C(=N/C([H])([H])C1=NOC(C([H])([H])[H])=C1[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(6.455,5.4643,0.3842;5.8093,4.7777,0.145;6.3477,3.5504,0.3615;5.5908,2.551,0.1933;6.062,1.1837,0.3304;6.9297,1.0637,0.9962;5.2441,0.5948,0.761;6.4311,0.6094,-1.0139;7.7008,0.424,-1.3032;7.7168,-0.0679,-2.6142;6.4469,-0.1613,-3.0716;6.2709,-0.6717,-4.46;5.2094,-0.7086,-4.716;6.6909,-1.6786,-4.5635;6.7815,-0.0245,-5.1821;5.5842,0.2565,-2.1023;4.5074,0.3178,-2.1495;7.8506,3.5793,0.7509;8.101,2.5858,1.1442;8.6819,3.7821,-0.5033;8.6248,2.9705,-1.2323;9.4323,4.7813,-0.7795;9.5448,5.8451,0.168;10.1627,6.8517,-0.1198;8.9076,5.6836,1.5004;9.1131,6.4677,2.223;8.1305,4.6284,1.7854;7.6636,4.5155,2.7621)|\",4.6858005056100005\r\n\"[H]O/C(=N/[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(1.515,1.7236,1.3859;1.6985,0.9215,0.8446;3.0023,0.9288,0.5161;3.5352,-0.1727,0.1734;4.9394,-0.3017,-0.1963;5.4137,0.6457,-0.4975;5.7042,-0.7744,1.0514;6.1525,-1.8826,1.2277;5.8107,0.2407,1.9459;6.4248,-0.075,3.2229;7.2458,-0.7744,3.049;6.828,0.8788,3.5719;5.4051,-0.6445,4.1966;5.8772,-0.8109,5.1716;5.0185,-1.6011,3.8342;4.5665,0.0458,4.3331;5.0781,-1.3161,-1.3309;6.1281,-1.4504,-1.6095;4.519,-0.9708,-2.2064;4.6738,-2.2813,-1.0184;3.7043,2.3091,0.5199;4.6824,2.1429,1.0064;3.0471,3.4163,1.3442;2.1161,3.2174,2.111;3.5789,4.7104,1.2179;4.0547,5.0334,0.0597;4.4276,6.0541,-0.0495;4.1391,4.1374,-1.0905;4.3783,4.5438,-2.068;3.9855,2.8197,-0.8783;4.0923,2.095,-1.6798)|\",3.90211261617\r\n\"[H]O/C(=N/[C@@]([H])(C([H])([H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(6.6913,-0.489,2.476;5.7515,-0.7196,2.5642;4.9818,0.2939,2.075;3.7388,0.0983,2.0014;2.7588,1.0577,1.5314;3.2035,2.0526,1.3359;2.1584,0.5612,0.1992;1.7631,-0.45,0.3439;1.3075,1.2088,-0.0501;3.1551,0.5547,-0.9635;2.6766,0.2103,-1.8869;3.9958,-0.1172,-0.758;3.5551,1.5591,-1.156;1.6749,1.2815,2.6252;0.9604,1.9946,2.1878;0.9148,-0.0007,2.9962;0.1937,0.2008,3.7971;1.6074,-0.7731,3.3447;0.3555,-0.4068,2.1468;2.2795,1.9368,3.8752;1.4959,2.193,4.5969;2.8136,2.8634,3.6254;2.9835,1.2603,4.3737;5.7969,1.5603,1.6969;5.092,2.2558,1.2219;6.3152,2.2201,2.9572;5.5475,2.5285,3.6729;7.5335,2.4431,3.2776;8.5696,2.0556,2.3673;9.7333,2.2337,2.6696;8.1871,1.4477,1.0696;9.0064,1.2083,0.3985;6.9049,1.2304,0.7425;6.6305,0.7952,-0.2163)|\",4.64770456654\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C(C([H])([H])[H])=C1[H])N([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(4.602,-1.7617,1.371;4.6189,-2.528,0.7696;3.8398,-2.1544,-0.2805;3.2967,-0.9919,-0.2578;2.4454,-0.5491,-1.285;1.3081,-1.2722,-1.6949;1.0756,-2.2191,-1.2156;0.4646,-0.7519,-2.6752;-0.4165,-1.3136,-2.9759;0.7345,0.4845,-3.2616;0.0681,0.8824,-4.023;1.855,1.2262,-2.863;2.1692,2.5634,-3.4951;1.3167,2.9475,-4.0643;3.0195,2.4879,-4.1856;2.4353,3.3109,-2.739;2.6916,0.7015,-1.873;3.5583,1.2636,-1.5339;3.7593,-3.1329,-1.2342;3.3042,-2.8329,-2.0869;4.6882,-4.2845,-1.3079;5.7187,-3.933,-1.2041;4.4335,-5.3169,-0.3052;4.316,-4.9382,0.6315;3.2247,-5.9922,-0.718;2.4647,-5.2964,-0.7221;3.5421,-6.2631,-2.1303;4.1085,-7.1967,-2.1857;2.6174,-6.3823,-2.7019;4.4243,-5.0674,-2.6139;3.9316,-4.4479,-3.3708;5.3682,-5.4154,-3.0411)|\",5.67629492143\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])OC1=C([H])C([H])=C(F)C([H])=C1[H] |(-4.7999,-3.2293,-0.6375;-4.3315,-4.0842,-0.6754;-3.4951,-4.13,0.3884;-2.7637,-5.1522,0.5491;-1.9039,-5.302,1.7175;-2.1346,-4.6035,2.5333;-2.0106,-6.6738,2.2537;-2.975,-7.0033,2.2362;-1.2427,-7.5095,1.3527;-1.6722,-7.4226,0.4211;0.011,-6.7464,1.2945;0.6029,-6.9871,2.1825;0.5789,-7.0306,0.4042;-0.4059,-5.2451,1.3066;-0.308,-4.7848,0.3192;0.1806,-4.6596,2.0215;-3.5941,-2.8994,1.3041;-2.6055,-2.5652,1.6305;-4.1815,-3.1551,2.1926;-4.2999,-1.8295,0.6754;-3.5748,-0.9014,-0.0563;-2.3216,-1.1544,-0.6201;-1.8373,-2.1178,-0.5034;-1.6872,-0.1565,-1.3655;-0.7155,-0.3289,-1.816;-2.319,1.0664,-1.5416;-1.708,2.0254,-2.2684;-3.5696,1.3286,-0.9898;-4.0337,2.2972,-1.1431;-4.1969,0.3377,-0.241;-5.1681,0.509,0.2116)|\",4.90893386302\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)/N=C1/C([H])=C([H])N([H])N1[H] |(-1.188,-0.4175,0.4502;-0.4886,0.2569,-0.0479;-0.9748,1.2256,-0.7282;-0.074,2.1219,-1.3899;-0.5235,3.0127,-2.0882;1.3824,1.9448,-1.1942;2.0156,2.6886,-1.6682;1.8788,0.9347,-0.4695;2.9465,0.8219,-0.3051;0.9728,-0.08,0.1494;1.1182,-1.0582,-0.3447;1.2943,-0.3573,1.637;2.2404,0.2395,2.1798;0.4799,-1.2963,2.1917;0.672,-1.5892,3.473;0.0227,-2.6186,4.2408;-0.7502,-3.2717,3.8669;0.605,-2.6233,5.4721;0.393,-3.2199,6.3486;1.6316,-1.7059,5.5129;1.791,-1.1458,6.3452;1.5601,-0.99,4.3273;2.3327,-0.4413,3.9443)|\",4.69124278262\r\n\"[H]O/C(=N/C1([H])C([H])([H])N([H])N([H])C1([H])[H])N([H])N([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.0658,-0.9979,0.5541;5.3078,-1.4701,0.1654;4.509,-0.4858,-0.3177;4.86,0.7354,-0.2051;4.03,1.8009,-0.7437;3.5724,1.5163,-1.708;4.8815,3.0939,-0.9269;5.015,3.3825,-1.9747;5.8695,2.9054,-0.4938;4.2147,4.1783,-0.1767;3.5622,4.6699,-0.7864;3.4023,3.5508,0.8317;4.0346,3.3416,1.6042;2.9236,2.2889,0.2413;2.7007,1.5546,1.021;1.9977,2.5021,-0.3096;3.351,-1.0042,-0.8787;2.8573,-0.3819,-1.5048;3.2536,-2.3495,-1.2522;4.1229,-2.6945,-1.6493;2.7061,-3.2482,-0.3028;2.9875,-4.6151,-0.4393;3.6621,-4.9504,-1.2247;2.4058,-5.5404,0.4239;2.6373,-6.5958,0.3062;1.5396,-5.1204,1.4357;1.0889,-5.843,2.1097;1.2648,-3.759,1.5702;0.595,-3.414,2.3538;1.8393,-2.8229,0.7103;1.6255,-1.7657,0.823)|\",5.787861600135001\r\n\"[H]O/C(=N/C1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.8861,1.9798,3.0409;3.7445,1.087,2.6852;2.9584,1.1526,1.5703;2.7905,0.087,0.9217;1.9752,-0.0916,-0.28;0.4797,-0.0524,0.0995;-0.1457,-0.3404,-0.7526;0.1622,0.949,0.4132;0.2801,-0.7466,0.9224;2.3193,-1.4848,-0.8797;3.3913,-1.6599,-0.7314;1.7829,-2.2923,-0.3693;2.0054,-1.3692,-2.3737;2.4649,-2.1646,-2.97;0.9228,-1.4206,-2.549;2.5449,0.0288,-2.7289;2.0859,0.4501,-3.6294;3.6229,-0.0357,-2.9193;2.2879,0.8976,-1.4691;1.4464,1.5815,-1.6349;3.1646,1.5118,-1.2401;2.3777,2.5759,1.323;1.8123,2.5267,0.3864;1.4059,2.913,2.4353;0.5617,2.2259,2.5441;1.468,3.8901,3.2586;2.5572,4.8144,3.1539;2.6406,5.7346,3.9435;3.5556,4.6195,2.0746;4.3448,5.3631,2.0201;3.4689,3.5987,1.2094;4.1998,3.4691,0.4137)|\",4.416407793614999\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])N(C([H])([H])[H])N=C1[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(9.0669,-0.5881,0.287;8.2111,-0.1153,0.4065;8.4219,1.1962,0.1913;7.4208,1.9272,-0.0766;7.512,3.3605,-0.2914;8.5334,3.7398,-0.467;6.9565,3.5735,-1.2159;6.8933,4.126,0.8518;5.9342,3.6733,1.7436;5.4618,2.7063,1.8348;5.6393,4.7097,2.5719;4.7091,4.7264,3.6846;4.2046,3.7596,3.7365;3.9668,5.5159,3.5385;5.2415,4.9092,4.6229;6.3506,5.8205,2.2824;7.1081,5.4694,1.2415;7.7822,6.1964,0.8046;9.8805,1.6978,0.3731;10.1006,2.3341,-0.5041;10.9861,0.6386,0.3576;10.7945,-0.517,0.0054;12.2762,1.0431,0.7329;12.361,1.9898,1.6112;13.3675,2.2698,1.9299;11.2293,2.7108,2.1828;11.39,3.3697,3.0301;10.0314,2.5921,1.5855;9.1648,3.1595,1.9132)|\",3.5429223335100004\r\n\"[H]C1=NN(C([H])([H])[H])C([H])=C1C(=O)/C([H])=C(\\[H])C1=C(C([H])([H])[H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(8.3914,1.0658,-3.8797;8.6303,0.0117,-3.8247;9.0856,-0.6021,-4.9113;9.2884,-1.8879,-4.5161;9.7824,-2.8536,-5.4802;9.9376,-3.8088,-4.9752;10.7287,-2.5039,-5.9009;9.0576,-2.9815,-6.2892;8.9718,-2.0868,-3.2212;9.0681,-3.0457,-2.7336;8.5311,-0.8683,-2.7132;8.0906,-0.6575,-1.3227;8.1044,-1.5916,-0.5178;7.6371,0.702,-0.944;7.657,1.4761,-1.7059;7.2304,0.9577,0.3168;7.2843,0.1106,0.9964;6.7445,2.2238,0.864;6.6027,2.3894,2.2676;6.9485,1.2843,3.2414;6.7951,1.6185,4.2717;7.9934,0.9666,3.1472;6.3274,0.3932,3.0875;6.1417,3.6122,2.7593;6.041,3.7342,3.8364;5.8054,4.6858,1.9246;5.3017,5.9851,2.5058;5.1543,6.7409,1.7282;6.0052,6.393,3.2421;4.3432,5.8463,3.0221;5.9423,4.5079,0.543;5.6815,5.3182,-0.1335;6.4003,3.3014,0.0281;6.4776,3.1843,-1.0488)|\",4.144293943115\r\n\"[H]C1=NN(C([H])([H])[H])C([H])=C1C(=O)/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C(C#N)=C1[H] |(3.0981,-0.6759,1.4782;3.8441,-0.7606,0.7004;4.1975,-1.9464,0.2266;5.1296,-1.6721,-0.7271;5.7741,-2.7708,-1.4234;6.3306,-2.3757,-2.2757;5.0111,-3.4686,-1.7749;6.4595,-3.2989,-0.7532;5.3673,-0.349,-0.8512;6.0857,0.0231,-1.5678;4.5435,0.3013,0.0662;4.376,1.7311,0.3613;3.5826,2.105,1.224;5.1942,2.7058,-0.4138;5.8797,2.3183,-1.1629;5.0772,4.0251,-0.1782;4.3647,4.3088,0.5951;5.7902,5.1243,-0.8326;5.5323,6.4437,-0.4159;4.8104,6.6117,0.3789;6.1817,7.5296,-1.0012;5.9649,8.5378,-0.6612;7.1078,7.3267,-2.0197;7.6206,8.1626,-2.4839;7.3785,6.0152,-2.451;8.3309,5.7875,-3.5005;9.1029,5.6,-4.35;6.7249,4.9269,-1.8621;6.9515,3.9271,-2.2156)|\",4.465388286705\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])C([H])([H])[H])C1([H])[H])C1=C([H])C([H])=C(C([H])([H])[H])N=C1[H] |(6.0339,-2.5041,-5.8039;5.8326,-1.8959,-5.0752;4.5401,-2.1017,-4.6752;4.2101,-1.5718,-3.5664;2.8523,-1.6513,-3.0748;2.1229,-1.804,-3.8822;2.7493,-2.8159,-2.1272;3.7088,-3.056,-1.873;2.1293,-2.3667,-0.9102;1.1223,-2.4617,-1.0579;2.4523,-0.9266,-0.7856;3.4721,-0.8717,-0.3782;1.4978,-0.1994,0.1616;1.7433,0.8706,0.1462;0.4747,-0.2807,-0.2374;1.5403,-0.72,1.6029;0.8475,-0.1625,2.2433;2.546,-0.6152,2.0271;1.274,-1.7805,1.6502;2.467,-0.4066,-2.2375;3.1811,0.4056,-2.3977;1.4688,-0.0503,-2.5225;3.714,-2.8903,-5.6396;2.8255,-3.8982,-5.2375;2.6989,-4.1278,-4.182;2.1141,-4.5919,-6.2115;1.4245,-5.381,-5.9264;2.2944,-4.2745,-7.5645;1.5489,-4.9951,-8.6582;0.9668,-5.8352,-8.2686;2.2489,-5.3679,-9.4134;0.8657,-4.3068,-9.1698;3.1397,-3.304,-7.9573;3.8197,-2.6448,-7.0195;4.4791,-1.8554,-7.3807)|\",5.023221680230001\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)N1C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.3997,-2.6256,3.6445;6.1975,-2.3015,2.972;7.1242,-3.1434,2.7202;8.1697,-2.7657,1.8186;9.096,-3.5282,1.6186;8.0928,-1.4472,1.1469;8.8698,-1.2405,0.4175;7.1374,-0.5595,1.4466;7.1004,0.4116,0.9575;6.0785,-0.8812,2.4626;6.2026,-0.2157,3.3335;4.657,-0.6693,1.8819;4.0093,-1.6308,1.4803;4.2004,0.6141,1.8141;4.7865,1.771,2.5122;4.7639,1.6193,3.6017;5.828,1.9361,2.2184;3.878,2.9386,2.1082;3.8593,3.7218,2.8718;4.2394,3.3881,1.1784;2.5072,2.2742,1.8933;1.8582,2.8579,1.2351;1.9876,2.1719,2.8531;2.8214,0.8736,1.3198;2.1667,0.1154,1.7634;2.7018,0.7146,-0.2175;3.0776,-0.2926,-0.4299;1.2272,0.7541,-0.6474;1.1332,0.5332,-1.7164;0.7761,1.7402,-0.4772;0.6313,0.0132,-0.1014;3.5423,1.7136,-1.024;3.4915,1.4748,-2.0923;4.5978,1.6831,-0.7312;3.1823,2.7437,-0.9061)|\",4.79464604581\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)N2C([H])([H])C([H])([H])C([H])([H])[C@@]2([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(5.5556,-4.8868,3.766;5.3344,-4.5524,2.7491;4.2479,-5.0162,2.2241;3.9282,-4.6227,0.9084;3.0416,-5.1951,0.3021;4.6965,-3.4555,0.2421;4.9548,-3.8135,-0.7617;3.7572,-2.2186,0.1162;3.8191,-1.3307,0.9615;2.8874,-2.211,-0.928;2.95,-3.092,-2.1145;3.9565,-3.0557,-2.5557;2.7297,-4.1256,-1.8418;1.9251,-2.4784,-3.0811;2.2168,-2.6171,-4.1263;0.9474,-2.9528,-2.9464;1.8684,-0.9998,-2.6598;0.9608,-0.4963,-3.0038;2.7273,-0.4558,-3.0716;1.9704,-1.0612,-1.123;2.4592,-0.179,-0.6989;0.5842,-1.2351,-0.4395;-0.0029,-1.9367,-1.0516;0.6632,-1.8155,0.9792;-0.3432,-1.8944,1.4075;1.2675,-1.1812,1.635;1.1018,-2.8179,0.9777;-0.1557,0.1133,-0.4236;-1.1691,-0.0096,-0.0256;-0.2447,0.5582,-1.4214;0.3682,0.8333,0.2178;5.9516,-3.0587,0.959;6.5765,-2.2956,0.5021;6.2699,-3.6168,2.1353;7.1685,-3.3463,2.6808)|\",4.12524597358\r\n\"[H]OC1=NC(=O)N(C([H])([H])C(=O)N(C([H])([H])[H])C2([H])C([H])([H])C2([H])[H])C([H])([H])C1([H])[H] |(-4.0418,1.7167,-1.8759;-3.7821,1.4927,-0.9685;-2.4512,1.6681,-0.821;-1.9486,1.3716,0.3133;-0.5706,1.5859,0.5464;-0.1134,1.477,1.6759;0.2338,1.9468,-0.5269;1.6691,1.9936,-0.2713;2.1422,2.5496,-1.0888;1.8282,2.5303,0.6593;2.2698,0.5755,-0.2661;2.0179,-0.1703,-1.2116;3.0683,0.201,0.7859;3.4399,-1.2112,0.8812;4.4838,-1.3099,1.1905;3.2998,-1.6678,-0.0973;2.8077,-1.7304,1.6135;3.2021,1.0025,1.9833;2.3078,1.027,2.6041;4.043,2.2592,1.9661;4.5463,2.5077,1.0362;3.6892,3.1135,2.5365;4.5303,1.0353,2.6952;4.5182,1.026,3.7812;5.3633,0.4976,2.2527;-0.2359,1.7373,-1.8935;0.4091,2.3006,-2.5743;-0.1611,0.6797,-2.1756;-1.6795,2.2262,-1.9957;-2.1218,1.9058,-2.9473;-1.7267,3.3234,-1.9583)|\",5.409623347940001\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])N(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(8.4863,-2.927,2.7522;8.105,-3.0207,1.8645;6.7569,-2.8202,1.9195;6.0941,-2.9975,0.8644;4.6711,-2.7764,0.7428;4.2571,-3.6075,0.1588;4.1284,-2.7657,1.7023;4.3802,-1.4475,0.0236;4.9787,-1.3941,-0.9021;4.7274,-0.6385,0.6711;2.9458,-1.2909,-0.2315;2.4958,0.0996,-0.0852;3.1676,0.7976,-0.6214;1.517,0.187,-0.5686;2.3553,0.5253,1.3776;2.0066,1.5632,1.4394;3.3049,0.4623,1.9192;1.6288,-0.1161,1.8885;2.543,-1.8871,-1.5108;2.7539,-1.2047,-2.3577;3.165,-2.7742,-1.6756;1.075,-2.3172,-1.5378;0.8354,-2.797,-2.4942;0.393,-1.4685,-1.4164;0.8726,-3.0276,-0.7294;6.2325,-2.3612,3.3128;5.1434,-2.4661,3.2803;6.5088,-0.855,3.5031;5.8191,-0.0272,2.9439;7.5883,-0.4415,4.3203;8.0853,-1.3033,5.1451;8.8976,-0.9498,5.785;7.6769,-2.7011,5.2856;8.1178,-3.297,6.079;6.792,-3.2136,4.4113;6.4865,-4.2569,4.4536)|\",2.5633124717100007\r\n\"[H]C1C([H])=C2N=C(SC([H])([H])[H])NC(=O)C2=C([H])[C@]1([H])N(=O)=O |(5.6315,-2.3369,6.1688;5.1239,-2.5499,5.2341;4.8868,-1.5964,4.3204;5.1946,-0.567,4.4734;4.1892,-1.896,3.0741;3.9808,-0.9415,2.2149;3.2898,-1.3089,1.0526;3.1028,0.0955,0.0161;2.1991,-0.6247,-1.3942;2.0526,0.1884,-2.1094;1.2359,-1.0192,-1.0668;2.7821,-1.4282,-1.8471;2.8211,-2.4674,0.6967;3.0049,-3.5498,1.5591;2.6071,-4.6706,1.288;3.7294,-3.2669,2.8451;3.9409,-4.2328,3.759;3.5669,-5.2344,3.5709;4.725,-3.9783,5.0022;5.6264,-4.6107,5.0068;3.9756,-4.5036,6.2581;4.5572,-4.3525,7.3236;2.8896,-5.0371,6.0926)|\",3.39053857723\r\n\"[H]/N=C(/N([H])OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1F)C([H])([H])OC([H])(C([H])([H])[H])C([H])([H])[H] |(5.1303,-0.4319,-1.0187;4.8785,0.3795,-0.4484;4.0062,1.0595,-1.0774;3.3861,2.1349,-0.4091;3.9806,2.4501,0.3575;3.1586,3.2501,-1.2746;1.7855,3.6662,-1.1728;1.1361,2.7876,-1.246;1.6081,4.1444,-0.2003;1.5238,4.6361,-2.2931;1.4577,6.0186,-2.0889;1.5873,6.4053,-1.0811;1.2279,6.8973,-3.1491;1.1781,7.9665,-2.966;1.0656,6.3983,-4.4426;0.8871,7.0756,-5.2728;1.1315,5.0234,-4.6783;1.0089,4.6024,-5.6708;1.3608,4.1783,-3.6026;1.4184,2.8419,-3.8297;3.4698,0.7882,-2.4708;3.6649,1.659,-3.1069;2.3754,0.6685,-2.4201;4.0905,-0.3772,-2.9674;3.4353,-0.9723,-4.0923;2.3806,-1.1531,-3.8202;3.478,-0.0708,-5.3299;3.0246,-0.5833,-6.1862;4.516,0.1736,-5.5839;2.9293,0.8641,-5.1745;4.1292,-2.3102,-4.3238;3.6422,-2.8619,-5.1349;4.095,-2.9207,-3.4161;5.1795,-2.1525,-4.5932)|\",5.7035063064800005\r\n\"[H]OC(=O)/C(C#N)=C(\\[H])C1=C([H])C(Cl)=C(O[H])C([H])=C1[H] |(-3.7633,-3.7049,-1.9103;-3.2401,-4.4715,-1.6135;-2.0588,-4.0862,-1.0848;-1.2765,-4.902,-0.6567;-1.7804,-2.6019,-1.0609;-2.7781,-1.7434,-1.6052;-3.6485,-1.1205,-2.0702;-0.5984,-2.1923,-0.5205;0.0089,-3.0225,-0.1634;-0.0221,-0.8765,-0.3398;1.2476,-0.8147,0.2766;1.741,-1.7302,0.5872;1.8863,0.3946,0.498;3.455,0.4343,1.2621;1.2735,1.6,0.1075;1.9284,2.7572,0.3408;1.393,3.5025,0.0235;0.0119,1.5475,-0.5066;-0.4668,2.4754,-0.8124;-0.6274,0.3384,-0.7284;-1.5999,0.3438,-1.2051)|\",3.942929693745\r\n\"[H]OC(=O)/C(C#N)=C(\\[H])C1=C([H])C(C([H])([H])[H])=C([H])C(C([H])([H])[H])=C1[H] |(5.9433,-5.011,-2.1591;6.8075,-4.9419,-1.7141;6.9343,-3.7639,-1.0666;7.9442,-3.4991,-0.4582;5.7593,-2.8159,-1.1462;4.6311,-3.2511,-1.9006;3.7666,-3.706,-2.5388;5.8657,-1.6267,-0.4921;6.8209,-1.5157,0.0183;4.9529,-0.5053,-0.3547;3.6717,-0.4382,-0.9384;3.3013,-1.2609,-1.5397;2.8589,0.6797,-0.7533;1.4906,0.7458,-1.3925;0.9311,1.6216,-1.0501;0.8983,-0.1462,-1.159;1.5664,0.8048,-2.4853;3.3409,1.7397,0.029;2.7085,2.6126,0.1781;4.6074,1.7116,0.6249;5.1059,2.8669,1.4622;4.3495,3.6526,1.5485;6.0059,3.3163,1.0251;5.3679,2.5409,2.4758;5.4009,0.582,0.4227;6.3888,0.5327,0.8756)|\",4.215043544244999\r\n\"[H]O/C(=N/C([H])([H])[C@]1([H])OC([H])([H])C([H])([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(-3.6039,-4.2604,-0.6281;-3.803,-3.3387,-0.8625;-3.0901,-2.4996,-0.0618;-3.1466,-1.2652,-0.3192;-2.427,-0.2724,0.4484;-3.0215,0.6489,0.4406;-2.2742,-0.5406,1.5067;-1.0602,0.0408,-0.1669;-1.1977,0.2157,-1.2473;-0.2442,-1.1244,0.0088;1.0434,-0.994,-0.5931;1.5486,-1.9516,-0.432;0.9289,-0.8477,-1.6807;1.8321,0.1664,0.0147;2.0362,-0.0589,1.0697;2.8008,0.261,-0.4922;1.0233,1.4671,-0.095;0.9535,1.7629,-1.1518;1.5316,2.2859,0.4277;-0.3923,1.2664,0.4684;-1.0093,2.1559,0.2895;-0.3418,1.1163,1.5558;-2.3329,-3.2233,1.081;-1.7083,-2.4679,1.5722;-3.3287,-3.7616,2.0839;-3.9791,-3.0141,2.5489;-3.5021,-4.9792,2.4372;-2.673,-5.9913,1.8511;-2.8416,-7.1558,2.1577;-1.6243,-5.5838,0.8853;-1.0001,-6.3843,0.4998;-1.4475,-4.3014,0.5348;-0.6708,-3.9934,-0.1606)|\",4.70212733664\r\n\"[H]C1=NC([H])=C([H])C(C(=O)N([H])[C@@]2([H])N([H])N([H])[C@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])C2([H])[H])=C1[H] |(8.7383,-2.7716,-3.3244;7.8749,-3.2365,-3.7974;8.1375,-4.0882,-4.7949;7.0881,-4.6866,-5.3755;7.321,-5.3814,-6.1803;5.764,-4.4644,-5.0022;4.9419,-4.9783,-5.4882;5.4979,-3.5564,-3.9733;4.0589,-3.3273,-3.5875;3.1876,-4.1498,-3.8625;3.7816,-2.1712,-2.9109;4.4838,-1.4438,-2.9063;2.3774,-1.7854,-2.6257;1.8259,-1.7493,-3.5702;1.6954,-2.6843,-1.7085;1.8021,-3.6633,-1.9623;2.2616,-2.4332,-0.4046;3.2576,-2.6871,-0.44;2.2138,-0.9513,-0.3332;3.0734,-0.62,0.2635;0.9249,-0.4627,0.3607;0.0818,-0.8068,-0.2548;0.7836,-1.08,1.7582;-0.1595,-0.7717,2.2247;1.6002,-0.7516,2.4162;0.8097,-2.1708,1.7041;0.8836,1.0712,0.4464;-0.0426,1.4049,0.9278;0.9338,1.5541,-0.5361;1.7207,1.4515,1.0475;2.3437,-0.4722,-1.8236;3.2287,0.15,-1.9977;1.4725,0.1126,-2.1291;6.5869,-2.9348,-3.3524;6.456,-2.2549,-2.5152)|\",4.258581760325\r\n\"[H]O/C(=N/C([H])([H])C1=NOC(C([H])([H])[H])=C1[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(6.9095,5.4418,-0.9528;6.2356,4.7437,-1.0026;6.6769,3.5928,-0.4317;5.8366,2.6534,-0.3251;6.194,1.3373,0.1711;7.1053,1.3104,0.7858;5.3696,0.9836,0.8006;6.3808,0.3768,-0.9776;7.5967,0.019,-1.3295;7.4342,-0.8277,-2.4336;6.1151,-0.9519,-2.709;5.7492,-1.8299,-3.8556;4.6657,-1.8331,-3.9959;6.0815,-2.86,-3.6828;6.2198,-1.4814,-4.7819;5.3953,-0.2093,-1.8214;4.3241,-0.0849,-1.7714;8.178,3.5726,-0.0295;8.2959,2.8315,0.7654;8.6307,4.9099,0.5675;8.6132,5.1148,1.7613;9.0398,5.9229,-0.3391;9.5729,5.5225,-1.4515;9.9282,6.3037,-2.1283;9.6987,4.1351,-1.8885;10.2812,3.9136,-2.7771;9.0054,3.1887,-1.2308;8.9738,2.1491,-1.5487)|\",4.106198004045\r\n\"[H]O/C(=N/C([H])([H])[H])N([H])[C@]1([H])C2=C(C([H])=C([H])C([H])=C2[H])C([H])([H])[C@]1([H])O[H] |(1.9827,-3.6778,-2.5774;1.5884,-2.7864,-2.6038;1.4877,-2.4382,-1.2784;1.8245,-3.2968,-0.3979;1.7543,-2.9092,0.9986;2.1574,-3.7171,1.6161;0.7188,-2.731,1.3364;2.3368,-1.9993,1.2192;1.0587,-1.1461,-1.113;0.7058,-0.931,-0.1905;0.5614,-0.3015,-2.2025;1.3036,-0.3504,-3.0043;0.334,1.111,-1.7042;-1.035,1.3696,-1.5476;-1.4607,2.6134,-1.0812;-2.5202,2.8268,-0.9615;-0.5074,3.5863,-0.7686;-0.8303,4.5593,-0.4074;0.8584,3.322,-0.9194;1.5875,4.0903,-0.6764;1.2875,2.0778,-1.3877;2.3475,1.8677,-1.5069;-1.8583,0.1623,-1.9358;-2.757,0.4109,-2.5102;-2.1869,-0.4041,-1.0535;-0.8632,-0.696,-2.7417;-0.8958,-0.3712,-3.7954;-1.1917,-2.0615,-2.6242;-0.4508,-2.5714,-2.9959)|\",5.62187215133\r\n\"[H]O/C(=N/OC([H])([H])[H])[C@]1([H])C(=O)C2=C(/N=C\\1[H])C([H])=C([H])C([H])=C2[H] |(4.6039,-1.3498,-1.4833;3.967,-0.6095,-1.6275;3.521,-0.2256,-0.4136;2.3542,0.297,-0.3729;2.0066,0.7535,0.911;0.6175,1.0628,0.9177;0.414,1.4576,1.9161;0.0109,0.167,0.7399;0.3836,1.8172,0.1583;4.5351,-0.4489,0.7291;5.4392,0.075,0.3737;4.9147,-1.9369,0.7682;5.2653,-2.5052,-0.2694;4.827,-2.6125,2.0616;4.4722,-1.8667,3.2111;4.2266,-0.4828,3.2019;4.2234,0.143,2.0845;4.0582,1.2171,2.1152;4.3817,-2.5285,4.443;4.1139,-1.9446,5.3175;4.6336,-3.8946,4.5284;4.5565,-4.3943,5.4903;4.9894,-4.63,3.3884;5.1873,-5.6948,3.4657;5.0884,-3.9887,2.1609;5.3629,-4.5278,1.2599)|\",3.708911782315\r\n\"[H]O/C(=N/OC([H])([H])C([H])([H])[H])[C@]1([H])C(=O)C2=C(/N=C\\1[H])C([H])=C([H])C([H])=C2[H] |(3.2699,-0.8658,-2.5053;4.1905,-0.9504,-2.1602;4.2911,-2.1752,-1.6009;5.1314,-2.303,-0.6457;5.249,-3.629,-0.1952;5.9632,-3.6381,1.0486;6.2311,-4.6905,1.186;6.8824,-3.0526,0.9295;5.126,-3.1227,2.2104;5.6958,-3.1858,3.145;4.2119,-3.7158,2.3224;4.847,-2.0776,2.046;3.3869,-3.2394,-2.2583;3.6495,-3.1721,-3.3281;1.9276,-2.7683,-2.1684;1.6306,-1.626,-2.5283;0.952,-3.7215,-1.6413;1.3728,-5.0313,-1.3066;2.6867,-5.5002,-1.478;3.5932,-4.6859,-1.873;4.5927,-5.0865,-2.0226;0.429,-5.9335,-0.7973;0.7637,-6.9357,-0.55;-0.8953,-5.5432,-0.6216;-1.6151,-6.2534,-0.2236;-1.3099,-4.2459,-0.957;-2.346,-3.9514,-0.8191;-0.3889,-3.3412,-1.468;-0.6777,-2.3307,-1.7389)|\",3.692584951285\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])N(C([H])([H])[H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(7.8768,-2.0228,-0.4855;7.0355,-1.8077,-0.0493;6.1954,-1.2111,-0.9403;5.0203,-0.962,-0.5552;4.0137,-0.309,-1.3604;4.2844,-0.2723,-2.4281;2.6359,-1.022,-1.2355;2.7215,-1.9637,-0.6831;2.2505,-1.2535,-2.2492;1.771,-0.0783,-0.5306;0.3544,-0.3401,-0.6806;-0.2214,0.3936,-0.1058;0.1175,-1.3358,-0.2892;0.0133,-0.2974,-1.7351;2.2062,1.2511,-0.9444;1.8907,1.4972,-1.9808;1.792,2.0185,-0.2807;3.7296,1.1343,-0.8557;4.0488,1.2047,0.1879;4.2602,1.8954,-1.4372;6.8619,-0.8901,-2.3045;6.0611,-0.5613,-2.9814;7.8097,0.2776,-2.1239;7.3525,1.1956,-1.7427;9.0678,0.3048,-2.3519;9.7056,-0.877,-2.8493;10.9038,-0.8748,-3.0515;8.8759,-2.079,-3.1099;9.4014,-2.9378,-3.5166;7.5551,-2.0889,-2.879;6.9513,-2.9708,-3.0843)|\",3.43951907032\r\n\"[H]O/C(=N/[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(2.5947,-1.9668,3.2191;2.7371,-1.1781,2.655;3.4959,-1.5576,1.6066;4.0434,-0.6516,0.9096;4.9142,-0.9414,-0.2211;5.1349,-2.0123,-0.347;4.1424,-0.5456,-1.4907;3.4105,-1.3139,-2.0889;4.331,0.7332,-1.8331;3.5767,1.2329,-2.9695;2.5861,0.773,-2.9588;3.485,2.3037,-2.7748;4.3051,0.9599,-4.2758;3.7573,1.4186,-5.1068;4.3741,-0.1153,-4.4635;5.3146,1.3831,-4.2552;6.2333,-0.182,-0.0431;6.8875,-0.3247,-0.91;6.7494,-0.5527,0.8482;6.0412,0.8849,0.0857;3.5792,-3.0951,1.4012;4.6501,-3.3542,1.37;3.0832,-3.7339,2.655;2.5278,-4.8458,2.6568;2.3365,-5.5902,1.39;2.155,-6.7837,1.4758;2.3226,-4.7973,0.1622;1.8512,-5.2324,-0.7134;2.9172,-3.5952,0.1507;2.9667,-2.9776,-0.7444)|\",3.5810182725800006\r\n\"[H]O/C(=N/[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.5467,-3.7217,0.894;3.4476,-2.9849,1.5198;4.3383,-2.0024,1.2203;4.2532,-0.9279,1.8787;5.1376,0.2072,1.6869;6.0877,-0.0658,1.1949;4.4592,1.2207,0.748;4.206,2.3691,1.0287;4.2089,0.6603,-0.4595;3.5585,1.5081,-1.4434;3.9574,2.5205,-1.3474;3.864,1.0877,-2.4049;2.0474,1.4905,-1.2755;1.58,2.0785,-2.0736;1.7613,1.9271,-0.3148;1.6577,0.4685,-1.3279;5.4534,0.8486,3.0381;6.1061,1.7167,2.9105;5.9502,0.1186,3.6841;4.5312,1.175,3.5244;5.3389,-2.3878,0.1029;5.9457,-1.4978,-0.1058;6.2607,-3.4809,0.6024;6.8491,-3.2306,1.4905;6.4192,-4.6514,0.1122;5.6685,-5.0206,-1.0526;5.7938,-6.1364,-1.5171;4.7586,-4.0202,-1.6615;4.2397,-4.3386,-2.5605;4.6126,-2.7916,-1.1451;3.9586,-2.0509,-1.5985)|\",4.696685059630001\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])N(C([H])([H])[H])N=C1[H])[C@]1([H])C([H])=NC(=O)C([H])=C1[H] |(9.8408,0.6237,1.59;8.8858,0.6665,1.4126;8.6396,1.6138,0.4686;7.4336,1.8535,0.1828;7.038,2.7954,-0.8508;6.1198,3.2824,-0.4966;7.7654,3.6109,-1.0077;6.7822,2.1084,-2.1682;6.3596,0.8051,-2.3772;6.1495,0.0099,-1.6767;6.2414,0.6361,-3.7205;5.8035,-0.5476,-4.4364;5.7802,-1.3918,-3.7441;6.503,-0.7607,-5.248;4.8045,-0.396,-4.8575;6.5567,1.7525,-4.4105;6.8853,2.6433,-3.4738;7.1901,3.64,-3.7691;9.9196,2.2232,-0.1641;9.6089,3.1382,-0.6864;10.4389,1.2564,-1.2104;9.7457,1.0423,-2.0282;11.5734,0.6658,-1.2276;12.5014,0.9216,-0.1694;13.5575,0.321,-0.1332;12.1483,1.9299,0.8623;12.9155,2.1388,1.6018;10.9594,2.5506,0.8643;10.7102,3.2951,1.6182)|\",4.138851666104999\r\n\"[H]C1C2=C(/N=C(/C([H])([H])C([H])([H])C([H])([H])OC([H])([H])[H])NC2=O)C([H])=C([H])[C@@]1([H])F |(11.2566,2.8941,-1.1313;10.7316,2.3349,-1.9021;9.3862,2.3638,-1.9576;8.6663,1.6657,-3.025;7.3679,1.681,-3.1061;6.6754,2.3946,-2.1088;5.1786,2.3185,-2.2575;4.7262,3.0268,-1.5578;4.929,2.6322,-3.2791;4.6207,0.8959,-2.0421;5.123,0.1951,-2.7179;3.5562,0.8884,-2.3033;4.771,0.4054,-0.6064;4.2628,1.0948,0.0907;5.8337,0.3779,-0.309;4.2051,-0.8895,-0.5215;4.2747,-1.4301,0.7796;3.8152,-2.4216,0.7433;3.7297,-0.8101,1.51;5.3166,-1.5318,1.1254;7.1721,3.0786,-1.1204;8.566,3.1401,-0.9722;9.085,3.7839,-0.0754;9.443,0.9803,-4.0559;8.8821,0.5379,-4.8732;10.783,0.9403,-3.9998;11.3749,0.4587,-4.7732;11.5469,1.4969,-2.835;11.9503,0.6492,-2.2511;12.6557,2.2186,-3.2857)|\",3.3497214996550007\r\n\"[H]C1C([H])=C2/N=C(/C3([H])C([H])([H])C3([H])[H])NC(=O)C2=C([H])[C@]1([H])F |(-3.9449,-5.4154,-4.1948;-3.4594,-4.4568,-4.0338;-2.8526,-4.1575,-2.8751;-2.8165,-4.8523,-2.0419;-2.2352,-2.8496,-2.6636;-1.6473,-2.5983,-1.5302;-1.1061,-1.3072,-1.373;-0.422,-1.1221,-0.0803;-0.4319,-2.0005,0.5549;-0.4739,0.246,0.5998;-1.0392,1.009,0.0741;-0.5732,0.244,1.6815;0.8022,-0.2091,-0.0116;1.6129,-0.5348,0.6337;1.0989,0.2459,-0.9513;-1.1541,-0.3093,-2.2125;-1.7878,-0.4823,-3.4418;-1.8767,0.4214,-4.2587;-2.3475,-1.8457,-3.7237;-2.9653,-2.1377,-4.8839;-3.079,-1.3692,-5.6446;-3.4521,-3.5157,-5.2022;-2.7861,-3.9338,-5.9789;-4.7249,-3.4542,-5.778)|\",3.379654023210001\r\n\"[H]C1([H])C2=C(C([H])([H])C1([H])[H])[C@@]1([H])C(NC(C3([H])C([H])([H])C3([H])[H])=NC1=O)S2 |(1.7914,5.464,0.5955;1.8371,4.3713,0.5217;2.3999,4.0157,1.3966;0.4863,3.7169,0.4511;0.3762,2.7983,-0.5176;1.6548,2.654,-1.297;2.1625,1.7095,-1.0486;1.4835,2.6439,-2.3779;2.4736,3.8943,-0.8227;2.3775,4.6935,-1.5645;3.5398,3.675,-0.7161;-0.9059,2.0376,-0.5093;-0.7273,1.0119,-0.1307;-1.8142,2.7098,0.4996;-3.0809,2.5083,0.5715;-3.6107,1.7097,-0.4606;-4.9912,1.2586,-0.2135;-5.3851,1.5515,0.7531;-5.4536,-0.0718,-0.8024;-4.7101,-0.6191,-1.3732;-6.1083,-0.6764,-0.1812;-5.9725,1.1944,-1.3847;-6.9948,1.4948,-1.1741;-5.5755,1.5083,-2.3446;-3.0332,1.4219,-1.5969;-1.7082,1.7976,-1.8033;-1.188,1.8462,-2.8996;-0.9554,3.9081,1.4804)|\",3.738844305869999\r\n\"[H]O/C(=N/C([H])([H])[H])C([H])([H])C([H])([H])/N=C(/O[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(0.937,-2.605,3.3779;1.8665,-2.9247,3.3588;2.2464,-3.1234,2.0729;3.4421,-3.4967,1.865;3.9107,-3.7493,0.5141;4.8881,-4.2391,0.5682;4.0512,-2.8189,-0.0585;3.2524,-4.4058,-0.0763;1.1562,-2.8963,1.0236;1.6136,-2.8216,0.0336;0.4943,-3.7735,1.004;0.2933,-1.6388,1.2488;-0.3361,-1.4855,0.3597;0.9513,-0.7633,1.3368;-0.4666,-1.7655,2.482;-1.6547,-1.3652,2.6477;-2.2275,-1.5996,3.8518;-3.1002,-1.1724,3.8953;-2.5707,-0.6543,1.6253;-1.9357,-0.34,0.7868;-3.5811,-1.653,1.0881;-3.154,-2.5266,0.587;-4.8544,-1.5882,1.1636;-5.4536,-0.4603,1.8144;-6.6606,-0.405,1.9302;-4.572,0.619,2.3265;-5.0769,1.474,2.7658;-3.2372,0.5479,2.2292;-2.5982,1.3488,2.5958)|\",3.404144269755\r\n\"[H]O/C(=N/C([H])([H])[H])C([H])([H])C([H])([H])/N=C(/O[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(1.8196,3.6148,0.2063;1.783,3.0104,0.9734;1.2626,1.8173,0.5621;1.3767,0.8412,1.3661;0.8303,-0.4689,1.0919;1.6354,-1.2135,1.1366;0.3135,-0.594,0.1275;0.1204,-0.7319,1.8862;0.6646,1.8647,-0.836;0.0887,0.9623,-1.0508;-0.0233,2.7162,-0.9002;1.7535,2.0333,-1.9196;1.2528,2.0843,-2.895;2.3885,1.1344,-1.9339;2.4723,3.2829,-1.7334;3.7386,3.3607,-1.7026;4.2968,4.5778,-1.5607;5.2486,4.499,-1.7791;4.7468,2.1905,-1.7504;4.4992,1.57,-2.6254;6.0869,2.7706,-2.0753;7.1412,2.2275,-1.7074;7.0979,0.9728,-0.9171;8.0824,0.2715,-0.9406;5.8856,0.7501,-0.1275;5.9562,0.1015,0.7398;4.7438,1.3362,-0.5129;3.8166,1.1979,0.0373)|\",3.338836945635\r\n\"[H]OC(=N/C([H])([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.3739,-2.7476,-1.4566;5.2902,-1.9504,-2.0027;4.5597,-1.0228,-1.304;4.7169,0.1892,-1.6582;3.955,1.2833,-1.0999;3.3019,1.6708,-1.8964;3.2967,1.0143,-0.2583;4.8986,2.4195,-0.6453;5.3953,2.0975,0.28;5.9319,2.7428,-1.6642;5.8677,2.0668,-2.4226;5.6836,4.0599,-2.2136;6.3433,4.6769,-1.7387;4.3282,4.4387,-1.791;4.2317,5.5287,-1.7504;3.603,4.0637,-2.5256;4.1339,3.7428,-0.4325;3.0833,3.584,-0.1628;4.598,4.3334,0.3662;3.6665,-1.558,-0.2566;3.3703,-0.8624,0.5215;3.1922,-2.8205,-0.2433;3.4446,-3.4773,-1.0765;2.3072,-3.4213,0.7596;1.7457,-4.685,0.5033;1.9814,-5.1898,-0.4308;0.8881,-5.2914,1.4195;0.4626,-6.2662,1.1981;0.5787,-4.6472,2.6182;-0.0877,-5.1173,3.336;1.1367,-3.3951,2.8933;0.9081,-2.8908,3.8284;1.9917,-2.7903,1.9784;2.4262,-1.8242,2.2173)|\",3.57013371856\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(5.8146,2.017,3.5517;6.1462,1.338,2.9405;6.0444,1.7879,1.6597;6.5323,1.061,0.7508;6.4459,1.3689,-0.6567;6.2525,2.4308,-0.8838;7.4092,1.1149,-1.1166;5.3676,0.5343,-1.3463;5.5011,-0.5263,-1.0844;5.4807,0.6377,-2.4362;4.0966,0.9941,-0.9163;3.0026,0.1445,-1.2579;3.2508,-0.9002,-1.0082;2.1847,0.4551,-0.6002;2.5804,0.2549,-2.726;2.3519,1.307,-2.9438;3.4272,-0.0222,-3.3673;1.3744,-0.6299,-3.0875;1.2374,-0.5962,-4.1759;1.6062,-1.6771,-2.8456;0.0596,-0.2255,-2.4083;-0.7695,-0.8491,-2.761;-0.193,0.8191,-2.6288;0.1058,-0.3321,-1.3187;5.3042,3.1445,1.5344;5.4303,3.4808,0.4968;3.8218,2.9171,1.7521;3.3633,2.2254,1.0437;3.0807,3.4333,2.6597;3.6653,4.3459,3.5927;2.9861,4.8162,4.4853;5.0979,4.7042,3.4409;5.4793,5.4373,4.1454;5.8622,4.166,2.4794;6.9098,4.4402,2.3693)|\",4.699406198135001\r\n\"[H]C1=NC([H])=C(C([H])([H])OC(=O)[C@@]2([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C2([H])[H])C([H])=C1[H] |(2.858,1.4419,5.5591;3.4856,1.3437,4.6747;4.0833,2.4646,4.2518;4.8606,2.3672,3.1688;5.3432,3.2896,2.8444;5.0826,1.181,2.4581;5.975,1.1753,1.2446;6.5005,0.2243,1.1377;6.6993,1.9912,1.2888;5.2341,1.4347,0.0171;4.6871,0.3721,-0.6041;4.7792,-0.7749,-0.2123;3.9388,0.801,-1.8589;4.5746,1.5184,-2.3904;3.6332,-0.3701,-2.7162;3.8691,-1.2078,-2.1849;2.2119,-0.4238,-2.9457;2.0487,0.091,-3.8132;1.5731,0.3126,-1.8404;1.531,-0.3741,-0.9827;0.1612,0.7637,-2.1946;-0.3135,1.2662,-1.3452;-0.4598,-0.0926,-2.4768;0.1751,1.4706,-3.0348;2.5643,1.4489,-1.5171;2.5072,1.8067,-0.4846;2.3865,2.3067,-2.1769;4.4472,0.0222,2.9198;4.5859,-0.917,2.3931;3.6354,0.1049,4.0485;3.1289,-0.7726,4.4396)|\",5.259960730165001\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)N(C([H])([H])[H])C([H])([H])C1=C([H])OC([H])=C1[H] |(1.7433,0.1681,-2.9936;2.0268,-0.8674,-3.2036;2.2624,-1.1602,-4.4236;2.6346,-2.5077,-4.7524;2.739,-2.825,-5.9214;2.8886,-3.4599,-3.6503;3.2713,-4.4325,-3.943;2.6588,-3.1357,-2.3731;2.8612,-3.8309,-1.564;2.0937,-1.7963,-2.0057;1.0572,-1.9446,-1.659;2.9062,-1.1614,-0.8467;4.0075,-1.6225,-0.5631;2.382,-0.0746,-0.2011;1.0159,0.4236,-0.2896;1.0029,1.458,-0.6569;0.5528,0.4055,0.7047;0.4055,-0.1832,-0.9561;3.2113,0.557,0.8446;4.2532,0.3949,0.5639;3.0094,1.6337,0.8138;2.9451,0.0176,2.221;2.3222,0.6559,3.2535;1.9015,1.6442,3.368;2.2608,-0.1452,4.353;2.8566,-1.3262,4.0102;2.8875,-2.0742,4.7878;3.2934,-1.287,2.7246;3.8057,-2.0659,2.1781)|\",4.500763087269999\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])N(C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(6.7851,-3.9678,2.2339;5.9855,-3.4831,2.4989;5.7715,-2.4474,1.6399;4.7064,-1.7871,1.779;4.3344,-0.6371,0.9842;4.9918,-0.4705,0.1124;2.9134,-0.8618,0.4368;2.2449,-1.0982,1.2888;2.9145,-1.7381,-0.2225;2.4582,0.2968,-0.3271;1.1863,0.0367,-0.9855;0.9058,0.9007,-1.598;0.359,-0.1631,-0.2765;1.2819,-0.832,-1.6464;2.3995,1.4957,0.5096;2.0399,2.3275,-0.1076;1.6725,1.371,1.34;3.7801,1.8273,1.0824;4.449,2.1,0.2561;3.7051,2.7009,1.7413;4.3605,0.6338,1.8522;3.7701,0.4405,2.7573;5.387,0.8416,2.1807;6.9309,-2.2309,0.6304;6.5817,-1.4838,-0.0956;8.118,-1.6395,1.3611;7.9288,-0.6869,1.8651;9.2949,-2.1295,1.4643;9.5833,-3.3761,0.8203;10.69,-3.8667,0.9277;8.5139,-4.0345,0.0306;8.7925,-4.9588,-0.4663;7.2878,-3.5022,-0.0793;6.5124,-3.9844,-0.6713)|\",3.5919028266\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])C([H])([H])OC([H])([H])C([H])([H])C1([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(3.4835,-4.8458,-4.2219;4.0327,-4.2114,-3.7093;3.3862,-3.028,-3.6832;3.6429,-2.2218,-2.7406;3.0313,-0.9125,-2.601;2.5834,-0.5093,-3.5248;2.2094,-1.0032,-1.8739;4.0372,0.1201,-2.059;3.4916,1.0677,-1.9383;4.5707,-0.2751,-0.6733;5.0628,-1.2575,-0.7324;3.7532,-0.3464,0.0526;5.4689,0.7019,-0.1573;6.6186,0.8495,-0.9788;7.2624,1.5759,-0.4726;7.1656,-0.1084,-1.0378;6.2426,1.3221,-2.386;5.8119,2.3297,-2.3144;7.1436,1.3954,-3.009;5.2202,0.3611,-3.0136;5.7028,-0.6034,-3.2239;4.8616,0.7609,-3.9735;2.4434,-2.7446,-4.8799;1.5516,-2.239,-4.4682;1.8897,-3.964,-5.6206;2.0214,-5.1081,-5.213;1.182,-3.725,-6.8105;1.5753,-2.7241,-7.5284;1.0576,-2.5671,-8.4772;2.6363,-1.791,-7.1558;3.0144,-1.0913,-7.8944;3.0515,-1.7799,-5.8784;3.7876,-1.0681,-5.5175)|\",3.442240208825\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C1=NSN=C1[H] |(8.3289,0.822,0.1303;7.4356,0.6961,0.501;6.5564,0.6752,-0.5284;5.3165,0.6269,-0.2437;4.2996,0.6762,-1.2717;4.6091,1.3063,-2.1186;4.0192,-0.7097,-1.7865;4.4613,-1.3555,-1.1313;2.6078,-0.9672,-1.6938;2.205,-0.6525,-2.5786;2.097,-0.101,-0.6084;2.3354,-0.6104,0.3355;0.59,0.0988,-0.7067;0.2221,0.7,0.1313;0.068,-0.8637,-0.6919;0.3216,0.6212,-1.6347;2.9485,1.1796,-0.7114;3.0906,1.6908,0.2441;2.4838,1.8824,-1.4149;7.2192,0.6998,-1.8722;8.4676,1.1437,-1.933;8.9677,1.0155,-3.5068;7.5746,0.3589,-4.1319;6.7172,0.2391,-3.1393;5.7452,-0.2193,-3.2985)|\",4.149736220125\r\n\"[H]O/C(=N/OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]1([H])N([H])N([H])N([H])C1([H])[H] |(4.5469,-3.4253,3.6256;4.5068,-2.703,2.9799;4.6708,-3.2364,1.73;4.8569,-2.3778,0.8003;4.9934,-2.9907,-0.4604;5.0157,-1.9683,-1.4715;5.6892,-1.1688,-1.1453;5.4546,-2.4726,-2.3378;3.6426,-1.4323,-1.8062;2.8998,-1.9965,-2.8506;3.328,-2.8103,-3.4325;1.6217,-1.5257,-3.1533;1.0596,-1.9701,-3.9707;1.0699,-0.4818,-2.4089;0.0761,-0.11,-2.6447;1.8012,0.0871,-1.3635;1.3766,0.9017,-0.7824;3.08,-0.3842,-1.0655;3.6472,0.0509,-0.2475;4.5473,-4.7362,1.5929;4.8059,-5.0044,0.5692;5.4074,-5.4805,2.5526;5.9266,-4.8547,3.1651;4.5574,-6.3014,3.3993;4.5469,-7.2176,2.9517;3.2144,-5.8199,3.2935;3.0991,-5.0964,4.004;3.1057,-5.2416,1.949;2.8123,-6.0269,1.2442;2.3483,-4.4542,1.9186)|\",5.945687633425\r\n\"[H]/N=C(/O[H])N([H])[C@]([H])(/C(=N\\C1=C([H])C(C#N)=C([H])C([H])=C1[H])O[H])C([H])([H])[H] |(1.5998,2.7217,0.958;0.9866,3.0331,0.2074;1.1441,2.3875,-0.8768;0.4283,2.7075,-1.9827;-0.1211,3.469,-1.7243;2.0165,1.3373,-1.1814;1.7462,0.846,-2.0268;2.5256,0.4796,-0.106;1.7691,0.3647,0.6783;3.7826,1.1189,0.5201;4.2619,0.8894,1.6745;3.6509,0.0431,2.6083;2.5568,0.4671,3.3773;2.1345,1.4545,3.2207;2.0227,-0.3699,4.3698;0.8983,0.0771,5.1416;-0.0167,0.4341,5.765;2.577,-1.6364,4.6126;2.1561,-2.2729,5.3833;3.6756,-2.0465,3.8595;4.1226,-3.0189,4.0457;4.2143,-1.2192,2.8764;5.0863,-1.5296,2.3083;4.4497,1.9469,-0.3009;3.8764,2.0721,-1.0871;2.8745,-0.903,-0.6764;3.2359,-1.5585,0.1191;3.6555,-0.8242,-1.4406;1.9865,-1.3647,-1.1225)|\",5.015058264715\r\n\"[H]O/C(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])[C@]1([H])C(=S)N=C([H])C([H])=C1[H] |(4.8766,-1.3947,3.9964;4.6622,-1.1119,3.093;5.7957,-0.9392,2.3682;5.6595,-0.749,1.1266;6.772,-0.5221,0.2399;7.6365,-1.1778,0.4422;6.4576,-0.7369,-0.7846;7.2954,0.913,0.2936;7.3569,1.5902,1.3004;7.729,1.306,-0.9148;8.3198,2.6297,-1.0054;8.9063,2.8143,-0.1021;8.9882,2.5633,-1.8668;7.2526,3.6945,-1.2038;7.7278,4.672,-1.3452;6.5985,3.7533,-0.3295;6.645,3.4762,-2.0877;7.1167,-0.9902,3.1913;7.9327,-0.919,2.4704;7.2615,-2.3187,3.9291;7.9966,-3.6167,3.2333;6.7487,-2.4116,5.2194;6.652,-1.3174,5.9194;6.2664,-1.4294,6.9343;6.9819,0.017,5.445;6.9892,0.8446,6.1471;7.1709,0.1918,4.1238;7.3189,1.1641,3.6654)|\",3.0694442336400005\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C(Cl)=C([H])C([H])=C1OC([H])([H])[H] |(3.0816,-1.3533,-0.1393;2.8348,-0.8057,0.622;3.5091,0.3866,0.5316;3.0256,1.3606,1.1919;3.7207,2.6407,1.243;4.7024,2.5289,1.7285;3.9182,3.0354,0.2319;2.8836,3.6568,2.0186;3.3935,4.626,2.0701;2.7023,3.3028,3.0386;1.9111,3.8007,1.5363;4.7324,0.3876,-0.295;5.0695,1.3627,-0.634;5.4469,-0.7204,-0.5813;5.1417,-1.6674,-0.1441;6.6595,-0.7899,-1.4;7.0541,0.2543,-2.2498;6.4426,1.1461,-2.3273;8.2109,0.1552,-3.0138;8.6706,1.4855,-4.0693;9.0024,-0.9894,-2.9626;9.9013,-1.0609,-3.5651;8.6245,-2.046,-2.1342;9.2451,-2.9332,-2.1015;7.465,-1.9578,-1.3573;7.0381,-2.9465,-0.5227;7.7891,-4.1503,-0.4436;7.2576,-4.7845,0.2674;8.8057,-3.9638,-0.0752;7.8408,-4.6559,-1.4163)|\",4.190553297699999\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C(F)=C1[H])N([H])C1([H])C([H])([H])N([H])N([H])C1([H])[H] |(3.0063,2.2841,-0.0303;3.527,1.542,-0.3866;2.7344,0.4554,-0.2155;1.5769,0.6123,0.3232;0.6864,-0.4598,0.467;0.1982,-0.7838,1.7467;0.5616,-0.2164,2.5973;-0.7305,-1.8084,1.9128;-1.0928,-2.0466,2.9092;-1.2069,-2.5368,0.8191;-1.9338,-3.3344,0.9262;-0.7267,-2.1978,-0.4391;-1.1792,-2.8758,-1.5183;0.2015,-1.1831,-0.6408;0.5096,-0.9342,-1.6516;3.3395,-0.6984,-0.6134;2.7388,-1.5125,-0.6146;4.5976,-0.7708,-1.353;5.2997,-0.0934,-0.864;4.4624,-0.4444,-2.8776;5.0168,0.4519,-3.1719;3.4044,-0.2744,-3.1118;4.9494,-1.6231,-3.6238;5.9344,-1.5048,-3.8544;4.8883,-2.7509,-2.7358;3.9368,-3.1124,-2.7984;5.1432,-2.2225,-1.3844;4.7,-2.8647,-0.6161;6.2271,-2.2087,-1.2199)|\",5.654525813389999\r\n\"[H]OC1=N[C@]([H])(C2=C([H])N=C([H])C([H])=N2)N([H])[C@@]2([H])C([H])=C([H])S[C@@]12[H] |(3.4218,-3.4926,-0.5816;4.1266,-2.8741,-0.8452;3.673,-1.6061,-0.6669;4.2532,-0.6791,-1.3077;3.8147,0.7091,-1.1341;3.3309,0.9743,-2.0867;5.0716,1.566,-0.9951;5.9138,1.7889,-2.0938;5.6712,1.369,-3.0676;7.0352,2.5046,-2.001;7.3225,2.9977,-0.7884;8.2331,3.5858,-0.6957;6.4961,2.7748,0.3101;6.7406,3.1769,1.2911;5.3687,2.0592,0.209;2.8993,1.0126,-0.0482;3.4387,1.2476,0.783;1.9167,-0.0185,0.2213;1.4369,0.2379,1.1775;0.8283,-0.1454,-0.8295;0.4385,0.7334,-1.3329;0.3517,-1.3788,-1.0018;-0.4694,-1.6815,-1.641;1.1466,-2.6286,0.0004;2.5396,-1.4365,0.3405;2.9114,-1.6412,1.3492)|\",4.50076308727\r\n\"[H]C1C([H])=C([H])[C@]2([H])C(=O)N/C(C3=C([H])C([H])=C([H])C(C([H])([H])[H])=N3)=N\\C2=C1[H] |(9.5479,-5.1442,-2.7716;8.7645,-5.0124,-2.0293;8.4117,-6.1447,-1.1955;8.9111,-7.0939,-1.367;7.465,-6.0266,-0.2427;7.1606,-6.8603,0.3815;6.8056,-4.7141,0.0198;7.258,-4.3224,0.9567;5.3118,-4.7786,0.4121;4.8143,-5.8123,0.8159;4.6332,-3.5705,0.3765;5.1762,-2.5501,-0.2475;4.4536,-1.2459,-0.1586;3.2077,-1.184,0.4778;2.7756,-2.0865,0.8924;2.568,0.051,0.5504;1.5972,0.1407,1.0304;3.1916,1.1661,0.0004;2.7198,2.1444,0.04;4.4483,1.0158,-0.6117;5.166,2.1983,-1.2174;5.3384,2.9851,-0.4727;4.5797,2.6448,-2.0302;6.1289,1.8752,-1.6181;5.0599,-0.1704,-0.6901;6.3064,-2.5583,-1.0474;7.0449,-3.6352,-1.0057;8.1179,-3.8164,-1.947;8.3417,-2.9909,-2.6147)|\",3.49394184042\r\n\"[H]OC1=N[C@]([H])(C2=NC([H])=C([H])C([H])=N2)N([H])[C@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]12[H] |(0.8527,-0.7162,-1.313;0.5526,-0.7786,-0.3939;0.1879,0.4648,0.0396;-0.2083,0.5549,1.2403;-0.6424,1.8568,1.7605;0.0909,2.1434,2.5241;-1.9672,1.6333,2.4827;-3.0722,1.6423,1.7233;-4.2291,1.4039,2.3474;-5.126,1.4153,1.7297;-4.2975,1.1531,3.7158;-5.2403,0.958,4.2158;-3.088,1.1785,4.4061;-3.0488,1.0039,5.4804;-1.919,1.42,3.8025;-0.7926,2.9569,0.8112;-1.7001,2.8542,0.3537;0.2711,2.9368,-0.1856;1.2223,2.996,0.367;0.2036,4.1334,-1.1357;-0.7521,4.1084,-1.6818;0.2091,5.0597,-0.5504;1.3733,4.1029,-2.1327;1.2869,4.9321,-2.8456;2.3143,4.2597,-1.5858;1.4467,2.7676,-2.8911;2.3295,2.7492,-3.5421;0.5709,2.6784,-3.5503;1.4765,1.5568,-1.9412;2.4131,1.5518,-1.3664;1.4777,0.6399,-2.5496;0.2768,1.6047,-0.9714;-0.6436,1.5708,-1.5789)|\",4.742944414215\r\n\"[H]C1C([H])=C([H])[C@]2([H])C(=O)N/C(C([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])=N\\C2=C1[H] |(8.9273,1.2933,-6.2497;8.3561,0.4478,-5.874;8.3083,-0.7583,-6.6762;8.8082,-0.7632,-7.6404;7.636,-1.8438,-6.242;7.5596,-2.7541,-6.8275;6.9846,-1.8511,-4.9007;7.6423,-2.4568,-4.2398;5.6586,-2.6428,-4.7964;5.3501,-3.4574,-5.643;4.9288,-2.4329,-3.6327;5.2091,-1.3936,-2.8868;4.4284,-1.2262,-1.609;4.3765,-2.2025,-1.1151;4.967,-0.5344,-0.9552;2.9788,-0.7376,-1.8493;2.4671,-0.7226,-0.8763;2.4715,-1.4928,-2.4623;2.8233,0.6429,-2.5166;3.3169,0.604,-3.4987;1.3345,0.9382,-2.754;1.1987,1.8936,-3.2746;0.7894,0.9985,-1.8026;0.862,0.1554,-3.3592;3.4836,1.7693,-1.7077;3.2986,2.7434,-2.1763;4.5678,1.6348,-1.6429;3.0784,1.8112,-0.6875;6.0835,-0.3514,-3.1808;6.8755,-0.5172,-4.2093;7.6748,0.572,-4.7007;7.6616,1.4987,-4.1361)|\",3.5946239651050003\r\n\"[H]O/C(=N/C([H])([H])[C@]1([H])OC([H])([H])C([H])([H])OC1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(1.8299,1.3404,2.6936;1.9803,0.5386,2.1646;1.004,-0.3707,2.4262;1.0195,-1.4462,1.7637;0.0962,-2.5356,1.9798;-0.608,-2.3728,2.808;-0.4975,-2.6828,1.0693;0.8788,-3.8463,2.2269;1.4709,-4.0595,1.331;-0.0184,-4.9551,2.3614;-0.7007,-5.0049,3.6139;-1.455,-4.2067,3.6831;-1.2241,-5.9663,3.6338;0.2708,-4.8915,4.7834;0.9204,-5.7805,4.8212;-0.2643,-4.8047,5.7342;1.0752,-3.7208,4.6487;1.8178,-3.7811,3.4307;2.4457,-2.8883,3.3914;2.4669,-4.6704,3.4366;0.024,0.0589,3.5533;-0.8714,-0.5676,3.4481;0.6437,-0.2607,4.9024;0.8426,-1.3207,5.086;0.9484,0.5692,5.8282;0.6879,1.9599,5.6255;1.0496,2.7715,6.4553;-0.0362,2.3802,4.3984;-0.2892,3.4343,4.335;-0.3585,1.5037,3.4361;-0.8885,1.8171,2.5384)|\",4.361985023515\r\n\"[H]OC1=C([H])C([H])=C([H])N=C1/N=C(/O[H])[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(9.0897,1.169,-1.42;8.2492,1.0069,-0.9627;7.7524,2.2018,-0.5305;8.3437,3.4276,-0.8125;9.2648,3.4642,-1.3924;7.7458,4.6002,-0.3427;8.1894,5.5708,-0.5416;6.5668,4.4885,0.3836;6.0629,5.3727,0.7689;5.9833,3.3137,0.6622;6.5526,2.1828,0.2275;6.041,0.9451,0.6094;4.8126,0.6266,0.5191;4.4927,-0.6091,1.0143;3.5301,-0.6784,1.0973;3.7127,1.3632,-0.2451;3.9748,2.4198,-0.2685;3.5913,0.8533,-1.6412;4.4868,0.6431,-2.0778;2.7951,-0.3479,-1.5942;3.3178,-1.0742,-1.0891;1.6668,0.0696,-0.7327;1.3071,-0.8177,-0.1971;0.5309,0.6404,-1.5805;-0.2864,1.0011,-0.9448;0.1377,-0.1236,-2.2576;0.9001,1.4765,-2.1828;2.2668,1.1354,0.2575;2.2218,0.8232,1.3083;1.7065,2.0728,0.1962)|\",4.99056801817\r\n\"[H]O/C(=N/C1=NC(C([H])([H])[H])=C([H])C([H])=C1[H])[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(5.1206,-4.7386,-0.1951;5.6295,-3.8884,-0.1572;4.6777,-2.9398,-0.1842;5.0499,-1.7363,-0.4086;4.2099,-0.6281,-0.4665;2.9778,-0.619,0.0758;2.232,0.4974,-0.0092;0.865,0.434,0.6301;0.3088,1.3656,0.4892;0.9514,0.2448,1.7067;0.2768,-0.3876,0.2052;2.6869,1.6551,-0.643;2.0563,2.5379,-0.6889;3.9675,1.6527,-1.2032;4.3526,2.5399,-1.6994;4.7426,0.508,-1.1119;5.7452,0.451,-1.5204;3.2773,-3.5068,0.0778;2.5511,-2.9638,-0.5309;3.2781,-4.9737,-0.2217;2.7422,-5.1606,-1.065;2.6361,-5.7026,0.861;3.4033,-6.1051,1.399;1.9685,-4.6867,1.7098;1.9646,-5.0625,2.7397;0.5255,-4.4605,1.2542;0.032,-3.7165,1.8895;-0.0408,-5.3957,1.3056;0.4788,-4.0992,0.2201;2.8654,-3.4432,1.5651;3.7516,-3.5461,2.2036;2.3873,-2.4904,1.7919)|\",5.123903804915\r\n\"[H]O/C(=N/C1=NC([H])=C([H])C([H])=C1C([H])([H])[H])[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(-0.3868,-2.4902,1.3073;0.5593,-2.6391,1.1624;1.0955,-1.5479,0.5293;2.3682,-1.5065,0.5234;3.1372,-0.4689,-0.007;2.7776,0.8073,0.2105;3.5857,1.7783,-0.2324;3.2581,2.7969,-0.0294;4.7786,1.5369,-0.9058;5.4031,2.3592,-1.241;5.1465,0.2075,-1.1244;6.0753,-0.0271,-1.6399;4.3353,-0.8309,-0.6738;4.6985,-2.2785,-0.8748;5.7227,-2.3786,-1.2486;4.0295,-2.7671,-1.5948;4.6023,-2.8358,0.063;0.1186,-0.6585,-0.2372;0.609,0.2976,-0.4097;-0.2613,-1.2625,-1.5468;0.5069,-1.729,-2.0244;-1.3157,-2.2128,-1.304;-0.9387,-2.9999,-0.7611;-2.2165,-1.4449,-0.4175;-2.7343,-2.159,0.2348;-3.2484,-0.6749,-1.2411;-3.8923,-0.0699,-0.5919;-3.8762,-1.3661,-1.8115;-2.7403,-0.0097,-1.9466;-1.2794,-0.4871,0.4075;-1.2813,-0.7027,1.4832;-1.6061,0.5514,0.3054)|\",5.064038757805\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(5.3833,2.4522,-1.0754;5.9064,1.798,-0.5447;5.4044,0.5552,-0.7684;5.9726,-0.4202,-0.1871;5.5352,-1.7927,-0.345;4.9539,-2.0863,0.5419;4.8906,-1.9809,-1.2188;6.7653,-2.7207,-0.444;7.3969,-2.5351,0.4321;7.3518,-2.426,-1.3226;6.3846,-4.1817,-0.5326;6.2518,-4.8206,-1.7729;6.4635,-4.2633,-2.6834;5.8619,-6.1583,-1.8566;5.7704,-6.6348,-2.8295;5.5964,-6.8838,-0.6942;5.2964,-7.9266,-0.7559;5.7259,-6.2618,0.5492;5.528,-6.8201,1.4608;6.1164,-4.9245,0.6263;6.2222,-4.4489,1.5992;4.1882,0.5249,-1.7092;3.6502,-0.4149,-1.5853;3.2261,1.6273,-1.4634;3.2266,1.8504,-0.4651;3.9215,2.7677,-2.1118;3.2339,3.5211,-2.1647;4.1995,2.2426,-3.4759;5.0524,2.8055,-3.8695;3.0076,2.3742,-4.4285;3.2414,1.9359,-5.4056;2.7476,3.4273,-4.5936;2.1327,1.8588,-4.0187;4.5877,0.7655,-3.1928;5.6541,0.5847,-3.3494;4.036,0.0908,-3.8537)|\",5.99738926502\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(3.9371,-0.9721,0.4815;4.4897,-0.2975,0.9598;4.2349,0.9173,0.4279;4.8933,1.9137,0.8712;4.6788,3.225,0.4237;5.7174,3.9006,-0.2424;6.6427,3.3666,-0.4382;5.5594,5.2252,-0.6422;6.3708,5.7257,-1.1651;4.3744,5.9145,-0.3686;4.2589,6.9508,-0.6736;3.3475,5.2596,0.3123;2.4244,5.7858,0.5433;3.4941,3.9289,0.7063;2.702,3.4298,1.2584;3.1873,0.9347,-0.6933;3.2609,1.8818,-1.2293;3.3584,-0.1742,-1.6621;4.3552,-0.3778,-1.768;2.8053,-1.3156,-0.8911;2.7072,-2.0843,-1.5563;1.4607,-0.8124,-0.497;1.16,-1.3657,0.3986;0.4068,-0.9881,-1.5939;-0.5528,-0.5602,-1.2821;0.2387,-2.0497,-1.8136;0.7266,-0.4868,-2.5134;1.748,0.6779,-0.1645;1.673,0.8793,0.907;1.0349,1.3307,-0.6758)|\",5.537516857675\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])C([H])([H])OC([H])([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(5.4988,-1.9355,0.6389;4.8183,-1.7454,-0.0288;3.5965,-2.1135,0.4447;2.5904,-1.8518,-0.2689;1.2269,-2.2031,0.0673;0.9601,-3.1124,-0.4908;1.0828,-2.4377,1.1377;0.2401,-1.0899,-0.3283;-0.7624,-1.4372,-0.0376;0.5104,0.2106,0.4439;1.5396,0.5517,0.24;0.4106,0.0533,1.5246;-0.4189,1.232,0.1117;-0.357,1.5781,-1.268;-1.073,2.3944,-1.4051;0.6497,1.9616,-1.5118;-0.6909,0.3819,-2.1612;-1.7369,0.0958,-1.9878;-0.6027,0.6698,-3.2168;0.2366,-0.8004,-1.8399;1.2617,-0.5703,-2.156;-0.0735,-1.6954,-2.3955;3.6493,-2.8565,1.8066;2.6184,-2.893,2.1856;4.0972,-4.2832,1.5652;3.4599,-4.8671,0.8945;5.1303,-4.8677,2.0408;5.992,-4.1386,2.922;6.9995,-4.6652,3.3514;5.6223,-2.7501,3.2935;6.2808,-2.2591,4.0036;4.5296,-2.153,2.7955;4.2593,-1.138,3.0804)|\",4.193274436205\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])C1=C([H])C([H])=C(F)C([H])=C1[H] |(6.3745,2.8569,-0.0367;6.3724,1.8881,0.0492;5.1277,1.4259,-0.2384;4.9287,0.1828,-0.0609;3.6585,-0.415,-0.4081;2.8102,0.2387,-0.1529;3.6089,-0.6727,-1.8882;4.5796,-0.6685,-2.2047;3.1444,-2.0153,-2.1015;2.1242,-1.961,-2.1242;3.5512,-2.7956,-0.9109;4.605,-3.0669,-1.0645;2.7252,-4.0672,-0.7592;3.0621,-4.6502,0.1043;2.8091,-4.6967,-1.6516;1.6639,-3.8292,-0.6059;3.4594,-1.798,0.263;4.2163,-1.9655,1.0337;2.4695,-1.8578,0.7336;4.1527,2.4694,-0.8021;4.2374,2.4365,-1.8951;3.1312,2.1518,-0.5754;4.3874,3.8818,-0.3064;3.9409,4.2801,0.9648;3.4127,3.568,1.5941;4.161,5.5719,1.4353;3.8149,5.8892,2.4135;4.8399,6.4738,0.6212;5.053,7.7258,1.0723;5.3003,6.1206,-0.6404;5.82,6.8535,-1.2485;5.0688,4.8207,-1.0956;5.4124,4.538,-2.0881)|\",5.270845284185\r\n\"[H]C1C([H])=C2N=C(C([H])(C([H])([H])[H])C([H])([H])[H])NC(=O)C2=C([H])[C@]1([H])F |(2.7417,6.6937,-1.6029;2.9461,5.665,-1.8869;2.6741,4.6381,-1.0673;2.2397,4.7801,-0.0827;2.9844,3.2625,-1.4531;2.6997,2.2893,-0.6394;3.0594,0.9875,-1.0472;2.6264,-0.0721,-0.0567;2.932,0.3048,0.9303;1.0829,-0.159,-0.0464;0.7535,-0.8516,0.7357;0.7116,-0.5319,-1.0082;0.6323,0.8191,0.1463;3.271,-1.4366,-0.308;2.9526,-2.147,0.4634;4.3631,-1.3719,-0.2909;2.9837,-1.8323,-1.2865;3.6821,0.6372,-2.1318;4.0521,1.6248,-3.0561;4.653,1.3482,-4.0809;3.6607,3.0344,-2.7319;3.9498,4.0629,-3.5516;4.4947,3.8847,-4.4756;3.4874,5.4575,-3.2708;2.6864,5.6944,-3.9946;4.5196,6.3649,-3.5263)|\",3.36332719218\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3=C([H])C([H])=C([H])O3)=N\\C2=C1[H] |(3.6229,-3.3734,-0.3332;4.6547,-3.2441,-0.642;5.1794,-4.0392,-2.1048;6.8228,-3.24,-1.9458;6.849,-2.4379,-2.6997;8.0751,-4.0874,-2.2124;8.0352,-5.0904,-2.8976;9.2372,-3.5431,-1.6942;9.1508,-2.6272,-0.7502;10.3975,-2.0408,-0.3052;11.6934,-2.2338,-0.7273;11.996,-2.9157,-1.5081;12.5074,-1.3779,0.0607;13.5808,-1.2607,0.0129;11.6522,-0.7221,0.9061;11.7988,0.0173,1.6794;10.3745,-1.109,0.7005;8.0128,-2.2129,-0.0595;6.885,-2.5941,-0.5814;5.5934,-2.536,0.0416;5.4228,-2.0417,0.9903)|\",3.5864605495900004\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3=C([H])C([H])=C(C([H])([H])[H])O3)=N\\C2=C1[H] |(8.0202,2.4409,-0.3178;8.1652,3.4705,-0.6269;8.9005,3.7648,-2.1837;8.6413,5.5677,-1.961;7.7957,5.8399,-2.6116;9.7801,6.5215,-2.3506;10.6385,6.199,-3.1495;9.6661,7.7819,-1.7942;8.8847,7.9534,-0.7449;8.7448,9.3058,-0.2573;9.2544,10.4971,-0.722;9.8973,10.6013,-1.5836;8.772,11.508,0.1469;8.9678,12.57,0.0936;7.9957,10.8765,1.0898;7.2205,11.3644,2.262;7.3011,12.4516,2.3392;7.5914,10.9213,3.1942;6.16,11.0993,2.1738;7.9724,9.5413,0.854;8.2435,6.9724,0.0135;8.212,5.7925,-0.5298;7.8515,4.5633,0.1189;7.4479,4.528,1.1236)|\",3.4259133777950006\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C3([H])[H])=N\\C2=C1[H] |(-5.111,0.045,6.9267;-4.3022,-0.6728,7.0118;-3.074,-0.6753,5.7712;-2.1439,-1.9542,6.6992;-1.2805,-1.4451,7.1559;-1.5574,-3.1414,5.9301;-1.3606,-3.1119,4.7339;-1.1844,-4.2013,6.7556;-1.723,-4.3192,7.9435;-1.2422,-5.4736,8.7954;-0.2893,-5.7612,8.34;-2.187,-6.7052,8.6809;-2.4465,-6.8679,7.628;-1.6048,-7.5827,8.997;-3.4463,-6.6072,9.5533;-4.088,-5.7949,9.1909;-4.023,-7.5366,9.4599;-3.0814,-6.3518,11.0234;-2.5184,-7.2152,11.4105;-3.9903,-6.2736,11.6336;-2.235,-5.0785,11.1731;-1.9257,-4.9518,12.2191;-2.8443,-4.2084,10.909;-0.982,-5.1172,10.2824;-0.2994,-5.8867,10.6706;-0.4379,-4.1662,10.3501;-2.7709,-3.5475,8.4577;-3.0349,-2.4477,7.8164;-4.1818,-1.6043,7.9956;-4.8985,-1.7515,8.7944)|\",3.77149796793\r\n\"[H]OC1=N[C@]([H])(C2=NC([H])N([H])N2)N([H])[C@@]2([H])C([H])=C([H])S[C@@]12[H] |(3.6795,-3.1763,-2.1601;4.0139,-2.422,-2.6703;3.6732,-1.2586,-2.0458;4.0671,-0.1785,-2.5759;3.704,1.0763,-1.9496;3.4475,1.783,-2.7466;4.9176,1.6571,-1.2494;5.6294,1,-0.278;6.5487,1.8824,0.0639;7.3238,1.7511,0.8071;6.3947,3.0118,-0.6604;6.9434,3.8598,-0.66;5.3465,2.8855,-1.514;2.5113,0.9632,-1.0944;2.2919,1.8672,-0.6808;2.6204,-0.0686,-0.0553;3.4496,0.1217,0.6434;1.3026,-0.2257,0.668;0.9869,0.4788,1.4321;0.5379,-1.2363,0.2458;-0.4612,-1.4756,0.5944;1.2356,-2.2481,-1.0277;2.8945,-1.4136,-0.7603;3.458,-2.0642,-0.0809)|\",5.45860384103\r\n\"[H]C1C([H])=C([H])[C@]2([H])C(=O)NC(C([H])([H])C([H])([H])C([H])([H])C([H])([H])Cl)=NC2=C1[H] |(7.3257,-2.2803,-3.7018;6.241,-2.3202,-3.7645;5.631,-2.3513,-5.0793;6.2732,-2.2905,-5.953;4.2916,-2.4317,-5.2115;3.8016,-2.4318,-6.1796;3.4136,-2.5588,-4.0122;3.1093,-3.6272,-3.9633;2.0378,-1.8615,-4.106;1.5805,-1.517,-5.1766;1.334,-1.7616,-2.9078;1.9816,-1.9344,-1.7846;1.2026,-1.8518,-0.499;0.1399,-1.9788,-0.7234;1.5284,-2.6711,0.1551;1.4414,-0.5097,0.224;2.5193,-0.3822,0.3824;1.1137,0.3075,-0.4322;0.6971,-0.4355,1.5659;1.0272,-1.25,2.2224;-0.38,-0.5696,1.4082;0.94,0.8981,2.2599;0.5733,1.7376,1.665;1.9989,1.0552,2.477;0.075,0.9933,3.8591;3.3537,-2.086,-1.6051;4.0683,-2.2933,-2.6803;5.5052,-2.2895,-2.6186;5.9653,-2.2,-1.6398)|\",3.632719904175001\r\n\"[H]C([H])([H])C1=C(C([H])([H])C([H])([H])[H])[C@@]2([H])C(NC(C([H])([H])C([H])([H])C([H])([H])[H])=NC2=O)S1 |(7.7744,4.3496,-0.1589;8.8358,4.0998,-0.0936;9.409,5.0284,-0.2086;9.0319,3.7193,0.9166;9.2218,3.0944,-1.1364;8.4745,2.4829,-2.0755;7.0003,2.6874,-2.3167;6.5037,1.7159,-2.2169;6.5854,3.3466,-1.5477;6.6768,3.2594,-3.7089;5.5975,3.4144,-3.8122;6.9828,2.5697,-4.5031;7.1751,4.2215,-3.8756;9.2805,1.5897,-2.9872;9.3782,2.0858,-3.9728;10.683,1.4719,-2.4461;11.5432,0.6011,-2.8447;11.0411,-0.342,-3.7592;12.0989,-1.1757,-4.4239;11.6512,-1.6892,-5.2797;12.8907,-0.5062,-4.7856;12.7202,-2.2022,-3.4511;13.1243,-1.6674,-2.5838;11.9242,-2.8591,-3.0783;13.8181,-3.0384,-4.1142;14.2396,-3.7627,-3.4083;14.6387,-2.4052,-4.4729;13.4285,-3.5976,-4.9735;9.7869,-0.609,-3.9873;8.8051,0.1744,-3.3682;7.6601,-0.2028,-3.2269;10.9596,2.6055,-1.1436)|\",3.8640166771\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C1=C([H])C([H])=C([H])C(C([H])([H])[H])=N1 |(2.7936,-2.0942,0.3623;3.4898,-2.2979,1.0251;4.4309,-1.3413,0.8674;5.5297,-1.4922,1.4978;6.6502,-0.5948,1.3395;6.7098,-0.1896,0.3188;6.5408,0.56,2.3002;5.8113,0.306,2.9674;7.7544,0.652,3.0668;8.3768,1.2624,2.5332;8.3379,-0.7071,3.0871;7.7948,-1.2685,3.8596;9.8212,-0.6816,3.4334;10.2265,-1.698,3.4774;9.9879,-0.2035,4.4043;10.3926,-0.1288,2.6756;7.9844,-1.2921,1.7064;7.8634,-2.3783,1.7075;8.7639,-1.0305,0.9786;3.9747,-0.2407,-0.066;4.4893,1.058,-0.0824;5.252,1.3682,0.6253;3.9426,1.9622,-0.9962;4.3138,2.9828,-1.0353;2.9154,1.5568,-1.839;2.4755,2.2458,-2.5544;2.4271,0.2448,-1.736;1.2914,-0.2446,-2.5993;0.411,0.4015,-2.5028;1.0158,-1.2614,-2.3121;1.5762,-0.249,-3.6586;2.9527,-0.6189,-0.8612)|\",4.60416635046\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)/N=C1\\C([H])=C(C([H])([H])[H])N([H])N1[H] |(4.2959,2.7928,2.4445;4.1287,3.7529,1.9523;3.9269,4.7762,2.6948;3.7072,6.0461,2.0699;3.4505,7.0156,2.7611;3.8125,6.1443,0.5971;3.7052,7.1419,0.1819;4.0363,5.0681,-0.167;4.1412,5.1461,-1.2452;4.1624,3.71,0.4423;3.2985,3.0861,0.1461;5.3893,2.9202,-0.0904;6.1168,3.4195,-0.9434;5.4866,1.6755,0.4874;6.4821,0.8953,0.1258;7.5629,0.9693,-0.8275;7.7809,1.8334,-1.4315;8.1867,-0.2441,-0.837;9.3908,-0.7064,-1.5906;9.7812,0.1052,-2.2079;9.1482,-1.5545,-2.241;10.1891,-1.0273,-0.9091;7.5225,-1.1219,0.0082;8.0907,-1.7168,0.6078;6.5967,-0.3487,0.7242;5.7537,-0.8235,1.0277)|\",4.5715126884\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])C1=C([H])C([H])=C([H])C(F)=C1[H] |(7.1911,0.3989,0.799;6.4552,-0.1666,0.5097;5.5319,0.6035,-0.1289;4.6296,-0.0125,-0.7776;3.6308,0.7224,-1.5208;3.3245,1.6499,-1.0158;4.2064,1.1001,-2.8569;4.9492,0.4227,-3.0391;3.2084,0.8567,-3.8604;2.6252,1.6953,-3.8877;2.3918,-0.2779,-3.3703;2.9472,-1.1898,-3.6297;1.0248,-0.3221,-4.0428;0.4542,-1.1948,-3.7079;1.1273,-0.3776,-5.1318;0.4375,0.5731,-3.7978;2.3726,-0.1283,-1.8314;2.4198,-1.0881,-1.31;1.4638,0.3934,-1.5065;5.7272,2.1073,0.0695;4.9474,2.6523,-0.464;5.6114,2.3242,1.1397;7.0945,2.5702,-0.4007;8.1287,2.8144,0.5132;7.9372,2.7299,1.5809;9.3951,3.2037,0.065;10.1878,3.3963,0.7823;9.6447,3.3574,-1.2966;10.6148,3.6627,-1.674;8.6032,3.1148,-2.1891;8.8347,3.267,-3.5084;7.3389,2.7227,-1.7728;6.5546,2.5401,-2.5029)|\",5.575612796745\r\n\"[H]C1=C([H])[C@@]([H])(C(=O)/N=C2\\C([H])=C(C([H])([H])[H])N([H])N2[H])N=NC1=O |(5.4321,6.5305,-1.187;5.6422,5.5412,-1.5821;6.2021,4.5532,-0.8692;6.4807,4.6702,0.1761;6.3956,3.2285,-1.5176;7.2412,2.6738,-1.109;5.1073,2.3547,-1.3079;4.0648,2.7209,-1.8372;5.3442,1.2614,-0.5203;4.3287,0.452,-0.2825;2.9259,0.4271,-0.6116;2.4374,1.1472,-1.2459;2.3656,-0.623,0.0579;0.9677,-1.1493,0.0335;0.3235,-0.4734,-0.5328;0.5636,-1.2503,1.047;0.9233,-2.1364,-0.445;3.3214,-1.2321,0.8573;3.3376,-2.249,0.8907;4.5516,-0.6705,0.4941;5.2673,-0.6404,1.2118;6.5996,3.2906,-2.9937;6.0807,4.208,-3.6544;5.3099,5.2811,-2.9864;4.545,5.9306,-3.6655)|\",3.62999876567\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)N2C([H])([H])C([H])([H])SC([H])([H])C2([H])[H])C([H])=C1[H] |(-0.1546,-0.5949,3.3423;-0.1114,-0.521,2.2528;0.965,-0.0033,1.7583;1.0409,0.1248,0.3592;1.828,0.9041,-0.1453;0.1257,-0.7459,-0.5321;-0.1551,-0.1141,-1.3812;0.8765,-2.0087,-1.0567;0.6672,-3.0847,-0.5069;1.7226,-1.8716,-2.1249;2.0823,-0.6309,-2.8203;1.8206,-0.7406,-3.8812;1.5254,0.2047,-2.4013;3.5734,-0.3042,-2.685;3.8155,-0.1188,-1.6346;3.8008,0.6037,-3.2516;4.6574,-1.6183,-3.3662;3.9045,-3.0237,-2.4566;4.3608,-3.9313,-2.8637;4.1605,-2.9589,-1.3937;2.3858,-3.0819,-2.6302;1.9737,-3.9275,-2.0782;2.1393,-3.1943,-3.6938;-1.1221,-1.196,0.1728;-1.8952,-1.6736,-0.423;-1.2481,-1.0382,1.4978;-2.1305,-1.3653,2.0385)|\",3.54836461052\r\n\"[H]O/C(=N/C([H])([H])C1=NN([H])C([H])=C1[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(-0.4971,-3.9716,-1.1445;0.0211,-3.5774,-1.8827;0.6011,-2.4518,-1.4211;1.6333,-2.0223,-2.0235;2.3329,-0.8172,-1.5725;2.9135,-0.4469,-2.424;1.6695,0.0021,-1.2641;3.2663,-1.106,-0.4245;2.8655,-0.8742,0.8309;3.8939,-1.2948,1.5996;3.8265,-1.2013,2.6018;4.9347,-1.7765,0.873;5.8361,-2.1415,1.3444;4.5627,-1.6735,-0.4518;5.1333,-1.9717,-1.319;-0.1114,-1.7548,-0.2362;0.6919,-1.3933,0.4382;-1.0008,-2.6311,0.641;-1.0513,-3.8491,0.5424;-1.7751,-1.9832,1.6185;-2.2159,-0.805,1.3187;-2.857,-0.3191,2.0578;-1.9084,-0.0843,0.0879;-2.462,0.8167,-0.1558;-0.8855,-0.5258,-0.664;-0.5648,0.0035,-1.5565)|\",3.90755489318\r\n\"[H]O/C(=N/C([H])([H])C1=NN([H])C([H])=C1[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.4741,0.9089,0.6264;3.2643,0.3229,1.3723;2.1608,-0.4173,1.0777;1.8464,-1.334,1.8869;0.6838,-2.186,1.6778;0.3431,-2.5212,2.6624;-0.164,-1.6504,1.2225;1.0017,-3.4025,0.8418;1.2251,-3.2778,-0.4704;1.4812,-4.5345,-0.8908;1.6718,-4.6944,-1.8686;1.4222,-5.4522,0.1072;1.6015,-6.5025,-0.0732;1.1133,-4.7504,1.2557;0.9907,-5.1475,2.2527;1.4528,0.0134,-0.2343;0.6099,-0.6728,-0.3807;0.921,1.421,-0.0816;0.1827,1.5623,0.714;1.2469,2.4559,-0.76;2.2132,2.3155,-1.8106;2.5644,3.2981,-2.4348;2.7503,0.9669,-2.1097;3.4346,0.9064,-2.9506;2.3869,-0.111,-1.3993;2.7546,-1.1087,-1.6235)|\",4.61505090448\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])OC([H])=C1[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(-0.4663,-3.8259,-0.9737;0.1018,-3.3702,-1.6368;0.5513,-2.226,-1.0916;1.574,-1.6802,-1.6056;2.1463,-0.4338,-1.1304;1.7948,-0.1039,-0.1381;3.2257,-0.6108,-1.0211;1.936,0.6821,-2.1236;1.6536,1.9904,-1.8593;1.5043,2.5477,-0.946;1.5751,2.7141,-3.0139;1.8078,1.8401,-4.0365;1.7708,2.2637,-5.0287;2.0325,0.5887,-3.557;2.2288,-0.3084,-4.126;-0.2858,-1.6455,0.0818;0.4377,-1.3693,0.8715;-1.2611,-2.5987,0.7784;-1.2554,-3.8076,0.5949;-2.1733,-2.0423,1.6877;-2.5791,-0.8376,1.4477;-3.322,-0.4272,2.1353;-2.1102,0.0046,0.3529;-2.6273,0.9338,0.1357;-0.996,-0.364,-0.3017;-0.5499,0.256,-1.0742)|\",3.39598085424\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N\\C(=O)[C@]1([H])C([H])([H])[C@@]1([H])N(=O)=O |(2.0014,-0.7494,-1.3209;2.2969,-1.3591,-0.4668;3.4136,-1.1117,0.1227;3.78,-1.9186,1.2153;4.7378,-1.6486,1.6516;3.054,-2.94,1.727;3.3955,-3.514,2.5823;1.7597,-3.2795,1.1443;0.9979,-4.1337,1.5778;1.3782,-2.4472,-0.0739;0.3183,-2.6359,-0.7618;-0.5921,-3.6984,-0.5824;-0.4545,-4.7399,-1.1938;-1.7982,-3.3615,0.2333;-1.8719,-2.3627,0.6502;-2.4423,-4.4808,1.0078;-2.9056,-4.2103,1.9503;-1.9735,-5.4573,0.9471;-3.1028,-3.9853,-0.2389;-3.1112,-4.5795,-1.1429;-4.3181,-3.1657,-0.099;-5.1662,-3.2898,-0.9772;-4.4041,-2.4225,0.8792)|\",3.1211458652350004\r\n\"[H]O/C(=N/[C@@]([H])(C([H])([H])[H])[C@@]1([H])OC([H])([H])C([H])([H])C1([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(4.1966,4.043,1.0787;3.3707,3.5501,0.9452;3.661,2.2654,0.5919;2.6969,1.5085,0.3022;2.8597,0.1071,-0.0604;3.7793,-0.3266,0.3595;1.6625,-0.6657,0.509;1.7539,-1.7407,0.3206;0.7256,-0.3047,0.0705;1.6085,-0.511,1.5904;2.9372,-0.0144,-1.5897;2.0664,0.5029,-2.0209;4.1363,0.6456,-2.057;4.7371,-0.1016,-3.125;4.9305,0.5745,-3.9666;5.7017,-0.5037,-2.7827;3.7563,-1.2281,-3.474;3.0463,-0.8953,-4.24;4.2596,-2.1249,-3.8482;3.0326,-1.4437,-2.138;3.6396,-2.0663,-1.4678;2.0537,-1.9181,-2.2485;5.1747,1.9126,0.6176;5.2746,1.0168,-0.0001;5.6096,1.5399,2.0471;5.2809,0.4735,2.5249;6.3806,2.4545,2.8089;6.9401,3.4373,2.1838;7.5464,4.1148,2.7907;6.8312,3.734,0.7544;7.4261,4.5442,0.3435;5.9883,3.0067,-0.0007;5.8489,3.1937,-1.0631)|\",3.8640166771\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N([H])/C(=N/C([H])([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])O2 |(2.3636,1.057,-0.9108;1.3044,1.1358,-1.1354;0.3929,0.3288,-0.4338;0.7131,-0.373,0.3287;-0.9389,0.4712,-0.7617;-1.3881,1.3649,-1.742;-0.4909,2.1654,-2.4365;-0.8234,2.8641,-3.1979;0.8685,2.0332,-2.1128;1.5954,2.6454,-2.6385;-2.767,1.2085,-1.7813;-3.4323,1.7111,-2.347;-3.172,0.261,-0.8514;-4.3574,-0.1071,-0.6233;-4.6168,-1.1095,0.4003;-4.4055,-2.1125,-0.0042;-3.967,-0.9815,1.2791;-6.0884,-1.0464,0.8463;-6.2293,-0.1286,1.4311;-7.0306,-1.0338,-0.3068;-6.4916,-0.9544,-1.1668;-7.7612,-2.2862,-0.3568;-8.6795,-2.0739,0.0342;-7.0791,-3.2044,0.566;-7.7802,-3.9562,0.9436;-6.2799,-3.7305,0.0267;-6.4902,-2.295,1.6578;-5.6429,-2.7378,2.1939;-7.2601,-2.0433,2.3969;-2.0185,-0.2026,-0.2212)|\",5.047711926775\r\n\"[H]C1=C([H])C2=C(N/C([H])=N\\C2N([H])C([H])([H])[C@@]2([H])N([H])N([H])C([H])([H])C2([H])[H])S1 |(-6.2457,-3.806,2.6433;-5.5014,-3.188,3.1279;-4.6235,-2.3348,2.5343;-4.5821,-2.1878,1.46;-3.7751,-1.6746,3.4872;-4.043,-2.0606,4.8113;-3.4087,-1.6096,5.9023;-2.458,-0.721,5.6251;-1.9108,-0.3224,6.4773;-2.0884,-0.2452,4.4259;-2.733,-0.7127,3.3457;-2.3485,-0.2542,2.1249;-2.919,-0.4728,1.319;-1.3792,0.8162,1.9409;-1.7894,1.7854,2.2714;-0.506,0.6185,2.5681;-0.9901,0.8898,0.4701;-0.5491,-0.0733,0.1764;-2.186,1.0782,-0.3856;-2.7526,1.832,0.0117;-1.7031,1.5825,-1.6429;-1.4374,0.7595,-2.1808;-0.4881,2.3966,-1.3573;0.278,2.1979,-2.1141;-0.7464,3.4592,-1.4245;-0.029,2.0316,0.0824;-0.15,2.8864,0.7588;1.0199,1.7229,0.1279;-5.347,-3.2389,4.8712)|\",4.9960102951800005\r\n\"[H]C1=C([H])C(OC(=O)[C@@]2([H])N([H])N([H])C([H])([H])C2([H])[H])=C(N(=O)=O)C([H])=C1[H] |(-0.7987,1.2753,1.8263;0.0075,1.4732,1.1262;0.9065,0.4575,0.8093;0.8267,-0.5271,1.2578;1.942,0.6851,-0.0949;2.8579,-0.3313,-0.2997;2.6499,-1.1806,-1.3666;1.6911,-1.1065,-2.0913;3.7568,-2.219,-1.4385;3.8805,-2.6225,-0.4258;3.3971,-3.2927,-2.3962;2.524,-3.0259,-2.8494;4.3933,-3.3492,-3.4423;5.0561,-4.0669,-3.1484;5.0762,-2.0471,-3.4281;6.0703,-2.1318,-3.8783;4.4908,-1.3321,-4.019;5.1023,-1.6273,-1.9498;5.1782,-0.5497,-1.794;5.9327,-2.1124,-1.4249;2.0756,1.9583,-0.6705;3.1457,2.2824,-1.6233;3.4267,3.4709,-1.7707;3.6934,1.3601,-2.2274;1.1843,2.9818,-0.3399;1.3274,3.9549,-0.7933;0.1447,2.7385,0.5503;-0.5519,3.5339,0.7946)|\",3.45040362434\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])[H])[C@@]1([H])C(=O)/N=C2C(=C/1[H])\\C(=O)C([H])([H])C([H])([H])C\\2([H])[H] |(3.2548,0.977,1.727;3.0496,0.0767,2.0677;3.8571,-0.8019,1.4432;3.5209,-2.0217,1.4126;4.3059,-3.0653,0.7837;5.048,-2.6949,0.054;4.875,-3.5979,1.5608;3.3777,-4.0614,0.0837;3.9526,-4.8847,-0.3557;2.8135,-3.5673,-0.7145;2.6585,-4.474,0.7975;5.1907,-0.2257,0.8831;5.2852,-0.6283,-0.1432;5.2813,1.2956,0.7113;4.2936,2.0168,0.7531;6.5378,1.8528,0.4585;7.5888,1.2454,0.9162;7.5328,-0.0197,1.6719;6.3855,-0.7299,1.6387;6.3385,-1.6932,2.1386;8.7275,-0.5561,2.4075;8.6443,-1.5644,3.0868;10.042,0.1939,2.2377;10.6361,0.038,3.1431;10.5839,-0.3053,1.4189;9.8625,1.6801,1.8986;9.4371,2.2126,2.7591;10.8371,2.137,1.6977;8.9443,1.8598,0.6783;9.3978,1.358,-0.1916;8.8087,2.9102,0.4086)|\",3.616393073145\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C([H])C(Cl)=C([H])C([H])=C1OC([H])([H])[H] |(2.0568,6.2223,1.5971;1.2209,5.73,1.9184;1.0792,4.6478,1.2583;0.0423,3.8342,1.6141;-0.1385,3.2237,0.883;1.9591,4.1777,0.1666;2.5759,4.9501,-0.287;2.0647,2.8909,-0.22;1.4965,2.1361,0.3172;2.9196,2.3597,-1.2839;3.5605,3.1856,-2.2195;3.4167,4.2593,-2.1796;4.3658,2.6458,-3.2153;5.1523,3.7148,-4.3699;4.5499,1.2691,-3.3141;5.1766,0.8546,-4.0962;3.9158,0.4265,-2.4013;4.0626,-0.6434,-2.487;3.1054,0.9566,-1.3921;2.4586,0.1961,-0.4652;2.5934,-1.2178,-0.5205;1.9902,-1.6048,0.3021;3.6373,-1.5253,-0.3815;2.2169,-1.6198,-1.4696)|\",4.266745175840001\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)N1C([H])([H])[C@]2([H])C([H])([H])C([H])([H])[C@@]1([H])C2([H])[H] |(1.437,1.3725,-4.2853;2.3807,1.1116,-3.8001;3.456,1.4179,-4.4169;4.7119,1.1227,-3.797;5.7453,1.3426,-4.3996;4.7176,0.5549,-2.4287;5.6949,0.4191,-1.9758;3.5853,0.2035,-1.8084;3.6038,-0.2327,-0.8119;2.251,0.404,-2.4686;1.8,-0.5825,-2.669;1.2874,1.209,-1.5607;1.1372,2.4142,-1.7305;0.6698,0.5174,-0.5664;0.6174,-0.9413,-0.3338;0.2965,-1.4878,-1.2295;1.5911,-1.3389,-0.0185;-0.431,-1.0226,0.7977;-0.406,-1.9749,1.3329;-1.8169,-0.6475,0.209;-2.0411,-1.213,-0.7019;-2.611,-0.8681,0.9298;-1.6906,0.8921,-0.0504;-2.4007,1.4651,0.5563;-1.8485,1.1691,-1.096;-0.24,1.1784,0.3969;0.0256,2.228,0.5069;-0.0831,0.2453,1.6095;0.9302,0.2433,2.0242;-0.8028,0.4576,2.4065)|\",4.791924907305\r\n\"[H]O/C(=N/[C@]([H])(/C(=N\\C([H])([H])[H])O[H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(5.8119,-1.1551,2.4111;4.8984,-0.8216,2.4144;4.506,-0.539,1.148;3.3049,-0.1812,0.9786;2.7225,0.1921,-0.3007;3.4795,0.4565,-1.0498;1.8251,1.4302,-0.0693;1.4403,2.2851,-0.9209;1.826,2.2216,-2.3129;2.1292,3.2228,-2.6419;0.9565,1.9428,-2.9239;2.6432,1.5267,-2.5679;1.3751,1.538,1.2029;1.8803,0.8826,1.7293;1.8725,-0.975,-0.8378;1.3895,-0.6993,-1.7799;1.0951,-1.2391,-0.1144;2.4965,-1.8597,-1.009;5.6287,-0.6645,0.0902;5.1496,-0.6265,-0.8968;6.5452,0.5419,0.2029;6.0642,1.5119,0.0472;7.7954,0.5454,0.466;8.4622,-0.7046,0.6807;9.6394,-0.716,0.9777;7.6884,-1.9625,0.5264;8.2532,-2.8821,0.6459;6.3799,-1.956,0.2367;5.8206,-2.8816,0.1141)|\",3.72795975185\r\n\"[H]/N=C(/O[H])C([H])([H])[C@]1([H])C([H])C2=C(NC1=O)/C([H])=C([H])\\C([H])=C/2[H] |(2.0663,5.6021,1.2668;2.9369,5.266,1.6806;2.8324,4.03,1.9743;3.8902,3.3971,2.5078;3.6399,2.4741,2.7628;1.5847,3.1697,1.7698;1.164,2.9,2.7463;0.8317,3.7719,1.2525;1.8376,1.8503,0.9627;2.7318,2.0218,0.335;0.7281,1.5442,0.0175;0.3434,2.3595,-0.5943;0.2266,0.2911,-0.12;0.8033,-0.7784,0.7255;1.7269,-0.597,1.6472;2.2117,0.6725,1.8809;2.9724,0.8772,2.8289;0.2961,-2.1146,0.5269;0.7263,-2.8983,1.141;-0.6833,-2.3629,-0.3856;-1.0505,-3.3776,-0.5156;-1.2604,-1.3137,-1.1937;-2.0435,-1.5627,-1.903;-0.8225,-0.0369,-1.0643;-1.239,0.7665,-1.6668)|\",3.0884922031749995\r\n\"[H]C1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C(C([H])([H])[H])C(C([H])([H])[H])=C2[H])OC(C([H])([H])[H])=C1[H] |(11.4157,0.5639,0.7322;11.2926,-0.2997,0.095;10.0874,-0.8198,-0.2994;8.7137,-0.3949,-0.001;8.5311,0.5869,0.7219;7.6102,-1.182,-0.5941;7.8753,-2.0291,-1.2187;6.3287,-0.839,-0.351;6.1807,0.0284,0.2918;5.1165,-1.4877,-0.8431;5.1245,-2.6111,-1.688;6.066,-3.0442,-2.0122;3.929,-3.1741,-2.1163;3.9535,-4.0427,-2.7705;2.6864,-2.6531,-1.7285;1.4081,-3.2912,-2.2142;1.6148,-4.1489,-2.8612;0.7922,-2.5829,-2.784;0.788,-3.6435,-1.3795;2.6587,-1.5259,-0.8797;1.347,-0.9254,-0.4317;1.5102,-0.0564,0.2125;0.7405,-1.6481,0.1296;0.7381,-0.5991,-1.2848;3.8669,-0.9695,-0.4572;3.845,-0.1006,0.1974;10.3066,-1.9175,-1.1044;11.6531,-2.085,-1.2123;12.1244,-3.2245,-2.0457;13.2171,-3.2538,-2.054;11.7744,-3.1336,-3.0813;11.757,-4.1823,-1.6573;12.3005,-1.1142,-0.4923;13.3722,-1.0037,-0.3991)|\",4.038169541419999\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)C2=C([H])C(F)=C([H])C([H])=C2[H])N=NS1 |(-1.3444,-7.4026,0.6561;-0.3347,-7.2413,0.3032;0.2661,-6.023,0.0557;-0.3385,-4.7144,0.2187;-1.3676,-4.6845,0.5692;0.2778,-3.5434,-0.0299;1.3041,-3.5419,-0.3767;-0.4517,-2.2665,0.1806;-1.6068,-2.2669,0.6002;0.2368,-0.9655,-0.12;-0.4808,0.2091,0.156;-1.4819,0.1441,0.5661;0.105,1.436,-0.1028;-0.5855,2.5632,0.1653;1.3883,1.5486,-0.6328;1.8073,2.5321,-0.8193;2.097,0.3804,-0.9095;3.098,0.4458,-1.3256;1.5311,-0.8701,-0.6556;2.1058,-1.7588,-0.8884;1.5717,-6.1608,-0.388;2.0067,-7.3402,-0.4975;0.7501,-8.5068,-0.0258)|\",4.31572566893\r\n\"[H]C(=C(\\C#N)C(=O)OC([H])([H])[H])/C1=C2\\OC([H])([H])O\\C2=C([H])\\C([H])=C\\1[H] |(4.6918,-0.8494,2.6934;4.8458,-1.5247,1.8546;3.7368,-1.6202,1.0546;2.6073,-0.8322,1.468;1.7172,-0.1806,1.839;3.5487,-2.427,-0.1845;4.3586,-3.1758,-0.6974;2.3189,-2.2216,-0.6949;2.0147,-2.9472,-1.8966;2.0616,-4.0242,-1.7147;2.7208,-2.6872,-2.6894;1.0028,-2.6455,-2.166;6.1822,-2.0895,1.874;7.0176,-1.6605,2.9171;6.7181,-0.7549,3.898;7.8471,-0.743,4.7871;8.1721,0.2897,4.9404;7.5619,-1.2144,5.7367;8.8951,-1.4913,4.1687;8.3254,-2.1068,3.0788;8.8947,-3.0132,2.2108;9.916,-3.3547,2.3396;8.0803,-3.468,1.1521;8.4898,-4.1861,0.4484;6.7754,-3.0297,0.9819;6.1793,-3.4008,0.1616)|\",3.6136719346400006\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)C2=C([H])C([H])=C([H])C([H])=C2[H])N=NS1 |(6.6876,1.4079,-1.4156;6.3093,2.4093,-1.5702;5.0101,2.8371,-1.3818;3.8917,2.0256,-0.9389;4.0922,0.9768,-0.7328;2.6335,2.4713,-0.7647;2.402,3.5116,-0.9588;1.5763,1.5323,-0.3025;1.8513,0.3586,-0.0597;0.1713,2.028,-0.1329;-0.2332,3.3436,-0.4109;0.4756,4.0793,-0.7749;-1.5616,3.728,-0.2278;-1.8616,4.7488,-0.4473;-2.5008,2.8056,0.2349;-3.535,3.1078,0.3776;-2.109,1.4924,0.5136;-2.8378,0.7715,0.8738;-0.7853,1.1084,0.3303;-0.459,0.0952,0.5392;4.8537,4.1841,-1.6673;5.8865,4.8028,-2.0453;7.2816,3.7008,-2.0941)|\",4.3973598240800005\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])N=C([H])C([H])=C1[H])N([H])C1([H])C([H])([H])N([H])N([H])C1([H])[H] |(-4.4669,-1.618,0.1925;-4.62,-2.4502,-0.2904;-3.3717,-2.8907,-0.5983;-2.3602,-2.2267,-0.1719;-1.0308,-2.6513,-0.5812;-0.957,-2.7532,-1.678;-0.3378,-1.8448,-0.3078;-0.5317,-3.9447,0.056;-0.8684,-4.2864,1.3751;-1.5258,-3.6314,1.9431;-0.4397,-5.3869,2.0028;0.3642,-6.2129,1.3203;0.7001,-7.102,1.8516;0.764,-5.9803,0.0046;1.4142,-6.6849,-0.5063;0.306,-4.8275,-0.6335;0.5991,-4.6111,-1.6595;-3.3935,-4.0127,-1.3905;-2.5401,-4.5565,-1.3439;-4.6126,-4.7779,-1.6582;-5.3792,-4.0555,-1.9417;-5.0919,-5.6708,-0.4681;-6.0027,-5.2983,0.0097;-4.3029,-5.6991,0.2946;-5.2991,-7.0299,-0.9997;-6.2539,-7.1282,-1.3423;-4.4524,-7.1624,-2.154;-3.5334,-7.425,-1.7992;-4.3912,-5.8278,-2.7817;-3.447,-5.6853,-3.3175;-5.1997,-5.7576,-3.5188)|\",5.3279891927900005\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])N([H])C1([H])C([H])([H])N([H])N([H])C1([H])[H] |(7.0146,-1.1849,-0.8222;6.5651,-1.971,-1.1817;5.2947,-1.8716,-0.6983;5.0237,-0.9135,0.1078;3.6634,-0.6856,0.5533;3.5022,0.4003,0.5788;2.9062,-1.0932,-0.1388;3.3876,-1.2239,1.9793;2.4116,-0.8393,2.3035;4.1427,-0.8014,2.6517;3.3973,-2.733,2.0765;2.2642,-3.4801,1.7151;1.3599,-2.9605,1.4034;2.2743,-4.8757,1.7649;1.3835,-5.4339,1.4887;3.423,-5.5526,2.1848;3.4301,-6.638,2.2386;4.5569,-4.8217,2.55;5.4523,-5.3395,2.8839;4.542,-3.4265,2.4946;5.4277,-2.8635,2.777;4.4704,-2.8423,-1.2157;3.6721,-3.0617,-0.6337;4.9499,-3.9322,-2.0659;5.5702,-3.4843,-2.8443;3.7409,-4.7111,-2.664;3.649,-4.5896,-3.7478;2.8083,-4.341,-2.2174;3.9137,-6.1359,-2.314;4.4079,-6.6116,-3.0676;4.7972,-6.1924,-1.1777;4.2113,-6.0402,-0.3563;5.7278,-5.0608,-1.3295;6.1185,-4.7407,-0.3603;6.5746,-5.3881,-1.947)|\",5.42595017897\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C(=O)N(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(-1.825,-3.8608,-4.0661;-1.0494,-3.969,-3.4918;-0.7071,-2.7657,-2.95;0.2316,-2.7674,-2.1061;0.7298,-1.5864,-1.4372;0.1497,-0.6713,-1.6203;0.6928,-1.7871,-0.3589;2.1958,-1.3351,-1.8133;2.7385,-2.2822,-1.7291;2.6436,-0.6383,-1.0914;2.3347,-0.746,-3.2202;1.3397,-0.3423,-3.8282;3.589,-0.6599,-3.7613;3.7527,-0.1079,-5.1003;4.1649,-0.8652,-5.7799;4.4409,0.7467,-5.0764;2.78,0.2179,-5.464;4.8211,-1.0774,-3.1116;5.3013,-1.888,-3.6764;4.6392,-1.4274,-2.0975;5.5251,-0.2361,-3.0632;-1.5553,-1.5838,-3.4817;-1.1524,-0.6726,-3.0244;-1.3462,-1.4471,-4.9763;-0.3193,-1.2309,-5.2736;-2.2312,-1.5553,-5.8973;-3.5891,-1.8112,-5.5266;-4.4322,-1.9619,-6.3909;-3.935,-1.8839,-4.0861;-4.9867,-2.0292,-3.8585;-3.0016,-1.7683,-3.1308;-3.2632,-1.825,-2.0758)|\",4.704848475145001\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)N1C([H])([H])C([H])([H])C([H])(C([H])([H])C([H])([H])[H])C([H])([H])C1([H])[H] |(7.6648,-4.1428,1.8272;7.1436,-4.6534,2.6396;7.7841,-5.5665,3.262;7.1448,-6.2327,4.3546;7.7126,-7.1509,4.915;5.7988,-5.7801,4.7783;5.3902,-6.2659,5.659;5.121,-4.851,4.0943;4.1244,-4.5446,4.4045;5.7157,-4.2047,2.875;5.1449,-4.5158,1.9841;5.6949,-2.6574,2.9702;6.7362,-2.0647,3.2333;4.5021,-2.0145,2.7755;4.4547,-0.5518,2.8974;3.7723,-0.2974,3.7226;5.4551,-0.214,3.1655;3.963,0.0963,1.5978;4.7111,-0.0705,0.8098;3.8947,1.1793,1.7523;2.6062,-0.479,1.1489;1.864,-0.2287,1.9259;2.1084,0.1051,-0.1851;2.8461,-0.1175,-0.9698;1.1895,-0.4246,-0.4732;1.8251,1.6116,-0.1613;1.4054,1.9426,-1.1177;2.7325,2.1983,0.0174;1.1026,1.8666,0.6239;2.7032,-2.0141,1.0812;1.7211,-2.4489,0.8534;3.3808,-2.3022,0.2649;3.2246,-2.6167,2.3946;3.3266,-3.6981,2.3036;2.4962,-2.4296,3.199)|\",4.783761491789999\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])[H])C([H])([H])N(C(=O)[C@@]1([H])C([H])=NC(=O)C([H])=C1[H])C([H])([H])[H] |(1.3023,-1.0936,1.8011;1.7272,-0.5535,1.1078;3.0725,-0.6856,1.329;3.4606,-1.3947,2.3012;4.8583,-1.6399,2.6206;5.5605,-1.1007,1.9711;5.0186,-1.2827,3.6464;5.1625,-3.1388,2.5528;6.1861,-3.3356,2.8913;5.0686,-3.4978,1.5234;4.4694,-3.7023,3.1862;3.8419,0.1136,0.2872;4.9097,0.0748,0.4914;3.5089,1.1558,0.3343;3.6202,-0.3641,-1.0838;4.414,-1.4026,-1.5019;5.2947,-1.8785,-0.7948;4.2031,-1.9436,-2.9367;3.1203,-1.9819,-3.1397;4.8123,-0.9975,-3.9573;4.5828,0.0643,-3.826;5.5593,-1.305,-4.9451;5.8934,-2.6855,-5.1563;6.4692,-3.0069,-6.1775;5.5326,-3.6653,-4.1095;5.9247,-4.6688,-4.2418;4.76,-3.3302,-3.0705;4.5012,-4.0497,-2.299;2.4146,0.0982,-1.7736;1.5342,-0.4957,-1.5041;2.5466,0.0854,-2.8559;2.2224,1.1333,-1.4805)|\",4.77287693777\r\n\"[H]C([H])([H])SC([H])([H])[C@@]1([H])N(C([H])([H])[H])C(=O)N(C([H])([H])[H])C(=O)C1([H])[H] |(6.2659,-3.0279,2.8748;5.7927,-2.0923,2.566;5.2735,-1.6587,3.426;6.5673,-1.4042,2.2148;4.623,-2.5081,1.2263;3.9544,-0.8304,0.8841;3.4267,-0.4716,1.7741;4.7826,-0.1419,0.6794;2.9777,-0.8614,-0.3129;2.2611,-1.6739,-0.1462;2.1965,0.3742,-0.3896;1.0079,0.4961,0.4444;0.3267,-0.3368,0.2381;0.5186,1.4401,0.2093;1.256,0.4879,1.5145;2.6995,1.5129,-0.9741;2.1785,2.6149,-0.8708;3.8873,1.3623,-1.7316;4.4765,2.6028,-2.2482;5.4315,2.3442,-2.7014;4.6139,3.3117,-1.4302;3.8186,3.0583,-2.9933;4.4209,0.1411,-2.141;5.4001,0.067,-2.8626;3.6689,-1.0854,-1.6579;2.9161,-1.3291,-2.4191;4.3783,-1.9142,-1.6142)|\",6.010994957545001\r\n\"[H]C1C([H])=C([H])[C@@]2([H])C(=O)NC(C([H])([H])SC([H])([H])[H])=NC2=C1[H] |(9.6941,-1.8174,-0.7875;9.0142,-1.0737,-1.1957;9.2099,-0.6442,-2.5658;10.0003,-1.1072,-3.1493;8.4116,0.2966,-3.1104;8.5182,0.6261,-4.1386;7.3546,0.9629,-2.2962;7.7472,1.972,-2.0426;6.0578,1.3377,-3.0531;6.0401,1.404,-4.2658;4.9746,1.69,-2.2537;5.0244,1.4083,-0.9738;3.9096,1.9198,-0.1074;3.7544,1.256,0.7448;2.9862,2.0335,-0.6792;4.3964,3.5535,0.6036;4.0766,4.6319,-0.8388;4.4543,5.6246,-0.5799;3.0046,4.6973,-1.0467;4.5973,4.2616,-1.7252;5.9688,0.6499,-0.297;7.0506,0.3343,-0.9618;7.9879,-0.6144,-0.4261;7.8091,-0.9844,0.5782)|\",3.2680873445050005\r\n\"[H]O/C(=N/C([H])([H])C(=O)/N=C1/C(=O)C([H])=C([H])N=C1[H])C1([H])C([H])([H])C1([H])[H] |(-2.7272,-1.7799,-2.2458;-2.0431,-2.2308,-2.7649;-0.8357,-1.9489,-2.2029;0.1692,-2.5502,-2.6998;1.4974,-2.2918,-2.187;1.5369,-1.6547,-1.2893;1.993,-3.239,-1.9496;2.3778,-1.6115,-3.2424;3.5564,-1.8502,-3.4077;1.7191,-0.5571,-3.8883;1.2286,-0.5868,-5.0658;0.5276,0.6617,-5.5908;0.2746,1.6101,-4.862;0.1721,0.5994,-7.0096;-0.2811,1.4773,-7.4587;0.3695,-0.5397,-7.7104;0.0873,-0.6037,-8.7579;0.915,-1.7331,-7.2025;1.303,-1.7544,-5.9768;1.7238,-2.6789,-5.5821;-0.8623,-0.9757,-1.051;-0.6534,-1.441,-0.0889;-1.8362,0.1871,-1.0234;-2.477,0.3539,-1.8867;-2.2978,0.4398,-0.0733;-0.3667,0.4475,-1.2152;0.195,0.8771,-0.3904;-0.0324,0.7622,-2.2001)|\",2.974204385964999\r\n\"[H]O/C(=N/C([H])([H])[H])N([H])C1=C([H])C([H])=C2NC(=O)NC2=C1[H] |(3.6166,-1.3044,-4.3287;2.7957,-1.4527,-3.824;3.2001,-2.024,-2.6701;4.4351,-2.2506,-2.4621;4.8677,-2.865,-1.221;5.9576,-2.9444,-1.2256;4.4696,-3.8851,-1.0921;4.5901,-2.2745,-0.3319;2.1544,-2.3167,-1.798;2.4717,-2.7559,-0.9451;0.7816,-2.1156,-1.8786;0.0718,-2.6078,-0.6752;0.6786,-3.0566,0.109;-1.2646,-2.5243,-0.5143;-1.7664,-2.8903,0.3751;-2.0287,-1.9209,-1.5857;-3.2964,-1.7271,-1.675;-3.4293,-1.0769,-3.0018;-4.4912,-0.7366,-3.4495;-2.1601,-0.9144,-3.6567;-1.3082,-1.4188,-2.8139;0.1097,-1.5342,-2.9237;0.6148,-1.1712,-3.8054)|\",2.6585523193850005\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)C2=C([H])N(C([H])([H])[H])N=C2C([H])([H])[H])C([H])=C1[H] |(-2.2666,7.609,-0.5295;-1.5872,6.7878,-0.3178;-1.6933,5.5875,-1.0284;-2.4567,5.4751,-1.7935;-0.8259,4.534,-0.7607;-0.9239,3.6087,-1.3208;0.1721,4.6534,0.2263;1.1124,3.5819,0.554;1.8249,3.7976,1.3496;1.2109,2.3592,-0.0022;0.5462,2.0465,-0.8027;2.2419,1.3993,0.4761;3.0106,1.6946,1.3936;2.3183,0.0856,-0.177;1.5601,-0.4253,-1.2279;0.7522,-0.0006,-1.8066;2.01,-1.6717,-1.4723;1.5402,-2.621,-2.4623;0.7342,-2.1625,-3.0388;1.168,-3.5236,-1.9689;2.3588,-2.8955,-3.1335;3.0284,-2.0275,-0.6415;3.2261,-0.9745,0.1462;4.286,-1.0035,1.2015;3.8556,-0.8407,2.1948;5.014,-0.2009,1.0477;4.7979,-1.9694,1.1804;0.2635,5.8679,0.9313;1.0283,5.976,1.6964;-0.6048,6.9244,0.6637;-0.5145,7.8529,1.2208)|\",4.394638685575\r\n\"[H]OC(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])N(C([H])([H])[H])N=C1[H] |(4.9202,0.6261,4.1382;4.4342,0.2889,3.3694;5.254,0.3778,2.275;4.6708,0.3595,1.1424;5.4467,0.3048,-0.0783;6.4595,-0.1158,0.0452;4.9246,-0.3278,-0.8015;5.6134,1.6615,-0.762;5.4726,1.8501,-1.9501;5.988,2.6175,0.1186;6.1747,3.9503,-0.4156;5.4117,4.1316,-1.1761;6.0008,4.6104,0.4378;7.5728,4.1312,-0.9879;7.7062,5.1651,-1.3265;7.7252,3.4673,-1.8432;8.3359,3.9183,-0.2312;6.7022,0.4637,2.5566;7.2992,0.9943,1.8216;7.2824,-0.0759,3.65;6.6565,-0.6354,4.3457;8.6858,-0.0274,4.0072;9.7526,0.6149,3.3789;9.7959,1.2233,2.4868;10.8603,0.3554,4.1097;12.2194,0.8038,3.8668;12.2365,1.4053,2.956;12.8794,-0.0595,3.7471;12.5693,1.4063,4.7094;10.6026,-0.4206,5.1907;9.296,-0.6506,5.1296;8.8207,-1.259,5.8891)|\",4.639541151025\r\n\"[H]O/C(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(4.658,1.0668,3.908;4.2426,0.815,3.05;5.2159,0.7232,2.1301;4.9693,0.0543,1.0755;5.9523,-0.1282,0.022;5.4282,-0.4021,-0.8963;6.5673,0.7578,-0.2007;6.9389,-1.2408,0.3809;7.7053,-1.184,1.3257;6.8586,-2.277,-0.465;7.7416,-3.4021,-0.2103;7.8364,-3.5335,0.8698;7.212,-4.2572,-0.6364;9.0968,-3.1957,-0.8683;9.717,-4.088,-0.725;9.6177,-2.3427,-0.4239;8.9867,-3.0242,-1.9441;6.5161,1.5072,2.4256;7.3374,0.8186,2.1486;6.7846,1.874,3.8823;6.1017,1.4758,4.8148;7.8927,2.7,4.1357;8.1651,3.5932,3.2421;8.998,4.2657,3.4588;7.4526,3.753,1.9766;7.6196,4.6417,1.3765;6.66,2.7478,1.5681;6.1416,2.7829,0.6145)|\",4.035448402915\r\n\"[H]O/C(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(4.883,3.7044,1.1643;4.5684,2.7859,1.037;5.6533,1.9917,0.9356;5.4978,0.828,0.4606;6.5926,-0.096,0.2757;6.5029,-0.5436,-0.7178;7.6005,0.3502,0.3453;6.5476,-1.2176,1.3164;6.5532,-1.0332,2.5178;6.5245,-2.4246,0.7336;6.5102,-3.5827,1.6116;5.8766,-3.3598,2.4729;6.0433,-4.3645,1.0086;7.918,-3.9688,2.038;7.885,-4.89,2.6305;8.366,-3.1829,2.6525;8.5548,-4.1446,1.1649;6.9663,2.6335,1.4479;7.724,2.499,0.6591;6.7527,4.1105,1.5035;7.3631,4.8312,2.3115;8.3352,4.2363,3.2597;9.1728,4.9748,3.7255;8.1374,2.8258,3.5913;8.5559,2.4624,4.5246;7.4764,2.0404,2.7285;7.2992,0.9818,2.9087)|\",3.572854857065\r\n\"[H]O/C(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(4.3126,-3.9191,-0.9089;4.1249,-2.9699,-0.818;5.247,-2.3129,-0.4272;5.1937,-1.0515,-0.3858;6.3145,-0.2342,0.0138;5.9408,0.6659,0.507;7.0084,-0.7223,0.7207;7.1603,0.2467,-1.1675;7.5506,1.3833,-1.3018;7.4613,-0.7715,-2.0049;8.2628,-0.434,-3.169;9.0028,0.314,-2.8758;8.7703,-1.3674,-3.4253;7.3919,0.0646,-4.3113;8.0108,0.2396,-5.1986;6.9051,1.0057,-4.0414;6.6216,-0.6703,-4.5666;6.4298,-3.2477,-0.0712;7.2919,-2.6055,0.1482;6.0946,-4.0318,1.1809;5.8829,-3.43,2.0701;6.0212,-5.302,1.3086;6.2911,-6.1285,0.1684;6.2015,-7.3359,0.2689;6.6783,-5.479,-1.1076;6.9059,-6.1505,-1.9299;6.7636,-4.1462,-1.2247;7.0592,-3.6678,-2.1551)|\",4.699406198135\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])N=C(C([H])([H])[H])C([H])=C1[H])N([H])C1([H])C([H])([H])N([H])N([H])C1([H])[H] |(8.3085,-3.0697,-0.6787;8.8381,-2.3745,-1.1085;8.0367,-1.2756,-1.0859;6.8758,-1.3576,-0.5546;6.0295,-0.1821,-0.5508;6.5214,0.6683,-0.0394;5.8291,0.1675,-1.5801;4.6999,-0.4351,0.1375;3.7447,0.5853,0.2253;3.9591,1.5659,-0.2044;2.5523,0.4627,0.8106;2.2354,-0.7252,1.3588;0.8737,-0.8258,2.0013;0.7436,-1.7759,2.5287;0.0834,-0.7417,1.2453;0.724,-0.0054,2.7113;3.1205,-1.8071,1.3238;2.8386,-2.7548,1.775;4.363,-1.6611,0.7072;5.0748,-2.4781,0.6611;8.6265,-0.1963,-1.6923;8.1696,0.6885,-1.5257;10.0409,-0.148,-2.086;10.6625,-0.5681,-1.2886;10.4401,1.309,-2.4308;11.4234,1.5566,-2.0167;9.7248,2.0252,-2.0037;10.4462,1.4115,-3.9162;11.406,1.5093,-4.2391;9.9815,0.159,-4.4598;8.9689,0.2501,-4.5446;10.2772,-0.8528,-3.4391;9.6586,-1.7432,-3.5679;11.3328,-1.1445,-3.5231)|\",5.880380309305\r\n\"[H]OC1=NC2=C(C([H])=C(/C(=N/C(=O)C([H])([H])C([H])([H])[H])C([H])([H])[H])C([H])=C2[H])N1[H] |(-1.556,-1.4935,-9.9878;-0.7859,-1.435,-9.3956;-1.2753,-1.2656,-8.1609;-2.5393,-1.1952,-7.8337;-2.5177,-1.0172,-6.4555;-1.1814,-0.9865,-5.9739;-0.8631,-0.823,-4.6391;0.1569,-0.8021,-4.2716;-1.9275,-0.6789,-3.7292;-1.6181,-0.491,-2.2881;-0.3825,-0.4826,-1.9311;0.0767,-0.2161,-0.6398;-0.1481,0.8275,-0.0462;0.9966,-1.3002,-0.0912;0.4668,-2.2613,-0.1552;1.8433,-1.3943,-0.7846;1.4763,-1.0197,1.3312;2.1456,-1.8154,1.676;0.6336,-0.9535,2.0265;2.0131,-0.068,1.3798;-2.7731,-0.3464,-1.317;-3.4237,-1.2277,-1.3562;-3.3849,0.5233,-1.5815;-2.4175,-0.2035,-0.2973;-3.2609,-0.7093,-4.2007;-4.0781,-0.5953,-3.4974;-3.5695,-0.8766,-5.5478;-4.5984,-0.895,-5.8922;-0.3995,-1.1512,-7.1173;0.6067,-1.1836,-7.1846)|\",4.508926502785\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)/N=C2\\C([H])=C(C([H])([H])[H])N([H])N2[H])C([H])=C1[H] |(6.7212,6.5082,-2.6476;5.9674,5.7327,-2.4831;5.6986,4.9729,-3.492;4.7008,3.9816,-3.3145;4.1824,3.4615,-4.2805;4.2733,3.5818,-1.898;3.1959,3.3781,-1.9313;4.9294,2.2367,-1.4566;5.8561,1.7532,-2.0921;4.3322,1.7468,-0.3134;4.7771,0.6091,0.1711;5.8277,-0.3201,-0.1713;6.4572,-0.2295,-1.0398;5.886,-1.2577,0.8166;6.7425,-2.4752,0.9429;7.4922,-2.4894,0.1488;6.1457,-3.3927,0.8585;7.257,-2.4987,1.9101;4.9757,-0.9469,1.8185;4.4259,-1.7131,2.2025;4.1785,0.0927,1.3118;3.8089,0.7425,1.997;4.5618,4.6226,-0.8589;4.1364,4.4768,0.1302;5.375,5.6518,-1.1493;5.6375,6.4081,-0.4153)|\",4.285793145375\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C1=C([H])C(F)=C([H])C(F)=C1[H] |(7.287,-0.0631,0.4037;6.4148,0.0699,0.808;5.4856,0.201,-0.1875;4.2687,0.1166,0.1684;3.186,0.3044,-0.7727;3.4789,0.9206,-1.6334;2.7621,-1.0451,-1.2852;3.1637,-1.7281,-0.641;1.3382,-1.1701,-1.1386;0.9314,-0.7861,-1.9938;0.9595,-0.3037,-0.0023;1.1861,-0.8719,0.9103;-0.5237,0.0442,-0.0221;-0.7895,0.6638,0.8411;-1.1379,-0.8621,0.0056;-0.7832,0.608,-0.9281;1.9273,0.8964,-0.0912;2.1763,1.3277,0.8821;1.4912,1.6859,-0.7167;6.0723,0.4595,-1.5441;7.1483,1.3572,-1.6564;7.5253,1.9034,-0.7982;7.7079,1.5814,-2.9076;8.7293,2.4506,-3.0172;7.2543,0.9397,-4.0539;7.7081,1.1225,-5.0207;6.1911,0.0554,-3.9069;5.7359,-0.5797,-5.001;5.5877,-0.2035,-2.6813;4.7546,-0.8977,-2.6193)|\",4.94158752508\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C([H])([H])C([H])([H])OC([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(7.0387,-6.2884,2.0057;7.1809,-5.4814,2.5269;6.8707,-4.3861,1.7785;6.8819,-3.2631,2.3534;6.5655,-2.0254,1.6554;6.9764,-1.2057,2.2561;7.0632,-1.9625,0.6733;5.0545,-1.8058,1.4559;4.907,-0.7994,1.0401;4.6791,-2.5073,0.7017;4.2147,-1.9704,2.7291;4.376,-1.1214,3.4066;4.5285,-2.8688,3.2713;2.7244,-2.0805,2.4328;2.3652,-1.1933,1.8822;2.1496,-2.1373,3.3725;2.4947,-3.2527,1.6587;1.1323,-3.4291,1.3281;1.0559,-4.3443,0.7353;0.745,-2.5849,0.7358;0.5074,-3.5339,2.2291;6.5317,-4.7105,0.3008;6.4318,-3.7462,-0.2153;7.6784,-5.4566,-0.345;8.6403,-4.9343,-0.3428;7.6521,-6.6203,-0.876;6.407,-7.3353,-0.9082;6.3702,-8.4446,-1.404;5.2024,-6.695,-0.3326;4.2864,-7.2748,-0.3952;5.2396,-5.4724,0.2163;4.3513,-5.0037,0.6368)|\",4.636820012519999\r\n\"[H]O/C(=N/C([H])([H])C1=NN([H])C([H])=C1[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(2.9754,1.3021,1.6951;3.0162,0.3549,1.9418;1.9887,-0.2726,1.3271;1.6719,-1.4308,1.7213;0.5701,-2.1873,1.1456;-0.0444,-2.5382,1.9842;-0.0958,-1.6088,0.4876;1.0698,-3.3829,0.3758;1.0339,-3.3804,-0.9595;1.5815,-4.5628,-1.3153;1.6404,-4.8003,-2.2939;1.9568,-5.315,-0.2481;2.4025,-6.2921,-0.3696;1.6428,-4.5772,0.8744;1.8088,-4.8474,1.9067;1.3315,0.5498,0.1779;0.2578,0.6205,0.4181;1.8276,1.9483,0.3008;2.0017,2.6881,-0.684;1.7105,2.2018,-2.0503;1.5728,3.0398,-2.9138;1.6737,0.7546,-2.2436;1.7625,0.3827,-3.2595;1.4889,-0.0503,-1.1865;1.4094,-1.1304,-1.2979)|\",3.572854857065\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.6351,-2.834,-1.4363;5.0087,-2.0078,-1.7782;4.3063,-0.9481,-1.2624;4.7511,0.2009,-1.5516;4.1354,1.4346,-1.1175;3.7296,1.9361,-2.0092;3.2988,1.3129,-0.4101;5.1877,2.3637,-0.4732;5.4509,1.946,0.5077;6.4219,2.4815,-1.2943;6.3586,1.835,-2.0781;6.5285,3.8248,-1.8282;7.2009,4.2993,-1.2248;5.2177,4.4571,-1.6263;5.3238,5.5452,-1.5642;4.5731,4.2291,-2.4858;4.6641,3.8094,-0.3451;3.572,3.855,-0.263;5.0877,4.2971,0.5406;3.0689,-1.3483,-0.4776;3.3452,-2.1101,0.2656;2.6815,-0.4981,0.0881;1.9474,-1.9081,-1.392;2.3259,-2.7779,-1.9436;1.6998,-1.1509,-2.1455;0.7078,-2.2927,-0.6132;-0.2927,-1.3464,-0.354;-0.1886,-0.3378,-0.7488;-1.4214,-1.6833,0.3938;-2.1885,-0.9359,0.5786;-1.5678,-2.9778,0.8957;-2.4476,-3.2433,1.4755;-0.5802,-3.9309,0.6421;-0.6887,-4.9432,1.0227;0.5467,-3.589,-0.1069;1.3073,-4.3417,-0.3056)|\",5.0966924198650005\r\n\"[H]/C1=C([H])/C([H])=C(/[H])C2=C1S/C(=N\\C([H])([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])N2[H] |(0.6218,1.4116,-1.18;-0.1658,2.057,-1.5573;0.1435,3.2715,-2.1783;1.1819,3.5707,-2.2828;-0.8751,4.0976,-2.6616;-0.6247,5.0396,-3.141;-2.2176,3.7331,-2.5378;-3.0077,4.3768,-2.9143;-2.5254,2.5212,-1.9184;-1.5007,1.6893,-1.4326;-2.1653,0.2165,-0.6986;-3.8653,0.7763,-1.0378;-4.9613,0.2096,-0.7489;-4.9698,-1.0634,-0.0551;-4.85,-1.8785,-0.7878;-4.1338,-1.1514,0.6609;-6.2941,-1.2626,0.7037;-6.3085,-0.5684,1.5533;-7.4821,-1.0003,-0.1549;-7.165,-0.6395,-1.0525;-8.1898,-2.2409,-0.4047;-8.9853,-2.226,0.2343;-7.292,-3.3249,0.0181;-7.8701,-4.2107,0.3012;-6.6412,-3.6015,-0.8221;-6.4668,-2.7238,1.169;-5.5078,-3.2281,1.334;-7.0335,-2.7668,2.1064;-3.7863,1.9878,-1.7058;-4.6477,2.4578,-1.9481)|\",5.066759896310001\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])OC1=C([H])C([H])=C([H])C([H])=C1[H] |(5.3028,-2.4444,-1.0939;5.7452,-1.576,-1.0766;4.8427,-0.6697,-0.6399;5.1976,0.5413,-0.5366;4.2684,1.5612,-0.0938;3.2549,1.1969,0.147;4.6687,2.0084,0.8291;4.1351,2.6715,-1.1497;3.8138,2.2046,-2.0891;3.0992,3.6662,-0.7328;2.9021,3.5419,0.2604;3.6522,4.9936,-0.8446;3.4363,5.3004,-1.7935;5.112,4.8366,-0.7187;5.6275,5.6884,-1.1745;5.3666,4.8244,0.3495;5.4376,3.4831,-1.3751;6.3081,2.9786,-0.9462;5.6274,3.6129,-2.4465;3.4589,-1.247,-0.31;2.6931,-0.765,-0.9314;3.2138,-1.057,0.743;3.4936,-2.6455,-0.5634;2.3616,-3.3982,-0.3453;2.4825,-4.7663,-0.6163;3.434,-5.1494,-0.9723;1.3901,-5.6058,-0.4256;1.4907,-6.6666,-0.6382;0.1727,-5.0936,0.0349;-0.6781,-5.7516,0.1835;0.0629,-3.7309,0.301;-0.8769,-3.3188,0.6583;1.1515,-2.8725,0.1145;1.0421,-1.8151,0.3258)|\",5.662689228905\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)C2=C([H])C([H])=C(C([H])([H])[H])O2)N=NS1 |(6.5819,6.4135,0.8723;5.8503,6.3496,0.0782;5.1726,5.2174,-0.3272;5.3085,3.8861,0.234;6.0111,3.7669,1.0559;4.6367,2.7966,-0.1804;3.9225,2.8531,-0.9948;4.8714,1.4912,0.487;5.6652,1.3556,1.4196;4.1154,0.3375,-0.0017;4.1061,-0.9727,0.4051;4.7024,-1.377,1.2101;3.1785,-1.6515,-0.4306;2.9106,-2.6987,-0.4008;2.6741,-0.7178,-1.3006;1.6824,-0.7849,-2.4078;1.3016,-1.8047,-2.5068;2.1323,-0.4894,-3.3634;0.834,-0.1151,-2.2231;3.2348,0.4952,-1.0505;4.3012,5.473,-1.3743;4.2557,6.6635,-1.7912;5.3751,7.6845,-0.8598)|\",4.016400433380001\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])/C(C#N)=C(/[H])C1=C([H])SN=N1 |(5.8416,0.868,-4.4159;6.7584,1.1944,-4.3232;6.9505,1.3184,-2.9752;5.9975,1.0581,-2.1859;6.1147,1.1837,-0.7382;6.2795,0.1793,-0.324;6.9738,1.7987,-0.4324;4.8248,1.7686,-0.151;4.6596,2.7675,-0.5782;3.9794,1.1485,-0.4765;4.8568,1.8567,1.3793;5.0305,0.8551,1.7974;5.7137,2.4696,1.6928;3.5699,2.4415,1.9703;3.6199,2.4898,3.0638;2.6981,1.8321,1.7028;3.3897,3.4578,1.5993;8.3567,1.7286,-2.6403;9.0837,0.7862,-1.8433;9.6536,0.004,-1.1958;8.971,2.8887,-2.9826;9.9769,3.0459,-2.602;8.4837,4.0187,-3.7532;7.3651,4.1705,-4.551;6.6357,3.4083,-4.7813;7.3256,5.7353,-5.1998;8.8404,6.1563,-4.3746;9.2607,5.1693,-3.7111)|\",4.416407793615001\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)C2=C([H])C([H])=C(F)C([H])=C2[H])N=NS1 |(4.6493,5.4238,0.61;5.2762,4.7576,0.0329;4.9494,3.4875,-0.3987;3.6985,2.7918,-0.1614;2.9385,3.3203,0.4095;3.411,1.5495,-0.5929;4.1479,0.9983,-1.1647;2.0876,0.9483,-0.2801;1.2512,1.5828,0.3609;1.7786,-0.4391,-0.7512;2.6715,-1.2379,-1.4849;3.658,-0.8745,-1.7495;2.3145,-2.5225,-1.8898;2.9939,-3.1504,-2.4563;1.0531,-3.0012,-1.5546;0.7028,-4.2409,-1.9434;0.1406,-2.2391,-0.8287;-0.8331,-2.6532,-0.5886;0.5122,-0.9613,-0.4319;-0.1687,-0.3358,0.1351;5.9755,2.8964,-1.1186;7.024,3.5822,-1.2697;6.8458,5.1633,-0.4756)|\",4.394638685575\r\n\"[H]C1=C([H])C([H])=C(C(=O)/C([H])=C(\\[H])C2=C([H])SN=N2)S1 |(-1.5055,-3.8994,-1.3995;-0.5395,-3.4324,-1.2569;0.6852,-3.9044,-1.6618;0.8245,-4.8399,-2.1921;1.7436,-3.0285,-1.3051;2.7836,-3.2269,-1.5375;1.3117,-1.9009,-0.6339;2.0567,-0.7543,-0.0917;1.4661,0.1633,0.4793;3.5326,-0.7478,-0.2578;4.0238,-1.5709,-0.7653;4.2549,0.28,0.2243;3.721,1.0835,0.7265;5.6976,0.403,0.1269;6.4671,1.4415,0.6121;6.1122,2.3185,1.1365;8.1077,1.1613,0.2704;7.7074,-0.3712,-0.5302;6.4606,-0.5703,-0.4981;-0.4252,-1.9177,-0.438)|\",4.25586062182\r\n\"[H]C1=C(/C([H])=C(\\[H])C2=NC3=C(C([H])=C2[H])C([H])=C([H])C([H])=C3[H])N=NS1 |(-4.5606,-5.0647,-0.0671;-5.2998,-4.2759,-0.0972;-5.0544,-2.9174,-0.1107;-3.7543,-2.2719,-0.0872;-2.8827,-2.92,-0.0558;-3.5644,-0.9382,-0.1015;-4.4272,-0.2776,-0.1327;-2.243,-0.3018,-0.077;-1.163,-1.0755,-0.0384;0.0618,-0.4891,-0.0154;0.2404,0.9338,-0.0309;-0.9317,1.7302,-0.0721;-0.8432,2.8142,-0.0851;-2.1619,1.1232,-0.095;-3.0757,1.7096,-0.1265;1.553,1.4709,-0.0049;1.6793,2.5511,-0.0169;2.648,0.6368,0.035;3.6508,1.0542,0.0547;2.4744,-0.7706,0.0504;3.3485,-1.4154,0.0819;1.2139,-1.3222,0.0259;1.0584,-2.3965,0.0372;-6.2252,-2.1754,-0.1509;-7.3067,-2.8276,-0.1687;-6.975,-4.571,-0.1351)|\",3.89122806215\r\n\"[H]/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])/N=C(/O[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(4.5787,-1.0522,2.7594;4.8106,-1.8231,2.132;3.7779,-2.1783,1.4715;3.8765,-3.2039,0.5892;4.7954,-3.5259,0.6532;2.3871,-1.5912,1.5081;1.6981,-2.3487,1.9026;2.3899,-0.7477,2.2034;1.8781,-1.116,0.1324;0.9279,-0.5927,0.2882;1.6766,-1.9782,-0.513;2.857,-0.184,-0.599;2.36,0.2089,-1.4963;3.7145,-0.7824,-0.9492;3.231,0.9289,0.2478;4.4128,1.3555,0.3721;4.6151,2.4021,1.2186;5.568,2.579,1.2952;5.7011,0.8813,-0.3497;5.4461,-0.0446,-0.8812;6.0784,1.9201,-1.3853;5.3268,2.1055,-2.1587;7.1563,2.6055,-1.4434;8.162,2.3965,-0.4429;9.1688,3.0774,-0.4553;7.9451,1.3402,0.5752;8.766,1.1745,1.2666;6.8173,0.6145,0.6156;6.6666,-0.186,1.3379)|\",4.726617583185\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C(C#N)=C([H])C([H])=C1[H] |(2.7752,-0.7259,1.5645;2.339,-0.7519,0.6978;3.3046,-0.6506,-0.2663;3.0532,-1.3124,-1.3441;4.0045,-1.0479,-2.3375;3.7412,-1.8974,-3.4493;4.4829,-1.6241,-4.204;2.7307,-1.7289,-3.8374;3.8558,-2.9536,-3.1783;4.4776,0.1811,-0.0165;5.297,0.0335,-0.7109;4.5472,1.0833,0.9842;3.6621,1.2454,1.5986;5.6935,1.9305,1.3252;5.5316,2.9186,2.3082;4.57,3.0414,2.7978;6.5944,3.7597,2.6667;6.4012,4.7623,3.6751;6.246,5.5738,4.4938;7.8442,3.6205,2.0434;8.6637,4.2734,2.3247;8.0129,2.6364,1.0699;8.978,2.519,0.5862;6.9575,1.8015,0.716;7.1172,1.0361,-0.0368)|\",3.86945895411\r\n\"[H]/N=C(/O[H])C([H])([H])N1C(=O)N=C2C([H])=C([H])C([H])=C([H])[C@]2([H])C1=O |(0.187,-4.4727,-4.6894;0.1391,-5.0499,-3.8491;0.8147,-4.5189,-2.9161;0.848,-5.1243,-1.7045;1.3609,-4.5748,-1.0793;1.6354,-3.2305,-3.0151;1.7183,-2.9208,-4.0542;2.6347,-3.3903,-2.5988;1.0269,-2.1003,-2.2816;0.2744,-1.1389,-3.0452;-0.0064,-1.3928,-4.1967;-0.0201,0.0874,-2.4525;0.1308,0.2431,-1.1743;-0.0294,1.5548,-0.5863;-0.3864,2.3472,-1.2357;0.3528,1.785,0.6963;0.2726,2.7891,1.1046;0.9222,0.7447,1.5379;1.29,1.0153,2.5231;1.005,-0.5242,1.1012;1.4304,-1.325,1.6962;0.4326,-0.9049,-0.2339;-0.5607,-1.348,-0.0112;1.1727,-2.0456,-0.9212;1.788,-2.8932,-0.28)|\",3.844968707564999\r\n\"[H]O/C(=N/C([H])([H])[C@@]1([H])C([H])([H])OC([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.2393,0.6004,2.6973;3.2093,-0.3714,2.6825;2.0538,-0.7807,2.095;1.8923,-2.0227,1.9301;0.688,-2.6075,1.388;-0.0874,-1.871,1.1224;0.952,-3.1391,0.4639;0.0678,-3.6318,2.363;0.7522,-4.4783,2.4773;-0.2823,-3.0091,3.7179;0.4731,-2.3229,4.1066;-0.4637,-3.793,4.4716;-1.472,-2.2625,3.4621;-2.2871,-3.0605,2.5922;-3.0529,-3.5879,3.1778;-2.7967,-2.3788,1.9039;-1.3356,-4.0602,1.8763;-1.4294,-4.0174,0.7857;-1.5537,-5.0886,2.1815;1.078,0.3797,1.738;0.3923,-0.0114,0.9756;0.2467,0.7338,2.9588;-0.4252,-0.0537,3.3125;0.2494,1.8432,3.598;1.0976,2.9002,3.1463;1.2068,3.9172,3.8037;1.8206,2.733,1.8587;2.3545,3.6064,1.4963;1.8084,1.5694,1.1924;2.3435,1.448,0.252)|\",4.628656597005\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])C([H])=C([H])N=C1[H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(-5.0571,-4.1472,-0.1193;-4.5387,-3.5893,-0.7429;-3.2495,-3.7724,-0.4053;-2.36,-3.097,-1.0113;-0.9569,-3.2165,-0.6702;-0.5927,-2.2016,-0.4478;-0.7431,-3.811,0.2331;-0.1289,-3.774,-1.8186;-0.5704,-3.7326,-3.144;-1.5489,-3.3178,-3.3643;0.2568,-4.2341,-4.1478;-0.0553,-4.2183,-5.1884;1.4986,-4.7616,-3.7932;2.1684,-5.1617,-4.553;1.9408,-4.8178,-2.53;1.1332,-4.3324,-1.5808;1.514,-4.3935,-0.5598;-3.0412,-4.7991,0.7313;-2.6054,-4.2773,1.5886;-4.3581,-5.3454,1.1488;-4.2993,-5.5378,2.1576;-4.4324,-6.5916,0.4153;-5.2113,-7.1335,0.7835;-3.1215,-7.2256,0.5958;-2.967,-7.616,1.6191;-2.9978,-8.0492,-0.1133;-2.1799,-6.0409,0.3187;-1.9164,-5.9919,-0.7402;-1.2532,-6.1083,0.8951)|\",6.005552680535\r\n\"[H]O/C(=N/C1=NC2=C(C([H])=C([H])C([H])=C2[H])N1[H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(-5.5484,-2.5048,-0.3834;-5.1782,-1.5793,-0.2948;-3.8673,-1.68,-0.0281;-3.199,-0.5817,0.0037;-1.8415,-0.5414,0.1962;-0.9174,-1.4798,0.0494;0.2857,-0.8483,0.3221;0.072,0.5192,0.6371;1.1166,1.3887,0.9498;0.9402,2.4337,1.1895;2.405,0.8558,0.9406;3.2474,1.4998,1.1777;2.6368,-0.4979,0.6302;3.6561,-0.8745,0.6346;1.5889,-1.3604,0.3188;1.7641,-2.4051,0.0795;-1.296,0.6805,0.5423;-1.8401,1.5207,0.6702;-3.3476,-3.0981,0.2333;-2.4316,-3.038,0.8182;-4.334,-3.959,0.9233;-4.8598,-3.3995,1.5995;-5.2854,-4.2424,-0.1798;-5.8958,-4.9893,0.1532;-4.3774,-4.756,-1.2345;-4.1251,-5.8093,-1.0617;-4.8721,-4.6643,-2.2049;-3.1153,-3.8692,-1.0967;-2.9849,-3.1851,-1.9378;-2.2114,-4.4765,-1.0234)|\",4.70756961365\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(7.1769,-2.8477,0.4159;6.7853,-1.9742,0.1581;5.4354,-2.108,0.0964;4.7438,-1.0707,-0.144;3.2929,-1.1441,-0.2345;2.9681,-1.7822,-1.0731;2.837,-1.5714,0.673;2.7234,0.2717,-0.4486;3.1958,0.6921,-1.3441;3.0404,0.896,0.3948;1.2172,0.2854,-0.5822;0.6036,0.0713,-1.8248;1.2247,-0.0694,-2.7073;-0.7861,0.0462,-1.9468;-1.2402,-0.115,-2.9214;-1.5917,0.2363,-0.8222;-2.6744,0.2213,-0.9155;-0.9959,0.4518,0.4216;-1.6139,0.608,1.3024;0.3949,0.4752,0.5371;0.8519,0.6504,1.5091;4.9188,-3.5356,0.3289;3.9135,-3.6285,-0.0854;5.767,-4.5823,-0.2951;6.168,-4.2032,-1.1567;6.8992,-4.6394,0.6613;7.4334,-5.4737,0.4163;6.2022,-4.8449,1.9487;5.8954,-5.8904,2.0817;6.8729,-4.5665,2.7657;4.9583,-3.9306,1.8345;5.0339,-3.0473,2.4739;4.0526,-4.4722,2.1177)|\",6.087186835685\r\n\"[H]C1=C([H])C([H])=C(N(C(=O)[C@@]2([H])N([H])N([H])C([H])([H])C2([H])[H])C([H])([H])[H])C([H])=C1[H] |(5.3515,2.5606,4.6409;4.7889,2.2066,3.7816;5.3284,1.2337,2.9372;6.3171,0.8287,3.1357;4.6172,0.7757,1.8304;5.0415,0.0204,1.1826;3.3401,1.2871,1.5596;2.6005,0.8638,0.4046;2.5071,-0.4786,0.0823;3.0929,-1.3491,0.7208;1.6364,-0.8847,-1.1302;1.9042,-0.277,-2.0018;1.8289,-2.3056,-1.4804;2.7705,-2.6072,-1.2347;0.9175,-3.0716,-0.6511;1.2188,-2.9543,0.3249;-0.3164,-2.3051,-0.8223;-0.7557,-2.5708,-1.7887;-1.0287,-2.5543,-0.0306;0.1092,-0.8082,-0.8174;-0.072,-0.3503,0.161;-0.4314,-0.2185,-1.5639;1.8365,1.8953,-0.303;2.3972,2.8321,-0.2678;0.8463,2.0643,0.1387;1.6984,1.6221,-1.3486;2.7979,2.2611,2.4081;1.8054,2.6578,2.2188;3.5215,2.7193,3.51;3.085,3.4748,4.1577)|\",5.238191622125001\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N/C(=O)C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.4245,0.5787,1.7028;4.531,1.6606,1.6419;5.4733,2.2138,2.3214;5.632,3.6095,2.2413;6.4605,3.9871,2.8345;4.8549,4.4474,1.5216;5.0192,5.5202,1.514;3.7305,3.9268,0.7376;2.9573,4.631,0.1085;3.5854,2.4056,0.7734;2.6616,1.8696,0.0685;2.485,0.475,-0.0684;3.3595,-0.2296,-0.5505;1.1367,-0.0207,0.3176;0.9232,-1.4074,0.3262;1.7393,-2.0618,0.0377;-0.3162,-1.918,0.6992;-0.4804,-2.9918,0.7068;-1.3497,-1.0482,1.0618;-2.318,-1.4483,1.3503;-1.1431,0.3332,1.0486;-1.9502,1.0076,1.3204;0.0968,0.8502,0.677;0.2591,1.9227,0.6443)|\",3.0558385411149995\r\n\"[H]C1=NC([H])=C([H])N=C1C(=O)/N=C1\\C(=O)C([H])=C([H])N=C1[H] |(-2.9632,-3.3726,2.3627;-3.0431,-3.5125,1.2888;-3.9788,-4.3511,0.8332;-4.0404,-4.4972,-0.4952;-4.7977,-5.1759,-0.8823;-3.1787,-3.8163,-1.3608;-3.2444,-3.9513,-2.4384;-2.2455,-2.977,-0.9078;-2.1821,-2.8284,0.4215;-1.1442,-1.91,0.9953;-0.9984,-1.7349,2.1895;-0.4702,-1.0934,0.0645;0.6211,-1.394,-0.5244;1.3193,-2.739,-0.3776;0.8373,-3.6181,0.3242;2.5587,-2.8629,-1.1394;3.1105,-3.7946,-1.0686;2.9875,-1.8342,-1.9073;3.909,-1.9182,-2.4773;2.3409,-0.5951,-2.0655;1.2453,-0.3987,-1.4191;0.731,0.5566,-1.5258)|\",2.9088970618450003\r\n\"[H]O/C(=N/C([H])([H])C1=NN=C([H])N1C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(4.1488,6.4321,0.2911;4.2204,5.4648,0.3675;2.9798,4.9214,0.4106;2.9027,3.6559,0.4357;1.5973,2.9958,0.4872;1.7577,1.9907,0.893;0.8718,3.488,1.1465;0.972,2.9206,-0.8731;-0.0381,3.6637,-1.2751;-0.2703,3.3727,-2.6053;0.6126,2.4687,-2.9533;0.7073,2.0221,-3.9338;1.4315,2.1455,-1.905;2.5472,1.2069,-1.8837;2.8813,1.0297,-2.9079;2.2485,0.2529,-1.4369;3.3685,1.6401,-1.3082;1.8241,5.9564,0.4559;0.905,5.4309,0.1592;1.6723,6.4463,1.8822;1.4166,5.6791,2.6193;1.8185,7.6398,2.3165;2.1613,8.675,1.3879;2.3817,9.7989,1.7939;2.2282,8.3422,-0.0571;2.403,9.1777,-0.7282;2.0602,7.0876,-0.5;2.0895,6.8495,-1.5612)|\",4.721175306175001\r\n\"[H]O/C(=N/C([H])([H])C1=C([H])C([H])=NN1C([H])([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(4.2475,-6.7417,1.1053;4.3363,-5.9152,0.6007;3.1068,-5.4074,0.3341;3.0521,-4.2791,-0.2339;1.7776,-3.6478,-0.5768;1.9822,-2.9667,-1.4119;1.0243,-4.3584,-0.945;1.1687,-2.8922,0.5738;-0.0704,-3.0359,1.1818;-0.8481,-3.7382,0.914;-0.0875,-2.0591,2.1987;-0.8713,-1.8379,2.9116;1.0467,-1.3617,2.2305;1.8032,-1.8819,1.2364;3.1019,-1.2924,0.9489;3.0026,-0.4523,0.2518;3.7556,-2.0546,0.5212;3.5189,-0.9254,1.8877;1.931,-6.3295,0.752;1.0257,-5.7094,0.7342;1.7797,-7.4304,-0.2789;1.5652,-7.094,-1.2978;1.8877,-8.6907,-0.0945;2.1718,-9.1738,1.2241;2.3341,-10.3639,1.4063;2.2513,-8.1979,2.3395;2.4003,-8.6206,3.3285;2.126,-6.8787,2.134;2.1706,-6.1672,2.956)|\",4.206880128730001\r\n\"[H]OC1=NC(=O)N(C([H])([H])C(=O)N(OC([H])([H])[H])C([H])([H])[H])C([H])([H])C1([H])[H] |(7.1638,4.9927,-0.0447;6.7754,4.5653,0.7346;6.3288,3.3338,0.4007;5.7623,2.6514,1.3155;5.3273,1.3323,1.0439;4.999,0.5916,1.9548;5.3271,0.901,-0.2829;4.6784,-0.361,-0.5629;5.1567,-0.838,-1.427;4.8183,-1.0145,0.2992;3.1736,-0.1915,-0.8244;2.6094,0.8915,-0.7806;2.493,-1.342,-1.181;3.148,-2.5501,-0.8504;3.4178,-3.305,-2.0334;2.4935,-3.5618,-2.5651;3.9026,-4.2195,-1.6832;4.088,-2.7622,-2.7109;1.0463,-1.425,-1.0373;0.6415,-0.4331,-1.238;0.7717,-1.737,-0.0226;0.6437,-2.1397,-1.7605;5.4599,1.8648,-1.3683;5.717,1.3245,-2.286;4.5114,2.3892,-1.5432;6.5645,2.8587,-1.0158;6.5637,3.6965,-1.7239;7.5516,2.3785,-1.0662)|\",5.6817371984400005\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])C1([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(5.2429,5.2903,1.5888;4.9458,4.5026,2.0726;4.7944,3.4622,1.2017;4.3604,2.3737,1.6592;4.1819,1.1551,0.8954;4.3352,1.3103,-0.1882;5.2135,0.1073,1.3548;6.2207,0.4834,1.1465;5.1248,-0.006,2.4437;4.9842,-1.2431,0.6622;5.1877,-1.1399,-0.4137;5.7028,-1.9801,1.0418;3.5478,-1.7486,0.8622;3.3837,-1.9674,1.9286;3.398,-2.6938,0.3234;2.5052,-0.7139,0.402;2.6457,-0.5618,-0.6809;1.0716,-1.2098,0.626;0.8921,-2.1564,0.1019;0.3353,-0.4813,0.2652;0.8775,-1.378,1.6931;2.7469,0.635,1.1025;2.5712,0.531,2.1824;2.0304,1.3854,0.7414;5.2208,3.781,-0.2632;4.7321,3.0305,-0.8927;6.7371,3.546,-0.4281;7.1755,2.418,-0.5091;7.6163,4.6569,-0.4655;7.1077,5.8207,-0.7004;7.8102,6.6562,-0.7526;5.6877,6.1225,-0.8871;5.4008,7.1315,-1.1674;4.7799,5.1514,-0.6793;3.7124,5.3382,-0.7774)|\",3.864016677100001\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N([H])N([H])C([H])([H])[C@@]2([H])C([H])([H])N([H])C([H])([H])C2([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(-2.6258,0.6945,-2.3432;-1.6284,0.6111,-1.9197;-0.54,0.306,-2.7391;-0.6863,0.154,-3.8054;0.7397,0.1934,-2.1951;1.5734,-0.0472,-2.8503;0.9609,0.3871,-0.8222;2.3446,0.2756,-0.201;2.2067,0.2531,0.8873;3.0618,-0.9705,-0.613;2.5244,-1.4473,-1.3322;4.3362,-0.6383,-1.2047;5.0187,-0.7295,-0.4521;4.261,0.7765,-1.5928;5.2601,1.218,-1.6397;3.8161,0.8587,-2.5939;3.3495,1.4328,-0.5424;3.9526,1.5984,0.364;2.7358,2.7722,-0.9476;1.9924,3.0918,-0.1908;2.1811,2.6555,-1.8868;3.7836,3.768,-1.1675;4.3319,3.8743,-0.3137;3.2675,5.0814,-1.5487;2.7434,4.9708,-2.5084;2.5182,5.4735,-0.8304;4.3912,6.0833,-1.6886;5.1469,5.7962,-2.4173;4.8918,6.836,-0.4747;4.3911,6.6639,0.4759;5.9577,7.0271,-0.3856;4.1169,7.5591,-1.5499;4.6507,8.246,-2.2006;3.1004,7.8687,-1.319;-0.1433,0.6862,-0.0128;0.0046,0.8227,1.0563;-1.4255,0.8006,-0.5524;-2.2662,1.0298,0.0974)|\",5.57833393525\r\n\"[H]C1=NC(=O)C([H])=C([H])[C@@]1([H])C(=O)N(C([H])([H])[H])[C@@]([H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.9205,-2.5281,2.4127;5.7125,-2.1203,3.4049;6.6468,-2.195,4.2718;6.4224,-1.6432,5.5728;7.2705,-1.7703,6.4352;5.1525,-0.9258,5.8349;5.0654,-0.4585,6.8109;4.165,-0.8902,4.9329;3.2292,-0.3789,5.1494;4.3208,-1.5517,3.5923;3.6207,-2.3997,3.5245;4.0365,-0.5568,2.4369;4.9836,0.0202,1.9127;2.7284,-0.3513,2.0777;1.6399,-1.1693,2.6151;0.6871,-0.7425,2.3085;1.6469,-1.1843,3.7083;1.6738,-2.205,2.2502;2.4678,0.5617,0.9309;3.461,0.7315,0.5098;1.6337,-0.149,-0.1471;1.6387,0.433,-1.0723;0.5873,-0.3015,0.1372;2.0694,-1.1279,-0.374;1.943,1.9755,1.3639;1.8576,2.867,0.1051;1.6504,3.9038,0.3937;1.0583,2.5526,-0.5745;2.8019,2.8621,-0.4528;0.5564,1.9412,2.0374;0.2133,2.9642,2.232;0.5809,1.4248,3.0035;-0.203,1.4617,1.4081;2.9511,2.6157,2.3394;2.6246,3.6266,2.6104;3.95,2.6845,1.897;3.0383,2.0367,3.2647)|\",4.79464604581\r\n\"[H]C([H])([H])C1=C(C([H])([H])[H])[C@@]2([H])C(NC(C3([H])C([H])([H])C3([H])[H])=NC2=O)S1 |(2.7148,1.8404,2.6459;2.9599,0.8683,3.0796;3.9087,0.9687,3.6216;2.1866,0.6243,3.8189;3.0486,-0.1879,2.02;2.8922,-0.0937,0.6863;2.6136,1.1571,-0.0972;2.4726,2.0281,0.5471;1.7231,1.0181,-0.7173;3.4442,1.378,-0.7819;3.1314,-1.3935,-0.0383;4.0935,-1.3216,-0.5831;3.3011,-2.496,0.9747;3.2813,-3.7527,0.6929;2.9356,-4.0448,-0.6372;3.1801,-5.4451,-1.0257;3.6634,-6.0372,-0.2566;3.4901,-5.7724,-2.483;3.5269,-4.9263,-3.162;4.2303,-6.548,-2.6579;2.1642,-6.1512,-1.9244;1.9615,-7.1944,-1.6996;1.3056,-5.5555,-2.2167;2.3622,-3.2338,-1.4868;2.1643,-1.9028,-1.1317;1.3644,-1.1793,-1.6902;3.4149,-1.8571,2.5987)|\",3.834084153545001\r\n\"[H]C1=C([H])C([H])=C([H])C(N([H])C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C([H])([H])N([H])N([H])[C@]2([H])C([H])([H])[H])=N1 |(-0.9179,8.6818,-0.6603;0.1659,8.5846,-0.6044;0.9426,9.6462,-0.1557;0.4856,10.5832,0.145;2.3298,9.4534,-0.1035;2.9859,10.2484,0.2421;2.8655,8.2395,-0.4974;3.9382,8.064,-0.4746;1.9853,7.2237,-0.9414;2.4863,6.0162,-1.3832;3.4338,5.8107,-1.0946;1.6333,4.8597,-1.6198;1.266,4.4285,-0.6728;0.7506,5.2103,-2.1606;2.3741,3.7945,-2.4316;2.7091,4.2425,-3.3755;3.2844,3.4843,-1.8941;1.5084,2.5544,-2.6948;0.5793,2.8638,-3.1973;1.2084,2.1293,-1.7282;2.2172,1.4792,-3.538;3.1764,1.2481,-3.0558;2.4393,1.9317,-5.0142;3.4903,1.8927,-5.3214;2.1028,2.9669,-5.1522;1.6228,1.0401,-5.882;2.2265,0.3408,-6.3098;0.7244,0.3038,-5.0378;-0.1028,0.8922,-4.9184;1.4069,0.1594,-3.7382;2.1387,-0.6509,-3.8739;0.4405,-0.2615,-2.6357;0.9642,-0.3986,-1.6832;-0.0368,-1.2106,-2.9003;-0.3503,0.4811,-2.4786;0.6547,7.3964,-0.985)|\",5.238191622125001\r\n\"[H]OC1=NC([H])([H])C([H])([H])N(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C([H])N=C2[H])C1([H])[H] |(-2.4852,-8.6914,-0.5264;-1.528,-8.5971,-0.6544;-1.1849,-7.2788,-0.5923;0.0267,-6.9754,-0.7955;0.396,-5.571,-0.6975;1.4077,-5.5115,-0.2773;0.4498,-5.1468,-1.7101;-0.5533,-4.7342,0.1649;-0.294,-3.6789,0.0935;-0.4478,-5.0287,1.221;-1.9313,-4.9538,-0.2677;-2.9455,-4.0235,-0.389;-4.1119,-4.3991,-0.5241;-2.5885,-2.5793,-0.3909;-1.5472,-2.2778,-0.3842;-3.5723,-1.6622,-0.4216;-4.5907,-2.0469,-0.433;-3.4327,-0.2074,-0.4451;-2.2057,0.479,-0.4482;-1.2667,-0.0671,-0.4329;-2.2024,1.8676,-0.4718;-1.2702,2.4247,-0.4754;-3.4271,2.5421,-0.4915;-3.4577,3.6301,-0.5102;-4.611,1.9185,-0.4892;-4.5968,0.5839,-0.4669;-5.5713,0.096,-0.4661;-2.3425,-6.3506,-0.2738;-2.7871,-6.6399,0.6942;-3.1347,-6.4677,-1.0207)|\",4.5633492728850005\r\n\"[H]C1=C(C(=O)N([H])[C@@]2([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C2([H])[H])C([H])=C(C([H])([H])[H])C(C([H])([H])[H])=C1[H] |(4.3439,-1.434,-0.3098;3.9529,-0.4515,-0.552;4.8005,0.6541,-0.4379;6.2117,0.4059,0.013;6.5253,-0.6256,0.6031;7.1143,1.4084,-0.2501;6.8877,2.0924,-0.9608;8.5348,1.2182,-0.0288;8.6447,0.259,0.4772;9.2381,1.1887,-1.3236;10.1629,0.7751,-1.1367;9.3865,2.609,-1.5735;9.9834,2.7333,-2.3891;9.9402,3.2135,-0.3345;9.6517,4.2709,-0.3196;11.4672,3.1122,-0.2071;11.8107,3.529,0.7469;11.9686,3.6645,-1.0114;11.8063,2.0687,-0.2479;9.1776,2.4076,0.7522;8.4086,3.0042,1.249;9.8624,2.039,1.5231;4.2889,1.9267,-0.7252;4.9163,2.8057,-0.5976;2.9666,2.1145,-1.1368;2.4537,3.5057,-1.4265;3.2352,4.2553,-1.2692;1.6049,3.7695,-0.7821;2.1037,3.6019,-2.4625;2.1239,0.9898,-1.2692;0.6901,1.1439,-1.7167;0.1846,0.1747,-1.76;0.6223,1.6015,-2.7124;0.1195,1.7887,-1.0355;2.637,-0.2781,-0.9694;1.9876,-1.1453,-1.0655)|\",5.314383500265\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])OC1=C([H])C([H])=C([H])C([H])=C1[H] |(7.3232,0.9283,0.215;6.6805,0.1966,0.1557;5.5875,0.6731,-0.4727;4.6133,-0.1144,-0.6886;3.4489,0.3924,-1.39;3.1727,1.4026,-1.0494;3.7336,0.4754,-2.8654;4.5031,-0.1738,-3.0376;2.6093,-0.0698,-3.5728;1.9466,0.6995,-3.6881;2.0076,-1.0887,-2.6823;2.6085,-2,-2.8079;0.5624,-1.3885,-3.0613;0.1469,-2.175,-2.4225;0.4926,-1.7206,-4.1026;-0.0673,-0.4965,-2.9419;2.2221,-0.5444,-1.2528;2.4197,-1.3294,-0.5182;1.3403,0.0201,-0.9243;5.6558,2.1523,-0.8657;4.8545,2.7077,-0.3587;5.5057,2.2491,-1.9467;6.9286,2.6522,-0.4731;7.2505,3.9628,-0.7438;6.3967,4.8559,-1.3962;5.4131,4.5486,-1.732;6.8271,6.1676,-1.6216;6.1615,6.8593,-2.1306;8.0875,6.5889,-1.2045;8.4126,7.609,-1.3855;8.9309,5.6842,-0.5514;9.9171,5.9982,-0.2206;8.5191,4.376,-0.3196;9.1624,3.6624,0.1861)|\",5.831399816215001\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(7.7761,2.4137,0.1773;7.2344,1.6429,0.4053;5.9459,1.8661,0.0034;5.1086,0.9576,0.3031;3.7359,1.0499,-0.1439;3.3738,2.0874,-0.1795;3.6295,0.4767,-1.5328;4.474,-0.0818,-1.6654;2.5439,-0.4662,-1.5574;1.705,0.0768,-1.7719;2.4226,-0.9965,-0.1811;3.1993,-1.7667,-0.0773;1.0568,-1.6253,0.0659;0.999,-2.0481,1.0745;0.8608,-2.4274,-0.6536;0.2577,-0.8776,-0.0273;2.7821,0.1967,0.7284;3.2697,-0.1001,1.6608;1.877,0.7659,0.9767;5.7461,3.1386,-0.8001;4.6998,3.4504,-0.7597;6.3324,3.9528,-0.3514;6.1566,2.9555,-2.2865;5.5437,2.151,-2.7064;7.2027,2.6271,-2.3367;5.9679,4.2242,-3.0893;4.74,4.4929,-3.7094;3.9373,3.7622,-3.6366;4.5426,5.6775,-4.4199;3.5853,5.8664,-4.8987;5.5734,6.6137,-4.5228;5.422,7.5346,-5.0794;6.8023,6.3559,-3.9129;7.613,7.0753,-3.9938;6.9952,5.1699,-3.2031;7.9588,4.9734,-2.7362)|\",5.99194698801\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C1=C([H])C(C([H])([H])[H])=C([H])C(C([H])([H])[H])=C1[H] |(3.7063,-2.8743,-0.4351;4.6615,-3.0235,-0.5183;5.3022,-1.8401,-0.2638;6.5625,-1.9153,-0.1048;7.3655,-0.7279,0.092;6.8705,0.179,-0.2791;7.6188,-0.5565,1.5661;7.3571,-1.4439,1.9979;9.0358,-0.4275,1.7834;9.2405,0.5709,1.7055;9.6943,-1.1288,0.6597;9.6701,-2.1996,0.9049;11.142,-0.6862,0.4853;11.6219,-1.2362,-0.3312;11.7168,-0.8616,1.4009;11.1994,0.3838,0.2442;8.7686,-0.8756,-0.5461;8.7805,-1.6837,-1.2826;9.0549,0.0563,-1.0514;4.3983,-0.6454,-0.2483;3.4135,-0.5204,-1.2407;3.3536,-1.2642,-2.033;2.5384,0.5708,-1.2541;1.5033,0.7142,-2.3464;0.8457,1.5692,-2.1624;0.8751,-0.1813,-2.4243;1.9748,0.8619,-3.326;2.6557,1.5266,-0.239;1.9775,2.378,-0.2359;3.6212,1.4226,0.7702;3.7045,2.4528,1.8729;3.2688,3.4083,1.563;4.7422,2.6312,2.1735;3.1616,2.1198,2.7673;4.4946,0.3301,0.7548;5.258,0.2365,1.5233)|\",5.2844509767100005\r\n\"[H]C1=C([H])[C@@]([H])(C(=O)N2C([H])([H])C([H])([H])C([H])(OC([H])([H])[H])C([H])([H])C2([H])[H])N=NC1=O |(4.1716,7.6071,-4.9822;4.2168,6.6208,-4.5311;4.3312,6.4243,-3.2118;4.3707,7.2506,-2.5083;4.4016,5.0423,-2.6603;5.4128,4.8539,-2.2654;3.3934,4.8829,-1.4818;2.5891,5.7884,-1.2835;3.4623,3.7574,-0.7184;2.4979,3.5891,0.3757;3.0515,3.5818,1.3269;1.8417,4.4586,0.3734;1.708,2.2832,0.2152;1.0859,2.3363,-0.6866;1.0369,2.1694,1.0748;2.6539,1.0784,0.0886;3.2027,0.9579,1.0421;1.9754,-0.1315,-0.2144;1.2477,-0.6913,0.8606;0.882,-1.6635,0.5202;0.3838,-0.0772,1.1544;1.8856,-0.8416,1.7467;3.6607,1.3152,-1.0379;4.3804,0.4894,-1.0681;3.1298,1.3271,-1.9965;4.4067,2.6441,-0.862;5.0543,2.8078,-1.7195;5.034,2.6091,0.0422;4.1872,3.9244,-3.6235;4.0479,4.1155,-4.8458;4.1137,5.4748,-5.4315;4.0711,5.5433,-6.6408)|\",3.59734510361\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([C@]2([H])C([H])([H])[C@@]2([H])C([H])([H])[H])O1 |(10.2204,1.9933,-0.2118;10.4078,1.5119,-1.0331;9.4701,0.5187,-1.155;9.9088,-0.5519,-1.7265;8.8652,-1.4654,-1.9528;9.4284,-2.6693,-2.4547;8.5788,-3.3264,-2.6578;10.0931,-3.1381,-1.7184;9.9884,-2.4847,-3.3789;8.1207,0.758,-0.6675;7.4883,-0.1156,-0.5666;7.6483,1.9908,-0.3632;8.269,2.8681,-0.537;6.345,2.2876,0.1509;5.7264,3.4729,0.4722;6.1627,4.4584,0.3807;4.4204,3.1482,0.9365;3.6547,3.8322,1.2747;4.3116,1.7824,0.8755;3.2253,0.8557,1.2066;2.3452,1.3601,1.5978;3.5141,-0.5327,1.7623;4.5572,-0.8111,1.8794;2.8543,-0.898,2.5456;2.9647,-0.3932,0.3725;3.6831,-0.556,-0.4287;1.5439,-0.801,0.0485;1.485,-1.874,-0.1715;1.1619,-0.2579,-0.8242;0.8712,-0.5979,0.8904;5.4748,1.2485,0.4006)|\",3.741565444375\r\n\"[H]O/C(=N/C([H])(C1([H])C([H])([H])C1([H])[H])C1([H])C([H])([H])C1([H])[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(-5.1228,-3.5021,-1.2621;-4.3099,-3.464,-0.7312;-3.4839,-2.496,-1.223;-2.4365,-2.2393,-0.5725;-1.4298,-1.2606,-0.9438;-1.6662,-0.7271,-1.8847;-1.3451,-0.223,0.1686;-1.0806,-0.654,1.132;-2.3041,0.9401,0.2051;-3.0178,1.0421,-0.6104;-2.6896,1.2635,1.1679;-0.845,1.1714,-0.1102;-0.2209,1.6542,0.6367;-0.5837,1.4302,-1.1332;-0.1058,-1.9882,-1.1426;0.2053,-2.5458,-0.2616;0.991,-1.3864,-1.9833;0.7978,-0.44,-2.4819;2.0186,-1.5046,-1.6509;0.2448,-2.6051,-2.4733;0.7556,-3.5637,-2.4817;-0.446,-2.4686,-3.3035;-3.9804,-1.8865,-2.5607;-3.307,-1.0518,-2.799;-3.8363,-2.9192,-3.6586;-2.8238,-3.3077,-3.8031;-4.7552,-3.3927,-4.4118;-6.0955,-2.9132,-4.2508;-6.9808,-3.3543,-4.9571;-6.3646,-1.8807,-3.2201;-7.3932,-1.54,-3.1501;-5.3884,-1.3835,-2.4469;-5.5898,-0.615,-1.7032)|\",4.672194813085\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])N1C([H])=NC([H])=C1[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(3.3061,0.9214,1.0867;3.0326,0.3247,1.8039;1.9382,-0.3824,1.4209;1.4955,-1.2501,2.2245;0.3318,-2.0664,1.9748;-0.5453,-1.6155,2.4592;0.0769,-2.1905,0.9086;0.5616,-3.4569,2.5945;0.7685,-3.3451,3.6621;1.44,-3.9262,2.1404;-0.5856,-4.3351,2.4307;-1.6059,-4.5256,3.3246;-1.595,-4.0645,4.3043;-2.5442,-5.3201,2.8572;-2.1218,-5.662,1.5919;-2.7029,-6.3258,0.9657;-0.9199,-5.0652,1.304;-0.274,-5.1009,0.4387;1.3631,0.0364,0.0432;0.6714,-0.7589,-0.2676;0.5463,1.3013,0.2273;-0.2883,1.2214,0.9301;0.7315,2.436,-0.3307;1.8149,2.5835,-1.2556;2.0298,3.6653,-1.7643;2.6484,1.3966,-1.5732;3.4235,1.5486,-2.3182;2.4376,0.2083,-0.9889;3.0445,-0.6602,-1.2375)|\",3.605508519125\r\n\"[H]C1=C(C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]2([H])C(N/C(C([H])([H])C([H])([H])[H])=N\\C2=O)S1 |(3.7369,-3.1394,-6.0975;3.6194,-2.8608,-5.0584;3.1757,-3.638,-4.0582;2.7193,-5.0724,-4.1448;3.2038,-5.6422,-3.3362;1.6523,-5.0837,-3.8867;2.9528,-5.8029,-5.4806;2.595,-5.154,-6.2938;4.4389,-6.1134,-5.7218;4.5899,-6.5675,-6.708;4.8089,-6.8236,-4.971;5.0646,-5.2167,-5.6652;2.1204,-7.0934,-5.5233;2.2702,-7.6275,-6.4687;1.0497,-6.8829,-5.4199;2.4075,-7.7716,-4.7094;3.2228,-2.9315,-2.7265;4.0426,-3.3698,-2.1241;3.5894,-1.4828,-2.9497;3.4868,-0.5653,-2.0539;2.8803,-0.9828,-0.8549;3.0244,-0.0072,0.2793;2.7419,-0.5125,1.2066;4.0796,0.2876,0.3441;2.1612,1.2507,0.0643;2.3057,1.954,0.891;1.0992,0.9875,0.0204;2.4331,1.7525,-0.8691;2.1526,-2.0491,-0.6884;2.0292,-2.9551,-1.7498;1.1063,-3.7359,-1.8443;4.0632,-1.1902,-4.6139)|\",3.8640166771\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C([H])/C1=C([H])/C([H])=C(/[H])C2=C1N=C([H])C([H])=C2[H] |(3.9119,0.0056,1.7577;3.7955,-0.9232,1.5052;4.9208,-1.3398,0.8432;5.0819,-2.5983,0.7412;6.205,-3.1401,-0.0129;6.1841,-2.7938,-1.0581;7.1672,-2.8013,0.4078;6.1626,-4.6675,0.0155;7.0061,-5.0938,-0.5404;5.2308,-5.0316,-0.4294;6.2044,-5.0323,1.0472;5.8034,-0.2862,0.3026;6.8212,-0.5987,0.0888;5.4048,0.9763,0.0463;4.3766,1.2679,0.2425;6.2238,2.0582,-0.5049;7.499,1.8581,-1.0178;7.9188,0.8568,-1.0308;8.2678,2.9159,-1.5438;9.2602,2.7091,-1.9339;7.7665,4.1988,-1.57;8.3537,5.0186,-1.9764;6.4681,4.4631,-1.0676;5.6832,3.3932,-0.5288;4.4306,3.6008,-0.0321;3.9381,4.8228,-0.0572;2.9334,4.9462,0.3459;4.6316,5.9473,-0.5678;4.1629,6.9265,-0.5546;5.8962,5.7613,-1.0712;6.4695,6.593,-1.4741)|\",3.888506923645001\r\n\"[H]OC(=N/C([H])([H])[H])/C1=C(\\[H])C(=O)[C@@]2([H])C([H])=C([H])C([H])=C([H])\\C2=N\\1 |(0.3361,4.5041,-0.7482;1.1215,5.0679,-0.9166;2.043,4.7107,0.0139;3.0728,5.4469,0.1194;4.146,5.2482,1.0655;4.7407,6.1665,1.1075;4.8272,4.4425,0.7516;3.8073,5.0252,2.0883;1.6422,3.4505,0.7534;2.4874,2.6876,1.5139;3.5121,2.9682,1.7033;2.0127,1.4906,2.1943;2.6676,0.8463,3.0004;0.6259,1.015,1.7334;0.9213,0.4117,0.8428;-0.0742,0.0604,2.6407;0.5535,-0.5781,3.2537;-1.421,-0.0143,2.6458;-1.9347,-0.7323,3.2783;-2.2107,0.8805,1.8248;-3.2918,0.7679,1.8317;-1.6501,1.8853,1.0942;-2.2541,2.5973,0.5407;-0.2277,2.0952,1.1181;0.289,3.1687,0.5821)|\",3.3633271921800003\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C([H])C([H])=C(/N=C(/O[H])C([H])([H])[H])C([H])=C1[H] |(-0.393,1.1605,9.2508;-1.0838,1.8971,9.0938;-1.0756,2.2347,7.8635;-1.8938,3.2675,7.4993;-2.0393,3.2219,6.5419;-0.2422,1.6342,6.8018;0.1518,0.6472,7.0324;0.088,2.2526,5.6492;-0.2703,3.2715,5.4955;0.9196,1.7409,4.5612;1.2518,2.5971,3.4935;0.8867,3.6218,3.504;2.045,2.1722,2.4362;2.3073,2.8515,1.6311;2.5212,0.8507,2.3894;3.3679,0.465,1.3486;3.0929,-0.4623,0.5272;4.06,-0.767,-0.3812;3.734,-1.4564,-0.9802;1.8139,-1.2599,0.3946;1.0683,-0.9399,1.1232;2.0081,-2.3297,0.5414;1.3974,-1.1323,-0.6127;2.2029,-0.0121,3.4574;2.5974,-1.0243,3.4539;1.4192,0.4255,4.5167;1.1889,-0.2698,5.3189)|\",4.19599557471\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C([H])([H])OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])SC(C([H])([H])[H])=N1 |(9.4414,4.3577,2.682;8.7947,3.6352,2.6473;9.201,2.7322,1.6927;8.3392,1.8735,1.3352;8.6404,0.7863,0.4225;9.7116,0.6861,0.1939;8.3279,-0.1435,0.9192;7.8551,0.9204,-0.8918;8.0927,0.0635,-1.5348;8.1744,1.8222,-1.4302;6.3461,0.9797,-0.6894;6.0734,1.8608,-0.0896;5.9999,0.0889,-0.1345;5.731,1.0318,-1.9668;4.3266,1.1185,-1.8887;3.9454,1.155,-2.9133;4.0001,2.0258,-1.354;3.889,0.2447,-1.3773;10.6237,2.8957,1.2975;11.3256,3.0145,2.1247;11.0647,2.9379,0.0274;10.3522,2.8733,-0.7911;12.4495,3.1016,-0.4098;13.5863,3.0805,0.3623;13.6733,2.9266,1.4288;14.9877,3.321,-0.618;13.9375,3.4331,-2.0375;14.4911,3.6594,-3.4111;13.6593,3.692,-4.1183;15.1743,2.8552,-3.7073;15.0422,4.6048,-3.4723;12.6759,3.2985,-1.765)|\",4.54974358036\r\n\"[H]OC(=N/C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])SC(C([H])([H])[H])=N1 |(7.6396,-0.2404,1.2139;6.808,-0.362,0.7298;5.9005,-0.9755,1.5627;4.706,-1.0297,1.1391;3.6465,-1.7331,1.8351;4.0266,-2.5432,2.4795;3.1196,-1.0249,2.4947;2.6196,-2.3053,0.8349;1.8632,-2.8284,1.4403;1.92,-1.1838,0.055;1.1851,-1.5905,-0.6499;2.6528,-0.5971,-0.5078;1.3913,-0.5015,0.7329;3.2665,-3.328,-0.1078;2.5239,-3.753,-0.793;3.7224,-4.1563,0.45;4.0525,-2.855,-0.7068;6.5067,-1.502,2.8075;7.4993,-1.9434,2.7212;5.9386,-1.4429,4.026;4.9608,-0.9843,4.148;6.5642,-1.9329,5.2463;5.9937,-1.8687,6.4938;5.028,-1.4637,6.7641;7.0515,-2.5508,7.68;8.2199,-2.8852,6.3901;9.5314,-3.5514,6.6742;10.1171,-3.574,5.7526;9.3879,-4.5805,7.0237;10.0981,-3.0147,7.443;7.825,-2.5111,5.213)|\",4.644983428035001\r\n\"[H]OC(=N/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])SC(C([H])([H])[H])=N1 |(9.258,0.8681,0.1216;9.0757,1.7906,0.3601;7.7358,2.0319,0.1364;7.4167,3.2536,0.0197;6.0563,3.7964,-0.1208;6.2192,5.3238,0.025;5.2537,5.8365,-0.0609;6.8928,5.7091,-0.7473;6.6569,5.5652,0.9993;5.4931,3.4982,-1.5272;4.5496,4.0353,-1.6787;5.2968,2.4327,-1.6839;6.2015,3.8285,-2.2944;5.0923,3.304,0.9802;4.1414,3.846,0.9169;5.5228,3.4898,1.9704;4.8731,2.2354,0.9019;6.8924,0.8134,0.1038;5.9887,0.835,-0.4933;7.1767,-0.3044,0.8019;8.0515,-0.3345,1.4493;6.355,-1.5056,0.8039;6.6146,-2.6259,1.5555;7.4351,-2.7921,2.2402;5.4065,-3.8321,1.2827;4.6125,-2.7272,0.148;3.3556,-3.1122,-0.5704;3.0926,-2.3112,-1.2651;2.5237,-3.2585,0.1282;3.4841,-4.0422,-1.1352;5.2168,-1.5877,0.0148)|\",4.50620536428\r\n\"[H]OC(=N/[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])SC(C([H])([H])[H])=N1 |(7.3859,-0.8733,-1.5628;6.6703,-0.2187,-1.571;5.6103,-0.7543,-2.2659;4.4696,-0.2492,-2.0188;3.2761,-0.6077,-2.7709;3.3972,-1.5389,-3.3502;2.9689,0.523,-3.7669;2.0461,0.3101,-4.3191;3.7852,0.638,-4.4882;2.848,1.4734,-3.236;2.1116,-0.8151,-1.7867;1.9692,0.1143,-1.2204;1.1954,-0.9801,-2.3703;2.3286,-1.9825,-0.8209;1.4788,-2.0959,-0.138;3.2284,-1.8244,-0.217;2.4473,-2.9301,-1.3625;5.9501,-1.8276,-3.2224;5.1718,-2.5345,-3.4884;7.1625,-1.9662,-3.7959;7.9503,-1.2426,-3.5936;7.5091,-3.0246,-4.733;8.7263,-3.1504,-5.357;9.5922,-2.5089,-5.2642;8.7376,-4.5361,-6.3923;7.067,-4.8658,-5.9046;6.3134,-6.0443,-6.4409;5.2952,-6.0154,-6.0467;6.7805,-6.9889,-6.1388;6.2703,-6.0297,-7.5356;6.5865,-4.0088,-5.0589)|\",4.5170899183\r\n\"[H]O/C(=N/C([H])([H])[H])[C@@]1([H])C(=O)N=C2C(=C1[H])C(=O)C([H])([H])C([H])([H])C2([H])[H] |(0.9662,-1.4312,0.02;1.7434,-0.9843,-0.3956;2.8146,-1.1597,0.4023;3.9691,-1.0336,-0.0822;5.2162,-1.1764,0.6151;5.696,-0.1966,0.7373;5.8905,-1.7889,0.0054;5.1504,-1.6478,1.6084;2.3467,-1.425,1.8971;1.7132,-0.5397,2.0709;1.3959,-2.6344,1.9356;0.4054,-2.6309,1.2094;1.6507,-3.7201,2.7662;2.6072,-3.6435,3.647;3.4826,-2.4811,3.8265;3.3656,-1.4361,2.9713;3.9869,-0.5593,3.1318;4.5222,-2.4282,4.9064;5.2546,-1.459,5.0193;4.6171,-3.6296,5.8318;5.6306,-3.6613,6.2418;3.9373,-3.4454,6.6793;4.2168,-4.9338,5.1284;4.924,-5.152,4.3173;4.2755,-5.7721,5.8305;2.7945,-4.8294,4.5642;2.0826,-4.708,5.3972;2.4839,-5.7307,4.0283)|\",3.627277627165\r\n\"[H]OC(=N/C([H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])SC(C([H])([H])[H])=N1 |(5.2542,0.021,-2.2722;4.8846,-0.8357,-2.0071;4.3941,-0.7108,-0.7269;3.5059,-1.554,-0.3864;2.977,-1.6298,0.9674;3.3969,-0.8635,1.6389;1.4558,-1.4333,0.9047;1.0107,-1.5302,1.9019;1.005,-2.1821,0.2444;1.2058,-0.4422,0.5104;3.3382,-3.0046,1.5508;2.9268,-3.1158,2.5611;4.4249,-3.1305,1.6045;2.936,-3.8019,0.9166;4.9929,0.373,0.0807;4.4124,0.7716,0.9055;6.2226,0.8791,-0.1404;6.8495,0.4662,-0.9292;6.8231,1.9501,0.6411;8.0931,2.4381,0.4519;8.8285,2.1197,-0.2744;8.4306,3.7045,1.5808;6.8032,3.4911,2.2481;6.306,4.3332,3.383;5.2991,4.0024,3.6474;6.2678,5.3934,3.1075;6.9515,4.2419,4.2638;6.1125,2.5626,1.6635)|\",4.52797447232\r\n\"[H]O/C(=N/[C@@]([H])(/C(=N\\C([H])([H])C([H])([H])[H])O[H])C([H])([H])[H])[C@]1([H])C(=O)N=C([H])C([H])=C1[H] |(2.1854,-2.1541,2.1902;2.1504,-1.4348,1.5379;3.362,-1.2953,0.9422;3.4944,-0.365,0.0991;4.7126,-0.1107,-0.6633;5.3449,-0.9998,-0.7314;4.2851,0.2879,-2.0957;4.9554,0.1944,-3.1659;6.281,-0.3984,-3.201;6.9344,0.2977,-3.7445;6.7469,-0.5473,-2.2128;6.246,-1.7379,-3.9449;7.2588,-2.1375,-4.0753;5.7897,-1.6094,-4.9317;5.6553,-2.473,-3.3875;3.0579,0.8626,-2.1471;2.6383,0.6823,-1.2795;5.4903,1.0486,-0.0112;6.3795,1.2898,-0.6005;5.8047,0.7835,1.0054;4.8583,1.9401,0.0452;4.4376,-2.3307,1.3683;5.4104,-1.8714,1.1732;4.3632,-3.5742,0.4491;4.8818,-3.5578,-0.6456;3.6689,-4.7231,0.898;3.4871,-4.8544,2.1685;2.9784,-5.7636,2.4975;3.8815,-3.8887,3.2;3.7723,-4.1681,4.2436;4.33,-2.6775,2.8266;4.5971,-1.9172,3.5575)|\",2.7864458291200003\r\n\"[H]C1=NC(=O)/C(=N/C(=O)C([H])([H])[H])C([H])=C1C(F)(F)F |(0.6815,3.731,4.3863;1.2158,3.1032,3.6707;2.4413,2.8218,3.942;3.1682,2.0152,3.0251;4.3441,1.7983,3.2072;2.441,1.4365,1.8032;3.1,0.6288,1.0579;2.5382,-0.0355,-0.0531;1.5985,-0.7996,0.0618;3.2981,0.2072,-1.337;3.3724,1.2807,-1.5467;2.7918,-0.3017,-2.1589;4.3218,-0.1695,-1.2354;1.0548,1.8537,1.5774;0.5193,1.483,0.7107;0.47,2.6576,2.4846;-0.9499,3.1392,2.345;-0.9874,4.4897,2.3622;-1.5258,2.7223,1.2048;-1.7021,2.707,3.3785)|\",3.072165372145\r\n\"[H]OC(=N/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(6.5909,-0.1322,-2.6775;7.2498,-0.7969,-2.4228;8.0954,-0.2237,-1.495;9.2249,-0.789,-1.373;10.2735,-0.4547,-0.3974;10.9546,0.883,-0.76;11.8432,1.0379,-0.1369;10.2956,1.7452,-0.614;11.2699,0.873,-1.8088;9.7591,-0.4425,1.0582;10.5999,-0.347,1.7553;9.2376,-1.3795,1.2822;9.0677,0.3825,1.2538;11.3193,-1.5814,-0.5319;12.1642,-1.4214,0.1488;11.6976,-1.6257,-1.5583;10.8636,-2.5502,-0.3024;7.5225,0.9411,-0.7778;8.2314,1.6747,-0.4125;6.2002,1.0995,-0.5553;5.5375,0.3199,-0.9236;5.5426,2.1972,0.163;4.1315,2.3422,0.1093;3.2568,1.3895,-0.6766;2.2055,1.681,-0.5996;3.3374,0.3582,-0.3105;3.5164,1.3752,-1.7423;3.5343,3.4031,0.7939;2.4522,3.5103,0.7422;4.269,4.3322,1.5407;3.5751,5.4507,2.2805;4.2902,6.1987,2.6368;3.0299,5.0717,3.1549;2.8431,5.9594,1.6424;5.6586,4.1747,1.5909;6.2605,4.8741,2.1662;6.2773,3.1279,0.9181;7.3557,3.0237,0.9933)|\",4.35926388501\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)N2C([H])([H])C([H])([H])C([H])(OC([H])([H])[H])C([H])([H])C2([H])[H])C([H])=C1[H] |(5.7223,-9.2342,0.6312;5.1013,-8.3749,0.3656;4.303,-7.9396,1.2836;3.4715,-6.8512,0.9539;2.4448,-6.6524,1.5752;3.8899,-5.9129,-0.199;2.9574,-5.5799,-0.6663;4.6737,-4.6688,0.327;5.8941,-4.6454,0.2032;3.963,-3.6483,0.892;4.689,-2.4538,1.3395;4.5488,-2.349,2.4256;5.7484,-2.6193,1.1479;4.172,-1.2018,0.6166;4.4008,-1.2749,-0.4541;4.6981,-0.3225,1.0069;2.6512,-1.0524,0.7835;2.4302,-0.8616,1.8504;2.1089,0.001,-0.0015;2.3634,1.3008,0.4927;1.8139,1.9949,-0.1488;3.4302,1.5669,0.4606;2.0073,1.4166,1.5291;1.9401,-2.3387,0.359;0.8674,-2.2506,0.5657;2.0564,-2.461,-0.7255;2.5124,-3.5617,1.0884;2.0277,-4.4844,0.7798;2.3234,-3.4759,2.1685;4.7376,-6.6051,-1.2281;4.9256,-6.0851,-2.1636;5.2714,-7.8071,-0.9691;5.9029,-8.3267,-1.6827)|\",4.31572566893\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(8.7569,-0.0446,0.4232;9.2488,-0.5011,-0.278;8.5952,-0.2707,-1.4595;9.3679,-0.2285,-2.4913;8.6218,-0.1352,-3.6777;9.5265,0.0407,-4.7599;8.9039,0.0673,-5.658;10.2341,-0.7943,-4.8186;10.0846,0.9803,-4.6653;7.1447,-0.1158,-1.4489;6.7184,0.2904,-2.359;6.3681,-0.4724,-0.4021;6.8478,-0.976,0.4352;4.9179,-0.2917,-0.2937;4.161,-1.0686,0.6218;4.8085,-2.1158,1.5011;4.0579,-2.6238,2.1136;5.547,-1.6793,2.1862;5.3294,-2.8797,0.9117;2.7809,-0.8719,0.6941;2.2059,-1.4772,1.3928;2.1078,0.0681,-0.0975;0.6129,0.2462,0.0208;0.2469,1.0169,-0.6646;0.3236,0.5384,1.0381;0.0803,-0.6853,-0.2082;2.8669,0.8334,-0.9893;2.3783,1.581,-1.6096;4.2433,0.6584,-1.0797;4.8127,1.2852,-1.7598)|\",4.013679294875001\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1C#N |(4.3203,1.2671,-2.786;4.74,1.0797,-1.9319;5.2831,2.2347,-1.4353;6.1954,2.105,-0.5594;6.7748,3.2702,0.0982;7.3276,3.8878,-0.6294;5.9947,3.9155,0.5336;7.7368,2.829,1.207;8.5123,2.1959,0.7578;8.2423,3.7195,1.6054;7.0372,2.069,2.3362;7.7551,1.7278,3.0905;6.5145,1.1938,1.9378;6.2974,2.7036,2.8414;4.7273,3.5063,-1.9455;5.3705,4.374,-1.8323;3.5039,3.6295,-2.4968;2.8781,2.7428,-2.5797;2.896,4.8649,-3.0027;3.4092,6.1399,-2.7104;4.2781,6.2279,-2.0663;2.8147,7.2943,-3.209;3.2363,8.2641,-2.9606;1.6767,7.213,-4.0175;1.2106,8.1143,-4.4036;1.138,5.9677,-4.3203;0.2546,5.8825,-4.9446;1.7364,4.8005,-3.8195;1.1614,3.5325,-4.1639;0.6985,2.5026,-4.4459)|\",4.258581760325001\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)N(C([H])([H])[H])C([H])([H])[H])C(C#N)=C1[H] |(-1.1775,6.6457,-3.7621;-0.6311,5.8297,-3.2991;-0.0731,5.982,-2.0261;-0.1875,6.9209,-1.4917;0.6192,4.9332,-1.4286;1.02,5.0604,-0.428;0.7875,3.7009,-2.0811;1.521,2.58,-1.478;1.3155,1.6023,-1.9083;2.4068,2.6689,-0.4714;2.6315,3.6221,0.0002;2.9891,1.5039,0.2686;2.9438,1.5366,1.4983;3.4943,0.4349,-0.4284;3.8687,0.4437,-1.8362;3.2543,-0.254,-2.4206;4.9186,0.1379,-1.9369;3.7607,1.4419,-2.257;3.9107,-0.7423,0.3221;3.5373,-1.6468,-0.1727;3.5061,-0.6765,1.331;5.0062,-0.8056,0.3802;0.2078,3.5592,-3.369;0.3439,2.3282,-4.0924;0.4571,1.329,-4.6787;-0.492,4.619,-3.9682;-0.9205,4.4786,-4.9553)|\",4.5170899183\r\n\"[H]C1=C([H])C(C#N)=C(/C([H])=C(\\[H])C(=O)N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(-1.8822,-9.0471,3.375;-1.7758,-8.059,2.9379;-2.3854,-6.9633,3.5403;-2.9651,-7.0868,4.4496;-2.2575,-5.6774,2.9886;-2.8942,-4.6073,3.6985;-3.4236,-3.766,4.3048;-1.5053,-5.4685,1.7998;-1.3023,-4.1872,1.1176;-0.5387,-4.196,0.3416;-1.9468,-3.0197,1.2879;-2.7516,-2.9219,2.0007;-1.5663,-1.8562,0.429;-0.8199,-2.0098,-0.5411;-2.1043,-0.6278,0.7426;-1.8031,0.5103,-0.1312;-1.2885,0.1205,-1.0086;-2.7579,0.9541,-0.4527;-0.9626,1.5679,0.5969;0.027,1.1446,0.8136;-0.8086,2.4282,-0.0665;-1.6434,2.0041,1.9027;-2.571,2.5457,1.6641;-1.0058,2.7028,2.4576;-1.9816,0.7839,2.7714;-2.5521,1.0827,3.6597;-1.056,0.3102,3.125;-2.7936,-0.2567,1.9813;-3.7752,0.1659,1.7178;-2.9746,-1.1326,2.6023;-0.8897,-6.5986,1.2305;-0.3013,-6.4616,0.3276;-1.0218,-7.8711,1.7781;-0.5339,-8.7159,1.3003)|\",4.176947605175\r\n\"[H]OC(=N/C1=C([H])C(Cl)=C([H])C([H])=C1F)/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H] |(8.1632,-0.2918,3.2049;8.0108,-0.3704,2.2502;6.7473,-0.8338,2.0459;6.2674,-0.6332,0.8761;5.0525,-1.1758,0.4592;4.0904,-0.3313,-0.123;4.2934,0.7324,-0.1671;2.905,-0.8556,-0.6275;1.7127,0.2391,-1.3232;2.6421,-2.2253,-0.6073;1.717,-2.6162,-1.015;3.599,-3.0782,-0.0572;3.4455,-4.1521,-0.0269;4.7734,-2.5544,0.4634;5.6874,-3.4028,0.9861;6.0675,-1.4707,3.1828;4.9849,-1.5286,3.1083;6.6942,-1.9909,4.2602;7.7849,-1.9927,4.3027;6.0191,-2.608,5.3837;4.9297,-2.6304,5.3536;6.6632,-3.1437,6.4369;7.7541,-3.1147,6.4496;6.0033,-3.7903,7.6145;6.2741,-3.2788,8.5482;6.3304,-4.8331,7.7252;4.9125,-3.7817,7.5237)|\",4.0218427103900005\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C(OC([H])([H])[H])=C([H])C([H])=C1[H] |(4.1254,4.1498,-4.9198;5.0236,4.3576,-5.2231;5.5686,3.2092,-5.7313;6.4084,3.4013,-6.6909;7.0303,2.1994,-7.0671;7.842,2.4553,-8.2059;8.329,1.5039,-8.435;8.5967,3.2189,-7.9853;7.238,2.7802,-9.0619;5.1802,1.9236,-5.1601;5.4478,1.0591,-5.7564;4.5501,1.7987,-3.9731;4.3753,2.6984,-3.3829;4.0904,0.5587,-3.342;3.5239,0.6342,-2.0636;3.4319,1.5895,-1.5546;3.0677,-0.5127,-1.4013;2.5373,-0.3025,-0.1614;2.062,-1.4252,0.5645;1.6911,-1.0324,1.5129;2.8645,-2.1484,0.7612;1.2425,-1.9304,0.0364;3.1739,-1.761,-2.0227;2.8286,-2.6638,-1.5326;3.7373,-1.8378,-3.3029;3.8167,-2.8083,-3.7854;4.1904,-0.7047,-3.9625;4.6179,-0.7947,-4.9555)|\",4.057217510955\r\n\"[H]OC(=N/OC([H])([H])[H])/C1=N/C(=O)[C@@]2([H])C([H])=C([H])C([H])=C([H])\\C2=C\\1[H] |(2.9524,0.0887,-0.2332;2.9699,1.0712,-0.1246;4.2336,1.3429,0.2486;4.4787,2.5889,0.4863;5.785,2.8547,0.8993;5.9034,4.2595,1.1121;6.9254,4.414,1.467;5.1867,4.5999,1.867;5.741,4.8105,0.1793;5.1165,0.1284,0.3299;4.4262,-0.9775,0.1136;5.0561,-2.2162,0.0943;4.4842,-3.2277,-0.2654;6.489,-2.2798,0.6583;6.2712,-2.395,1.7444;7.2597,-3.5061,0.2839;6.6849,-4.4109,0.118;8.6056,-3.4731,0.2125;9.1702,-4.3717,-0.0194;9.3257,-2.2331,0.4192;10.412,-2.2535,0.3885;8.682,-1.0435,0.5865;9.2414,-0.1149,0.6624;7.2478,-0.9814,0.5747;6.5358,0.1878,0.5228;7.0264,1.1488,0.5551)|\",3.0095791865299995\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C(OC([H])([H])[H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(6.6434,-3.9725,1.1094;6.5369,-4.7601,0.5531;5.2169,-4.8457,0.2025;4.8092,-6.0483,-0.0253;3.488,-6.0521,-0.5056;3.0705,-7.4031,-0.643;2.0479,-7.3549,-1.0269;3.082,-7.9246,0.3219;3.7106,-7.9426,-1.3511;4.4243,-3.6245,0.1082;3.3524,-3.7786,0.0582;4.9725,-2.3908,0.0587;6.0546,-2.3021,0.0099;4.2547,-1.1172,0.0108;4.9684,0.0732,-0.2893;6.3074,-0.0703,-0.5208;7.0692,1.0837,-0.8369;8.0921,0.7343,-0.9878;7.0518,1.8149,-0.018;6.7096,1.5621,-1.757;4.3008,1.2978,-0.3393;4.8373,2.2111,-0.5689;2.9277,1.3558,-0.0922;2.4259,2.3196,-0.1344;2.1924,0.2068,0.2131;0.7068,0.2683,0.4863;0.3385,1.2986,0.4528;0.4613,-0.1403,1.4745;0.1374,-0.3109,-0.2515;2.8792,-1.0095,0.2622;2.3276,-1.9105,0.5179)|\",3.932045139725001\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C(OC([H])([H])C([H])([H])[H])C([H])=C([H])C([H])=C1[H] |(2.3315,-4.4722,-2.2747;2.5179,-5.3462,-1.8972;3.852,-5.5967,-2.069;4.1446,-6.8494,-2.1663;5.5373,-7.0431,-2.2043;5.7875,-8.4168,-2.4672;6.8753,-8.5241,-2.4476;5.4022,-8.7123,-3.451;5.3347,-9.0527,-1.6976;4.7871,-4.478,-2.1173;5.7756,-4.7298,-2.4837;4.4625,-3.2235,-1.7364;3.4688,-3.0366,-1.3387;5.3293,-2.0455,-1.774;4.7721,-0.7631,-1.5223;3.4245,-0.7399,-1.2958;2.765,0.4743,-0.9213;3.0822,1.2946,-1.5771;1.7098,0.2777,-1.1311;2.9574,0.8183,0.5517;2.3748,1.7113,0.8053;4.0072,1.0146,0.7893;2.6139,-0.0087,1.1812;5.5848,0.3765,-1.5411;5.1669,1.3565,-1.3444;6.9477,0.2596,-1.8156;7.5651,1.1537,-1.8256;7.514,-0.9903,-2.0665;8.5761,-1.083,-2.2719;6.7061,-2.1221,-2.0415;7.1517,-3.0959,-2.2195)|\",3.95109310926\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(OC([H])([H])[H])C(F)=C1[H] |(4.178,-0.6092,-7.4611;4.8269,-0.0099,-7.8628;4.6536,1.2276,-7.3026;4.9238,2.209,-8.095;4.8592,3.4394,-7.4207;5.0612,4.4778,-8.3702;5.0201,5.408,-7.7976;4.2752,4.4759,-9.1351;6.0387,4.3802,-8.8567;4.2139,1.3244,-5.9157;3.8712,2.3073,-5.6125;4.2455,0.2855,-5.0527;4.6774,-0.6555,-5.394;3.8003,0.2733,-3.6618;4.0154,-0.8742,-2.8753;4.5092,-1.7367,-3.3151;3.6147,-0.9279,-1.5479;3.7832,-1.8148,-0.9455;2.9772,0.1595,-0.9318;2.6452,-0.0107,0.3721;1.8512,0.948,1.0756;1.6905,0.5081,2.0615;0.8867,1.1074,0.5834;2.3699,1.9048,1.1789;2.7656,1.303,-1.7237;2.1582,2.3986,-1.1993;3.1579,1.3627,-3.0488;2.9506,2.2799,-3.5896)|\",3.90755489318\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(4.7444,0.6025,7.6504;5.6403,0.9203,7.8458;5.9384,1.9064,6.9436;6.7387,2.8105,7.3966;7.1135,3.7079,6.3822;7.8667,4.7575,6.9757;8.1589,5.4102,6.1489;8.7592,4.3646,7.4763;7.2651,5.3205,7.6998;5.3617,1.8442,5.6054;5.4447,2.7591,5.0293;4.7827,0.7356,5.0966;4.8013,-0.1769,5.6933;4.1488,0.5871,3.7865;3.6877,-0.6797,3.3907;3.8113,-1.5244,4.065;3.0848,-0.8737,2.1504;2.7424,-1.8684,1.8735;2.9127,0.1859,1.2526;2.2529,-0.0384,-0.0998;2.0222,-1.1101,-0.1704;0.9217,0.7286,-0.2234;0.436,0.5018,-1.1796;1.0811,1.8124,-0.1786;0.2315,0.4591,0.5834;3.1998,0.3077,-1.2645;2.7307,0.0681,-2.2259;4.1379,-0.253,-1.1919;3.4483,1.3754,-1.2732;3.3677,1.4555,1.6501;3.2478,2.3044,0.9816;3.9698,1.6549,2.8853;4.303,2.6524,3.1564)|\",4.01095815637\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(C#N)C([H])=C1[H] |(3.3322,1.007,1.7755;2.6118,0.5849,1.2803;3.1675,-0.2769,0.374;2.4738,-1.3428,0.1583;3.02,-2.1045,-0.8808;2.2835,-3.3191,-0.9846;2.7233,-3.8521,-1.8311;1.2243,-3.1165,-1.1772;2.3801,-3.9209,-0.0734;4.4311,0.0752,-0.264;4.9224,-0.7402,-0.7828;4.9545,1.3194,-0.2378;4.3743,2.115,0.2287;6.2261,1.7491,-0.8205;6.5379,3.1222,-0.8318;5.8299,3.8305,-0.4095;7.7263,3.5909,-1.3767;7.9495,4.6528,-1.3805;8.6469,2.6845,-1.9268;9.8773,3.1555,-2.4893;10.8777,3.5372,-2.9448;8.354,1.3079,-1.918;9.0683,0.6068,-2.3373;7.164,0.8524,-1.3715;6.962,-0.2138,-1.3672)|\",3.687142674275\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(N(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(5.9978,-1.8784,-6.4519;5.5509,-2.4725,-7.0754;4.9537,-3.4636,-6.3411;4.8605,-4.5921,-6.9592;4.1206,-5.5199,-6.2013;4.1236,-6.7593,-6.8937;3.5252,-7.4382,-6.28;5.1409,-7.1555,-7.0027;3.6701,-6.6569,-7.8869;4.4871,-3.1618,-4.994;4.2555,-4.0274,-4.3841;4.3194,-1.902,-4.5299;4.4851,-1.0755,-5.2228;3.8905,-1.4923,-3.1994;3.6516,-0.1312,-2.9355;3.7975,0.5946,-3.7331;3.2309,0.3213,-1.6936;3.065,1.3828,-1.5547;3.0203,-0.5828,-0.6261;2.5823,-0.1491,0.6107;2.4404,1.2737,0.8663;2.0687,1.421,1.8818;1.7191,1.7322,0.1776;3.3942,1.8145,0.7692;2.4939,-1.0828,1.7202;2.1087,-0.5597,2.5972;3.4703,-1.5177,1.9831;1.8049,-1.9071,1.4958;3.2769,-1.9542,-0.8805;3.151,-2.6878,-0.0932;3.6951,-2.3862,-2.1281;3.8825,-3.4466,-2.2707)|\",3.76605569092\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(11.9198,-2.7156,-0.4554;12.348,-2.612,-1.3201;11.8088,-1.5045,-1.9195;12.6252,-0.8752,-2.6949;11.9703,0.1637,-3.3796;12.9446,0.8953,-4.1103;12.39,1.6794,-4.6328;13.4557,0.2519,-4.8361;13.6885,1.3475,-3.4429;10.4184,-1.1565,-1.6507;10.1307,-0.1574,-1.9577;9.5351,-2.0079,-1.0844;9.8661,-3.0261,-0.8771;8.1405,-1.7449,-0.7394;7.5344,-0.4826,-0.864;8.1137,0.3609,-1.2276;6.2005,-0.2748,-0.5273;5.7745,0.7154,-0.6393;5.4259,-1.3417,-0.0458;4.117,-1.2465,0.3118;3.4606,0.017,0.1907;3.9807,0.7656,0.8048;3.4948,0.3531,-0.8552;2.0263,-0.1645,0.6562;1.4836,0.7839,0.5795;1.9975,-0.4979,1.6982;1.5126,-0.9098,0.0411;6.0121,-2.6094,0.0911;5.4023,-3.4257,0.4651;7.3421,-2.7991,-0.249;7.7804,-3.7885,-0.1392)|\",3.967419940290001\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C(Cl)=C(C([H])([H])[H])C([H])=C1[H] |(8.3545,-0.087,-0.0074;8.7976,-0.1994,-0.8635;8.0631,-1.085,-1.604;8.7613,-1.8082,-2.412;7.9339,-2.5781,-3.2438;8.7488,-3.4923,-3.9671;8.0697,-4.0089,-4.6501;9.2265,-4.2183,-3.2975;9.5213,-2.9619,-4.535;6.6147,-1.1273,-1.434;6.1309,-1.9978,-1.8623;5.9109,-0.1622,-0.8053;6.441,0.7275,-0.4651;4.4697,-0.1472,-0.5492;3.9029,0.9938,0.0434;4.5306,1.8424,0.2973;2.5387,1.0528,0.3069;1.9026,2.518,1.0532;1.6698,-0.0047,0.0023;0.1923,0.0538,0.2874;-0.2965,-0.8779,-0.0112;-0.2857,0.8803,-0.2516;-0.0029,0.2184,1.3535;2.2529,-1.1381,-0.5871;1.611,-1.9799,-0.8342;3.6124,-1.2187,-0.8579;4.0099,-2.1228,-1.3079)|\",3.96741994029\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1Cl |(3.3697,-2.0559,0.9661;2.6697,-1.3843,0.9984;3.2486,-0.1582,0.8212;2.6499,0.8056,1.4342;3.2078,2.0478,1.0952;2.5988,3.0465,1.9049;3.0322,3.9934,1.5729;2.8193,2.8872,2.9673;1.5128,3.0586,1.759;4.4344,-0.0412,-0.0229;4.9743,0.893,0.0796;4.8432,-1.0145,-0.8629;4.2473,-1.9195,-0.9414;6.0298,-0.9704,-1.7216;6.6849,0.2416,-2.0225;6.28,1.1652,-1.6214;7.8095,0.2884,-2.8366;8.2833,1.2427,-3.0469;8.32,-0.8886,-3.3897;9.1962,-0.8631,-4.0308;7.6959,-2.105,-3.1225;8.0751,-3.0301,-3.543;6.5695,-2.1369,-2.3023;5.8434,-3.7161,-1.9932)|\",3.9837467713200008\r\n\"[H]OC(=N/OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H] |(2.6369,0.1145,-1.8176;2.1236,-0.394,-1.1697;2.8168,-0.4037,0.0094;2.0736,-0.3887,1.0642;2.8311,-0.5454,2.2287;1.969,-0.3934,3.3527;2.6038,-0.5548,4.2273;1.1651,-1.137,3.3285;1.5348,0.6125,3.386;4.274,-0.4517,-0.0192;4.7555,-0.2384,0.9283;4.9833,-0.7544,-1.1275;4.4399,-1.0366,-2.0287;6.4405,-0.7932,-1.2555;7.3172,-0.3924,-0.2256;6.9192,-0.0249,0.7143;8.6929,-0.4514,-0.3932;9.3737,-0.1453,0.3914;9.2076,-0.9126,-1.6063;10.6636,-0.9738,-1.7891;11.0875,-1.3856,-2.8695;11.3738,-0.6097,-0.8513;8.376,-1.3131,-2.6495;8.8113,-1.6648,-3.5765;7.0002,-1.2486,-2.4657;6.3409,-1.5606,-3.2713)|\",3.3415580841399994\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([C@]2([H])C([H])([H])[C@@]2([H])C([H])([H])[H])O1 |(3.1593,-0.5959,0.1657;2.9975,-0.6412,-0.7895;3.2936,0.5891,-1.3311;2.7353,0.8531,-2.4439;3.037,2.0577,-3.1924;3.5294,2.8483,-2.6031;3.7425,1.7912,-3.9935;1.7574,2.6135,-3.8223;1.9773,3.49,-4.4429;1.042,2.9097,-3.0473;1.2792,1.8516,-4.446;4.2417,1.4142,-0.5627;4.2237,2.4849,-0.7347;5.1488,0.9063,0.3034;5.2238,-0.171,0.4415;6.1004,1.6614,1.0652;7.093,1.2847,1.9378;7.3371,0.2684,2.2162;7.7253,2.478,2.3889;8.5489,2.5625,3.084;7.0846,3.5194,1.7683;7.2726,4.9722,1.8179;8.0697,5.269,2.4951;6.0907,5.9261,1.7046;5.1087,5.4838,1.5648;6.1072,6.8071,2.3415;7.082,5.8404,0.5802;6.7406,5.2964,-0.2985;8.0423,6.9753,0.297;7.5714,7.7367,-0.3366;8.9418,6.6182,-0.2184;8.3597,7.4668,1.2244;6.0942,3.0359,0.962)|\",3.989189048330001\r\n\"[H]NC(O[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])N(C2=C([H])C([H])=C([H])C([H])=C2[H])C(C([H])([H])[H])=C1[H] |(6.3036,5.2197,-0.4663;7.2328,4.8552,-0.2458;7.1395,3.6181,0.0559;8.3122,2.9539,0.3015;8.1073,2.1652,0.8268;5.8983,2.8272,0.1395;4.9838,3.4064,0.2442;5.8384,1.4806,0.0376;6.7706,0.9418,-0.1328;4.6461,0.664,0.0962;4.6023,-0.7169,-0.0909;5.6866,-1.6925,-0.4222;6.6372,-1.1723,-0.5612;5.4716,-2.2385,-1.3494;5.8313,-2.4432,0.3649;3.2829,-1.1147,0.0391;2.8106,-2.4598,-0.0842;2.1216,-2.8552,-1.2354;1.9633,-2.1364,-2.0339;1.656,-4.1656,-1.3498;1.1201,-4.4697,-2.2446;1.8848,-5.0838,-0.3231;1.5249,-6.1046,-0.4161;2.5765,-4.6883,0.8235;2.7542,-5.3986,1.626;3.0354,-3.3765,0.9478;3.5598,-3.0524,1.8417;2.4792,-0.0032,0.3086;1.0062,-0.1174,0.5387;0.5981,0.8727,0.7599;0.7699,-0.7749,1.3852;0.4714,-0.5124,-0.3338;3.2995,1.0925,0.3489;2.9735,2.1036,0.5526)|\",4.41912893212\r\n\"[H]OC1=NC(=O)N(C([H])([H])C(=O)N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C([H])([H])C1([H])[H] |(-6.8785,1.7768,2.1026;-6.2071,2.477,2.0915;-5.2035,2.13,1.2575;-4.2222,2.939,1.1603;-3.1681,2.6571,0.2607;-2.344,3.5232,-0.0034;-3.115,1.3986,-0.3197;-1.9304,1.0673,-1.1062;-2.2001,0.2682,-1.8057;-1.6484,1.9512,-1.6757;-0.8012,0.517,-0.2194;-0.9968,-0.51,0.4372;0.3835,1.1732,-0.1948;1.4762,0.6603,0.6491;1.1832,0.7129,1.7057;1.6706,-0.3913,0.4192;2.6498,1.5954,0.3283;3.3509,1.6847,1.1636;3.2075,1.2176,-0.5379;1.9569,2.9223,-0.0257;2.5879,3.6023,-0.6055;1.6501,3.4441,0.8886;0.711,2.4768,-0.806;0.9442,2.3534,-1.8738;-0.1256,3.1699,-0.7006;-3.9314,0.3071,0.2036;-3.963,-0.4916,-0.5425;-3.4827,-0.1205,1.1093;-5.3363,0.8348,0.4887;-5.9146,0.0928,1.0528;-5.8746,1.0357,-0.4482)|\",5.36064285485\r\n\"[H]C#CC([H])([H])/N=C(/O[H])C([H])([H])N1C(=O)N=C(O[H])C([H])([H])C1([H])[H] |(9.6535,1.5669,-2.8614;8.5926,1.6313,-2.774;7.3894,1.7063,-2.6937;5.9245,1.7726,-2.5814;5.5317,2.3018,-3.4582;5.6634,2.4067,-1.716;5.2621,0.4692,-2.5349;5.3739,-0.2679,-1.5026;4.7367,-1.4527,-1.5022;4.9279,-1.9276,-0.657;6.1762,0.0048,-0.2244;6.7231,0.9455,-0.293;6.9115,-0.7982,-0.1056;5.3393,0.0403,0.9831;5.0433,-1.1427,1.6209;5.2525,-2.2267,1.0707;4.5204,-1.0975,2.9226;3.9275,-0.0357,3.3107;3.4113,-0.0312,4.5528;2.9812,0.8199,4.7309;3.7671,1.1963,2.4474;2.8827,1.0713,1.8088;3.6236,2.1008,3.0493;5.0271,1.329,1.5892;5.8687,1.6863,2.2033;4.8588,2.0628,0.7955)|\",5.091250142854999\r\n\"[H]C1=C([H])/C2=C([H])/C(C(=O)N(C([H])([H])[H])C([H])([H])[H])=N\\C(=O)[C@@]2([H])C([H])=C1[H] |(3.1383,7.8413,-1.1953;2.4537,7.0002,-1.2671;2.8962,5.7471,-0.9706;3.9165,5.5815,-0.6352;1.9953,4.6266,-1.0054;2.2839,3.3864,-0.5155;3.2467,3.1453,-0.081;1.2525,2.381,-0.4879;1.669,1.0483,0.1253;2.5032,1.0912,1.0322;1.1045,-0.1123,-0.3135;0.1858,-0.2657,-1.4363;0.4736,-1.1625,-1.9983;0.2241,0.5952,-2.098;-0.8484,-0.383,-1.0912;1.4114,-1.3419,0.4082;1.9028,-2.063,-0.2574;0.4868,-1.7949,0.7883;2.0725,-1.1052,1.24;-0.01,2.5596,-0.7895;-0.4306,3.83,-1.2055;-1.6113,4.1081,-1.2974;0.6497,4.8283,-1.6596;0.8076,4.498,-2.7118;0.1915,6.2507,-1.7599;-0.8551,6.4108,-1.9955;1.0681,7.2631,-1.6102;0.7464,8.2947,-1.7222)|\",3.213664574405\r\n\"[H]C1=C([H])/C2=N/C(C(=O)N(C([H])([H])[H])C([H])([H])[H])=C(/[H])C(=O)[C@@]2([H])C([H])=C1[H] |(-2.978,3.2176,0.4244;-1.9427,2.886,0.4331;-1.2181,2.9255,-0.72;-1.6357,3.3026,-1.6482;0.1714,2.5494,-0.7266;0.9007,2.7754,-1.7895;2.2614,2.5233,-1.7145;3.0676,2.637,-2.9983;3.9126,1.7709,-3.2219;2.8149,3.6634,-3.8685;2.0742,4.8796,-3.5657;2.7583,5.7398,-3.5345;1.5552,4.7936,-2.6154;1.3267,5.0667,-4.3467;3.5562,3.6931,-5.1225;2.8708,3.9199,-5.948;4.0221,2.7219,-5.2802;4.3382,4.465,-5.0993;2.9381,2.1477,-0.5861;4.0181,2.0576,-0.5964;2.237,1.9628,0.6734;2.7695,1.8077,1.7632;0.7108,1.8593,0.5044;0.6337,0.7809,0.2278;-0.0964,2.0113,1.7493;0.3789,1.7095,2.677;-1.3648,2.4666,1.6938;-1.9669,2.543,2.5947)|\",3.44496134733\r\n\"[H]/N=C(O[H])\\C([H])=C([H])\\C1=C(/C([H])([H])C([H])([H])[H])OC2=C([H])C([H])=C([H])C([H])=C21 |(4.8085,4.0382,-1.3327;4.3904,4.7233,-1.9638;3.8702,4.1234,-2.9608;3.3616,4.9095,-3.958;2.8359,4.3589,-4.558;3.7535,2.6654,-3.1992;3.7742,2.3379,-4.2363;3.5875,1.7741,-2.2015;3.5091,2.1751,-1.1914;3.4717,0.3348,-2.3123;3.0612,-0.491,-1.2896;2.6621,-0.245,0.1267;2.6083,0.8344,0.2988;1.6458,-0.6344,0.2766;3.6134,-0.8993,1.1466;3.2618,-0.714,2.1668;4.6265,-0.4941,1.0541;3.665,-1.9812,0.9924;3.0275,-1.8036,-1.6821;3.4338,-1.8396,-2.9928;3.5507,-2.9863,-3.765;3.3081,-3.9637,-3.3613;4.0003,-2.8105,-5.0733;4.1102,-3.6753,-5.721;4.3212,-1.5324,-5.561;4.6813,-1.4281,-6.5806;4.1937,-0.3963,-4.764;4.4712,0.5741,-5.1608;3.7288,-0.5438,-3.4481)|\",4.478993979229999\r\n\"[H]OC(=N/C([H])([H])C(F)(F)F)/C([H])=C(\\[H])C1=C([H])N(C([H])([H])[H])N=C1[H] |(6.3834,3.3863,0.7662;7.1378,3.522,0.1719;7.6293,2.2991,-0.1908;8.8342,2.3008,-0.6144;9.4297,1.1079,-1.1575;10.4902,1.0805,-0.8842;8.9814,0.1545,-0.8421;9.379,1.1055,-2.6802;10.0049,2.1641,-3.2195;9.9795,-0.0128,-3.1578;8.1065,1.1035,-3.134;6.7046,1.16,-0.0471;7.1518,0.1734,0.027;5.3595,1.282,-0.0405;4.9256,2.2718,-0.1836;4.3863,0.2195,0.1;4.5655,-1.1517,0.2867;5.4598,-1.7534,0.366;3.3356,-1.706,0.3669;2.9927,-3.1053,0.5474;3.9107,-3.6825,0.6724;2.4465,-3.4717,-0.3261;2.3648,-3.2189,1.4347;2.3415,-0.7924,0.2459;2.9722,0.3648,0.0859;2.401,1.2766,-0.0377)|\",4.655867982055\r\n\"[H]OC(=N/C([H])([H])/C(=N\\C([H])([H])C([H])([H])C([H])([H])[H])O[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])O1 |(0.8514,5.202,2.5653;1.0927,4.2768,2.4019;2.4549,4.1763,2.389;2.9371,3.1581,1.7915;4.3607,2.8737,1.8024;4.9369,3.6503,1.275;4.7697,2.8236,2.8218;4.6189,1.5309,1.102;5.7485,0.9706,0.9535;6.9402,1.6188,1.4824;7.0438,2.6557,1.1122;6.8976,1.6891,2.5834;8.1942,0.8311,1.0855;9.0657,1.2786,1.5838;8.0934,-0.1909,1.4725;8.4152,0.7874,-0.4282;9.286,0.1743,-0.6878;8.583,1.7942,-0.8334;7.5354,0.3667,-0.9248;3.5097,0.9375,0.6203;2.7671,1.5396,0.8578;3.2143,5.2248,3.0837;4.2499,5.3577,2.7882;2.7057,5.9792,4.0846;1.6876,5.8097,4.4312;3.4072,7.0054,4.8009;3.0308,7.8228,5.8415;2.0676,7.8209,6.3331;4.1498,8.6543,6.1377;4.2185,9.4189,6.8984;5.1301,8.2901,5.261;6.1429,8.6263,5.0979;4.701,7.2941,4.4432)|\",3.804151629989999\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C(N(=O)=O)C([H])=C2OC([H])([H])OC2=C1[H] |(-1.1052,2.3023,-1.4526;-1.7518,1.5502,-1.2064;-1.0994,0.4819,-0.9653;-1.8063,-0.6081,-0.5551;-1.2767,-1.4073,-0.7069;0.3719,0.3291,-1.0538;0.8783,1.0298,-1.7143;1.0681,-0.5978,-0.3724;0.5371,-1.2927,0.2674;2.5337,-0.7327,-0.4095;3.2119,-1.97,-0.3304;2.4849,-3.2439,-0.3021;3.1179,-4.2546,0.0048;1.2854,-3.2513,-0.5982;4.6172,-2.0803,-0.3117;5.0875,-3.0515,-0.2449;5.3294,-0.9098,-0.3936;6.6848,-0.7324,-0.3972;6.9037,0.6811,-0.4933;7.4501,1.0285,0.3896;7.4521,0.9064,-1.4139;5.6172,1.324,-0.5383;4.6902,0.329,-0.4815;3.3173,0.4444,-0.4784;2.8417,1.4174,-0.4906)|\",3.779661383445001\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C(OC([H])([H])[H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(3.2329,0.3707,-0.0542;3.3372,0.7487,-0.941;4.6411,1.1533,-1.0833;4.873,1.9937,-2.0102;6.2408,2.4063,-2.2991;6.6632,2.9719,-1.4512;6.9005,1.5378,-2.4517;6.2665,3.2887,-3.5466;7.2872,3.6198,-3.7719;5.6361,4.1725,-3.4027;5.8789,2.7399,-4.4112;5.6217,0.5493,-0.1595;6.5493,1.1,-0.0337;5.4404,-0.6347,0.4622;4.5445,-1.2084,0.2383;6.3609,-1.2832,1.3981;6.1789,-2.6548,1.7145;5.1382,-3.2818,1.0916;4.8991,-4.6515,1.3765;4.0262,-4.931,0.7842;4.6827,-4.81,2.441;5.7519,-5.277,1.0832;7.045,-3.2882,2.608;6.9201,-4.3374,2.8492;8.0889,-2.5719,3.196;8.7525,-3.0836,3.889;8.2932,-1.2172,2.9179;9.4022,-0.4347,3.5836;10.1037,-1.1006,4.0965;9.0086,0.2663,4.331;9.9722,0.1557,2.8568;7.4168,-0.6012,2.0203;7.5466,0.457,1.8071)|\",4.231370375275\r\n\"[H]OC(=N/C([H])([H])[H])/C([H])=C(\\[H])C1=C(OC([H])([H])[H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(6.2745,-3.6994,1.3403;6.2437,-4.5206,0.8254;4.9932,-4.6308,0.2701;4.6457,-5.7951,-0.1089;3.3762,-5.9779,-0.7926;3.3258,-7.0038,-1.1702;3.238,-5.3004,-1.6479;2.5178,-5.8404,-0.116;4.2105,-3.3839,0.1602;3.1367,-3.5112,0.0586;4.7587,-2.1509,0.1305;5.8423,-2.064,0.1382;4.0464,-0.8749,0.0385;4.7713,0.3041,-0.2756;6.1119,0.1471,-0.4818;6.8962,1.2927,-0.7769;7.9214,0.9328,-0.8794;6.8491,2.0328,0.0325;6.5815,1.7629,-1.7173;4.1086,1.5298,-0.3698;4.6516,2.434,-0.6183;2.7319,1.5984,-0.1495;2.2346,2.5622,-0.2292;1.9862,0.4608,0.1744;0.5018,0.5421,0.4474;0.0856,1.4941,0.1025;0.2843,0.4588,1.5204;-0.0452,-0.2636,-0.0556;2.6671,-0.7564,0.2623;2.1067,-1.6481,0.5319)|\",4.225928098264999\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])[H])C1=C([H])C(=O)[C@]2([H])C(=N1)C([H])=C([H])C([H])=C2[H] |(2.9869,1.3193,-0.4224;2.812,0.6331,-1.1001;4.0119,0.0549,-1.3635;4.0146,-1.0051,-2.0632;5.1927,-1.7821,-2.3928;5.6382,-1.393,-3.3221;5.9775,-1.7249,-1.6214;4.7985,-3.2445,-2.6117;5.6717,-3.8432,-2.8965;4.0449,-3.3226,-3.4015;4.3715,-3.6682,-1.6965;5.1627,0.8048,-0.7378;6.4404,0.7863,-1.2305;6.6838,0.3089,-2.1702;7.4987,1.5503,-0.5871;8.624,1.7031,-1.0376;7.1168,2.0816,0.8091;7.3229,1.1709,1.4181;5.6344,2.288,0.9985;4.7489,1.6042,0.3224;5.2102,3.2465,1.9823;4.1479,3.3095,2.1957;6.1112,4.0932,2.5553;5.7633,4.8397,3.2647;7.5181,4.0724,2.2127;8.1703,4.8288,2.6395;8.0032,3.1467,1.36;9.0461,3.1144,1.0619)|\",3.44496134733\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C([H])C(/C([H])=C(\\C#N)C(=O)OC([H])([H])[H])=C1OC([H])([H])[H] |(3.2423,2.648,-1.1232;3.1839,1.7613,-0.5037;1.9912,1.4635,0.154;1.1426,2.1319,0.0274;1.8621,0.3331,0.9704;0.5553,0.0029,1.6541;-0.0586,0.8987,1.7955;0.7219,-0.4522,2.6358;-0.0358,-0.7092,1.0633;2.9759,-0.4931,1.1163;2.8912,-1.3676,1.7501;4.2033,-0.2347,0.4741;5.3847,-1.0656,0.5973;6.2423,-0.7307,0.0219;5.6308,-2.204,1.3078;4.6948,-2.8712,2.1549;3.9282,-3.4129,2.8449;6.999,-2.7961,1.184;7.8905,-2.3377,0.4976;7.1246,-3.9086,1.9333;8.4089,-4.5491,1.8772;8.3284,-5.4167,2.5319;9.1904,-3.8704,2.2289;8.6383,-4.8569,0.8536;4.2917,0.9236,-0.3543;5.4885,1.1473,-0.9584;5.636,2.2793,-1.8048;6.6609,2.2374,-2.1762;5.4866,3.2147,-1.2512;4.9388,2.2399,-2.651)|\",3.8504109845750008\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N([H])N([H])C([H])([H])[C@@]2([H])C([H])([H])N([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(-0.5837,-3.4582,-2.872;0.1425,-3.6111,-2.0782;1.3238,-4.3112,-2.3314;1.5228,-4.7024,-3.3258;2.2542,-4.5118,-1.3114;3.1688,-5.0581,-1.5288;2.025,-4.017,-0.0173;3.0208,-4.2107,1.1156;2.5109,-3.9253,2.044;3.4759,-5.6279,1.244;3.148,-6.1653,0.4457;4.9154,-5.6892,1.2213;5.2075,-5.6881,2.1987;5.3736,-4.4395,0.5999;6.3992,-4.2215,0.9179;5.3763,-4.56,-0.4931;4.3474,-3.3715,1.0259;4.6,-3.0482,2.044;4.2958,-2.1109,0.1594;3.4125,-1.5231,0.4334;4.1617,-2.3899,-0.9028;5.4864,-1.2871,0.3798;6.3106,-1.8574,0.1936;5.5726,-0.0815,-0.4588;5.3528,-0.3192,-1.5192;4.5625,0.9755,0.0052;4.6674,1.8903,-0.5893;4.7337,1.2217,1.0594;3.5283,0.6321,-0.101;7.0012,0.4651,-0.3798;7.1098,1.3675,-0.9909;7.7307,-0.2714,-0.741;7.256,0.7155,0.6565;0.8321,-3.3226,0.2212;0.6298,-2.9447,1.2209;-0.1007,-3.1166,-0.7967;-1.0206,-2.578,-0.5847)|\",5.670852644420001\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N([H])N([H])C([H])([H])[C@@]2([H])C([H])([H])N([H])C([H])([H])C([H])([H])OC([H])([H])[H])C([H])=C1[H] |(6.9479,-4.4379,-1.4541;6.9171,-4.1644,-0.4028;8.081,-3.7589,0.2529;9.0229,-3.7119,-0.2876;8.043,-3.4135,1.6043;8.9604,-3.1026,2.0976;6.8412,-3.4623,2.3287;6.7627,-3.0819,3.7998;5.7883,-3.4269,4.1684;7.835,-3.7192,4.6218;8.4678,-4.2337,4.0152;8.6269,-2.7065,5.2835;8.2472,-2.6413,6.2281;8.353,-1.4402,4.5905;8.5297,-0.59,5.254;9.03,-1.3369,3.7314;6.8945,-1.5515,4.1165;6.2472,-1.3545,4.9848;6.4878,-0.5878,3.0013;5.4751,-0.8486,2.6426;7.1588,-0.7132,2.1423;6.6134,0.7946,3.4669;5.9492,0.955,4.2236;6.3817,1.8159,2.4492;6.6437,2.7881,2.885;7.0685,1.6409,1.6109;4.9524,1.9072,1.9076;4.66,0.9774,1.3911;4.8799,2.7271,1.1717;4.0911,2.1466,3.0083;2.7304,2.2164,2.6421;2.1566,2.3921,3.5562;2.3841,1.2792,2.1769;2.5394,3.0418,1.9373;5.6836,-3.8781,1.657;4.7462,-3.9393,2.2057;5.7163,-4.2239,0.3053;4.8051,-4.5478,-0.1911)|\",5.483094087575\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C1=N/C(=O)[C@@]2([H])C([H])=C([H])C([H])=C([H])\\C2=C\\1[H] |(9.5328,1.5827,-0.6802;9.1676,2.2467,-0.054;7.9817,1.7463,0.3581;7.2182,2.4996,1.0391;5.8749,2.171,1.477;5.368,1.4372,0.8323;5.92,1.7329,2.4871;5.0345,3.4496,1.5378;4.0248,3.2326,1.9061;4.9517,3.9014,0.5438;5.5035,4.1821,2.2017;7.7798,0.3126,-0.1034;8.5389,-0.0094,-1.1311;8.5167,-1.3039,-1.645;9.3214,-1.6864,-2.4715;7.3479,-2.2116,-1.2027;6.556,-1.8717,-1.9078;7.5388,-3.6656,-1.4931;8.1198,-3.9136,-2.3748;6.9686,-4.6044,-0.7107;7.077,-5.6597,-0.9436;6.223,-4.2255,0.4719;5.7537,-5.0084,1.0618;6.1451,-2.9311,0.8931;5.6413,-2.679,1.8224;6.7994,-1.8847,0.1615;6.9402,-0.5997,0.6169;6.5159,-0.305,1.5678)|\",3.13475155776\r\n\"[H]C1=C([H])C([H])=C2OC(/C([H])=C(\\[H])C(=O)N(OC([H])([H])[H])C([H])([H])[H])=N\\C2=C1[H] |(12.4706,0.9697,-0.2074;11.4772,0.7383,-0.5805;11.324,0.3767,-1.9334;12.2007,0.3367,-2.5733;10.0741,0.0672,-2.4745;9.9462,-0.2126,-3.5146;9.0048,0.1399,-1.5941;7.6794,-0.1047,-1.837;7.0667,0.1164,-0.6164;5.6377,-0.0582,-0.5219;5.2069,0.1289,0.4569;4.8438,-0.4279,-1.5426;5.2479,-0.6344,-2.5251;3.3843,-0.5759,-1.305;2.8637,-0.4071,-0.206;2.6027,-0.8564,-2.4172;3.2596,-1.3787,-3.5506;3.1902,-0.4447,-4.6328;2.1509,-0.2108,-4.8929;3.6739,-0.9471,-5.4742;3.7235,0.4821,-4.3913;1.2465,-1.3585,-2.27;0.8446,-0.9482,-1.3441;1.2383,-2.4545,-2.2223;0.6363,-1.0287,-3.1157;7.8707,0.4709,0.3461;9.129,0.4975,-0.2418;10.388,0.8051,0.2854;10.4997,1.0832,1.3283)|\",4.14157280461\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C([H])/C1=N/C2=C([H])C([H])=C([H])C([H])=C2O1 |(4.2199,1.3781,1.8044;3.3508,0.9478,1.8252;3.4199,-0.2131,1.0939;2.5532,-1.1015,1.3682;2.4314,-2.3355,0.6164;1.4625,-2.3032,0.0987;3.1936,-2.4735,-0.1663;2.4443,-3.5341,1.5698;2.2738,-4.4657,1.0185;1.6636,-3.426,2.3291;3.4081,-3.6107,2.0853;4.4718,-0.2572,0.0577;4.8138,-1.2307,-0.2782;5.0172,0.8397,-0.5046;4.6788,1.8401,-0.2476;6.0515,0.7731,-1.5145;6.6451,-0.2751,-2.0079;7.5468,0.2387,-2.9315;8.4639,-0.3944,-3.7776;8.5582,-1.4753,-3.7851;9.2392,0.4182,-4.6015;9.9624,-0.0375,-5.2716;9.1114,1.8213,-4.5891;9.7368,2.4162,-5.2483;8.2009,2.4679,-3.7507;8.0934,3.5471,-3.7339;7.4393,1.6391,-2.9401;6.4768,1.9836,-2.0272)|\",4.040890679925\r\n\"[H]OC(=N/[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])N(C([H])([H])[H])N=C1[H] |(7.1767,2.0476,1.0906;6.5019,1.5818,0.5724;5.268,1.9177,1.0822;4.3266,1.0949,0.8501;2.948,1.3668,1.2336;2.7818,2.424,1.5029;2.5912,0.492,2.4451;1.5409,0.6288,2.7283;3.2161,0.7424,3.3101;2.7549,-0.5651,2.209;2.0303,1.0525,0.0361;2.1621,-0.0048,-0.2275;0.9877,1.1764,0.3604;2.2997,1.929,-1.189;1.6263,1.6718,-2.0146;3.3295,1.8004,-1.5376;2.152,2.9921,-0.9583;5.2081,3.2001,1.8145;4.4058,3.3073,2.5379;6.0674,4.2218,1.6125;6.8292,4.1147,0.84;6.0651,5.4958,2.3035;6.9081,6.5748,2.0472;7.6994,6.6973,1.3205;6.5762,7.5618,2.9113;7.1631,8.8826,3.038;7.9584,8.9885,2.2975;7.579,9.0124,4.041;6.4012,9.6479,2.8664;5.5598,7.2044,3.7343;5.2505,5.9653,3.3721;4.4556,5.4437,3.8891)|\",4.71845416767\r\n\"[H]C([H])([H])/C1=N/C(=O)[C@@]2([H])C(N1)SC1=C2C([H])([H])C([H])([H])C([H])([H])S1 |(0.7199,-0.4564,-2.1366;1.0815,-0.3097,-1.1121;0.6307,0.5852,-0.682;0.7783,-1.1975,-0.5438;2.5754,-0.198,-1.1165;3.1365,0.903,-0.7119;4.5372,0.9744,-0.6921;5.1438,2.0241,-0.7156;5.227,-0.3935,-0.5366;4.9827,-0.6799,0.5062;4.4925,-1.3689,-1.424;3.23,-1.3218,-1.6601;5.5902,-2.5031,-2.193;7.0062,-1.6035,-1.5432;6.7049,-0.5508,-0.759;7.7104,0.3384,-0.0738;7.6877,1.327,-0.55;7.39,0.5142,0.9622;9.1346,-0.2371,-0.0717;9.2074,-1.0645,0.6438;9.8343,0.5391,0.2624;9.5771,-0.731,-1.4511;10.6143,-1.0781,-1.4385;9.4882,0.0594,-2.2032;8.602,-2.187,-2.0218)|\",3.4993841174300004\r\n\"[H]C([H])([H])C1=NC(=O)[C@]2([H])C(N1)SC(C([H])([H])[H])=C2C([H])([H])C([H])([H])[H] |(8.0509,1.5661,-1.2382;8.0257,0.6325,-0.675;8.4587,-0.1762,-1.2758;8.6422,0.7158,0.228;6.6184,0.2811,-0.2986;5.6377,1.007,-0.7506;4.3286,0.6859,-0.3671;3.3565,1.0411,-1.0008;4.2362,-0.0599,0.977;4.5093,0.724,1.7104;5.3584,-1.0668,1.0176;6.5097,-0.8932,0.4681;4.8636,-2.5375,1.8234;3.1557,-1.9612,1.9302;2.1843,-2.9238,2.544;1.1778,-2.5009,2.5799;2.4786,-3.183,3.5688;2.1373,-3.8592,1.9724;2.9603,-0.7251,1.4323;1.6524,0.0238,1.3877;1.4423,0.2765,0.3426;0.8414,-0.6231,1.7365;1.6594,1.3245,2.2114;0.6766,1.8056,2.1664;2.3894,2.0403,1.8176;1.8959,1.1336,3.2646)|\",3.883064646635\r\n\"[H]C1=C([H])C(/C([H])=C(\\[H])C(=O)N(OC([H])([H])[H])C([H])([H])[H])=C([H])C2=C1OC([H])([H])O2 |(7.0347,-4.0374,-1.4413;7.3177,-2.9929,-1.3682;6.3597,-1.9754,-1.2762;5.3107,-2.2527,-1.2765;6.7153,-0.6172,-1.1788;5.7305,0.4587,-1.0824;6.1335,1.4647,-0.9732;4.3873,0.3612,-1.1139;3.8723,-0.5861,-1.2099;3.5756,1.593,-0.9856;4.0567,2.7123,-0.8224;2.1954,1.4509,-1.1275;1.6674,0.1517,-0.9492;1.1006,-0.3138,-2.1758;0.3026,0.3519,-2.5268;0.6793,-1.2955,-1.9431;1.8648,-0.4092,-2.9563;1.296,2.4806,-0.6328;1.7999,3.4392,-0.7507;1.0557,2.319,0.426;0.371,2.4744,-1.2168;8.0885,-0.2587,-1.1695;8.393,0.7798,-1.093;9.0188,-1.2695,-1.2616;8.6467,-2.6078,-1.3595;9.7627,-3.3985,-1.4369;10.8862,-2.5072,-1.3754;11.4813,-2.6105,-2.2901;11.488,-2.7432,-0.4897;10.3896,-1.1684,-1.2753)|\",4.095313450025\r\n\"[H]OC(=N/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])N(C([H])([H])[H])N=C1C([H])([H])[H] |(4.2133,-2.5445,-0.1309;4.1414,-2.7391,-1.0786;2.8596,-3.1969,-1.3177;2.7273,-3.8939,-2.3707;1.467,-4.4132,-2.9221;0.9325,-5.5778,-2.0606;0.0787,-6.0574,-2.5536;0.6009,-5.2493,-1.0702;1.7145,-6.3318,-1.9204;0.3921,-3.3215,-3.1125;-0.4668,-3.7284,-3.659;0.8003,-2.4885,-3.6957;0.0259,-2.9197,-2.1636;1.8382,-4.9713,-4.3124;0.9656,-5.4112,-4.8104;2.6134,-5.7384,-4.2173;2.2367,-4.1724,-4.9466;1.8593,-2.7701,-0.3118;0.9935,-3.4065,-0.1895;1.9758,-1.6279,0.4048;2.8381,-0.9969,0.1891;1.0771,-1.1003,1.409;1.213,0.1478,2.0312;2.2154,1.2424,1.8478;2.9775,0.9531,1.1207;2.7283,1.4868,2.7868;1.7465,2.1647,1.4813;0.189,0.2487,2.915;-0.134,1.3623,3.7857;0.7246,1.6313,4.4098;-0.9581,1.0416,4.4227;-0.4436,2.2412,3.2084;-0.6133,-0.8504,2.9152;-0.09,-1.672,2.0131;-0.7372,-3.0002,1.7694;-1.1268,-3.0817,0.7468;-1.5728,-3.1286,2.4624;-0.0323,-3.8274,1.9183)|\",4.6803582286\r\n\"[H]C1=C([H])C(F)=C(/C([H])=C(\\C#N)C(=C(C#N)C#N)N([H])[H])C([H])=C1[H] |(-3.512,-1.0873,0.6931;-2.7871,-0.3023,0.499;-1.9062,0.0836,1.5084;-1.909,-0.387,2.4856;-0.9864,1.0913,1.253;-0.1052,1.4052,2.2195;-0.9108,1.7457,0.0124;0.0287,2.8085,-0.3272;0.3434,2.79,-1.3649;0.5127,3.8326,0.4258;0.1359,4.0612,1.7904;-0.1215,4.3416,2.8903;1.471,4.8353,-0.1176;1.4916,5.2616,-1.4466;2.4998,6.1903,-1.8347;3.3521,6.9519,-2.0659;0.5334,4.8776,-2.4231;-0.2403,4.5598,-3.2355;2.3505,5.3417,0.7709;2.3025,5.0885,1.7474;2.9682,6.0985,0.5054;-1.7866,1.2991,-1.0019;-1.7313,1.7738,-1.9779;-2.7252,0.3036,-0.7602;-3.4018,-0.0052,-1.5505)|\",3.681700397265\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C([H])C([H])=C1/C([H])=C(\\C#N)C(=C(C#N)C#N)N([H])[H] |(4.0174,-0.0385,-0.886;3.9003,-1.1047,-0.7101;2.6265,-1.6394,-0.5622;1.7608,-0.9847,-0.6165;2.4446,-3.0113,-0.347;1.0682,-3.6055,-0.1863;0.2903,-2.8381,-0.2333;0.9725,-4.1269,0.774;0.863,-4.3429,-0.9722;3.5875,-3.8288,-0.2879;3.4695,-4.8972,-0.1238;4.8634,-3.3068,-0.4357;5.7117,-3.9774,-0.3839;5.0502,-1.9217,-0.6501;6.3243,-1.2518,-0.8235;6.2242,-0.1823,-0.9707;7.6094,-1.7202,-0.8321;7.9426,-3.0877,-0.5709;8.2903,-4.1743,-0.3336;8.7823,-0.8338,-1.0358;8.7601,0.3469,-1.7839;9.9509,1.127,-1.8315;10.9657,1.702,-1.8214;7.6535,0.792,-2.5554;6.7556,1.1835,-3.1881;9.9321,-1.2327,-0.451;9.9774,-2.0918,0.0775;10.7925,-0.7265,-0.6154)|\",3.4884995634100004\r\n\"[H]NC(O[H])/C([H])=C(\\[H])C1=C([H])C(OC([H])([H])[H])=C(OC([H])([H])[H])C([H])=C1OC([H])([H])[H] |(-0.8201,4.3657,-0.233;-1.1773,4.1182,-1.1571;-0.5813,3.0707,-1.574;-1.0176,2.548,-2.7622;-0.3947,1.8617,-3.0456;0.5205,2.3108,-0.9379;0.5591,1.2429,-1.1236;1.458,2.9085,-0.1732;1.3715,3.991,-0.0664;2.6016,2.3537,0.5417;3.4391,3.2662,1.2263;3.1803,4.3177,1.1737;4.5558,2.8751,1.9482;5.3965,3.7098,2.6261;5.114,5.0979,2.5944;4.1337,5.3225,3.0366;5.1454,5.4947,1.5704;5.895,5.5747,3.1895;4.8763,1.4977,2.005;5.9808,1.1774,2.7267;6.3551,-0.1872,2.8231;7.2555,-0.2048,3.4393;6.5821,-0.6141,1.8371;5.5728,-0.7871,3.3069;4.0696,0.5729,1.3406;4.3214,-0.4768,1.3891;2.9441,0.9866,0.6146;2.1349,0.104,-0.0457;2.4197,-1.2838,0.0297;1.6398,-1.7797,-0.5509;2.3846,-1.6465,1.0651;3.3994,-1.5195,-0.406)|\",4.046332956935\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C([H])C([H])=C(OC([H])([H])[H])C(OC([H])([H])C([H])([H])[H])=C1[H] |(5.1376,3.9226,-1.0758;5.7218,4.4225,-0.4027;6.4125,3.586,0.2684;7.3041,4.1005,1.1675;7.5266,3.4099,1.8101;6.3602,2.1152,0.1417;5.466,1.7291,-0.3418;7.3462,1.2792,0.5255;8.253,1.7136,0.9485;7.3807,-0.178,0.414;8.5321,-0.867,0.8182;9.381,-0.3094,1.2061;8.6215,-2.2563,0.7253;9.534,-2.7508,1.0377;7.5508,-2.999,0.2215;7.5474,-4.3507,0.0651;8.7284,-5.0706,0.3909;8.5157,-6.1137,0.1519;8.9719,-4.9834,1.4577;9.5831,-4.7269,-0.2053;6.3676,-2.3222,-0.1751;5.3198,-3.0019,-0.74;4.5565,-3.8703,0.122;4.0377,-4.5469,-0.5634;5.2313,-4.4697,0.7409;3.5609,-3.0934,0.9738;4.0742,-2.4081,1.6574;2.9581,-3.7839,1.5757;2.8856,-2.5075,0.3413;6.3009,-0.9412,-0.0762;5.3811,-0.4683,-0.4042)|\",4.285793145375\r\n\"[H]/N=C(O[H])\\C([H])=C(/[H])C1=C([H])C(OC([H])([H])C([H])([H])[H])=C(OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(10.1275,-8.413,-0.2353;11.0206,-8.4328,-0.7295;11.1914,-7.3323,-1.3483;12.394,-7.1573,-1.9774;12.3332,-6.3902,-2.5667;10.2594,-6.1849,-1.4625;10.7167,-5.1989,-1.5457;8.9201,-6.3222,-1.4873;8.5218,-7.3366,-1.456;7.9059,-5.2715,-1.5753;8.2198,-3.8997,-1.4707;9.2476,-3.6076,-1.2938;7.2311,-2.9234,-1.5614;7.4337,-1.5809,-1.4203;8.7418,-1.0776,-1.1296;8.5576,-0.0971,-0.6815;9.2236,-1.7022,-0.3669;9.608,-0.9342,-2.3766;10.5765,-0.495,-2.1106;9.1211,-0.2765,-3.1037;9.7908,-1.8987,-2.86;5.8849,-3.3149,-1.7837;4.8682,-2.4001,-1.8225;4.8398,-1.5058,-2.9515;5.7699,-0.9303,-2.9933;4.7562,-2.0988,-3.8736;3.6385,-0.5946,-2.7782;3.5559,0.0867,-3.6322;3.737,0.0013,-1.8653;2.7158,-1.1797,-2.7095;5.575,-4.6682,-1.8778;4.5349,-4.9404,-2.0278;6.5675,-5.6405,-1.767;6.3052,-6.6922,-1.8438)|\",4.258581760325001\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])O1)/C([H])=C(\\[H])C1=C([H])C([H])=C([H])O1 |(2.4503,-5.6265,-0.1406;3.2313,-5.199,0.2441;3.212,-3.8707,-0.1111;4.3387,-3.2787,-0.1055;4.4575,-1.8518,-0.3199;5.0858,-1.6921,-1.2063;3.5082,-1.3242,-0.4905;5.1666,-1.2055,0.8938;6.1148,-1.7251,1.0633;4.5482,-1.3693,1.786;5.4335,0.25,0.7141;6.5663,0.9871,0.5247;7.5759,0.6012,0.4881;6.1554,2.3559,0.3927;6.7876,3.2189,0.2359;4.8008,2.3535,0.5082;4.0464,3.1252,0.4816;4.3412,1.0825,0.7031;1.8832,-3.3128,-0.4287;1.8495,-2.4291,-1.0564;0.7252,-3.8296,0.0388;0.7425,-4.6828,0.7145;-0.5926,-3.3282,-0.2422;-1.8378,-3.7277,0.1821;-2.0453,-4.5505,0.8525;-2.783,-2.8501,-0.4276;-3.8585,-2.8655,-0.3213;-2.0536,-1.9781,-1.1809;-2.3143,-1.1449,-1.8157;-0.7252,-2.2508,-1.0826)|\",4.008237017865\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=N1)/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H] |(7.3934,-2.4282,3.2431;8.0036,-1.7362,2.9444;7.3194,-0.8976,2.0983;8.0324,-0.2018,1.305;7.4163,0.781,0.4336;6.4092,1.0968,0.7392;8.0459,1.6803,0.4499;7.3514,0.2523,-1.0169;6.7603,-0.6711,-1.0246;8.3634,-0.0007,-1.3507;6.7045,1.2494,-1.9481;7.4529,2.0058,-2.8594;8.5282,1.8659,-2.9261;6.8017,2.9305,-3.6738;7.3629,3.5263,-4.389;5.4207,3.0744,-3.5541;4.868,3.7787,-4.1687;4.7585,2.2801,-2.6169;3.6791,2.3613,-2.4934;5.3708,1.3896,-1.8297;5.8474,-0.9058,2.2327;5.2765,-0.5816,1.3666;5.1879,-1.269,3.3542;5.7549,-1.5466,4.2441;3.7441,-1.288,3.4883;3.1637,-1.0039,2.6099;3.1029,-1.6311,4.6201;3.6981,-1.912,5.4908;1.6162,-1.6589,4.7966;1.3014,-0.9784,5.5996;1.2704,-2.661,5.0854;1.0936,-1.3687,3.8794)|\",4.612329765975\r\n\"[H]OC(=N/C1=C([H])C(F)=C([H])C([H])=C1F)/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H] |(8.0923,3.3093,1.8087;8.0215,2.6738,1.08;6.8862,1.9388,1.2513;6.82,0.8684,0.5573;5.7509,-0.0256,0.591;5.0702,-0.3416,-0.5963;5.3272,0.1725,-1.5154;4.0763,-1.3102,-0.5883;3.4269,-1.58,-1.7426;3.7313,-2.0198,0.5566;2.9554,-2.7759,0.5163;4.4145,-1.7293,1.7388;4.1922,-2.2533,2.6627;5.398,-0.7518,1.7425;6.0412,-0.4742,2.9052;5.8573,2.4919,2.1477;5.1158,1.7916,2.5158;5.755,3.7989,2.4736;6.4648,4.5194,2.0638;4.7214,4.3505,3.3265;3.9909,3.6502,3.732;4.6275,5.6573,3.6327;5.3673,6.3437,3.2174;3.5783,6.2663,4.5101;4.0311,6.7651,5.3778;2.8701,5.5167,4.8773;3.0125,7.0366,3.9687)|\",3.975583355805\r\n\"[H]OC(=N/[C@]([H])(C1=NC([H])=C([H])C([H])=C1[H])C([H])([H])[H])/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H] |(9.3294,1.8717,-2.4874;8.6659,1.1732,-2.5995;8.143,0.878,-1.3652;7.5693,-0.2548,-1.2585;6.8894,-0.6151,-0.0118;6.5489,0.2591,0.5572;7.8511,-1.368,0.9012;8.0031,-0.8776,2.1415;8.8339,-1.5215,2.9745;8.9311,-1.0939,3.9715;9.5484,-2.6646,2.6244;10.2052,-3.1455,3.3433;9.3983,-3.161,1.3277;9.9439,-4.0442,1.0054;8.541,-2.5034,0.4519;8.4093,-2.8419,-0.5706;5.654,-1.4601,-0.3604;5.1245,-1.7658,0.5489;5.9416,-2.3555,-0.9209;4.97,-0.8772,-0.9861;8.3017,1.9162,-0.3254;8.2785,1.5746,0.7062;8.4524,3.2331,-0.5865;8.4269,3.5827,-1.62;8.6153,4.2561,0.4281;8.631,3.9215,1.4658;8.7423,5.5671,0.1548;8.7202,5.8864,-0.8887;8.9119,6.6494,1.1757;9.8542,7.1934,1.0224;8.9104,6.2493,2.1946;8.108,7.3943,1.0987)|\",4.606887488965\r\n\"[H]/C(C(=O)N(C([H])([H])[H])C([H])([H])[H])=C(/[H])C1=C(C([H])([H])[H])N(C([H])([H])[H])N=C1C([H])([H])[H] |(0.9573,-1.6515,0.7941;1.6637,-2.4162,0.5009;1.2397,-3.8399,0.4391;1.9875,-4.7149,-0.0078;-0.0292,-4.1502,0.8934;-0.9716,-3.1966,1.4568;-1.6539,-3.733,2.1244;-1.5792,-2.6937,0.6891;-0.4637,-2.4387,2.0559;-0.5289,-5.5025,0.6971;-0.8159,-5.9526,1.6567;0.2628,-6.0947,0.241;-1.4106,-5.502,0.0407;2.9285,-2.1004,0.1484;3.5444,-2.9495,-0.1438;3.5584,-0.8017,0.1046;4.888,-0.5703,-0.2739;5.9638,-1.5087,-0.7191;5.5973,-2.5373,-0.7252;6.8361,-1.4717,-0.0539;6.3135,-1.2748,-1.7327;5.0884,0.7673,-0.1716;6.2803,1.5222,-0.5042;7.1685,1.0407,-0.0845;6.1665,2.5158,-0.0705;6.4051,1.6157,-1.5898;3.9751,1.4351,0.2402;3.0475,0.5015,0.413;1.684,0.9004,0.8857;0.9071,0.6136,0.1661;1.6477,1.9848,1.0189;1.4343,0.4258,1.843)|\",4.81097287684\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(C#N)C([H])=C1[H] |(2.7895,0.8893,-0.1354;2.4811,0.0957,-0.6003;3.5716,-0.5468,-1.1344;3.3408,-1.3128,-2.122;4.3644,-2.1459,-2.7219;5.363,-2.0476,-2.2663;4.0608,-3.1906,-2.5667;4.4508,-1.8742,-4.2269;5.1666,-2.555,-4.7015;4.7742,-0.845,-4.4193;3.4713,-2.0109,-4.6952;4.8646,-0.2784,-0.4662;5.7556,-0.4384,-1.0646;4.9736,0.1217,0.8168;4.0637,0.203,1.411;6.2079,0.4205,1.5485;7.4702,0.5115,0.929;7.5606,0.3731,-0.1437;8.6107,0.7928,1.6661;9.5768,0.8615,1.1765;8.5213,0.9981,3.0553;9.6987,1.2903,3.8175;10.6555,1.5267,4.436;7.2699,0.9193,3.6871;7.1969,1.0801,4.7577;6.135,0.6374,2.9373;5.1703,0.5765,3.4342)|\",4.166063051155\r\n\"[H]OC(=N/C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])N(C([H])([H])[H])N=C1C([H])([H])[H] |(4.1653,-0.79,0.3503;3.3483,-0.9982,-0.1292;2.8543,0.1767,-0.6498;1.6083,0.1917,-0.9079;0.9979,1.3413,-1.5597;1.3185,2.298,-1.1135;1.3031,1.3789,-2.617;-0.5249,1.2388,-1.4754;-1.0032,2.0881,-1.9777;-0.8517,1.2254,-0.4302;-0.8712,0.3131,-1.9461;3.8384,1.2511,-0.8748;3.4404,2.2568,-0.9358;5.1596,1.0246,-1.0598;5.4874,-0.0154,-1.0596;6.2117,1.9881,-1.2994;7.547,1.6681,-1.5792;8.2406,0.3519,-1.7298;7.5412,-0.4748,-1.5861;9.0508,0.2321,-0.999;8.6817,0.238,-2.7283;8.2014,2.8476,-1.722;9.6009,3.0605,-2.0367;10.2471,2.5761,-1.2968;9.7718,4.1365,-2.0177;9.8458,2.6719,-3.0318;7.3854,3.9236,-1.5534;6.1835,3.4212,-1.2981;5.0362,4.3517,-1.0526;4.5665,4.1667,-0.0789;4.2573,4.2467,-1.8182;5.3963,5.3837,-1.0729)|\",4.64770456654\r\n\"[H]OC(=N/C([H])([H])[H])/C1=C(\\[H])C(=O)/N=C2/C([H])=C([H])C([H])=C([H])[C@@]21[H] |(-5.2406,3.4259,3.5254;-5.645,2.9106,4.2421;-4.8269,2.9678,5.3435;-5.2924,2.5238,6.4306;-4.5025,2.4837,7.6441;-5.0291,3.0439,8.4257;-3.4848,2.8919,7.5584;-4.4381,1.4453,7.9914;-3.4569,3.5099,5.06;-3.0807,4.7407,5.44;-3.7551,5.4147,5.9603;-1.6973,5.2404,5.2184;-1.2937,6.2351,5.7978;-0.8786,4.5597,4.2941;-1.2418,3.3919,3.8645;-0.4558,2.767,2.8141;0.471,3.263,2.5444;-0.9208,1.6884,2.1383;-0.3398,1.2744,1.3182;-2.2099,1.0891,2.4436;-2.568,0.2753,1.8197;-2.9511,1.5313,3.475;-3.906,1.0747,3.7162;-2.4289,2.5987,4.4071;-1.9958,2.0172,5.2503)|\",3.964698801785\r\n\"[H]OC(=N/[C@]([H])(/C(=N\\C([H])([H])[H])O[H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.3745,0.4327,3.6202;4.6356,0.9051,3.2056;4.6501,0.6467,1.8645;3.559,0.8646,1.2422;3.4489,0.7504,-0.2061;4.4095,0.91,-0.7099;2.4636,1.8432,-0.6863;2.3267,2.3136,-1.8563;3.1497,1.8711,-2.9602;3.4845,2.7489,-3.5266;2.5486,1.2644,-3.6523;4.0459,1.2839,-2.7013;1.6328,2.277,0.288;1.9932,1.904,1.124;2.8967,-0.64,-0.5787;2.7451,-0.7231,-1.659;1.9368,-0.8058,-0.0798;3.5876,-1.4292,-0.2599;5.9227,0.1648,1.2932;5.847,-0.3951,0.367;7.129,0.4181,1.842;7.1666,1.0458,2.7331;8.4405,-0.0137,1.3536;8.6055,-0.9297,0.2968;7.7353,-1.3627,-0.1876;9.8761,-1.2999,-0.1299;9.9838,-2.0098,-0.9452;11.0121,-0.7687,0.4897;12.0029,-1.0626,0.155;10.8664,0.135,1.5428;11.7428,0.551,2.0316;9.5933,0.5047,1.9711;9.4837,1.2106,2.7911)|\",3.8041516299900007\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])C1=C([H])C([H])=C(F)C([H])=C1[H] |(1.174,4.8422,-0.025;0.2367,4.8789,-0.2802;-0.188,3.6191,-0.5762;-1.3336,3.5236,-1.1138;-1.9463,2.2533,-1.4316;-1.23,1.4198,-1.4616;-2.9901,1.9387,-0.3991;-3.0607,2.7665,0.1929;-4.2734,1.815,-1.047;-4.3794,0.8228,-1.2605;-4.1647,2.5625,-2.3107;-4.921,2.2203,-3.0246;-4.3408,3.6245,-2.1028;-2.7166,2.3391,-2.7735;-2.3202,3.1412,-3.4018;-2.6314,1.3917,-3.3203;0.7815,2.5151,-0.1422;0.6836,2.4054,0.9458;0.4733,1.5622,-0.5779;2.2273,2.8106,-0.4959;3.1437,3.2274,0.48;2.8248,3.3104,1.5166;4.471,3.5155,0.1517;5.1872,3.832,0.9026;4.873,3.3786,-1.1697;6.1518,3.6488,-1.4971;3.9928,2.967,-2.167;4.3477,2.8711,-3.1878;2.6733,2.6884,-1.8217;1.9774,2.37,-2.5937)|\",5.14295177445\r\n\"[H]OC1=C([H])C([H])=C([H])C([H])=C1C([H])([H])C([H])([H])/C(=N\\[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])O[H] |(-1.1542,1.9853,-3.1186;-0.5535,1.9311,-2.3589;0.684,1.5317,-2.7935;0.9524,1.2821,-4.1415;0.1628,1.4086,-4.8803;2.2255,0.8685,-4.5352;2.4245,0.6768,-5.5858;3.2313,0.7048,-3.5842;4.2241,0.3822,-3.882;2.9514,0.9571,-2.2391;3.7323,0.828,-1.4936;1.6842,1.3764,-1.8122;1.3709,1.6299,-0.351;0.4358,1.1161,-0.1004;2.1565,1.1821,0.2682;1.1895,3.1208,0.0514;0.6931,3.1571,1.0236;0.5069,3.6003,-0.6612;2.4753,3.9303,0.1117;2.9634,4.5879,1.0807;2.378,4.6264,2.409;1.7277,3.7698,2.6317;3.4448,4.6669,3.4303;4.1989,4.0229,3.1928;3.9868,6.0115,3.3636;4.3385,6.1373,2.404;2.7475,6.8005,3.4394;2.4662,6.9055,4.4917;2.9209,7.7968,3.0227;1.6692,5.9843,2.6631;1.392,6.4509,1.7141;0.7604,5.8443,3.2571;3.1844,3.9764,-1.0564;2.7666,3.3972,-1.7155)|\",4.810972876839999\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])C([H])([H])OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.6129,-2.2675,-1.822;4.3471,-2.9057,-1.7271;5.3562,-2.2559,-1.1149;6.4359,-2.8849,-0.877;7.5047,-2.1957,-0.1813;7.5864,-1.1396,-0.4813;7.2453,-2.2253,1.2998;6.606,-3.0075,1.4523;8.4647,-2.5932,1.9675;8.9797,-1.7235,2.1064;9.2046,-3.4463,1.0167;10.2723,-3.4484,1.2594;8.8324,-4.4726,1.122;8.8782,-2.8917,-0.3837;8.8018,-3.6666,-1.1505;9.6349,-2.1651,-0.7015;5.0497,-0.7958,-0.7647;5.7807,-0.1329,-1.2546;5.1494,-0.6533,0.3204;3.7359,-0.5044,-1.208;3.3268,0.8449,-0.9716;3.3229,1.0475,0.1094;4.0613,1.5236,-1.4344;1.956,1.0549,-1.5615;1.7642,0.9467,-2.9457;2.6099,0.7067,-3.5855;0.5009,1.1374,-3.5019;0.3638,1.05,-4.5763;-0.5865,1.446,-2.6797;-1.571,1.5989,-3.1136;-0.4043,1.557,-1.3013;-1.2462,1.7938,-0.6565;0.8615,1.3573,-0.7459;0.9995,1.4369,0.3299)|\",5.597381904785\r\n\"[H]/C1=C(\\C([H])([H])C([H])([H])[H])SC2NC(C3([H])C([H])([H])C3([H])[H])=NC(=O)[C@]21[H] |(3.757,0.3896,3.5391;3.7475,-0.425,2.8232;3.8342,-0.2448,1.4982;3.9363,1.0444,0.7347;4.1377,1.8433,1.4583;4.8045,1.0041,0.063;2.6768,1.3872,-0.0806;2.8134,2.3368,-0.609;1.8023,1.482,0.5719;2.4567,0.6168,-0.8281;3.8369,-1.7974,0.5727;3.8069,-2.7291,2.063;4.0288,-3.9941,2.1309;4.2053,-4.489,3.4379;4.198,-5.9605,3.5213;4.0109,-6.4553,2.5749;3.6795,-6.6262,4.7942;3.3323,-5.9522,5.5709;3.0978,-7.5347,4.6648;5.1389,-6.6516,4.5079;5.598,-7.5773,4.1729;5.7757,-5.9893,5.0854;4.4437,-3.7865,4.5138;4.3931,-2.3958,4.4497;4.8598,-1.6648,5.299;3.5831,-1.8436,3.2608;2.5342,-2.0121,3.576)|\",4.01095815637\r\n\"[H]C([H])([H])C1=C(C([H])([H])C([H])([H])[H])[C@@]2([H])C(NC(C3([H])C([H])([H])C3([H])[H])=NC2=O)S1 |(5.0958,-2.8741,-0.6666;4.5754,-2.79,0.2901;4.2751,-3.7987,0.6007;5.2927,-2.4208,1.0336;3.3878,-1.8815,0.1835;2.9043,-1.1991,-0.8725;3.4692,-1.2023,-2.2706;3.6854,-0.1656,-2.55;4.4163,-1.7509,-2.2881;2.5125,-1.7995,-3.3186;2.9876,-1.804,-4.3055;1.5958,-1.2057,-3.4023;2.2355,-2.8308,-3.0706;1.6242,-0.4599,-0.5664;0.7845,-1.0017,-1.0441;1.3605,-0.5274,0.9145;0.4914,0.1946,1.5337;-0.1114,1.1858,0.7426;-1.2813,1.8226,1.3726;-1.552,1.396,2.3318;-2.4013,2.3621,0.4868;-2.2643,2.2295,-0.5818;-3.4156,2.2263,0.8514;-1.5035,3.3232,1.1815;-1.8771,3.8696,2.0427;-0.7546,3.834,0.585;0.3106,1.6223,-0.4158;1.3872,0.9934,-1.0328;2.0268,1.5063,-1.9304;2.4461,-1.646,1.7053)|\",3.8476898460700006\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3=C([H])N(C([H])([H])[H])N=C3[H])=N\\C2=C1[H] |(7.9953,2.49,-0.2695;8.1603,3.5198,-0.568;9.0323,3.8148,-2.0526;8.7392,5.6165,-1.8613;7.949,5.8776,-2.5823;9.897,6.5793,-2.1573;10.8255,6.2623,-2.8744;9.7222,7.8448,-1.6213;8.8624,8.018,-0.6404;8.6466,9.3598,-0.1451;7.8394,9.7457,0.9236;7.235,9.1529,1.5935;7.9338,11.0858,1.0187;7.2737,11.9705,1.9606;6.7207,11.3681,2.6838;6.583,12.6342,1.4325;8.0206,12.5749,2.4814;8.7595,11.6219,0.0727;9.1898,10.5824,-0.6263;9.8754,10.7166,-1.4512;8.1664,7.0239,0.0657;8.186,5.8427,-0.4734;7.7731,4.612,0.1426;7.2794,4.5755,1.1062)|\",3.76061341391\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3=C([H])SC([H])=C3[H])=N\\C2=C1[H] |(-0.7818,4.9856,-0.2481;-1.6817,5.5114,-0.5488;-1.6343,6.4412,-2.0266;-3.4218,6.8124,-1.8445;-3.9455,6.1748,-2.574;-3.9045,8.2392,-2.1287;-3.2731,9.0073,-2.826;-5.159,8.5182,-1.6033;-5.6327,7.7713,-0.6313;-6.9834,8.052,-0.1484;-7.5689,7.3808,0.9046;-7.1217,6.6016,1.5041;-9.1695,7.9371,1.205;-9.0651,9.0941,-0.0975;-9.9059,9.7387,-0.315;-7.8556,9.0413,-0.7192;-7.5603,9.6678,-1.5513;-4.956,6.7666,0.0709;-3.8416,6.3664,-0.4625;-2.8448,5.5371,0.1548;-2.9943,5.0558,1.1137)|\",3.687142674275\r\n\"[H]OC1=N[C@]([H])(C2=NC([H])N([H])N2)N([H])[C@]2([H])C([H])([H])C([H])([H])[C@@]([H])(F)C([H])([H])[C@@]12[H] |(0.8912,-0.7022,-1.4672;0.5924,-0.8113,-0.5522;0.2073,0.4014,-0.0569;-0.1998,0.4251,1.1423;-0.6481,1.689,1.7374;0.0722,1.9231,2.5318;-1.9845,1.4612,2.3968;-3.1504,1.4301,1.6796;-4.0599,1.1899,2.6048;-5.1262,1.0896,2.4529;-3.4657,1.0838,3.8125;-3.8723,0.9038,4.7193;-2.124,1.2612,3.6981;-0.7685,2.8542,0.8562;-1.6756,2.8089,0.3892;0.2974,2.8813,-0.135;1.2477,2.8956,0.4212;0.2448,4.1344,-1.0116;-0.716,4.1633,-1.5476;0.2759,5.0248,-0.3752;1.4086,4.1458,-2.017;2.3644,4.2301,-1.4827;1.3405,5.0078,-2.6902;1.4417,2.8633,-2.8417;0.5631,2.8208,-3.5018;2.5699,2.8759,-3.6681;1.4872,1.6046,-1.9747;2.434,1.5913,-1.4195;1.4971,0.7372,-2.6494;0.2912,1.5981,-1.0008;-0.635,1.6038,-1.5999)|\",6.040927481100001\r\n\"[H]C1C2=C(/N=C(/C([H])([H])C([H])([H])C([H])([H])[H])NC2=O)C([H])=C([H])[C@@]1([H])F |(9.2107,-2.1281,1.0264;8.6434,-1.208,0.9082;7.2972,-1.242,0.9199;6.5162,-0.0073,0.8221;5.2155,-0.0105,0.8347;4.5816,-1.2657,0.935;3.0818,-1.1775,0.8766;2.6651,-2.1502,1.1517;2.753,-0.4331,1.6145;2.577,-0.7484,-0.5177;3.0501,0.2029,-0.7888;2.9049,-1.49,-1.2578;1.0527,-0.6093,-0.5637;0.7145,-0.3149,-1.5635;0.7022,0.1507,0.1453;0.5594,-1.5549,-0.309;5.136,-2.436,1.0533;6.5356,-2.5233,1.0794;7.1071,-3.5925,1.2147;7.2325,1.2655,0.7641;6.6254,2.1654,0.7693;8.5736,1.3045,0.7415;9.1189,2.2441,0.7258;9.4048,0.0576,0.6703;9.8462,-0.0063,-0.341;10.4814,0.1438,1.5573)|\",3.344279222645\r\n\"[H]OC1=N[C@]([H])(C2=NC([H])=C([H])C([H])=C2[H])N([H])[C@@]2([H])C([H])=C([H])S[C@@]12[H] |(0.7006,6.6688,1.6022;0.326,6.1399,0.88;0.9389,4.9227,0.8396;0.5406,4.0983,-0.0354;1.2061,2.8164,-0.1485;1.3876,2.6582,-1.2218;0.2697,1.677,0.3003;0.8852,0.5137,0.5631;0.1315,-0.5346,0.9152;0.6676,-1.4596,1.1227;-1.2581,-0.4831,1.019;-1.8209,-1.3652,1.3088;-1.8922,0.7277,0.7411;-2.9736,0.8147,0.8096;-1.1193,1.828,0.375;-1.565,2.7899,0.149;2.5281,2.7747,0.485;2.8492,1.8077,0.4494;2.5051,3.2242,1.8796;1.8475,2.6057,2.5166;3.9109,3.2833,2.434;4.4278,2.3882,2.7665;4.4836,4.4895,2.394;5.5035,4.7277,2.6762;3.4645,5.8041,1.7818;1.9774,4.6723,1.9051;1.5283,4.8712,2.8865)|\",5.053154203785001\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C3=C([H])C([H])=C([H])S3)=N\\C2=C1[H] |(3.5875,-3.4614,-0.356;4.6205,-3.3218,-0.6563;5.1767,-4.1457,-2.0914;6.8003,-3.304,-1.9386;6.8072,-2.5134,-2.7051;8.0735,-4.1241,-2.1882;8.0603,-5.1347,-2.8629;9.2221,-3.5501,-1.6681;9.1142,-2.6263,-0.737;10.3283,-1.9932,-0.2581;11.6184,-2.1813,-0.7151;11.8437,-2.8771,-1.5142;12.5672,-1.383,-0.032;13.6305,-1.3879,-0.2443;11.9898,-0.5964,0.9383;12.4838,0.1014,1.6022;10.2817,-0.8181,1.0378;7.9549,-2.2209,-0.0675;6.8368,-2.631,-0.5874;5.5373,-2.5774,0.0187;5.3458,-2.0628,0.9524)|\",3.621835350155\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])NC1=C([H])N=C([H])C([H])N1[H])[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(0.5353,-0.0767,6.8019;0.402,0.8577,6.4866;1.0738,1.0176,5.3233;0.9931,2.1465,4.7414;1.6999,2.4326,3.4926;2.7817,2.2523,3.5831;1.325,1.788,2.6823;1.4866,3.8979,3.069;0.4042,4.1074,3.1472;1.7472,3.9665,2.005;2.2848,4.8943,3.7641;1.9398,5.3017,4.941;2.6291,6.4369,5.5597;3.4623,6.8493,4.9929;2.3188,6.9761,6.6986;1.265,6.4298,7.4082;1.0101,6.9009,8.3508;0.5977,5.3356,6.955;-0.2002,4.8497,7.5072;0.9206,4.7798,5.7475;0.5806,3.8391,5.4997;1.8546,-0.2111,4.8475;2.5954,0.0904,4.1062;2.558,-0.9145,5.9482;2.8687,-0.228,6.6401;1.4323,-1.6384,6.5868;1.8535,-2.2921,7.2477;0.8675,-2.3699,5.4327;1.4879,-3.2341,5.1644;-0.1401,-2.7137,5.6791;0.8993,-1.3161,4.3008;-0.0927,-0.9094,4.0886;1.2891,-1.7471,3.3759)|\",4.15789963564\r\n\"[H]OC(=N/C([H])([H])[H])/C1=C(\\C([H])([H])[H])[C@@]2([H])C(NC(C([H])([H])[H])=NC2=O)S1 |(0.5898,3.3049,-1.2651;0.1319,2.4863,-1.5135;1.0633,1.572,-1.9298;0.6357,0.4954,-2.4347;1.535,-0.5716,-2.8218;2.5862,-0.4316,-2.5306;1.491,-0.7069,-3.9096;1.1776,-1.506,-2.3733;2.4815,1.9993,-1.7038;3.083,2.2122,-0.5148;2.461,1.9978,0.8345;2.9479,1.1607,1.3531;1.3939,1.7767,0.7572;2.6101,2.8861,1.4565;4.5349,2.5826,-0.6486;5.1523,1.7269,-0.3129;4.8524,2.8009,-2.109;5.9267,3.36,-2.5392;6.7628,3.8754,-1.5288;8.1425,4.2203,-1.9969;8.7509,4.5822,-1.1675;8.0845,4.9871,-2.7787;8.609,3.3406,-2.4563;6.4172,4.1357,-0.3025;5.1358,3.773,0.1295;4.5848,4.2965,1.0738;3.5215,2.3093,-3.141)|\",3.891228062150001\r\n\"[H]OC(=N/C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1Br |(2.6817,1.8361,-0.8132;2.4064,1.3676,-0.0097;2.5083,0.0172,-0.2327;2.6055,-0.7122,0.8037;2.6037,-2.1554,0.7366;1.7542,-2.5342,1.3195;3.5105,-2.5344,1.2244;2.5395,-2.6025,-0.2672;2.4699,-0.4101,-1.6495;2.9336,-1.3652,-1.8735;1.9063,0.318,-2.6331;1.4472,1.2691,-2.3774;1.8279,-0.0538,-4.0513;1.9865,-1.3911,-4.4703;2.1466,-2.1561,-3.7172;1.914,-1.7588,-5.8079;2.037,-2.8009,-6.0876;1.6698,-0.7904,-6.7841;1.606,-1.0652,-7.8328;1.4982,0.5408,-6.4098;1.3085,1.3037,-7.1565;1.5757,0.8939,-5.0637;1.3438,2.7623,-4.6463)|\",4.49532081026\r\n\"[H]OC(=N/C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C(Cl)=C(C([H])([H])[H])C([H])=C1[H] |(8.633,-0.3932,-1.75;9.3443,-0.3052,-1.0964;8.9465,0.6072,-0.1484;9.8785,1.1758,0.5024;9.6115,2.061,1.6132;10.1149,3.0189,1.4329;10.0632,1.6363,2.5191;8.5533,2.2629,1.8372;7.485,0.7834,-0.0023;7.1542,1.7263,0.4213;6.5825,-0.1605,-0.3373;6.951,-1.1232,-0.6915;5.1228,-0.0722,-0.2365;4.4466,1.1267,0.0506;4.9936,2.0516,0.1943;3.0612,1.1452,0.1407;2.2783,2.6835,0.503;2.2733,-0.0033,-0.05;0.7709,0.0236,0.047;0.3528,-0.9672,-0.1516;0.3367,0.7302,-0.6697;0.4411,0.3429,1.0427;2.9626,-1.187,-0.3425;2.3896,-2.0971,-0.4999;4.3502,-1.2274,-0.438;4.8455,-2.1681,-0.6647)|\",4.432734624645\r\n\"[H]O/C(=N/C1([H])C([H])([H])C1([H])[H])C([H])([H])O/N=C(\\[H])C1=C([H])C([H])=C(F)C([H])=C1[H] |(2.9282,5.5543,-1.1279;3.0069,6.4804,-0.8364;1.7586,6.8909,-0.491;1.5908,8.0997,-0.1545;0.3245,8.6007,0.2568;-0.5393,7.9323,0.26;0.035,10.0505,-0.0716;0.8052,10.577,-0.626;-0.9854,10.3192,-0.3315;0.3386,9.6539,1.3476;-0.4708,9.6464,2.0729;1.3132,9.9133,1.7485;0.7181,5.7668,-0.4963;-0.2452,6.1141,-0.8739;0.5856,5.3732,0.5175;1.1539,4.6417,-1.2688;0.8466,4.864,-2.6324;1.1908,3.8506,-3.3382;1.6494,2.9794,-2.8626;0.9841,3.8165,-4.7869;0.4259,4.902,-5.4884;0.1425,5.7952,-4.9418;0.2407,4.8352,-6.8632;-0.1866,5.6622,-7.4205;0.6176,3.6747,-7.5373;0.4377,3.6096,-8.8701;1.1739,2.5852,-6.8783;1.4558,1.7021,-7.4415;1.3547,2.6648,-5.4992;1.7881,1.8212,-4.968)|\",4.49532081026\r\n\"[H]OC(NC1=C([H])C([H])=C([H])N=C1[H])N1C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])C([H])([H])O[H] |(-1.459,-0.2529,0.3918;-2.395,0.0107,0.4465;-2.9746,-0.6073,-0.6223;-2.1829,-1.2418,-1.4174;-2.5089,-2.1847,-2.3849;-3.2683,-3.3415,-2.1322;-3.6906,-3.503,-1.1434;-3.4494,-4.2738,-3.1513;-4.032,-5.1752,-2.981;-2.8658,-4.0371,-4.396;-2.9914,-4.7452,-5.2131;-2.1213,-2.9538,-4.657;-1.9477,-2.0692,-3.6743;-1.3229,-1.204,-3.893;-4.3169,-0.3762,-0.6488;-5.0132,0.3021,0.4676;-5.099,-0.3822,1.3223;-4.4524,1.1776,0.8005;-6.388,0.6544,-0.1157;-7.1694,0.6543,0.65;-6.3653,1.6552,-0.5616;-6.6076,-0.4122,-1.1997;-7.3276,-0.1219,-1.9681;-6.9559,-1.3471,-0.7447;-5.207,-0.624,-1.7995;-5.0913,-1.643,-2.1742;-4.9011,0.3322,-2.9609;-3.8634,0.1824,-3.2869;-5.0008,1.3772,-2.6236;-5.824,0.0292,-3.9995;-5.5015,0.431,-4.8192)|\",5.3334314698\r\n\"[H]C1C2=C(/N=C(/C([H])([H])OC([H])([H])C([H])([H])C([H])([H])[H])NC2=O)C([H])=C([H])[C@@]1([H])F |(11.6119,0.7938,3.2168;11.0033,1.5727,2.7637;9.6593,1.4942,2.8133;8.8263,2.569,2.2689;7.5257,2.5236,2.306;6.9564,1.3742,2.8703;5.4345,1.3711,2.8716;5.1025,2.0089,3.7015;5.0976,0.3447,3.0737;4.8468,1.9072,1.7118;4.9915,1.0837,0.5593;6.0584,0.9404,0.3206;4.5595,0.0885,0.7565;4.2801,1.7596,-0.6063;4.3084,1.0747,-1.4644;3.2235,1.8873,-0.3385;4.8926,3.1114,-0.9857;4.3443,3.5745,-1.8136;5.9379,2.9953,-1.2986;4.874,3.7955,-0.1326;7.5513,0.3571,3.4177;8.9544,0.3493,3.4754;9.568,-0.5521,4.0207;9.4871,3.7529,1.7238;8.8412,4.5645,1.4039;10.8251,3.8285,1.6563;11.3309,4.71,1.2721;11.7065,2.6772,2.0409;12.134,2.2477,1.1162;12.7918,3.1304,2.7954)|\",3.229991405435\r\n\"[H]C1C2=C(/N=C(/SC([H])([H])[H])NC2=O)C([H])=C([H])[C@@]1([H])F |(4.9595,1.5891,5.8842;4.5177,2.3167,5.208;4.3273,1.999,3.9137;3.6829,2.946,3.0002;3.4854,2.6705,1.7434;3.9377,1.4152,1.3179;3.6333,1.2363,-0.4034;4.2984,-0.4344,-0.7062;4.1282,-0.6436,-1.7651;3.7793,-1.1661,-0.0853;5.3656,-0.4655,-0.4804;4.4986,0.4595,1.9963;4.7244,0.6611,3.3601;5.2288,-0.1965,4.0665;3.2074,4.2194,3.5358;2.6694,4.8651,2.8489;3.4034,4.5427,4.823;3.031,5.4738,5.2414;4.2006,3.674,5.751;5.1583,4.1827,5.9635;3.5449,3.5568,6.9804)|\",3.469451593875001\r\n\"[H]C1S[C@]2([H])C(=O)N/C(C([H])([H])OC([H])([H])C([H])([H])C([H])([H])[H])=N\\C2=C1[H] |(10.0294,2.404,-4.1279;10.6144,2.6767,-3.2561;11.5108,4.1747,-3.3064;12.0091,3.9315,-1.5594;11.409,4.633,-0.9581;13.464,4.1964,-1.1685;14.2167,4.8696,-1.8377;13.8001,3.6668,0.0831;13.093,2.677,0.5549;13.4694,2.1391,1.9159;14.1964,2.8198,2.3797;13.9599,1.1592,1.7724;12.3006,2.0031,2.6927;12.5395,1.372,3.9415;12.9698,0.3681,3.7793;13.2756,1.9504,4.5269;11.2212,1.2709,4.6975;10.8075,2.2803,4.8129;10.5094,0.7084,4.0811;11.3848,0.6035,6.0665;10.4253,0.5442,6.591;12.0776,1.1641,6.706;11.7736,-0.4174,5.9693;12.0851,1.9638,-0.0898;11.6088,2.5259,-1.16;10.7335,1.928,-2.1273;10.2769,0.9567,-1.9812)|\",3.744286582880001\r\n\"[H]C([H])([H])/C1=C(\\C([H])([H])C([H])([H])[H])[C@@]2([H])C(NC(C([H])([H])C([H])([H])[H])=NC2=O)S1 |(0.9415,-3.2141,2.1753;1.7355,-3.8204,1.7331;2.0819,-4.5295,2.4958;1.2963,-4.4092,0.9178;2.8604,-2.963,1.2362;2.9984,-1.6229,1.2352;2.0091,-0.6212,1.7747;1.7423,0.0612,0.96;1.0929,-1.1326,2.0866;2.5527,0.213,2.9493;1.7791,0.8964,3.3154;3.4075,0.8256,2.6414;2.8666,-0.4236,3.7847;4.3319,-1.1641,0.6972;4.9558,-0.8149,1.5433;5.0594,-2.3475,0.1101;6.0993,-2.259,-0.6436;6.4571,-0.9462,-0.9987;7.8242,-0.8259,-1.6162;7.8272,-1.4276,-2.5356;7.9813,0.2187,-1.8959;8.9409,-1.3319,-0.6871;9.9078,-1.2833,-1.1987;8.7594,-2.3698,-0.3923;9.009,-0.7216,0.2205;5.7049,0.1125,-0.9088;4.4452,0,-0.306;3.555,0.803,-0.4969;4.2506,-3.8475,0.5006)|\",3.85857440009\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])[H])[C@]1([H])C(=O)N=C(C(F)(F)F)C([H])=C1[H] |(3.9055,0.9329,2.6399;3.4628,0.0559,2.6265;4.01,-0.6847,1.6422;3.888,-1.9428,1.6956;4.4072,-2.8586,0.7001;3.5773,-3.1779,0.0514;5.1732,-2.4226,0.0337;4.9965,-4.0924,1.3894;5.3343,-4.8269,0.6492;4.2454,-4.5589,2.0337;5.8511,-3.8139,2.0151;4.6666,0.1015,0.4716;5.628,-0.4046,0.2638;5.0408,1.5588,0.7409;5.0494,2.0555,1.8552;5.4181,2.3443,-0.3647;4.8602,2.0676,-1.4921;5.2246,2.9341,-2.6936;5.8484,2.1752,-3.6219;4.0956,3.4171,-3.2581;6.016,3.955,-2.3781;3.9353,0.9641,-1.7345;3.3905,0.916,-2.6705;3.8561,0.0023,-0.8004;3.2452,-0.8829,-0.9479)|\",3.60278738062\r\n\"[H]OC1=NC([H])([H])C([H])([H])C([H])([H])[C@@]1([H])/C(=N\\C([H])([H])C1=C([H])C([H])=NC([H])=C1[H])O[H] |(1.5306,1.1809,1.3855;2.1407,1.6082,2.0088;2.9295,0.667,2.6067;3.7068,1.0623,3.5241;4.631,0.1258,4.1565;5.5766,0.6584,4.3127;4.2475,-0.1004,5.1629;4.8737,-1.1738,3.3807;5.494,-0.9665,2.498;5.4294,-1.8848,4.003;3.5383,-1.7747,2.9407;2.9372,-2.0225,3.8256;3.6721,-2.7075,2.3828;2.7707,-0.759,2.0622;3.2052,-0.7608,1.0559;1.2889,-1.1225,1.9566;0.631,-1.563,0.9673;1.2213,-1.8734,-0.3148;0.7548,-1.2141,-1.0626;2.3064,-1.6917,-0.3834;0.9525,-3.3121,-0.7334;-0.0182,-4.1015,-0.1138;-0.5966,-3.7028,0.7121;-0.2207,-5.4039,-0.5738;-0.9723,-6.0352,-0.1018;0.4596,-5.9596,-1.5839;1.3875,-5.1933,-2.1725;1.9385,-5.6536,-2.9916;1.67,-3.8832,-1.7899;2.4423,-3.3203,-2.3097;0.5834,-0.9528,3.1142;1.1295,-0.4901,3.7711)|\",6.20963806841\r\n\"[H]OC([H])([H])C([H])([H])OC(=O)[C@]1([H])N=NC(=O)C2=C1C([H])=C([H])C([H])=C2[H] |(-0.5039,2.1732,3.6336;-0.0377,2.9971,3.4148;-0.9911,4.0323,3.2872;-1.3153,4.428,4.2656;-0.4852,4.8453,2.7556;-2.2465,3.6202,2.5261;-2.8781,4.4881,2.3212;-2.8235,2.8717,3.0761;-1.9114,3.079,1.2232;-1.6669,1.7633,1.1727;-1.724,0.9937,2.1077;-1.3167,1.3173,-0.2554;-1.401,2.1782,-0.9203;-2.4305,0.3799,-0.6626;-2.332,-0.8346,-0.4281;-1.1195,-1.4051,0.2094;-1.2336,-2.479,0.753;0.1366,-0.6654,0.0053;0.0513,0.7018,-0.2842;1.2153,1.4302,-0.539;1.1564,2.493,-0.7575;2.4513,0.7851,-0.5029;3.3568,1.3506,-0.7041;2.5356,-0.5818,-0.2072;3.5052,-1.0698,-0.1731;1.3788,-1.3115,0.0447;1.4127,-2.3718,0.2746)|\",3.578297134075\r\n\"[H]O/C1=N/C(=O)N(C([H])([H])[H])[C@]2([H])N([H])C([H])([H])[C@@]([H])(N([H])[H])C([H])([H])[C@@]12[H] |(6.2411,-0.7555,0.8133;5.3067,-0.8097,1.0688;4.7682,0.4273,1.0893;3.5374,0.523,1.4088;2.9124,1.7917,1.4472;1.6986,1.8727,1.5164;3.7133,2.9421,1.4268;3.0792,4.1699,1.9149;3.6495,5.0337,1.5756;3.0322,4.1901,3.0138;2.063,4.2026,1.5256;5.1553,2.8119,1.5682;5.4346,2.6244,2.6231;5.8293,4.0485,1.1898;5.5846,4.2703,0.2231;7.2837,3.926,1.3246;7.7412,4.8676,0.9982;7.515,3.8127,2.3942;7.9043,2.7379,0.5512;7.7711,2.9293,-0.524;9.3389,2.5408,0.7685;9.5286,2.4521,1.7675;9.8562,3.3607,0.4537;7.1547,1.4365,0.8888;7.3837,1.1474,1.9258;7.5525,0.6469,0.237;5.6354,1.6136,0.7235;5.4151,1.8574,-0.3283)|\",5.684458336944999\r\n\"[H]OC(=O)[C@]1([H])N([H])C([H])([H])C([H])([H])[C@@]([H])(C(F)(F)F)C1([H])[H] |(-1.8414,-0.6545,2.5413;-0.994,-0.6902,3.0149;0.0335,-0.8594,2.1409;1.163,-0.9544,2.5529;-0.3372,-0.9966,0.6411;-0.8563,-1.9647,0.5636;0.8316,-1.0561,-0.2252;1.4664,-1.7633,0.1389;1.5588,0.2187,-0.33;1.9619,0.5543,0.6389;2.4094,0.0567,-1.0006;0.6449,1.3004,-0.9066;0.3436,1.0253,-1.9241;1.1801,2.2539,-0.9606;-0.611,1.4471,-0.0303;-0.315,1.817,0.961;-1.5656,2.4894,-0.5813;-0.9701,3.6971,-0.6892;-2.0308,2.1584,-1.809;-2.6481,2.6475,0.2199;-1.3087,0.0856,0.1238;-1.6747,-0.2403,-0.8551;-2.1894,0.1872,0.7695)|\",6.41644459479\r\n\"[H]OC(=O)[C@@]1([H])N([H])[C@@]2([H])[C@@]([H])(OC1([H])[H])[C@]2([H])C(F)(F)F |(1.1342,2.9726,-0.8222;0.3996,3.6185,-0.7306;-0.699,2.9259,-0.3667;-1.762,3.4544,-0.1554;-0.4706,1.4146,-0.2814;-0.5969,1.0262,-1.3023;0.9305,1.1597,0.1201;1.0833,1.656,1.0012;1.174,-0.2442,0.4154;2.0689,-0.3887,1.0132;0.0196,-1.1853,0.6685;0.1079,-1.9436,1.4437;-1.2917,-0.728,0.4855;-1.4658,0.6862,0.6188;-2.4939,0.8947,0.321;-1.3326,1.0044,1.6645;0.8729,-1.3307,-0.5845;0.3899,-1.0661,-1.5198;1.828,-2.4729,-0.7228;2.3697,-2.81,0.4757;1.2209,-3.5728,-1.2143;2.8486,-2.1716,-1.555)|\",6.8001251239950005\r\n\"[H]C([H])([H])OC(=O)[C@]1([H])N=NC(=O)C2=NOC(C3([H])C([H])([H])C3([H])[H])=C21 |(5.2126,2.3845,1.1659;4.8736,1.5243,0.5842;5.6931,1.101,0.0042;4.4462,0.778,1.2579;3.8917,1.9398,-0.3905;2.7802,2.491,0.116;2.5465,2.6514,1.2903;1.8019,2.8987,-1.0018;2.2762,2.7393,-1.9712;1.6712,4.4099,-0.8674;0.7695,4.9243,-0.189;-0.2823,4.1323,0.5181;-0.8635,4.6604,1.429;-0.5136,2.8013,-0.0595;-1.6141,2.0827,0.0034;-1.3165,0.9524,-0.7659;-0.0558,1.0224,-1.2615;0.4265,-0.0896,-2.0846;1.4496,0.0441,-2.4241;-0.5213,-0.8254,-3.0225;-1.5552,-0.4945,-3.0348;-0.1241,-1.1193,-3.9896;-0.0195,-1.5162,-1.7935;0.7296,-2.2954,-1.8978;-0.7116,-1.6548,-0.9687;0.501,2.2024,-0.8444)|\",3.5184320869650003\r\n\"[H]C1S[C@@]2([H])C([H])([H])C([H])([H])[C@@]([H])(Cl)C([H])([H])[C@]2([H])C(=O)C1C([H])([H])[H] |(2.4445,1.6292,1.6526;3.1241,0.8338,1.3441;4.4503,1.4371,0.2992;3.1582,1.7358,-1.0621;2.586,2.6062,-0.7124;3.8599,2.1025,-2.3782;4.5528,2.9353,-2.213;4.4541,1.2507,-2.7284;2.8307,2.5015,-3.45;3.332,2.7143,-4.3996;2.3243,3.428,-3.1413;1.74,1.45,-3.6555;0.9743,1.8235,-4.3385;2.4341,-0.0127,-4.5419;1.0978,1.0061,-2.3385;0.3806,0.1998,-2.5157;0.5216,1.8522,-1.9413;2.1396,0.5685,-1.2972;2.7045,-0.2781,-1.7012;1.4243,0.1065,0.0395;0.26,0.325,0.2786;2.5874,-0.334,0.8844;3.3673,-1.5703,0.5495;4.4467,-1.403,0.6101;3.1178,-1.9888,-0.431;3.0997,-2.3382,1.2901)|\",4.2749085913550005\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])C([H])=C(C1([H])N([H])N([H])N([H])N1[H])O2 |(2.7696,-0.9153,1.3328;1.9121,-0.4862,0.8224;0.6572,-1.0725,0.9853;0.5037,-1.9475,1.6083;-0.3961,-0.4759,0.3042;-0.256,0.6598,-0.5164;1.0172,1.2314,-0.6638;1.1639,2.1087,-1.2877;2.0879,0.6495,0.0094;3.0809,1.0782,-0.0934;-1.5801,0.9383,-1.0174;-1.8954,1.7289,-1.6817;-2.3943,-0.0081,-0.4852;-3.8586,-0.2723,-0.6223;-4.0241,-1.258,-1.0721;-4.5186,0.7507,-1.4796;-5.2154,0.2261,-2.0202;-5.2344,1.5478,-0.513;-6.0401,1.9284,-1.0183;-5.778,0.4449,0.3626;-6.0879,0.8998,1.2263;-4.5481,-0.2219,0.6991;-3.9871,0.4277,1.2647;-1.7037,-0.888,0.3238)|\",5.25723959166\r\n\"[H]C1=C([H])C(C(F)(F)F)=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])C2([H])[H])C([H])=C1[H] |(-1.2564,-2.2443,0.5704;-0.7033,-1.3343,0.3575;0.6863,-1.3684,0.2711;1.2165,-2.3016,0.4204;1.4128,-0.2061,-0.0074;2.9179,-0.3047,-0.0738;3.3614,-1.5503,0.1965;3.4045,0.0274,-1.293;3.5203,0.5238,0.8203;0.7525,1.0246,-0.2039;1.4828,2.3196,-0.5094;2.544,2.1188,-0.6674;0.9914,2.9781,-1.7428;-0.0263,2.9204,-1.7608;1.3046,4.3816,-1.6135;2.0916,4.5834,-2.2277;1.7372,4.6888,-0.2258;1.1908,5.5637,0.1471;3.1534,5.053,-0.2256;3.7393,4.2196,-0.2863;3.3998,5.5272,0.6414;1.3514,3.419,0.567;2.0004,3.2592,1.4339;0.3158,3.4784,0.9233;-0.645,1.0342,-0.0961;-1.1829,1.9706,-0.2197;-1.3702,-0.1247,0.1749;-2.4533,-0.0784,0.2466)|\",4.94158752508\r\n\"[H]C1=C([H])N2C(=S)N=NC(=O)C2=C1[H] |(-3.5132,-0.0985,-0.0674;-2.4339,-0.0544,-0.0281;-1.5956,-1.1401,0.0239;-1.7892,-2.2012,0.0355;-0.2901,-0.6719,0.0625;0.8897,-1.3825,0.1198;1.0124,-3.0159,0.1568;2.1331,-0.6218,0.1494;2.1678,0.6306,0.1283;0.933,1.4454,0.0694;1.0658,2.6511,0.0539;-0.3161,0.7299,0.0355;-1.6344,1.1262,-0.0211;-1.9726,2.1524,-0.054)|\",2.533379948155\r\n\"[H]OC1=NC(=O)N(C([H])([H])C([H])([H])[H])[C@@]([H])(N([H])[H])[C@@]1([H])N([H])C([H])([H])C([H])([H])[H] |(3.586,-1.1138,-1.0503;4.4477,-0.7821,-1.4221;5.066,-0.2676,-0.3637;6.1804,0.3503,-0.4822;6.7762,0.8829,0.6822;7.9651,1.1596,0.7124;5.9588,1.1265,1.7879;6.5352,1.8384,2.932;7.2842,2.5341,2.5474;5.7356,2.4305,3.3963;7.1781,0.904,3.9593;7.5841,1.4791,4.7999;7.9973,0.3501,3.4924;6.4534,0.1848,4.3603;4.5335,0.8207,1.7858;4.2282,0.6297,2.8199;3.6419,1.8666,1.2735;4.0255,2.2507,0.4095;3.5849,2.6389,1.9339;4.3306,-0.4679,0.964;4.8211,-1.2948,1.4921;2.9378,-0.8255,0.6975;2.4122,0.0528,0.674;2.3201,-1.7446,1.6666;2.9076,-2.6706,1.6687;2.3522,-1.343,2.6936;0.8761,-2.0454,1.2762;0.4264,-2.7516,1.9819;0.8284,-2.4791,0.272;0.266,-1.1341,1.2823)|\",5.864053478275\r\n\"[H]C([H])([H])OC(=O)[C@]1([H])N=NC(=O)C2=NOC(C([H])([H])[H])=C21 |(0.8031,-4.8397,2.5541;1.7472,-4.2985,2.6052;1.6872,-3.4747,3.3203;2.5635,-4.9657,2.8908;1.9676,-3.7876,1.2719;3.0879,-3.0706,1.1132;3.892,-2.821,1.9798;3.2351,-2.5785,-0.3387;2.4322,-2.999,-0.9456;4.5086,-3.2454,-0.8451;5.6106,-2.6845,-0.7557;5.7926,-1.309,-0.1969;6.8872,-1.0044,0.1964;4.5861,-0.474,-0.2752;4.5204,0.8392,-0.3334;3.1586,1.1005,-0.4788;2.4458,-0.0534,-0.5035;0.969,0.0649,-0.6431;0.5128,-0.9276,-0.683;0.7052,0.6094,-1.5563;0.5391,0.6098,0.205;3.3185,-1.0976,-0.3862)|\",3.534758917995\r\n\"[H]C1=C([H])C(/C([H])=C(\\[H])C2=C(N([H])[H])C(C([H])([H])[H])=NO2)=C([H])C([H])=C1F |(9.184,-6.3204,0.3974;8.8869,-5.2782,0.4501;7.564,-4.8856,0.2629;6.8099,-5.6412,0.0586;7.1756,-3.5333,0.3324;5.7695,-3.1944,0.1234;5.1181,-4.0332,-0.1137;5.2044,-1.9676,0.1923;5.8116,-1.0956,0.4281;3.8112,-1.6915,-0.0289;3.081,-0.5237,0.0143;3.5028,0.7926,0.2246;2.8252,1.3548,0.7286;4.409,0.8667,0.6731;1.7487,-0.9297,-0.3081;0.5314,-0.0661,-0.4125;0.2658,0.373,0.5582;-0.318,-0.6586,-0.7604;0.697,0.7589,-1.1154;1.6791,-2.2263,-0.5206;2.9692,-2.7169,-0.3544;8.1786,-2.5781,0.5971;7.9247,-1.5237,0.6497;9.5036,-2.9524,0.7865;10.2776,-2.2189,0.9878;9.8402,-4.3015,0.7102;11.1261,-4.6657,0.8932)|\",3.736123167365\r\n\"[H]OC1=N[C@]([H])(C2=NC([H])=C([H])C([H])=N2)N([H])O1 |(2.8592,3.4722,-3.1411;2.8667,3.9,-2.2677;2.8041,2.9184,-1.3703;2.8573,1.6635,-1.5879;2.6009,1.0676,-0.2836;3.3077,0.2686,-0.0416;1.1966,0.4819,-0.1928;1.1009,-0.7619,0.291;-0.1373,-1.2639,0.3874;-0.2188,-2.2777,0.7764;-1.2656,-0.5397,0.0152;-2.2651,-0.9534,0.0983;-1.0405,0.7482,-0.4715;-1.8671,1.3857,-0.7819;0.1847,1.2654,-0.5852;2.7595,2.1754,0.7225;3.7465,2.1928,0.9964;2.6685,3.3786,-0.1034)|\",5.439555871495\r\n\"[H]OC(=O)C([H])([H])[C@@]1([H])C(C([H])([H])[H])N=C(N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])NC1=O |(5.3725,2.3804,-1.0174;4.7796,3.1623,-1.2186;3.5384,2.75,-1.4822;2.6547,3.5136,-1.8048;3.2754,1.2404,-1.3603;2.1974,1.1236,-1.4707;3.751,0.7289,-2.2062;3.7727,0.592,-0.0273;3.7519,1.375,0.7487;2.8524,-0.4996,0.4787;1.3836,-0.2074,0.6202;1.2136,0.811,0.987;0.9273,-0.9311,1.2982;0.8787,-0.2848,-0.3508;3.2697,-1.6631,0.8379;4.6494,-1.9197,0.7169;5.0015,-3.1459,1.1215;6.4014,-3.6049,1.1034;6.7242,-3.7536,0.0655;7.0501,-2.841,1.5375;6.3532,-4.9194,1.8942;7.1409,-5.6142,1.5904;6.48,-4.7176,2.9645;4.9358,-5.4499,1.6204;4.5943,-6.1784,2.3611;4.8962,-5.9296,0.6351;4.0726,-4.1796,1.6187;3.7293,-3.9197,2.6278;3.1938,-4.2429,0.9756;5.5961,-1.1127,0.2631;5.2349,0.1314,-0.1412;6.0642,0.9202,-0.6153)|\",4.438176901655\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C1=C(C([H])([H])[H])C2=C(N=NC2=O)N=C1O[H] |(2.2015,-3.0012,0.2231;1.3591,-3.4934,0.2701;0.3502,-2.6643,0.6394;-0.7705,-3.0812,0.7908;0.7593,-1.2062,0.835;-0.1156,-0.6628,1.1957;1.5165,-1.1727,1.6281;1.3233,-0.5557,-0.4697;0.5437,0.0529,-0.9322;1.5627,-1.3414,-1.1952;2.5656,0.2729,-0.2418;2.5896,1.6818,-0.2766;1.3632,2.5132,-0.5508;0.9775,2.3213,-1.5594;1.5934,3.5769,-0.471;0.5587,2.2759,0.1544;3.8354,2.2784,-0.0454;4.9465,1.4905,0.2002;6.1369,2.3188,0.415;5.8354,3.5277,0.3216;4.3406,3.6601,0.0186;3.8126,4.7314,-0.116;4.9724,0.1731,0.2477;3.7814,-0.394,0.028;3.7595,-1.754,0.0755;4.6738,-2.0423,0.2579)|\",3.159241804305\r\n\"[H]OC(=O)C([H])([H])C1=C(C([H])([H])[H])C2=C(N=NC2=O)N=C1O[H] |(4.1407,-3.0884,1.4104;3.3347,-3.273,1.9309;2.2352,-2.8947,1.2445;1.132,-2.9963,1.7194;2.497,-2.312,-0.1559;3.1493,-2.9924,-0.7151;1.5345,-2.2709,-0.6653;3.1303,-0.9384,-0.0895;2.3699,0.2484,-0.1468;0.8694,0.2397,-0.2696;0.5625,-0.1781,-1.2372;0.4684,1.2517,-0.1948;0.4201,-0.3873,0.5081;3.1008,1.4394,-0.0791;4.4791,1.4022,0.0495;5.0353,2.7572,0.0963;4.1144,3.5977,0.0112;2.7709,2.8746,-0.1075;1.7333,3.4731,-0.1989;5.2257,0.3187,0.1341;4.5258,-0.8189,0.0726;5.2445,-1.9696,0.1926;6.1784,-1.7084,0.3011)|\",3.1565206658\r\n\"[H]OC(=O)/C1=C(\\[H])C(=C([H])[H])N([H])C2=C1C(=O)N=N2 |(6.4686,2.2885,0.6112;5.9269,3.0978,0.4651;4.6319,2.8776,0.1984;3.8715,3.8005,0.0046;4.1468,1.4459,0.1456;2.8303,1.2323,-0.1202;2.191,2.0931,-0.2813;2.2126,-0.0885,-0.2059;0.9098,-0.319,-0.468;0.2333,0.5096,-0.6341;0.499,-1.3221,-0.5213;3.1079,-1.1766,0.0138;2.7645,-2.1307,-0.0284;4.3913,-0.9565,0.2717;4.9776,0.2725,0.3537;6.3523,-0.016,0.649;7.3278,0.6874,0.8288;6.4611,-1.5558,0.7203;5.3391,-2.0506,0.5028)|\",2.0082002166900006\r\n\"[H]OC1=NC(=S)N(C2([H])C([H])([H])C2([H])[H])C([H])([H])C1([H])[H] |(-3.8123,-1.3742,-5.0998;-3.425,-2.1416,-4.6501;-2.8003,-1.7402,-3.5226;-2.2056,-2.6398,-2.8368;-1.5991,-2.3243,-1.6135;-1.0343,-3.5679,-0.6594;-1.5057,-0.9927,-1.2539;-0.7803,-0.6157,-0.0635;0.2948,-0.7855,-0.123;-1.4132,-0.811,1.2829;-2.4121,-1.234,1.2989;-0.7705,-1.1426,2.0922;-1.2544,0.5819,0.7273;-0.499,1.2408,1.1464;-2.157,1.0911,0.3989;-1.6313,0.0219,-2.304;-1.7112,1.0018,-1.8317;-0.7329,0.0306,-2.9418;-2.8626,-0.2789,-3.1525;-2.8809,0.3618,-4.0431;-3.7834,-0.0876,-2.5838)|\",3.708911782315001\r\n\"[H]C([H])=C([H])/C1=N/C(=O)[C@@]2([H])C(N1)SC1=C2C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(3.1214,2.5451,0.965;3.7001,1.7263,0.5479;4.7347,1.9203,0.2847;3.1577,0.5147,0.3693;2.1281,0.3019,0.6415;3.8838,-0.6231,-0.2016;3.2596,-1.7714,-0.2571;3.9078,-2.8625,-0.8418;3.5814,-4.0135,-0.6353;5.0029,-2.4753,-1.854;4.4226,-2.091,-2.7161;5.7553,-1.2932,-1.2905;5.2212,-0.3779,-0.5635;7.4712,-1.3889,-1.6521;7.2696,-3.043,-2.3301;6.0003,-3.483,-2.3475;5.6101,-4.829,-2.9016;5.3025,-5.4769,-2.0704;4.7153,-4.7268,-3.5305;6.7583,-5.4616,-3.7066;6.5462,-6.5222,-3.8838;6.8186,-4.9822,-4.6937;8.106,-5.2991,-2.9891;8.9036,-5.8042,-3.5458;8.052,-5.7781,-2.0026;8.4691,-3.8128,-2.8071;8.828,-3.387,-3.7563;9.298,-3.709,-2.0948)|\",3.57013371856\r\n\"[H]OC1=NC(=O)N(C([H])([H])[H])C([H])([H])[C@@]1([H])C1=NC(C([H])([H])[H])=C([H])S1 |(3.4745,-2.7888,0.3016;4.02,-3.5199,-0.0918;5.1365,-3.0128,-0.6048;5.9291,-3.7942,-1.237;7.1507,-3.336,-1.7671;8.0071,-4.1236,-2.1348;7.3465,-1.9561,-1.851;8.4713,-1.4553,-2.6258;8.901,-0.5729,-2.1368;8.1738,-1.1785,-3.6487;9.2179,-2.2463,-2.6796;6.246,-1.0499,-1.6024;6.6563,-0.0489,-1.4242;5.5633,-0.9728,-2.4668;5.4645,-1.5266,-0.3691;6.1571,-1.505,0.4877;4.2903,-0.6387,-0.0369;3.1858,-1.0689,0.5118;2.2749,-0.0587,0.7629;0.9531,-0.3982,1.384;0.3427,0.4976,1.5263;1.0955,-0.8787,2.3586;0.3961,-1.0997,0.7522;2.7143,1.1812,0.3967;2.1963,2.1262,0.4819;4.3059,1.096,-0.2878)|\",4.868116785445\r\n\"[H]O/C(=N/N([H])[H])C([H])([H])N1C(=O)N=C2C([H])=C([H])C([H])=C([H])[C@]2([H])C1=O |(2.6429,-4.3445,-1.313;2.3724,-4.8839,-2.0802;1.5406,-4.0888,-2.824;0.4672,-4.6062,-3.2782;-0.4193,-3.8259,-3.9975;-0.0325,-3.0113,-4.4793;-0.9241,-4.4224,-4.643;2.0267,-2.6662,-3.0569;1.919,-2.372,-4.102;3.0829,-2.5986,-2.7865;1.2753,-1.6639,-2.2718;0.5031,-0.6754,-2.9607;0.3427,-0.7719,-4.1631;0.0271,0.4177,-2.2389;0.0588,0.4024,-0.9424;-0.3046,1.591,-0.2032;-0.7008,2.425,-0.7729;-0.0627,1.6737,1.1306;-0.2978,2.5939,1.6593;0.5555,0.5905,1.8783;0.8034,0.7537,2.9228;0.8295,-0.5848,1.2848;1.3006,-1.4109,1.8063;0.4241,-0.8285,-0.1393;-0.5199,-1.4105,-0.0835;1.356,-1.7602,-0.9037;2.0641,-2.5843,-0.3384)|\",2.555149056195\r\n\"[H]OC1=NN([H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[C@@]1([H])/N=C(/O[H])C([H])([H])[H] |(2.9985,2.2066,0.3691;2.7205,3.0925,0.0576;2.9059,3.0578,-1.2889;3.068,4.1606,-1.9104;3.3596,4.1302,-3.2801;2.9901,4.9797,-3.6928;3.0572,2.9364,-4.071;3.6765,3.0093,-4.9754;1.6024,2.7851,-4.5259;1.3097,1.9512,-5.6155;2.1206,1.4361,-6.1282;-0.0009,1.7864,-6.0626;-0.2049,1.1431,-6.9149;-1.0465,2.4578,-5.4243;-2.0686,2.3351,-5.7726;-0.7672,3.2937,-4.343;-1.5729,3.8237,-3.8416;0.547,3.4589,-3.8979;0.7533,4.1123,-3.055;3.5571,1.7166,-3.2688;4.6341,1.8186,-3.097;3.3884,0.7952,-3.836;2.8345,1.6711,-1.9163;1.7666,1.4604,-2.0923;3.397,0.7188,-0.9656;3.1324,-0.5204,-1.0283;3.7103,-1.3055,-0.0743;3.4425,-2.2278,-0.2081;2.2689,-1.2671,-2.0222;1.7091,-0.5927,-2.6715;1.5511,-1.9082,-1.4953;2.8913,-1.9118,-2.6552)|\",4.821857430860001\r\n\"[H]C([H])=C1N(C([H])([H])C([H])([H])[H])C(=O)/N=C2/C([H])=C(C([H])([H])[H])C(C([H])([H])[H])=C([H])[C@]12[H] |(6.9153,3.2395,0.2376;7.6088,2.4628,-0.0541;8.6385,2.5908,0.2544;7.1879,1.402,-0.7611;8.0118,0.3434,-1.1786;9.3513,0.1587,-0.6024;9.8884,-0.4994,-1.2849;9.8634,1.1268,-0.5972;9.3149,-0.4547,0.7982;10.3323,-0.5503,1.1943;8.8671,-1.4526,0.7598;8.7357,0.1633,1.4922;7.6101,-0.5689,-2.1754;8.2284,-1.604,-2.3653;6.5428,-0.1894,-3.0069;5.6762,0.6594,-2.5588;4.6274,1.1441,-3.4257;4.5966,0.7192,-4.4247;3.7524,2.1102,-3.0429;2.6972,2.6015,-3.9988;2.7976,3.6777,-4.1886;1.6882,2.4465,-3.5955;2.7626,2.0806,-4.9574;3.8311,2.7021,-1.6899;2.8512,3.782,-1.2973;3.0345,4.1177,-0.2726;1.8134,3.4304,-1.3539;2.9263,4.6573,-1.9544;4.7699,2.2729,-0.8247;4.8045,2.6883,0.1791;5.7279,1.1597,-1.1262;5.412,0.2971,-0.5065)|\",3.703469505305\r\n\"[H]OC(=O)C1=C([H])C(=O)N=C2/C([H])=C(F)\\C([H])=C(\\[H])[C@@]21[H] |(2.5043,-1.6085,5.3751;3.2766,-1.2,4.9489;2.8826,-0.4233,3.9139;3.6983,0.0746,3.1716;1.4082,-0.1942,3.7598;0.5912,-0.112,4.8228;0.9339,-0.2661,5.8431;-0.8302,0.3198,4.697;-1.4634,0.6242,5.6948;-1.411,0.3311,3.4236;-0.6578,0.1854,2.3747;-1.303,0.0533,1.0888;-2.3756,0.1965,1.0331;-0.5899,-0.3784,0.0233;-1.194,-0.5733,-1.1557;0.8206,-0.6965,0.0655;1.2796,-1.1058,-0.8285;1.5138,-0.481,1.1953;2.5782,-0.6756,1.2479;0.8733,0.1836,2.3917;1.1522,1.2509,2.2638)|\",3.95653538627\r\n\"[H]C1C([H])=C([H])[C@]2([H])C(=O)NC(C([H])([H])N(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])=NC2=C1[H] |(4.7813,7.3885,2.3321;5.4896,6.8525,1.7051;6.7782,7.4657,1.4466;6.9745,8.4545,1.8506;7.7036,6.8299,0.7004;8.6674,7.2718,0.469;7.4495,5.4543,0.1809;8.0299,4.7634,0.831;8.0504,5.1356,-1.2053;8.9259,5.8254,-1.6862;7.5955,3.9576,-1.8;6.503,3.4062,-1.3408;6.0725,2.0787,-1.9288;6.6182,1.9016,-2.8693;5.0047,2.148,-2.1514;6.2847,1.0204,-0.9388;7.6784,0.5893,-0.9151;7.8434,-0.0797,-0.0638;8.3354,1.457,-0.8034;7.982,0.0591,-1.8383;5.3623,-0.1109,-1.0799;5.3802,-0.5207,-2.11;5.7397,-0.9061,-0.4245;3.916,0.1996,-0.6718;3.475,0.9353,-1.3576;3.3344,-0.7217,-0.8143;3.7777,0.698,0.7691;2.7249,0.8588,1.0289;4.3157,1.6409,0.9004;4.1898,-0.03,1.4795;5.6261,3.912,-0.3878;6.028,4.9588,0.283;5.124,5.663,1.1528;4.1357,5.2386,1.2977)|\",2.98781007849\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(8.9756,-1.9905,0.0841;8.2111,-2.4358,0.4883;7.085,-1.7383,0.1753;5.9766,-2.3174,0.4038;4.7297,-1.6615,0.0764;4.7664,-0.576,0.2455;4.4299,-1.8945,-1.38;4.9928,-2.7028,-1.6492;3.0631,-2.3219,-1.5021;2.5056,-1.4672,-1.559;2.7324,-3.0185,-0.2378;3.1395,-4.0347,-0.3313;1.2274,-3.1008,-0.0134;0.9997,-3.6576,0.9017;0.7346,-3.6042,-0.852;0.7907,-2.0984,0.0902;3.5273,-2.2616,0.8477;3.8718,-2.9063,1.6606;2.9106,-1.4629,1.2792;7.351,-0.3392,-0.3827;6.4054,0.133,-0.6535;7.8042,0.2604,0.4184;8.2766,-0.3705,-1.5854;9.6361,-0.0582,-1.4678;10.0338,0.2737,-0.5098;10.4854,-0.1294,-2.5754;11.5361,0.1273,-2.4606;10.0025,-0.5121,-3.8306;10.9225,-0.616,-5.0245;11.8092,0.0148,-4.9038;10.4162,-0.3153,-5.9483;11.2719,-1.6472,-5.1683;8.6361,-0.8133,-3.9444;8.2329,-1.1015,-4.9128;7.7861,-0.749,-2.8441;6.7296,-0.9853,-2.9527)|\",5.711669721995\r\n\"[H]O/C(=N/[C@@]1([H])N([H])N([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C1=C([H])C([H])=C(C([H])([H])[H])C(F)=C1[H] |(6.2324,-1.7251,0.5555;6.7991,-0.945,0.6665;6.2113,0.1068,0.011;6.9095,1.158,-0.0929;6.3941,2.3922,-0.6582;6.3763,2.3186,-1.7529;5.0285,2.7437,-0.2107;4.6624,3.3944,-0.9213;5.294,3.4682,1.0133;4.4275,3.9131,1.311;6.3779,4.4348,0.72;6.881,4.67,1.665;5.9045,5.7445,0.0697;6.7527,6.4011,-0.1577;5.2281,6.2945,0.7361;5.3743,5.5595,-0.8742;7.2918,3.5865,-0.1991;8.1586,3.1942,0.3355;7.65,4.1718,-1.0528;4.8271,-0.2038,-0.477;3.8654,-0.7113,0.4065;4.1035,-0.8083,1.4619;2.5926,-1.0511,-0.054;1.851,-1.4262,0.647;2.2421,-0.9227,-1.4018;0.8816,-1.2855,-1.9381;0.2353,-1.6606,-1.1395;0.9512,-2.0572,-2.7137;0.3904,-0.4195,-2.3972;3.2327,-0.4319,-2.257;2.9322,-0.308,-3.5718;4.5023,-0.0753,-1.8331;5.2212,0.2904,-2.5584)|\",4.78376149179\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N/C(=O)C1=NC([H])=C([H])C([H])=C1[H] |(1.7707,2.9684,2.0184;2.8261,2.6934,2.0435;3.4652,2.8463,3.148;4.8287,2.4987,3.1842;5.2934,2.663,4.1527;5.5401,2.0057,2.1473;6.5924,1.757,2.2413;4.9067,1.7933,0.8407;5.4946,1.3607,-0.1366;3.4261,2.1625,0.796;2.7771,1.9861,-0.2876;1.4409,2.3212,-0.5189;1.1254,3.4187,-0.9469;0.4544,1.1997,-0.3481;0.8441,0.1377,0.3726;-0.0466,-0.8417,0.5501;0.2879,-1.6918,1.1416;-1.3405,-0.8105,0.0216;-2.0193,-1.6406,0.1921;-1.7325,0.3024,-0.722;-2.7303,0.3631,-1.1472;-0.8187,1.3366,-0.9096;-1.0605,2.2335,-1.4687)|\",3.170126358325\r\n\"[H]C1=NC([H])=C([H])C(C(=O)/N=C2/C(=O)C([H])=C([H])N=C2[H])=C1[H] |(-3.1108,0.5581,-0.3353;-2.0739,0.3352,-0.5792;-1.8702,-0.74,-1.3537;-0.6041,-1.0408,-1.6663;-0.4614,-1.9221,-2.2888;0.4986,-0.2998,-1.2378;1.5066,-0.5979,-1.5056;0.2694,0.8224,-0.4344;1.3722,1.6983,0.0663;1.2098,2.5674,0.9073;2.6476,1.3708,-0.4289;3.4093,2.1843,-1.0575;4.8177,1.7167,-1.4269;5.194,0.5789,-1.1986;5.6458,2.7385,-2.0747;6.6639,2.469,-2.3367;5.1338,3.9571,-2.352;5.7342,4.7173,-2.844;3.8213,4.3761,-2.0645;3.0278,3.5587,-1.4674;2.018,3.8989,-1.2425;-1.0488,1.1439,-0.094;-1.2522,2.0051,0.5334)|\",3.03134829457\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N/C(=O)C([H])([H])[C@@]1([H])C([H])=C([H])C([H])([H])C1([H])[H] |(-2.7281,-3.9376,1.7006;-3.0379,-4.6605,0.9455;-3.3969,-5.83,1.3424;-3.8184,-6.7651,0.3789;-4.1042,-7.7248,0.801;-3.8822,-6.5445,-0.952;-4.2149,-7.3109,-1.6448;-3.4874,-5.2473,-1.5124;-3.5094,-4.988,-2.7043;-3.031,-4.2253,-0.4726;-2.6623,-3.0671,-0.8711;-2.2807,-2.0113,-0.0211;-3.1026,-1.3999,0.6357;-0.8049,-1.6608,-0.1189;-0.2388,-2.51,0.2951;-0.5286,-1.6136,-1.1801;-0.4408,-0.3573,0.6064;-0.8249,-0.4374,1.6345;1.0504,-0.0859,0.6403;1.7763,-0.8489,0.9122;1.3402,1.1856,0.3552;2.3393,1.6137,0.3618;0.0993,2.0031,0.0793;0.1917,2.6357,-0.8125;-0.1058,2.6841,0.9192;-1.0058,0.9248,-0.0697;-1.1735,0.7233,-1.1353;-1.9654,1.2256,0.3564)|\",2.67215801191\r\n\"[H]OC(=N/C1=C([H])C([H])=C([H])C(SC([H])([H])[H])=C1[H])/C(=C(\\[H])C([H])([H])C([H])([H])[H])C([H])([H])[H] |(5.1155,-1.1356,2.9226;4.3569,-0.8729,2.3779;4.4596,-1.4861,1.1595;3.4106,-1.468,0.442;3.3757,-1.9034,-0.8907;4.1348,-1.2844,-1.9005;4.8043,-0.4693,-1.6425;3.9972,-1.7005,-3.2207;4.5837,-1.2169,-3.9984;3.1209,-2.731,-3.5695;3.0415,-3.0365,-4.6065;2.3512,-3.3417,-2.571;1.1926,-4.67,-2.8667;1.169,-4.8202,-4.6828;0.4249,-5.589,-4.9063;2.1368,-5.1418,-5.0774;0.8634,-3.8838,-5.1574;2.4719,-2.9143,-1.2421;1.8603,-3.3583,-0.4617;5.803,-2.089,0.866;5.8554,-3.3884,0.5274;4.9085,-3.925,0.4716;7.0612,-4.2227,0.1992;6.9976,-4.5083,-0.8616;7.9886,-3.6506,0.3063;7.1362,-5.5027,1.05;7.9915,-6.1189,0.7522;6.2296,-6.1077,0.9333;7.2434,-5.2641,2.1141;6.9919,-1.1565,0.9709;7.9418,-1.6952,0.9804;6.9497,-0.534,1.8727;7.0094,-0.4615,0.1208)|\",4.840905400395\r\n\"[H]C1=C([H])C(N([H])C(=O)/C(=C(\\[H])C([H])([H])C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1SC([H])([H])[H] |(-0.0562,4.0492,3.9794;0.5617,3.2422,3.6014;1.6607,3.5596,2.8061;1.8709,4.6053,2.5879;2.4942,2.5601,2.2875;3.5891,2.9669,1.4976;3.6652,3.9631,1.3464;4.5117,2.1921,0.8265;4.4687,0.9662,0.7842;5.5963,2.9619,0.1122;6.1182,4.0759,0.6578;5.7353,4.3963,1.6287;7.2376,4.9334,0.1335;7.5332,4.6244,-0.8737;6.8723,5.9673,0.043;8.4651,4.9247,1.0627;9.2428,5.5991,0.6874;8.8918,3.9189,1.1387;8.1983,5.2504,2.0751;6.0617,2.2967,-1.158;6.977,2.7397,-1.5552;5.2888,2.3527,-1.9359;6.2338,1.2338,-0.964;2.2044,1.2177,2.5844;2.8351,0.4363,2.1852;1.1076,0.9086,3.381;0.8996,-0.1355,3.6011;0.2695,1.9069,3.9014;-1.1064,1.3663,4.9018;-1.8978,2.9265,5.412;-2.7417,2.6372,6.0432;-2.2763,3.4881,4.5532;-1.2173,3.5504,5.9983)|\",4.468109425210001\r\n\"[H]OC(=N/C1=C([H])C(C(=O)C([H])([H])[H])=C([H])C([H])=C1[H])/C(=C(\\[H])C([H])([H])C([H])([H])[H])C([H])([H])[H] |(3.4677,2.8671,-2.3669;3.1968,2.5207,-1.5026;4.2956,2.0327,-0.8544;4.0538,1.3544,0.1951;5.0299,0.9149,1.0966;5.9509,1.7777,1.7055;5.9727,2.8314,1.45;6.8544,1.3095,2.6686;7.8163,2.2951,3.2604;7.8507,3.4524,2.8672;8.7566,1.8327,4.3631;8.1955,1.4559,5.2265;9.3733,2.6781,4.6729;9.4054,1.0197,4.016;6.8373,-0.0447,3.0363;7.5286,-0.426,3.7807;5.914,-0.9074,2.4453;5.89,-1.9551,2.7335;5.0106,-0.4347,1.4962;4.2747,-1.097,1.0496;5.6092,2.3828,-1.4871;6.5105,1.4066,-1.6883;6.2406,0.4049,-1.3549;7.883,1.5139,-2.2898;7.9361,0.8464,-3.1625;8.0787,2.5238,-2.6639;8.9803,1.1086,-1.2876;9.967,1.1349,-1.763;8.8145,0.0934,-0.9094;8.9939,1.7857,-0.4272;5.7923,3.8401,-1.8646;5.4565,4.4969,-1.0542;5.2206,4.122,-2.7611;6.8384,4.0742,-2.0715)|\",4.53885902634\r\n\"[H]C1=C([H])C([C@@]2([H])N([H])C([H])([H])C([H])([H])C([H])([H])[C@@]2([H])N(=O)=O)=C([H])C([H])=C1OC([H])([H])[H] |(1.8894,-2.6504,-0.6799;2.5653,-3.4412,-0.3701;3.2899,-3.3493,0.8085;3.1692,-2.4651,1.4295;4.1734,-4.3628,1.2214;4.8882,-4.1719,2.5563;4.7694,-3.1143,2.8061;4.2398,-4.9207,3.6472;4.4807,-4.4781,4.5337;4.5969,-6.3418,3.7128;4.0848,-6.7743,4.5796;4.1808,-6.8358,2.8265;6.1116,-6.5914,3.7989;6.483,-6.2138,4.7631;6.3266,-7.6672,3.7805;6.8579,-5.8928,2.6482;6.6569,-6.3937,1.699;7.9418,-5.9501,2.7985;6.4358,-4.4263,2.5715;6.8233,-3.8875,3.4423;7.0737,-3.6999,1.406;6.9975,-2.4728,1.4314;7.5961,-4.3579,0.5112;4.299,-5.4768,0.3844;4.9678,-6.2897,0.6391;3.5788,-5.5887,-0.8091;3.7165,-6.4693,-1.426;2.7047,-4.5679,-1.1929;1.9553,-4.5691,-2.3326;2.0693,-5.6786,-3.2093;1.3986,-5.47,-4.0448;1.7601,-6.6125,-2.7213;3.0942,-5.7914,-3.5866)|\",4.34837933099\r\n\"[H]C([H])([H])C1=NC2=C(C(C([H])([H])[H])=C(C([H])([H])[H])S2)[C@]2([H])N=NC(=O)N12 |(8.1476,-3.1616,-0.4179;7.2141,-3.7259,-0.4194;7.1434,-4.3231,-1.3365;7.1978,-4.4301,0.4181;6.0741,-2.7617,-0.3273;6.2655,-1.4896,-0.2584;5.1456,-0.6905,-0.1216;3.8273,-1.062,-0.2485;2.8927,0.013,-0.091;1.4011,-0.1672,-0.2183;0.8661,0.7788,-0.1099;1.0169,-0.8642,0.5336;1.1357,-0.5836,-1.1982;3.5373,1.2022,0.1813;2.9564,2.5634,0.4342;1.8644,2.526,0.4459;3.2556,3.2824,-0.3388;3.2823,2.9675,1.4002;5.2801,1.0124,0.2189;3.5934,-2.4899,-0.6357;3.329,-2.5801,-1.7048;2.4727,-3.2093,0.0442;2.8718,-4.2516,0.5883;4.3447,-4.3965,0.4375;4.9841,-5.3027,0.9046;4.7585,-3.3008,-0.3145)|\",3.2653662060000004\r\n\"[H]C1=C([H])/C(=C2\\NC3/C(C([H])([H])[H])=C([H])\\C([H])=C(\\[H])[C@@]3([H])N2[H])C(=O)C(C([H])([H])[H])=C1[H] |(6.7569,-4.1726,3.6604;6.5849,-4.1668,2.5882;5.711,-3.2774,2.0267;5.1672,-2.5578,2.6305;5.499,-3.2763,0.6152;4.5957,-2.3802,0.0244;3.8071,-1.4603,0.7303;3.0551,-0.8769,-0.1586;2.2102,0.2823,0.0416;1.797,0.6777,1.4308;1.231,1.6136,1.4242;2.6775,0.8022,2.0715;1.18,-0.1005,1.8959;1.967,1.0272,-1.0746;1.3991,1.9496,-0.9758;2.5103,0.7095,-2.3929;2.365,1.4382,-3.1858;3.1818,-0.4326,-2.637;3.5749,-0.6804,-3.6185;3.2191,-1.4637,-1.5439;2.3328,-2.1114,-1.7269;4.3685,-2.3071,-1.3078;4.8116,-3.0458,-1.8778;6.2015,-4.2282,-0.2529;6.0272,-4.2652,-1.5059;7.119,-5.1527,0.398;7.8481,-6.1324,-0.4795;8.5014,-6.7843,0.1096;8.4559,-5.6122,-1.2298;7.143,-6.7565,-1.0418;7.282,-5.0978,1.7592;7.9723,-5.7953,2.2318)|\",2.2857563442\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])N(C([H])([H])[H])C([H])([H])[C@]1([H])C([H])([H])C([H])([H])C([H])([H])N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]21[H] |(2.8125,3.859,4.2241;2.928,3.1494,3.5747;1.6765,2.9406,2.9214;0.9133,2.595,3.6373;1.3059,3.8783,2.4729;1.8809,1.8922,1.8344;2.2818,0.9875,2.3074;0.9005,1.6289,1.4169;2.829,2.3866,0.7207;3.7514,2.7425,1.188;2.3718,3.2602,0.234;3.1807,1.4358,-0.337;2.041,1.0358,-1.1484;1.5072,1.9284,-1.496;1.3106,0.3874,-0.6266;2.3934,0.4885,-2.0303;3.9686,0.2803,0.1023;3.4197,-0.3198,0.849;4.1044,-0.3713,-0.7672;5.3605,0.6473,0.6656;5.2221,1.132,1.6419;6.1227,1.6198,-0.254;5.4996,2.487,-0.4902;7.0103,1.9868,0.2819;6.5686,0.9168,-1.5392;7.1868,1.5873,-2.1498;5.695,0.6507,-2.1461;7.3714,-0.3397,-1.2097;8.337,-0.0373,-0.7493;7.6195,-0.8859,-2.1285;6.6469,-1.2593,-0.324;7.4808,-2.4449,-0.0954;7.7073,-2.8866,-1.074;8.4553,-2.1565,0.3549;6.813,-3.48,0.8087;5.9111,-3.865,0.3145;7.4943,-4.3283,0.9515;6.4345,-2.8405,2.1472;7.3478,-2.5421,2.6826;5.9075,-3.5557,2.7911;5.5602,-1.6118,1.8816;5.3081,-1.0977,2.8181;4.6176,-1.9514,1.4369;6.2412,-0.6043,0.9374;7.1507,-0.2337,1.4616)|\",7.21645931526\r\n\"[H]C([H])=C([H])C([H])([H])N1N=C(C([H])([H])[H])C(C([H])([H])N2C([H])([H])[C@]3([H])C([H])([H])C([H])([H])[C@@]2([H])C3([H])[H])=C1[H] |(-0.1624,0.0022,3.7644;0.7722,0.5459,3.6586;1.3295,0.7373,4.5736;1.2095,0.9664,2.4719;0.6376,0.7576,1.5683;2.4796,1.7491,2.2709;2.9965,1.8892,3.229;2.2708,2.7393,1.8539;3.3755,1.1188,1.3083;3.8223,1.8245,0.2412;4.6231,0.9849,-0.4206;5.3068,1.4321,-1.6782;5.079,0.768,-2.5216;6.3984,1.4487,-1.5665;4.9766,2.4416,-1.9378;4.6977,-0.2781,0.2289;5.498,-1.4882,-0.1668;6.4885,-1.4545,0.3087;5.6831,-1.456,-1.26;4.8535,-2.7295,0.2606;3.6141,-3.0569,-0.5036;2.7163,-2.9084,0.1083;3.519,-2.4149,-1.3923;3.8146,-4.5322,-0.9132;3.1359,-4.8593,-1.706;3.7788,-5.404,0.3658;2.8887,-5.204,0.9722;3.7685,-6.4696,0.1101;5.1052,-5.0028,1.092;5.7804,-5.86,1.1978;4.9362,-4.5747,2.0836;5.6976,-3.9468,0.1342;6.7494,-3.712,0.3142;5.3217,-4.5226,-1.2478;5.5809,-3.8601,-2.0813;5.734,-5.5195,-1.4408;3.8799,-0.1431,1.3357;3.619,-0.8494,2.11)|\",5.657246951895\r\n\"[H]O[C@]1(C([H])([H])N2NC([H])=NC2[H])C([H])([H])C([H])([H])C([H])([H])N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]21[H] |(0.6202,0.6993,-1.814;1.0208,1.5204,-1.4633;1.1392,1.2782,-0.0589;1.139,2.6996,0.5439;0.1972,3.2053,0.3274;1.9375,3.2626,0.052;1.3834,2.7663,1.9772;0.4234,2.4225,2.8785;1.0427,2.6103,4.0363;0.5452,2.4269,4.9792;2.3275,3.0515,3.9469;2.5034,3.1367,2.6385;3.403,3.4598,2.1321;2.4692,0.5589,0.2378;3.3037,1.1921,-0.0887;2.5715,0.4088,1.3206;2.5119,-0.7931,-0.4792;3.4001,-1.3611,-0.1784;2.5836,-0.6306,-1.561;1.2556,-1.6051,-0.1672;1.2748,-1.926,0.8938;1.2371,-2.517,-0.7752;0.0253,-0.8481,-0.4507;-1.1352,-1.7414,-0.3139;-0.9914,-2.5731,-1.0142;-1.1692,-2.1826,0.7027;-2.4581,-1.0341,-0.5909;-2.4753,-0.689,-1.6335;-3.2816,-1.7485,-0.4696;-2.6109,0.157,0.3562;-2.6775,-0.2066,1.3918;-3.5392,0.7052,0.1543;-1.4086,1.0938,0.2121;-1.4698,1.889,0.9611;-1.4308,1.5639,-0.7792;-0.0575,0.3777,0.3877;0.0685,0.1214,1.4548)|\",6.402838902265\r\n\"[H]OC([H])([H])[C@@]([H])(N([H])C1=C2\\N=C(C([H])([H])[H])O\\C2=N/C([H])=N\\1)C([H])(C([H])([H])[H])C([H])([H])[H] |(3.3119,2.4774,-2.2452;2.5298,2.1538,-2.7232;1.7316,1.4869,-1.7663;1.1978,2.2038,-1.1137;0.9774,0.9343,-2.3305;2.5538,0.5226,-0.8921;3.0474,-0.2003,-1.5476;3.636,1.2999,-0.2709;3.3706,1.9688,0.4438;4.9537,0.9479,-0.2595;5.8923,1.6378,0.5352;5.7991,2.6926,1.4397;7.022,2.8603,1.8404;7.545,3.8499,2.8171;6.7195,4.4625,3.1831;8.2968,4.4978,2.3521;8.024,3.345,3.6637;7.947,1.9997,1.2795;7.2078,1.2119,0.4377;7.6917,0.2335,-0.3124;6.7098,-0.3491,-1.0174;7.0134,-1.1699,-1.6631;5.4014,-0.0587,-1.0337;1.7088,-0.2272,0.1686;1.2662,0.5302,0.8365;0.5555,-1.0151,-0.478;0.0175,-1.5918,0.2827;0.9356,-1.7242,-1.2237;-0.1734,-0.3633,-0.9698;2.577,-1.1636,1.0233;1.9635,-1.6735,1.775;3.3683,-0.6224,1.5508;3.0551,-1.9302,0.4018)|\",5.25723959166\r\n\"[H]C1=C(C#N)C([H])=C2C(=C1[H])/N=C(/[C@]1([H])C([H])([H])[C@@]3([H])C([H])=C([H])[C@]1([H])C3([H])[H])N2[H] |(6.6201,1.4646,4.7579;5.9941,1.5657,3.8777;6.2936,2.5915,2.9456;7.4051,3.4617,3.1909;8.3074,4.1698,3.3898;5.523,2.7745,1.7866;5.7639,3.5632,1.0813;4.4562,1.9032,1.6058;4.141,0.8727,2.5287;4.9243,0.705,3.6777;4.6916,-0.081,4.3887;3.0347,0.1617,2.0984;2.6765,0.721,0.963;1.511,0.3345,0.1042;1.8038,0.4784,-0.9424;0.2053,1.1607,0.4204;0.3458,1.7893,1.3062;-0.0841,1.8094,-0.4117;-0.8506,0.0424,0.7096;-1.7552,0.4097,1.2003;-1.0507,-0.7204,-0.5937;-1.9074,-0.6105,-1.2515;0.0637,-1.4301,-0.8204;0.3063,-2.0145,-1.7026;1.0258,-1.1494,0.3259;1.8419,-1.8572,0.475;0.0241,-0.9652,1.4907;-0.5082,-1.8908,1.7295;0.4865,-0.5546,2.3928;3.4993,1.7765,0.6152;3.415,2.342,-0.217)|\",5.08308672734\r\n\"[H]C1=C([H])C(C([H])([H])[H])=N/C(=C2\\NC3/C(C([H])([H])[H])=C([H])\\C([H])=C(\\[H])[C@@]3([H])N2[H])C1=O |(7.9377,-5.4361,-0.9624;7.5308,-4.635,-0.3518;8.0752,-4.2811,0.8469;8.9505,-4.8029,1.2302;7.5027,-3.2177,1.6287;8.1238,-2.8344,2.9496;8.2246,-3.7025,3.6141;7.503,-2.0829,3.4421;9.1309,-2.4176,2.8126;6.4464,-2.5534,1.2253;5.8788,-2.8764,0.0237;4.7351,-2.1513,-0.3788;4.0975,-1.1546,0.3668;3.0536,-0.8058,-0.3309;2.1888,0.3303,-0.0854;2.1736,0.9796,1.2692;1.5971,1.909,1.2558;3.1952,1.1975,1.5989;1.7386,0.3103,2.0216;1.5375,0.816,-1.1806;0.931,1.7128,-1.0762;1.681,0.2546,-2.5215;1.2172,0.7922,-3.3441;2.3678,-0.88,-2.7579;2.4632,-1.3078,-3.7514;2.858,-1.6548,-1.5678;2.0444,-2.3774,-1.3355;4.1154,-2.3681,-1.5611;4.4507,-3.1777,-2.1097;6.3664,-3.9378,-0.8572;5.8321,-4.2413,-1.9609)|\",2.549706779184999\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])[H])N([H])N=C1C(=O)N([H])C([H])([H])[C@@]1([H])OC([H])([H])C([H])([H])C1([H])[H] |(4.6459,3.1656,-1.09;4.9897,2.1753,-0.8335;4.276,0.9914,-0.8696;2.8522,0.6766,-1.2212;2.4556,1.5073,-1.8164;2.8225,-0.2084,-1.8723;1.9315,0.4375,-0.0019;0.9584,0.0917,-0.3742;2.3387,-0.3835,0.6041;1.7379,1.6775,0.8746;1.0868,1.4573,1.7277;1.2769,2.4937,0.3046;2.6936,2.0435,1.2645;5.1579,0.0299,-0.4557;4.9884,-0.9622,-0.3724;6.3824,0.504,-0.1626;6.2834,1.8179,-0.3895;7.4328,2.7442,-0.189;7.3245,3.9501,-0.4073;8.577,2.1432,0.2419;8.5601,1.1463,0.4185;9.7955,2.872,0.5224;10.577,2.6044,-0.2043;9.5696,3.9353,0.402;10.3139,2.5832,1.9302;9.5419,2.8695,2.658;10.527,1.1621,2.0702;11.8811,0.9016,2.4463;11.887,0.0375,3.119;12.4838,0.6486,1.5591;12.3812,2.1933,3.0971;12.0516,2.2423,4.1415;13.4714,2.2893,3.0778;11.6602,3.2564,2.254;12.219,3.45,1.3289;11.5325,4.2114,2.7718)|\",6.0137160960500005\r\n\"[H]C1=C([H])C(=C2NC3/C(C([H])([H])[H])=C([H])\\C([H])=C(\\[H])[C@]3([H])N2[H])C([H])=C([H])C1=O |(8.0802,-5.4212,0.721;7.4072,-4.8443,0.0932;6.5213,-3.9632,0.6222;6.4929,-3.8186,1.701;5.6376,-3.1889,-0.2163;4.7022,-2.3114,0.3125;3.8924,-1.4965,-0.4719;3.0992,-0.8514,0.3355;2.2347,0.2581,-0.0038;1.8674,0.511,-1.4389;1.253,1.4106,-1.5364;1.3154,-0.3372,-1.8611;2.7677,0.6338,-2.0523;1.9184,1.0929,1.0279;1.3266,1.9824,0.8241;2.4231,0.9174,2.3879;2.2312,1.7143,3.1015;3.1175,-0.1752,2.7605;3.4863,-0.3116,3.773;3.2294,-1.3045,1.7731;2.3729,-1.9824,1.9855;4.4585,-2.0819,1.6657;4.6226,-2.8326,2.3224;5.7412,-3.3655,-1.646;5.0844,-2.7727,-2.2744;6.6216,-4.2448,-2.1877;6.6979,-4.3808,-3.2626;7.5188,-5.0604,-1.3594;8.3215,-5.8702,-1.8394)|\",2.4435823774899994\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C(=O)C1=NN([H])C(C([H])([H])C([H])([H])C([H])([H])[H])=C1[H])C([H])([H])[H] |(11.6638,-0.5937,-3.0119;11.0977,-0.7189,-3.8507;9.9713,-0.141,-3.7257;9.0994,-0.1978,-4.7503;8.2557,0.2507,-4.5042;9.4625,0.675,-2.5162;9.1539,1.6523,-2.9114;8.2621,0.0674,-1.9273;8.3335,-0.4054,-1.0343;7.0231,0.2109,-2.4553;6.8157,0.7377,-3.5596;5.9092,-0.3034,-1.6264;6.1313,-0.8245,-0.4141;4.9092,-1.1804,0.0117;4.818,-1.6173,0.9182;3.9081,-0.9108,-0.8815;2.4666,-1.1955,-0.5851;1.9313,-1.2932,-1.5364;2.377,-2.1688,-0.0816;1.7836,-0.1107,0.2756;2.3205,-0.0112,1.2287;1.8805,0.8568,-0.2325;0.3077,-0.4192,0.541;-0.1536,0.3617,1.1549;0.1884,-1.3733,1.0692;-0.2588,-0.4852,-0.3955;4.5387,-0.3321,-1.9681;4.091,0.0197,-2.8851;10.5063,0.8833,-1.4221;11.3874,1.3917,-1.8233;10.8294,-0.0717,-0.9902;10.0934,1.501,-0.6183)|\",5.5157477496350005\r\n\"[H]OC(=O)[C@@]12C([H])([H])N([H])C([H])([H])[C@]1([H])C([H])([H])N(C([H])([H])/C(=C(\\[H])C([H])([H])[H])C([H])([H])[H])C2([H])[H] |(4.7504,5.6625,-5.1061;4.3047,5.1485,-5.7993;4.2489,3.829,-5.4516;3.7287,3.043,-6.2047;4.8947,3.4627,-4.1207;6.442,3.7577,-4.1224;6.6728,4.5725,-4.8226;7.0256,2.8881,-4.436;6.8117,4.1806,-2.7723;6.822,3.3445,-2.1874;5.6684,4.9901,-2.3401;5.6635,5.1012,-1.2514;5.7709,5.9986,-2.7684;4.3807,4.3075,-2.8849;3.6162,5.0449,-3.1521;3.7949,3.2407,-1.9437;2.715,3.105,-2.1534;3.8958,3.502,-0.8834;4.558,2.0304,-2.2579;4.0006,0.8183,-1.65;3.8718,1.0308,-0.5784;2.9947,0.604,-2.0592;4.8931,-0.3961,-1.8184;4.4385,-1.466,-2.4884;3.4319,-1.3995,-2.9049;5.1288,-2.7761,-2.7423;5.1865,-2.9806,-3.8199;4.5599,-3.6056,-2.3002;6.1439,-2.8164,-2.339;6.2471,-0.286,-1.1626;6.882,-1.1574,-1.3373;6.1406,-0.1615,-0.0756;6.7763,0.6033,-1.5248;4.5786,2.0044,-3.726;5.323,1.293,-4.0946;3.6018,1.7073,-4.1454)|\",6.2150803454200005\r\n\"[H]C1=C([H])C(C([H])([H])N(C(=O)[C@]([H])(N([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=C([H])S1 |(5.0768,5.6763,-3.9543;4.5585,4.7309,-3.8647;5.0756,3.4709,-3.9444;6.1227,3.2484,-4.1151;4.0822,2.4454,-3.7857;4.3893,0.9671,-3.8354;4.9461,0.729,-4.7441;3.4565,0.3933,-3.846;5.198,0.4919,-2.7016;6.5559,0.451,-2.856;7.109,0.836,-3.8884;7.406,-0.0897,-1.6853;6.8511,-0.881,-1.1683;7.5696,1.0134,-0.7187;8.0591,1.7815,-1.179;8.1774,0.7054,0.0403;8.7173,-0.6791,-2.2153;8.5342,-1.5332,-2.8773;9.3371,-1.0228,-1.3782;9.2693,0.0705,-2.7886;4.4975,0.2608,-1.4338;5.1858,0.468,-0.6114;3.707,1.0192,-1.3629;3.8837,-1.1414,-1.3118;3.221,-1.331,-2.168;4.6819,-1.8944,-1.3735;3.0993,-1.3267,-0.0065;3.7651,-1.1357,0.8471;2.3067,-0.5673,0.0529;2.4812,-2.7218,0.1309;1.9304,-2.8227,1.0728;3.2527,-3.5012,0.1104;1.7815,-2.9283,-0.6882;2.8313,2.9694,-3.5902;1.9009,2.4338,-3.4513;2.8406,4.7056,-3.5973)|\",6.081744558675\r\n\"[H]C1=C([H])C([C@@]2([H])N([H])N([H])C([H])([H])[C@@]2([H])C([H])([H])N(C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1F |(5.0936,5.8023,2.9005;4.5345,5.5355,2.0095;4.9985,4.5608,1.124;5.942,4.0671,1.3396;4.2763,4.2165,-0.0292;4.7609,3.154,-1.0101;4.282,3.3509,-1.9755;6.238,3.1476,-1.2019;6.6466,3.9566,-0.7424;6.821,1.973,-0.566;7.0758,1.3584,-1.3393;5.7365,1.3184,0.1806;5.9209,0.2408,0.2443;5.6981,1.7121,1.2052;4.4571,1.6802,-0.5908;4.4581,1.0811,-1.5108;3.151,1.4381,0.1655;3.1422,2.068,1.0612;3.1259,0.3858,0.5169;1.9515,1.7612,-0.6147;1.686,0.7891,-1.6684;0.8068,1.103,-2.2412;1.494,-0.2306,-1.2782;2.5286,0.7318,-2.3636;0.7863,1.9108,0.249;-0.0846,2.1872,-0.3556;0.9637,2.7108,0.9753;0.5317,0.9868,0.8053;3.0697,4.889,-0.275;2.4942,4.6287,-1.1577;2.5894,5.8651,0.5966;1.6587,6.3896,0.4048;3.3334,6.174,1.7301;2.8779,7.1218,2.5774)|\",5.292614392225\r\n\"[H]OC(=O)[C@@]12C([H])([H])N([H])C([H])([H])[C@]1([H])C([H])([H])N(/C(=N\\C([H])([H])C([H])=C([H])[H])O[H])C2([H])[H] |(4.3787,-6.8233,4.7296;4.3604,-6.3842,5.5957;4.5784,-5.0442,5.4593;4.5907,-4.3442,6.4412;4.8194,-4.5499,4.0357;6.1099,-5.2008,3.4053;6.3131,-6.1712,3.8785;6.9967,-4.579,3.5519;5.848,-5.4213,1.9844;5.9442,-4.5289,1.5009;4.4367,-5.8105,1.9468;4.0269,-5.6994,0.9383;4.3629,-6.8774,2.2072;3.687,-4.9555,3.0092;2.8685,-5.5157,3.4729;3.1619,-3.6114,2.4721;2.2905,-3.2765,3.0577;2.8584,-3.6721,1.4258;4.296,-2.6925,2.6598;4.192,-1.3629,2.2564;3.8448,-0.8749,1.1308;3.6329,-1.7126,-0.0371;4.2464,-2.6286,-0.0189;2.5781,-2.0372,-0.0815;3.9416,-0.9313,-1.2898;3.4693,0.0488,-1.3482;4.7182,-1.364,-2.2817;4.8922,-0.7701,-3.1749;5.213,-2.3329,-2.2408;4.5719,-0.5073,3.2476;4.576,0.3705,2.8247;4.8532,-3.0083,3.9866;5.8606,-2.5997,4.0861;4.2443,-2.5909,4.7973)|\",6.185147821865\r\n\"[H]SC1NC([H])([H])C([H])([H])[C@@]([H])(N2C([H])([H])C([H])([H])N(C([H])([H])[H])C([H])([H])C2([H])[H])N1[H] |(9.5475,-1.9487,2.1822;8.8907,-0.775,2.122;7.9185,-1.2563,0.6781;8.1576,-2.3609,0.0875;7.2655,-2.6604,-1.029;7.7603,-3.3902,-1.6796;6.3634,-3.1541,-0.6394;6.8914,-1.4142,-1.8498;6.1594,-1.6499,-2.6286;7.7992,-1.0646,-2.3547;6.3479,-0.25,-0.9985;6.6963,0.6893,-1.4478;4.8961,-0.1393,-0.9998;4.4177,1.1169,-0.4187;4.5856,1.1627,0.6737;4.9644,1.9477,-0.8827;2.9224,1.2878,-0.6799;2.7556,1.3854,-1.7717;2.5735,2.2131,-0.2055;2.1824,0.1638,-0.1154;0.7449,0.3248,-0.2637;0.4258,1.2539,0.2217;0.2284,-0.5091,0.2245;0.415,0.3597,-1.321;2.6558,-1.0914,-0.6913;2.1114,-1.9194,-0.2213;2.4615,-1.1387,-1.782;4.1539,-1.2744,-0.4452;4.4767,-2.192,-0.945;4.3367,-1.3946,0.6362;6.9439,-0.3321,0.3588;6.9606,0.5363,0.877)|\",6.17698440635\r\n\"[H]OC(=O)[C@@]([H])(O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])O[C@]1([H])C([H])=C([H])[H])C([H])([H])[H] |(0.1138,-0.841,-0.3231;-0.6522,-0.6455,-0.9049;-0.3249,-1.0598,-2.1387;-1.0723,-0.9635,-3.0827;1.0976,-1.6363,-2.2749;1.6946,-0.8588,-2.7746;1.6122,-1.8711,-0.9583;3.0093,-1.6283,-0.7628;3.5377,-1.748,-1.719;3.5559,-2.655,0.2277;2.9744,-2.6084,1.1578;3.4148,-3.662,-0.1806;5.0393,-2.3655,0.5054;5.6279,-2.5584,-0.401;5.4245,-3.0287,1.2899;5.2294,-0.9053,0.9181;6.2899,-0.6418,0.9674;4.7998,-0.7314,1.9171;4.6672,-0.0045,-0.0364;3.2633,-0.176,-0.2781;3.0482,0.5113,-1.1044;2.407,0.2287,0.8986;2.5329,-0.3254,1.827;1.5155,1.2205,0.8532;0.9121,1.49,1.716;1.3529,1.8054,-0.0498;1.1009,-2.9104,-3.1154;0.6224,-2.719,-4.0788;0.5482,-3.7017,-2.5995;2.1257,-3.2554,-3.2881)|\",6.718490968845001\r\n\"[H]C1=C([H])[C@]([H])([C@]([H])(C([H])([H])C(=O)C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])OC1=O |(10.2479,5.6761,5.4608;9.6077,4.8315,5.2431;9.7912,3.842,4.3659;10.6371,3.6784,3.7091;8.6335,2.8829,4.4472;8.9883,1.9158,4.8244;7.8214,2.6481,3.1472;7.0195,1.9496,3.4258;8.6948,1.9798,2.0488;8.0546,1.7531,1.1899;9.4626,2.6879,1.7175;9.4079,0.7138,2.5133;10.5036,0.7739,3.0443;8.6937,-0.6072,2.301;9.2434,-1.4144,2.7888;8.6169,-0.8174,1.226;7.6678,-0.5657,2.688;7.1664,3.944,2.6279;6.7205,4.4659,3.4818;7.9464,4.6079,2.2283;6.0771,3.7339,1.5666;6.4861,3.2174,0.6871;5.2973,3.0737,1.9749;5.4364,5.0507,1.1067;5.0243,5.5783,1.9789;6.2176,5.708,0.6959;4.3317,4.8613,0.0593;4.7378,4.3123,-0.8025;3.541,4.2235,0.4795;3.7217,6.1826,-0.4197;2.9303,6.013,-1.1585;4.4807,6.8234,-0.8852;3.2841,6.7427,0.4156;7.7761,3.4427,5.4568;8.3277,4.6061,5.9571;7.7955,5.2545,6.8215)|\",5.798746154154999\r\n\"[H]C1=C([H])C([H])=C([C@]([H])(C([H])([H])C(=O)C([H])([H])[H])[C@]2([H])OC(=O)C([H])=C2[H])O1 |(0.5543,-0.3488,5.7081;0.538,0.1728,4.7632;-0.4161,0.3836,3.8186;-1.4366,0.0277,3.8437;0.2076,1.1629,2.7882;-0.2379,1.5072,1.8656;1.497,1.3736,3.1773;2.6492,2.0968,2.5526;2.278,2.5006,1.6056;3.8265,1.1644,2.2197;4.7219,1.7644,2.0032;4.0856,0.523,3.0704;3.5704,0.3014,0.9841;2.6731,0.5478,0.2014;4.5108,-0.8721,0.7817;4.2983,-1.6445,1.5327;5.5565,-0.572,0.9178;4.369,-1.2951,-0.2148;3.0789,3.3212,3.4051;2.184,3.9341,3.5806;4.0132,4.1202,2.6621;5.1569,4.3352,3.4082;6.0819,4.9894,3.0006;4.9804,3.6268,4.6987;5.7407,3.6335,5.4685;3.7846,3.033,4.7039;3.3386,2.4319,5.4864;1.7128,0.7672,4.3945)|\",4.9742411871400005\r\n\"[H]C1=C([H])C([H])=C(/C(=C(\\[H])C([H])([H])OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(3.6529,7.4323,-5.9082;4.5038,6.7823,-5.723;5.7576,7.0943,-6.2498;5.8908,7.9952,-6.8433;6.8475,6.2554,-6.0188;7.8143,6.5242,-6.4332;6.7181,5.0895,-5.2433;7.8876,4.2016,-4.9806;7.9904,3.5584,-3.803;7.2319,3.734,-3.0425;9.0436,2.5857,-3.3614;9.9464,2.6251,-3.9897;8.655,1.551,-3.4284;9.3746,2.8771,-2.0114;10.3084,1.9648,-1.4591;9.9198,0.9328,-1.5344;11.2534,1.9974,-2.0288;10.5559,2.3335,-0.0008;11.3524,1.682,0.3862;10.9425,3.3608,0.037;9.3083,2.2165,0.8836;8.5134,2.838,0.4559;8.9413,1.1802,0.853;9.5662,2.625,2.3371;8.6599,2.529,2.9457;9.9011,3.6677,2.4015;10.3432,2.0005,2.796;8.9079,4.0857,-6.0932;9.6666,3.3282,-5.8887;9.4322,5.0363,-6.2545;8.4216,3.8282,-7.0419;5.4426,4.7833,-4.7346;5.3022,3.8647,-4.1727;4.3521,5.6194,-4.9655;3.3777,5.3535,-4.5638)|\",5.417786763454999\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])/C(=C(\\[H])C([H])([H])OC([H])(C([H])([H])[H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(6.7198,4.821,2.6725;6.434,3.8589,2.2554;7.4001,3.0159,1.7068;8.4443,3.3178,1.6933;7.0282,1.7807,1.1702;7.788,1.1267,0.7472;5.6912,1.3665,1.1744;5.2847,0.0089,0.6166;4.9237,-0.6255,1.4362;6.1817,-0.4841,0.2147;4.221,0.0783,-0.4703;3.0237,-0.4822,-0.237;2.8606,-0.9538,0.7325;1.8236,-0.5869,-1.1337;1.6807,-1.6431,-1.4216;1.9375,-0.0148,-2.0634;0.6959,-0.1144,-0.4043;-0.5533,-0.2256,-1.091;-0.4059,0.0582,-2.1476;-1.1056,-1.6539,-1.0308;-2.0632,-1.7206,-1.5603;-1.2648,-1.9515,0.0118;-0.4221,-2.3752,-1.4912;-1.4931,0.7811,-0.434;-2.4767,0.7697,-0.9165;-1.0794,1.7919,-0.5032;-1.6247,0.5376,0.6263;4.6661,0.7842,-1.7555;5.7013,1.106,-1.5798;4.7096,-0.1639,-2.9702;5.1665,0.3391,-3.8312;3.7087,-0.4913,-3.2709;5.2992,-1.0618,-2.7515;3.8746,2.0728,-2.0573;4.3025,2.5812,-2.9302;3.918,2.762,-1.2082;2.8197,1.8751,-2.2723;4.7285,2.2248,1.7264;3.6847,1.9201,1.7281;5.0952,3.4583,2.2632;4.3351,4.1082,2.6894)|\",6.291272223560001\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])C([H])([H])OC([H])([H])/C([H])=C(\\C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(1.4659,7.1363,8.2349;1.8813,6.5424,7.4251;1.5778,6.8527,6.0982;0.9246,7.6913,5.8704;2.1165,6.0914,5.0601;1.8804,6.3429,4.0281;2.967,5.0092,5.3256;3.5193,4.1583,4.2025;3.6632,4.7602,3.2976;4.4995,3.7504,4.4753;2.5951,2.9875,3.8565;1.6037,3.3639,3.5501;2.4373,2.3522,4.7457;3.1913,2.2428,2.8128;2.4217,1.1123,2.4263;2.2421,0.4713,3.3106;1.4308,1.435,2.068;3.1989,0.3637,1.3797;4.2711,0.4526,1.5268;2.7175,-0.3805,0.3723;1.2276,-0.5704,0.1628;0.6416,0.2311,0.621;0.8802,-1.5141,0.6044;0.9628,-0.6049,-0.899;3.6454,-1.114,-0.6156;3.5038,-0.4833,-2.0233;4.145,-1.01,-2.7407;3.803,0.5708,-2.01;2.4781,-0.5369,-2.4033;5.1294,-1.0344,-0.2118;5.7395,-1.5884,-0.9345;5.3031,-1.4717,0.7776;5.4949,-0.0017,-0.1979;3.2502,-2.609,-0.69;3.9094,-3.1414,-1.3861;2.2223,-2.7469,-1.0408;3.3399,-3.0879,0.2922;3.2637,4.7103,6.6625;3.9277,3.878,6.8872;2.7276,5.4678,7.7043;2.9749,5.2217,8.7339)|\",6.326647024125\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])/C(=C(\\[H])C([H])([H])OC([H])([H])C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(7.7787,5.272,6.2258;7.5261,4.2833,5.852;6.3163,4.071,5.1858;5.6233,4.8957,5.04;5.9925,2.8027,4.7054;5.0533,2.6441,4.1809;6.8691,1.7224,4.8828;6.4971,0.3347,4.377;7.3321,-0.3485,4.5886;5.6371,-0.0411,4.9462;6.169,0.2835,2.8934;4.9327,-0.0474,2.4921;4.1815,-0.2742,3.2486;4.4181,-0.184,1.0895;4.3286,-1.2542,0.821;5.0943,0.2714,0.3494;3.1359,0.4257,1.026;2.5185,0.2794,-0.2404;3.1499,0.7282,-1.0273;2.4088,-0.7912,-0.4887;1.1598,0.9595,-0.1954;0.652,0.867,-1.1618;1.2714,2.0231,0.039;0.5287,0.5041,0.5746;7.3221,0.6062,1.975;7.0298,0.6749,0.9246;8.1034,-0.1632,2.0527;7.7882,1.5563,2.2624;8.0796,1.9487,5.548;8.7719,1.1218,5.6923;8.4078,3.2175,6.0315;9.3534,3.3716,6.5452)|\",6.419165733294999\r\n\"[H]C([H])([H])C([H])([H])O[C@]1([H])ON=C(C(F)(F)F)[C@]2([H])O[C@]12[H] |(0.5906,0.3145,2.4244;1.4264,-0.1611,1.8985;1.973,-0.7728,2.6254;1.0188,-0.8246,1.1291;2.3226,0.8997,1.2725;2.7362,1.5794,2.0252;1.7666,1.5022,0.5493;3.3952,0.3038,0.5121;4.5486,0.0445,1.2153;4.3449,-0.3887,2.2096;5.2002,1.3327,1.4535;6.5332,1.3934,1.7975;7.2933,0.4362,1.4092;8.7543,0.5319,1.7749;9.126,-0.5926,2.4395;9.5085,0.5884,0.6555;9.0469,1.5845,2.5394;6.8684,-0.7195,0.5816;7.5538,-1.5586,0.4941;6.172,-0.3608,-0.635;5.4203,-0.9074,0.4377;5.0017,-1.9012,0.2763)|\",5.9320819409\r\n\"[H]C([H])([H])C([H])([H])OC(=O)C1=NO[C@@]([H])(OC([H])([H])C([H])([H])[H])[C@@]2([H])O[C@@]12[H] |(4.0829,-4.0162,4.0744;4.6827,-3.3717,3.4216;4.5647,-2.3382,3.7591;5.7333,-3.6622,3.5224;4.2154,-3.5243,1.9822;3.1802,-3.1962,1.8624;4.3134,-4.5551,1.6347;5.0543,-2.7683,1.0725;4.7414,-1.4782,0.9046;3.8237,-0.9023,1.4611;5.6457,-0.762,-0.0503;6.6059,-1.4293,-0.5941;7.4362,-0.7722,-1.4703;7.6242,0.6785,-1.4004;8.3462,0.8538,-0.5846;8.1135,1.0774,-2.6239;9.4876,0.7271,-2.8877;9.7105,-0.2509,-2.4478;9.5472,0.6254,-3.9747;10.4466,1.8028,-2.3928;11.4781,1.5351,-2.6494;10.3952,1.9223,-1.3044;10.214,2.7674,-2.8552;6.3462,1.4115,-1.0916;6.4464,2.4906,-0.9676;5.1774,0.9592,-1.7547;5.3338,0.6592,-0.3429;4.6277,1.1531,0.3162)|\",5.287172115214999\r\n\"[H]C([H])=C([H])/C([H])=C(\\[H])[Ge](C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(6.2737,8.5242,-0.9207;6.6423,7.5083,-1.0261;7.2238,7.1117,-0.1965;6.3943,6.7757,-2.1218;5.8067,7.2062,-2.9333;6.8609,5.4083,-2.321;7.4449,4.9892,-1.4981;6.6157,4.6691,-3.4207;6.0252,5.1351,-4.2139;7.2265,2.8261,-3.7194;9.2026,2.7421,-3.6652;9.6383,3.4223,-4.4034;9.5465,1.7254,-3.8837;9.5735,3.024,-2.6747;6.5801,2.2573,-5.4982;7.0458,2.8571,-6.2868;5.4942,2.3739,-5.5728;6.8243,1.2053,-5.679;6.5035,1.6227,-2.3105;6.8438,0.6035,-2.5407;6.9768,1.8967,-1.3575;4.9747,1.6555,-2.1693;4.6474,2.687,-1.9786;4.5094,1.3624,-3.1219;4.4374,0.7448,-1.055;4.7658,-0.2887,-1.2414;4.8913,1.039,-0.0969;2.9089,0.7778,-0.9243;2.5828,1.8134,-0.7456;2.4561,0.4794,-1.8819;2.3646,-0.1223,0.1923;2.6907,-1.157,0.014;2.814,0.1764,1.1501;0.8374,-0.0819,0.3111;0.4803,-0.734,1.1166;0.3587,-0.4096,-0.62;0.4835,0.9344,0.524)|\",5.295335530729999\r\n\"[H]C([H])=C([H])/C([H])=C(\\[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(5.5769,-6.1843,6.6031;6.0779,-5.7718,5.7325;6.7897,-4.9694,5.9145;5.8346,-6.2259,4.494;5.1141,-7.0319,4.3503;6.4756,-5.7129,3.2894;7.191,-4.9028,3.4533;6.2416,-6.1648,2.0392;5.5139,-6.9768,1.946;7.064,-5.5176,0.4674;6.3662,-6.493,-1.001;6.5698,-7.5652,-0.8944;6.8143,-6.1635,-1.9461;5.2804,-6.3695,-1.0883;8.9425,-5.7711,0.5633;9.1942,-6.8339,0.6564;9.3731,-5.2523,1.428;9.439,-5.3866,-0.3362;6.7142,-3.6556,0.26;7.1659,-3.1267,1.1134;7.2617,-3.3018,-0.6272;5.2294,-3.2681,0.1442;4.7852,-3.7657,-0.7298;4.6854,-3.651,1.019;4.9918,-1.7555,0.0267;5.4269,-1.2544,0.9042;5.5393,-1.3684,-0.846;3.51,-1.3779,-0.0947;3.0783,-1.8757,-0.976;2.9627,-1.7733,0.7744;3.2633,0.1327,-0.2007;3.6913,0.6311,0.6807;3.8092,0.5299,-1.0685;1.7794,0.4936,-0.3239;1.6332,1.5776,-0.3953;1.2108,0.1384,0.5446;1.3351,0.0377,-1.2174)|\",5.254518453155001\r\n\"[H]C([H])=C([H])/C([H])=C(\\[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(6.3778,-8.8756,2.3029;6.9418,-7.9525,2.3966;8.0131,-8.0155,2.2175;6.35,-6.7919,2.7147;5.2729,-6.7737,2.8859;7.0459,-5.5195,2.8572;8.1214,-5.5408,2.688;6.4312,-4.3674,3.1756;5.3528,-4.4035,3.3406;7.0529,-2.9961,3.3502;8.5898,-3.0415,3.2744;9.007,-3.7096,4.0363;9.0069,-2.0415,3.4406;8.9362,-3.3859,2.2932;6.6449,-2.4527,4.7406;7.0733,-3.0717,5.537;5.5579,-2.4479,4.8765;7.0064,-1.4256,4.877;6.5504,-2.0363,2.2252;7.0212,-1.0559,2.3927;6.9433,-2.4059,1.2677;5.0344,-1.8353,2.0851;4.5486,-2.7937,1.8578;4.6061,-1.489,3.0357;4.678,-0.8254,0.9846;5.1549,0.1404,1.209;5.1094,-1.1611,0.0299;3.1682,-0.6166,0.8112;2.6923,-1.5813,0.5795;2.7337,-0.2867,1.7669;2.8114,0.398,-0.2829;3.2851,1.3624,-0.05;3.2473,0.0692,-1.2372;1.3021,0.5987,-0.4533;1.0811,1.3298,-1.2394;0.8429,0.9592,0.4755;0.8053,-0.3413,-0.7235)|\",5.488536364585\r\n\"[H]OC([H])([H])[C@@]1([H])N(C2=C([H])C([H])=C([H])C([H])=C2[H])[C@]([H])(C([H])([H])N(=O)=O)C([H])([H])C1([H])[H] |(2.2373,2.9144,-3.931;2.6952,2.8955,-3.0777;2.3863,1.6634,-2.4357;1.3074,1.5627,-2.2533;2.7024,0.7991,-3.0417;3.1447,1.6551,-1.0998;2.9145,2.5935,-0.5803;2.782,0.4923,-0.2776;1.5599,0.3513,0.3797;0.6257,1.4098,0.4214;0.851,2.3564,-0.0566;-0.593,1.259,1.0781;-1.2885,2.0945,1.0896;-0.9257,0.0633,1.7165;-1.8774,-0.0465,2.2275;-0.008,-0.9863,1.6846;-0.2405,-1.9293,2.1729;1.2153,-0.8555,1.0295;1.8965,-1.6993,1.0238;3.8621,-0.4793,-0.1636;3.5069,-1.4972,-0.3611;4.4616,-0.468,1.2701;3.671,-0.5405,2.018;5.088,0.4055,1.4472;5.3284,-1.6738,1.4564;6.546,-1.5115,1.5046;4.7526,-2.7614,1.4986;4.8807,-0.0222,-1.2218;5.9049,-0.3011,-0.9567;4.6476,-0.4942,-2.1821;4.6623,1.4977,-1.2943;4.9911,1.9433,-2.2362;5.2,2.0036,-0.4843)|\",3.5510857490250003\r\n\"[H]C([H])=C([H])/C(C1=C(C([H])([H])[H])C([H])=C([H])C([H])=C1[H])=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(10.1,3.2019,-0.5713;9.1463,3.3995,-1.0519;9.0574,4.3318,-1.6022;8.1303,2.5285,-0.9562;8.2887,1.6194,-0.3804;6.7993,2.6984,-1.5553;6.5744,3.8776,-2.4586;7.0379,3.886,-3.7937;7.779,2.7049,-4.3772;7.9561,2.844,-5.4483;7.2145,1.7748,-4.2436;8.7491,2.5577,-3.8884;6.7788,5.0115,-4.5852;7.1263,5.0191,-5.616;6.0821,6.1132,-4.0865;5.8973,6.9726,-4.7259;5.6233,6.1008,-2.771;5.0756,6.9492,-2.3693;5.8712,4.9862,-1.9687;5.5189,4.9677,-0.9407;5.7626,1.865,-1.3091;4.8214,2.0924,-1.8113;5.7204,0.6347,-0.445;6.6656,0.4765,0.0851;4.9557,0.7789,0.3345;5.3644,-0.6355,-1.244;4.4209,-0.4714,-1.7846;6.1317,-0.8018,-2.0129;5.2363,-1.886,-0.3651;6.1792,-2.0463,0.177;4.4699,-1.7113,0.4035;4.8847,-3.1468,-1.1614;4.7964,-4.0216,-0.5072;5.6531,-3.3682,-1.9122;3.931,-3.0283,-1.6903)|\",5.36608513186\r\n\"[H]C1=C([H])C([H])=C(C2=NO[C@]([H])(OC([H])([H])[H])[C@@]([H])(OC([H])([H])[H])C2([H])[H])C([H])=C1[H] |(-1.574,-7.6236,-0.9197;-0.8197,-6.8689,-0.7136;0.5261,-7.2308,-0.5944;0.8216,-8.2703,-0.7101;1.4941,-6.2681,-0.3313;2.5376,-6.5458,-0.2375;1.1389,-4.9142,-0.1837;2.1727,-3.8835,0.1087;3.3459,-4.3358,0.3851;4.3811,-3.4335,0.674;4.0497,-2.0882,1.0024;5.0254,-1.5863,0.9447;3.4971,-1.959,2.2821;4.343,-2.416,3.3326;3.8358,-2.1697,4.268;5.3171,-1.9052,3.3032;4.5063,-3.497,3.2733;3.0217,-1.5197,0.0222;2.7624,-0.4952,0.3285;3.5049,-1.5342,-1.3091;4.4331,-0.5121,-1.6199;4.6262,-0.5815,-2.6932;5.3877,-0.6277,-1.0869;4.0241,0.4855,-1.3934;1.7932,-2.4196,0.0768;1.2037,-2.1798,0.9697;1.1725,-2.2128,-0.8003;-0.2143,-4.5627,-0.3063;-0.5252,-3.5292,-0.1932;-1.1847,-5.5321,-0.567;-2.2269,-5.237,-0.6548)|\",5.137509497440001\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C([H])=C([H])[H])/C([H])=C(\\[H])[C@]([H])(N([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(4.8708,-0.4169,-1.066;4.9556,0.2788,-1.7371;6.2917,0.543,-1.907;6.6165,1.1178,-2.9926;7.9761,1.5446,-3.2538;7.957,2.6343,-3.4011;8.6878,1.358,-2.4346;8.5011,0.8856,-4.5457;8.5645,-0.2001,-4.3999;7.7522,1.057,-5.3312;9.8364,1.4319,-4.9681;9.8648,2.5025,-5.1817;10.9593,0.7216,-5.0838;11.897,1.1759,-5.3931;10.9814,-0.3479,-4.8824;7.1659,0.1346,-0.7777;8.1965,-0.1112,-1.0198;6.7569,0.0643,0.4964;5.7323,0.3497,0.7463;7.599,-0.3758,1.6692;8.555,-0.7584,1.2856;6.9038,-1.5229,2.2854;6.0961,-1.1908,2.8141;7.515,-1.978,2.9605;7.9111,0.8083,2.6349;6.9419,1.2122,2.9713;8.6829,0.3325,3.8751;8.887,1.172,4.5488;9.6497,-0.1052,3.5918;8.1267,-0.418,4.4468;8.6799,1.9358,1.9295;8.8737,2.7638,2.6207;8.1282,2.3329,1.0724;9.6514,1.5765,1.5647)|\",5.58377621226\r\n\"[H]OC(=O)C([H])([H])C(=O)O[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])=C([H])[H] |(10.8527,1.5219,0.8964;10.4756,0.9976,0.1706;9.2253,1.4328,-0.135;8.5978,0.9099,-1.0178;8.73,2.5986,0.725;9.4474,3.4245,0.6773;8.6559,2.2588,1.7655;7.3805,3.1184,0.2537;7.2396,4.127,-0.4004;6.3948,2.3088,0.6689;5.0317,2.6207,0.2345;5.1109,3.1418,-0.7239;4.3882,3.5362,1.2701;3.3662,3.7881,0.9667;4.9548,4.4674,1.364;4.3462,3.047,2.2497;4.3189,1.2813,0.0649;4.3797,0.73,1.0133;3.2546,1.4814,-0.1135;4.8808,0.4232,-1.0849;4.7441,0.9529,-2.037;5.9632,0.3063,-0.9426;4.227,-0.9294,-1.1607;4.3522,-1.5656,-0.282;3.5214,-1.389,-2.1946;3.0738,-2.3794,-2.1892;3.3712,-0.793,-3.093)|\",6.598760874625\r\n\"[H]O[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])[C@@]1([H])C(=O)C([H])=C([H])C([H])([H])C1([H])[H] |(4.0523,2.9652,1.6271;4.242,2.109,2.0598;4.9217,1.3427,1.09;4.3789,0.3995,0.9224;4.9183,2.1131,-0.2332;4.4727,3.2422,-0.309;5.4056,1.3942,-1.2475;5.5552,2.0672,-2.5292;5.3392,1.2939,-3.2704;4.8032,2.8557,-2.5952;6.9656,2.6153,-2.6719;7.0693,3.1157,-3.6417;7.7044,1.8095,-2.618;7.1834,3.3328,-1.8768;6.3611,0.9723,1.5407;6.8265,0.4697,0.6851;7.1731,2.2527,1.7817;7.5421,2.9375,0.8339;7.4436,2.6547,3.1715;8.025,3.565,3.2926;6.9366,1.9883,4.2237;7.1035,2.3749,5.2287;6.117,0.7375,4.0913;5.0562,1.0092,4.1766;6.334,0.061,4.9281;6.3838,0.0266,2.7562;5.6665,-0.7904,2.6087;7.3784,-0.4356,2.8067)|\",5.10485583538\r\n\"[H]O[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])[C@]1([H])C(=O)C2=C(C([H])=C([H])C([H])=C2[H])C1([H])[H] |(6.5296,-0.4321,1.2126;5.965,-0.5816,0.4283;5.3781,0.6562,0.1199;4.8591,0.5405,-0.8366;4.3361,1.0357,1.1848;4.4398,0.77,2.3608;3.2956,1.7101,0.6407;2.2468,2.1277,1.554;1.3593,2.2099,0.9221;2.0962,1.3365,2.2921;2.5826,3.4505,2.2256;1.751,3.7593,2.8695;2.7506,4.2378,1.4831;3.4773,3.3518,2.8462;6.4683,1.7553,-0.0337;7.1781,1.3409,-0.7602;7.2455,1.9605,1.2818;7.7173,1.0622,1.9641;7.3083,3.4093,1.549;6.6071,4.1074,0.5587;6.5377,5.4998,0.6181;6.0023,6.066,-0.1401;7.1719,6.1603,1.6729;7.1245,7.2449,1.7283;7.8695,5.4514,2.6648;8.3521,5.9931,3.4734;7.9425,4.0633,2.6101;8.4732,3.4856,3.3612;5.9995,3.1681,-0.4657;6.3365,3.4107,-1.4804;4.9065,3.2471,-0.4715)|\",5.053154203785\r\n\"[H]C([H])=C(/C(C1=C([H])C([H])=C([H])C([H])=C1C([H])([H])[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(1.2328,2.5826,0.106;1.5307,1.7059,0.6764;1.1043,1.5982,1.6691;2.3806,0.8016,0.172;2.7992,-0.4057,0.9498;4.2733,-0.6613,1.076;4.79,-1.8596,0.5565;4.1057,-2.5524,0.0745;6.1484,-2.1648,0.6331;6.5199,-3.0991,0.2205;7.019,-1.2617,1.239;8.0812,-1.4811,1.308;6.517,-0.0724,1.7681;7.1952,0.6225,2.2585;5.1565,0.2525,1.7001;4.6696,1.5345,2.3372;5.4682,1.9999,2.9235;3.8204,1.3508,3.0049;4.3289,2.2632,1.5935;1.9508,-1.3037,1.5055;2.4199,-2.0805,2.1098;0.4808,-1.4234,1.4416;-0.2872,-1.0761,0.315;0.2035,-0.665,-0.5601;-1.6669,-1.2644,0.3049;-2.2376,-0.9945,-0.5801;-2.3173,-1.8048,1.4176;-3.3943,-1.9492,1.4063;-1.5695,-2.1698,2.5376;-2.0606,-2.6014,3.406;-0.1871,-1.9906,2.5424;0.3913,-2.2867,3.4148;2.9967,0.9692,-1.2003;2.6401,1.883,-1.6854;2.7527,0.117,-1.8487;4.0909,1.0119,-1.1429)|\",4.65314684355\r\n\"[H]C([H])=C([H])C(=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(3.0157,-0.4506,-5.2176;4.0543,-0.4601,-4.8986;4.3353,0.2518,-4.1284;4.9388,-1.2947,-5.4607;4.5913,-1.9497,-6.2588;6.3747,-1.4139,-5.1326;7.309,-1.6841,-6.0896;8.3257,-1.8524,-5.7342;7.1611,-1.817,-7.5409;6.172,-1.1582,-8.3;5.4814,-0.4821,-7.8084;6.0981,-1.3286,-9.6816;5.33,-0.8041,-10.2441;7.0099,-2.1516,-10.345;6.9489,-2.2813,-11.4221;8.0123,-2.7938,-9.6132;8.7364,-3.4271,-10.1192;8.0905,-2.6203,-8.235;8.8752,-3.1213,-7.6725;6.8447,-1.3142,-3.7417;7.929,-1.2428,-3.6401;6.1169,-1.3552,-2.6122;5.037,-1.4633,-2.6773;6.7072,-1.2763,-1.2311;6.4086,-2.1683,-0.6579;7.8023,-1.3018,-1.3004;6.2851,-0.0237,-0.4293;6.8545,-0.0093,0.5111;6.5856,0.8743,-0.987;4.7879,0.0539,-0.1023;4.2043,0.0799,-1.0322;4.4847,-0.8632,0.4234;4.429,1.2744,0.752;3.3566,1.3055,0.9757;4.6896,2.2079,0.2382;4.9688,1.2618,1.7069)|\",4.106198004045\r\n\"[H]C([H])=C([H])C(=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(1.2089,1.5937,0.2524;1.4216,0.5605,0.5114;0.6752,0.0426,1.104;2.5713,-0.0137,0.1284;3.279,0.5953,-0.4341;3.0477,-1.3786,0.4179;4.3833,-1.6121,0.2481;4.9888,-0.7522,-0.0382;5.1545,-2.8441,0.4224;4.6255,-4.1409,0.2544;3.5931,-4.2634,-0.0536;5.4261,-5.2684,0.4295;4.9959,-6.2563,0.2864;6.7738,-5.1361,0.769;7.3942,-6.0179,0.9042;7.3223,-3.8592,0.9154;8.3735,-3.7421,1.1657;6.5261,-2.7327,0.7352;6.9607,-1.7421,0.8484;2.1383,-2.4096,0.9371;2.5935,-3.1155,1.629;0.84,-2.5696,0.6042;0.4115,-1.9032,-0.1418;-0.0767,-3.597,1.1109;-1.3399,-3.7337,0.5056;-1.6031,-3.075,-0.319;-2.2493,-4.6974,0.9371;-3.2164,-4.7834,0.4485;-1.9185,-5.5481,1.9925;-2.6254,-6.2989,2.3348;-0.6713,-5.4211,2.6122;-0.4082,-6.0732,3.4411;0.2361,-4.4584,2.1812;1.1924,-4.3667,2.6877)|\",3.76061341391\r\n\"[H]C([H])=C([H])/C(C1=C([H])C([H])=C([H])S1)=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.7013,-0.5958,0.6674;5.0631,0.1126,0.147;5.5494,0.7852,-0.5525;3.7396,0.1306,0.364;3.3192,-0.5951,1.0569;2.7651,1.0519,-0.2437;3.2062,2.37,-0.732;2.7458,3.0511,-1.8373;2.0429,2.6112,-2.5363;3.3007,4.3547,-1.9851;3.0602,5.0233,-2.8048;4.187,4.6659,-0.9906;4.7583,5.576,-0.8635;4.3634,3.3667,0.1425;1.436,0.7545,-0.334;0.7778,1.5667,-0.6362;0.7481,-0.5095,-0.0543;1.3351,-1.7791,-0.2296;2.3459,-1.8524,-0.6166;0.6189,-2.9426,0.0462;1.0916,-3.91,-0.1022;-0.7019,-2.8717,0.493;-1.258,-3.7807,0.7053;-1.3088,-1.6226,0.6488;-2.3407,-1.5555,0.9834;-0.5961,-0.4605,0.3698;-1.0742,0.5087,0.4916)|\",4.002794740855\r\n\"[H]C([H])=C([H])/C(C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.8962,-2.1104,-0.4836;4.6599,-1.0511,-0.5282;5.4995,-0.3646,-0.5665;3.3831,-0.6366,-0.5198;2.6046,-1.3916,-0.4551;2.914,0.7551,-0.5162;3.9202,1.8613,-0.4954;3.9471,2.7924,0.558;3.2522,2.6756,1.3851;4.8502,3.8474,0.5695;4.8731,4.5608,1.3875;5.7634,4.0043,-0.4825;6.6115,5.0701,-0.3792;7.5485,5.2847,-1.4216;8.1052,6.1818,-1.1444;7.0502,5.4515,-2.3859;8.2459,4.4418,-1.5187;5.7561,3.0891,-1.5412;6.447,3.1881,-2.3709;4.842,2.0323,-1.536;4.8381,1.3383,-2.3717;1.5971,1.0898,-0.4557;1.3765,2.1516,-0.3495;0.4008,0.2302,-0.4903;0.2679,-0.8818,-1.3434;1.072,-1.1282,-2.0294;-0.9004,-1.6432,-1.3522;-0.9804,-2.4937,-2.0243;-1.9688,-1.3078,-0.5189;-2.8793,-1.9008,-0.5297;-1.8651,-0.1913,0.3141;-2.6952,0.0881,0.9577;-0.7,0.5716,0.3192;-0.6276,1.4415,0.9679)|\",4.378311854545001\r\n\"[H]C([H])=C([H])/C(C1=C([H])C([H])=C([H])C([H])=C1OC([H])([H])[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.2634,-1.7207,-0.0273;5.6168,-1.0217,-0.5498;6.1026,-0.2732,-1.1694;4.2837,-1.0868,-0.4243;3.8561,-1.8512,0.2206;3.3198,-0.2063,-1.0982;3.7723,0.5691,-2.2941;3.7822,1.9671,-2.2627;3.4736,2.4605,-1.3452;4.1908,2.7261,-3.3628;4.1934,3.8108,-3.3074;4.5971,2.0749,-4.5231;4.9214,2.6465,-5.3889;4.5891,0.6787,-4.5918;4.8943,0.1887,-5.5093;4.1749,-0.0752,-3.4883;4.1006,-1.4373,-3.4855;4.5154,-2.1425,-4.6425;4.3886,-3.2008,-4.4073;5.5698,-1.9477,-4.8797;3.8979,-1.8893,-5.5148;2.0325,-0.0548,-0.6894;1.3853,0.5349,-1.3373;1.3608,-0.6019,0.4989;-0.0115,-0.9109,0.4154;-0.5294,-0.756,-0.5282;-0.7055,-1.4215,1.5089;-1.7613,-1.6614,1.4136;-0.0487,-1.6196,2.7261;-0.5895,-2.0126,3.5828;1.3036,-1.2918,2.837;1.8179,-1.418,3.7863;2.0001,-0.7848,1.7407;3.0397,-0.4963,1.854)|\",4.41368665511\r\n\"[H]C([H])=C([H])/C(C1=C([H])C([H])=C([H])C(C([H])([H])[H])=C1[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.7916,1.929,-1.5667;3.967,1.059,-0.9405;4.9869,0.692,-0.8874;2.9558,0.4877,-0.2676;1.9675,0.9293,-0.3605;3.0541,-0.6503,0.655;4.4022,-1.2375,0.9371;4.9405,-1.2053,2.2324;4.3852,-0.7166,3.0281;6.181,-1.7849,2.493;6.5909,-1.7552,3.4993;6.899,-2.4021,1.468;7.8653,-2.8538,1.6798;6.3867,-2.4478,0.1662;7.1592,-3.106,-0.9549;7.9891,-3.7065,-0.5687;7.583,-2.3602,-1.6402;6.5169,-3.7628,-1.553;5.1393,-1.86,-0.0804;4.7222,-1.9015,-1.084;1.985,-1.1567,1.3249;2.2015,-1.9265,2.0653;0.5572,-0.8103,1.2149;-0.0846,-0.5543,-0.0115;0.4812,-0.6253,-0.9351;-1.4455,-0.2551,-0.0602;-1.9188,-0.0664,-1.0203;-2.202,-0.2139,1.1122;-3.2629,0.0179,1.0721;-1.5874,-0.4914,2.3355;-2.1688,-0.4749,3.2538;-0.2291,-0.795,2.3833;0.2419,-1.0131,3.339)|\",4.50620536428\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C(/C(=C(\\[H])C#N)C([H])([H])[H])C([H])=C1C([H])([H])[H] |(2.5509,2.0281,-0.0336;3.0921,1.0873,-0.1023;4.4638,1.0979,-0.3444;4.9727,2.051,-0.4694;5.2119,-0.0816,-0.4412;6.696,0.0138,-0.7254;6.9275,0.9645,-1.2161;7.0479,-0.7964,-1.3705;7.2956,-0.0278,0.1928;4.5249,-1.3103,-0.2969;5.2143,-2.6305,-0.3897;6.2554,-2.915,0.425;6.5882,-2.1938,1.1658;6.9709,-4.149,0.3986;7.5669,-5.1501,0.3932;4.698,-3.6106,-1.4115;5.2338,-4.5621,-1.3747;4.7964,-3.189,-2.4199;3.6299,-3.8075,-1.2614;3.1386,-1.3028,-0.0755;2.62,-2.2526,0.0345;2.4024,-0.1205,0.0401;0.918,-0.1561,0.3203;0.4628,0.8295,0.1814;0.7135,-0.4714,1.3517;0.4001,-0.8615,-0.3401)|\",5.134788358935\r\n\"[H]C1=C([H])C([H])=C(C2=NO[C@]([H])(OC([H])([H])C([H])([H])[H])[C@@]([H])(OC([H])([H])[H])C2([H])[H])C([H])=C1[H] |(10.8824,1.0953,4.0192;9.9274,0.9866,3.5121;9.8359,0.2282,2.3452;10.7207,-0.2498,1.9337;8.6085,0.0806,1.6982;8.5594,-0.4925,0.7775;7.4548,0.6962,2.2079;6.1456,0.563,1.5248;5.2843,1.5043,1.7155;4.0546,1.2448,1.0373;4.2841,1.0293,-0.3708;4.8944,1.8736,-0.7357;3.055,0.9646,-1.0037;2.3287,2.2031,-1.0247;2.1076,2.5159,0.0014;2.9567,2.9809,-1.4866;1.0554,1.9854,-1.8222;0.4757,2.9138,-1.8668;0.436,1.213,-1.3552;1.2848,1.6696,-2.8452;5.0572,-0.2818,-0.5982;4.3373,-1.069,-0.8683;6.0402,-0.1501,-1.6139;5.5088,-0.1045,-2.9282;6.3621,-0.0677,-3.6102;4.8771,0.7788,-3.0935;4.9088,-1.0014,-3.147;5.7785,-0.653,0.706;5.0857,-1.2388,1.3243;6.6346,-1.2885,0.4797;7.56,1.4658,3.3803;6.6689,1.9436,3.773;8.7844,1.6064,4.0259;8.8477,2.1976,4.9355)|\",5.07764445033\r\n\"[H]C1=NC2=C(C([H])=C1[H])C([H])=C([H])C([H])=C2OC([H])([H])C1([H])N([H])N([H])N([H])N1[H] |(-3.5534,2.6983,0.6303;-2.565,3.1513,0.5577;-1.5394,2.3437,0.7402;-0.2839,2.8699,0.6557;-0.0487,4.255,0.3684;-1.1885,5.0795,0.1852;-1.0491,6.136,-0.0323;-2.4468,4.5345,0.2783;-3.3383,5.1389,0.1405;1.2786,4.7442,0.2609;1.4385,5.7961,0.0381;2.3424,3.8873,0.4293;3.3606,4.2567,0.3446;2.1269,2.522,0.7233;2.9614,1.8423,0.863;0.8489,2.0121,0.8402;0.7183,0.6672,1.0535;0.0159,0.2347,2.2227;0.3854,0.779,3.1042;-1.0565,0.4167,2.105;0.3057,-1.2511,2.3888;0.1582,-1.7442,1.421;-0.5889,-1.8453,3.4151;-0.9831,-1.0994,3.9896;0.2531,-2.6061,4.2992;0.3235,-3.5345,3.8806;1.5758,-2.0457,4.1932;1.5726,-1.2388,4.8181;1.6926,-1.5123,2.8608;2.0988,-2.2588,2.3)|\",4.536137887835\r\n\"[H]/C1=C(\\[H])[C@]([H])(C([H])(C(=O)OC([H])([H])[H])C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(0.7822,-3.9205,5.6835;1.6601,-3.8849,5.0377;1.6018,-2.993,4.0377;0.699,-2.3824,4.0087;2.596,-2.6117,2.9545;2.8413,-1.5538,3.1251;1.912,-2.6317,1.5526;1.6715,-3.649,1.2407;2.815,-2.0054,0.4856;3.6151,-1.1178,0.6726;2.5831,-2.5673,-0.7208;3.343,-2.015,-1.8112;3.1247,-0.9505,-1.9288;3.0318,-2.5727,-2.6947;4.414,-2.1407,-1.6327;0.6102,-1.8261,1.5438;0.498,-0.688,1.9417;-0.405,-2.547,1.0264;-1.6704,-1.8651,0.9607;-2.3646,-2.5818,0.5218;-1.591,-0.9713,0.3365;-1.9991,-1.5717,1.9611;3.9388,-3.3769,2.9363;4.6275,-2.8123,2.2981;4.37,-3.3168,3.9385;3.9343,-4.8397,2.4336;3.8526,-4.8304,1.3373;4.9248,-5.2676,2.645;2.8473,-5.792,2.9644;1.8577,-5.4034,2.696;2.9484,-6.7455,2.4278;2.8618,-6.0963,4.4684;2.0284,-6.7796,4.6812;3.7795,-6.6428,4.7283;2.7535,-4.8662,5.4024;3.7231,-4.3622,5.4461;2.5693,-5.2284,6.4211)|\",6.038206342595\r\n\"[H]O[C@]([H])(C([H])([H])[H])[C@]([H])(C([H])([H])[H])[C@]([H])(C(=O)OC([H])([H])C([H])([H])[H])N([H])[H] |(8.1174,-2.1818,1.5153;7.2603,-2.0288,1.9436;6.2447,-2.4996,1.0504;5.3337,-2.4795,1.6597;6.497,-3.9529,0.6423;5.6763,-4.3392,0.0274;6.5853,-4.5793,1.5357;7.4247,-4.0584,0.0652;6.0465,-1.5354,-0.1492;5.2969,-1.9882,-0.8134;7.3437,-1.3254,-0.9429;7.7599,-2.2745,-1.2974;8.1079,-0.8244,-0.3345;7.1703,-0.7063,-1.8284;5.4594,-0.1797,0.3407;6.0976,0.2122,1.1373;4.0536,-0.395,0.9016;3.1444,-0.8709,0.25;3.9377,0.0244,2.1746;2.6226,-0.0751,2.7764;2.6243,0.6967,3.5498;1.8721,0.1682,2.0204;2.3846,-1.4561,3.3685;1.4231,-1.4742,3.8947;3.1727,-1.7132,4.0838;2.3577,-2.2141,2.5806;5.2964,0.8726,-0.6709;4.7407,0.5137,-1.4463;6.2004,1.1532,-1.0422)|\",6.7484234924\r\n\"[H]/N=C(/N([H])N([H])[H])C([H])([H])OC1=C2N=C([H])C([H])=C([H])C2=C([H])C([H])=C1[H] |(0.8021,-2.7027,0.4293;-0.0042,-3.3253,0.3568;-1.0675,-2.6191,0.2932;-2.3072,-3.2222,0.1379;-3.0852,-2.7493,0.5821;-2.4051,-4.6301,0.2925;-1.6203,-4.9123,0.8878;-2.1897,-5.0336,-0.6198;-1.1539,-1.0985,0.3057;-1.9082,-0.7458,1.0154;-1.4402,-0.7419,-0.6931;0.1314,-0.5887,0.6612;0.4505,0.6989,0.3223;-0.3578,1.8192,0.7024;-1.5138,1.6067,1.3939;-2.2333,2.6549,1.7411;-3.1479,2.4489,2.2963;-1.8841,3.9931,1.4373;-2.5297,4.8072,1.7522;-0.7211,4.2254,0.7432;-0.4109,5.2365,0.4898;0.0969,3.1339,0.3527;1.3259,3.3042,-0.334;1.6562,4.3083,-0.5868;2.086,2.2051,-0.6634;3.0306,2.3292,-1.1853;1.646,0.9029,-0.3384;2.2418,0.0349,-0.6025)|\",4.468109425209999\r\n\"[H]C([H])=C1C(=O)C([H])([H])[C@]2(C([H])(C([H])([H])[H])C([H])([H])[H])O[C@@]1(C([H])([H])[H])C([H])=C2[H] |(2.1649,3.3933,5.1643;2.5257,3.7466,4.2029;2.9974,4.7235,4.1689;2.3782,2.9797,3.1175;1.7126,1.6282,3.2632;1.1643,1.2946,4.2995;1.7821,0.6452,2.0852;0.8029,0.1622,1.9968;2.4954,-0.1349,2.3818;2.2478,1.3108,0.7697;2.6944,0.3406,-0.3439;2.9424,0.9856,-1.1984;3.9654,-0.4383,0.0296;4.3256,-1.0051,-0.8365;3.7805,-1.1599,0.8349;4.7601,0.239,0.3529;1.5704,-0.615,-0.7755;1.8959,-1.2144,-1.6329;0.6606,-0.0825,-1.0744;1.3037,-1.312,0.028;3.3755,2.1404,1.1258;2.8123,3.339,1.6963;3.8368,4.4569,1.5837;4.1182,4.584,0.5344;4.7345,4.215,2.1609;3.4302,5.4051,1.9502;1.5407,3.523,0.8706;0.9822,4.4522,0.8302;1.2128,2.3476,0.3391;0.3254,2.116,-0.2396)|\",5.221864791095\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])[C@@](C#N)(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]([H])(C([H])([H])[H])C2([H])[H] |(-1.3456,3.2776,3.3811;-0.4709,2.8328,2.9142;0.16,3.492,1.8556;-0.2212,4.4455,1.4981;1.2882,2.9176,1.2714;1.7739,1.6928,1.7416;1.1509,1.0324,2.7989;1.5461,0.0928,3.1767;0.0224,1.6119,3.385;-0.4693,1.1139,4.2161;2.9868,1.2266,0.9232;4.0547,0.7556,1.8226;4.8959,0.3813,2.5316;2.5712,0.0095,-0.0022;2.1364,-0.7113,0.7043;1.4697,0.3526,-1.0177;1.1193,-0.5696,-1.4949;1.8344,1.0098,-1.8142;0.6067,0.8303,-0.545;3.7452,-0.7078,-0.6912;3.3806,-1.6318,-1.1547;4.5309,-0.981,0.0199;4.1973,-0.1037,-1.4826;3.438,2.5972,0.2582;4.0317,3.0892,1.0393;4.3007,2.5589,-1.002;4.5927,3.5808,-1.2722;3.7728,2.1327,-1.8606;5.2205,1.9864,-0.8447;2.1364,3.4273,0.1314;1.6451,3.2565,-0.8363;2.342,4.5025,0.1953)|\",6.34025271665\r\n\"[H]C([H])=C([H])C([H])([H])[C@@]([H])(C([H])=C([H])[H])C1=C([H])C([H])=C(Cl)C([H])=C1[H] |(0.9185,-1.2194,-0.48;1.347,-0.2932,-0.8533;0.9128,0.108,-1.7671;2.3485,0.3214,-0.224;2.7525,-0.1228,0.6877;2.9858,1.6126,-0.6586;2.789,2.3924,0.0915;2.5255,1.9564,-1.5936;4.5298,1.513,-0.8364;4.9463,1.1459,0.1112;4.907,0.524,-1.9176;4.5156,0.7374,-2.9134;5.6686,-0.554,-1.735;5.9104,-1.2283,-2.552;6.0802,-0.8063,-0.7597;5.1415,2.8852,-1.1045;6.0703,3.4359,-0.2137;6.3582,2.8766,0.6733;6.6425,4.689,-0.4365;7.3607,5.1031,0.2634;6.2795,5.4056,-1.574;6.9921,6.9879,-1.8701;5.3582,4.8861,-2.4823;5.0849,5.4544,-3.3652;4.7981,3.6327,-2.2406;4.0823,3.2376,-2.9569)|\",6.08446569718\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(/[H])C2=C([H])C([H])=C([H])C([H])=C2[H])C1(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H] |(1.6261,-2.7816,2.3279;1.0449,-2.3672,1.6706;1.8542,-1.8133,0.6531;1.1457,-1.4172,-0.0796;2.943,-2.7386,0.0355;3.0193,-3.7168,0.5312;2.837,-2.9258,-1.041;4.0034,-1.73,0.4418;5.3444,-1.6895,0.4054;5.8224,-0.8206,0.8585;6.2904,-2.6708,-0.1433;5.9062,-3.8399,-0.8299;4.8559,-4.0545,-0.9887;6.8602,-4.7285,-1.3206;6.5367,-5.6231,-1.8468;8.223,-4.4777,-1.1437;8.9634,-5.1733,-1.5293;8.6232,-3.3224,-0.47;9.6799,-3.1121,-0.3265;7.6692,-2.4342,0.0204;7.9903,-1.5364,0.5441;2.9744,-0.7587,1.0402;2.7898,0.5518,0.2378;2.6502,0.2869,-0.8191;1.8472,1.0163,0.5577;3.9274,1.5729,0.3397;3.7106,2.449,-0.2821;4.8748,1.1482,-0.0089;4.0731,1.9277,1.3661;3.161,-0.5239,2.5541;3.2508,-1.499,3.0539;4.1327,-0.0386,2.712;2.0554,0.2923,3.2371;2.2334,0.3476,4.3174;1.0768,-0.1698,3.0767;2.0142,1.3199,2.8599)|\",4.99601029518\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(/[H])C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]1([H])C([H])([H])C([H])([H])[H] |(4.6051,-0.2366,2.4437;4.6491,-0.5927,1.5419;4.2739,0.4357,0.6559;4.3018,-0.0116,-0.3428;5.0315,1.7975,0.6678;5.4577,2.0067,1.6608;5.8154,1.9545,-0.0811;3.6831,2.4635,0.4489;3.2539,3.6088,-0.0975;2.1768,3.7183,-0.2315;4.0501,4.7459,-0.5795;3.4378,5.6939,-1.4212;2.3917,5.5626,-1.6889;4.1455,6.7854,-1.9189;3.6475,7.4997,-2.5696;5.4885,6.9627,-1.5807;6.0435,7.814,-1.9655;6.1091,6.041,-0.7343;7.1502,6.1773,-0.4527;5.4009,4.9492,-0.2366;5.8935,4.2617,0.4425;2.933,1.2062,0.8728;2.7311,1.2439,1.9587;1.6724,0.7552,0.1381;0.8969,1.5277,0.2354;1.8978,0.6817,-0.9345;1.1344,-0.5847,0.655;0.2595,-0.9091,0.0806;0.8333,-0.5091,1.7072;1.8963,-1.3693,0.5881)|\",5.069481034815\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(/[H])C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]1([H])C([H])([H])[H] |(3.6979,-2.6667,-0.4801;2.7872,-2.5456,-0.7932;2.6094,-1.1779,-1.0747;1.5736,-1.0788,-1.4141;3.5817,-0.4449,-2.0476;4.5723,-0.9246,-2.058;3.2587,-0.3046,-3.0849;3.5073,0.7456,-1.1058;3.6593,2.0718,-1.218;3.3828,2.6745,-0.3517;4.1267,2.8439,-2.3775;4.8015,2.2697,-3.4724;5.0268,1.2087,-3.4698;5.2118,3.0515,-4.5502;5.7331,2.5849,-5.3823;4.9676,4.4267,-4.5623;5.291,5.0336,-5.4037;4.3133,5.0155,-3.4785;4.1241,6.0858,-3.471;3.9028,4.2335,-2.4014;3.3933,4.6993,-1.5607;2.9351,-0.1108,0.0176;3.7597,-0.5031,0.6385;1.8258,0.4204,0.9163;2.1864,1.2261,1.567;1.4338,-0.3769,1.5584;0.9972,0.818,0.3191)|\",5.069481034815\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.5597,-2.884,1.0458;6.7813,-2.2671,0.3299;5.626,-1.529,0.0253;5.924,-0.8129,-0.7451;4.8211,-0.843,1.1621;4.9793,-1.3518,2.1256;4.9729,0.2309,1.3194;3.5489,-1.303,0.4699;2.2867,-0.8738,0.6158;2.1444,-0.0596,1.3289;1.0459,-1.3236,-0.0314;0.913,-2.559,-0.6927;1.7585,-3.2347,-0.7494;-0.2978,-2.9324,-1.271;-0.3754,-3.8913,-1.777;-1.4085,-2.0882,-1.1998;-2.3515,-2.3844,-1.6515;-1.2988,-0.8652,-0.5357;-2.157,-0.2017,-0.4664;-0.0882,-0.493,0.0442;-0.0103,0.4596,0.5638;4.3175,-2.2878,-0.4192;4.3275,-3.287,0.0458;4.064,-2.4294,-1.8996;3.6479,-1.3389,-2.6769;3.4313,-0.3909,-2.1921;3.4924,-1.4648,-4.0568;3.1635,-0.6093,-4.6411;3.7509,-2.6842,-4.6864;3.6277,-2.782,-5.7617;4.1642,-3.7773,-3.9239;4.3665,-4.7321,-4.4025;4.3174,-3.6486,-2.5428;4.6439,-4.5041,-1.9551)|\",5.023221680230001\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(\\[H])C2=C([H])C(C([H])([H])[H])=C([H])C([H])=C2[H])C1(C([H])([H])[H])C([H])([H])[H] |(2.9859,-6.8875,0.8598;3.5237,-6.9679,0.0566;4.0925,-5.7072,-0.2111;4.7244,-5.86,-1.0916;4.8583,-4.9473,0.9026;4.5352,-5.2583,1.9072;5.9531,-4.9841,0.8763;4.1728,-3.6683,0.4363;4.4161,-2.4021,0.8;5.1995,-2.2671,1.5495;3.7364,-1.1498,0.4064;3.4716,-0.1902,1.4002;3.7654,-0.4053,2.4262;2.8347,1.021,1.1166;2.526,2.0098,2.2178;2.5801,3.0424,1.856;3.2237,1.9076,3.0556;1.5138,1.8593,2.6167;2.4728,1.2874,-0.2111;1.9875,2.2292,-0.4577;2.7501,0.363,-1.2173;2.4897,0.5915,-2.2478;3.3799,-0.8445,-0.9165;3.6323,-1.5352,-1.712;3.1612,-4.455,-0.4234;1.8146,-4.597,0.3089;1.2653,-3.6504,0.2764;1.9411,-4.858,1.3672;1.1982,-5.3703,-0.1653;2.9292,-4.1049,-1.8958;2.2712,-3.2387,-2.0165;2.4561,-4.9578,-2.3994;3.8732,-3.8942,-2.4114)|\",5.333431469800001\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(\\[H])C2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])C1(C([H])([H])[H])C([H])([H])[H] |(9.5115,-4.0638,0.0325;9.7282,-3.7023,-0.8412;8.9137,-2.5731,-1.0557;9.2206,-2.1799,-2.0299;8.8493,-1.4461,0.0068;9.1162,-1.8096,1.0103;9.4355,-0.5408,-0.1872;7.3425,-1.3849,-0.2162;6.4648,-0.4853,0.25;6.8895,0.2956,0.8853;4.9977,-0.4048,0.1118;4.2435,0.0558,1.207;4.7577,0.3311,2.1252;2.8572,0.1577,1.1418;2.306,0.5111,2.0109;2.1597,-0.1809,-0.026;0.653,-0.0933,-0.089;0.3023,0.0538,-1.116;0.2727,0.7343,0.5199;0.1831,-1.0128,0.2862;2.9101,-0.6078,-1.1283;2.4008,-0.8476,-2.0594;4.2986,-0.7153,-1.0656;4.8489,-1.0042,-1.9527;7.3468,-2.7151,-0.9998;6.8961,-3.8802,-0.1001;5.8131,-3.8403,0.056;7.3649,-3.8453,0.8915;7.1429,-4.8431,-0.5637;6.6965,-2.8303,-2.3815;5.6143,-2.9829,-2.3199;7.1238,-3.6932,-2.9085;6.8856,-1.939,-2.9906)|\",5.178326575015001\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(/[H])C2=C([H])C([H])=C([H])C([H])=C2[H])C1(C([H])([H])[H])C([H])([H])[H] |(3.7209,-2.6049,-0.1953;2.8299,-2.5604,-0.576;2.5842,-1.226,-0.9518;1.5826,-1.2299,-1.392;3.5915,-0.4775,-1.8722;4.6068,-0.8934,-1.788;3.3452,-0.4058,-2.9374;3.3704,0.7508,-1.0059;3.5133,2.0742,-1.1587;3.1355,2.7063,-0.3534;4.0907,2.8116,-2.2909;4.8746,2.209,-3.2941;5.1032,1.1504,-3.236;5.3861,2.9599,-4.3502;5.9906,2.4721,-5.1108;5.1363,4.3319,-4.4316;5.5382,4.9144,-5.2561;4.3732,4.9488,-3.4384;4.1773,6.017,-3.485;3.8619,4.1977,-2.3827;3.2686,4.6856,-1.6125;2.7099,-0.0667,0.1081;1.3981,0.4628,0.689;1.5621,1.3685,1.2864;0.9346,-0.2867,1.3425;0.6856,0.7111,-0.1053;3.6995,-0.4201,1.2332;3.8895,0.4562,1.8624;4.6713,-0.7516,0.846;3.2931,-1.216,1.869)|\",5.058596480795\r\n\"[H]O[C@]1([H])C([H])([H])/C(=C(/[H])C2=C([H])C([H])=C(Cl)C([H])=C2[H])C1(C([H])([H])[H])C([H])([H])[H] |(3.3339,-2.5478,-0.1795;2.4459,-2.4478,-0.5568;2.2742,-1.098,-0.9158;1.2769,-1.0422,-1.3622;3.3252,-0.3923,-1.8216;4.3178,-0.8593,-1.7353;3.0901,-0.2954,-2.8873;3.1622,0.835,-0.9412;3.3719,2.1517,-1.0744;3.0226,2.7907,-0.2622;3.9929,2.8768,-2.1899;4.728,2.2547,-3.2177;4.8828,1.1818,-3.2;5.2888,2.9929,-4.2565;5.8551,2.5006,-5.0402;5.1275,4.3781,-4.2791;5.8378,5.3136,-5.5892;4.4184,5.0302,-3.2717;4.3044,6.1088,-3.2951;3.8636,4.2775,-2.2408;3.3098,4.786,-1.4552;2.4532,0.04,0.159;1.1645,0.6275,0.7369;1.3693,1.5117,1.353;0.6569,-0.1099,1.3708;0.4755,0.9254,-0.0608;3.4172,-0.3724,1.2853;3.6537,0.4909,1.9166;4.3712,-0.7564,0.9017;2.9671,-1.1464,1.9188)|\",4.90349158601\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])/C(=C(\\[H])C2=C([H])C([H])=C(Cl)C([H])=C2[H])C1([H])[H] |(2.3881,-1.8354,0.7602;2.8801,-1.0086,0.8898;2.394,-0.0659,-0.0815;0.896,0.1808,0.1417;0.5159,0.9827,-0.4998;0.7015,0.4406,1.1866;0.3284,-0.7296,-0.0878;2.6496,-0.6069,-1.4982;2.3131,0.1027,-2.2622;2.1068,-1.5485,-1.6543;3.7171,-0.8004,-1.6461;3.2579,1.1785,0.1393;4.3201,0.9307,0.0938;2.8736,2.5173,-0.3224;2.6409,3.3232,-1.3578;2.7798,2.9182,-2.3609;2.2094,4.7265,-1.3032;1.9654,5.4061,-0.0948;2.1002,4.8856,0.8475;1.5517,6.7337,-0.084;1.3663,7.2471,0.8538;1.3744,7.4067,-1.2944;0.8505,9.0859,-1.284;1.608,6.7653,-2.5089;1.4683,7.2988,-3.4432;2.0219,5.4352,-2.5025;2.2032,4.9337,-3.4502;2.9023,2.2975,1.1313;1.9594,2.2384,1.6734;3.7159,2.7213,1.7199)|\",4.889885893484999\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(N([H])[H])[C@@]([H])(C(=O)C([H])([H])[H])C([H])([H])Cl)C([H])=C1[H] |(0.2983,6.5703,-4.1826;0.6672,5.6099,-3.833;1.7617,5.0082,-4.4592;2.2478,5.4998,-5.2979;2.2316,3.7735,-4.0125;3.0808,3.3174,-4.5184;1.6205,3.1164,-2.9337;2.1262,1.7587,-2.4434;1.4085,1.3812,-1.7067;2.2421,0.7285,-3.4884;2.8594,1.0547,-4.232;1.3307,0.594,-3.9246;3.4883,1.8926,-1.6989;4.2408,2.2741,-2.4005;3.3239,2.8708,-0.5171;2.5359,2.606,0.3717;4.1622,4.1265,-0.5202;5.2257,3.8564,-0.5247;3.9359,4.7333,0.3588;3.9699,4.7034,-1.4334;3.9388,0.5269,-1.1782;3.9447,-0.2142,-1.9739;3.2945,0.1991,-0.361;5.6339,0.5972,-0.5037;0.5269,3.7329,-2.3129;0.0487,3.2428,-1.4685;0.0517,4.9685,-2.7581;-0.7996,5.4279,-2.2629)|\",5.981062433989999\r\n\"[H]O[C@@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(-1.7561,-2.0381,-2.5557;-0.7931,-2.0853,-2.6634;-0.2532,-0.844,-2.1825;-0.5888,-0.0132,-2.8231;-0.7139,-0.5857,-0.772;-0.4627,-1.3678,-0.054;-1.3973,0.4912,-0.3829;-1.7108,0.6326,0.6481;-1.6621,1.2821,-1.0826;1.2727,-0.9576,-2.2772;1.6996,0.0153,-2.011;1.6186,-1.6826,-1.5289;1.794,-1.3926,-3.657;1.4593,-2.4156,-3.8639;2.8914,-1.4018,-3.6275;1.3455,-0.5094,-4.8137;0.2537,-0.5775,-4.936;1.7234,0.8524,-4.5707;1.9157,1.5041,-5.8326;0.7517,2.4446,-6.1356;0.8644,2.8816,-7.1331;0.7141,3.2529,-5.3985;-0.1917,1.8918,-6.1044;3.2662,2.216,-5.7971;3.4524,2.7274,-6.7468;4.0659,1.4899,-5.6242;3.2869,2.9539,-4.9884;1.9178,0.4702,-6.8275;2.0228,-0.7793,-6.1569;1.52,-1.5384,-6.7631;3.0756,-1.0716,-6.0158)|\",6.87359586363\r\n\"[H]C#C/C([H])=C(\\[H])C([H])=C(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(0.2336,0.0712,0.9532;1.277,-0.1178,1.0616;2.4622,-0.3348,1.1954;3.8536,-0.5774,1.3184;4.4615,-0.3836,0.4338;4.4507,-1.0272,2.4506;3.8125,-1.1982,3.3082;5.8867,-1.2357,2.5284;6.3692,-0.9186,1.6112;6.6885,-1.72,3.5166;6.0634,-2.2258,4.8606;5.4158,-1.0229,5.6004;4.8953,-1.367,6.5031;4.6974,-0.4759,4.9883;6.1855,-0.3082,5.9141;5.013,-3.3354,4.5713;4.5684,-3.6771,5.5142;5.4996,-4.1983,4.102;4.1999,-3.0271,3.915;7.0123,-2.8845,5.8958;6.4301,-3.0782,6.8043;7.8573,-2.2579,6.1862;7.3924,-3.8499,5.5544;8.2277,-1.6672,3.2712;8.871,-0.5993,4.1966;9.9585,-0.5847,4.0511;8.6839,-0.7757,5.2576;8.4866,0.3978,3.9529;8.9211,-3.0416,3.4814;9.9225,-3.0175,3.0364;8.3619,-3.8457,2.9889;9.0479,-3.3059,4.5301;8.5949,-1.2424,1.8273;9.6857,-1.2399,1.7277;8.2504,-0.2327,1.5814;8.1976,-1.9377,1.0794)|\",4.345658192485001\r\n\"[H]C(=O)/C([H])=C([H])/C([H])=C(\\[H])C([H])=C(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(-2.9933,0.2879,-0.8565;-2.4842,1.2284,-1.172;-3.0953,2.0972,-1.7726;-1.0724,1.3056,-0.8043;-0.5458,2.2101,-1.0996;-0.4738,0.2892,-0.135;-1.1002,-0.5738,0.0974;0.8997,0.1893,0.3143;1.1735,-0.7507,0.7925;1.8539,1.1518,0.2;1.5729,2.0841,-0.2739;3.2147,0.9445,0.6518;3.3541,-0.0707,1.0037;4.305,1.7651,0.6724;4.1774,3.2605,0.2267;3.0549,3.9655,1.0407;2.984,5.0165,0.7351;3.2955,3.946,2.1098;2.0671,3.5223,0.9232;3.8756,3.3087,-1.2972;3.7086,4.3451,-1.6158;2.9996,2.7239,-1.584;4.7266,2.9201,-1.868;5.4083,4.1807,0.4368;5.179,5.1502,-0.0205;6.324,3.819,-0.0339;5.6088,4.3702,1.4939;5.6659,1.1187,1.0745;6.6007,1.0673,-0.1643;7.5685,0.6328,0.116;6.7977,2.0482,-0.6002;6.1646,0.4364,-0.9472;6.3666,1.8584,2.247;7.182,1.2356,2.6333;5.6684,2.0377,3.0727;6.8078,2.8112,1.9589;5.5241,-0.3497,1.5476;6.5196,-0.7355,1.7921;5.1092,-1.0048,0.775;4.9076,-0.4391,2.4494)|\",3.5592491645400006\r\n\"[H]C(=O)/C([H])=C(\\[H])C([H])=C(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(-1.4976,-0.5175,0.1839;-1.2109,-1.5913,0.0932;-2.0641,-2.4602,0.0265;0.231,-1.8342,0.0644;0.5535,-2.87,-0.0236;1.1164,-0.8119,0.1435;0.6962,0.1856,0.2219;2.5533,-1.0048,0.0889;2.7934,-2.0471,-0.0795;3.5976,-0.1332,0.1774;3.3415,1.387,0.4551;4.577,2.2948,0.6886;4.2236,3.3309,0.7409;5.3199,2.2532,-0.1091;5.0705,2.081,1.6394;2.5102,1.5562,1.7585;2.3507,2.6228,1.9569;3.0548,1.1364,2.6118;1.533,1.0758,1.7308;2.5944,1.9999,-0.7628;2.3273,3.0429,-0.5531;1.6802,1.4633,-1.0208;3.237,1.9935,-1.6506;5.0214,-0.7143,-0.0846;6.0164,-0.4217,1.0711;6.9166,-1.0327,0.9383;5.5795,-0.6864,2.041;6.3427,0.6158,1.1144;5.0268,-2.2559,-0.2451;6.056,-2.5842,-0.4242;4.4305,-2.5932,-1.0986;4.6707,-2.7688,0.6553;5.5696,-0.1562,-1.4258;6.573,-0.5582,-1.6125;5.6458,0.9323,-1.4473;4.9257,-0.4611,-2.2584)|\",4.348379330990001\r\n\"[H]C([H])=C([H])/C(C1=C([H])SC([H])=C1[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.2315,2.5647,0.8463;4.2203,1.6405,0.2758;5.0607,1.4666,-0.3899;3.2032,0.7728,0.3896;2.3817,1.0218,1.0578;3.0855,-0.5186,-0.3037;4.3168,-1.1614,-0.8253;4.3939,-1.7742,-2.0545;3.6245,-1.8199,-2.813;5.9506,-2.4759,-2.3474;6.5248,-1.9457,-0.7957;7.5326,-2.1805,-0.4806;5.5628,-1.2633,-0.1105;5.7114,-0.8547,0.8822;1.8995,-1.1744,-0.4437;1.9525,-2.1921,-0.8271;0.5421,-0.7196,-0.1195;0.1208,0.6207,-0.2271;0.8149,1.3721,-0.5888;-1.1884,0.9867,0.0823;-1.4908,2.0262,-0.0144;-2.1124,0.0257,0.4965;-3.1328,0.3135,0.7348;-1.7196,-1.3125,0.5843;-2.4338,-2.0713,0.8933;-0.414,-1.6791,0.2714;-0.1152,-2.7227,0.3389)|\",4.2096012672350005\r\n\"[H]OC1=N[C@@]([H])(C([H])([H])C([H])([H])[H])C2=C(C([H])=C(OC([H])([H])[H])C(OC([H])([H])[H])=C2[H])C1([H])[H] |(2.3026,5.4018,-1.3176;1.7977,4.8427,-0.7074;2.4402,3.6428,-0.5658;1.9369,2.8118,0.2427;2.5303,1.4928,0.4136;2.5911,1.3296,1.4999;1.5343,0.4253,-0.1163;1.9129,-0.5669,0.162;0.5893,0.57,0.4206;1.2844,0.4791,-1.6247;0.5716,-0.2967,-1.9261;2.2094,0.3199,-2.1912;0.864,1.4469,-1.9181;3.9325,1.3169,-0.1533;4.4747,2.2499,-1.0349;5.7692,2.0537,-1.5364;6.2159,2.7706,-2.2208;6.5314,0.9539,-1.1693;7.8179,0.8505,-1.6401;8.0741,-0.2458,-2.5206;9.1143,-0.1402,-2.8375;7.4207,-0.1994,-3.4025;7.9405,-1.2071,-2.0154;5.9938,0.0149,-0.2565;6.8138,-1.0213,0.0918;6.3342,-1.9642,1.0373;7.1428,-2.6838,1.1776;5.4415,-2.4877,0.6699;6.1024,-1.4869,1.9982;4.7035,0.2078,0.2332;4.2822,-0.4982,0.9415;3.6694,3.4638,-1.4329;3.3419,3.3962,-2.4828;4.2928,4.369,-1.3722)|\",5.6817371984400005\r\n\"[H]OC1=N[C@@]([H])(C([H])([H])[H])C2=C([H])C(OC([H])([H])[H])=C(OC([H])([H])[H])C([H])=C2C1([H])[H] |(4.9103,2.0813,-3.2084;4.3869,2.3947,-2.4551;3.7893,1.3223,-1.8511;3.1052,1.5516,-0.8132;2.3814,0.4655,-0.1633;2.5201,0.6093,0.918;0.8761,0.6483,-0.4513;0.2757,-0.0597,0.1304;0.6662,0.4822,-1.514;0.5731,1.6679,-0.1948;2.8519,-0.9392,-0.511;2.5187,-2.0156,0.3271;1.9466,-1.8163,1.2275;2.9252,-3.3155,0.031;2.6655,-4.411,0.8052;1.9591,-4.2182,2.0208;1.883,-5.2049,2.4809;0.9509,-3.8204,1.8441;2.4998,-3.5426,2.6961;3.6819,-3.5499,-1.1419;4.1632,-4.7976,-1.4568;3.185,-5.7748,-1.8177;3.7437,-6.6769,-2.0774;2.6054,-5.4445,-2.6907;2.5051,-5.9909,-0.9881;4.0191,-2.4785,-1.9571;4.6216,-2.6864,-2.8376;3.6157,-1.1714,-1.6528;4.0076,-0.0114,-2.5371;3.4328,-0.0135,-3.4773;5.0636,-0.1016,-2.8334)|\",5.69806402947\r\n\"[H]OC([H])([H])[C@]([H])(/C([H])=C(\\[H])C([H])([H])[H])[Si](C1=C([H])C([H])=C([H])C([H])=C1[H])(C([H])([H])[H])C([H])([H])[H] |(4.8212,2.4127,1.0855;5.7515,2.155,0.9968;5.7938,0.9309,0.2576;5.3341,1.0695,-0.734;6.8573,0.7281,0.1061;5.1319,-0.2551,0.9873;5.3876,-1.1614,0.4162;3.6326,-0.1283,1.0286;3.2402,0.7998,1.4551;2.7462,-1.0438,0.6192;3.1175,-1.9805,0.1982;1.2526,-0.8977,0.69;0.7955,-0.976,-0.3058;0.963,0.0676,1.12;0.8012,-1.6891,1.304;5.8404,-0.5517,2.7688;7.7352,-0.4423,2.7406;8.3874,0.796,2.5721;7.7989,1.6998,2.4369;9.7797,0.8881,2.5611;10.256,1.8566,2.4293;10.5595,-0.2595,2.7187;11.6444,-0.1891,2.7107;9.938,-1.4973,2.8859;10.5376,-2.396,3.0102;8.5439,-1.5833,2.8967;8.0823,-2.5583,3.0342;5.1449,0.6961,4.0129;5.4987,0.4619,5.0237;4.0489,0.6702,4.0301;5.4593,1.7152,3.7681;5.3061,-2.2955,3.2864;5.665,-2.5431,4.2923;5.6754,-3.0662,2.5994;4.2123,-2.362,3.2962)|\",5.95929332595\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])C([H])([H])/C(=C(\\[H])C(=O)SC([H])([H])C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(3.1189,-6.3671,-1.6836;3.5504,-5.4104,-1.9652;4.8773,-5.3366,-2.3937;5.4831,-6.2374,-2.4497;5.4269,-4.1077,-2.759;6.4598,-4.0607,-3.0993;4.6668,-2.931,-2.7012;5.2835,-1.5996,-3.0738;5.9481,-1.7307,-3.9367;4.4999,-0.9003,-3.3894;6.1228,-0.9541,-1.9347;6.6346,-0.0754,-2.3537;6.8985,-1.6615,-1.6211;5.3076,-0.5101,-0.7427;5.4542,-1.1436,0.4417;6.1608,-1.9667,0.5182;4.7215,-0.8077,1.6722;3.8583,0.0459,1.7886;5.2501,-1.8247,3.0812;4.1461,-1.1327,4.3773;4.1944,-0.044,4.2978;3.12,-1.4332,4.1469;4.5732,-1.6235,5.759;3.903,-1.209,6.5207;5.5943,-1.3083,5.9976;4.5288,-2.7155,5.8322;4.3572,0.6416,-0.9671;4.5246,1.1052,-1.9447;4.4607,1.3996,-0.1866;3.314,0.3079,-0.9174;3.3363,-3.0208,-2.273;2.7269,-2.1208,-2.2298;2.7807,-4.2479,-1.9069;1.7456,-4.2947,-1.5788)|\",5.12662494342\r\n\"[H]C([H])=C([H])[C@@]1([H])/C(=C(/[H])C([H])([H])[H])C2=C([H])C(Cl)=C([H])C([H])=C2OC1([H])[H] |(0.973,-0.6842,1.6464;1.1668,-0.0631,0.7763;0.3382,0.5533,0.4383;2.3561,-0.0799,0.1708;3.1363,-0.7246,0.5746;2.7905,0.7723,-0.9962;3.7071,1.2896,-0.6646;3.1504,0.0857,-2.3171;3.0537,-1.2132,-2.6585;3.3119,-1.4462,-3.6915;2.6469,-2.4269,-1.8741;1.8812,-2.9864,-2.4277;2.25,-2.2035,-0.8857;3.5006,-3.1105,-1.7579;3.6014,1.0927,-3.3191;4.4777,0.7807,-4.3735;4.8731,-0.2247,-4.4631;4.8735,1.7501,-5.2854;5.9739,1.3133,-6.5913;4.431,3.0693,-5.1721;4.751,3.8196,-5.8868;3.5833,3.4027,-4.1236;3.2201,4.4178,-3.999;3.1741,2.4317,-3.2027;2.3562,2.8701,-2.2019;1.7883,1.887,-1.3347;1.4928,2.4351,-0.437;0.8892,1.4656,-1.8051)|\",4.91437614003\r\n\"[H]C([H])=C([H])[C@@]1([H])/C(=C(/[H])C([H])([H])[H])C([H])([H])C(C(=O)OC([H])([H])[H])(C(=O)OC([H])([H])[H])C1([H])[H] |(0.6743,-1.0861,-0.7981;1.1902,-0.6152,0.0344;0.6915,-0.6539,1.0011;2.374,-0.0194,-0.1184;2.8414,-0.001,-1.1046;3.1392,0.6835,0.9747;2.6458,0.4881,1.9341;4.6213,0.2873,1.0788;5.1175,-0.9167,1.3842;6.2028,-1.0078,1.4596;4.3477,-2.1765,1.6625;4.5574,-2.5419,2.6774;4.6465,-2.9794,0.9746;3.2692,-2.0382,1.5628;5.4762,1.5202,0.8561;5.722,1.6598,-0.2048;6.4202,1.5051,1.4073;4.5549,2.6938,1.2668;4.9664,4.0114,0.612;5.7145,4.1226,-0.3334;4.3285,5.0528,1.1881;4.6073,6.3435,0.6186;4.0216,7.0527,1.2037;4.3076,6.3703,-0.4323;5.674,6.5702,0.6913;4.5256,2.8489,2.7909;3.5785,2.6152,3.5083;5.7278,3.2576,3.2509;5.8188,3.4298,4.6751;6.8398,3.762,4.8644;5.6194,2.4854,5.1881;5.0997,4.1794,5.0156;3.1723,2.2238,0.7455;2.3464,2.7344,1.2434;3.114,2.443,-0.3277)|\",6.14433074429\r\n\"[H]C1=C([H])C([H])=C(N2O[C@]([H])(C(=O)C([H])([H])[H])C([H])([H])[C@@]2([H])N(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(-1.7106,1.8214,-6.466;-0.808,1.6741,-5.8803;0.3068,2.4923,-6.079;0.2793,3.2805,-6.827;1.469,2.3017,-5.3369;2.3397,2.9213,-5.5287;1.5279,1.2946,-4.3566;2.7532,1.1027,-3.653;2.6887,0.0067,-2.7298;2.6641,0.5351,-1.3685;3.5928,0.1963,-0.8904;1.4609,-0.0746,-0.6328;0.6146,0.6267,-0.1131;1.4144,-1.5876,-0.5822;1.5475,-2.0117,-1.5832;0.4682,-1.9173,-0.1486;2.2439,-1.9605,0.0334;2.6357,2.0491,-1.5125;3.1535,2.5514,-0.695;1.6021,2.4057,-1.5286;3.2857,2.2445,-2.8939;2.9319,3.1663,-3.3844;4.7491,2.1835,-2.8346;5.3607,1.9987,-4.1504;6.442,1.8842,-4.0224;5.1883,2.8566,-4.8303;4.9623,1.0993,-4.6198;5.3091,3.3661,-2.187;6.3971,3.2608,-2.1321;4.9362,3.4741,-1.1655;5.0865,4.3045,-2.7334;0.4179,0.4603,-4.1722;0.4663,-0.3401,-3.4461;-0.7378,0.6572,-4.9293;-1.5911,0.0038,-4.7667)|\",4.79464604581\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])/N=C(/O[H])C([H])([H])[H])OC([H])([H])C([H])=C([H])[H] |(-0.196,0.6966,3.0806;0.5216,1.0555,2.5293;1.6295,0.3812,2.936;1.564,-0.472,3.8682;2.7512,-1.2135,4.2635;3.6742,-0.6673,4.0283;2.6501,-1.5798,5.7543;3.6382,-1.894,6.1144;2.3353,-0.7251,6.3606;1.6512,-2.7661,5.8061;1.9338,-3.4968,6.5719;0.649,-2.4091,6.0609;1.6551,-3.3806,4.378;0.7023,-3.1952,3.8742;1.8273,-4.4626,4.3698;2.7651,-2.6261,3.6069;3.7379,-3.0709,3.873;2.5356,-2.5965,2.183;3.278,-3.193,1.3437;2.9612,-3.0974,0.0193;2.1631,-2.5366,-0.0053;4.5098,-4.0446,1.5144;4.8025,-4.1517,2.5592;5.3405,-3.6031,0.9536;4.3256,-5.0387,1.0927;2.7413,0.7387,2.2651;2.6629,1.7876,1.272;1.9885,1.4866,0.4642;2.2505,2.6892,1.7415;4.0511,2.0302,0.7658;4.7993,2.2501,1.5262;4.3826,2.0189,-0.5244;5.393,2.2391,-0.8571;3.6543,1.794,-1.3008)|\",6.160657575319999\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])N([H])[H])OC([H])([H])C([H])=C([H])[H] |(6.6227,-4.6193,-2.8754;6.6074,-3.708,-2.5337;5.3235,-3.5079,-2.1385;4.4492,-4.4169,-2.2531;3.0773,-4.164,-1.8132;3.0688,-3.5555,-0.8982;2.2196,-3.5132,-2.925;1.4123,-2.9335,-2.4618;2.809,-2.8125,-3.5238;1.6319,-4.6902,-3.7639;0.5405,-4.6074,-3.821;1.9995,-4.6779,-4.795;2.05,-5.9838,-3.028;2.9608,-6.3993,-3.4759;1.2873,-6.768,-3.0496;2.3791,-5.5363,-1.5934;1.4382,-5.3443,-1.0579;3.1137,-6.5527,-0.8451;3.2274,-6.2449,0.1211;4.0577,-6.5872,-1.2333;5.1074,-2.2764,-1.6397;6.2204,-1.357,-1.5072;7.0038,-1.8381,-0.9094;6.6349,-1.1325,-2.4949;5.7046,-0.1235,-0.8337;5.2016,-0.2815,0.1192;5.8637,1.1063,-1.3206;5.5115,1.9829,-0.7845;6.358,1.285,-2.2732)|\",5.79058273864\r\n\"[H]C([H])=C([H])C([H])([H])[C@]1(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[C@@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(0.8042,1.0385,-1.4687;1.4665,0.19,-1.6185;1.4158,-0.2986,-2.5901;2.2906,-0.2357,-0.6607;2.3055,0.2788,0.2999;3.2353,-1.4024,-0.7952;3.1017,-1.8631,-1.7816;2.9918,-2.1664,-0.0462;4.7069,-1.0092,-0.6389;5.1537,-0.5143,0.7029;6.1231,0.1924,0.9007;4.3474,-0.9647,1.6945;4.7125,-0.5863,3.0442;5.0749,0.4445,3.0341;3.7732,-0.6368,3.6005;5.7551,-1.5324,3.6204;5.955,-1.2715,4.6663;6.6917,-1.4595,3.0612;5.402,-2.5682,3.5857;5.4437,-0.4294,-1.8213;4.8672,-0.2539,-2.7258;6.1816,0.3336,-1.5965;5.7467,-1.8478,-1.4425;5.2393,-2.5964,-2.0491;7.045,-2.3548,-0.9023;7.1834,-3.7392,-0.708;6.3475,-4.3945,-0.944;8.3706,-4.2858,-0.2243;8.4526,-5.3608,-0.0856;9.4509,-3.4536,0.0755;10.3798,-3.8748,0.4505;9.3286,-2.077,-0.1164;10.1642,-1.4195,0.1093;8.1397,-1.5312,-0.6025;8.0634,-0.4591,-0.7391)|\",6.043648619605\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])C([H])([H])SC([H])([H])C([H])([H])C([H])([H])F |(3.462,3.0548,1.0475;3.7439,3.1773,1.9794;5.021,3.6191,2.0228;5.5762,3.8111,3.0795;5.7209,3.8356,0.6658;4.9829,4.2208,-0.0481;6.8152,4.7996,0.7471;7.3557,4.5786,1.5851;6.4285,5.7258,0.9274;6.287,2.5049,0.1174;6.766,2.7415,-0.8385;7.0931,2.1753,0.7877;5.3147,1.3278,-0.0402;4.9755,0.9622,0.9344;5.8255,0.4939,-0.531;3.7365,1.6751,-0.9319;4.3086,2.3596,-2.5359;3.3927,2.6853,-3.0372;4.9185,3.2532,-2.3615;5.0466,1.3563,-3.4269;5.9897,1.0411,-2.9617;4.4323,0.4579,-3.5553;5.3656,1.9456,-4.7939;5.8684,1.2085,-5.4314;6.0073,2.8323,-4.7084;4.1835,2.3312,-5.4242)|\",6.936182049245\r\n\"[H]C1=NN([H])C2=C3C(=O)C([H])=C([H])C([H])=C3OC(C([H])([H])[H])(C([H])([H])[H])[C@]12[H] |(2.8412,-3.6171,-0.3393;3.6571,-2.9071,-0.276;4.7372,-3.2239,0.355;5.6151,-2.1501,0.2185;6.4858,-2.0742,0.7382;5.0706,-1.1077,-0.4319;5.5024,0.1907,-0.5277;6.8155,0.5897,0.0038;7.5529,-0.2506,0.5664;7.1539,1.9837,-0.1999;8.1247,2.323,0.1463;6.263,2.836,-0.7977;6.5417,3.8796,-0.9286;4.9564,2.4456,-1.2376;4.2647,3.1737,-1.646;4.5814,1.138,-1.0966;3.3275,0.7374,-1.4977;2.7144,-0.4338,-0.8744;2.329,-0.1101,0.5726;1.8202,-0.9636,1.0339;1.6496,0.7473,0.5857;3.204,0.1369,1.1799;1.4811,-0.7191,-1.7264;0.9492,-1.5994,-1.3494;1.7623,-0.896,-2.7693;0.799,0.1357,-1.6954;3.7585,-1.5732,-0.9833;3.8834,-1.7737,-2.0618)|\",3.126588142245\r\n\"[H]O[C@@]([H])(C([H])([H])S[H])[C@]([H])(O[H])C1([H])SS1 |(-0.5195,-0.7609,-0.8594;-0.4912,0.1922,-1.0529;0.8187,0.6157,-0.7414;0.8955,1.6448,-1.1122;1.1222,0.5962,0.7685;2.1169,1.0116,0.96;1.1224,-0.4354,1.1344;-0.1031,1.4696,1.8215;-0.0386,2.6611,1.1875;1.8072,-0.2582,-1.5506;1.5272,-0.1582,-2.6068;1.6007,-1.5959,-1.0938;1.9422,-2.1996,-1.7721;3.2731,0.1197,-1.4056;3.6761,0.081,-0.3957;3.9041,1.4922,-2.4109;4.4275,-0.5501,-2.648)|\",3.8803435081300006\r\n\"[H]C1=C([H])C([H])=C2C(=N1)[C@@]1([H])C(=O)[C@@]([H])(C2([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(3.2231,2.1661,-0.4609;2.5716,1.2978,-0.3747;3.0699,0.0789,0.0877;4.1136,-0.0213,0.3701;2.189,-0.9939,0.1793;2.5335,-1.9607,0.541;0.8513,-0.8324,-0.1965;0.4548,0.4413,-0.6483;1.2982,1.4816,-0.7316;-0.9848,0.7393,-1.0641;-0.9573,1.3315,-1.9818;-1.7264,-0.5624,-1.3117;-2.3323,-0.8301,-2.3304;-1.5978,-1.5018,-0.1191;-2.2685,-2.3505,-0.2813;-0.1297,-1.9886,-0.1295;0.0672,-2.5952,0.7642;0.0217,-2.6516,-0.993;-1.9513,-0.7476,1.2028;-2.5979,-1.3801,1.8211;-1.0385,-0.5718,1.7866;-2.6373,0.5981,0.9184;-2.9,1.0997,1.8572;-3.5839,0.4049,0.3948;-1.7582,1.5266,0.0552;-1.025,2.0568,0.6723;-2.3859,2.2897,-0.4165)|\",5.42595017897\r\n\"[H]C(=O)C([H])([H])[C@@]([H])(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])=C([H])[H] |(8.3567,0.589,2.6261;7.2452,0.5847,2.7298;6.7101,1.1988,3.6258;6.5247,-0.2199,1.6718;6.6914,0.3032,0.7182;5.4516,-0.2109,1.8892;7.0627,-1.672,1.5506;8.1525,-1.5958,1.4005;6.8318,-2.415,2.8513;7.2958,-1.9482,3.7206;6.1238,-3.5341,3.0287;5.6503,-4.0067,2.1681;5.892,-4.2544,4.3349;4.8048,-4.3597,4.4816;6.2591,-5.2861,4.2234;6.5049,-3.6126,5.5849;7.5938,-3.5392,5.4641;6.1395,-2.5806,5.669;6.1724,-4.351,6.8938;6.5544,-3.7536,7.7338;5.08,-4.3801,7.0152;6.7238,-5.7826,7.0234;6.351,-6.2076,7.965;6.3123,-6.4179,6.2274;8.2542,-5.8732,7.0082;8.5892,-6.9037,7.1733;8.6942,-5.25,7.797;8.6752,-5.5413,6.0522;6.4924,-2.3976,0.3086;6.8701,-3.4269,0.3108;5.3989,-2.461,0.4081;6.8425,-1.7506,-1.0054;6.3486,-0.806,-1.2357;7.6993,-2.2557,-1.8944;7.9204,-1.7487,-2.8299;8.2105,-3.2011,-1.7223)|\",6.040927481100001\r\n\"[H]C(=O)C([H])([H])[C@]([H])(C([H])([H])C([H])=C([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(9.9563,0.2799,1.6034;9.8426,-0.8256,1.5245;10.7064,-1.485,0.987;8.5758,-1.399,2.1163;8.6391,-1.3053,3.2113;8.5596,-2.4666,1.8728;7.2814,-0.6969,1.6295;7.37,-0.5397,0.5435;7.1073,0.7121,2.2646;7.9902,1.3235,2.041;6.2626,1.2075,1.7666;6.8715,0.7136,3.7522;5.9478,0.2476,4.0975;7.6953,1.2444,4.6582;7.4688,1.2249,5.721;8.6258,1.7308,4.3705;6.0715,-1.629,1.8539;6.2816,-2.5791,1.3432;5.9923,-1.8768,2.9222;4.7255,-1.0897,1.3502;4.4423,-0.1893,1.9126;4.8291,-0.7736,0.3013;3.591,-2.1176,1.4596;3.8565,-3.0159,0.8828;3.4982,-2.4463,2.5056;2.2354,-1.5872,0.9751;1.9697,-0.6911,1.554;2.3283,-1.2574,-0.0694;1.1086,-2.6193,1.0862;0.155,-2.21,0.7338;0.9686,-2.9433,2.1249;1.3287,-3.5125,0.4887)|\",6.141609605785001\r\n\"[H]/C(C(=O)OC([H])([H])[H])=C(/OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C(=O)OC([H])([H])[H] |(8.2248,1.7485,-0.6363;7.1456,1.7586,-0.7305;6.4188,0.5067,-0.4343;5.2562,0.2556,-0.6869;7.2554,-0.3759,0.1662;6.6689,-1.642,0.5014;7.469,-2.2216,0.9628;6.2977,-2.1441,-0.3963;5.8385,-1.5087,1.2002;6.5518,2.9158,-1.0812;5.2147,3.1109,-1.1199;4.5539,2.8724,-2.3839;4.9419,3.5838,-3.1231;4.7781,1.8506,-2.7096;3.0616,3.0469,-2.1487;2.7755,2.3605,-1.3434;2.8765,4.0661,-1.7845;2.2063,2.7676,-3.3952;2.4016,1.7455,-3.749;1.151,2.7849,-3.0943;2.4103,3.7588,-4.5484;1.7219,3.5464,-5.3742;3.428,3.7173,-4.9536;2.2273,4.7894,-4.2195;7.4354,4.1053,-1.3757;8.6457,4.1012,-1.2799;6.7135,5.1704,-1.7706;7.4824,6.3456,-2.0772;8.1821,6.1439,-2.8927;8.0453,6.6723,-1.199;6.7539,7.1013,-2.3711)|\",5.107576973885\r\n\"[H]C(=O)[C@@]1([H])[C@]([H])(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@]1([H])C([H])([H])[H] |(6.6598,0.6306,-1.0501;5.8106,0.1843,-0.4891;5.9865,-0.3163,0.6053;4.4904,0.2638,-1.1629;4.47,0.6757,-2.1681;3.4984,-0.8757,-0.9008;3.8401,-1.6405,-0.2115;2.6984,-1.3654,-2.0642;2.4226,-0.6969,-3.0449;2.3341,-2.6436,-1.8581;1.5197,-3.3919,-2.8389;2.2829,-3.5172,-4.1616;1.7296,-4.1752,-4.841;3.2705,-3.9595,-3.9911;2.4084,-2.5448,-4.6403;1.3659,-4.7582,-2.1666;0.7695,-5.4234,-2.7999;0.8649,-4.6591,-1.1984;2.3443,-5.2207,-2.0018;0.1569,-2.7135,-3.0128;-0.483,-3.3387,-3.6456;0.2585,-1.7323,-3.4787;-0.3373,-2.5968,-2.042;3.2581,0.4917,-0.3069;3.4872,0.5206,0.7571;2.1511,1.4205,-0.7546;2.4163,2.4601,-0.5301;1.2216,1.1904,-0.22;1.9599,1.3308,-1.8258)|\",6.010994957545\r\n\"[H]O[C@@]1([H])C([H])=C(C([H])([H])[H])[C@]2([H])O[C@]2([H])[C@]1([H])Br |(4.6916,-2.9726,0.823;4.941,-2.8068,-0.101;4.5542,-1.4798,-0.4238;4.7688,-1.377,-1.4923;3.0737,-1.2645,-0.1872;2.4615,-2.1607,-0.2726;2.5145,-0.0684,0.04;1.0372,0.1417,0.2371;0.4889,-0.8039,0.1957;0.6282,0.8073,-0.5347;0.8269,0.6154,1.2055;3.3925,1.1389,0.0876;2.9432,2.0604,0.4581;4.291,1.3464,-1.0317;4.8547,0.9692,0.221;5.4367,1.7433,0.7211;5.3897,-0.4321,0.3405;5.4418,-0.703,1.3995;7.294,-0.4882,-0.2548)|\",6.05725431213\r\n\"[H]O[C@@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])N([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])=C([H])[H] |(5.0613,-1.4277,-0.7891;4.7418,-1.5142,0.124;3.3131,-1.389,0.0895;3.001,-1.5475,1.1284;2.9057,-0.0011,-0.339;3.2051,0.2818,-1.3515;2.2308,0.8671,0.414;1.9663,1.8575,0.0533;1.9177,0.6202,1.4265;2.702,-2.4863,-0.7978;2.9934,-2.3184,-1.845;3.1521,-3.4403,-0.4912;1.1762,-2.5805,-0.7245;0.7321,-1.6237,-1.0263;0.8675,-2.7445,0.3279;0.6745,-3.6124,-1.6293;1.0943,-4.5068,-1.3742;-0.7802,-3.7522,-1.6007;-1.1717,-3.9507,-0.5824;-1.2136,-2.7943,-1.9123;-1.2361,-4.8717,-2.5422;-0.8179,-5.8269,-2.1983;-0.8244,-4.6917,-3.5439;-2.7562,-5.0653,-2.6104;-3.1453,-5.1168,-1.5847;-3.0862,-6.3533,-3.1588;-2.8697,-6.3141,-4.1049;-3.5353,-4.0092,-3.3633;-4.6159,-4.1378,-3.2986;-3.0419,-3.0292,-4.1228;-3.6983,-2.3531,-4.6644;-1.9755,-2.8524,-4.2353)|\",6.223243760935\r\n\"[H]C(COC([H])([H])[H])C([H])C1=C([H])C([H])=C(Cl)C([H])=C1[H] |(4.6918,-0.3335,-2.25;4.231,-0.8528,-1.4093;2.9541,-0.2893,-1.01;1.9593,-1.0623,-0.6772;1.9273,-2.5227,-0.8346;0.9176,-2.756,-1.1751;2.1033,-2.9818,0.1407;2.6784,-2.8479,-1.5557;4.9986,-1.6465,-0.6123;4.5305,-2.1143,0.2537;6.4304,-1.9034,-0.7429;7.0226,-2.8945,0.0645;6.4093,-3.4256,0.7885;8.3711,-3.2195,-0.0461;8.8099,-3.9876,0.5817;9.1579,-2.5397,-0.9747;10.8635,-2.9352,-1.1238;8.6097,-1.5398,-1.7809;9.238,-1.0079,-2.4875;7.2609,-1.2266,-1.659;6.8534,-0.4301,-2.2739)|\",3.229991405435\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])C([H])=C([H])[H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(0.8752,2.9306,-0.7566;1.3675,2.2386,-1.2247;2.7516,2.3133,-0.8059;2.808,2.5672,0.2577;3.305,0.8781,-0.926;4.3229,0.9007,-0.516;2.5634,0.0457,-0.0428;1.6346,0.1746,-0.3038;3.347,0.3304,-2.3383;3.7853,0.985,-3.0879;2.8873,-0.8725,-2.6809;2.9484,-1.2311,-3.7047;2.4467,-1.5388,-1.9452;3.4312,3.436,-1.6151;3.0096,4.3857,-1.2268;3.1372,3.3265,-2.9993;2.2199,2.9994,-3.0376;4.9517,3.5087,-1.4665;5.3142,4.299,-2.1446;5.3922,2.5698,-1.8181;5.3177,3.692,-0.0572;5.1164,4.6483,0.2323;6.3212,3.5687,0.0584)|\",7.27088208536\r\n\"[H]N([H])C1=NC(=O)N=C2C1=NC(=O)N2C([H])([H])[H] |(2.0797,-0.9466,-5.1455;1.0962,-0.7211,-5.1789;0.5947,-0.7581,-6.0551;0.4376,-0.3725,-4.0575;-0.8284,-0.0776,-4.0539;-1.486,0.2895,-2.8567;-2.6644,0.5739,-2.8878;-0.8181,0.3445,-1.5763;0.4269,0.0489,-1.6103;1.2028,-0.3344,-2.8044;2.441,-0.58,-2.5667;2.593,-0.3688,-1.1296;3.6194,-0.4962,-0.5134;1.3469,0.0127,-0.5949;1.1014,0.2994,0.8095;2.0532,0.564,1.2723;0.6803,-0.5742,1.3169;0.3968,1.1302,0.884)|\",3.2544816519799995\r\n\"[H]O[C@@]1([H])C(=C([H])[H])C([H])([H])C([H])([H])[C@@]2(C([H])([H])[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C(=O)[C@@]2([H])C1([H])[H] |(-0.98,5.1372,0.3106;-0.8455,5.3149,1.2561;-0.3001,4.1299,1.8308;-0.1017,4.4058,2.8771;-1.2808,2.967,1.8189;-2.5055,3.0966,1.299;-3.2058,2.2652,1.2879;-2.8596,4.0415,0.9004;-0.827,1.661,2.4368;-0.5584,1.8289,3.4889;-1.6876,0.9832,2.4571;0.3344,0.9427,1.7023;0.1677,1.0556,0.6281;0.2507,-0.1316,1.9097;1.767,1.3767,2.0821;2.0275,0.9674,3.5549;1.8827,-0.1109,3.6921;1.3566,1.4861,4.2478;3.0511,1.2062,3.866;2.9154,0.6715,1.227;3.1861,-0.2488,1.7596;2.7084,0.2163,-0.253;3.7223,-0.0904,-0.5531;1.8434,-1.0469,-0.4145;1.9667,-1.4576,-1.4239;0.7768,-0.854,-0.2716;2.1415,-1.8264,0.2972;2.2894,1.3058,-1.2554;2.3885,0.9265,-2.2794;2.9121,2.2043,-1.1816;1.245,1.6092,-1.1274;4.0869,1.6756,1.3182;4.8059,1.6207,0.4955;4.6593,1.535,2.2458;3.4449,3.0513,1.3849;3.9567,4.093,1.0329;2.0216,2.9194,1.9538;2.073,3.3579,2.9627;1.0507,3.8086,1.1562;0.8656,3.3933,0.1574;1.5603,4.7671,1.011)|\",5.934803079405\r\n\"[H]ONC([H])C1=C(C#CC2=C([H])C([H])=C([H])C([H])=C2[H])C([H])=C(F)C([H])=C1[H] |(3.3844,3.2743,2.852;4.3426,3.211,2.7161;4.4402,2.236,1.709;5.6725,2.0394,1.4117;6.4412,2.6141,1.9359;6.1483,1.0924,0.403;5.329,0.2158,-0.3676;3.9153,0.1489,-0.2445;2.7058,0.0257,-0.2219;1.2872,-0.058,-0.1412;0.5672,-0.9559,-0.9541;1.1085,-1.5901,-1.6493;-0.8203,-1.0269,-0.8646;-1.3653,-1.7231,-1.4962;-1.5111,-0.2088,0.0328;-2.594,-0.2672,0.1;-0.806,0.6848,0.8442;-1.3414,1.3215,1.5433;0.5806,0.7631,0.761;1.1406,1.4518,1.3854;5.9386,-0.6418,-1.3045;5.3315,-1.3155,-1.8981;7.3132,-0.6289,-1.4705;7.8661,-1.4614,-2.3749;8.1367,0.2139,-0.7319;9.2101,0.1971,-0.8855;7.5387,1.0604,0.1942;8.1665,1.7255,0.7811)|\",4.046332956935\r\n\"[H]C1=NC(=O)N=C2O/C(C([H])([H])C([H])([H])C([H])([H])[H])=C(/[H])[C@@]12[H] |(8.8008,-0.1181,0.4226;8.5015,0.7503,1.0136;9.2783,1.7578,1.074;8.8572,2.8653,1.9155;9.6862,3.5195,2.4975;7.4736,3.1695,1.9475;6.7284,2.1435,1.8728;5.3821,2.2179,1.7012;4.9415,0.9413,1.2978;3.4829,0.8583,1.011;3.2357,1.5862,0.2256;3.271,-0.1375,0.6042;2.5985,1.1283,2.2464;2.841,0.3947,3.026;2.8483,2.1143,2.6549;1.1056,1.0632,1.9135;0.4971,1.2587,2.8029;0.8249,0.0758,1.5272;0.8354,1.8073,1.1545;5.9338,0.0434,1.2633;5.8283,-1.0028,1.0105;7.1826,0.7078,1.7764;7.4313,0.3267,2.7835)|\",4.810972876839999\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(C([H])([H])N(=O)=O)[C@@]2([H])N([H])C([H])([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(-2.9465,-4.7469,1.8345;-1.9967,-4.2885,1.5725;-1.9684,-3.0621,0.9078;-2.896,-2.5592,0.6476;-0.7459,-2.4769,0.5751;-0.732,-1.52,0.0577;0.4668,-3.1022,0.897;1.7805,-2.424,0.5344;1.5396,-1.4812,0.0269;2.5172,-2.0516,1.8405;2.853,-2.9421,2.3698;1.8932,-1.4212,2.4747;3.754,-1.2456,1.5444;3.6026,-0.0503,1.3097;4.8324,-1.841,1.5141;2.7129,-3.2274,-0.4158;3.5122,-2.5349,-0.7196;3.2984,-4.4222,0.2349;4.246,-4.2165,0.5347;3.2613,-5.5571,-0.714;4.2113,-6.1026,-0.7072;2.4779,-6.2651,-0.4093;2.9203,-4.9501,-2.0854;3.8318,-4.6006,-2.5871;2.4194,-5.6571,-2.7552;2.0353,-3.7562,-1.6958;1.0169,-4.0965,-1.4779;1.973,-2.987,-2.4724;0.4265,-4.3346,1.5691;1.3586,-4.836,1.8082;-0.7958,-4.9214,1.8993;-0.8086,-5.8771,2.417)|\",4.193274436205\r\n\"[H]O[C@@]1(C([H])([H])[H])C([H])([H])C([H])([H])O[C@@]([H])(/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(2.1095,0.9071,1.5926;2.5109,0.101,1.2295;2.1881,0.0553,-0.1748;0.6643,0.04,-0.3363;0.3686,-0.0204,-1.3897;0.2237,0.9569,0.0789;0.2354,-0.8153,0.1968;2.826,1.2581,-0.8977;2.5008,1.267,-1.9457;2.4785,2.1969,-0.4409;4.3519,1.1754,-0.8483;4.701,1.2673,0.1899;4.807,1.9832,-1.4299;4.8194,-0.0308,-1.4382;4.3555,-1.2296,-0.7897;4.6692,-2.0248,-1.4755;5.0329,-1.4708,0.5391;4.6899,-0.8699,1.3762;5.9985,-2.3891,0.6956;6.303,-2.9662,-0.1797;6.728,-2.7232,1.9288;7.786,-3.6472,1.8631;8.0393,-4.0918,0.9031;8.5143,-3.9969,2.9997;9.3292,-4.7118,2.92;8.1966,-3.4307,4.2348;8.7599,-3.7009,5.1239;7.1437,-2.514,4.3197;6.8851,-2.0715,5.2783;6.4183,-2.1653,3.184;5.598,-1.4587,3.2712;2.8111,-1.2429,-0.7183;2.4736,-2.0887,-0.1078;2.4286,-1.3952,-1.7351)|\",5.069481034815\r\n\"[H]OC(=O)[C@@]([H])(O[H])[C@]([H])(OC(=O)C([H])([H])[H])C(=O)OC([H])([H])[H] |(8.5974,2.5299,-2.6714;8.025,2.8157,-3.4144;6.991,1.9538,-3.4547;6.056,2.0817,-4.2044;7.1469,0.7805,-2.4644;7.695,-0.0164,-2.979;7.9642,1.1703,-1.3618;7.403,1.7413,-0.8033;5.8055,0.1832,-2.0049;6.0048,-0.457,-1.1375;5.2637,-0.6003,-3.058;4.5346,-1.6988,-2.6847;4.4407,-2.0766,-1.5416;3.8911,-2.3271,-3.8916;3.0811,-1.6763,-4.2389;3.4856,-3.3028,-3.6222;4.6106,-2.4216,-4.7097;4.7746,1.2307,-1.5781;3.5947,1.1952,-1.8029;5.3893,2.2019,-0.8479;4.5288,3.2618,-0.3742;3.7461,2.8505,0.2669;4.0741,3.7782,-1.222;5.1762,3.9349,0.1874)|\",7.05863328197\r\n\"[H]OC(=O)[C@@]([H])(OC(=O)C([H])([H])[H])[C@]([H])(O[H])C(=O)OC([H])([H])[H] |(4.661,1.5607,3.3174;5.344,0.8699,3.4214;5.9178,0.6755,2.2216;6.7962,-0.1328,2.0435;5.4013,1.5538,1.0799;5.0364,0.9048,0.2793;4.3196,2.3696,1.5616;3.2335,2.5308,0.7266;3.1023,1.932,-0.3098;2.2772,3.5384,1.3069;2.757,4.5228,1.327;2.0145,3.2771,2.3369;1.3788,3.5801,0.6909;6.4898,2.5006,0.4993;6.8732,3.1466,1.293;5.9214,3.3335,-0.4848;5.8524,2.7912,-1.2927;7.618,1.6774,-0.1234;7.4884,1.1566,-1.2129;8.7102,1.6342,0.6376;9.7626,0.7652,0.1705;10.0775,1.0553,-0.8344;9.4071,-0.2675,0.1641;10.5766,0.8878,0.8842)|\",7.004210511869999\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(O[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(3.2757,-3.2653,-1.7866;2.9283,-2.3723,-1.6328;2.9571,-2.1432,-0.217;2.4185,-1.195,-0.0997;2.1651,-3.2519,0.4992;1.1767,-3.2955,0.022;2.645,-4.2193,0.2877;1.9969,-3.1112,2.0211;1.4648,-4.0008,2.3877;2.9722,-3.1165,2.52;1.223,-1.8597,2.4604;0.2358,-1.8445,1.9772;1.7512,-0.9561,2.1344;1.0288,-1.7831,3.976;0.6022,-2.7449,4.3137;2.3367,-1.6237,4.5427;2.264,-1.7521,5.5006;0.0601,-0.6656,4.4336;-0.9029,-0.875,3.9437;-0.1702,-0.733,5.9524;-0.9409,-0.0199,6.265;0.743,-0.477,6.5065;-0.4934,-1.7328,6.2697;0.528,0.7365,4.0177;-0.1387,1.505,4.4258;0.5439,0.8574,2.9296;1.5395,0.9277,4.3925;4.4014,-1.9196,0.2956;4.3305,-1.7114,1.3726;5.3013,-3.1526,0.1129;6.3103,-2.9501,0.4892;5.4028,-3.4171,-0.9478;4.9226,-4.0305,0.6484;5.0203,-0.689,-0.3837;6.0399,-0.5093,-0.0232;4.43,0.2133,-0.1823;5.056,-0.8268,-1.4694)|\",8.5987976758\r\n\"[H]O[C@](/C([H])=C(\\[H])C([H])([H])[H])(C([H])([H])[H])C([H])([H])C1=NC([H])=C([H])C([H])=C1[H] |(4.673,-1.8224,-2.5059;5.0387,-1.8314,-1.6063;4.513,-0.6632,-0.9357;3.0019,-0.774,-0.9379;2.639,-1.7387,-0.5766;2.1171,0.1397,-1.3472;2.4713,1.1043,-1.7102;0.6266,-0.0491,-1.3444;0.1333,0.7029,-0.7137;0.3456,-1.0405,-0.9733;0.209,0.0676,-2.3537;5.0493,0.597,-1.6218;4.7585,1.5107,-1.0967;4.6818,0.6568,-2.6536;6.1418,0.5568,-1.6462;5.0737,-0.8192,0.5054;4.7416,-1.7957,0.8756;6.1649,-0.8503,0.4251;4.6831,0.2653,1.482;5.4564,1.3657,1.5093;5.1386,2.3415,2.3679;5.7923,3.2129,2.3583;4.0532,2.2849,3.2419;3.849,3.1064,3.9222;3.2521,1.1441,3.215;2.3968,1.0496,3.8791;3.5702,0.1228,2.3235;2.9706,-0.7814,2.2783)|\",6.059975450635\r\n\"[H]OC([H])([H])C([H])([H])[C@]1([H])O[C@@]([H])(C([H])([H])C([H])([H])[H])C(C([H])([H])[H])=C([H])C1([H])[H] |(5.8554,0.542,-0.1388;6.64,0.276,0.3742;6.5717,1.0165,1.5826;5.927,0.5115,2.3246;7.5852,1.0322,2;6.0731,2.4517,1.3759;6.6652,2.9244,0.5814;6.2354,3.0318,2.2947;4.5892,2.5359,1.019;4.0042,2.0536,1.8187;4.4,1.7991,-0.2038;3.0311,1.605,-0.5925;3.0982,1.3612,-1.6632;2.4102,0.3878,0.1261;2.3648,0.5917,1.2039;1.3701,0.2855,-0.2072;3.1603,-0.9227,-0.1353;2.6786,-1.7561,0.3879;4.1997,-0.8714,0.2026;3.1693,-1.1644,-1.2057;2.2149,2.8785,-0.4558;0.8525,2.889,-1.1026;0.405,3.8866,-1.0533;0.1591,2.1893,-0.6179;0.9084,2.5945,-2.1598;2.7168,3.948,0.1727;2.1313,4.866,0.2144;4.0714,3.9616,0.8292;4.019,4.4668,1.8037;4.7838,4.5385,0.2198)|\",7.102171498050001\r\n\"[H]O[C@@]([H])(C([H])([H])/C([H])=C(\\[H])[C@@]([H])(O[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(1.0976,-2.3644,5.0613;1.0993,-3.2843,4.7515;2.1036,-3.389,3.7357;2.2099,-4.4693,3.5706;1.5862,-2.7657,2.4151;2.2923,-2.9721,1.6026;0.6528,-3.2914,2.1712;1.3112,-1.289,2.4977;0.558,-0.9821,3.2268;1.8844,-0.3439,1.7462;2.6296,-0.6386,1.0034;1.5644,1.1274,1.8304;0.8402,1.2775,2.6401;2.7097,1.8927,2.2354;3.43,1.6927,1.6168;0.9297,1.6932,0.5302;0.0797,1.0366,0.2947;0.4018,3.115,0.7674;-0.0523,3.5229,-0.1434;1.2162,3.7803,1.0723;-0.3591,3.1316,1.5571;1.8957,1.6634,-0.6652;1.4007,2.0408,-1.5667;2.2529,0.6518,-0.889;2.7677,2.3058,-0.4865;3.4605,-2.8397,4.2256;3.3233,-1.7644,4.4245;3.8844,-3.5233,5.5352;4.7988,-3.067,5.9315;4.0885,-4.5892,5.3688;3.1025,-3.4517,6.2958;4.5657,-2.9779,3.1666;5.5307,-2.6632,3.5802;4.3722,-2.3653,2.2808;4.6735,-4.0214,2.8419)|\",6.870874725125001\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])C([H])([H])[C@@]([H])(O[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.7662,-3.2015,3.5693;7.3565,-2.6584,4.1151;6.8495,-1.3245,4.0825;5.8521,-1.2589,4.547;7.5394,-0.749,4.711;6.8122,-0.7556,2.6909;7.761,-0.7682,2.1523;5.7216,-0.2365,2.1117;4.7869,-0.2361,2.6763;5.6713,0.3241,0.7149;6.684,0.5714,0.377;5.0895,1.2548,0.7053;5.0744,-0.6678,-0.3164;5.1267,-0.19,-1.303;5.9058,-1.82,-0.428;6.0521,-2.1445,0.477;3.6166,-1.0806,-0.053;3.358,-1.8409,-0.7995;3.5407,-1.5753,0.9254;2.5953,0.0733,-0.1281;2.7176,0.5937,-1.0878;2.8077,0.8122,0.655;1.1649,-0.4045,0.0123;0.4367,-0.831,-1.1068;0.8975,-0.7869,-2.0917;-0.8689,-1.3052,-0.9761;-1.4171,-1.6271,-1.8579;-1.4719,-1.3608,0.2819;-2.4897,-1.7271,0.3854;-0.7595,-0.9371,1.4049;-1.2216,-0.9705,2.3883;0.5461,-0.4632,1.268;1.0919,-0.1287,2.1483)|\",6.372906378710001\r\n\"[H]OC([H])([H])C([H])([H])/C([H])=C(\\[H])[C@@]([H])(O[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(8.7578,-1.4494,-0.228;9.4203,-2.1252,-0.4492;9.3052,-2.3653,-1.8415;9.6649,-1.503,-2.4287;9.9671,-3.2097,-2.0587;7.8622,-2.7093,-2.2576;7.8569,-2.9663,-3.3266;7.5437,-3.5985,-1.7006;6.9245,-1.5617,-1.9957;7.0242,-0.6923,-2.647;6.0457,-1.5031,-0.9888;5.9453,-2.3546,-0.3135;5.1177,-0.3396,-0.7122;5.4531,0.176,0.1998;5.1847,0.6661,-1.7215;4.9063,0.2477,-2.5535;3.6726,-0.8226,-0.4831;3.3081,-1.2958,-1.4072;3.6728,-1.6103,0.2823;2.7143,0.3084,-0.0655;2.7674,1.1048,-0.8157;3.0714,0.7441,0.8777;1.2861,-0.1665,0.0962;0.3699,-0.0627,-0.9588;0.6846,0.3964,-1.894;-0.938,-0.5311,-0.8245;-1.6335,-0.4366,-1.6546;-1.3537,-1.1133,0.3741;-2.3725,-1.4755,0.4825;-0.4525,-1.2212,1.4351;-0.7679,-1.6675,2.3748;0.8537,-0.7515,1.2946;1.5477,-0.8336,2.1291)|\",6.4246080103050005\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C(F)(F)C(=O)N(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H] |(6.9035,2.1423,-2.6751;6.032,2.6219,-3.1091;5.3133,3.524,-2.4395;4.4333,3.9529,-2.9166;5.6058,3.9918,-1.0428;6.534,3.5321,-0.68;5.7769,5.0793,-1.0559;4.4543,3.6888,-0.064;3.524,4.1313,-0.4488;4.2882,2.6034,-0.033;4.7163,4.2117,1.354;5.6485,3.7709,1.7344;4.8862,5.2973,1.3152;3.5726,3.908,2.327;3.7899,4.2927,3.33;3.4025,2.8279,2.4136;2.6334,4.3646,1.9913;5.7097,2.1903,-4.5058;4.5659,2.8177,-4.9509;6.7281,2.6013,-5.3572;5.5764,0.6497,-4.6535;5.8682,-0.0525,-3.6882;5.157,0.1396,-5.844;5.0613,-1.3251,-5.9403;4.6957,-1.7025,-4.9829;4.3105,-1.5535,-6.7038;6.4008,-1.9813,-6.285;6.2782,-3.0664,-6.3798;7.1303,-1.7844,-5.4947;6.7981,-1.6004,-7.2328;4.8735,0.9089,-7.0646;5.3837,1.8675,-7.0066;5.3282,0.3637,-7.9013;3.378,1.1132,-7.319;3.2296,1.6537,-8.2612;2.9324,1.6977,-6.5097;2.8425,0.1601,-7.3937)|\",6.413723456285\r\n\"[H]OC(=O)/C([H])=N/N([H])C1=C([H])C([H])=C(C#N)C([H])=C1[H] |(1.6145,-6.1533,-0.6065;1.2771,-5.2455,-0.492;-0.0443,-5.3836,-0.2392;-0.6052,-6.4613,-0.1845;-0.7579,-4.1108,-0.0334;-1.8282,-4.2063,0.1714;-0.1427,-2.9805,-0.0923;-0.8382,-1.8638,0.1037;-1.8382,-1.9177,0.2958;-0.2391,-0.6055,0.0551;-1.0392,0.5281,0.2704;-2.1018,0.4144,0.471;-0.4769,1.7958,0.228;-1.097,2.6704,0.3947;0.8956,1.9537,-0.0293;1.4779,3.2603,-0.0735;1.9508,4.3233,-0.109;1.6896,0.8117,-0.243;2.7503,0.929,-0.44;1.1338,-0.4582,-0.2021;1.7409,-1.34,-0.3663)|\",4.19599557471\r\n\"[H]C1NN2C([H])([H])C([H])([H])C([H])([H])[C@]2([H])[C@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.0508,-1.8277,1.7527;3.432,-0.9059,1.3248;4.5689,-0.4403,1.6935;4.8783,0.6885,0.9314;5.4853,1.7875,1.7171;5.5542,1.4762,2.7636;6.4983,2.001,1.3592;4.539,2.9994,1.534;4.4809,3.6174,2.4353;4.8964,3.6415,0.7198;3.1887,2.3666,1.1528;2.5156,3.0573,0.6358;2.6715,2.0057,2.0491;3.6241,1.1801,0.2774;3.8521,1.529,-0.7353;2.7455,-0.1067,0.235;2.936,-0.6015,-0.7295;1.2463,0.0747,0.3735;0.4494,0.2236,-0.7704;0.9157,0.1881,-1.7529;-0.9287,0.4116,-0.6669;-1.5273,0.5255,-1.567;-1.5379,0.446,0.5891;-2.612,0.5878,0.6725;-0.7582,0.2899,1.7359;-1.2234,0.3069,2.7181;0.6213,0.107,1.628;1.2193,-0.0219,2.5265)|\",5.412344486445\r\n\"[H]OC(=O)[C@@]([H])(/C([H])=C(\\C([H])([H])[H])[C@]1([H])O[C@@]([H])(C([H])([H])C([H])([H])[H])C(C([H])([H])[H])=C([H])C1([H])[H])C([H])([H])[H] |(5.6187,-3.89,7.1123;4.9396,-3.4908,7.6804;4.1263,-2.6879,6.9384;3.2038,-2.1187,7.4651;4.5241,-2.5254,5.4638;4.9915,-3.4591,5.1237;3.3091,-2.2076,4.6287;2.5885,-1.5818,5.1459;3.0652,-2.5969,3.3699;3.9988,-3.4405,2.5335;4.8506,-3.8308,3.0975;4.3917,-2.8617,1.6893;3.4646,-4.2955,2.0982;1.7943,-2.1987,2.6185;1.3638,-3.1222,2.2099;2.1227,-1.4461,1.4389;2.6596,-0.1402,1.6663;3.7147,-0.2337,1.9875;2.6564,0.5455,0.2888;3.212,1.4884,0.3634;3.2256,-0.0999,-0.3912;1.2581,0.7924,-0.2862;1.3224,1.2498,-1.2799;0.6703,1.4606,0.3535;0.7099,-0.1496,-0.3767;1.921,0.6298,2.7478;2.3001,2.0755,2.9441;1.7757,2.5036,3.8045;2.0649,2.6905,2.0662;3.3793,2.1818,3.1241;1.0265,0.0112,3.5268;0.5066,0.5759,4.3002;0.7071,-1.4551,3.4053;-0.2491,-1.589,2.879;0.5697,-1.9072,4.3947;5.5871,-1.3977,5.377;5.9376,-1.2962,4.3457;6.4499,-1.6072,6.0204;5.1516,-0.4434,5.6891)|\",6.288551085055\r\n\"[H]C1([H])C2=C(C([H])([H])C([H])([H])C1([H])[H])C([H])([H])[C@@]1(C#N)C([H])([H])C([H])([H])C([H])([H])[C@]1([H])C2([H])[H] |(-1.7748,-1.0432,-1.7705;-1.7195,-1.0494,-0.6696;-1.8015,-2.1075,-0.3826;-0.3611,-0.5237,-0.2449;-0.202,0.7,0.2965;-1.3672,1.6424,0.5354;-1.5733,1.6916,1.6178;-1.0702,2.662,0.2517;-2.6421,1.2453,-0.2189;-3.4961,1.8236,0.1543;-2.5282,1.4989,-1.282;-2.896,-0.2576,-0.0861;-3.8282,-0.5434,-0.5884;-3.0172,-0.5129,0.9761;1.1509,1.2424,0.7357;1.2419,1.1802,1.8315;1.2198,2.3098,0.4889;2.3252,0.4688,0.1116;2.4512,0.837,-1.3133;2.5741,1.1288,-2.4312;3.714,0.6027,0.7885;4.2372,1.5226,0.512;3.5547,0.6279,1.8732;4.4831,-0.6853,0.3771;5.204,-0.4701,-0.4173;5.0539,-1.0803,1.2234;3.4056,-1.6991,-0.1189;3.5215,-2.6906,0.3307;3.474,-1.8274,-1.2054;2.0598,-1.0526,0.2466;1.88,-1.2085,1.3211;0.8012,-1.4697,-0.5054;1.0029,-1.5112,-1.5876;0.5127,-2.4905,-0.2179)|\",6.759308046419999\r\n\"[H]/C(=N\\C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.5702,-1.7874,0.743;6.3524,-2.2475,0.1205;6.2313,-2.2976,-1.1491;5.0644,-1.6841,-1.6804;4.5014,-0.7144,-1.207;4.7271,-2.33,-2.8043;3.5579,-1.9184,-3.6026;2.2779,-2.0514,-2.7706;1.4066,-1.8672,-3.4091;2.2643,-1.3362,-1.9466;2.1918,-3.0652,-2.3641;3.7641,-0.4953,-4.1323;2.9501,-0.2398,-4.82;4.7085,-0.4281,-4.6832;3.7763,0.2312,-3.3183;3.5688,-2.933,-4.7488;2.7391,-2.734,-5.4351;3.4649,-3.9524,-4.3635;4.5071,-2.8713,-5.309;7.5496,-2.7907,0.863;8.1616,-1.6179,1.6664;8.9954,-1.9798,2.279;7.4249,-1.1607,2.3377;8.5432,-0.8376,0.9988;8.5908,-3.3891,-0.0925;9.4459,-3.7727,0.4762;8.9539,-2.6382,-0.8008;8.1637,-4.2101,-0.6761;7.0378,-3.8693,1.8475;7.8684,-4.2437,2.4567;6.6004,-4.7185,1.3107;6.2774,-3.467,2.5278)|\",5.711669721995\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])[C@@]1([H])O[C@@]([H])(C([H])([H])[H])OC1([H])[H] |(2.5886,2.8007,-5.7842;3.0444,2.2249,-6.4186;2.8459,0.8824,-5.9788;1.7869,0.5851,-6.0446;3.4033,0.2661,-6.6942;3.3595,0.6653,-4.5833;4.4006,0.9464,-4.4182;2.626,0.1912,-3.5733;1.5792,-0.0743,-3.7283;3.1463,-0.0579,-2.1938;4.1521,0.3767,-2.079;2.259,0.5124,-1.2265;2.5217,-0.1844,-0.0257;3.4264,0.2356,0.4513;1.3292,-0.1106,0.901;1.5262,-0.69,1.8079;0.4489,-0.5215,0.3988;1.1326,0.9286,1.1821;2.7888,-1.5362,-0.395;3.1666,-1.5478,-1.7684;4.16,-1.9949,-1.8936;2.4401,-2.14,-2.3401)|\",6.819173093530001\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C2\\C(=O)SC(=O)C2([H])[H])C([H])=C1[H] |(-2.8001,-0.2034,0.132;-1.7184,-0.2834,0.0663;-1.1174,-1.5244,-0.1567;-1.7333,-2.4128,-0.2664;0.267,-1.6431,-0.2436;0.7201,-2.6084,-0.4173;1.0902,-0.5047,-0.1068;2.5498,-0.4686,-0.1756;2.912,0.5571,-0.0828;3.5698,-1.361,-0.3149;3.485,-2.8345,-0.4389;2.5086,-3.5463,-0.5161;5.1576,-3.5839,-0.4558;6.0005,-1.9906,-0.3269;7.1955,-1.865,-0.2599;4.996,-0.8477,-0.3335;5.2258,-0.2023,0.523;5.2055,-0.2537,-1.2342;0.4614,0.7429,0.1151;1.0759,1.6339,0.2208;-0.921,0.8551,0.2027;-1.3756,1.8265,0.3754)|\",4.17422646667\r\n\"[H]OC([H])([H])C#C[C@@](O[H])(/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(-1.8708,-1.8482,-0.2034;-2.2956,-1.2555,-0.8435;-1.2727,-0.7558,-1.6988;-0.902,-1.5373,-2.3822;-1.7575,0.0057,-2.3198;-0.1383,-0.1744,-0.9698;0.7848,0.2467,-0.3076;1.9101,0.8067,0.475;1.4141,1.516,1.6156;0.7135,2.1086,1.3009;2.7831,-0.3301,0.9748;3.2291,-0.9249,0.1807;3.0014,-0.6019,2.2674;2.5251,0.0415,3.0031;3.8281,-1.688,2.8167;4.417,-2.6917,2.0253;4.2632,-2.6921,0.9499;5.1879,-3.6972,2.601;5.629,-4.4636,1.9689;5.3915,-3.7286,3.9841;5.9929,-4.5156,4.431;4.8126,-2.7437,4.7845;4.962,-2.7571,5.861;4.0394,-1.738,4.2057;3.5906,-0.9735,4.8357;2.7346,1.7647,-0.4145;3.1876,1.2355,-1.2594;3.5239,2.2188,0.1917;2.0899,2.5552,-0.8151)|\",5.074923311825\r\n\"[H]OC1=NC(=O)N([C@@]2([H])OC([H])([H])[C@]([H])(O[H])C2([H])[H])C([H])([H])[C@@]1([H])C([H])([H])[H] |(1.5054,-2.7079,0.198;2.1373,-2.7899,-0.5337;2.6715,-1.5794,-0.8078;3.5479,-1.5253,-1.7331;4.0728,-0.2668,-2.1084;4.5763,-0.1109,-3.2082;4.0073,0.7608,-1.1684;4.4478,2.0867,-1.5414;4.7647,1.9952,-2.5818;3.3385,3.0118,-1.4701;3.7018,4.1698,-0.7067;3.2715,5.0547,-1.186;3.3052,4.098,0.3185;5.2343,4.2017,-0.7068;5.6393,4.7522,0.1467;5.7702,4.8402,-1.858;5.3504,4.4483,-2.6412;5.5503,2.6999,-0.6571;5.4705,2.3295,0.3721;6.5532,2.4729,-1.0267;3.3808,0.5547,0.1311;4.0962,0.1327,0.8528;3.0453,1.5234,0.5102;2.1767,-0.3873,-0.0091;1.8927,-0.7257,0.9974;0.9604,0.2799,-0.6803;0.5638,1.0716,-0.0358;0.1601,-0.4448,-0.8643;1.2444,0.7326,-1.6342)|\",5.82867867771\r\n\"[H]OC(NNC([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(1.107,-0.6557,-1.6106;1.3597,-0.9952,-0.7388;2.5859,-0.5005,-0.4164;3.0294,-0.8588,0.7336;4.3017,-0.3328,1.0114;4.781,-0.7762,2.1142;4.204,-1.4921,2.7096;6.1289,-0.3354,2.6009;6.5966,0.2665,1.8137;6.7553,-1.2285,2.7488;6.1097,0.4694,3.9289;7.1607,0.7071,4.1489;5.351,1.7946,3.7737;5.4085,2.3891,4.6934;4.2914,1.6214,3.552;5.7648,2.3971,2.9566;5.5674,-0.3514,5.1089;5.6683,0.2069,6.0471;6.1096,-1.2981,5.2239;4.504,-0.5873,4.9827;3.2745,0.3616,-1.4498;2.5183,0.9591,-1.9807;3.9366,1.0493,-0.92;4.1052,-0.4523,-2.4678;4.6278,0.2598,-3.1193;4.8793,-1.0003,-1.9187;3.2859,-1.4231,-3.325;3.9256,-1.9438,-4.046;2.5141,-0.8931,-3.9001;2.7903,-2.1856,-2.7139)|\",5.64908353638\r\n\"[H]C1=C([H])C(C(=O)/C(=C(\\[H])C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1OC([H])([H])[H] |(7.0546,-1.6188,-3.0169;6.6551,-1.5769,-2.0085;6.0283,-0.4364,-1.5271;5.9267,0.4265,-2.1767;5.4969,-0.3977,-0.2244;4.7401,0.7677,0.3336;3.891,0.5803,1.2025;4.9992,2.1699,-0.1441;6.2455,2.5414,-0.4966;7.0205,1.7771,-0.4815;6.732,3.9074,-0.8821;7.1712,3.8871,-1.8891;7.5314,4.2354,-0.2041;5.9474,4.6672,-0.8732;3.8095,3.09,-0.0275;4.0566,4.131,-0.2449;3.3946,3.0258,0.983;3.0096,2.7792,-0.7118;5.6026,-1.55,0.567;5.1654,-1.5333,1.5603;6.2452,-2.695,0.1057;6.3206,-3.5625,0.7513;6.7778,-2.712,-1.1923;7.4195,-3.7738,-1.751;7.5603,-4.9613,-0.9836;8.094,-5.6667,-1.6228;6.5841,-5.3833,-0.7128;8.1449,-4.7843,-0.0717)|\",4.81641515385\r\n\"[H]OC(NNC([H])C([H])([H])[H])C1=C([H])C(C([H])=O)=C([H])C([H])=C1[H] |(4.9066,-0.6766,3.2844;4.9455,-0.4791,2.3353;4.5876,-1.5982,1.6436;4.8996,-1.5807,0.3981;4.4267,-2.6811,-0.3308;4.7932,-2.633,-1.5577;5.404,-1.7972,-1.9167;4.3949,-3.7017,-2.5233;5.281,-4.1731,-2.9691;3.8147,-3.2758,-3.3528;3.7937,-4.4667,-2.0255;3.8604,-2.6336,2.4308;4.0996,-4.0048,2.2614;4.8158,-4.3358,1.5183;3.4194,-4.9421,3.0425;3.6951,-6.387,2.8556;4.4522,-6.6172,2.0741;3.158,-7.2771,3.4856;2.4804,-4.5219,3.998;1.9637,-5.2745,4.5855;2.2321,-3.1654,4.1679;1.5013,-2.8286,4.8976;2.9215,-2.2273,3.3938;2.7053,-1.169,3.5169)|\",4.27762972986\r\n\"[H]OC1=NC([H])=C([H])C2=NN[C@@]([H])(C3=C([H])OC([H])=C3[H])C12 |(2.2054,-2.7921,0.509;1.2628,-2.6228,0.3257;1.1389,-1.2767,0.2454;2.2385,-0.5474,0.4308;2.1481,0.7932,0.3823;3.0806,1.33,0.5403;0.9614,1.4876,0.1562;0.9169,2.5701,0.1346;-0.164,0.6892,-0.0355;-1.513,1.1147,-0.2679;-2.2678,0.1179,-0.3841;-1.5107,-1.1725,-0.2408;-1.9198,-1.6489,0.6649;-1.774,-2.0463,-1.4415;-2.6908,-1.8149,-2.4241;-3.3991,-1.0212,-2.5986;-2.6736,-2.8241,-3.3386;-1.7271,-3.7154,-2.9268;-1.592,-4.5803,-3.5581;-1.1427,-3.2968,-1.7732;-0.356,-3.7897,-1.2215;-0.1056,-0.6984,-0.0099)|\",4.008237017865\r\n\"[H]C1=C(\\[H])C2=NN([H])C(C([H])([H])[H])=C2/C(N([H])[C@@]([H])(C([H])([H])[H])C2([H])C([H])([H])C2([H])[H])=N\\1 |(7.6196,1.1781,2.3634;6.9927,0.2916,2.2886;7.5587,-0.954,2.3789;8.6237,-1.0967,2.522;6.6641,-2.0547,2.2793;6.9307,-3.3704,2.3519;5.7077,-3.9363,2.2172;5.6442,-4.9446,2.2499;4.6721,-3.0775,2.0663;3.2594,-3.547,1.9155;2.592,-3.0165,2.6042;3.1723,-4.6165,2.1355;2.8815,-3.3928,0.896;5.2643,-1.808,2.0957;4.8116,-0.4476,1.9997;3.4734,-0.1678,1.8496;2.8875,-0.9225,1.5222;2.966,1.1882,1.6095;3.5187,1.8365,2.2968;3.2346,1.665,0.1737;2.8753,2.6915,0.0374;2.7169,1.0265,-0.5537;4.3069,1.6396,-0.0368;1.4892,1.239,1.9557;0.8485,0.6858,1.2664;0.8936,2.503,2.5272;1.5627,3.3407,2.7086;-0.1034,2.7947,2.2085;1.0442,1.2779,3.396;0.1485,0.7312,3.6787;1.819,1.2853,4.1582;5.6643,0.5598,2.0952)|\",4.503484225775\r\n\"[H]OC([H])([H])C([H])([H])[C@@]([H])(O[H])C([H])([H])OC([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(7.9389,2.7412,4.4919;8.8602,2.8549,4.2136;8.8755,2.9826,2.7949;8.4084,2.1063,2.318;8.3074,3.871,2.4702;10.3334,3.1184,2.3668;10.8859,2.2124,2.6455;10.7686,3.9389,2.9533;10.5503,3.4233,0.8757;9.8794,4.2416,0.5653;11.91,3.7807,0.6294;12.0848,4.6132,1.0946;10.2584,2.2291,-0.0257;9.3335,1.7273,0.2991;11.0866,1.5087,0.074;10.1478,2.6855,-1.3603;10.2683,1.6729,-2.3495;10.4278,2.2119,-3.2908;11.1683,1.0664,-2.1549;9.0388,0.7658,-2.474;9.2403,0.0331,-3.2691;8.9116,0.1815,-1.5514;7.7469,1.5325,-2.784;7.5945,2.2883,-2.0051;7.8758,2.0878,-3.7254;6.5198,0.6177,-2.8942;6.7466,-0.1864,-3.6099;6.3549,0.1191,-1.9268;5.2074,1.2933,-3.3331;5.3518,1.7634,-4.316;4.4692,0.4965,-3.4941;4.6116,2.3163,-2.3383;3.5151,2.2816,-2.4077;4.8524,1.9925,-1.3154;5.0442,3.7844,-2.5172;4.7054,4.3549,-1.6399;6.139,3.8629,-2.5221;4.485,4.4608,-3.7772;4.8213,3.9222,-4.6733;3.388,4.3832,-3.77;4.893,5.9333,-3.8941;4.4835,6.3924,-4.8012;5.9842,6.0403,-3.9301;4.5334,6.5136,-3.0352)|\",8.574307429255\r\n\"[H]C1=C([H])C(Cl)=C([H])C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])=C1[H] |(0.8432,0.4983,-5.2349;1.5099,0.8718,-4.4626;2.0107,2.1695,-4.5556;1.7473,2.816,-5.3857;2.8674,2.6259,-3.554;3.5144,4.2598,-3.6565;3.225,1.8219,-2.4749;3.9079,2.1967,-1.7204;2.7153,0.5186,-2.3847;3.0698,-0.3745,-1.1975;2.6772,-1.3767,-1.3985;4.582,-0.4954,-1.078;5.3245,0.3417,-0.603;5.0125,-1.6785,-1.5704;6.4448,-1.8951,-1.4515;6.7782,-1.0249,-1.0185;6.7441,-1.8554,-2.4289;2.4856,0.1343,0.1446;2.852,-0.5185,0.9472;2.8919,1.1307,0.349;0.9557,0.1641,0.1645;0.5925,0.51,1.1381;0.5347,-0.8334,-0.012;0.5557,0.8367,-0.6013;1.8595,0.0517,-3.3889;1.4632,-0.9585,-3.3295)|\",6.0953502512\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(5.1693,6.2433,0.1417;4.6955,5.2711,0.0346;3.8884,5.0069,-1.071;3.7417,5.7782,-1.8235;3.2611,3.7664,-1.2467;2.4182,3.5293,-2.4815;2.461,4.395,-3.1493;1.3621,3.3611,-2.232;2.7616,2.6526,-3.0416;3.4484,2.7701,-0.263;2.7945,1.3899,-0.4104;2.0089,1.4584,-1.1668;3.8193,0.444,-1.027;3.9375,0.218,-2.2155;4.6194,-0.1119,-0.086;5.6232,-1.0234,-0.6079;6.4821,-0.4792,-0.4911;5.4315,-1.0241,-1.6175;2.1529,0.8287,0.8795;1.8036,-0.19,0.6689;2.9061,0.7293,1.6669;0.9804,1.6812,1.3735;0.5419,1.2474,2.2789;0.1874,1.7438,0.6181;1.2992,2.7019,1.6096;4.2754,3.0388,0.8355;4.4443,2.2655,1.5789;4.8941,4.2785,0.9923;5.5283,4.4625,1.8553)|\",6.070860004655\r\n\"[H]C1=C([H])C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1C([H])([H])[H] |(0.9606,-2.0062,4.933;1.6158,-2.0938,4.0692;1.9,-0.9592,3.307;1.4632,-0.0033,3.5871;2.7323,-1.0362,2.1861;3.0114,0.2063,1.3434;2.5857,1.0725,1.8612;4.5118,0.4325,1.2436;5.275,-0.1668,0.5109;4.9109,1.4129,2.0874;6.3306,1.7169,2.0285;6.6794,1.0625,1.3171;6.6603,1.354,2.9266;2.4083,0.1285,-0.0818;2.722,1.0211,-0.6387;2.8483,-0.7291,-0.6021;0.8811,0.0313,-0.0857;0.5019,-0.0066,-1.1127;0.4248,0.8998,0.4053;0.5346,-0.8674,0.4349;3.2808,-2.2836,1.8522;3.9437,-2.3655,0.9957;2.9965,-3.4119,2.6173;3.4334,-4.3683,2.3375;2.1593,-3.3399,3.7404;1.8745,-4.5667,4.5751;1.0374,-4.399,5.2602;2.7454,-4.8483,5.1816;1.6276,-5.4309,3.9474)|\",6.122561636250001\r\n\"[H]C1=C([H])C([H])=C([C@]([H])(C(=O)OC([H])([H])C([H])([H])[H])N([H])[H])C(C#N)=C1[H] |(7.9105,-4.966,-0.8581;7.4528,-4.0127,-0.6114;6.1425,-3.7323,-1.0094;5.5721,-4.4724,-1.5638;5.5605,-2.5059,-0.6955;4.5387,-2.298,-1.0002;6.2621,-1.5319,0.0251;5.6132,-0.1717,0.3144;6.291,0.4111,0.9438;4.322,-0.406,1.1012;3.2382,-0.5645,0.5763;4.5546,-0.4551,2.4202;3.417,-0.7388,3.2777;3.6891,-0.2884,4.2346;2.5393,-0.2307,2.8718;3.1856,-2.2361,3.4073;2.3739,-2.4246,4.119;4.0872,-2.7389,3.7718;2.9039,-2.6689,2.4432;5.2706,0.6163,-0.8681;4.4426,0.2197,-1.3086;6.0323,0.5914,-1.542;7.582,-1.8273,0.4294;8.3601,-0.8766,1.1727;9.0105,-0.119,1.7705;8.1713,-3.0637,0.1059;9.1902,-3.2624,0.4228)|\",5.091250142854999\r\n\"[H]C1=C([H])C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1C#N |(0.9242,-0.7237,5.3277;1.5578,-1.0477,4.5085;1.8941,-0.167,3.4847;1.5188,0.8524,3.5146;2.7067,-0.5747,2.419;3.0398,0.3978,1.291;2.669,1.3882,1.5753;4.5506,0.5037,1.136;5.264,-0.3087,0.5801;5.017,1.625,1.7269;6.4654,1.7429,1.7346;6.6123,2.5541,1.1294;6.7719,0.9245,1.1944;2.4082,-0.0074,-0.065;2.7605,0.6961,-0.8303;2.793,-0.9918,-0.3523;0.8781,-0.016,-0.0412;0.4824,-0.2803,-1.0276;0.4765,0.9697,0.2241;0.4895,-0.7414,0.681;3.1916,-1.8917,2.4043;3.844,-2.2133,1.5988;2.8659,-2.7807,3.4225;3.2472,-3.7966,3.4069;2.0419,-2.3649,4.4822;1.6994,-3.2795,5.5321;1.4191,-4.0215,6.3832)|\",5.589218489270001\r\n\"[H]C1=C([H])C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1Cl |(0.6567,3.8327,3.9074;1.5577,3.279,3.666;1.6959,2.6454,2.4308;0.8845,2.7107,1.7101;2.854,1.9291,2.1084;2.9568,1.2022,0.7699;2.1274,1.5359,0.137;4.249,1.5917,0.0662;5.3522,1.1385,0.306;4.0255,2.5504,-0.8592;5.2278,3.0746,-1.4845;5.1177,2.7551,-2.4497;5.9747,2.4984,-1.0772;2.8781,-0.3333,0.9382;3.7369,-0.6637,1.5328;1.9795,-0.5552,1.5258;2.8376,-1.0955,-0.3899;2.7448,-2.1725,-0.2134;3.753,-0.9347,-0.9701;1.9844,-0.7844,-1.0052;3.8884,1.8678,3.0526;4.8036,1.3349,2.8137;3.7679,2.4984,4.2899;4.5733,2.4508,5.0151;2.5993,3.1982,4.5875;2.4388,3.9923,6.1493)|\",6.005552680535\r\n\"[H]C1=C([H])C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1F |(2.5705,-4.4994,-3.2447;2.3063,-4.1913,-2.2383;2.8884,-3.0739,-1.6414;3.6324,-2.5027,-2.1882;2.5397,-2.6884,-0.3383;3.1378,-1.4408,0.3076;2.8576,-1.4374,1.3664;4.657,-1.5064,0.2438;5.3398,-1.2278,-0.7235;5.1732,-1.9319,1.4183;6.6245,-1.9982,1.4503;6.8926,-1.7159,0.4994;6.7864,-3.007,1.5012;2.6262,-0.1383,-0.3514;2.9606,-0.1196,-1.3946;1.5308,-0.1861,-0.369;3.0852,1.1337,0.368;2.6615,2.0226,-0.1118;4.1759,1.2347,0.3455;2.7664,1.1374,1.4175;1.5938,-3.4526,0.3564;1.3139,-3.1709,1.3684;0.9988,-4.5717,-0.2269;0.2657,-5.1691,0.3051;1.3671,-4.9222,-1.5196;0.8001,-6.0048,-2.0928)|\",6.163378713825\r\n\"[H]C1=C([H])C(F)=C([C@@]([H])(C(=O)ON([H])[H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(1.6299,0.9244,6.1287;1.9117,0.9434,5.0798;1.3079,1.859,4.2192;0.5552,2.5619,4.561;1.6822,1.8662,2.8811;1.0772,2.756,2.0565;2.6396,0.9969,2.3535;2.9849,1.0413,0.8691;2.5725,1.9636,0.4537;4.4944,1.0971,0.6961;5.2573,0.1535,0.7817;4.9021,2.356,0.4199;6.3313,2.4881,0.1924;6.6809,1.5344,0.347;6.6231,3.0323,1.0079;2.4167,-0.1649,0.0791;2.7477,-0.0741,-0.9636;2.8685,-1.084,0.4681;0.8897,-0.255,0.1271;0.5372,-1.0928,-0.4842;0.4228,0.6599,-0.2561;0.5283,-0.4114,1.149;3.2296,0.0858,3.2413;3.9863,-0.5946,2.8625;2.8728,0.0563,4.5898;3.3469,-0.6573,5.2572)|\",6.171542129340001\r\n\"[H]O[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1C#N |(5.2292,-0.2333,1.5985;5.9342,0.4076,1.7909;6.0344,1.2875,0.696;7.0888,1.5716,0.6055;5.6667,0.6508,-0.6582;5.9169,1.1644,-1.7248;5.0229,-0.5165,-0.4853;4.5264,-1.1891,-1.6756;5.2485,-1.038,-2.4811;4.5043,-2.2442,-1.3953;3.1464,-0.6726,-2.0474;2.7495,-1.2532,-2.8881;3.1964,0.3773,-2.3507;2.458,-0.7661,-1.2021;5.2307,2.5781,0.8757;5.8174,3.8041,0.559;6.8377,3.8175,0.1864;5.1204,5.0032,0.7133;5.6041,5.9431,0.4632;3.8118,4.9954,1.1972;3.2669,5.9255,1.3274;3.2027,3.7852,1.5147;2.1834,3.759,1.8866;3.9023,2.5771,1.3533;3.2125,1.3553,1.6572;2.6138,0.3822,1.8793)|\",5.692621752460001\r\n\"[H]O[C@]1([H])N2C(=NC1([H])[H])O[C@]([H])(N(=O)=O)C([H])=C2[H] |(1.8931,0.5986,2.0427;2.1242,1.307,1.4178;1.8194,0.8639,0.117;2.2227,1.6352,-0.5419;0.3787,0.6981,-0.1047;0.1242,-0.6723,-0.2036;1.1244,-1.4554,-0.2523;2.3031,-0.5777,-0.2328;3.0369,-0.9178,0.507;2.7928,-0.5995,-1.2136;-1.1601,-1.0976,-0.1502;-2.197,-0.1735,-0.3735;-3.1048,-0.6056,0.0454;-2.4925,-0.1307,-1.9063;-1.5659,-0.3489,-2.6717;-3.6401,0.1778,-2.2044;-1.9122,1.2177,0.084;-2.7382,1.8844,0.2892;-0.6266,1.597,0.1821;-0.3086,2.5832,0.5017)|\",4.394638685575\r\n\"[H]C#CC([H])([H])[C@]([H])(/N=C(\\O[H])OC([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C(=O)O[H] |(3.6094,0.1383,4.2935;3.5577,0.307,3.2419;3.4989,0.4978,2.0508;3.4413,0.7194,0.6071;4.3423,1.254,0.2857;2.5853,1.3613,0.357;3.3222,-0.5909,-0.2216;4.1999,-1.213,-0.0003;3.273,-0.2807,-1.6581;4.3368,0.0769,-2.2592;5.5349,0.2257,-1.6551;6.1806,0.4949,-2.3318;4.4289,0.3655,-3.5644;3.2549,0.1833,-4.4237;2.3759,0.3392,-3.7929;3.3476,1.2643,-5.4902;2.4867,1.1931,-6.1629;4.2591,1.147,-6.0864;3.3508,2.2614,-5.0393;3.269,-1.2337,-4.9851;2.4011,-1.384,-5.6361;3.224,-1.9827,-4.1893;4.1758,-1.4026,-5.5764;2.0476,-1.3639,0.1724;1.1813,-0.7055,0.0121;2.0844,-1.6145,1.2333;1.7881,-2.6593,-0.599;1.5212,-3.7113,-0.0691;1.8236,-2.5361,-1.9455;2.1614,-1.6288,-2.1498)|\",7.074960113\r\n\"[H]O/C1=C(\\[H])[C@]2([H])C(NC([H])=NC2=O)S1 |(1.0933,0.4232,0.9602;1.437,-0.3277,0.4474;2.6654,-0.0152,-0.0168;3.423,1.0776,0.149;3.1373,1.9422,0.7371;4.7065,1.0224,-0.6141;4.6545,1.6825,-1.5044;4.8741,-0.3773,-1.159;5.9834,-0.8455,-1.6123;7.0785,0.0076,-1.4238;7.9699,-0.3018,-1.9684;7.1806,1.0384,-0.6466;6.0244,1.4499,0.0492;6.0638,2.1517,1.034;3.3976,-1.3212,-1.0119)|\",3.801430491485\r\n\"[H]OC([H])([H])[C@@]([H])(C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H])C([H])([H])N([H])[H] |(3.1863,1.8305,-4.756;3.5552,2.4291,-5.4255;4.7941,2.9039,-4.9143;5.5346,2.0969,-4.8284;5.1717,3.6097,-5.6622;4.6376,3.6331,-3.5667;3.9493,4.4704,-3.746;4.0005,2.7517,-2.4994;2.9287,3.2222,-1.7329;2.54,4.2207,-1.9216;2.3326,2.4511,-0.7313;1.5033,2.8617,-0.1663;2.8096,1.1602,-0.4791;2.3081,0.3139,0.4678;1.2199,0.764,1.2585;0.9815,-0.058,1.9361;0.3396,0.9924,0.6429;1.4872,1.652,1.8468;3.8781,0.6636,-1.2392;4.235,-0.3417,-1.0366;4.4598,1.4472,-2.2295;5.2896,1.0412,-2.8011;5.9727,4.2605,-3.1021;5.759,4.9438,-2.2614;6.3604,4.8801,-3.9212;6.9919,3.249,-2.7944;6.7665,2.7999,-1.9082;7.8929,3.7038,-2.6598)|\",5.733438830035\r\n\"[H]OC(=O)[C@@]([H])(C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H])C([H])([H])N([H])[H] |(3.0723,4.099,-0.2089;3.3632,3.5579,-1.0005;2.5562,2.4975,-1.1323;2.7541,1.6415,-1.963;1.3315,2.4799,-0.1779;0.6296,3.2247,-0.5836;0.6252,1.1392,-0.1462;1.3358,-0.0538,0.0596;2.419,-0.0356,0.1112;0.6718,-1.2712,0.1633;1.2086,-2.1991,0.317;-0.7179,-1.2898,0.057;-1.4299,-2.5722,0.1703;-2.6568,-2.5549,0.0693;-0.7583,-3.5853,0.3635;-1.4537,-0.1275,-0.155;-2.5318,-0.1832,-0.2405;-0.7708,1.0818,-0.2538;-1.3354,1.9955,-0.4223;1.6873,2.9358,1.2632;2.4938,2.3071,1.6557;0.8141,2.7855,1.912;2.165,4.3325,1.2585;2.6404,4.5617,2.1286;1.3842,4.9801,1.1625)|\",4.759271245245\r\n\"[H]OC([H])([H])[C@@]([H])(C1=C([H])C([H])=C(F)C([H])=C1[H])C([H])([H])N([H])[H] |(-1.9463,3.2843,-0.6352;-1.3635,3.5795,0.0834;-0.0945,3.8419,-0.4924;-0.1322,4.7216,-1.1598;0.5605,4.0889,0.3485;0.4803,2.6321,-1.2557;0.5634,1.815,-0.5295;-0.457,2.1881,-2.3695;-0.7154,3.0014,-3.4865;-0.2404,3.9752,-3.5688;-1.575,2.5889,-4.5044;-1.7743,3.2132,-5.3694;-2.1897,1.3466,-4.3972;-3.0232,0.9406,-5.3785;-1.972,0.5151,-3.3063;-2.4738,-0.4456,-3.2536;-1.1065,0.9463,-2.2998;-0.9284,0.3021,-1.4426;1.9051,2.9092,-1.7817;1.8719,3.7275,-2.5238;2.2459,2.0184,-2.321;2.845,3.1506,-0.6802;2.7565,4.1064,-0.3407;3.8002,3.0663,-1.0222)|\",6.010994957545\r\n\"[H]N([H])C([H])([H])C([H])([H])N([H])[C@]([H])(C([H])([H])[H])C([H])([H])SC([H])([H])C([H])([H])[H] |(9.3537,-3.3631,1.8391;9.4697,-2.4088,1.4989;10.4737,-2.2378,1.4933;8.8154,-1.4775,2.4261;9.1055,-1.5975,3.4837;9.1054,-0.4597,2.1319;7.2943,-1.609,2.3228;7.0104,-2.6316,2.6039;7.0067,-1.4734,1.267;6.6476,-0.6781,3.249;6.8893,0.2714,2.9686;5.1862,-0.7638,3.3746;4.9037,0.0728,4.0273;4.7686,-2.0594,4.0852;3.7072,-2.0265,4.3543;5.3569,-2.1856,4.9992;4.9214,-2.9429,3.4547;4.392,-0.606,2.0559;3.3174,-0.681,2.2606;4.6604,-1.4051,1.3568;4.7339,1.03,1.2875;3.7043,0.9303,-0.2325;2.6837,0.6548,0.0573;3.6638,1.9612,-0.6004;4.2428,-0.0021,-1.3172;3.5877,0.0231,-2.1975;4.2919,-1.0399,-0.9714;5.2484,0.298,-1.6275)|\",6.865432448115\r\n\"[H]O[C@]12N=C(C([H])([H])[H])S[C@]([H])(C([H])([H])C([H])([H])C1([H])[H])C2([H])[H] |(5.5461,3.1551,0.0173;4.5837,3.0374,-0.0267;4.3099,1.6941,0.3707;2.8613,1.6213,0.3613;2.2128,0.5622,0.0961;0.7029,0.582,0.0882;0.3624,1.598,0.2982;0.2993,-0.1014,0.8444;0.3157,0.2612,-0.8859;2.8705,-1.0712,-0.3077;4.6891,-0.7388,-0.1808;5.1368,-1.4382,-0.8939;5.2325,-1.0043,1.2344;5.0098,-2.033,1.5387;6.3291,-0.9169,1.1868;4.6776,-0.0089,2.2651;5.1834,-0.1508,3.2278;3.6169,-0.2239,2.4406;4.835,1.4485,1.8048;5.9011,1.7239,1.8162;4.3169,2.137,2.4802;4.9668,0.7065,-0.6089;4.605,0.9048,-1.6226;6.0541,0.8691,-0.6037)|\",6.43277142582\r\n\"[H]O[C@@]1(C([H])([H])[H])N=C(C([H])([H])[H])S[C@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C1([H])[H] |(8.8598,-4.0284,-4.0899;9.5242,-3.9733,-3.3845;8.8402,-4.2031,-2.147;8.1501,-5.576,-2.2019;7.7218,-5.8563,-1.2355;8.8949,-6.3256,-2.4824;7.347,-5.579,-2.9498;9.9026,-4.2639,-1.1662;9.705,-3.994,0.0576;10.8299,-4.1315,1.0557;11.7175,-4.4985,0.5362;10.561,-4.8353,1.8525;11.0518,-3.1681,1.528;8.1773,-3.4478,0.8484;7.1041,-2.9546,-0.5916;6.8787,-1.9001,-0.3961;5.7674,-3.7162,-0.5863;5.2242,-3.4224,-1.4978;5.941,-4.7953,-0.66;4.8825,-3.4209,0.6325;5.4202,-3.6897,1.552;4.6949,-2.3382,0.6898;3.5432,-4.1679,0.5971;3.001,-3.9038,-0.3235;3.7323,-5.2501,0.5384;2.6499,-3.8769,1.8101;3.1922,-4.1399,2.7294;2.4602,-2.7957,1.8684;1.3156,-4.6284,1.7719;0.7024,-4.3994,2.6507;1.4722,-5.714,1.749;0.7341,-4.3603,0.8811;7.8896,-3.0076,-1.9133;8.5174,-2.1146,-1.9974;7.1644,-2.9667,-2.7377)|\",6.430050287315001\r\n\"[H]OC(NNC([H])C(=O)OC([H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.7745,5.1601,2.3279;6.035,4.2372,2.1953;5.0873,3.5895,1.484;5.3882,2.3484,1.2771;4.4387,1.6476,0.5524;4.7664,0.4235,0.3561;5.6995,-0.0042,0.7271;3.8838,-0.5052,-0.3984;4.1913,-1.6689,-0.5821;2.7495,0.0707,-0.8409;1.8342,-0.7745,-1.5986;2.4416,-1.4577,-2.1994;0.9642,-1.5757,-0.6349;0.2579,-2.1993,-1.1948;0.3918,-0.9048,0.0156;1.5811,-2.2311,-0.0144;1.0422,0.1692,-2.4924;0.3474,-0.4013,-3.1186;1.7104,0.7379,-3.1466;0.462,0.878,-1.8913;3.8395,4.3882,1.0403;2.572,3.7493,1.657;1.6916,4.333,1.3618;2.6227,3.7477,2.7523;2.4464,2.7234,1.3097;3.9166,5.8577,1.5129;3.0206,6.3888,1.1768;4.7751,6.3939,1.0857;3.9399,5.948,2.6075;3.7442,4.3827,-0.5043;2.885,4.9883,-0.8166;3.6153,3.3675,-0.8809;4.6437,4.8148,-0.9585)|\",4.5715126884\r\n\"[H]O/C(=N/NC([H])C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(4.8599,0.2688,-2.2539;4.2832,1.0045,-2.0055;2.9826,0.6318,-2.1396;2.1631,1.5756,-1.8372;0.8041,1.2542,-1.9481;0.0499,2.244,-1.6442;0.4956,3.1975,-1.3363;-1.4427,2.1245,-1.6754;-1.8559,2.8956,-2.3414;-1.7186,1.1494,-2.09;-2.0735,2.303,-0.2695;-1.7448,3.262,0.1508;-1.6813,1.5197,0.3905;-3.5859,2.248,-0.3001;-4.3466,3.4191,-0.4142;-3.8411,4.3821,-0.4505;-5.74,3.3674,-0.4765;-6.3113,4.2883,-0.5605;-6.3978,2.1374,-0.4246;-7.4828,2.0948,-0.4689;-5.6523,0.9625,-0.3091;-6.1556,0.0002,-0.2621;-4.2597,1.02,-0.247;-3.6857,0.1004,-0.1521;2.6934,-0.812,-2.6154;1.8821,-1.5566,-1.5293;1.6916,-2.5868,-1.8542;0.9284,-1.0587,-1.3519;2.4354,-1.5979,-0.5835;3.9982,-1.6031,-2.863;3.7466,-2.6161,-3.1935;4.603,-1.713,-1.9526;4.6153,-1.1596,-3.6561;1.9014,-0.767,-3.9429;1.6996,-1.7893,-4.2854;2.4738,-0.2557,-4.7261;0.9534,-0.2452,-3.8096)|\",5.44771928701\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])C(=O)N(OC([H])([H])[H])C([H])([H])[H] |(1.9165,-4.3487,-0.8026;2.0713,-3.4437,-0.2209;1.6992,-3.4543,0.8019;2.6768,-2.3716,-0.7343;3.0323,-2.4109,-1.7664;2.9272,-1.0751,-0.012;2.5019,-1.1276,0.9994;2.3926,-0.2684,-0.5356;4.4201,-0.6881,0.0699;4.4889,0.3372,0.453;4.8418,-0.6646,-0.9461;5.2509,-1.6363,0.9444;5.0798,-2.6662,0.602;4.8723,-1.5955,1.978;6.7675,-1.3566,0.9449;7.1287,-1.4098,-0.0908;7.1114,0.038,1.4918;6.664,0.8313,0.884;8.1935,0.1942,1.4924;6.7422,0.1567,2.52;7.4946,-2.4525,1.7437;7.0754,-3.4398,1.5031;7.3321,-2.323,2.8208;8.9941,-2.5056,1.4731;9.5615,-1.815,0.6361;9.7115,-3.4565,2.1885;9.0901,-3.9594,3.3523;8.821,-5.3548,3.2042;8.3761,-5.6629,4.1538;9.7421,-5.9245,3.0297;8.1176,-5.542,2.3844;11.1559,-3.3554,2.3264;11.5392,-2.896,1.4156;11.5876,-4.3536,2.4447;11.4266,-2.7388,3.1927)|\",7.055912143465\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(2.0243,3.07,-5.5591;2.319,2.5099,-4.6755;2.1447,1.4356,-4.7016;2.8607,3.1049,-3.6119;3.0182,4.185,-3.632;3.3055,2.4055,-2.3565;3.0459,1.3398,-2.4148;2.7549,2.8168,-1.4962;4.8153,2.5613,-2.0869;5.061,3.6301,-2.0545;5.3692,2.1326,-2.9322;5.2404,1.8969,-0.7685;4.8984,0.8518,-0.7642;4.707,2.3906,0.0558;6.7548,1.8975,-0.4568;6.8672,1.4877,0.5561;7.5528,0.9974,-1.4123;7.1551,-0.0243,-1.4082;8.6075,0.9469,-1.117;7.5163,1.3655,-2.4442;7.3729,3.319,-0.4344;7.3397,3.7702,-1.4299;8.4226,3.2414,-0.1286;6.6583,4.2596,0.5124;5.9013,5.1481,0.1783;6.9455,3.9594,1.8021;6.3024,4.7651,2.8181;5.2896,5.01,2.4872;6.2503,4.1115,3.6937;7.1022,6.0284,3.1174;8.1331,5.7451,3.3688;7.1473,6.6392,2.2076;6.4901,6.8469,4.2626;5.4539,7.11,4.0086;6.4362,6.2264,5.1687;7.28,8.124,4.5662;6.8224,8.6891,5.3858;7.3212,8.7808,3.6891;8.3122,7.8921,4.8556)|\",7.04774872795\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])C([H])([H])/C(=C(\\[H])C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[H] |(3.7437,6.3414,0.0968;3.5604,5.3409,-0.2862;2.6489,5.2033,-0.8653;4.4089,4.3356,-0.0657;5.3122,4.5296,0.5137;4.2185,2.9243,-0.5552;5.1114,2.6106,-1.1153;3.3788,2.8973,-1.2611;3.9465,1.8989,0.5674;3.7211,0.928,0.1084;3.0434,2.2007,1.1126;5.0898,1.7289,1.5943;4.7545,1.0027,2.3494;5.2529,2.6754,2.1217;6.3967,1.2364,1.0073;7.489,2.0264,1.0741;7.409,2.9991,1.5516;8.8407,1.7188,0.562;9.2105,0.7166,-0.026;9.6734,2.7572,0.8454;11.039,2.6333,0.3946;11.6129,3.2614,1.0823;11.3579,1.593,0.5035;11.1991,3.1058,-1.0474;10.5876,2.4671,-1.6961;10.8036,4.1269,-1.1322;12.6614,3.0713,-1.5121;13.2693,3.7125,-0.8579;13.0567,2.0525,-1.396;12.8324,3.5208,-2.9668;13.8843,3.4901,-3.2721;12.4731,4.5474,-3.1087;12.267,2.8747,-3.6489;6.3695,-0.1375,0.3896;5.8882,-0.8469,1.0758;7.3679,-0.4892,0.1364;5.7644,-0.1338,-0.5272)|\",5.8831014478100006\r\n\"[H]C([H])=C([H])/C([H])=C(\\[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])O[Si](C([H])([H])[H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(3.8126,3.1208,-4.7949;4.7312,2.9136,-4.2545;5.4154,2.2005,-4.7093;5.011,3.509,-3.0856;4.2959,4.2183,-2.6666;6.2204,3.2827,-2.3055;6.9428,2.5825,-2.7194;6.4668,3.9048,-1.1408;5.719,4.6169,-0.7809;7.654,3.7587,-0.2111;8.333,5.1362,-0.0499;7.6163,5.8842,0.3099;8.7397,5.4791,-1.0047;9.1546,5.0835,0.676;7.1259,3.2828,1.1639;6.3664,3.9736,1.5469;7.9366,3.2354,1.9018;6.6673,2.2898,1.0916;8.6881,2.7203,-0.6844;8.1915,1.7481,-0.8257;9.4296,2.5978,0.1199;9.3274,3.1222,-1.8867;10.6419,2.318,-2.5787;12.1059,2.3218,-1.3776;12.9773,1.8204,-1.815;11.8581,1.7909,-0.4506;12.4048,3.3407,-1.1073;10.1709,0.5222,-2.9549;10.9982,-0.0157,-3.4332;9.3032,0.4612,-3.6213;9.9219,-0.0196,-2.0343;11.0124,3.3072,-4.1738;9.7656,3.3063,-5.0841;9.9593,3.8822,-6.001;8.9021,3.7571,-4.5832;9.4841,2.2919,-5.393;11.3703,4.7659,-3.8181;11.5644,5.3447,-4.7329;12.2718,4.828,-3.1967;10.5544,5.2592,-3.2783;12.1964,2.667,-4.9317;12.4065,3.2328,-5.8507;11.9863,1.6321,-5.2296;13.1167,2.6643,-4.3345)|\",5.488536364585\r\n\"[H]C([H])=C([H])[C@]([H])(OC([H])([H])OC([H])([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])=C([H])[H] |(0.5846,1.312,3.0017;1.0271,1.0167,2.0542;2.1065,1.1308,1.9682;0.29,0.5371,1.0521;-0.789,0.4352,1.1593;0.8469,0.1076,-0.2825;1.9317,0.3075,-0.2994;0.2123,0.8004,-1.3524;0.666,2.1161,-1.6184;0.0099,2.8551,-1.1438;1.6962,2.2327,-1.2315;0.6043,2.3687,-2.9825;1.5067,1.5742,-3.7449;1.3338,1.8255,-4.7938;1.3307,0.5056,-3.5833;2.5518,1.8101,-3.4868;0.6487,-1.402,-0.5526;1.0464,-1.9605,0.3029;1.376,-1.7611,-1.7446;2.6442,-2.3258,-1.5096;3.1139,-2.3765,-2.5023;3.2487,-1.7019,-0.8374;2.6064,-3.585,-0.8978;1.9781,-4.5873,-1.6889;2.4753,-4.6915,-2.6657;0.9177,-4.3653,-1.8527;2.0748,-5.5252,-1.137;-0.7922,-1.7755,-0.7738;-1.249,-1.3398,-1.6591;-1.4913,-2.568,0.0386;-2.5343,-2.8049,-0.1546;-1.05,-3.0106,0.9296)|\",7.069517835990001\r\n\"[H]C([H])([H])C(=O)N1C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C(=O)OC(=O)[C@@]12[H] |(2.3578,0.6026,-2.5684;2.1129,0.0732,-1.6395;1.1482,0.442,-1.2784;2.0386,-0.9912,-1.8682;3.2191,0.2449,-0.613;4.0866,-0.5943,-0.454;3.1855,1.4186,0.1668;2.9416,2.6814,-0.5868;3.7448,2.8217,-1.33;2.0059,2.5705,-1.1379;2.8667,3.9408,0.2884;2.7816,4.8073,-0.3771;1.9521,3.9096,0.8949;4.0839,4.0787,1.2208;3.9826,4.9444,1.8828;5.008,4.212,0.6432;4.13,2.7847,2.0105;3.1726,2.6819,2.548;5.1996,2.5074,3.046;5.8852,3.2534,3.6824;5.2521,1.1241,3.1984;4.439,0.4948,2.2265;4.0104,-0.6019,2.3938;4.2756,1.5382,1.1208;5.2498,1.5817,0.6013)|\",6.17698440635\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)C2=C([H])C([H])=C([H])C(Cl)=C2[H])O1 |(2.5601,-7.8313,-0.5389;1.5709,-7.5296,-0.2286;0.4417,-8.2205,0.1066;0.3252,-9.2948,0.1222;-0.5362,-7.2366,0.4291;-1.5578,-7.4036,0.7417;0.0631,-6.0074,0.2681;-0.4293,-4.6757,0.4438;-1.4634,-4.5892,0.7702;0.2652,-3.5325,0.2447;1.3009,-3.5779,-0.071;-0.3843,-2.2277,0.4817;-1.5348,-2.1552,0.9125;0.3812,-0.9608,0.2031;-0.1897,0.2483,0.6293;-1.1471,0.2095,1.1369;0.4613,1.4561,0.401;0.0157,2.3877,0.7382;1.69,1.4845,-0.2636;2.2061,2.4207,-0.4484;2.2483,0.282,-0.6925;3.7905,0.3011,-1.5421;1.6134,-0.9371,-0.4666;2.0802,-1.8444,-0.8298;1.3646,-6.1918,-0.1377)|\",3.823199599525\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])=C([H])[H])[C@@]([H])(C([H])([H])[H])C([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])[H] |(6.6133,-2.6152,1.242;6.3065,-1.6984,1.3458;5.0865,-1.5824,0.6174;5.2991,-1.5931,-0.4668;4.1612,-2.7847,0.9183;3.9641,-2.8307,1.9948;3.2015,-2.6295,0.4061;4.7886,-4.0732,0.4596;4.8348,-4.2258,-0.6204;5.3375,-4.9875,1.265;5.8101,-5.8855,0.8755;5.311,-4.8826,2.3482;4.4671,-0.2086,0.9426;3.5311,-0.1578,0.3681;4.1431,-0.0486,2.4367;3.7034,0.9321,2.6461;5.0496,-0.1555,3.0411;3.4207,-0.7983,2.7769;5.4026,0.903,0.4182;5.6552,0.6816,-0.6264;6.3402,0.8732,0.9853;4.8457,2.3355,0.4698;4.5653,2.6063,1.4929;3.6047,2.4794,-0.3995;3.4074,1.9038,-1.4499;2.7389,3.3815,0.1209;1.5648,3.6321,-0.668;0.9878,2.713,-0.8014;1.8371,4.0229,-1.6522;0.9874,4.3694,-0.1091;5.8968,3.3566,-0.018;6.7997,3.2893,0.5984;5.515,4.3811,0.0448;6.1772,3.1557,-1.058)|\",7.134825160110001\r\n\"[H]C1([H])C(=O)[C@@]2([H])C([H])([H])C([H])([H])[C@@](C([H])([H])F)(C1=O)C2([H])[H] |(1.8731,-1.541,-0.3923;0.8982,-1.4975,0.1214;0.7151,-2.495,0.5278;-0.1424,-1.1419,-0.9447;-0.6856,-1.9978,-1.6121;-0.456,0.3439,-1.0932;-0.773,0.5212,-2.1242;-1.5678,0.7362,-0.0673;-2.1648,1.5555,-0.4795;-2.257,-0.0926,0.1217;-0.811,1.1931,1.2141;-1.1166,0.6473,2.1115;-0.9923,2.2569,1.4096;0.7041,0.9874,0.8974;1.6063,1.9305,1.6736;1.6014,1.6883,2.7403;1.2935,2.9717,1.5258;2.9131,1.8007,1.2021;1.0412,-0.4769,1.2457;1.3674,-0.8079,2.3674;0.745,1.1883,-0.6324;0.5841,2.2465,-0.8735;1.6989,0.8935,-1.0808)|\",5.651804674885\r\n\"[H]OC([H])([H])[C@@]12C(=O)C([H])=C(OC([H])([H])[H])[C@@]([H])(C([H])([H])C1([H])[H])C2([H])[H] |(3.9947,5.0162,-0.8179;4.55,5.3608,-0.0956;5.3101,4.2425,0.3248;5.8631,3.7986,-0.517;6.0473,4.6141,1.0471;4.4566,3.1578,0.9913;3.3087,2.7332,0.0557;3.3026,3.0514,-1.136;2.2499,1.9317,0.6542;1.4578,1.5605,0.0124;2.2548,1.6615,1.9852;1.2741,0.8642,2.4664;1.1416,0.6237,3.8663;0.2945,-0.0576,3.9602;0.9224,1.5457,4.417;2.0342,0.1467,4.286;3.3672,2.1807,2.8715;3.0603,2.2136,3.9193;4.6454,1.3114,2.6594;5.3093,1.434,3.5225;4.4154,0.2448,2.575;5.297,1.8929,1.3738;5.3314,1.1688,0.5543;6.3321,2.1967,1.5723;3.8229,3.553,2.3413;4.5796,3.9847,3.0093;3.0079,4.2745,2.2368)|\",5.270845284185\r\n\"[H]OC([H])([H])[C@]12C(=O)C([H])([H])C(OC([H])([H])[H])(OC([H])([H])[H])[C@]([H])(C([H])([H])C1([H])[H])C2([H])[H] |(3.7082,-1.4262,-4.3271;3.7771,-2.3073,-3.9189;2.7841,-2.3086,-2.9084;1.7865,-2.1062,-3.3298;2.7652,-3.3212,-2.4888;3.0713,-1.3095,-1.7826;3.1158,0.1201,-2.3211;3.0315,0.3831,-3.5113;3.3402,1.2294,-1.2969;3.0127,2.1747,-1.7351;4.424,1.2916,-1.1316;2.6575,0.9842,0.0663;1.3225,1.4388,-0.1285;0.44,1.3085,0.9819;0.9154,1.637,1.9126;0.0851,0.2765,1.1036;-0.4177,1.9503,0.766;3.2834,1.6812,1.1278;3.4171,3.09,0.9672;2.4679,3.5548,0.6776;4.1823,3.3461,0.2221;3.7307,3.4784,1.9391;2.7194,-0.5113,0.454;2.2599,-0.6252,1.4398;4.1694,-1.0592,0.4374;4.8901,-0.2958,0.7418;4.2578,-1.8792,1.1577;4.4012,-1.5846,-1.0101;4.5833,-2.6647,-1.0148;5.2659,-1.1271,-1.5007;2.0244,-1.3367,-0.6403;1.0527,-0.931,-0.9381;1.8756,-2.3727,-0.3093)|\",6.000110403525\r\n\"[H]C1=C([H])C2=C(N=C1[C@@]1([H])N([H])N([H])[C@]([H])(N([H])[H])[C@]1([H])F)C([H])=C([H])C([H])=C2[H] |(-3.3685,1.1433,-0.2975;-2.389,0.6841,-0.1991;-1.237,1.4304,-0.1919;-1.2732,2.5141,-0.2806;0.02,0.7831,-0.0697;0.0111,-0.6433,0.0428;-1.1435,-1.3735,0.0375;-2.2929,-0.7352,-0.0839;-3.5452,-1.5953,-0.0875;-3.2427,-2.6123,0.1747;-4.6094,-1.1349,0.8427;-4.1976,-0.4878,1.516;-5.4832,-0.2887,-0.0199;-6.3717,-0.2256,0.4823;-5.7061,-1.1634,-1.1682;-6.1166,-0.5763,-1.9967;-6.645,-2.2369,-0.8607;-6.2717,-2.7767,-0.0795;-6.7006,-2.8786,-1.6506;-4.2584,-1.6326,-1.4661;-3.7614,-0.9861,-2.1953;-4.2675,-2.9166,-2.0011;1.2463,-1.3311,0.1678;1.2155,-2.4129,0.2514;2.4344,-0.6341,0.1782;3.3761,-1.1681,0.2717;2.4427,0.7784,0.0673;3.3889,1.3124,0.0791;1.2594,1.4731,-0.0541;1.2595,2.5574,-0.1395)|\",4.133409389095\r\n\"[H]C1([H])OC([H])([H])[C@]([H])(C([H])([H])N2/C(=N/N(=O)=O)SC([H])([H])C2([H])[H])C1([H])[H] |(0.6127,-1.7159,-2.6474;0.2026,-1.5969,-1.6413;0.407,-2.5166,-1.0703;0.8615,-0.4869,-1.0339;0.0332,-0.1002,0.047;0.1714,-0.7809,0.9071;0.3256,0.9076,0.3556;-1.4195,-0.1954,-0.4832;-1.6898,0.7724,-0.9221;-2.4082,-0.5306,0.642;-2.3213,0.2151,1.4443;-2.1868,-1.5113,1.0663;-3.8155,-0.5523,0.2291;-4.5077,-1.6963,0.0138;-3.9071,-2.8485,0.2039;-4.6613,-3.9965,-0.0041;-5.7853,-3.9245,-0.5291;-4.121,-5.0401,0.336;-6.1856,-1.3857,-0.494;-5.9745,0.414,-0.1235;-6.539,0.9983,-0.8526;-6.3625,0.6192,0.8782;-4.4743,0.6773,-0.2108;-4.166,1.5031,0.4393;-4.1754,0.9226,-1.24;-1.3053,-1.2617,-1.6119;-1.9113,-2.148,-1.4056;-1.6334,-0.8459,-2.5693)|\",5.246355037640001\r\n\"[H]C(=O)C1=C([H])OC(/C([H])=C(\\C([H])([H])[H])[C@]([H])(OC([H])([H])[H])C([H])([H])[H])=N1 |(-2.1722,0.6712,3.5338;-1.4491,0.3629,4.3166;-1.7724,0.2472,5.485;-0.0929,0.1132,3.8158;1.0123,-0.2714,4.521;1.2045,-0.4848,5.5602;2.0446,-0.3798,3.6538;1.5184,-0.0501,2.4153;2.4584,-0.1033,1.3169;3.4765,-0.3398,1.6036;2.1879,0.0966,0.0117;0.8256,0.4179,-0.5448;0.0308,0.0781,0.1203;0.6919,-0.0339,-1.5355;0.6974,1.5023,-0.6661;3.3071,0.0113,-1.0129;3.0117,-0.7516,-1.7563;4.5113,-0.3981,-0.3797;5.4255,-1.0669,-1.2245;6.2752,-1.3569,-0.601;5.7932,-0.427,-2.0401;4.9807,-1.973,-1.665;3.5107,1.3518,-1.7423;2.6036,1.6652,-2.2683;4.313,1.2695,-2.4825;3.7833,2.1271,-1.0192;0.251,0.2531,2.4773)|\",4.723896444679999\r\n\"[H]C1=C(C(=O)OC([H])([H])C([H])([H])[H])N=C(/C([H])=C(\\C([H])([H])[H])[C@]([H])(OC([H])([H])[H])C([H])([H])[H])O1 |(3.9747,0.8024,1.5895;4.441,0.2516,0.7891;5.0323,0.5991,-0.3889;5.1748,1.9843,-0.8811;4.764,2.948,-0.2597;5.8023,2.0433,-2.0682;5.9868,3.3631,-2.6336;6.1866,4.0695,-1.8244;6.8765,3.2631,-3.26;4.7739,3.7852,-3.4493;4.9695,4.748,-3.9356;3.8972,3.8981,-2.8052;4.5515,3.0455,-4.2251;5.4677,-0.5457,-1.0362;5.1308,-1.5282,-0.2458;5.3841,-2.9325,-0.4945;5.9525,-3.0664,-1.4111;5.0318,-4.0201,0.2208;4.2266,-4.0262,1.4935;3.6544,-3.1092,1.6284;4.8816,-4.1397,2.3687;3.544,-4.881,1.4872;5.5059,-5.3792,-0.2725;5.9231,-5.2598,-1.2847;4.3545,-6.2243,-0.3412;4.4958,-7.3438,-1.1916;3.5248,-7.8458,-1.2142;5.2504,-8.0588,-0.8317;4.7634,-7.042,-2.2169;6.5877,-5.9791,0.6406;7.4483,-5.3049,0.7088;6.9433,-6.9387,0.2505;6.1932,-6.147,1.6476;4.4965,-1.1006,0.9015)|\",5.0068948492\r\n\"[H]OC1=C([H])C(N(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1/C([H])=N/C([H])([H])[H] |(8.3676,-0.1045,-1.2419;7.4306,-0.4424,-1.1403;6.6412,0.633,-0.9746;5.2709,0.4235,-0.8201;4.9313,-0.6031,-0.864;4.3747,1.4987,-0.6443;3.0187,1.2815,-0.4853;2.4654,-0.0642,-0.3529;3.1628,-0.6845,0.2189;1.5599,0.0124,0.2605;2.1248,-0.733,-1.6912;1.7031,-1.7312,-1.5241;3.015,-0.8362,-2.3192;1.3873,-0.1427,-2.247;2.0582,2.3815,-0.451;2.384,3.168,-1.1391;1.1119,2.0042,-0.8562;1.8219,2.9592,0.9509;1.083,3.7687,0.9133;2.7491,3.3575,1.3746;1.4456,2.1887,1.6333;4.9145,2.8169,-0.634;4.2755,3.6763,-0.4775;6.2724,3.0141,-0.7928;6.6622,4.0303,-0.7754;7.1773,1.952,-0.9663;8.593,2.1965,-1.1283;8.9136,3.2506,-1.1046;9.4527,1.2516,-1.2948;10.8518,1.5895,-1.4486;11.2271,1.1953,-2.4014;11.4389,1.1165,-0.6509;11.0381,2.6752,-1.4219)|\",4.34293705398\r\n\"[H]C1=NC([H])=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])[C@]2([H])F)C([H])=C1[H] |(2.0614,1.5166,-0.0705;1.3409,0.7002,-0.0826;0.2802,0.8404,0.72;-0.6179,-0.1512,0.7235;-1.476,-0.0106,1.3815;-0.5189,-1.3165,-0.0468;-1.5955,-2.3805,0.026;-2.4098,-2.0211,0.6642;-2.166,-2.7721,-1.2925;-1.9346,-2.0458,-1.9707;-1.3287,-3.9418,-1.6865;-1.8655,-4.4205,-2.4134;-1.4003,-4.7751,-0.4887;-0.6219,-5.5439,-0.5372;-2.6923,-5.4392,-0.3644;-3.4202,-4.7254,-0.3286;-2.7379,-5.9431,0.5201;-1.096,-3.7287,0.6148;-0.0315,-3.6878,0.8606;-1.7718,-4.0537,1.7884;0.5992,-1.4359,-0.8859;0.7122,-2.3181,-1.5112;1.5425,-0.412,-0.9033;2.4211,-0.4706,-1.5394)|\",4.889885893485\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])[C@]2([H])F)C([H])=C1[H] |(-2.6648,-0.047,-0.2943;-1.5802,-0.0021,-0.3437;-0.9317,-0.0491,-1.5794;-1.5098,-0.1306,-2.4961;0.4601,0.0124,-1.6422;0.9466,-0.0177,-2.6151;1.2285,0.1167,-0.4728;2.7387,0.1654,-0.5261;3.1242,0.2882,0.4967;3.3311,-1.0602,-1.0984;2.7657,-1.3529,-1.8946;4.6322,-0.7117,-1.624;5.3375,-1.0783,-0.9874;4.7907,0.757,-1.6688;5.1235,1.0688,-2.6659;5.8101,1.1966,-0.7257;5.4268,1.2281,0.2193;6.0925,2.1508,-0.9436;3.3591,1.2865,-1.3769;2.7904,1.4497,-2.2991;3.406,2.4981,-0.6967;0.5657,0.1585,0.759;1.1481,0.2348,1.6739;-0.827,0.1036,0.8259;-1.3219,0.14,1.7927)|\",5.6599680904000005\r\n\"[H]OC(=N/N([H])[H])/C(C#N)=C(\\[H])C1=C([H])C2=C(OC([H])([H])O2)C([H])=C1[H] |(5.3057,-3.5518,0.7919;5.9919,-2.966,1.1471;5.4738,-1.7123,1.3434;6.3328,-0.814,1.6382;5.9275,0.4958,1.8931;6.6593,0.9159,2.4594;5.0466,0.5453,2.415;3.9906,-1.5812,1.2152;3.2522,-2.7443,1.6059;2.713,-3.7319,1.9112;3.4103,-0.4597,0.6904;4.1167,0.3119,0.3948;2.0226,-0.1116,0.4392;0.9075,-0.9258,0.7893;1.0199,-1.8822,1.2814;-0.3413,-0.4467,0.4776;-0.5413,0.7834,-0.1562;-1.8741,1.0014,-0.3394;-2.5482,-0.1475,0.2032;-3.1157,-0.6412,-0.5934;-3.2089,0.1736,1.016;-1.5551,-1.0409,0.7129;0.5177,1.5993,-0.5075;0.3605,2.5536,-0.9973;1.7987,1.1294,-0.196;2.6567,1.7424,-0.456)|\",3.3252312531099992\r\n\"[H]C1=C([H])C2=C([H])C([H])([H])C([H])([H])C([H])([H])[C@]23N(C1=O)C([H])([H])C([H])([H])C3([H])[H] |(-0.7629,1.1469,-4.0039;-0.5079,0.779,-3.0161;0.5293,1.2583,-2.3087;1.1761,2.0281,-2.7269;0.8864,0.7078,-1.0114;2.1113,0.9261,-0.4912;2.7876,1.5874,-1.0334;2.678,0.2405,0.7183;3.5443,-0.3522,0.3828;3.0947,0.9798,1.4171;1.6635,-0.6676,1.4367;1.1335,-0.0902,2.2037;2.1917,-1.466,1.9705;0.6446,-1.2792,0.4649;-0.0499,-1.919,1.0194;1.1523,-1.9194,-0.2672;-0.1281,-0.1851,-0.3145;-1.0867,-0.8212,-1.2586;-1.4083,-0.2707,-2.4932;-2.4005,-0.6245,-3.1276;-2.2107,-1.3465,-0.4643;-3.0982,-1.3399,-1.097;-2.0165,-2.3816,-0.165;-2.3071,-0.4006,0.7699;-3.2617,0.1334,0.7887;-2.238,-0.9664,1.7048;-1.1178,0.5774,0.6157;-0.6738,0.8693,1.5715;-1.434,1.4945,0.1065)|\",4.713011890660001\r\n\"[H]C1NC([H])([H])[C@]2([H])[C@]1([H])[C@]1([H])C([H])=C([H])[C@@]2([H])C1([H])[H] |(-0.3611,-3.8086,-0.0332;0.7156,-3.6365,-0.0271;1.5194,-4.5242,-0.4947;2.9263,-3.992,-0.3808;3.6093,-4.8238,-0.1879;3.2007,-3.5315,-1.3413;2.7656,-2.9783,0.7559;2.533,-3.632,1.6085;1.4314,-2.3465,0.323;1.5735,-1.9081,-0.6789;1.3249,-1.0982,1.2895;0.3284,-0.7651,1.5859;2.1912,-0.0823,0.5257;1.7977,0.7442,-0.0583;3.4605,-0.514,0.5687;4.3126,-0.0867,0.05;3.5085,-1.7761,1.4278;4.473,-2.016,1.8817;2.3402,-1.4307,2.4197;2.0496,-2.2653,3.0671;2.5598,-0.5548,3.0367)|\",6.315762470105\r\n\"[H]/N=C(O[H])\\C(C#N)=C(/C([H])([H])[H])C([H])([H])C(=O)OC([H])([H])C([H])([H])[H] |(3.5582,4.3437,1.3832;2.6595,4.0026,1.7277;2.4981,2.7828,1.4241;1.3586,2.1631,1.8404;1.2653,1.3214,1.3659;3.4551,1.9209,0.6362;3.9042,2.5033,-0.5957;4.2681,2.994,-1.5867;3.916,0.7093,1.0411;3.5561,0.08,2.3592;3.0601,-0.8823,2.1841;2.9099,0.7041,2.9765;4.4745,-0.1328,2.9222;4.8489,-0.0991,0.1722;5.6808,-0.5,0.7654;5.2955,0.5017,-0.6262;4.1293,-1.2805,-0.4791;2.9514,-1.5374,-0.3437;4.9924,-1.9918,-1.222;4.457,-3.1398,-1.9354;5.3137,-3.809,-2.0414;3.6972,-3.6149,-1.3107;3.892,-2.7311,-3.2864;3.5676,-3.6225,-3.8352;4.6482,-2.2141,-3.8856;3.0276,-2.0725,-3.1624)|\",5.371527408870001\r\n\"[H]C1=C([H])C([H])([H])[C@@]([H])(/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])O1 |(8.0736,-0.3638,0.7585;7.2438,-0.4258,0.0639;7.2264,-0.7543,-1.2281;8.0832,-1.0364,-1.8254;5.7904,-0.7497,-1.7022;5.4265,-1.7643,-1.9223;5.6215,-0.1507,-2.6048;5.0503,-0.1295,-0.4772;4.808,0.9226,-0.6728;3.8245,-0.8661,-0.0424;3.9872,-1.9061,0.2378;2.6011,-0.3177,-0.0133;2.5113,0.7332,-0.2923;1.3324,-0.9627,0.3551;0.1579,-0.1901,0.364;0.2163,0.8629,0.0977;-1.0723,-0.7477,0.7091;-1.9648,-0.1276,0.7087;-1.1555,-2.097,1.0533;-2.1121,-2.5364,1.3226;0.0024,-2.8808,1.0476;-0.053,-3.9337,1.3118;1.2294,-2.3229,0.7021;2.116,-2.9503,0.6993;6.028,-0.1293,0.6165)|\",4.639541151025\r\n\"[H]C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[C@@]([H])(C#N)C([H])([H])N(=O)=O |(0.3105,-0.9429,-0.5429;0.857,-0.1455,-0.0277;0.6013,0.8036,-0.5154;0.4918,-0.1001,1.0048;2.3721,-0.3981,-0.0793;2.5532,-1.3769,0.3897;2.8628,-0.4685,-1.5335;2.2849,-1.208,-2.0989;3.9177,-0.7561,-1.61;2.7428,0.5003,-2.0353;3.0962,0.6895,0.7414;2.6925,0.7099,1.7618;2.8702,1.67,0.3038;4.6432,0.5671,0.8202;5.064,0.5302,-0.1893;5.2007,1.7579,1.4809;5.6124,2.7012,2.0171;5.0854,-0.7039,1.5556;4.6123,-1.5882,1.121;4.8765,-0.6762,2.6249;6.572,-0.9683,1.4318;7.0871,-1.6004,2.3448;7.1384,-0.5788,0.4153)|\",6.163378713825001\r\n\"[H]C1=C(C(=O)C(F)(F)F)C([H])([H])C([H])([H])[C@]2([H])N1C([H])([H])[C@@]([H])(C([H])([H])[H])N2[H] |(4.444,2.3242,-1.8337;4.3254,1.302,-2.1759;5.0498,0.797,-3.2267;5.9563,1.6004,-4.0159;6.6469,1.1595,-4.9284;6.0664,3.1238,-3.7201;6.4846,3.3535,-2.4466;6.9296,3.7175,-4.5444;4.8667,3.7433,-3.8539;4.9461,-0.6751,-3.5728;4.3003,-0.819,-4.4511;5.9334,-1.0401,-3.8726;4.4133,-1.4864,-2.3804;5.1554,-1.5234,-1.575;4.1921,-2.5177,-2.678;3.1517,-0.828,-1.823;2.3633,-0.8579,-2.5991;3.4109,0.5803,-1.4929;2.8853,0.9293,-0.1712;3.6989,1.0264,0.5588;2.3282,1.8726,-0.2031;1.9866,-0.2773,0.1601;2.0075,-0.478,1.2373;0.5295,-0.0758,-0.2841;0.0523,0.7246,0.2937;-0.0529,-0.9911,-0.1238;0.4626,0.1904,-1.3453;2.7074,-1.3611,-0.5355;2.1235,-2.1856,-0.6567)|\",4.563349272885\r\n\"[H]C1=C([H])[C@]2([H])C3=C(C(C(F)(F)F)=C(C#N)S3)[C@@]1([H])C2([H])[H] |(-2.5711,1.106,0.7149;-1.9979,0.3011,0.2695;-1.9373,-0.9759,0.6587;-2.4434,-1.4472,1.4933;-0.9154,-1.6952,-0.253;-0.9592,-2.7843,-0.2754;0.4398,-1.0579,0.0425;0.3913,0.2629,-0.3449;1.5686,0.9751,-0.031;1.8121,2.4298,-0.3189;2.9378,2.6165,-1.036;0.7864,2.96,-1.0233;1.9372,3.1478,0.8158;2.5108,0.1669,0.6007;3.8109,0.5129,1.0335;4.8845,0.7729,1.4049;1.9125,-1.4718,0.8118;-1.018,0.4835,-0.9115;-1.16,1.3674,-1.5304;-1.208,-0.9181,-1.5815;-0.4808,-1.1233,-2.3737;-2.2269,-1.0814,-1.9439)|\",4.476272840725\r\n\"[H]C([H])=C([H])[C@]([H])(OC([H])([H])Cl)[C@]([H])(OC([H])([H])Cl)C([H])=C([H])[H] |(0.6954,-3.4098,1.6325;1.2968,-2.6927,1.0804;2.3761,-2.825,1.1314;0.7412,-1.7118,0.368;-0.34,-1.5977,0.3247;1.5226,-0.6994,-0.428;2.6016,-0.8997,-0.3368;1.2404,0.6423,0.0207;1.768,0.9382,1.281;1.4586,0.2138,2.0403;2.8615,1.0201,1.2548;1.1363,2.5453,1.7879;1.1772,-0.7044,-1.9336;1.7177,0.1241,-2.4048;-0.243,-0.513,-2.1006;-0.6607,0.7692,-2.3679;-1.7429,0.7965,-2.254;-0.1627,1.5261,-1.763;-0.3408,1.2762,-4.1275;1.547,-2.0035,-2.5943;0.9423,-2.8625,-2.3107;2.5512,-2.1341,-3.4594;2.7982,-3.0942,-3.9037;3.1603,-1.2863,-3.7658)|\",7.123940606090001\r\n\"[H]C([H])([H])C(=O)C([H])([H])[C@@]([H])(B1OC(C([H])([H])[H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])O1)C([H])([H])C([H])([H])C([H])([H])[H] |(6.5491,-2.4462,0.4407;7.0348,-2.0493,-0.4602;8.1179,-2.073,-0.3237;6.7476,-2.7042,-1.2916;6.572,-0.6251,-0.7094;7.3334,0.3198,-0.5864;5.1256,-0.4325,-1.1329;4.4847,-1.1069,-0.5451;5.0551,-0.7964,-2.1704;4.638,1.0256,-1.0497;4.6888,1.3473,-0.0002;5.5718,1.9779,-1.9025;5.9521,3.2428,-1.5299;6.925,3.7118,-2.5064;8.312,3.4353,-1.9095;9.1122,3.789,-2.5687;8.4532,2.3697,-1.7115;8.3953,3.965,-0.9551;6.7275,5.2144,-2.6998;7.4117,5.6039,-3.4626;6.9401,5.7341,-1.7601;5.7032,5.4566,-2.9932;6.5914,2.8226,-3.7694;5.5327,3.4407,-4.6938;5.9224,4.3102,-5.2339;4.6452,3.7507,-4.1332;5.2225,2.6909,-5.4287;7.8047,2.3827,-4.5869;8.3246,3.249,-5.0121;7.4801,1.7449,-5.4159;8.5121,1.8137,-3.9797;5.9835,1.6429,-3.1738;3.1627,1.1135,-1.5119;3.1063,0.858,-2.5805;2.5699,0.3484,-0.9867;2.5083,2.4834,-1.2853;2.5543,2.7336,-0.2162;3.0903,3.2611,-1.7975;1.0535,2.5318,-1.7629;0.6065,3.5174,-1.5889;0.4382,1.79,-1.2385;0.9826,2.3184,-2.8367)|\",6.35113727067\r\n\"[H]C(=O)[C@@]1(C([H])([H])[H])O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])OC(=O)C([H])([H])[H] |(11.3872,0.5647,2.203;10.8843,-0.4244,2.2824;11.2755,-1.3953,1.673;9.6949,-0.4283,3.2014;9.1835,-1.784,3.6199;8.9379,-2.3825,2.7374;8.303,-1.702,4.2593;9.9612,-2.3192,4.1748;9.7987,0.6042,4.212;8.8659,0.8102,3.1553;9.1807,1.5843,2.4472;7.387,0.8509,3.4728;7.1404,0.0984,4.2287;7.1651,1.8281,3.9238;6.5294,0.6581,2.2127;6.8237,1.4009,1.4573;6.744,-0.3259,1.7723;5.0243,0.7864,2.4835;4.8031,1.7802,2.8901;4.7104,0.0488,3.2337;4.2075,0.5823,1.211;4.3811,-0.4101,0.7852;4.4534,1.343,0.4646;2.787,0.6239,1.4705;2.2123,1.8516,1.4831;2.8193,2.8847,1.3034;0.7276,1.7425,1.7419;0.2808,2.7367,1.7094;0.5482,1.2869,2.7214;0.2601,1.0975,0.991)|\",5.953851048940001\r\n\"[H]O[C@@]([H])(/C(=C(\\[H])C([H])([H])[H])C([H])([H])[H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(3.598,1.6977,1.0142;4.5,1.5644,1.3499;5.0924,0.5455,0.516;6.0352,0.3254,1.0291;4.2162,-0.7054,0.606;3.3534,-1.0542,-0.3617;3.3396,-0.4584,-1.2735;2.3763,-2.1975,-0.3729;1.3486,-1.8297,-0.4995;2.4037,-2.8017,0.5371;2.5725,-2.8652,-1.2229;4.3601,-1.4224,1.9279;3.6183,-2.2093,2.0805;4.2716,-0.7026,2.7502;5.3554,-1.8809,2.013;5.4169,1.0944,-0.8936;4.4581,1.2883,-1.398;6.2246,0.0988,-1.761;5.7373,-0.8818,-1.7741;6.2148,0.4673,-2.7971;7.689,-0.0469,-1.3112;7.7322,-0.5245,-0.322;8.2169,-0.7219,-1.9973;8.4008,1.313,-1.2498;8.4787,1.7228,-2.2683;9.4284,1.1904,-0.8836;7.6294,2.3056,-0.3673;8.1163,3.2892,-0.3881;7.6636,1.9707,0.6787;6.1672,2.4461,-0.8205;6.1516,2.8968,-1.8236;5.6297,3.1265,-0.1536)|\",7.015095065890001\r\n\"[H]O[C@@]1(C([H])([H])[H])N=C(C([H])([H])[H])S[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(5.982,2.2179,-0.5203;5.0807,2.1466,-0.8731;4.386,1.198,-0.053;4.4894,1.6333,1.4164;3.9151,0.9835,2.0802;4.1098,2.655,1.5017;5.5341,1.6127,1.7519;3.0057,1.325,-0.4733;2.1874,0.357,-0.4626;0.7541,0.5713,-0.8887;0.6189,1.6244,-1.1432;0.0605,0.3003,-0.0841;0.5113,-0.0496,-1.7584;2.4754,-1.3628,0.0082;4.305,-1.4243,0.2808;4.6168,-2.2848,-0.3206;4.7249,-1.726,1.72;6.0431,-2.1647,1.9256;6.708,-2.2861,1.0724;6.5143,-2.4572,3.2039;7.5382,-2.7971,3.3364;5.6697,-2.3242,4.3081;6.0318,-2.555,5.3063;4.3547,-1.9034,4.1156;3.6843,-1.8046,4.9653;3.8855,-1.6101,2.8323;2.8553,-1.298,2.6939;4.9865,-0.1914,-0.3548;4.9666,-0.2998,-1.444;6.0393,-0.1926,-0.0463)|\",6.06269658914\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])O[C@]1([H])C([H])([H])C([H])=C([H])[H] |(3.7928,-2.5163,1.2452;4.7123,-2.4979,0.9368;5.0589,-1.1361,0.6966;6.1515,-1.1368,0.607;4.672,-0.2112,1.854;3.5861,-0.2716,2.0285;5.1588,-0.5586,2.7728;5.0608,1.2382,1.5298;6.1553,1.3272,1.4916;4.7076,1.9198,2.314;4.489,1.6655,0.1729;3.3884,1.6886,0.247;4.9799,3.0351,-0.2772;4.6896,3.8057,0.4454;4.5552,3.2937,-1.2523;6.0715,3.0358,-0.3696;4.8793,0.7441,-0.8534;4.5013,-0.6233,-0.6532;5.0142,-1.1625,-1.4567;2.9854,-0.858,-0.8393;2.7863,-1.9281,-0.7068;2.4237,-0.3274,-0.0561;2.4949,-0.4186,-2.1924;2.7099,0.6153,-2.4593;1.8534,-1.2015,-3.0604;1.5254,-0.8332,-4.0289;1.6296,-2.2432,-2.8368)|\",7.0885658055250005\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])O[C@]1([H])C([H])([H])C([H])=O |(6.7265,0.8717,-1.9313;6.3977,0.0526,-1.5299;4.9746,0.0809,-1.5351;4.5941,-0.0886,-2.5557;4.3871,1.3952,-1.0148;4.8207,1.6249,-0.033;4.666,2.2198,-1.6862;2.8589,1.2743,-0.9157;2.427,1.1775,-1.9212;2.4283,2.1757,-0.4624;2.4629,0.0412,-0.0935;2.792,0.1918,0.9479;0.9647,-0.2297,-0.1029;0.4169,0.6195,0.3202;0.7346,-1.1219,0.4874;0.6136,-0.397,-1.127;3.0861,-1.141,-0.6195;4.5114,-1.1133,-0.6814;4.7778,-2.0324,-1.217;5.1619,-1.1795,0.7142;5.0447,-0.2622,1.2973;4.642,-1.9823,1.2596;6.6366,-1.5362,0.7056;6.9353,-2.2966,-0.0492;7.4426,-1.0917,1.4919)|\",6.299435639075\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])SC([H])([H])C([H])([H])C([H])([H])F |(7.5103,-4.6351,4.7469;8.0954,-4.3158,4.0128;7.252,-3.933,3.0456;7.6108,-3.4476,1.9985;5.7641,-4.2093,3.381;5.5206,-5.1544,2.8782;5.6349,-4.4441,4.8296;5.4314,-3.5716,5.3169;4.8641,-5.0753,5.0328;4.8831,-3.1016,2.7943;5.1145,-2.994,1.7319;5.1025,-2.1456,3.2867;3.1091,-3.5163,3.02;2.3531,-2.1,2.1172;2.779,-1.1678,2.5012;2.6076,-2.1877,1.0564;0.8313,-2.0737,2.2974;0.4176,-1.3539,1.5799;0.3959,-3.052,2.0544;0.3863,-1.6662,3.6934;0.7703,-2.3524,4.4566;-0.708,-1.6362,3.7591;0.8733,-0.39,3.981)|\",6.4463771183450005\r\n\"[H]C([H])=C([H])/C([H])=C1C(=C(/Cl)C([H])([H])[H])/C([H])([H])C([H])([H])[C@]2(C([H])([H])[H])C([H])([H])N([H])C([H])([H])[C@@]/12[H] |(0.6676,0.0359,1.1371;1.234,0.2154,0.2284;0.6923,0.6567,-0.6056;2.5386,-0.0911,0.1378;3.026,-0.5344,0.9994;3.3142,0.1178,-1.0765;2.7251,0.4843,-1.9171;4.6302,-0.095,-1.318;5.1737,0.1141,-2.7014;4.626,-0.5149,-3.76;5.2526,-0.2164,-5.4058;3.5142,-1.5247,-3.7953;3.2371,-1.8405,-2.7885;3.8272,-2.4022,-4.3725;2.6228,-1.1179,-4.289;6.3937,1.0183,-2.793;6.8127,0.9973,-3.8001;6.0774,2.0545,-2.6096;7.4822,0.6289,-1.7703;8.2961,1.3649,-1.8096;7.9135,-0.339,-2.0612;6.9086,0.5242,-0.3525;6.4742,1.9092,0.1741;6.221,1.8696,1.2376;5.606,2.3041,-0.3625;7.297,2.6251,0.0596;7.83,-0.1499,0.6806;8.5446,0.5573,1.1214;8.4133,-0.9493,0.1953;6.9061,-0.6707,1.7156;7.217,-1.5747,2.0538;5.5393,-0.7554,1.148;5.0786,-1.7273,1.3602;4.9074,0.0139,1.6067;5.7289,-0.5031,-0.3565;6.1597,-1.4301,-0.772)|\",4.819136292354999\r\n\"[H]O/C1=N\\[C@]([H])(C([H])([H])[H])/C(O[H])=N\\C([H])([H])C([H])([H])C([H])([H])N([H])[C@@]1([H])C([H])([H])[H] |(7.0204,0.8556,2.1188;6.2028,0.8696,1.5901;6.0022,-0.3152,0.9415;6.8021,-1.3019,0.8601;8.0203,-1.2999,1.6615;8.6761,-0.4654,1.351;8.7962,-2.6032,1.43;9.7244,-2.6198,2.013;9.0458,-2.6974,0.3687;8.1903,-3.4666,1.7164;7.6775,-1.0851,3.1598;8.4202,-0.0717,3.753;9.2387,0.0564,3.2477;6.7787,-1.6036,3.8751;5.8442,-2.6184,3.3927;5.9624,-2.8715,2.3335;6.0389,-3.5229,3.9847;4.3839,-2.2026,3.6438;3.7534,-3.0853,3.4733;4.2721,-1.9191,4.6988;3.8622,-1.0467,2.7696;4.4616,-0.1514,2.964;2.8421,-0.8008,3.0986;3.8809,-1.3735,1.331;2.9415,-1.5392,0.9822;4.58,-0.455,0.4125;4.6652,-0.991,-0.5399;3.8727,0.8877,0.1678;4.4341,1.5159,-0.5338;3.7475,1.4507,1.0969;2.881,0.7055,-0.2631)|\",5.69806402947\r\n\"[H]OC1=C([H])C([H])=C(/C([H])=C([H])/C(=N\\C2=C([H])C([H])=C([H])C([H])=C2[H])O[H])C([H])=C1[H] |(-4.1588,-4.1025,-0.0722;-3.6403,-3.9952,-0.8851;-2.3641,-3.643,-0.5615;-1.9299,-3.482,0.7582;-2.6216,-3.6375,1.584;-0.6096,-3.1229,1.0102;-0.2836,-2.9982,2.0402;0.31,-2.9116,-0.032;1.6813,-2.5301,0.2975;1.8893,-2.495,1.3681;2.6691,-2.184,-0.5564;2.4858,-2.1192,-1.6242;4.0388,-1.8427,-0.1386;4.91,-1.1887,-0.8083;4.5931,-0.5177,-1.9957;5.439,-0.6986,-3.1058;6.2768,-1.3828,-3.0104;5.1995,-0.0188,-4.2964;5.8566,-0.1816,-5.1471;4.1324,0.8791,-4.3991;3.9563,1.4182,-5.3258;3.3075,1.0895,-3.2935;2.4855,1.7987,-3.3539;3.53,0.3992,-2.1016;2.9035,0.5863,-1.2344;4.4454,-2.2961,1.0847;3.8677,-3.0308,1.3453;-0.1526,-3.0851,-1.3537;0.5276,-2.9421,-2.1877;-1.4646,-3.4432,-1.6193;-1.816,-3.5774,-2.6372)|\",3.839526430555\r\n\"[H]C1=C2/C(C([H])([H])[H])=C(/[H])C(=O)/N=C\\2[C@]([H])(OC(=O)C([H])([H])[H])C([H])=C1[H] |(0.8783,-2.0113,0.9224;1.8802,-2.2359,0.567;2.7781,-1.2148,0.4044;2.4679,0.1901,0.6399;1.1436,0.5906,1.2368;1.0901,1.6749,1.3635;0.9897,0.1249,2.2181;0.3055,0.2856,0.5974;3.4064,1.103,0.2955;3.2444,2.1672,0.4394;4.6944,0.7292,-0.3139;5.4983,1.5745,-0.681;4.9995,-0.6479,-0.47;4.1151,-1.519,-0.1226;4.5224,-2.9875,-0.2131;5.1396,-3.1824,0.684;5.366,-3.1976,-1.3505;6.7103,-3.225,-1.0993;7.1806,-3.2953,0.0118;7.4837,-3.1549,-2.3881;8.5268,-3.4149,-2.2055;7.424,-2.1286,-2.7674;7.0472,-3.8155,-3.1424;3.3989,-3.9697,-0.2054;3.6365,-4.9914,-0.4868;2.1733,-3.6099,0.2231;1.3753,-4.3425,0.3005)|\",3.366048330685\r\n\"[H]OC([H])([H])[C@]1([H])N([H])[C@]([H])(C([H])([H])C([H])([H])C([H])=C([H])[H])[C@@]([H])(C([H])([H])[H])[C@@]([H])(O[H])[C@]1([H])O[H] |(5.7975,1.9732,3.663;5.1201,1.4543,3.2011;5.2513,0.0946,3.6312;5.2818,0.0241,4.7248;4.3422,-0.4062,3.2931;6.5265,-0.5427,3.0378;7.362,0.0374,3.4619;6.68,-0.4523,1.5815;6.5677,0.5161,1.2965;5.9016,-1.3422,0.7003;6.3449,-1.2017,-0.2969;4.3887,-1.0463,0.5235;4.01,-1.7389,-0.2429;3.8219,-1.2698,1.4343;4.0852,0.4002,0.0873;4.4121,1.0996,0.864;4.6708,0.6207,-0.8202;2.6232,0.6213,-0.1945;2.1998,0.0406,-1.0169;1.8317,1.4447,0.4943;0.7771,1.5586,0.2568;2.2131,2.0352,1.3252;6.1643,-2.8168,1.1085;5.433,-3.4519,0.5919;7.5689,-3.2684,0.6724;7.6348,-3.2722,-0.4212;7.799,-4.2905,1.0007;8.3441,-2.5944,1.0478;5.9013,-2.9724,2.6193;4.8433,-2.7745,2.8205;6.0873,-4.3074,3.1124;6.9958,-4.5738,2.8954;6.7398,-2.0053,3.4603;7.8037,-2.2448,3.2921;6.4275,-2.1564,4.8383;6.2942,-3.1133,4.9606)|\",6.636856813695\r\n\"[H]O[C@@]1([H])C([H])([H])[C@]2(C([H])([H])[H])C(=O)C(C([H])([H])[H])(C([H])([H])[H])[C@@]3([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])[C@@]31C2([H])[H] |(5.5193,-2.7324,-1.7115;5.0458,-2.0564,-1.2029;3.6508,-2.3672,-1.2609;3.3113,-2.3385,-2.3073;3.3422,-3.721,-0.5784;4.2388,-4.351,-0.5535;2.5761,-4.288,-1.1209;2.8888,-3.3731,0.889;3.5391,-4.272,1.941;4.6306,-4.1975,1.8761;3.2469,-5.3165,1.8006;3.2343,-3.9819,2.9525;1.3612,-3.4985,0.9378;0.8151,-4.4468,1.4729;0.5098,-2.3625,0.3066;-0.0865,-1.5632,1.4966;-0.8623,-0.876,1.1433;0.6605,-0.9728,2.0363;-0.548,-2.2573,2.2055;-0.6703,-3.0139,-0.4478;-1.257,-3.6412,0.2279;-0.3179,-3.6466,-1.271;-1.3243,-2.2428,-0.8706;1.346,-1.4917,-0.6853;1.2741,-2.0047,-1.6554;0.8459,-0.0528,-0.9373;-0.0496,-0.0224,-1.5684;0.593,0.4468,0.0042;2.0592,0.637,-1.5729;1.9731,1.7297,-1.5911;2.1732,0.3048,-2.6138;3.2542,0.1466,-0.7233;4.1792,0.1602,-1.3077;3.4782,1.0418,0.506;4.3061,0.6786,1.1238;2.5894,1.1067,1.1458;3.7273,2.0611,0.1873;2.8937,-1.3364,-0.3854;3.2492,-1.874,1.0131;4.3232,-1.7648,1.1969;2.7239,-1.3796,1.8345)|\",5.967456741465\r\n\"[H]/C(C(=O)OC([H])([H])C([H])([H])[H])=C(/[H])N1N=NN=C1SC([H])([H])[H] |(5.9347,1.3641,0.3858;5.4953,0.4,0.1616;4.2912,0.3364,-0.6962;3.7114,-0.6863,-1.013;3.9255,1.5761,-1.0855;2.756,1.6698,-1.9357;2.9075,2.594,-2.4983;2.7538,0.8205,-2.6232;1.4777,1.7185,-1.1125;0.6173,1.873,-1.7737;1.5098,2.5421,-0.3918;1.3309,0.7791,-0.5723;6.0024,-0.7463,0.6309;5.5398,-1.6976,0.3887;7.1318,-0.8342,1.4482;7.8475,0.2669,1.8536;8.8075,-0.1718,2.5804;8.7819,-1.5356,2.6916;7.7403,-1.9332,1.9863;7.1892,-3.5771,1.7529;8.4643,-4.4476,2.7375;8.2052,-5.5068,2.6758;8.4372,-4.11,3.7739;9.4539,-4.274,2.3136)|\",4.8082517383350005\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])/N=C\\2[H] |(6.3103,-4.0963,-5.7748;6.2218,-3.4566,-4.9017;7.3366,-2.7847,-4.4228;8.3028,-2.8876,-4.9083;7.1843,-1.9657,-3.2921;5.92,-1.8445,-2.6736;4.7893,-2.5189,-3.152;3.8175,-2.425,-2.6782;4.965,-3.3227,-4.2709;4.1119,-3.8628,-4.6717;6.0962,-0.9791,-1.6029;5.1383,-0.5585,-0.6994;4.1481,-0.9663,-0.8728;5.3425,0.281,0.3301;6.3133,0.7108,0.5404;4.2009,0.6163,1.1939;3.0627,0.196,1.0636;4.5819,1.4665,2.1799;3.5528,1.8816,3.105;2.89,1.0349,3.3003;4.0968,2.136,4.0184;2.7743,3.0758,2.5715;2.0532,3.417,3.3235;2.224,2.8016,1.6671;3.4481,3.907,2.3386;7.4,-0.5512,-1.5154;8.0421,-1.1283,-2.5058;9.0994,-0.9368,-2.6395)|\",4.503484225775001\r\n\"[H]C1N=C([H])NN1/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H] |(6.5953,3.503,0.2598;6.9667,2.8752,-0.5392;7.8968,3.1835,-1.4144;7.971,2.0477,-2.177;8.651,1.9459,-3.0119;7.1545,1.0704,-1.8303;6.4982,1.6111,-0.76;5.514,0.9296,-0.0474;5.0878,1.4934,0.7763;5.0974,-0.3143,-0.3159;5.5081,-0.8946,-1.1327;4.0444,-0.9037,0.5356;3.5101,-0.3472,1.4786;3.7533,-2.1592,0.1275;2.7325,-2.8619,0.8759;2.9728,-3.9179,0.7302;2.8348,-2.6082,1.9338;1.3402,-2.5298,0.3605;0.5939,-3.1259,0.8982;1.2557,-2.7548,-0.7076;1.113,-1.472,0.5197)|\",5.208259098569999\r\n\"[H]C1=C([H])N(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])N=C1C(F)(F)F |(8.5676,3.708,3.0218;8.1089,2.8484,2.5587;6.9336,2.7937,1.8509;6.2101,3.553,1.5915;6.7637,1.4884,1.4593;5.6952,0.9842,0.7172;4.9535,1.7273,0.4435;5.5507,-0.2954,0.3501;6.2767,-1.0548,0.6121;4.3584,-0.6709,-0.4361;3.4798,0.0954,-0.7908;4.369,-1.995,-0.708;3.2553,-2.5061,-1.4788;3.1922,-3.5587,-1.1922;2.3447,-1.9884,-1.1677;3.4976,-2.3488,-2.9726;2.6762,-2.8111,-3.5322;4.4325,-2.8356,-3.2689;3.5457,-1.291,-3.246;7.7751,0.6932,1.8815;8.5824,1.5129,2.5416;9.8392,0.9867,3.1684;10.5082,1.9925,3.7757;9.5829,0.045,4.0972;10.6646,0.4366,2.2569)|\",5.053154203785001\r\n\"[H]C1=NN(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(9.5136,0.5882,-2.708;8.477,0.2766,-2.7206;7.7487,0.3959,-1.6199;6.5195,-0.0755,-1.9728;5.4635,-0.118,-1.0703;4.5416,-0.5175,-1.481;5.5185,0.287,0.207;6.4247,0.6873,0.6436;4.3034,0.1768,1.0325;3.2264,-0.2521,0.6543;4.5394,0.6271,2.2885;3.4271,0.5767,3.2106;3.6456,1.3609,3.94;2.51,0.8274,2.6718;3.3182,-0.7876,3.8759;2.5119,-0.7761,4.6185;4.2517,-1.0484,4.3852;3.091,-1.5589,3.1347;6.4842,-0.4858,-3.2835;5.5795,-0.8868,-3.7175;7.7371,-0.2721,-3.8017;8.0755,-0.4772,-4.8066)|\",5.015058264715\r\n\"[H]C1=NC([H])=C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(5.1648,-2.2864,6.4979;5.3892,-1.9616,5.484;4.4124,-2.1204,4.5847;4.6498,-1.7356,3.3267;3.8519,-1.9,2.6053;5.863,-1.1645,2.9299;6.131,-0.7883,1.6288;5.1312,-0.2727,0.8691;4.2481,0.082,1.3952;5.2501,-0.1624,-0.461;6.1271,-0.5197,-0.9883;4.158,0.4654,-1.2212;3.1269,0.9091,-0.7454;4.4515,0.4881,-2.5448;3.4543,1.0711,-3.4145;3.005,1.9305,-2.9108;4.0223,1.4142,-4.283;2.3964,0.0509,-3.8093;1.6907,0.5015,-4.5169;1.8361,-0.28,-2.9304;2.8544,-0.821,-4.288;6.885,-1.0116,3.8652;7.8332,-0.5802,3.5602;6.6399,-1.4221,5.1717;7.4032,-1.3217,5.9375)|\",5.412344486445\r\n\"[H]C1=C([H])C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(3.6027,4.1619,4.8661;4.1674,4.1947,3.9388;4.565,3.0051,3.331;4.3355,2.0472,3.7874;5.2898,3.0504,2.1381;5.7295,1.8753,1.5331;4.9026,0.8065,1.4961;3.8533,0.9907,1.7151;5.3455,-0.417,1.1676;6.3908,-0.6081,0.954;4.3827,-1.5237,1.0916;3.1871,-1.4454,1.3203;5.0078,-2.6757,0.7357;4.175,-3.8511,0.6263;4.8561,-4.6847,0.8171;3.412,-3.8186,1.4078;3.5421,-3.956,-0.7539;2.9848,-4.8963,-0.8387;4.3086,-3.9384,-1.5357;2.8469,-3.1284,-0.9207;5.6556,4.2623,1.5314;6.4651,4.302,0.2409;6.5766,3.2707,-0.1064;5.7418,5.0886,-0.868;6.3155,5.0435,-1.8012;5.6232,6.1463,-0.6047;4.7448,4.6782,-1.0618;7.8796,4.8617,0.4883;8.4676,4.8401,-0.4369;8.4117,4.2741,1.2441;7.8426,5.901,0.8366;5.2492,5.4381,2.1764;5.5145,6.3955,1.7357;4.5107,5.4159,3.3596;4.2098,6.3486,3.8281)|\",5.35520057784\r\n\"[H]C1=C([H])C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C(C([H])([H])[H])C([H])=C1[H] |(5.737,0.8457,5.8276;6.3404,0.912,4.9269;5.7199,0.8548,3.6772;4.6456,0.719,3.6007;6.4988,0.9443,2.5245;5.9413,0.8669,1.2533;4.7234,1.4094,1.0287;4.3581,2.1177,1.7685;4.0166,1.1226,-0.0751;4.3723,0.4131,-0.8132;2.7258,1.7923,-0.2849;2.2161,2.6073,0.466;2.1527,1.3716,-1.4417;0.8706,1.9502,-1.7711;0.3873,1.1987,-2.4009;0.2935,2.0788,-0.852;1.0304,3.2729,-2.5071;0.0485,3.6476,-2.8189;1.6507,3.148,-3.4009;1.4927,4.0201,-1.8561;7.8965,1.0645,2.5754;8.71,1.1429,1.3079;9.7734,1.2569,1.5376;8.4003,1.9899,0.6847;8.5863,0.2416,0.6962;8.4872,1.1093,3.8415;9.5683,1.2049,3.9079;7.7252,1.0403,5.0104;8.2143,1.0824,5.9793)|\",5.287172115215\r\n\"[H]C1=C([H])C(C(=O)N([H])[H])=C([H])C([H])=C1O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H] |(6.6579,5.0968,-3.9214;6.0502,4.4899,-4.5844;5.4452,5.0594,-5.7012;5.5461,6.1204,-5.9039;4.704,4.2804,-6.5977;4.0954,4.9774,-7.7792;3.9567,6.1936,-7.8132;3.672,4.1714,-8.8131;4.0991,3.2678,-8.9579;3.3904,4.6677,-9.6488;4.5575,2.9082,-6.3425;3.9493,2.2888,-6.9953;5.1433,2.3273,-5.2229;5.025,1.2711,-5.0046;5.8865,3.122,-4.3475;6.472,2.4682,-3.2811;6.6644,3.1292,-2.1129;6.0936,4.0423,-1.9641;7.483,2.6502,-1.166;8.0561,1.7418,-1.3113;7.6014,3.381,0.1056;7.0169,4.4132,0.3871;8.4697,2.7526,0.9364;8.6814,3.3656,2.2285;9.6791,3.0323,2.5254;8.6815,4.4517,2.1086;7.6254,2.9243,3.2316;7.8536,3.3399,4.22;7.5993,1.8327,3.3146;6.6368,3.2806,2.929)|\",5.113019250895\r\n\"[H]C1=C([H])C(N(=O)=O)=C([H])C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C1[H] |(3.2068,-4.5097,-2.6182;4.2311,-4.259,-2.3607;5.2378,-5.2056,-2.5334;5.0357,-6.1981,-2.9148;6.543,-4.8414,-2.2067;7.6258,-5.8276,-2.3872;7.3185,-6.9336,-2.8286;8.7671,-5.4834,-2.0858;6.8705,-3.5853,-1.7098;7.8929,-3.3344,-1.4587;5.8422,-2.6594,-1.5403;6.2169,-1.4148,-1.0806;5.3535,-0.6972,-0.3159;4.5307,-1.2422,0.14;5.5373,0.6104,-0.0935;6.3571,1.1589,-0.5426;4.5975,1.3139,0.7962;3.6274,0.8114,1.3373;4.9505,2.6144,0.9281;4.1089,3.4279,1.7788;4.2504,4.4443,1.4029;3.0679,3.1281,1.6358;4.5201,3.3119,3.2393;3.9214,3.998,3.8496;5.5764,3.5696,3.3679;4.3556,2.2946,3.605;4.5229,-2.9826,-1.8742;3.7357,-2.2427,-1.7749)|\",4.166063051155001\r\n\"[H]C1=C([H])C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C([H])C(F)=C1[H] |(7.7463,0.0822,6.0141;7.8938,0.3949,4.9847;7.1589,-0.2158,3.9707;6.4378,-0.9981,4.1806;7.3512,0.1947,2.6496;6.6187,-0.4751,1.6872;6.2443,0.1745,0.5585;6.2915,1.2606,0.5769;5.7901,-0.4896,-0.5136;5.7429,-1.5721,-0.5382;5.3389,0.2781,-1.6843;5.3501,1.4934,-1.7834;4.9001,-0.5558,-2.6598;4.4357,0.0679,-3.8785;3.7282,-0.6524,-4.2972;3.9084,0.9907,-3.6248;5.5864,0.3383,-4.8371;5.1994,0.7504,-5.7763;6.1299,-0.5848,-5.0641;6.2827,1.063,-4.406;8.2725,1.1928,2.3275;8.4681,1.496,1.3056;8.9783,1.7833,3.3712;9.8693,2.7485,3.0697;8.8113,1.4085,4.6981;9.3899,1.8984,5.4733)|\",5.344316023820001\r\n\"[H]C1=C([H])C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C([H])C([H])=C1F |(8.5923,-3.2429,-4.4211;7.6855,-2.7568,-4.0773;7.6743,-1.9738,-2.9256;8.5745,-1.8276,-2.338;6.4867,-1.3666,-2.5137;6.559,-0.5656,-1.384;5.5097,-0.5392,-0.5291;4.7778,-1.3379,-0.624;5.399,0.4049,0.4169;6.1231,1.2058,0.5115;4.2649,0.3483,1.3511;3.3842,-0.4949,1.3518;4.3193,1.3857,2.224;3.2613,1.4598,3.2058;3.1996,2.522,3.4563;2.3253,1.1404,2.7407;3.5803,0.6168,4.4323;2.8025,0.7551,5.1924;4.5419,0.9094,4.8668;3.6186,-0.444,4.1691;5.3091,-1.5228,-3.2473;4.3987,-1.0176,-2.9413;5.3162,-2.3162,-4.3959;4.4176,-2.4572,-4.9869;6.5028,-2.9205,-4.792;6.5122,-3.6831,-5.9049)|\",5.325268054285001\r\n\"[H]/C(SC1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])=C(/[H])C(=O)OC([H])([H])C([H])([H])[H] |(5.8873,3.0334,-1.8143;4.9429,3.5259,-1.5877;3.5977,3.146,-2.6365;4.494,2.2173,-3.9856;5.1485,1.5178,-3.454;5.3347,3.1483,-4.878;6.0035,3.7619,-4.2633;5.9805,2.5089,-5.4987;4.4721,4.035,-5.7886;3.9077,4.7472,-5.1726;5.1221,4.6303,-6.4416;3.4932,3.1977,-6.6258;4.0636,2.5658,-7.3232;2.8629,3.8513,-7.2415;2.6171,2.3036,-5.7341;1.9667,1.6698,-6.3493;1.9525,2.9318,-5.1252;3.4753,1.4178,-4.8167;4.0553,0.7202,-5.4383;2.8472,0.8053,-4.1597;4.9015,4.3491,-0.5246;4.0114,4.8968,-0.2324;6.1241,4.5412,0.275;7.1953,3.9923,0.079;5.908,5.4357,1.2737;7.046,5.7531,2.1062;6.6071,6.0797,3.0526;7.6282,4.843,2.2705;7.9022,6.845,1.4813;8.7127,7.1215,2.1656;7.3039,7.7385,1.2747;8.3461,6.4935,0.5457)|\",5.020500541725\r\n\"[H]C1=C([H])C([H])=C(S/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])S1 |(11.2509,-2.2321,3.404;10.3435,-1.6429,3.3754;9.4985,-1.3446,4.4112;9.6612,-1.6788,5.43;8.3833,-0.5574,4.0033;7.5999,-0.2147,4.6699;8.3913,-0.2689,2.66;7.2412,0.7479,1.7864;6.2152,-0.444,0.989;6.3876,-1.4841,1.2538;5.2496,-0.1266,0.1125;5.0399,0.8941,-0.192;4.4282,-1.2006,-0.4792;4.5345,-2.39,-0.2398;3.5179,-0.6792,-1.3397;2.6457,-1.6231,-2.0021;2.3756,-2.411,-1.2946;1.7567,-1.0382,-2.2521;3.2983,-2.2035,-3.2483;2.5904,-2.8595,-3.7681;4.1795,-2.7929,-2.9803;3.5994,-1.4073,-3.9371;9.8015,-0.9586,1.8812)|\",5.200095683055\r\n\"[H]OC1=C([H])C([H])=C(S/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(6.6313,-7.3765,-2.3395;6.0068,-7.2669,-1.6055;6.1548,-6.0222,-1.0704;7.093,-5.1,-1.5487;7.7435,-5.3641,-2.3803;7.1967,-3.8433,-0.9558;7.9283,-3.1318,-1.3256;6.3627,-3.4925,0.1116;6.5339,-1.8948,0.909;5.4352,-0.8974,-0.0336;4.9264,-1.3866,-0.8605;5.2011,0.4035,0.209;5.6721,0.9465,1.0224;4.263,1.1424,-0.6545;3.6623,0.6844,-1.6105;4.1472,2.4297,-0.2359;3.258,3.2756,-0.9969;2.3977,2.6851,-1.3219;2.9284,4.0359,-0.2836;3.9729,3.9019,-2.1858;3.3038,4.6033,-2.6979;4.2752,3.1306,-2.9;4.8627,4.4508,-1.8599;5.4314,-4.426,0.5898;4.7887,-4.164,1.4246;5.323,-5.6819,0.0037;4.6038,-6.411,0.3627)|\",5.14839405146\r\n\"[H]C1=C(OC([H])([H])[H])C(OC([H])([H])[H])=C([H])C(/C([H])=C(\\[H])C([H])([H])[H])=C1Cl |(6.8673,1.4217,1.0582;6.4995,0.6542,0.3897;7.3636,-0.2833,-0.1734;8.7087,-0.323,0.0397;9.2861,0.6764,0.868;10.357,0.4675,0.8834;8.8909,0.6293,1.8911;9.1168,1.6807,0.4592;6.8302,-1.2674,-1.0391;7.637,-2.1806,-1.6718;8.2346,-3.1714,-0.8315;8.7998,-3.8279,-1.497;7.4648,-3.7574,-0.3118;8.9113,-2.7219,-0.0986;5.4746,-1.2671,-1.3193;5.1194,-2.0311,-2.0024;4.5702,-0.3283,-0.7769;3.1359,-0.349,-1.0945;2.5528,0.4578,-0.6584;2.4891,-1.2483,-1.8522;3.0318,-2.0793,-2.3021;1.0186,-1.2016,-2.1447;0.5196,-2.1281,-1.8288;0.831,-1.1001,-3.2229;0.5324,-0.3625,-1.6361;5.1352,0.6228,0.0872;4.1392,1.8631,0.8596)|\",4.80553059983\r\n\"[H]C1=C([H])C([H])=C([H])[C@@]2([H])C1NN(C([H])([H])C([H])([H])C([H])([H])SC([H])([H])C([H])([H])[H])C2=O |(9.9832,-6.7886,2.2815;10.2336,-6.2562,3.1938;10.2156,-6.8552,4.4144;9.989,-7.9152,4.4901;10.4064,-6.105,5.6542;10.2483,-6.6284,6.5934;10.7557,-4.8023,5.6572;10.9043,-4.238,6.5732;11.0621,-4.1446,4.3494;12.1655,-4.1626,4.2271;10.4709,-4.8373,3.1563;10.1108,-4.0103,2.2192;10.3443,-2.7216,2.7031;9.8638,-1.5706,1.952;8.8757,-1.8163,1.5456;9.7455,-0.7699,2.6875;10.8166,-1.1133,0.8355;11.8056,-0.9335,1.274;10.4596,-0.1471,0.4584;10.9193,-2.0997,-0.3323;9.9403,-2.2249,-0.8085;11.2243,-3.0925,0.0097;12.0296,-1.545,-1.6889;13.6935,-1.8526,-0.9647;14.3781,-1.3177,-1.6315;13.7568,-1.3602,0.012;14.0848,-3.3266,-0.8661;15.0933,-3.4251,-0.4442;14.0757,-3.8006,-1.8525;13.3992,-3.8844,-0.2193;10.7286,-2.6817,4.0385;10.8433,-1.6915,4.7357)|\",3.850410984575\r\n\"[H]C1=C([H])C([H])=C(N2[C@]3([H])C([H])([H])C4=C(C([H])=C([H])C([H])=C4[H])[C@@]2([H])C([H])([H])C3([H])[H])C([H])=C1[H] |(-2.5038,-0.9379,0.6582;-1.4958,-0.8007,0.2782;-1.2705,-0.1018,-0.9098;-2.1079,0.3167,-1.463;0.019,0.0751,-1.4025;0.1624,0.6418,-2.3168;1.1362,-0.4496,-0.718;2.434,-0.1994,-1.1891;2.7407,-0.2148,-2.6352;2.0904,0.4739,-3.1793;2.5622,-1.649,-3.1773;1.4879,-1.8715,-3.2431;2.9555,-1.7169,-4.2004;3.2296,-2.6818,-2.2801;3.7025,-2.3129,-1.0066;4.2847,-3.275,-0.1763;4.6471,-2.9817,0.8076;4.4025,-4.6009,-0.5929;4.852,-5.3408,0.0641;3.9365,-4.9704,-1.856;4.0236,-6.0006,-2.1911;3.3543,-4.0144,-2.6882;2.9897,-4.3054,-3.6714;3.6113,-0.8469,-0.5942;3.6054,-0.7434,0.4943;4.7515,-0.0261,-1.2391;4.8892,0.9062,-0.6822;5.7,-0.5703,-1.2362;4.2157,0.2594,-2.6678;4.7814,-0.2744,-3.4376;4.2733,1.3266,-2.9015;0.8973,-1.1621,0.4739;1.7219,-1.6088,1.017;-0.3991,-1.3249,0.9612;-0.5471,-1.8827,1.8828)|\",5.025942818735\r\n\"[H]C1=C([H])C(N(=O)=O)=C([H])C([H])=C1OC([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H] |(7.1792,1.8112,5.7041;7.2043,2.779,5.2147;7.4687,3.936,5.9271;7.6554,3.9088,6.9936;7.497,5.1592,5.2505;7.7752,6.386,5.9917;7.9785,6.2926,7.2044;7.7912,7.448,5.3646;7.2612,5.2313,3.8792;7.2866,6.1972,3.3902;6.9929,4.069,3.1623;6.7925,4.1347,2.1013;6.9672,2.8319,3.8279;6.7442,1.6312,3.2398;6.5042,1.5459,1.8219;7.2515,2.1304,1.2705;6.6835,0.487,1.6061;5.1027,1.9325,1.4469;4.3177,1.4673,2.0437;4.7856,2.7383,0.4298;5.5919,3.1947,-0.1516;3.3907,3.0849,-0.013;3.2507,2.7311,-1.0454;2.6611,2.5446,0.6043;3.0954,4.5996,0.0359;3.8606,5.1348,-0.5439;2.142,4.787,-0.4738;3.037,5.161,1.4601;2.8409,6.2389,1.4549;3.9784,4.9922,1.9942;2.2384,4.6805,2.0385)|\",4.56062813438\r\n\"[H]C(=O)/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C#N |(3.8317,-4.3915,-1.264;4.5815,-3.845,-0.6462;5.5885,-4.4003,-0.249;4.2418,-2.4412,-0.371;4.9545,-1.8741,0.225;3.1117,-1.8913,-0.843;2.4426,-2.5213,-1.4356;2.6652,-0.4741,-0.6412;3.4089,0.0724,-0.0479;2.6216,0.0178,-1.6244;1.2737,-0.3548,0.0172;0.9689,0.6996,-0.0078;0.5371,-0.9019,-0.5884;1.2291,-0.8638,1.464;1.9696,-0.3143,2.0637;1.5347,-1.9188,1.4942;-0.1569,-0.7128,2.1012;-0.9026,-1.262,1.5131;-0.4637,0.3404,2.0895;-0.1834,-1.2288,3.5556;0.5482,-0.6833,4.1653;0.1073,-2.2865,3.5889;-1.5021,-1.0907,4.1805;-2.5542,-0.9725,4.6568)|\",5.232749345115\r\n\"[H]C1=C([H])C(=O)C(OC([H])([H])[H])=C([H])C1C1=NNC(=S)S1 |(4.6184,-3.7317,3.1367;4.2853,-2.907,2.5182;5.0017,-1.768,2.4081;5.9403,-1.6224,2.9335;4.5626,-0.6401,1.5782;5.2092,0.4047,1.5075;3.2796,-0.8375,0.8224;2.7612,0.0876,0.0106;3.3988,1.3402,-0.3063;2.7231,1.8041,-1.0263;4.3818,1.1806,-0.7525;3.5048,1.9653,0.5812;2.571,-2.0037,0.9643;1.6422,-2.1072,0.4118;3.0305,-3.0633,1.7961;2.32,-4.2569,1.935;2.8015,-5.2716,2.7375;2.0823,-6.3248,2.7878;0.9101,-6.2739,2.0134;-0.1795,-7.4966,1.9042;0.7988,-4.6952,1.1815)|\",1.967383139115\r\n\"[H]OC([H])([H])C([H])([H])/C([H])=C(\\[H])[C@@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(3.1531,-4.6393,4.2958;2.666,-5.4517,4.0895;2.9712,-5.7948,2.7425;4.0599,-5.8375,2.5817;2.5748,-6.8055,2.5981;2.3371,-4.8374,1.7174;2.5899,-5.1906,0.7084;1.2452,-4.9052,1.8228;2.7784,-3.408,1.8904;2.4331,-2.9048,2.7961;3.5485,-2.7297,1.0303;3.8787,-3.2414,0.1251;3.9801,-1.2941,1.1812;3.7187,-0.9639,2.197;5.4707,-1.0668,0.9857;6.1738,-1.5941,-0.1103;5.6551,-2.2056,-0.8435;7.5377,-1.3498,-0.2772;8.0568,-1.7734,-1.1333;8.2341,-0.5688,0.6462;9.2961,-0.3798,0.5161;7.5503,-0.0362,1.7399;8.0785,0.5703,2.4714;6.1877,-0.2857,1.9057;5.6686,0.129,2.7672;2.9152,-0.143,0.0337;3.0083,-0.7185,-1.7699;2.3838,-0.0776,-2.4042;2.6487,-1.7473,-1.8861;4.0315,-0.6694,-2.1584;3.5537,1.6355,0.1706;2.9531,2.3079,-0.4543;4.596,1.7136,-0.1557;3.4976,2.0051,1.2015;1.12,-0.2176,0.6367;0.4746,0.4138,0.0141;1.0323,0.139,1.6701;0.7277,-1.2395,0.6011)|\",5.989225849505\r\n\"[H]C1=C([H])C([H])=C([C@]([H])(/C([H])=C(\\[H])C([H])(C([H])([H])[H])C([H])([H])[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(4.4578,7.1957,3.0207;4.5926,6.3919,2.3021;4.3745,6.6155,0.942;4.0653,7.5977,0.5929;4.5434,5.5788,0.0239;4.3611,5.7643,-1.0325;4.9375,4.2954,0.4341;5.1363,3.1945,-0.5952;4.7145,3.5445,-1.548;4.4624,1.8925,-0.2394;4.656,1.5009,0.7605;3.6732,1.1785,-1.0506;3.4617,1.5576,-2.0545;3.018,-0.1347,-0.6973;3.3102,-0.3934,0.3304;1.4829,-0.0103,-0.741;1.0042,-0.9594,-0.4701;1.1409,0.2599,-1.7482;1.1293,0.7622,-0.0496;3.4986,-1.2644,-1.6283;3.0229,-2.2177,-1.367;4.5844,-1.3956,-1.5646;3.2488,-1.0441,-2.6742;7.0132,2.9023,-0.9974;7.8816,2.0287,0.4435;8.9359,1.849,0.1996;7.424,1.057,0.6618;7.8526,2.6321,1.3576;7.1156,1.8233,-2.5517;8.1592,1.6011,-2.8053;6.666,2.3263,-3.4167;6.5919,0.8716,-2.4106;7.8472,4.5704,-1.3321;8.9031,4.4287,-1.5933;7.8014,5.2247,-0.4548;7.369,5.0983,-2.1659;5.1544,4.0879,1.8068;5.4572,3.1079,2.1656;4.9824,5.1215,2.7286;5.1537,4.9312,3.7853)|\",5.9919469880100005\r\n\"[H]OC([H])([H])[C@@]([H])(ON([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])C([H])([H])C([H])([H])C([H])=C([H])[H] |(7.2302,0.7273,-1.7705;7.8913,1.0388,-1.1315;7.1685,1.6722,-0.0892;7.9091,2.0251,0.6351;6.6085,2.5432,-0.462;6.1895,0.7191,0.6073;5.6642,1.2894,1.3942;5.2418,0.432,-0.4497;4.04,-0.2018,-0.0556;3.7863,0.106,0.8811;3.9794,-1.5967,-0.2402;4.7586,-2.2522,-1.2041;5.4847,-1.6857,-1.7745;4.5876,-3.6191,-1.4213;5.2014,-4.1151,-2.1689;3.6419,-4.3494,-0.6991;3.5138,-5.4132,-0.8762;2.8622,-3.6917,0.2546;2.1227,-4.2431,0.8295;3.0264,-2.3286,0.4883;2.4114,-1.8254,1.2315;6.8904,-0.5027,1.2274;7.1126,-1.2231,0.4334;7.8653,-0.1559,1.5984;6.1599,-1.1819,2.3948;5.9003,-0.4244,3.1506;5.22,-1.6351,2.0633;7.0152,-2.2802,3.0586;7.2589,-3.052,2.3171;7.9719,-1.8362,3.3758;6.3395,-2.911,4.2454;6.1076,-2.2371,5.0729;6.0017,-4.197,4.3457;5.5081,-4.593,5.2294;6.2088,-4.9055,3.5461)|\",5.532074580664999\r\n\"[H]C1=C([H])C2=C(C([H])=C1F)C([H])([H])[C@@]([H])(C([H])([H])[H])O2 |(5.8483,0.5027,4.755;5.2809,0.099,3.9229;4.6162,0.9412,3.0249;4.6471,2.0199,3.135;3.9088,0.3516,1.9814;3.851,-1.0339,1.8152;4.5078,-1.8731,2.7077;4.4859,-2.9547,2.617;5.2169,-1.2801,3.7516;5.8672,-2.0792,4.6314;3.004,-1.3306,0.6019;3.5575,-1.88,-0.1688;2.1189,-1.9308,0.8497;2.6014,0.0934,0.1119;3.0356,0.3031,-0.8722;1.1017,0.3463,0.0812;0.622,-0.3001,-0.6632;0.8907,1.3883,-0.1782;0.6574,0.1363,1.0605;3.2134,1.0459,1.0343)|\",5.3334314698\r\n\"[H]OC1=C(C([H])([H])[C@@]([H])(O[H])C([H])([H])[H])C([H])=C(F)C([H])=C1[H] |(1.2624,-0.3465,-5.3523;1.6534,-0.1953,-4.4779;2.0192,-1.4073,-3.9234;2.6301,-1.3758,-2.6561;2.9168,-0.0775,-1.9242;3.6234,-0.3006,-1.1162;3.4157,0.6294,-2.5984;1.7036,0.6694,-1.3027;2.1257,1.4017,-0.6018;0.9944,1.4523,-2.255;0.9223,0.9317,-3.0729;0.7576,-0.2488,-0.5234;1.2864,-0.7948,0.2682;0.2889,-0.9857,-1.1855;-0.035,0.3488,-0.063;2.9928,-2.6026,-2.0899;3.4729,-2.6354,-1.1167;2.754,-3.7968,-2.7604;3.1203,-4.9596,-2.1765;2.1571,-3.824,-4.0115;1.9857,-4.7692,-4.5149;1.7889,-2.6096,-4.5921;1.3167,-2.6017,-5.5728)|\",5.662689228905\r\n\"[H]C1=C([H])C([H])=C(/C([H])=N/N([H])C2=C([H])C([H])=C(C([H])([H])N([H])[H])C([H])=C2[H])C([H])=C1[H] |(6.537,5.7473,-1.4127;6.2365,4.7217,-1.2163;7.153,3.6759,-1.379;8.1684,3.8904,-1.7027;6.7754,2.3611,-1.1291;7.483,1.548,-1.2538;5.4648,2.0635,-0.7083;5.0268,0.6961,-0.4351;3.9871,0.5664,-0.1015;5.8079,-0.3204,-0.5736;5.3254,-1.5406,-0.2992;4.3595,-1.6355,0.0091;6.0947,-2.6968,-0.4328;5.5111,-3.9379,-0.1275;4.4755,-3.9796,0.2055;6.2493,-5.1106,-0.2495;5.7964,-6.0658,-0.0021;7.5845,-5.0841,-0.6748;8.3791,-6.3649,-0.8371;9.4476,-6.1034,-0.9402;8.0954,-6.8608,-1.7767;8.0942,-7.3147,0.2467;8.4089,-6.9217,1.133;8.6241,-8.1726,0.1028;8.1539,-3.8388,-0.9673;9.1924,-3.7913,-1.2894;7.4308,-2.6535,-0.8562;7.8848,-1.6963,-1.0823;4.553,3.1209,-0.5477;3.5377,2.9042,-0.2225;4.935,4.4381,-0.7996;4.2157,5.2424,-0.6697)|\",3.910276031685\r\n\"[H]/N=C1/N=C(O[H])[C@]([H])(NO)/C(=N\\C([H])([H])C(=O)O[H])N1[H] |(6.8969,-0.7053,0.8611;6.8476,-1.6511,1.2493;5.647,-2.0824,1.2813;5.4137,-3.3649,1.7736;4.2414,-3.8572,1.7925;4.0672,-5.0988,2.2652;3.1611,-5.4056,2.0667;2.971,-3.1434,1.3408;2.2389,-3.1751,2.1575;2.3219,-3.9288,0.1806;1.9566,-5.0255,0.5249;3.2082,-1.7218,0.8883;2.2984,-0.8922,0.5391;0.8863,-1.2331,0.5671;0.6063,-1.8536,-0.2948;0.5699,-1.7763,1.4689;0.0265,0.0418,0.4925;-1.1736,0.0099,0.6074;0.7218,1.168,0.275;1.6689,0.8997,0.2442;4.5276,-1.3393,0.8417;4.6867,-0.3922,0.5181)|\",2.90073364633\r\n\"[H]/C(C(=O)OC([H])([H])C([H])([H])[H])=C(/[H])[C@]([H])(OC(=O)OC([H])([H])[H])C([H])([H])[H] |(5.5703,-0.4495,-1.1843;5.3086,0.1335,-0.3085;4.3289,1.2136,-0.5657;3.8553,1.4347,-1.6639;4.021,1.9283,0.5459;3.0679,3.0024,0.3765;3.3054,3.7042,1.1802;3.2463,3.4825,-0.5887;1.6363,2.4978,0.4826;0.9391,3.3419,0.4271;1.4748,1.9802,1.4339;1.41,1.811,-0.3377;5.8425,-0.1258,0.8898;5.5527,0.4794,1.7459;6.8305,-1.2111,1.1985;7.7717,-0.7636,1.54;7.1033,-1.9405,-0.0205;8.3167,-2.5102,-0.1091;9.1952,-2.4653,0.7252;8.3849,-3.1399,-1.2894;9.6338,-3.8034,-1.5432;10.4582,-3.0857,-1.5411;9.5224,-4.2564,-2.5284;9.8231,-4.5697,-0.7869;6.32,-2.1748,2.2744;6.0909,-1.6282,3.1955;7.0879,-2.9202,2.4987;5.4112,-2.6799,1.9329)|\",6.1497730213\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N(C([H])([H])[H])O[C@]3([H])C(=O)N(C([H])([H])[H])C(=O)[C@]23[H])O1 |(7.5288,-4.7758,3.3657;7.3082,-3.7894,2.987;8.0712,-2.7048,2.6813;9.1448,-2.6232,2.7792;7.1642,-1.7027,2.2054;7.4097,-0.7,1.8859;5.9132,-2.251,2.2526;4.5298,-1.7586,1.9559;3.8469,-2.6128,1.9808;3.9828,-0.7186,2.8735;4.2752,-0.8771,4.2902;3.7906,-1.7994,4.6251;3.8353,-0.0325,4.8259;5.3508,-0.9279,4.5077;4.7194,0.4817,2.4471;4.5421,0.4825,1.0394;5.3826,1.0319,0.6087;3.2118,1.1343,0.6132;2.884,2.2914,0.7541;2.427,0.1507,0.0155;1.082,0.4068,-0.4764;0.8194,-0.3872,-1.1766;1.0645,1.3819,-0.9668;0.3654,0.4136,0.351;3.0111,-1.1126,0.0127;2.4821,-2.1217,-0.4046;4.4124,-0.9973,0.6144;5.1496,-1.3375,-0.1154;5.9937,-3.5327,2.7369)|\",5.270845284185\r\n\"[H]OC([H])([H])[C@]1(C([H])([H])[H])C([H])([H])C([H])=C2C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C(=O)C([H])([H])[C@@]21[H] |(2.436,-2.6219,-2.4459;3.197,-2.6737,-3.0462;4.1985,-3.4583,-2.409;4.0379,-4.5299,-2.6161;5.1433,-3.178,-2.8879;4.3087,-3.2844,-0.875;5.542,-4.0649,-0.4006;5.4861,-5.1169,-0.7069;5.6268,-4.0398,0.6914;6.4673,-3.6492,-0.8175;2.9955,-3.7868,-0.2071;2.5029,-4.5721,-0.7986;3.1986,-4.2338,0.7787;2.169,-2.5354,-0.0581;1.1164,-2.5738,0.2038;2.8888,-1.409,-0.1984;2.3688,0.0152,-0.0078;0.8271,0.0278,0.0418;0.4675,1.0519,0.1931;0.4421,-0.5786,0.8682;0.3909,-0.3473,-0.8912;2.8935,0.578,1.3374;2.5499,1.6099,1.4812;3.9856,0.5811,1.4001;2.5141,-0.0232,2.1709;2.8017,0.9215,-1.2043;2.663,0.3526,-2.1322;2.1055,1.767,-1.2634;4.2237,1.5117,-1.158;4.3077,2.3165,-0.4214;4.4385,1.9651,-2.138;5.3448,0.5209,-0.878;6.3222,0.8372,-0.2223;5.1786,-0.8828,-1.4266;6.1811,-1.3107,-1.53;4.6915,-0.8668,-2.4059;4.3612,-1.7705,-0.453;4.885,-1.7171,0.5157)|\",6.05181203512\r\n\"[H]OC([H])([H])[C@]1(C([H])([H])[H])C([H])([H])[C@]2([H])C(=C(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])C([H])([H])[C@@]2(O[H])C([H])=C([H])[H] |(8.0218,2.1091,-1.9499;7.7182,2.7254,-2.6332;6.4224,3.1833,-2.2536;6.4349,3.5879,-1.2301;6.1982,4.014,-2.9322;5.3078,2.1227,-2.3836;5.2811,1.605,-3.8286;5.1891,2.4487,-4.5242;4.4297,0.945,-4.0123;6.2071,1.0771,-4.0786;3.942,2.7613,-1.9416;4.0726,3.8166,-1.669;3.1845,2.7093,-2.7259;3.5288,1.9563,-0.6835;2.7743,2.4493,-0.0691;4.89,1.6768,-0.0628;5.3783,1.9256,1.1588;4.571,2.5955,2.2466;5.0774,3.5014,2.6083;4.4655,1.9333,3.1176;3.5689,2.879,1.9172;6.7858,1.5588,1.572;6.7739,0.8784,2.4353;7.3456,2.4494,1.8906;7.3519,1.072,0.7738;5.498,1.0094,-1.2912;6.5293,0.6529,-1.2114;4.482,-0.1434,-1.4824;4.723,-0.9653,-0.7992;4.4589,-0.553,-2.4958;3.0948,0.5128,-1.1139;2.2233,0.6119,-2.2386;1.9679,-0.2938,-2.4777;2.4272,-0.21,0.0343;3.0744,-0.4166,0.8861;1.1489,-0.5877,0.0682;0.7344,-1.1076,0.9272;0.4684,-0.3735,-0.7513)|\",6.64502022921\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])OC(C([H])([H])C([H])([H])[H])=NN(C([H])([H])[H])[C@]2([H])C([H])([H])[H])C([H])=C1[H] |(9.4673,2.414,-2.4441;8.423,2.1904,-2.646;8.047,1.6468,-3.8744;8.796,1.4422,-4.6347;6.7051,1.3556,-4.1244;6.4183,0.9211,-5.08;5.7219,1.6119,-3.1612;4.2725,1.3094,-3.4857;4.2424,0.7422,-4.4218;3.6858,0.4159,-2.5123;2.9502,0.9707,-1.4994;2.508,-0.0509,-0.4915;1.7932,0.4383,0.1759;1.9795,-0.8572,-1.0175;3.6761,-0.6493,0.3113;3.3098,-1.4016,1.0181;4.1952,0.1293,0.8809;4.4029,-1.1267,-0.3529;2.6526,2.2095,-1.3636;3.1679,3.1056,-2.3035;2.4932,4.3906,-2.2227;3.0277,5.1099,-2.8521;2.5286,4.7335,-1.186;1.4377,4.3569,-2.5373;3.3737,2.5563,-3.6446;3.9394,3.3041,-4.213;2.077,2.2198,-4.4056;2.3066,1.8599,-5.4151;1.441,3.105,-4.5048;1.5035,1.4441,-3.8891;6.1089,2.158,-1.9286;5.3593,2.3639,-1.1735;7.4507,2.4413,-1.6752;7.738,2.8619,-0.715)|\",5.074923311825\r\n\"[H]OC(NN(C([H])([H])[H])[C@@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])C([H])([H])[H] |(0.3343,1.9697,2.4425;-0.1621,1.4173,1.8199;0.6268,0.3728,1.4325;0.0596,-0.4775,0.6666;0.9189,-1.5537,0.2268;1.4577,-1.2253,-1.0967;0.7011,-0.8799,-1.8128;1.9545,-2.1114,-1.5039;2.2117,-0.4369,-0.9866;0.1793,-2.834,0.3282;0.7679,-3.5555,-0.2561;0.1864,-3.2804,1.7961;-0.3009,-4.2505,1.9131;-0.3531,-2.5552,2.4159;1.217,-3.3438,2.16;-1.2806,-2.8288,-0.2249;-1.8375,-2.0789,0.3393;-1.9317,-4.0729,0.0823;-1.5853,-4.7453,-0.5253;-1.4081,-2.516,-1.7072;-0.8891,-3.3874,-2.6762;-0.3483,-4.2832,-2.3725;-1.0426,-3.1216,-4.0369;-0.6335,-3.8097,-4.7723;-1.7184,-1.9723,-4.4523;-1.8385,-1.7624,-5.5118;-2.239,-1.0966,-3.498;-2.7671,-0.1999,-3.8121;-2.0864,-1.3695,-2.1374;-2.483,-0.6819,-1.3956;2.0447,0.3783,1.9555;2.0005,0.4077,3.0545;2.4998,-0.5717,1.6689;2.8822,1.5649,1.4444;3.8938,1.5233,1.8608;2.4495,2.5328,1.7265;2.9639,1.549,0.3528)|\",5.042269649765\r\n\"[H]C(C1=C([H])C([H])=C(C([H])([H])[H])O1)N(O)C([H])([H])[H] |(6.6241,0.6732,0.2539;5.674,1.1734,0.3945;4.4906,0.3939,0.2063;4.3701,-0.9338,-0.1386;5.1875,-1.6155,-0.3281;2.9742,-1.217,-0.1944;2.5094,-2.1627,-0.4372;2.3184,-0.0601,0.1159;0.8841,0.3148,0.2329;0.2557,-0.5527,0.0142;0.6171,1.116,-0.468;0.6397,0.6683,1.2428;3.234,0.9331,0.3635;5.7908,2.4481,0.7246;6.9257,3.0174,0.8577;4.6094,3.3162,0.9653;4.0098,2.9151,1.7844;3.9931,3.3682,0.0659;5.0203,4.2906,1.218)|\",3.787824798960001\r\n\"[H]C(C1=C([H])C([H])=C(C([H])([H])[H])S1)N(O)C([H])([H])[H] |(6.7017,0.6836,0.0536;5.8538,1.3258,0.2645;4.551,0.7343,0.16;4.3551,-0.5973,-0.1657;5.1829,-1.2734,-0.3515;2.9912,-0.9821,-0.231;2.6631,-1.9884,-0.4716;2.1123,0.0349,0.0373;0.6142,0.0001,0.0598;0.267,-1.0142,-0.1603;0.1753,0.6737,-0.6869;0.2097,0.2876,1.0382;2.9821,1.5133,0.3828;6.2173,2.5563,0.5838;7.4419,2.919,0.6164;5.2343,3.6125,0.9229;4.6015,3.2904,1.7539;4.6128,3.8427,0.0531;5.8283,4.4782,1.2063)|\",3.659931289225\r\n\"[H]C1=C(F)C2=C3C(=N1)/C([H])=C(/[H])C(=O)N3C([H])([H])[C@]21OC1([H])[H] |(-3.1432,-2.0068,-0.0024;-2.2468,-1.3913,-0.0062;-2.391,0.0114,-0.0734;-3.6256,0.5379,-0.143;-1.2689,0.8136,-0.0807;-0.0522,0.1368,-0.0194;0.0487,-1.2561,0.0435;-1.071,-2.0124,0.0502;1.3909,-1.7807,0.0914;1.5258,-2.8573,0.1464;2.4562,-0.9341,0.066;3.4739,-1.3076,0.0987;2.3448,0.5355,-0.0153;3.2846,1.3205,-0.0597;1.0212,0.984,-0.0401;0.5775,2.3838,-0.135;0.9248,2.9614,0.7265;0.9784,2.8417,-1.0444;-0.9704,2.2772,-0.133;-1.6916,3.1701,0.7219;-1.8656,3.3007,-0.6955;-2.8433,3.0058,-1.075;-1.4354,4.1986,-1.1402)|\",4.27762972986\r\n\"[H]C1=N[C@@]2([H])/C(=N\\C(=O)/C([H])=C\\2[H])C([H])=C1F |(2.5271,1.8745,2.4993;2.1505,0.8711,2.2942;2.5631,-0.1028,3.013;1.9906,-1.4243,2.7513;1.1646,-1.494,3.4888;1.3252,-1.6579,1.397;1.1823,-2.811,0.8346;1.8154,-3.942,1.4194;1.5371,-5.0629,1.0329;2.8457,-3.7123,2.4661;3.4787,-4.56,2.7088;2.9466,-2.5337,3.0891;3.6663,-2.3378,3.8794;0.7967,-0.4749,0.737;0.1775,-0.5898,-0.1456;1.2066,0.7265,1.1849;0.8224,1.8698,0.6038)|\",3.872180092615\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])C(C(=O)OC([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(3.6363,1.4266,0.3447;2.9014,0.668,0.6224;3.3068,-0.5975,0.757;2.5724,-1.357,1.0371;4.7178,-1.086,0.5683;5.3494,-0.2516,0.2408;4.7426,-1.834,-0.24;5.294,-1.7337,1.846;4.5845,-2.5004,2.18;5.3235,-0.9771,2.6411;6.6945,-2.3573,1.6548;6.6598,-2.9674,0.7422;7.7562,-1.2633,1.4377;7.5229,-0.665,0.5512;8.7558,-1.6849,1.2869;7.8093,-0.5788,2.292;7.0889,-3.3493,2.8116;8.4856,-3.9116,2.5113;9.4598,-3.8147,3.2266;8.5163,-4.5674,1.324;9.7835,-5.1426,0.9702;9.6279,-5.6222,0.0032;10.0952,-5.8769,1.7179;10.5518,-4.3682,0.8957;7.1322,-2.6684,4.1906;6.1458,-2.272,4.4511;7.8573,-1.8513,4.2207;7.4253,-3.3865,4.9622;6.1191,-4.5564,2.8646;5.1369,-4.2537,3.2399;6.5012,-5.326,3.5455;5.9903,-5.0108,1.8778;1.4934,1.1544,0.8176;1.4403,1.918,1.6056;0.8202,0.3359,1.0944;1.1014,1.6207,-0.0967)|\",6.887201556155\r\n\"[H]C(=O)[C@@]1([H])C([H])([H])C2=C(SC([H])=C2[H])[C@@](C([H])=C(C([H])([H])[H])C([H])([H])[H])(C([H])([H])[H])C1([H])[H] |(5.3715,0.9923,-4.8878;6.0778,0.2715,-4.4129;7.0579,-0.1117,-5.0137;5.696,-0.1412,-3.0076;4.6852,-0.5759,-3.0838;6.6643,-1.1664,-2.4089;7.6954,-0.8453,-2.6104;6.5536,-2.1309,-2.9199;6.4391,-1.3257,-0.926;5.6866,-0.4448,-0.1893;5.6257,-0.9321,1.4942;6.6259,-2.3169,1.1816;6.896,-2.9835,1.9896;6.9787,-2.3926,-0.1334;7.6024,-3.1822,-0.541;4.9456,0.7774,-0.7173;3.4506,0.4286,-0.8326;3.2161,-0.5826,-0.5045;2.4069,1.1625,-1.2566;1.0188,0.5598,-1.25;0.5804,0.5659,-2.258;0.3399,1.145,-0.6139;1.0202,-0.4716,-0.8855;2.4335,2.5881,-1.7563;1.8017,3.2243,-1.1209;2.0071,2.6463,-2.7674;3.4267,3.0362,-1.787;5.1735,1.9771,0.2368;4.7358,2.8961,-0.1626;6.245,2.1464,0.3891;4.716,1.7929,1.2143;5.5745,1.1056,-2.105;4.9758,1.8731,-2.6056;6.5755,1.53,-1.9488)|\",5.379690824385\r\n\"[H]C([H])=C([H])C([H])([H])[C@@]12C(=O)C([H])([H])C([H])([H])[C@@]1([H])C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C2=O |(0.6297,1.4889,2.7697;1.4362,0.761,2.7585;1.2581,-0.1675,3.299;2.5842,0.9997,2.1229;2.725,1.9366,1.5869;3.7416,0.0353,2.077;3.4541,-0.9088,2.5503;4.5731,0.4341,2.6765;4.2614,-0.2126,0.6166;5.0837,1.0366,0.2529;4.6538,2.1189,-0.0745;6.5745,0.7068,0.4727;7.0691,0.7376,-0.5074;7.0454,1.4808,1.0866;6.6109,-0.7163,1.0697;6.586,-0.6857,2.1659;7.5067,-1.2757,0.781;5.3264,-1.345,0.5014;5.5126,-1.4534,-0.5796;4.8177,-2.6976,0.9944;5.6016,-3.4585,0.8856;4.5798,-2.6542,2.0647;3.5649,-3.1336,0.1843;3.0879,-3.9599,0.7277;3.9335,-3.6633,-1.2123;4.6699,-4.4725,-1.1432;4.3515,-2.8824,-1.8582;3.047,-4.0592,-1.7201;2.5129,-1.987,0.0795;2.0086,-1.863,1.0468;1.7439,-2.2344,-0.6584;3.1144,-0.6313,-0.3069;2.7615,-0.01,-1.2867)|\",5.632756705349999\r\n\"[H]OC(=O)[C@]1([H])N([H])C([H])([H])C2=C(C([H])=C(Cl)C([H])=C2[H])C1([H])[H] |(-4.3455,-0.2023,0.507;-4.7339,0.7042,0.4733;-3.7155,1.524,0.1547;-3.8323,2.7233,0.081;-2.4056,0.7784,-0.141;-2.4284,0.5437,-1.2161;-2.4072,-0.5193,0.5578;-2.2635,-0.34,1.5542;-1.3017,-1.3604,0.0797;-1.2372,-2.2444,0.7239;-1.5728,-1.7252,-0.921;0.0397,-0.6408,0.0258;0.1031,0.7636,0.0456;1.352,1.3948,-0.0234;1.4136,2.4789,-0.0098;2.5151,0.6376,-0.1119;4.0723,1.4513,-0.1963;2.4676,-0.7563,-0.1357;3.3823,-1.3352,-0.2029;1.2251,-1.3798,-0.0663;1.1776,-2.4668,-0.0815;-1.1561,1.6088,0.159;-1.1087,2.4698,-0.5154;-1.2374,2.0295,1.1719)|\",6.05181203512\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]([H])(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])N([H])[H] |(8.4192,-0.1623,0.1158;8.0793,0.7467,0.1211;6.7865,0.7258,-0.4879;6.827,0.2268,-1.4654;6.3317,2.1556,-0.6818;5.5057,2.4827,-1.7655;5.2262,1.7099,-2.4769;5.0429,3.7877,-1.9398;4.4098,4.0252,-2.7908;5.4047,4.7863,-1.0336;5.0499,5.8043,-1.1711;6.2355,4.4711,0.0428;6.5322,5.2448,0.7463;6.6963,3.1651,0.2188;7.3658,2.9302,1.0391;5.825,-0.1202,0.4268;6.2531,-1.134,0.4303;4.4712,-0.2794,-0.2637;4.3632,-0.8609,-1.3275;3.4829,0.3143,0.415;2.0737,0.2664,-0.0347;1.5964,-1.1889,-0.058;0.5229,-1.2173,-0.2762;1.7578,-1.6591,0.9181;2.1228,-1.7652,-0.8208;1.9295,0.9505,-1.3972;0.8659,1.0402,-1.6461;2.4254,0.3773,-2.1819;2.3608,1.9564,-1.3659;1.3535,1.0626,1.0554;1.7281,2.0906,1.0943;1.5072,0.6011,2.036;0.2783,1.0933,0.8494;5.7292,0.33,1.805;5.1768,1.1835,1.8505;6.6693,0.5624,2.1238)|\",6.087186835684999\r\n\"[H]OC(=O)[C@@]([H])(/N=C(/O[H])C([H])([H])[H])C([H])([H])OC([H])([H])C([H])([H])/N=C(/O[H])C([H])([H])[H] |(6.526,0.8456,-8.4244;6.7591,1.0166,-7.4764;5.6486,0.7325,-6.7869;5.571,0.8152,-5.5823;4.4616,0.2565,-7.6575;3.6061,0.8909,-7.3883;4.8483,0.3563,-9.0502;4.0229,0.5377,-9.9982;4.5687,0.6297,-11.2427;3.8626,0.7534,-11.8957;2.5224,0.7008,-9.9513;2.1224,0.5407,-8.9526;2.0491,-0.0337,-10.6139;2.2468,1.7029,-10.3048;4.122,-1.1843,-7.2462;4.0431,-1.2139,-6.1522;4.9372,-1.8559,-7.559;2.9082,-1.5532,-7.8766;2.6002,-2.9408,-7.8152;1.8274,-3.1171,-8.5693;3.4786,-3.5456,-8.0861;2.0682,-3.3824,-6.4475;1.2165,-2.7346,-6.1842;2.8375,-3.2186,-5.6757;1.675,-4.7775,-6.5426;1.3153,-5.3939,-5.4958;0.96,-6.7038,-5.6487;0.7084,-7.0721,-4.7895;1.2463,-4.8296,-4.0889;2.2499,-4.5626,-3.7372;0.8105,-5.5293,-3.3684;0.646,-3.9135,-4.0764)|\",6.617808844160001\r\n\"[H]OC(=O)[C@@]([H])(/N=C(/O[H])C([H])([H])[H])C([H])([H])OC([H])([H])C([H])([H])C1([H])C([H])([H])C1([H])[H] |(5.3663,-3.219,1.263;5.192,-3.8104,2.0341;3.9143,-3.5823,2.372;3.3577,-4.1542,3.2795;3.1935,-2.5429,1.4845;2.7493,-1.8035,2.1645;4.1486,-1.9673,0.5507;4.0699,-0.7457,0.1869;4.976,-0.2695,-0.6952;5.5576,-1.0172,-0.9292;3.0851,0.3099,0.6131;2.0708,-0.0168,0.373;3.3052,1.2532,0.1112;3.1403,0.4641,1.6968;2.0436,-3.2583,0.761;1.4465,-3.8005,1.5078;2.4582,-3.9905,0.0488;1.2617,-2.2949,0.0818;0.1816,-2.8665,-0.6448;0.5621,-3.5892,-1.3848;-0.4819,-3.4199,0.041;-0.5845,-1.7473,-1.3422;0.1045,-1.2164,-2.0125;-0.913,-1.0217,-0.5859;-1.775,-2.2624,-2.127;-2.5301,-2.7605,-1.5192;-1.5917,-2.8128,-3.5233;-0.5919,-2.7897,-3.951;-2.1807,-3.6747,-3.8246;-2.2991,-1.4953,-3.3158;-3.3734,-1.4564,-3.4732;-1.7774,-0.5856,-3.6033)|\",6.85182675559\r\n\"[H]OC(=O)[C@@]([H])(/N=C(/O[H])C([H])([H])[H])C([H])([H])OC([H])([H])C#CC([H])([H])[H] |(7.1927,2.5676,2.6301;6.2094,2.6224,2.7048;5.7762,2.8672,1.4604;4.6066,2.9711,1.1735;6.8967,2.9829,0.4034;6.7497,3.9474,-0.1012;8.187,2.8655,1.0596;9.2469,3.4121,0.6023;10.4089,3.2199,1.2681;10.2076,2.6242,2.0136;9.4594,4.2694,-0.6129;8.5248,4.688,-0.9859;9.8873,3.6556,-1.4112;10.1621,5.0717,-0.3721;6.6833,1.877,-0.6402;5.65,1.9364,-1.0104;6.8267,0.8917,-0.1662;7.6203,2.0737,-1.6815;7.5584,1.051,-2.671;7.732,0.0658,-2.2052;6.5508,1.0206,-3.1187;8.5601,1.2989,-3.7049;9.3796,1.4805,-4.5747;10.3696,1.6999,-5.6259;10.9791,0.8035,-5.7916;11.0494,2.5185,-5.3605;9.8908,1.9602,-6.5774)|\",6.881759279144999\r\n\"[H]O/C(=N/[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])OC([H])([H])C([H])([H])OC([H])([H])[H])C([H])([H])[H] |(-0.0873,1.4786,0.1661;0.5872,1.1745,-0.46;1.715,0.8197,0.2304;2.6771,0.3937,-0.4745;3.9479,0.0244,0.1077;3.9,-0.2412,1.1725;4.4646,-1.2429,-0.5813;4.63,-2.3015,-0.0128;4.7287,-1.0259,-1.8832;5.1971,-2.1606,-2.6517;4.6749,-3.0563,-2.307;4.894,-1.9341,-3.6766;6.7055,-2.3228,-2.5367;7.0427,-3.1386,-3.1865;6.989,-2.5641,-1.5084;7.2203,-1.4058,-2.8422;4.9554,1.1839,-0.0755;4.891,1.5364,-1.1074;5.9806,0.8317,0.1154;4.6473,2.2881,0.7596;5.2279,2.2313,2.0515;4.8364,1.3907,2.6453;6.3199,2.1087,1.9855;4.902,3.5416,2.7536;3.8082,3.6754,2.8077;5.3081,4.3848,2.1714;5.4727,3.4861,4.0442;5.2565,4.6648,4.7899;5.7,5.5454,4.298;5.7339,4.523,5.7629;4.1825,4.8591,4.9431;1.603,1.0174,1.7273;0.7081,0.5117,2.1113;1.5128,2.0865,1.9548;2.4669,0.632,2.268)|\",6.680395029775\r\n\"[H]C([H])([H])OC(=O)C([H])([H])C(=O)C([H])([H])[C@@]1([H])C(C([H])([H])[H])N=C(C([H])([H])[H])NC1=O |(1.952,-3.2724,-5.8438;2.4704,-2.4691,-6.3733;3.3974,-2.8349,-6.8149;1.8092,-2.0624,-7.1429;2.8648,-1.4382,-5.4515;1.8583,-0.8459,-4.7858;0.6848,-1.0928,-4.9541;2.383,0.1596,-3.7793;3.3563,0.5606,-4.0878;1.677,0.9935,-3.7057;2.5918,-0.4673,-2.4043;2.6582,-1.6694,-2.2387;2.7044,0.5031,-1.2331;1.6877,0.6961,-0.8637;3.1026,1.4593,-1.5825;3.6053,-0.0436,-0.1204;3.697,0.7444,0.6494;3.1255,-1.2686,0.6436;1.6569,-1.4977,0.8478;1.1749,-0.6034,1.2647;1.5107,-2.3375,1.5292;1.1736,-1.7199,-0.1094;3.9597,-2.0783,1.2007;5.3379,-1.817,1.0297;6.2102,-2.6214,1.944;6.0657,-3.6893,1.7396;5.9057,-2.4615,2.9854;7.2594,-2.3558,1.8101;5.8848,-1.0135,0.1724;5.026,-0.2781,-0.6485;5.3899,0.1907,-1.7111)|\",4.481715117735\r\n\"[H]C1=C([H])C([H])=C([H])C(NNN([H])C2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])=N1 |(10.6775,-2.7526,2.0923;9.6478,-3.0063,1.8438;9.2179,-4.3322,1.8935;9.9024,-5.1254,2.1781;7.8879,-4.6,1.5598;7.5062,-5.6176,1.5772;7.0557,-3.5487,1.1982;6.0196,-3.7124,0.9269;7.5889,-2.246,1.187;6.8578,-1.087,0.8292;5.6108,-1.2823,0.7157;4.9645,-0.1797,0.3518;5.5376,0.6531,0.2132;3.5753,-0.125,0.1919;2.7526,-1.2337,0.4263;3.1927,-2.1711,0.7446;1.376,-1.1154,0.2488;0.749,-1.9841,0.4357;0.7795,0.0847,-0.1578;-0.7152,0.2,-0.3461;-1.2206,-0.7393,-0.1003;-0.9725,0.4538,-1.3823;-1.1424,0.9841,0.2915;1.6223,1.1816,-0.3861;1.1952,2.1304,-0.7029;2.9994,1.0868,-0.2166;3.6322,1.9517,-0.4026;8.8654,-1.9801,1.4974)|\",3.90211261617\r\n\"[H]C1=C([H])C(=C2N([H])[C@]3([H])N([H])ON([H])[C@@]3([H])N([H])C2([H])[H])C([H])=C([H])C1=O |(-0.8721,2.3609,-0.792;-0.5288,1.3513,-0.5859;0.605,1.1223,0.1202;1.1591,1.9857,0.4838;1.0754,-0.2156,0.4246;2.2581,-0.4479,1.1163;3.1374,0.5569,1.4405;2.851,1.511,1.2652;4.0349,0.3381,2.562;3.463,0.2825,3.4977;5.1584,1.2712,2.7081;5.0957,1.6775,3.6435;6.3215,0.4267,2.7883;5.8013,-0.8728,3.3164;6.5822,-1.484,3.06;4.7135,-1.0073,2.3415;5.1264,-1.0204,1.3145;3.6834,-2.0019,2.5467;4.0667,-2.9457,2.5237;2.7229,-1.8779,1.4256;3.1506,-2.2711,0.4842;1.8559,-2.4916,1.6715;0.252,-1.3032,-0.065;0.5577,-2.3275,0.1252;-0.8818,-1.0922,-0.7784;-1.4812,-1.922,-1.1414;-1.3634,0.2561,-1.1008;-2.3962,0.4588,-1.7479)|\",3.5538068875299995\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]1([H])[C@@]([H])(N([H])[H])N([H])N([H])[C@]1([H])N([H])C([H])([H])[H] |(4.7317,-1.2114,4.2804;3.9373,-0.6369,4.1404;3.4206,-0.8685,2.9154;2.372,-0.2227,2.591;1.729,-0.3748,1.3569;1.8483,0.6323,0.3806;2.4632,1.5006,0.6021;1.1642,0.5315,-0.8308;1.2725,1.3195,-1.5721;0.3356,-0.5625,-1.0898;-0.2011,-0.6358,-2.0312;0.1928,-1.5538,-0.1163;-0.4602,-2.4039,-0.2989;0.8769,-1.464,1.0959;0.7467,-2.2243,1.8614;4.1548,-1.8988,2.0524;3.4698,-2.277,1.2911;4.7455,-3.0459,2.9109;4.0696,-3.3226,3.7276;5.102,-4.2826,2.2312;5.5352,-4.1056,1.3262;4.275,-4.8538,2.0682;5.9301,-2.3711,3.4906;6.6468,-3.0703,3.6792;6.4663,-1.4218,2.5073;6.4129,-0.5067,2.9601;5.4626,-1.3517,1.4183;5.7744,-2.0661,0.6315;5.3734,-0.0037,0.9098;4.4612,0.1459,0.4861;6.438,0.3534,-0.0276;6.2814,1.3801,-0.3712;6.4937,-0.3074,-0.9111;7.4047,0.3109,0.4828)|\",5.602824181795\r\n\"[H]C1=NN=C2C([H])=C([H])C(/C([H])=C(\\[H])C3=C([H])C([H])=C([H])C([H])=C3[H])=C([H])N12 |(8.9835,0.716,-0.3995;8.3454,1.581,-0.5069;8.742,2.826,-0.6595;7.6396,3.6304,-0.7617;6.5746,2.8454,-0.6675;5.1822,3.1228,-0.7073;4.8667,4.1518,-0.8364;4.2914,2.0961,-0.5846;3.2288,2.309,-0.6188;4.7268,0.7296,-0.4127;3.8039,-0.396,-0.2764;4.2864,-1.3639,-0.1509;2.4568,-0.3335,-0.2854;1.9783,0.6383,-0.3925;1.517,-1.4505,-0.1541;1.9082,-2.8039,-0.1347;2.9568,-3.0674,-0.2373;0.9649,-3.8176,-0.0007;1.2902,-4.8545,0.0089;-0.3948,-3.5103,0.1143;-1.1282,-4.3052,0.2173;-0.801,-2.176,0.0899;-1.8546,-1.9241,0.1755;0.144,-1.1614,-0.0451;-0.1808,-0.1235,-0.0633;6.0782,0.4793,-0.3768;6.4995,-0.5119,-0.2536;6.9731,1.5157,-0.5023)|\",4.04361181843\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])N2/C(=C(/C#N)N(=O)=O)N([H])C([H])([H])C2([H])[H])O1 |(3.7845,-0.7426,4.3552;3.1299,-0.4958,3.5332;2.1263,0.4072,3.3684;1.7653,1.1104,4.1056;1.6481,0.2325,2.0282;0.8469,0.7682,1.538;2.3996,-0.7608,1.4739;2.3933,-1.409,0.1327;3.4078,-1.4629,-0.2755;1.7768,-0.8219,-0.5419;1.9015,-2.8069,0.1315;0.7076,-3.2541,0.6094;-0.5433,-2.625,0.7812;-1.5335,-3.3315,1.5095;-2.27,-4.014,2.1032;-0.9539,-1.3892,0.2175;-0.2494,-0.8553,-0.6602;-2.0281,-0.919,0.6071;0.845,-4.5553,1.0102;0.0135,-5.1244,1.1087;2.0998,-5.1223,0.527;2.5339,-5.8125,1.2546;1.9572,-5.6524,-0.4251;2.9295,-3.8495,0.3447;3.6058,-3.8957,-0.513;3.5092,-3.6112,1.245;3.317,-1.2189,2.3899)|\",4.715733029165\r\n\"[H]C([H])([H])[C@]1([H])C([H])([H])C(=O)[C@@]2([H])C([H])([H])C([H])([H])S(=O)(=O)[C@@]12[H] |(0.4186,-0.1377,1.0041;0.8728,-0.037,0.0122;0.4805,-0.8411,-0.6205;0.5349,0.9182,-0.4098;2.404,-0.1072,0.1138;2.691,-1.0741,0.5371;2.99,1.0489,0.9488;3.9047,0.7302,1.4674;2.3123,1.4469,1.7107;3.3863,2.1348,-0.0485;3.7095,3.2718,0.2125;3.3539,1.557,-1.4817;2.5444,2.0623,-2.0225;4.6948,1.7681,-2.2236;4.5576,1.6059,-3.2975;5.0428,2.7941,-2.0714;5.6903,0.7501,-1.6636;6.0791,1.0349,-0.6818;6.5178,0.5074,-2.333;4.7071,-0.7822,-1.434;4.7113,-1.5386,-2.6987;5.0741,-1.4386,-0.1657;3.0525,0.0623,-1.2836;2.459,-0.3858,-2.0839)|\",5.90487055585\r\n\"[H]C1=C([H])C2=C(C([H])=C1Cl)C([H])=C(C1=N[C@]([H])(C([H])([H])[H])N([H])O1)N2[H] |(7.3885,3.9593,6.9787;6.6554,3.8237,6.1913;6.4024,2.5591,5.6809;6.9331,1.6918,6.0627;5.446,2.4432,4.6646;4.743,3.5766,4.1573;5.0147,4.8515,4.6893;4.4991,5.7344,4.3268;5.9628,4.9477,5.6924;6.3296,6.5282,6.3837;3.8536,3.0993,3.1452;3.1656,3.6775,2.5454;4.0356,1.735,3.0659;3.4076,0.7361,2.2274;3.619,-0.5247,2.2833;2.8667,-1.083,1.151;2.2778,-1.9506,1.4713;3.8053,-1.4744,0.0115;4.5367,-2.2087,0.3629;4.3413,-0.5932,-0.3555;3.2351,-1.9046,-0.8179;1.9202,-0.0093,0.7236;1.0786,-0.0901,1.3029;2.5332,1.1981,1.2794;4.9953,1.3412,3.9816;5.2867,0.3807,4.0913)|\",4.5633492728850005\r\n\"[H]C1=C(F)C([H])=C2C(=C1[H])N([H])/C(C1=N[C@]([H])(C([H])([H])[H])N([H])O1)=C\\2[H] |(7.3918,3.914,6.9069;6.6744,3.7697,6.106;6.0189,4.8962,5.5759;6.3256,6.1053,6.1023;5.0906,4.8239,4.5583;4.6147,5.7256,4.1873;4.7959,3.5492,4.035;5.4609,2.4044,4.5701;6.3994,2.5056,5.605;6.8982,1.6281,6.0058;4.9956,1.3041,3.8903;5.2585,0.3379,4.0191;4.0636,1.71,2.954;3.4255,0.7174,2.1162;3.6118,-0.547,2.1844;2.8582,-1.1011,1.0521;2.2468,-1.9516,1.3765;3.7965,-1.5294,-0.0743;4.5075,-2.2761,0.2923;4.3561,-0.6664,-0.4492;3.2222,-1.9573,-0.9023;1.9403,-0.0115,0.6017;1.0875,-0.0702,1.1669;2.5687,1.1879,1.1558;3.9147,3.0802,3.013;3.2514,3.667,2.394)|\",4.541580164845\r\n\"[H]C1=C([H])C2=C(C([H])=C1OC([H])([H])[H])C([H])=C(C1=N[C@]([H])(C([H])([H])[H])N([H])O1)N2[H] |(7.6274,3.5607,6.4629;6.7838,3.4728,5.7859;6.5466,2.2982,5.097;7.2009,1.4398,5.2208;5.4388,2.2565,4.2379;4.5797,3.3796,4.0743;4.8408,4.5703,4.7883;4.1886,5.4273,4.6662;5.9395,4.6044,5.6357;6.313,5.687,6.3883;5.5267,6.8609,6.295;5.9927,7.5902,6.9605;4.4926,6.6821,6.6204;5.5149,7.2596,5.2712;3.5668,2.9954,3.1439;2.7428,3.5932,2.7814;3.8305,1.69,2.7768;3.1447,0.7979,1.8692;3.451,-0.4252,1.6448;2.5639,-0.8414,0.551;2.1029,-1.8107,0.7758;3.3241,-0.9162,-0.7719;4.1553,-1.6229,-0.6869;3.7263,0.0682,-1.032;2.6564,-1.2428,-1.5756;1.4817,0.1861,0.4916;0.761,-0.087,1.1673;2.0904,1.3301,1.171;4.9611,1.2467,3.4363;5.3357,0.3168,3.3192)|\",4.329331361455\r\n\"[H]C([H])=C([H])C([H])([H])[C@@]([H])(C([H])=C([H])[H])N([H])[S@](=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(0.5299,2.9675,-2.7276;1.4886,2.458,-2.7722;2.3254,3.0332,-3.1645;1.6294,1.194,-2.3713;0.7653,0.6586,-1.9795;2.9201,0.4227,-2.4088;3.7096,1.0563,-2.8353;2.8245,-0.4537,-3.0667;3.3808,-0.0793,-1.0157;3.4462,0.7888,-0.343;2.4159,-1.0828,-0.4384;2.3545,-2.0271,-0.9809;1.6533,-0.8716,0.6344;0.9523,-1.6205,0.993;1.7059,0.0562,1.1998;4.7082,-0.72,-1.205;5.3214,-0.0782,-1.7134;5.5001,-1.1472,0.2818;6.1267,0.0764,0.9079;6.9269,-2.1228,-0.4942;7.735,-1.2094,-1.4191;8.6853,-1.6911,-1.6798;7.9601,-0.2614,-0.9183;7.2077,-1.0059,-2.3579;7.7743,-2.5447,0.7161;8.6348,-3.1332,0.3756;7.1986,-3.1651,1.4134;8.1442,-1.6699,1.2588;6.3458,-3.3397,-1.2183;7.1649,-3.9716,-1.5845;5.7267,-3.0427,-2.0683;5.7327,-3.9503,-0.5444),wD:17.17|\",6.125282774755\r\n\"[H]C([H])=C([H])C([H])([H])[C@@]([H])(/C([H])=C(\\[H])C([H])([H])[H])N([H])[S@](=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(0.5207,0.8544,4.0597;1.1935,1.1138,3.2461;1.8633,1.9541,3.4203;1.1853,0.4476,2.0892;0.4845,-0.3744,1.9457;2.0735,0.7487,0.9105;2.522,1.7442,1.0149;1.4574,0.7583,0.0023;3.2358,-0.2704,0.6986;3.6933,-0.0616,-0.2754;4.2937,-0.1377,1.7649;3.9505,-0.2608,2.7933;5.577,0.1426,1.5268;5.904,0.2727,0.4933;6.6376,0.3015,2.5779;7.0927,1.3004,2.5357;7.4525,-0.4199,2.43;6.2326,0.1552,3.5848;2.8072,-1.6943,0.6505;2.3776,-1.9705,1.536;1.7621,-2.1095,-0.6571;0.311,-1.849,-0.311;1.9757,-3.9881,-0.5447;1.5204,-4.4687,0.8354;1.432,-5.5618,0.8361;0.5399,-4.0467,1.0807;2.2369,-4.199,1.619;1.0437,-4.5325,-1.6377;1.095,-5.6277,-1.6497;1.3357,-4.1719,-2.6314;0.0075,-4.2345,-1.4539;3.4393,-4.3239,-0.839;3.567,-5.4136,-0.8583;4.1045,-3.9072,-0.0786;3.7495,-3.937,-1.8171),wU:20.20|\",6.147051882795001\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])/C([H])=C(/C1=N[C@]([H])(C([H])([H])[H])N([H])O1)N2[H] |(7.0859,4.226,6.9896;6.4607,3.9774,6.1367;5.7757,5.0119,5.46;5.8869,6.0363,5.8045;4.964,4.7485,4.3655;4.4387,5.5464,3.8478;4.8483,3.4142,3.9544;5.5315,2.353,4.6218;6.3462,2.6575,5.7296;6.8749,1.8665,6.2549;5.1911,1.1433,3.94;5.5237,0.141,4.1693;4.3379,1.4879,2.9132;3.6848,0.6827,1.9049;2.9508,1.1248,0.9536;2.4162,-0.0765,0.2992;2.5261,-0.0027,-0.7893;0.9496,-0.2933,0.6668;0.3559,0.5769,0.3704;0.8486,-0.4348,1.7476;0.5593,-1.1827,0.1618;3.2633,-1.2107,0.7764;4.0839,-1.2608,0.1645;3.8551,-0.6745,2.0027;4.1299,2.8567,2.9254;3.5462,3.3283,2.2499)|\",4.536137887835\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])C([H])=C(C1=N[C@]([H])(N([H])[H])N([H])O1)N2[H] |(3.0013,-1.0471,0.4145;2.0322,-0.593,0.2284;1.9539,0.8038,0.0256;2.8637,1.3968,0.06;0.7418,1.4343,-0.2166;0.6852,2.5082,-0.372;-0.4065,0.6322,-0.2522;-0.3518,-0.7803,-0.0503;0.8965,-1.3862,0.1929;0.9634,-2.4596,0.3494;-1.6907,-1.2693,-0.1569;-2.0289,-2.2912,-0.0609;-2.4964,-0.1793,-0.4106;-3.9244,-0.0733,-0.6125;-4.5558,1.0065,-0.8931;-5.9722,0.6524,-0.8569;-6.5,0.9924,-1.7559;-6.6016,1.2642,0.286;-6.1206,0.9769,1.1375;-7.5633,0.9348,0.3583;-6.0076,-0.8556,-0.8573;-6.0262,-1.1651,-1.8346;-4.6489,-1.2243,-0.4661;-1.721,0.9661,-0.4656;-2.1086,1.882,-0.6413)|\",4.5198110568050005\r\n\"[H]C1=C([H])C([H])=C(O/N=C(/C([H])([H])[H])C([H])([H])C(=O)C([H])([H])[H])C([H])=C1[H] |(2.5914,0.5217,6.1516;2.4128,0.2876,5.1062;2.4385,1.2964,4.1427;2.6341,2.3241,4.4376;2.2139,1.0119,2.7939;2.2314,1.7902,2.0429;1.9599,-0.3083,2.4189;1.7279,-0.7219,1.1189;1.7117,0.3377,0.1851;1.6074,-0.094,-1.0191;1.5297,0.9446,-2.1019;2.3941,0.8622,-2.7734;1.504,1.9495,-1.675;0.6328,0.7776,-2.7101;1.5136,-1.55,-1.4281;1.9915,-2.1831,-0.6786;2.0074,-1.6772,-2.3964;0.0304,-1.9303,-1.6168;-0.5557,-1.6314,-2.6384;-0.6379,-2.6496,-0.466;-0.4674,-2.1048,0.4694;-0.1862,-3.6425,-0.3398;-1.7067,-2.758,-0.6602;1.9299,-1.3314,3.3712;1.7305,-2.3489,3.0488;2.1569,-1.0282,4.7116;2.1314,-1.8254,5.4497)|\",4.971520048635\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])O/N=C1\\C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(13.6412,-1.4819,-0.9796;13.0802,-0.5407,-1.018;12.3571,-0.5613,-0.1947;13.7868,0.2754,-0.8219;12.394,-0.3693,-2.3785;11.6716,-1.1862,-2.5212;13.1462,-0.4916,-3.17;11.676,0.9737,-2.602;11.3164,0.9872,-3.6391;12.3962,1.7996,-2.5236;10.4855,1.263,-1.6634;9.9977,0.3232,-1.3651;9.7262,1.8464,-2.1971;10.8374,2.0479,-0.3975;11.6357,1.5651,0.1732;9.957,2.1512,0.2484;11.3628,3.3501,-0.6819;10.3072,4.1956,-1.0776;10.6999,5.3764,-1.3854;12.1331,5.8529,-1.4061;12.221,6.7369,-0.7561;12.7978,5.083,-1.0113;12.5439,6.254,-2.8425;12.4977,5.3442,-3.4602;13.9784,6.7905,-2.8842;14.277,7.0423,-3.9088;14.6913,6.0527,-2.4977;14.0755,7.6991,-2.2756;11.5327,7.2627,-3.4096;11.623,8.2073,-2.8507;11.785,7.4965,-4.4525;10.0792,6.77,-3.3277;9.9424,5.8992,-3.9852;9.4163,7.557,-3.7013;9.6657,6.3801,-1.8826;9.7919,7.2894,-1.2692;8.1915,5.9213,-1.7344;8.1193,4.905,-2.1417;7.7769,5.8675,-0.2537;6.7499,5.4964,-0.154;7.8091,6.8693,0.1961;8.4308,5.2098,0.3231;7.2092,6.8176,-2.5072;6.1781,6.5064,-2.3027;7.3558,6.7687,-3.5907;7.2999,7.8689,-2.2016)|\",6.522568996485\r\n\"[H]C(=C(C([H])([H])[H])C([H])([H])[H])C([H])([H])O/N=C1\\C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(2.4664,0.7093,4.3959;2.5915,0.8885,3.3279;1.4952,1.1292,2.5929;0.1283,1.134,3.2353;-0.3707,2.1021,3.0887;0.1809,0.9367,4.3104;-0.5252,0.3759,2.7813;1.4871,1.3986,1.108;0.9505,2.3323,0.8918;0.9516,0.6022,0.5724;2.4865,1.4772,0.6746;4.0134,0.8063,2.8649;4.1395,1.0764,1.8104;4.4073,-0.2122,2.9951;4.7809,1.6992,3.6866;6.1478,1.488,3.4237;6.9121,2.2728,4.0898;6.4532,3.327,5.0727;6.6914,4.3162,4.6499;5.3697,3.2803,5.1946;7.1683,3.1947,6.4363;6.8571,2.2364,6.8799;6.751,4.3203,7.3887;7.2303,4.2079,8.3686;5.6657,4.3294,7.5437;7.0385,5.3007,6.987;8.6906,3.1417,6.2329;9.0351,4.1119,5.8422;9.1899,3.0018,7.201;9.1031,2.023,5.2666;8.8399,1.0527,5.7073;10.1922,2.0248,5.1378;8.4172,2.1611,3.8843;8.7243,3.1453,3.4886;8.8638,1.1056,2.8369;8.1685,1.2002,1.9942;10.2757,1.4123,2.3105;10.5551,0.7027,1.5227;11.0336,1.3348,3.0999;10.3372,2.422,1.8859;8.7761,-0.3465,3.3342;8.9823,-1.039,2.5092;7.7794,-0.574,3.7216;9.5121,-0.5528,4.1209)|\",6.479030780405\r\n\"[H]C#CC([H])([H])O/N=C1\\C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(0.0686,-1.1248,-0.0856;1.1187,-0.969,0.0159;2.3063,-0.7972,0.1383;3.749,-0.5991,0.2494;4.0818,0.1984,-0.4297;4.2873,-1.5156,-0.0281;4.0592,-0.2489,1.6014;5.4644,-0.1237,1.6926;5.8523,0.2287,2.8622;4.9615,0.452,4.0599;5.1826,1.445,4.4793;3.9103,0.4353,3.7681;5.2567,-0.6174,5.1386;5.0003,-1.5957,4.705;4.3968,-0.4027,6.3877;4.586,-1.1812,7.1364;3.3281,-0.4227,6.1449;4.6161,0.567,6.8532;6.7593,-0.6211,5.4644;7.013,0.3212,5.9743;6.9828,-1.4256,6.1775;7.6489,-0.7722,4.2194;7.5067,-1.77,3.7799;8.6987,-0.7057,4.5225;7.3494,0.3,3.1359;7.5401,1.2808,3.6037;8.2444,0.2058,1.8723;7.8966,-0.6539,1.2864;8.0972,1.4635,0.9984;8.7223,1.3839,0.1009;8.4224,2.3589,1.5452;7.0639,1.6134,0.6773;9.7281,-0.0097,2.2127;10.331,0.0159,1.2975;9.9144,-0.972,2.6996;10.1065,0.7813,2.8745)|\",6.55250152004\r\n\"[H]C([H])=C([H])C([H])([H])O/N=C1\\C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.2581,3.6669,2.5384;5.0893,2.9433,1.7456;4.3649,3.2189,0.9819;5.7297,1.7752,1.7129;6.4573,1.5249,2.4844;5.5007,0.7181,0.6751;4.8119,1.0707,-0.105;5.0684,-0.187,1.1246;6.7711,0.384,0.1067;6.6364,-0.8068,-0.637;7.703,-1.1139,-1.2785;9.0013,-0.3431,-1.2515;9.2926,-0.1064,-2.2862;8.8762,0.5994,-0.7164;10.1166,-1.2013,-0.6091;9.824,-1.386,0.4357;11.459,-0.4628,-0.6085;12.2406,-1.0625,-0.1267;11.3886,0.4908,-0.0723;11.7889,-0.246,-1.6329;10.2023,-2.559,-1.3239;10.5717,-2.3972,-2.3487;10.9459,-3.1954,-0.8258;8.8532,-3.293,-1.3864;8.5343,-3.5739,-0.3721;8.9798,-4.2241,-1.9481;7.7377,-2.4325,-2.0413;8.0864,-2.1975,-3.0622;6.3666,-3.1458,-2.1716;5.9201,-3.1885,-1.1702;5.4141,-2.3497,-3.0809;4.4352,-2.8406,-3.1368;5.8108,-2.2908,-4.1036;5.2611,-1.3334,-2.7115;6.497,-4.5829,-2.7032;5.5025,-5.0122,-2.8718;7.0235,-5.2444,-2.0081;7.0307,-4.6078,-3.6631)|\",6.517126719475\r\n\"[H]C([H])([H])C([H])([H])O/N=C1\\C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(0.3514,1.4005,0.3786;0.7538,1.1435,-0.6073;0.2838,0.2121,-0.94;0.4791,1.9368,-1.3104;2.2636,0.9897,-0.5303;2.5476,0.1951,0.1729;2.7435,1.9198,-0.1983;2.722,0.6567,-1.8421;4.1247,0.5381,-1.8178;4.6164,0.2017,-2.9531;3.8317,-0.0144,-4.2253;4.0619,-1.0175,-4.6157;2.7599,0.0274,-4.0255;4.2371,1.034,-5.288;3.9513,2.0217,-4.8955;3.4941,0.8076,-6.6087;3.761,1.5711,-7.3492;2.4077,0.8426,-6.466;3.7434,-0.1725,-7.0358;5.7626,1.0218,-5.4737;6.0547,0.0693,-5.943;6.0578,1.8137,-6.175;6.5327,1.1873,-4.1538;6.351,2.189,-3.738;7.6061,1.119,-4.3582;6.1315,0.1241,-3.0948;6.3568,-0.8595,-3.543;6.916,0.2149,-1.7598;6.5325,1.0835,-1.2098;6.6786,-1.0348,-0.8941;7.2066,-0.9465,0.063;7.0588,-1.9346,-1.3969;5.6174,-1.1809,-0.6822;8.4264,0.4079,-1.9744;8.9497,0.3706,-1.0119;8.6677,1.369,-2.4393;8.8463,-0.3873,-2.6055)|\",6.53617468901\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C(F)C([H])=C2[H])OC([H])=C1[H] |(5.5567,4.4695,2.6014;4.7522,4.085,2.2204;4.3259,4.8781,1.2152;3.2,4.6454,0.4458;2.1981,3.5831,0.4672;2.2854,2.6668,1.286;1.0991,3.6706,-0.5279;1.1012,4.5126,-1.2126;0.1445,2.7206,-0.5561;0.2446,1.923,0.1793;-1.0079,2.6286,-1.4504;-1.2841,3.5754,-2.4577;-0.6223,4.4245,-2.5969;-2.3933,3.4457,-3.2844;-2.6121,4.1702,-4.062;-3.2419,2.3557,-3.1039;-4.318,2.2277,-3.9056;-3.0096,1.3979,-2.1245;-3.693,0.5624,-2.0159;-1.8931,1.5435,-1.3052;-1.6978,0.8026,-0.5344;3.0912,5.6724,-0.4801;4.1249,6.5129,-0.2825;4.1723,7.3698,-0.9383;4.9215,6.0877,0.7405;5.8168,6.5697,1.1102)|\",4.179668743680001\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C([H])O2)OC([H])=C1[H] |(6.6052,-5.7323,0.2967;5.9663,-5.016,0.1602;4.7521,-5.434,0.5772;3.5945,-4.678,0.5318;3.3243,-3.3204,0.0655;4.2369,-2.6302,-0.3944;1.929,-2.8347,0.1741;1.173,-3.493,0.5878;1.6224,-1.5877,-0.2415;2.4275,-0.9783,-0.6469;0.3278,-0.9734,-0.2065;-0.1069,0.2728,-0.5961;0.5109,1.0493,-1.0255;-1.505,0.3289,-0.324;-2.1785,1.1555,-0.5016;-1.8304,-0.8829,0.2129;-2.7538,-1.3109,0.5732;-0.7379,-1.6851,0.2928;2.5594,-5.4442,1.0453;3.0697,-6.6398,1.3967;2.3719,-7.349,1.817;4.4081,-6.7031,1.1377;5.0663,-7.5421,1.321)|\",3.94565083225\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C(C#N)=C2[H])OC([H])=C1[H] |(-4.9364,-7.6355,-0.8499;-4.3151,-6.891,-0.8417;-4.9973,-5.7555,-0.5917;-4.4313,-4.4949,-0.5108;-3.0589,-4.0253,-0.6566;-2.1481,-4.8191,-0.8931;-2.8249,-2.5624,-0.5053;-3.6839,-1.9308,-0.3025;-1.5782,-2.0706,-0.6217;-0.7917,-2.7963,-0.8245;-1.1543,-0.6733,-0.5067;-2.046,0.389,-0.2581;-3.1063,0.1855,-0.1452;-1.5906,1.6997,-0.1557;-2.2958,2.5029,0.0367;-0.2343,1.9919,-0.2978;0.1277,3.0116,-0.2185;0.6701,0.9476,-0.5463;2.0688,1.2316,-0.697;3.2014,1.4663,-0.8192;0.209,-0.3726,-0.649;0.9187,-1.1713,-0.8419;-5.4425,-3.5845,-0.2414;-6.5983,-4.2708,-0.1582;-7.4831,-3.6876,0.0508;-6.3999,-5.6054,-0.3628;-7.1524,-6.3826,-0.3522)|\",4.068102064975\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=N2)OC([H])=C1[H] |(5.0408,6.8742,0.6968;4.3465,6.2003,0.7586;3.7039,6.13,-0.4261;2.6526,5.2748,-0.7076;1.9639,4.2688,0.0939;2.3021,4.0663,1.2612;0.8583,3.5218,-0.5623;0.6063,3.7378,-1.5948;0.1767,2.5875,0.1228;0.4643,2.4051,1.1566;-0.9297,1.7898,-0.4137;-1.567,0.8403,0.4031;-1.2358,0.7021,1.4283;-2.6176,0.0887,-0.1174;-3.1234,-0.651,0.4973;-3.0043,0.3049,-1.438;-3.8174,-0.2547,-1.8903;-2.3124,1.2716,-2.1767;-2.5879,1.4681,-3.2122;-1.3058,1.9977,-1.6944;2.2583,5.4943,-2.0182;3.0463,6.4596,-2.5268;2.8503,6.7413,-3.5509;3.9555,6.8951,-1.6063;4.7065,7.6613,-1.7459)|\",4.1089191425500005\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])OC([H])=C1[H] |(6.6276,4.0177,-0.6763;5.9136,3.3628,-0.6403;6.4291,2.1811,-0.241;5.7029,1.0152,-0.0761;4.2881,0.7044,-0.2642;3.5066,1.5825,-0.633;3.8605,-0.692,0.0069;4.6184,-1.4012,0.3242;2.5676,-1.0382,-0.1429;1.8954,-0.2432,-0.4642;1.9553,-2.349,0.0737;0.5702,-2.4882,-0.1319;-0.0054,-1.6202,-0.444;-0.0678,-3.7121,0.0591;-1.1389,-3.7947,-0.1044;0.6688,-4.8274,0.4599;0.1756,-5.7841,0.6092;2.0471,-4.7079,0.6678;2.6263,-5.5733,0.979;2.683,-3.4859,0.4774;3.7536,-3.4101,0.6424;6.5745,0.0204,0.3413;7.8037,0.5636,0.4303;8.5944,-0.0998,0.7485;7.7881,1.8844,0.0876;8.6326,2.5606,0.0708)|\",4.206880128730001\r\n\"[H]O/C1=N/C2=C(C([H])=C(F)C([H])=C2[H])N2C([H])([H])C([H])([H])C([H])([H])[C@@]12[H] |(2.3677,4.4236,-0.891;3.004,4.3561,-1.6194;3.1444,3.0499,-1.9763;3.9277,2.7744,-2.9388;4.1092,1.4297,-3.2989;3.5939,0.3718,-2.5;3.9008,-0.9563,-2.8266;3.5356,-1.7898,-2.2375;4.6756,-1.2052,-3.9547;4.9526,-2.4936,-4.2609;5.1641,-0.1965,-4.7739;5.7628,-0.4438,-5.6431;4.8757,1.1243,-4.4236;5.2612,1.9518,-5.011;2.8337,0.7088,-1.4012;2.5538,-0.1716,-0.2741;1.8864,-1.0008,-0.5503;3.4894,-0.6146,0.0958;1.8891,0.7581,0.7684;0.8002,0.6407,0.7375;2.2145,0.5327,1.7875;2.2728,2.183,0.3246;3.2719,2.4491,0.6881;1.5588,2.9325,0.6854;2.2975,2.0495,-1.2129;1.2542,2.1289,-1.588)|\",4.666752536075001\r\n\"[H]C([H])([H])C1(C([H])([H])[H])C([H])([H])[C@@]2([H])C([H])([H])C(=O)OC([H])([H])[C@]2(C([H])([H])[H])C1([H])[H] |(0.4825,-0.5568,0.2598;1.051,0.3311,-0.0439;0.7213,0.6088,-1.0517;0.7763,1.1487,0.6339;2.5633,0.0559,0.0034;2.9674,-0.267,1.4541;2.3825,-1.1063,1.8498;2.7993,0.5962,2.1102;4.0286,-0.5358,1.5253;2.957,-1.1271,-0.9476;3.0691,-2.0705,-0.4011;2.1839,-1.2926,-1.7088;4.2625,-0.6628,-1.5996;5.045,-0.719,-0.825;4.8025,-1.3613,-2.8399;4.0176,-1.4855,-3.5982;5.1866,-2.3648,-2.6314;5.9357,-0.5844,-3.5151;6.6668,-1.092,-4.3309;6.0897,0.7429,-3.2401;5.4112,1.431,-2.1589;6.0552,1.3736,-1.2715;5.368,2.4753,-2.4837;4.0357,0.8422,-1.8645;3.0913,1.1347,-3.05;2.8783,2.2094,-3.0925;2.133,0.6152,-2.9584;3.5341,0.8549,-4.0109;3.398,1.2852,-0.5273;4.1849,1.5344,0.1967;2.7776,2.1816,-0.6396)|\",7.322583716955\r\n\"[H]C1=C2C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[C@]2(C([H])([H])[H])C([H])([H])OC1=O |(4.3982,1.698,-3.4037;4.4369,0.6457,-3.1393;3.5863,0.071,-2.2786;2.5322,0.6916,-1.4115;1.5341,0.4444,-1.8043;2.5993,1.7829,-1.3508;2.7424,-0.0225,-0.0418;1.4656,-0.0289,0.8089;1.1511,0.9918,1.0587;1.6245,-0.5652,1.7524;0.6374,-0.5169,0.281;3.8772,0.6728,0.7343;3.5964,1.6987,1.0025;4.8002,0.7263,0.1454;4.1011,0.1337,1.6625;3.1732,-1.4697,-0.4605;2.315,-2.1494,-0.4115;3.9251,-1.8696,0.2313;3.7067,-1.4031,-1.927;2.8516,-2.2708,-2.8771;1.807,-1.9405,-2.8826;2.8708,-3.3179,-2.5509;3.2306,-2.2252,-3.9023;5.186,-1.7716,-2.0606;5.3491,-2.8501,-1.9774;5.7784,-1.2693,-1.2842;5.7143,-1.4117,-3.3517;5.4739,-0.1507,-3.8329;6.0928,0.2404,-4.7968)|\",5.850447785749999\r\n\"[H]C([H])([H])C(=O)OC([H])([H])[C@]1(C([H])([H])[H])C(=O)C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])C1([H])[H] |(7.0514,-1.3538,-2.367;7.1786,-1.2344,-1.2906;7.6888,-2.1127,-0.8805;7.8067,-0.3625,-1.0821;5.8244,-1.072,-0.6411;4.7564,-1.1361,-1.2062;5.952,-0.8473,0.6912;4.7185,-0.6576,1.4206;4.225,0.2456,1.0493;4.0599,-1.5133,1.2393;5.0505,-0.5155,2.9109;3.7348,-0.1936,3.6463;2.9786,-0.9596,3.4378;3.3484,0.7784,3.3239;3.8808,-0.1571,4.7298;6.0508,0.6577,3.0801;5.7873,1.8154,2.8379;7.3799,0.1121,3.5806;7.9167,0.8459,4.1899;7.9889,-0.0897,2.6867;7.0314,-1.21,4.3015;8.1927,-2.2126,4.2853;9.0557,-1.8306,4.8445;8.5216,-2.4223,3.2604;7.8958,-3.1643,4.7436;6.6322,-0.9174,5.7616;7.4899,-0.5341,6.3277;6.28,-1.8277,6.2617;5.8357,-0.1679,5.8319;5.8124,-1.7411,3.479;5.1659,-2.3811,4.0907;6.1854,-2.3547,2.6516)|\",5.97562015698\r\n\"[H]C1N=C([H])NN1[C@]([H])(C1=C([H])C([H])=C(F)C([H])=C1[H])C([H])([H])C(=O)C([H])([H])[H] |(2.4764,5.4023,2.1054;3.4087,5.2268,2.6242;4.2725,6.1466,3.0225;5.2542,5.3954,3.59;6.1454,5.822,4.0306;5.0537,4.0842,3.5651;3.8486,3.9888,2.9382;3.2025,2.6921,2.6973;2.4183,2.9014,1.9648;2.5665,2.1069,3.9507;3.2779,1.9794,5.1525;4.2953,2.3524,5.2133;2.6831,1.4034,6.2737;3.2193,1.3026,7.2116;1.372,0.9487,6.1804;0.7947,0.3883,7.2633;0.6384,1.0594,5.0062;-0.3856,0.7023,4.9748;1.2462,1.6444,3.8949;0.6789,1.7409,2.9719;4.2056,1.7371,2.032;5.1156,1.6566,2.6351;3.767,0.7304,1.991;4.5451,2.1481,0.5979;3.8199,2.8931,-0.0344;5.8155,1.5675,0.0124;5.8728,0.4858,0.1842;6.6828,2.017,0.5135;5.8669,1.7808,-1.0571)|\",5.981062433990001\r\n\"[H]C1=C([H])C([H])=C([C@]2([H])O[C@@]([H])(C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])C([H])=C2[H])C([H])=C1[H] |(-4.2212,-1.5863,4.0026;-3.5624,-1.2838,3.1929;-3.6812,-0.0138,2.6295;-4.4322,0.679,3.0002;-2.8375,0.3731,1.5847;-2.9377,1.3617,1.1458;-1.8653,-0.5042,1.091;-0.8835,-0.1119,-0.0142;-0.732,-0.9962,-0.6482;0.4235,0.1335,0.5284;0.5962,1.4547,1.0512;-0.1899,1.6683,1.7894;1.9477,1.434,1.76;2.7598,1.2757,1.0404;1.9791,0.5757,2.4427;2.2298,2.6868,2.5624;1.4498,3.5934,2.7682;3.4896,2.667,3.0516;3.855,3.7978,3.8598;4.887,3.6187,4.1628;3.2047,3.8712,4.7356;3.7789,4.7236,3.2833;0.4975,2.4804,-0.0826;0.4371,3.4884,0.3435;1.4109,2.451,-0.6974;-0.7073,2.1931,-0.9411;-1.0497,2.9758,-1.6159;-1.3488,1.0236,-0.8904;-2.2341,0.8394,-1.4959;-1.7541,-1.7778,1.6672;-0.9966,-2.4642,1.2952;-2.5934,-2.1673,2.7087;-2.4955,-3.1596,3.1415)|\",6.37290637871\r\n\"[H]C([H])=C(C([H])([H])Cl)[C@]1([H])C([H])([H])C([H])([H])[C@]2(C([H])([H])[H])O[C@]2([H])C1([H])[H] |(1.811,1.33,2.1948;2.4638,0.635,1.6712;3.4442,0.4596,2.099;2.0506,0.0038,0.5655;0.6468,0.2346,0.0806;0.0647,0.8279,0.7855;0.1157,-0.6961,-0.1323;0.5927,1.1641,-1.5114;2.8787,-0.9778,-0.2566;2.5621,-0.8272,-1.2963;2.559,-2.4533,0.0908;1.4767,-2.629,0.0534;2.9996,-3.0861,-0.6925;3.1075,-2.8945,1.4558;2.9294,-3.969,1.5845;2.5708,-2.3846,2.2644;4.5995,-2.6014,1.6152;5.4612,-3.7199,2.158;5.437,-4.5906,1.4914;5.0972,-4.0413,3.1417;6.4994,-3.3926,2.2687;4.8985,-1.3016,2.1838;5.1994,-1.5258,0.7971;6.2727,-1.5801,0.5955;4.414,-0.7556,-0.2453;4.6442,0.3135,-0.1605;4.8055,-1.0758,-1.2208)|\",6.576991766585\r\n\"[H]O[C@@]([H])(/C(=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H])C([H])([H])N(=O)=O |(2.4077,-2.51,0.934;3.0366,-2.0064,0.3934;2.7834,-0.6162,0.5881;3.2939,-0.1111,-0.2353;1.31,-0.2475,0.5689;0.8625,0.5263,-0.4391;1.586,0.7928,-1.21;-0.4947,1.0473,-0.6836;-0.9435,1.1588,-2.0129;-0.286,0.8469,-2.821;-2.215,1.6455,-2.3048;-2.5426,1.7136,-3.3388;-3.0624,2.0545,-1.2721;-4.0522,2.4419,-1.4977;-2.6218,1.9794,0.05;-3.2642,2.3183,0.8584;-1.3515,1.4833,0.3429;-1.0074,1.4729,1.3717;0.4551,-0.7978,1.6874;-0.6001,-0.8323,1.4078;0.5462,-0.1946,2.5998;0.7508,-1.8217,1.9584;3.4968,-0.1981,1.8986;4.5608,-0.423,1.8174;3.0634,-0.6806,2.7743;3.3766,1.2813,2.1132;2.6071,1.6738,2.9878;4.0319,2.0009,1.3641)|\",4.43001348614\r\n\"[H]O[C@@]([H])(C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])/C([H])=C(\\[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(1.4241,-0.1494,-0.2188;0.5162,0.0263,-0.5164;-0.3549,-0.3918,0.5284;-1.3602,-0.1561,0.1635;-0.2556,-1.9278,0.7121;0.7239,-2.182,1.1285;-0.3614,-2.386,-0.2747;-1.3093,-2.4739,1.6494;-1.1815,-2.6181,2.8487;-2.458,-2.7478,0.99;-3.5395,-3.2267,1.8069;-3.8027,-2.4877,2.5685;-3.2613,-4.1607,2.3023;-4.3733,-3.3882,1.1232;-0.1081,0.3991,1.8361;-0.3883,1.4421,1.6421;-0.7761,0.0147,2.6144;1.3245,0.3453,2.3143;2.0068,1.0408,1.8198;1.7973,-0.4733,3.2695;1.0733,-1.1524,3.7311;3.5793,-0.541,3.8769;3.6154,-0.1315,5.7286;4.6322,-0.2142,6.1319;2.9753,-0.8142,6.2998;3.2632,0.8894,5.9162;4.2649,-2.2908,3.6148;5.2934,-2.3682,3.9882;4.2733,-2.5628,2.5528;3.6628,-3.0387,4.1443;4.6445,0.7073,2.9272;5.6822,0.6696,3.2798;4.2873,1.7342,3.0666;4.6568,0.4999,1.8506)|\",6.6695104757550006\r\n\"[H]O[C@]([H])(C([H])([H])Cl)C([H])([H])/C([H])=C(\\[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.7084,3.8856,3.022;6.1743,3.4094,2.313;5.507,3.7178,1.0988;6.1261,3.2642,0.3128;5.5301,5.2221,0.82;5.0533,5.4676,-0.1309;6.5553,5.5949,0.8313;4.6448,6.1695,2.1056;4.0976,3.0972,1.0292;3.4989,3.5246,1.8478;3.6109,3.3956,0.0912;4.1253,1.5956,1.1433;4.6668,1.219,2.0116;3.5625,0.7391,0.277;3.0365,1.1758,-0.5792;3.6035,-1.1378,0.4128;4.4583,-1.8536,-1.1232;4.4647,-2.9502,-1.0966;5.4986,-1.5147,-1.192;3.9478,-1.5481,-2.0444;1.8263,-1.7986,0.4933;1.8144,-2.8949,0.53;1.2429,-1.4906,-0.3828;1.3066,-1.429,1.3849;4.5486,-1.6584,1.9701;4.575,-2.7514,2.0552;4.0765,-1.2688,2.8795;5.5856,-1.3035,1.9528)|\",6.876317002135\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])/C(=N/OC([H])([H])[H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(0.8323,-1.7185,-2.4122;1.0238,-1.1618,-3.326;0.2746,-1.2484,-4.1117;2.1201,-0.4177,-3.4801;2.8485,-0.3569,-2.6736;2.4461,0.3804,-4.7146;1.6989,0.1681,-5.4895;3.4221,0.0692,-5.1076;2.4715,1.9056,-4.4868;1.5319,2.2075,-4.0064;2.4764,2.4177,-5.4602;3.6416,2.4567,-3.6867;4.5432,1.5976,-3.3592;5.6143,2.142,-2.6368;6.4938,1.0846,-2.2751;5.9903,0.3534,-1.631;7.311,1.5589,-1.7258;6.8876,0.573,-3.1616;3.6645,3.9202,-3.4134;4.8364,4.6875,-3.5741;5.7567,4.1964,-3.8646;4.831,6.0575,-3.3666;5.7321,6.6485,-3.4989;3.6534,6.7122,-2.9733;3.757,8.0584,-2.786;2.6016,8.7712,-2.3733;2.9121,9.8129,-2.2737;2.2276,8.4106,-1.406;1.7981,8.7025,-3.1185;2.48,5.9702,-2.8029;1.5553,6.4455,-2.4964;2.4957,4.5932,-3.0333;1.573,4.0391,-2.8894)|\",5.210980237075\r\n\"[H]C1=C(C([H])([H])[H])[C@]([H])(C([H])([H])C#N)OC(=O)[C@@]1([H])C([H])([H])C([H])([H])[H] |(3.0631,-2.4313,-0.4699;2.6312,-1.8978,-1.3158;2.0886,-2.5835,-2.3233;2.0201,-4.0878,-2.3736;2.4476,-4.5271,-1.4679;2.5704,-4.4987,-3.2309;0.9829,-4.4379,-2.4642;1.5017,-1.8397,-3.4975;0.5413,-2.2935,-3.7737;2.3854,-1.8997,-4.7725;2.4834,-2.9393,-5.106;1.8827,-1.343,-5.571;3.7216,-1.336,-4.5655;4.7744,-0.8842,-4.3802;1.1862,-0.4636,-3.2458;1.73,0.277,-2.235;1.4099,1.4363,-2.1492;2.7105,-0.395,-1.2771;3.707,-0.0959,-1.6405;2.5592,0.183,0.1495;3.3732,-0.2291,0.7604;2.7151,1.264,0.0937;1.2096,-0.1129,0.8106;1.1816,0.2949,1.8268;1.0182,-1.1906,0.8783;0.3889,0.3438,0.2473)|\",6.8354999245600006\r\n\"[H]O[C@@]([H])(/C(=C(\\[H])C([H])([H])C([H])([H])[C@@]([H])(C([H])=C([H])[H])C([H])([H])[H])C([H])([H])[H])C([H])([H])C#N |(8.8233,-1.3799,2.6244;9.043,-2.2771,2.9303;8.6647,-3.1771,1.9067;8.9302,-4.1706,2.2931;7.1763,-3.1651,1.5674;6.3306,-2.4056,2.2782;6.7551,-1.8242,3.0956;4.8483,-2.2307,2.0962;4.4516,-2.9101,1.3333;4.3363,-2.4869,3.0343;4.4834,-0.7814,1.7161;4.9785,-0.5234,0.7708;4.8903,-0.0923,2.4713;2.9647,-0.531,1.573;2.564,-1.2616,0.8547;2.2439,-0.7189,2.8851;2.5568,-0.0467,3.6883;1.2759,-1.6048,3.1239;0.7893,-1.6762,4.0932;0.928,-2.2925,2.3553;2.7039,0.8825,1.0168;1.6321,1.0585,0.8757;3.2053,1.0228,0.052;3.0818,1.6507,1.7034;6.7716,-4.0836,0.4367;5.6951,-4.2652,0.4193;7.2657,-5.0609,0.5278;7.0487,-3.6753,-0.5447;9.5737,-2.9529,0.6588;10.6178,-3.0523,0.9744;9.3859,-3.7064,-0.1132;9.3816,-1.6174,0.0874;9.2073,-0.5434,-0.3182)|\",6.859990171105\r\n\"[H]O[C@@]([H])(C([H])([H])O[Si](C([H])([H])[H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])OC1([H])[H] |(3.3875,4.7247,-0.3507;4.004,3.9722,-0.275;3.3007,2.8651,-0.8128;4.0461,2.1173,-1.1104;2.3991,2.2479,0.2668;1.6371,2.9861,0.5609;3.0204,2.0315,1.1465;1.7953,1.0698,-0.2448;0.7048,0.1051,0.6154;-0.7989,1.1397,1.1147;-1.5424,0.5287,1.6404;-1.2887,1.5888,0.2435;-0.5138,1.955,1.7903;1.5469,-0.5589,2.1745;0.8858,-1.2373,2.7268;1.8119,0.2582,2.8562;2.4669,-1.1049,1.9379;0.2378,-1.2941,-0.602;-0.3836,-0.6983,-1.8823;-0.6466,-1.5005,-2.5869;0.3124,-0.0226,-2.3918;-1.3017,-0.1365,-1.6706;-0.7852,-2.2407,0.0628;-1.0502,-3.0593,-0.622;-1.7164,-1.7228,0.323;-0.3899,-2.7016,0.9768;1.5001,-2.0972,-0.9807;1.2487,-2.8921,-1.6983;1.9541,-2.58,-0.1067;2.2622,-1.4608,-1.4443;2.4953,3.2855,-2.0323;1.7849,2.5499,-2.4062;1.9672,4.6281,-1.9374;2.973,4.3404,-2.9314;2.6209,4.3729,-3.9617;3.942,4.799,-2.7446)|\",8.699479800485001\r\n\"[H]O/C(=N/C1=NC([H])=C(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])O[H])N=C1[H])C([H])([H])[H] |(3.7757,0.2853,-1.6448;3.8192,-0.319,-0.8875;3.002,0.1362,0.0982;2.9379,-0.6019,1.1357;2.2609,-0.2533,2.2934;2.1836,1.0172,2.7237;1.5636,1.2208,3.8895;1.5124,2.2524,4.237;1.0032,0.1888,4.6565;0.3372,0.4713,5.9302;0.3197,1.5187,6.2322;-0.226,-0.4562,6.7194;-0.193,-1.4938,6.3878;-0.9099,-0.1791,8.0245;-0.8185,0.8844,8.2833;-0.4184,-0.7404,8.8315;-2.3983,-0.5786,8.0154;-2.4972,-1.624,7.6841;-2.9443,0.0271,7.2794;-3.0679,-0.4222,9.3835;-4.1398,-0.6661,9.3037;-2.9958,0.6175,9.7222;-2.4453,-1.1912,10.4082;-2.543,-2.1268,10.1706;1.0877,-1.0897,4.219;1.712,-1.3,3.072;1.8003,-2.3257,2.7203;2.291,1.4342,-0.208;1.4418,1.5884,0.4564;1.9468,1.4409,-1.2501;2.9722,2.2807,-0.0629)|\",4.269466314345\r\n\"[H]C1=C([H])C(=O)C([H])=C(C([H])([H])[H])C1C1N([H])C([H])([H])N([H])[C@]([H])(N([H])C([H])([H])[H])C1([H])[H] |(5.1145,1.9134,0.4633;4.5313,1.0338,0.1987;5.185,-0.1528,0.1613;6.2534,-0.2166,0.3469;4.4678,-1.403,-0.0815;5.0329,-2.5034,-0.1429;3.0202,-1.2504,-0.2153;2.473,-2.1767,-0.3711;2.3464,-0.0633,-0.1646;0.8442,-0.1314,-0.3613;0.5401,-1.1758,-0.4694;0.5175,0.393,-1.2683;0.2831,0.2842,0.4825;3.0978,1.1825,0.0161;2.563,2.4805,0.0128;3.3842,3.5548,-0.1462;4.3344,3.3715,-0.4315;3.073,4.9504,0.1358;3.0493,5.5029,-0.8258;3.8792,5.3788,0.7451;1.8278,5.0313,0.8662;1.5533,6.0052,0.9843;0.7706,4.3335,0.1421;0.7233,4.646,-0.9197;-0.4961,4.714,0.7397;-0.5153,4.3884,1.7055;-1.6882,4.2687,0.0232;-2.5745,4.6497,0.5398;-1.7956,3.1746,-0.0733;-1.6831,4.6956,-0.9861;1.0954,2.8266,0.206;0.7842,2.4522,1.1892;0.4907,2.3058,-0.5375)|\",3.5755759955700004\r\n\"[H]C1=C([H])[C@@]2([H])C(C(=O)OC([H])([H])[H])NN(C([H])([H])[H])C2=C([H])C1=O |(6.4697,-2.0606,-2.5216;5.716,-1.8707,-1.7626;5.7991,-0.8163,-0.9381;6.626,-0.1136,-0.9603;4.7941,-0.6657,0.1683;5.2951,-1.0859,1.0654;4.1911,0.6388,0.6482;4.9746,1.8566,0.9149;6.1759,1.9097,0.7101;4.238,2.8774,1.3841;4.9688,4.0878,1.6448;5.4454,4.4528,0.7312;5.7374,3.9166,2.4031;4.2265,4.8008,2.0035;2.9337,0.5062,0.9456;2.5451,-0.7692,0.627;1.1297,-1.0793,0.5879;0.9659,-2.099,0.9486;0.7318,-0.9918,-0.4317;0.6152,-0.3752,1.242;3.5361,-1.4571,-0.0552;3.4538,-2.5397,-0.859;2.5473,-3.1211,-0.9903;4.5991,-2.8632,-1.7248;4.6139,-3.8533,-2.4532)|\",4.00007360235\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@]([H])(OC([H])([H])[H])[C@](OC([H])([H])[H])(C([H])([H])[H])[C@]1([H])O[H] |(5.9673,0.5314,-2.7814;6.2485,1.4579,-2.699;5.406,2.0203,-1.7067;5.6074,1.5888,-0.7157;5.6296,3.0924,-1.6732;3.9349,1.8106,-2.0531;3.708,2.3172,-2.9983;3.7452,0.3793,-2.2397;2.6262,-0.0472,-1.5015;2.7855,-1.0987,-1.2345;1.4253,0.1034,-2.2415;1.3719,-0.7097,-3.4104;0.3976,-0.5331,-3.8712;1.4604,-1.7735,-3.1471;2.1692,-0.4483,-4.1145;2.5473,0.8963,-0.2837;3.638,0.4578,0.5376;3.7672,1.0576,1.8161;4.7189,0.7022,2.2194;2.9615,0.7579,2.499;3.7956,2.1555,1.7695;1.2106,0.8864,0.4505;0.3868,1.0641,-0.2397;1.1764,1.665,1.2183;1.0581,-0.0866,0.9309;2.9158,2.2446,-0.9748;3.3544,2.95,-0.2633;1.7851,2.9011,-1.5254;1.3477,2.2458,-2.0993)|\",8.6532204459\r\n\"[H]C1=N[C@@]([H])(Cl)C([H])=N/C1=C1\\C(=O)C([H])=C([H])C([H])=C1[H] |(2.6536,-2.592,0.6535;1.81,-3.2172,0.3652;2.058,-4.4512,0.1297;0.9273,-5.2782,-0.2441;0.8924,-5.3432,-1.3379;1.2122,-6.992,0.3143;-0.3729,-4.772,0.3298;-1.13,-5.4893,0.6437;-0.5828,-3.5229,0.5123;0.4538,-2.6352,0.261;0.2233,-1.3051,-0.0349;-1.186,-0.8193,-0.258;-2.1183,-1.5851,-0.4943;-1.3833,0.6358,-0.2314;-2.411,0.9786,-0.2969;-0.3289,1.4866,-0.2431;-0.4909,2.5606,-0.2933;1.0294,0.9941,-0.2103;1.8475,1.7075,-0.2437;1.293,-0.3362,-0.1182;2.3244,-0.6698,-0.0829)|\",2.86807998427\r\n\"[H]OC(=O)C1=C([H])C([H])=C([H])C([H])=C1[C@]1([H])N([H])[C@@]([H])(N([H])[H])N=C(O[H])C1([H])[H] |(1.3546,-1.6992,-2.005;0.5988,-1.3544,-1.4947;0.887,-0.0459,-1.2685;1.9333,0.4316,-1.6779;-0.1777,0.6599,-0.5069;-1.3949,-0.0144,-0.292;-1.5163,-1.0134,-0.6932;-2.4304,0.5799,0.4178;-3.3649,0.0465,0.5659;-2.2484,1.8584,0.9457;-3.0432,2.3343,1.5143;-1.0416,2.5277,0.7547;-0.8796,3.5015,1.2025;0.0087,1.9635,0.0216;1.2836,2.7895,-0.1795;2.1357,2.1082,-0.1487;1.4562,3.8156,0.87;2.3542,3.662,1.3262;1.4805,5.2164,0.3754;0.4567,5.4408,0.0361;1.823,6.1453,1.4278;2.8352,6.1763,1.5308;1.409,5.843,2.307;2.4201,5.4187,-0.7215;2.3065,4.5533,-1.6454;3.138,4.638,-2.7171;2.9716,3.8837,-3.3028;1.2925,3.4352,-1.6021;0.3022,3.8608,-1.8124;1.4853,2.6559,-2.3431)|\",4.487157394745\r\n\"[H]C(=O)C1=C([H])C(=O)N=C2C(Cl)=C([H])C([H])=C(OC([H])([H])[H])[C@@]21[H] |(0.8965,4.9408,2.1895;1.4994,4.0835,1.8216;1.7504,3.9437,0.6419;1.8734,3.1034,2.8719;1.1543,3.0691,4.0074;0.4146,3.8329,4.2353;1.243,1.9443,4.9748;0.595,1.9645,6.0074;2.0671,0.8449,4.6604;2.8522,0.9044,3.6369;3.8383,-0.1404,3.3901;3.6673,-1.6733,4.2061;4.9559,0.1431,2.6661;5.727,-0.616,2.5764;5.2166,1.4442,2.1012;6.2045,1.6449,1.7056;4.249,2.3964,2.1319;4.3604,3.6726,1.7435;5.5962,4.1014,1.1698;5.4558,5.1525,0.9184;5.8191,3.5273,0.2636;6.4158,3.9934,1.8895;2.8464,1.9987,2.5662;2.4513,1.4886,1.665)|\",3.300741006565\r\n\"[H]C1=C(Cl)C2=NC(=O)C([H])=C(C([H])([H])[H])[C@@]2([H])C(OC([H])([H])[H])=C1[H] |(4.8299,0.2135,-0.1749;4.7055,-0.7833,-0.5861;5.409,-1.1356,-1.6914;6.5355,-0.0183,-2.418;5.3495,-2.5046,-2.2035;6.2875,-2.9415,-2.9684;6.2861,-4.2972,-3.3736;7.2117,-4.7338,-4.0376;5.1442,-5.1497,-2.9929;5.1747,-6.1625,-3.385;4.1049,-4.7288,-2.2531;2.9195,-5.6296,-2.0101;2.9603,-6.488,-2.6868;1.9737,-5.1006,-2.1768;2.8936,-5.9977,-0.9807;4.0796,-3.2776,-1.8072;3.3061,-2.7966,-2.4442;3.5996,-2.9358,-0.4011;2.8481,-3.8915,0.1802;2.2702,-3.6122,1.4543;1.5649,-2.7753,1.3868;3.0463,-3.3797,2.1925;1.7413,-4.5197,1.7466;3.8508,-1.7084,0.1164;3.4382,-1.3992,1.0688)|\",3.687142674275\r\n\"[H]C([H])([H])C([H])([H])OC(=O)/C(C#N)=C(/Cl)C([H])([H])C([H])([H])C([H])([H])[H] |(4.6202,-3.4986,3.9791;4.9052,-2.5966,3.4257;4.1583,-1.8228,3.6243;5.8773,-2.2572,3.7968;4.9749,-2.911,1.9397;3.9999,-3.205,1.5443;5.7071,-3.6923,1.726;5.4569,-1.7691,1.1835;4.5484,-0.8485,0.8247;3.3599,-0.9233,1.0724;5.2135,0.2762,0.0858;6.6456,0.2856,0.0712;7.8085,0.2908,0.0662;4.4955,1.2487,-0.5417;5.3642,2.5428,-1.333;3.0033,1.3781,-0.6207;2.7486,1.8046,-1.5972;2.5562,0.3861,-0.5407;2.4301,2.2768,0.5008;2.6835,1.8267,1.4668;2.9126,3.2611,0.4636;0.912,2.4315,0.3784;0.5211,3.0673,1.1801;0.4089,1.4601,0.4459;0.6327,2.8901,-0.5779)|\",5.2055379600650005\r\n\"[H]C1C([H])=C(OC([H])([H])[H])[C@]([H])(OC([H])([H])[H])C2=C1C(=O)NC([H])=N2 |(2.1156,3.7177,0.0546;2.8837,2.9661,0.2207;2.5788,1.8234,1.0046;1.5867,1.6887,1.4216;3.5346,0.8765,1.2475;3.1841,-0.1843,1.9786;4.1146,-1.2107,2.3509;4.8079,-0.8398,3.1069;4.674,-1.5781,1.4845;3.4982,-2.0156,2.7525;4.9512,1.0115,0.712;5.1897,0.1113,0.1189;5.7927,1.0668,1.8613;7.1392,0.6388,1.6496;7.6221,0.6876,2.6285;7.6581,1.2833,0.9367;7.1673,-0.3979,1.2796;5.154,2.2183,-0.2087;4.1125,3.1823,-0.3686;4.3779,4.3709,-1.2117;3.5468,5.2505,-1.3951;5.6538,4.4317,-1.8128;6.4809,3.4619,-1.5805;7.4617,3.5235,-2.0538;6.3138,2.318,-0.8064)|\",2.99869463251\r\n\"[H]C1N[C@]2([H])C(=NC3=C([H])N=C([H])C([H])=C32)C(=O)N1[H] |(-4.3959,-0.9023,-1.0689;-3.4849,-1.4682,-1.2613;-2.4195,-0.8462,-1.5879;-1.2879,-1.7113,-1.9024;-1.3256,-1.8926,-2.9941;-1.2818,-3.0673,-1.211;-0.1502,-3.435,-0.7139;0.7397,-2.3551,-0.9499;2.0843,-2.2782,-0.598;2.5906,-3.1143,-0.1208;2.8023,-1.1707,-0.8324;2.1948,-0.1323,-1.423;2.8139,0.7498,-1.5763;0.8529,-0.1195,-1.8368;0.4226,0.7558,-2.3137;0.1176,-1.2623,-1.5809;-2.5777,-3.7702,-1.0223;-2.7615,-4.9427,-0.7715;-3.6365,-2.8509,-1.1214;-4.5529,-3.2269,-0.9048)|\",4.438176901655001\r\n\"[H]C1=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])C2([H])[H])C([H])=C([H])C(C(F)(F)F)=N1 |(-0.9132,-0.1748,0.563;0.0321,-0.5168,0.1435;0.1275,-1.8118,-0.3881;-1.0707,-2.7301,-0.3895;-1.8931,-2.216,0.129;-1.54,-3.042,-1.7594;-0.7212,-3.22,-2.3423;-2.2557,-4.2921,-1.6819;-3.2501,-4.0896,-1.7694;-2.0461,-4.9254,-0.3553;-1.7679,-5.9778,-0.4873;-3.3009,-4.9206,0.3942;-3.4774,-4.0036,0.8048;-3.2647,-5.5898,1.161;-0.8726,-4.1202,0.251;-0.8982,-4.1129,1.3454;0.0895,-4.545,-0.061;1.3656,-2.1841,-0.9212;1.5119,-3.1739,-1.3469;2.4264,-1.2808,-0.9017;3.3954,-1.5421,-1.3099;2.2068,-0.0258,-0.3365;3.3094,1.0118,-0.2649;4.4557,0.5458,-0.8139;2.9705,2.1354,-0.9235;3.5799,1.3539,1.0093;1.0361,0.3588,0.1801)|\",4.677637090095\r\n\"[H]C1=C(\\[H])N2\\C(=C/1C#N)C(=O)N([H])C([H])([H])N2[H] |(-3.4691,0.0337,0.1305;-2.3921,0.0062,0.0539;-1.5517,1.0997,-0.0618;-1.7659,2.157,-0.1238;-0.2691,0.6333,-0.0917;-0.2475,-0.7385,-0.0332;-1.5805,-1.1586,0.0711;-2.0467,-2.4949,0.193;-2.4901,-3.5652,0.3013;1.0224,-1.4753,-0.1333;1.1042,-2.6889,-0.2307;2.1238,-0.6335,-0.1607;3.0218,-1.0998,-0.1416;2.025,0.7256,0.328;2.9317,1.2754,0.0665;1.8879,0.7548,1.4247;0.9047,1.3808,-0.3686;0.7585,2.3034,0.0421)|\",5.175605436510001\r\n\"[H]C1=C(C([H])([H])[H])ON=C1/C([H])=C(/N(=O)=O)C([H])([H])[H] |(3.7889,2.9567,-2.6478;2.7357,2.8068,-2.4624;1.6871,3.5023,-2.9808;1.5655,4.6463,-3.9262;2.5553,4.9588,-4.2673;1.076,5.5008,-3.4457;0.966,4.369,-4.8004;0.533,2.9918,-2.4876;0.8105,1.9395,-1.6262;2.1301,1.8294,-1.6123;2.8595,0.837,-0.8304;3.94,0.8854,-0.9156;2.3684,-0.1221,-0.0275;3.385,-0.9812,0.6382;2.9512,-1.8541,1.3904;4.5811,-0.7877,0.4152;0.9552,-0.461,0.3124;0.2756,0.1913,-0.2352;0.7803,-0.3474,1.3878;0.7405,-1.5064,0.0674)|\",4.585118380925001\r\n\"[H]OC(=O)/C([H])=C(\\[H])C([H])(F)F |(-1.4933,2.3203,-0.2599;-1.8864,1.4324,-0.2397;-0.9261,0.4872,-0.061;-1.2249,-0.6822,-0.0185;0.4702,1.0018,0.0753;0.6556,2.0734,0.0651;1.4858,0.1482,0.2065;1.2939,-0.9221,0.2184;2.9138,0.5687,0.3638;3.3209,0.3142,1.3505;3.6789,-0.0663,-0.5769;3.0607,1.9172,0.1898)|\",5.853168924255\r\n\"[H]OC([H])([H])[C@]1([H])N(C([H])([H])C2=C([H])C(Cl)=C([H])C([H])=C2[H])C([H])([H])C([H])([H])N([H])C1([H])[H] |(-6.9341,-5.2459,0.7478;-6.2157,-5.3993,1.3793;-5.6897,-4.1218,1.7366;-6.4952,-3.3888,1.8604;-5.2069,-4.2595,2.7098;-4.6491,-3.6152,0.7102;-3.8431,-4.3605,0.6729;-4.0314,-2.3213,1.0532;-3.2536,-2.317,2.2867;-3.8782,-2.203,3.1923;-2.7702,-3.3001,2.371;-2.1743,-1.2438,2.3015;-1.8837,-0.5593,3.486;-2.4555,-0.7544,4.3884;-0.8534,0.3798,3.5086;-0.5016,1.2318,5.0108;-0.1039,0.6654,2.3701;0.6898,1.4039,2.4081;-0.4046,-0.0148,1.1879;0.1689,0.2001,0.29;-1.4276,-0.9606,1.1508;-1.6719,-1.4807,0.23;-4.9497,-1.1839,0.9427;-5.7499,-1.2126,1.7041;-4.3765,-0.2658,1.1109;-5.5783,-1.1389,-0.4498;-4.7878,-0.8962,-1.1846;-6.3302,-0.3422,-0.4849;-6.2258,-2.4223,-0.7318;-6.6738,-2.3889,-1.6452;-5.2368,-3.5018,-0.7024;-5.7191,-4.4464,-0.9742;-4.4024,-3.3358,-1.4086)|\",5.401459932425\r\n\"[H]OC1=NC(=O)O[C@]1(C1=C([H])C(C([H])([H])[H])=C(Cl)C([H])=C1[H])C([H])([H])[H] |(3.4064,3.7686,-0.9241;3.7629,4.2645,-0.1682;4.8593,3.6586,0.2959;5.5146,4.1101,1.2957;6.623,3.2409,1.4603;7.4817,3.3054,2.2907;6.6189,2.248,0.4815;5.4618,2.4163,-0.3522;4.5893,1.1629,-0.2719;3.2755,1.1819,0.2062;2.832,2.109,0.5574;2.4832,0.0258,0.2807;1.0721,0.0935,0.8055;0.815,1.1151,1.0991;0.9412,-0.5561,1.6783;0.35,-0.2412,0.0517;3.0684,-1.1731,-0.1443;2.1427,-2.6679,-0.0912;4.3828,-1.2267,-0.6086;4.807,-2.1778,-0.9121;5.1392,-0.0619,-0.6722;6.169,-0.115,-1.0109;5.9225,2.7248,-1.7877;5.0665,2.9089,-2.4472;6.4787,1.8764,-2.1935;6.5728,3.6046,-1.7955)|\",5.877659170799999\r\n\"[H]C([H])([H])OC(=O)C([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1(C([H])([H])[H])C([H])([H])[H] |(0.1223,3.0631,2.6581;-0.0231,2.8604,1.5941;-0.7947,2.094,1.4809;-0.3047,3.77,1.0633;1.209,2.4285,0.9905;1.7559,1.3112,1.5184;1.2703,0.7059,2.4511;3.0096,0.913,0.7659;3.5607,1.8128,0.4769;3.6382,0.3124,1.4252;2.6861,0.1402,-0.5191;1.9942,0.7446,-1.127;3.9073,-0.0618,-1.2235;3.665,-1.1023,-2.1742;4.9169,-1.969,-2.267;5.7673,-1.3778,-2.6215;4.7473,-2.7939,-2.9659;5.1596,-2.3849,-1.2858;3.258,-0.5156,-3.5287;4.0859,0.0539,-3.9636;2.3972,0.1499,-3.4128;2.9806,-1.3185,-4.2185;2.5666,-1.8628,-1.6569;2.1179,-1.2991,-0.4041;0.5916,-1.3568,-0.3909;0.1968,-0.9472,0.5445;0.2586,-2.3964,-0.4751;0.1717,-0.7979,-1.234;2.7122,-2.1025,0.7595;2.3234,-1.7529,1.7215;3.8041,-2.0248,0.7728;2.4482,-3.1578,0.6376)|\",6.72665438436\r\n\"[H]N([H])N([H])[C@]1([H])C([H])([H])C([H])([H])N(C(=O)OC([H])([H])[H])C([H])([H])[C@@]1([H])C([H])([H])C([H])([H])[H] |(1.3559,1.4541,3.9847;2.2418,0.9713,4.1573;2.8484,1.616,4.658;2.9004,0.6318,2.9296;2.6938,1.3306,2.217;2.4661,-0.6882,2.4527;1.3583,-0.7486,2.4026;2.9571,-1.7637,3.4324;4.0465,-1.6746,3.5273;2.5239,-1.5725,4.4184;2.596,-3.1699,2.9579;1.5021,-3.3052,2.9667;3.0321,-3.9371,3.5975;3.0925,-3.385,1.6008;3.816,-4.5096,1.322;4.1356,-5.3609,2.1371;4.1619,-4.5792,0.0027;4.9451,-5.727,-0.3414;5.1266,-5.6426,-1.4142;4.4039,-6.65,-0.1167;5.8917,-5.732,0.2062;2.6658,-2.4046,0.6099;3.1381,-2.6495,-0.3381;1.5741,-2.4853,0.468;3.0101,-0.9585,1.029;4.1056,-0.873,1.0755;2.485,0.0409,-0.025;1.3862,-0.0078,-0.0528;2.7374,1.0634,0.2832;3.0432,-0.163,-1.4407;2.7144,0.6443,-2.1046;2.7116,-1.1062,-1.8868;4.14,-0.1632,-1.4357)|\",7.834157755895001\r\n\"[H]C(=O)/C(=C(\\[H])C([H])([H])C([H])([H])SC([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(4.3532,2.0989,2.0658;4.3544,1.2615,1.3308;5.4093,0.7745,0.9647;3.001,0.8461,0.8893;1.9854,1.5296,1.4577;2.2569,2.3085,2.1747;0.5064,1.3607,1.267;0.1019,0.8199,2.139;0.2814,0.7377,0.3964;-0.2545,2.6948,1.1731;-0.0526,3.3075,2.0591;-1.3342,2.5085,1.1497;0.1943,3.7724,-0.2436;-0.468,2.795,-1.6387;-0.3316,3.4025,-2.5371;-1.5367,2.5962,-1.5094;0.0715,1.8531,-1.7729;2.8593,-0.2775,-0.1251;1.7885,-0.4224,-0.3129;3.5159,0.086,-1.4724;3.3403,-0.7121,-2.2035;4.5951,0.2154,-1.3582;3.099,1.0153,-1.8775;3.4076,-1.6107,0.4233;3.2319,-2.4155,-0.3004;2.9128,-1.8868,1.3617;4.4826,-1.5421,0.6079)|\",4.76199238375\r\n\"[H]C1=NC([H])=C(Cl)C(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C1[H] |(7.1217,-0.9854,4.6514;7.5685,-0.8385,3.6704;8.9047,-0.9327,3.6095;9.4737,-0.7476,2.4184;10.5574,-0.8222,2.3684;8.7475,-0.4676,1.2573;9.6413,-0.2588,-0.2381;7.3441,-0.3737,1.3004;6.545,-0.0795,0.1077;7.0709,0.3222,-0.7538;5.2223,-0.2777,-0.0195;4.606,-0.7075,0.7638;4.5362,0.0752,-1.2853;5.0706,0.5587,-2.2649;3.2182,-0.2221,-1.2033;2.4167,0.0704,-2.3734;3.0059,-0.1523,-3.2662;1.5785,-0.6274,-2.3061;1.9424,1.5161,-2.372;1.2725,1.6875,-3.2224;2.7913,2.1998,-2.461;1.3957,1.746,-1.4517;6.7668,-0.5644,2.57;5.6923,-0.4784,2.695)|\",4.715733029165\r\n\"[H]OB(O[H])C1([H])C([H])([H])C([H])([H])N(C(=O)[C@@]2([H])N([H])C([H])([H])C([H])([H])C2([H])[H])C([H])([H])C1([H])[H] |(4.9602,5.0666,-3.3909;4.3865,5.7983,-3.6537;3.0913,5.397,-3.8323;2.2267,6.3968,-4.1954;1.3258,6.07,-4.3184;2.5968,3.8893,-3.6316;2.1609,3.5646,-4.5948;1.4703,3.7895,-2.5738;1.832,4.1613,-1.6052;0.6141,4.4202,-2.8507;0.9726,2.3496,-2.3802;0.4719,2.0094,-3.3024;0.2658,2.3157,-1.5544;2.0883,1.4497,-2.0836;2.1497,0.5217,-1.0714;3.0984,-0.2546,-0.9622;1.0134,0.4061,-0.0388;1.4365,-0.2849,0.7057;0.5717,1.6649,0.5815;1.2824,2.0328,1.2115;-0.6672,1.322,1.2913;-0.4817,0.7128,2.1947;-1.2001,2.2278,1.5992;-1.4302,0.4905,0.2454;-2.1188,-0.2218,0.7095;-2.0239,1.1488,-0.3971;-0.3114,-0.2113,-0.5749;-0.4355,-0.0343,-1.6472;-0.3048,-1.2948,-0.4291;3.1618,1.4658,-3.0795;3.9325,0.7737,-2.7414;2.7656,1.0915,-4.0376;3.7095,2.8848,-3.2749;4.4789,2.8574,-4.0594;4.2071,3.1916,-2.3422)|\",6.813730816520001\r\n\"[H]O[C@](C([H])([H])[H])(C([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.3657,1.8635,1.0558;4.7853,1.0564,0.7172;4.3234,0.8671,-0.6352;2.7923,0.7291,-0.6297;2.399,0.4564,-1.6151;2.3232,1.6778,-0.3363;2.4784,-0.0336,0.0885;5.029,-0.4157,-1.1159;4.7476,-0.6158,-2.1567;6.1093,-0.2222,-1.1128;4.7542,-1.6726,-0.2636;3.6917,-1.9375,-0.324;4.9645,-1.4329,0.7836;5.5887,-2.8559,-0.7064;5.1072,-3.7643,-1.6589;4.1043,-3.633,-2.0609;5.8887,-4.8371,-2.0905;5.4931,-5.532,-2.8271;7.1726,-5.0206,-1.5743;7.7818,-5.8572,-1.9063;7.6657,-4.1242,-0.6241;8.6624,-4.2607,-0.2118;6.8797,-3.0535,-0.1966;7.2684,-2.3624,0.5485;4.7785,2.0917,-1.4647;4.3442,2.989,-0.9941;5.8665,2.1818,-1.3458;4.4238,2.1082,-2.9591;4.8705,1.2386,-3.4576;3.3376,2.0145,-3.0879;4.904,3.3866,-3.6564;4.6477,3.3796,-4.7218;5.9923,3.4971,-3.5757;4.4476,4.2782,-3.209)|\",6.51440558097\r\n\"[H]O[C@@](C([H])([H])[H])(C([H])([H])C([H])([H])C([H])([H])[H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(3.3371,1.1518,1.1764;4.0975,0.595,0.9418;4.2377,0.6477,-0.4935;2.9477,0.0898,-1.126;3.036,-0.0321,-2.2111;2.109,0.7733,-0.938;2.6894,-0.8786,-0.691;4.412,2.1384,-0.8855;3.4964,2.6625,-0.5662;5.221,2.5621,-0.2791;4.6597,2.4623,-2.366;5.5977,1.9985,-2.6968;3.8655,2.0254,-2.9855;4.728,3.9717,-2.6275;4.9069,4.1848,-3.6874;5.5374,4.4378,-2.0523;3.7919,4.4668,-2.3409;5.4747,-0.2194,-0.8796;5.5087,-0.1561,-1.9762;5.36,-1.7286,-0.539;4.3858,-2.1262,-0.8431;6.1019,-2.2529,-1.1603;5.6365,-2.072,0.9346;4.8458,-1.6478,1.5619;5.6053,-3.1624,1.064;6.9991,-1.5291,1.3902;7.7991,-2.0622,0.8529;7.1519,-1.7345,2.4578;7.1239,-0.0232,1.115;8.1334,0.3239,1.3744;6.4191,0.5238,1.7496;6.8413,0.2984,-0.3624;7.6186,-0.1845,-0.9738;6.9498,1.374,-0.5414)|\",8.753902570585002\r\n\"[H]/N=C(O[H])\\C([H])=C([H])\\C(=N/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])O[H] |(6.6336,2.223,2.3581;7.0844,1.682,3.097;6.7531,0.4571,3.0057;7.3426,-0.4177,3.87;6.8911,-1.2746,3.827;5.8097,-0.1609,2.0362;6.0476,-1.1703,1.6982;4.6832,0.4367,1.6118;4.4232,1.4119,2.0102;3.734,-0.178,0.6466;2.4798,-0.0387,0.5431;1.5994,0.7625,1.4104;0.1662,0.3951,0.9738;-0.5783,0.9569,1.5501;-0.0123,-0.675,1.1213;0.0262,0.6135,-0.0895;1.816,2.2699,1.1572;1.055,2.8544,1.6864;1.7321,2.4898,0.0878;2.7955,2.6206,1.4993;1.7561,0.4173,2.9064;0.9813,0.925,3.4917;2.727,0.7185,3.3108;1.6441,-0.6616,3.0605;4.3143,-1.0648,-0.2334;5.257,-0.8497,-0.3139)|\",4.634098874015\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C([H])([H])O/C([H])=C(\\[H])C([H])([H])[H])O1 |(10.4242,0.4344,-5.9422;10.414,0.4108,-4.8631;11.2621,-0.0937,-3.9246;12.1924,-0.6123,-4.1098;10.6698,0.2034,-2.6575;11.0576,-0.0443,-1.679;9.4977,0.8703,-2.9121;8.4868,1.4165,-2.0372;8.6709,1.2822,-0.9754;7.3769,2.0547,-2.4371;7.1811,2.1957,-3.4981;6.3349,2.6195,-1.5223;5.3514,2.1671,-1.7356;6.2219,3.7036,-1.6937;6.6866,2.3833,-0.1682;5.8318,2.8398,0.7897;6.2208,2.5879,1.7734;4.6782,3.5001,0.6358;4.3071,3.7411,-0.3569;3.8384,3.9445,1.8014;2.8314,3.507,1.7648;3.7092,5.0354,1.8139;4.2914,3.654,2.756;9.3374,0.9996,-4.2699)|\",4.8953281704950005\r\n\"[H]O[C@]1([H])C([H])([H])[C@]2(OC([H])([H])C([H])=C([H])[H])C([H])([H])C(=O)[C@@]1([H])C([H])([H])C2([H])[H] |(5.258,-3.6587,4.7333;4.7867,-2.9866,5.254;5.0286,-1.7152,4.669;4.8047,-0.9963,5.4643;4.1181,-1.431,3.4312;3.3267,-0.7167,3.684;3.6296,-2.3723,3.1544;4.9513,-0.8769,2.2527;4.1601,-0.4122,1.1614;3.1817,-1.298,0.6201;3.6247,-2.2876,0.4206;2.3467,-1.4461,1.3197;2.6858,-0.7033,-0.6654;3.4606,-0.4407,-1.3848;1.4005,-0.5091,-0.9562;1.0856,-0.1046,-1.9142;0.6102,-0.7516,-0.2486;5.9752,-1.956,1.8281;5.511,-2.8237,1.3472;6.6911,-1.5348,1.109;6.7233,-2.4428,3.0691;7.3186,-3.5037,3.12;6.5171,-1.5275,4.2639;7.1668,-1.8341,5.0877;6.739,-0.0555,3.8527;6.618,0.5786,4.7379;7.7724,0.0786,3.5148;5.7331,0.3554,2.7405;6.2468,0.8029,1.8833;5.0166,1.1004,3.1024)|\",5.90487055585\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])N(C([H])([H])C([H])=C([H])[H])[C@]1([H])C([H])=C([H])[H] |(7.581,-2.1918,2.6336;7.5452,-2.2653,1.6644;6.7627,-1.1781,1.1951;6.5314,-1.4152,0.1505;7.5095,0.1591,1.2393;7.8036,0.3796,2.2762;8.4362,0.0732,0.6605;6.6136,1.2811,0.6963;6.433,1.117,-0.3741;7.1076,2.2552,0.7967;5.2654,1.3138,1.4217;4.5985,2.0388,0.9441;5.411,1.6681,2.4624;4.6073,0.0053,1.3678;3.2225,0.0357,1.8473;2.8847,-1.0079,1.9154;3.1272,0.463,2.8633;2.3159,0.7768,0.8993;2.3648,0.4465,-0.1385;1.4838,1.7572,1.2515;0.828,2.2399,0.5318;1.4212,2.1106,2.2791;5.4016,-1.0937,1.9421;4.8722,-2.0283,1.7171;5.5903,-1.0239,3.4471;5.8496,-0.0542,3.8709;5.4794,-2.0737,4.2667;5.6364,-1.9873,5.3391;5.2149,-3.0617,3.8936)|\",5.83412095472\r\n\"[H]O[C@]([H])(C([H])=C([H])[H])[C@@]1([H])N(C([H])([H])C([H])=C([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(5.9741,4.2643,-0.3;5.4199,4.3228,-1.0966;4.4226,3.3097,-0.978;3.757,3.501,-1.8299;3.6391,3.4136,0.3127;3.0644,2.5295,0.5834;3.6539,4.4799,1.1131;3.0895,4.5009,2.0411;4.2195,5.3721,0.8577;5.07,1.9227,-1.1446;5.7377,1.7894,-0.2768;4.1484,0.7626,-1.1026;2.9724,0.8695,-1.9878;2.439,1.791,-1.7243;3.2256,0.9461,-3.0609;2.0293,-0.2851,-1.7762;1.7686,-0.4822,-0.7363;1.5072,-1.0301,-2.7508;0.8026,-1.8314,-2.5449;1.7585,-0.8616,-3.7964;5.0367,-0.346,-1.5013;5.6171,-0.6493,-0.621;4.4474,-1.2082,-1.8223;5.99,0.2059,-2.6047;7.0069,-0.1794,-2.4748;5.6618,-0.0964,-3.6047;5.9186,1.7449,-2.4293;5.418,2.2221,-3.2803;6.9002,2.2164,-2.3387)|\",6.28582994655\r\n\"[H]OC([H])([H])P(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.0692,-4.5188,-2.9597;6.5389,-4.3655,-2.1235;7.3766,-3.2294,-2.3055;8.2007,-3.4436,-3.0067;7.8284,-3.0325,-1.3284;6.4701,-1.7235,-2.9933;7.9503,-0.5846,-3.0822;8.4632,-0.4351,-2.1251;7.6525,0.3909,-3.4766;8.6668,-1.0165,-3.7902;5.4527,-1.061,-1.4953;5.0551,0.3817,-1.8653;4.4515,0.8358,-1.0722;4.4764,0.4224,-2.7952;5.9391,1.0179,-1.9897;6.2751,-1.0479,-0.1938;5.7465,-0.5188,0.6054;7.2352,-0.5353,-0.3261;6.479,-2.0611,0.1679;4.2389,-2.0452,-1.4213;3.8954,-2.2206,-2.4513;4.6504,-3.0063,-1.0885;2.9371,-1.78,-0.5913;2.1956,-3.1358,-0.5348;1.231,-3.0315,-0.023;2.7834,-3.8889,0.0038;1.9993,-3.5231,-1.5423;1.9993,-0.7705,-1.2879;1.0429,-0.7091,-0.7533;1.7823,-1.0809,-2.3175;2.4173,0.2388,-1.3247;3.1941,-1.3209,0.8567;2.2524,-1.3077,1.4202;3.6114,-0.3096,0.903;3.8814,-1.9968,1.3784)|\",7.012373927385\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])C1=C(Cl)C([H])=C(Cl)C([H])=C1Cl |(2.8099,0.9065,-1.3611;3.1265,1.7275,-0.9519;2.5172,1.8168,0.3343;2.8734,1.0234,1.0106;2.8591,2.7748,0.7434;1.0172,1.7926,0.2466;0.5746,2.539,-0.4087;0.2559,0.9138,0.9117;0.7466,0.2121,1.5844;-1.2176,0.8282,0.8956;-1.9253,0.6315,2.1021;-1.0449,0.5399,3.6217;-3.3104,0.5208,2.1748;-3.8018,0.3807,3.1295;-4.0428,0.6004,0.9948;-5.7882,0.4662,1.0522;-3.4099,0.7783,-0.2307;-3.982,0.8231,-1.1489;-2.0203,0.8846,-0.2644;-1.3051,1.043,-1.863)|\",5.18648999053\r\n\"[H]C#C[C@]1([H])C([H])([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[C@]1([H])O[H] |(7.3457,-4.0871,2.5213;7.2687,-3.0887,2.1546;7.1856,-1.9606,1.7333;7.0651,-0.5987,1.2283;6.6538,0.047,2.017;6.1571,-0.4588,-0.0169;5.0939,-0.4074,0.2266;6.292,-1.3082,-0.7024;6.6171,0.7932,-0.6145;5.7891,1.514,-1.427;4.6783,1.1302,-1.7655;6.3848,2.6787,-1.7868;5.6725,3.6645,-2.6163;5.3382,3.062,-3.9851;4.8922,3.8314,-4.6258;4.6351,2.2342,-3.8849;6.249,2.6985,-4.4737;4.4255,4.1609,-1.8761;3.9607,4.975,-2.4433;4.7019,4.5488,-0.8893;3.6977,3.3573,-1.7517;6.7038,4.7876,-2.7559;6.29,5.6058,-3.3548;7.6095,4.4211,-3.2497;6.9809,5.184,-1.7736;7.9001,1.2561,-0.0718;8.6109,1.5063,-0.8642;7.7619,2.1488,0.5531;8.3998,0.057,0.755;9.009,0.3768,1.6036;9.227,-0.8203,0.0146;8.6962,-1.233,-0.6857)|\",7.616466675495001\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C(N(=O)=O)O1)C([H])([H])C([H])=C([H])[H] |(5.8577,1.2128,2.2456;5.6068,0.4396,1.7126;4.9189,0.9192,0.5588;4.7263,0.0309,-0.0508;5.7922,1.8354,-0.2454;6.2816,1.8128,-1.5256;6.1025,1.041,-2.261;7.0668,2.9894,-1.6885;7.6169,3.3219,-2.5555;6.9959,3.636,-0.4878;7.5898,4.8569,-0.049;8.2769,5.453,-0.8844;7.3776,5.2201,1.1078;6.2263,2.9591,0.4001;3.5695,1.5799,0.9284;3.7886,2.4562,1.5578;3.0905,1.9596,0.0172;2.6494,0.6319,1.6476;3.0656,0.1522,2.532;1.4048,0.3485,1.2628;0.7801,-0.3454,1.8186;0.9571,0.8046,0.3817)|\",4.48443625624\r\n\"[H]O[C@]1([H])C([H])([H])/C([H])=C([H])/C([H])=C(/[H])C([H])([H])C([H])([H])C(=O)OC([H])([H])C1([H])[H] |(10.3183,4.0609,-1.8411;9.9966,4.0248,-2.7567;8.644,3.5741,-2.7202;8.3544,3.5366,-3.7786;7.767,4.635,-2.0057;8.1861,4.8015,-1.0022;7.8986,5.573,-2.5626;6.3073,4.2936,-1.8768;5.7706,4.0678,-2.7991;5.6389,4.2568,-0.714;6.1686,4.5686,0.1908;4.2285,3.8692,-0.5174;3.6145,4.6064,0.0033;3.6411,2.7036,-0.8375;2.5832,2.6076,-0.5969;4.2811,1.4721,-1.4523;3.5489,0.6583,-1.4747;4.5355,1.6627,-2.5024;5.527,0.9825,-0.6686;5.2035,0.6332,0.3149;6.2127,1.817,-0.5165;6.2489,-0.1963,-1.3107;6.0228,-1.3431,-1.007;7.2048,0.0451,-2.2565;7.4189,1.35,-2.817;7.701,1.1597,-3.8576;6.4869,1.915,-2.8307;8.5301,2.1488,-2.1301;9.4845,1.6351,-2.2873;8.3547,2.1941,-1.0472)|\",5.8776591708\r\n\"[H]C([H])([H])C([H])([H])SC(=O)C([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.2764,0.6259,5.2993;5.9521,0.0231,4.6841;6.8194,-0.2525,5.2969;5.4344,-0.8965,4.3972;6.4134,0.7879,3.4456;7.0663,0.1694,2.8241;6.9501,1.7035,3.7115;5.0339,1.3342,2.3605;4.7971,-0.1654,1.3789;5.4868,-1.1525,1.5275;3.6537,-0.0511,0.38;2.8358,-0.6724,0.7716;3.29,0.9829,0.37;3.9966,-0.499,-1.0657;3.1243,-0.1993,-1.6621;5.2142,0.2671,-1.6076;5.3558,0.087,-2.678;6.1334,-0.0378,-1.0936;5.0942,1.3483,-1.4691;4.1476,-2.0312,-1.1735;5.008,-2.3359,-0.5668;3.2756,-2.5065,-0.7027;4.3325,-2.6047,-2.5949;5.2526,-2.1606,-3.0027;4.5888,-4.1197,-2.5113;4.7408,-4.5536,-3.5072;3.753,-4.6508,-2.0404;5.4838,-4.3292,-1.914;3.2301,-2.2645,-3.6324;3.1352,-1.1729,-3.7286;3.617,-2.6064,-4.6029;1.8123,-2.8654,-3.4589;1.9093,-3.8892,-3.0715;0.9206,-2.0744,-2.4868;-0.0735,-2.5317,-2.4124;0.7821,-1.0435,-2.8395;1.3361,-2.0282,-1.4761;1.1159,-2.9613,-4.8278;0.109,-3.3857,-4.734;1.6834,-3.592,-5.5222;1.0154,-1.9688,-5.2872)|\",6.304877916084999\r\n\"[H]/C(C(=O)SC([H])([H])C([H])([H])[H])=C(/[H])C([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(6.767,0.2453,-0.4032;6.294,0.0057,0.5471;6.0187,1.1238,1.4768;5.4935,1.0131,2.569;6.5495,2.7234,0.8222;6.0495,3.7978,2.2279;4.9883,3.6196,2.4188;6.6022,3.4703,3.1122;6.328,5.2631,1.9001;6.0272,5.8924,2.7453;5.7676,5.5916,1.0185;7.3925,5.4392,1.712;5.9738,-1.2559,0.8712;5.5003,-1.4184,1.8403;6.2139,-2.4656,0.0209;6.9174,-3.1187,0.5574;6.7038,-2.1777,-0.9194;4.9257,-3.27,-0.2925;4.4365,-3.5002,0.6649;3.9446,-2.4335,-1.1279;3.6896,-1.4928,-0.6288;3.0143,-2.9845,-1.3074;4.3775,-2.1839,-2.1057;5.2424,-4.5957,-1.016;4.2886,-5.0568,-1.3139;5.7645,-4.3573,-1.9554;6.0642,-5.654,-0.2508;7.0278,-5.2096,0.0392;5.362,-6.1349,1.0282;5.9581,-6.9022,1.5361;4.3842,-6.5763,0.7935;5.1965,-5.3217,1.7434;6.3731,-6.8438,-1.1732;6.9915,-7.5914,-0.6626;6.9088,-6.5231,-2.0747;5.4488,-7.3413,-1.4947)|\",5.26812414568\r\n\"[H]/C(=C(/[H])[C@@]([H])(C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C(=O)SC([H])([H])C([H])([H])[H] |(5.5228,-2.3297,-1.0841;5.9413,-1.3369,-1.2436;5.4729,-0.5593,-2.2228;5.8991,0.4392,-2.3561;4.3683,-0.9265,-3.1827;4.0523,-1.9551,-2.9639;3.1504,-0.0051,-2.9695;2.7713,-0.0838,-1.9451;2.3377,-0.2663,-3.6579;3.4146,1.0448,-3.1493;4.8424,-0.8518,-4.6533;3.9819,-1.0635,-5.3059;5.1339,0.1881,-4.8665;6.0065,-1.7818,-5.0477;6.8466,-1.5764,-4.3686;5.6441,-3.2682,-4.9089;6.4806,-3.9047,-5.221;4.7812,-3.5223,-5.5391;5.3953,-3.5366,-3.8767;6.4706,-1.4722,-6.4792;7.3223,-2.1008,-6.7655;6.7769,-0.4243,-6.5837;5.665,-1.6565,-7.2023;7.0361,-0.9365,-0.2878;7.3761,0.0866,-0.4791;7.9066,-1.6001,-0.4082;6.5958,-1.0999,1.1666;6.0773,-2.1094,1.5917;6.9447,0.3359,2.197;6.2973,-0.2708,3.8084;6.5815,-1.3238,3.8798;6.8489,0.2913,4.5679;4.788,-0.0985,3.9619;4.4702,-0.4621,4.947;4.257,-0.6749,3.1996;4.4934,0.9521,3.8735)|\",6.103513666715\r\n\"[H]/C(C(=O)SC([H])([H])C([H])([H])[H])=C([H])\\C([H])=C(/[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.9345,-1.7474,-0.1893;5.2189,-0.9314,-0.1178;4.4791,-0.5586,-1.3345;3.6527,0.3341,-1.4076;4.9307,-1.5781,-2.764;3.8196,-0.8359,-4.0249;3.8172,0.241,-3.8373;4.3095,-1.0222,-4.9852;2.4,-1.3978,-3.9956;1.7974,-0.9297,-4.7841;1.9254,-1.1856,-3.0339;2.3938,-2.4806,-4.1573;5.0127,-0.2784,1.0467;4.2813,0.5291,1.049;5.6916,-0.5716,2.2891;6.4256,-1.3783,2.2782;5.4628,0.0999,3.4347;4.7274,0.9051,3.4201;6.1383,-0.1793,4.7438;6.5985,0.7488,5.1189;6.9558,-0.8943,4.5849;5.2005,-0.7296,5.852;4.7351,-1.6453,5.4597;6.0217,-1.104,7.0938;5.3818,-1.5316,7.8741;6.5175,-0.2208,7.5172;6.7984,-1.8407,6.8558;4.0817,0.2573,6.2158;3.4455,-0.1527,7.0086;3.4359,0.4823,5.3602;4.4976,1.2053,6.5824)|\",4.449061455675\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])C([H])([H])C(=O)C([H])([H])[C@@]1([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])[H] |(5.1604,-6.3237,3.1179;4.2422,-6.0233,3.0154;4.0597,-5.4918,1.7708;2.9752,-5.0751,1.4518;5.3014,-5.4498,0.8894;5.7122,-6.4651,0.7972;4.9762,-5.1344,-0.1052;6.389,-4.4943,1.4268;6.7313,-4.8325,2.4168;7.2607,-4.5659,0.7653;5.9206,-3.0345,1.5175;5.0285,-2.9796,2.1567;5.6081,-2.7003,0.5203;6.9887,-2.0831,2.0686;7.3309,-2.4904,3.0333;8.2394,-1.8981,1.1863;7.9715,-1.8278,0.1235;8.9942,-2.6851,1.2821;8.8236,-0.5449,1.6037;9.904,-0.1065,1.275;7.8152,0.1586,2.5189;8.2058,0.0827,3.5439;7.7491,1.2263,2.2859;6.4964,-0.6293,2.3773;5.9518,-0.6418,3.3293;5.5248,0.0017,1.3449;4.6097,-0.6003,1.2922;5.2209,0.9732,1.7654;6.0513,0.2376,-0.0485;6.929,0.882,-0.1347;5.4942,-0.2271,-1.1721;4.6038,-0.8569,-1.0989;5.9885,0.0454,-2.5676;5.1888,0.5317,-3.1457;6.8232,0.7571,-2.5321;6.4269,-1.2292,-3.3094;6.7433,-0.9988,-4.333;5.6089,-1.9575,-3.3692;7.2666,-1.7104,-2.7953)|\",5.9484087719300005\r\n\"[H]OC1=NC(=S)S/C1=C(\\[H])C1=C([H])C([H])=C([H])C(N([H])[H])=C1[H] |(0.7077,-2.5573,-0.2719;0.7477,-3.5304,-0.178;-0.2985,-3.9726,0.5158;-0.3884,-5.2486,0.7439;-1.5372,-5.6002,1.4044;-1.9648,-7.1139,1.8773;-2.5967,-4.1673,1.728;-1.4029,-3.0969,0.9809;-1.5841,-1.762,0.8194;-2.5653,-1.3683,1.0816;-0.6316,-0.756,0.3212;-1.0819,0.2202,-0.5844;-2.1139,0.2106,-0.9212;-0.1885,1.1763,-1.0654;-0.5314,1.9207,-1.7787;1.1395,1.1957,-0.6496;1.8185,1.9572,-1.0249;1.6052,0.2513,0.2848;2.8975,0.3212,0.7816;3.5734,0.81,0.2098;3.2716,-0.5234,1.1937;0.7067,-0.7225,0.7604;1.0329,-1.4046,1.5423)|\",3.006858048025\r\n\"[H]/C1=C(\\O[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])O[C@]([H])(C([H])([H])[H])[C@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]12[H] |(5.0379,1.2641,-2.1897;4.2006,1.3543,-2.8739;3.2335,0.4251,-2.8145;3.2385,-0.5833,-1.923;2.1085,-1.8483,-1.7081;2.8509,-2.8005,-0.2643;2.229,-3.6649,-0.0018;2.9354,-2.1649,0.6243;3.8535,-3.1691,-0.5082;0.4161,-1.1579,-1.249;-0.2859,-1.9702,-1.0223;-0.0011,-0.5623,-2.0663;0.4835,-0.5181,-0.3615;2.0235,-2.9349,-3.2461;1.3839,-3.8083,-3.0671;3.019,-3.3017,-3.5214;1.6166,-2.3827,-4.0985;2.1284,0.3858,-3.6242;2.1277,1.2771,-4.7723;1.0636,1.3817,-5.0119;2.837,0.5903,-5.9388;2.7908,1.2091,-6.8414;2.3527,-0.3668,-6.1573;3.8877,0.3948,-5.7052;2.6848,2.6463,-4.3553;2.0693,2.9629,-3.4987;2.5796,3.7408,-5.4275;3.1884,3.4709,-6.3016;1.5436,3.8344,-5.7819;3.0699,5.0874,-4.8653;3.0428,5.8548,-5.6493;2.3754,5.4188,-4.0798;4.4864,4.9805,-4.2749;4.7778,5.9374,-3.8235;5.2019,4.79,-5.0886;4.5928,3.8487,-3.2395;3.9729,4.0821,-2.362;5.6258,3.7593,-2.8778;4.1295,2.5072,-3.8392;4.7842,2.3153,-4.7091)|\",6.90624952569\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])/C([H])=C(\\[H])C2=C(OC([H])([H])[H])C([H])=C([H])C([H])=C2[H])C([H])=C1[H] |(-4.3658,-6.5808,-0.0236;-3.4099,-6.1413,-0.2961;-3.0016,-6.1159,-1.6299;-3.6391,-6.5362,-2.4038;-1.7715,-5.55,-1.9751;-1.4585,-5.5355,-3.017;-0.9315,-5.0035,-0.9982;0.4119,-4.3906,-1.3711;1.2255,-4.9638,-0.9062;0.5473,-4.4856,-2.4583;0.5358,-2.9424,-0.9711;-0.2183,-2.2756,-1.3895;1.4951,-2.4606,-0.165;2.2595,-3.1457,0.1947;1.642,-1.0632,0.2713;2.9021,-0.5964,0.7233;3.9178,-1.513,0.7007;5.197,-1.1222,1.1677;5.1662,-0.8065,2.2191;5.6181,-0.3096,0.5607;5.8319,-2.0056,1.075;3.0638,0.726,1.1492;4.0296,1.0808,1.4898;1.9767,1.6032,1.1311;2.1146,2.6281,1.465;0.7271,1.1641,0.6973;-0.1237,1.8392,0.6943;0.5714,-0.1575,0.281;-0.409,-0.5098,-0.0263;-1.3553,-5.0303,0.3387;-0.718,-4.5973,1.1061;-2.5809,-5.5962,0.6884;-2.8912,-5.6101,1.7302)|\",4.87083792395\r\n\"[H]/C(=C(/SC([H])([H])[H])[S@](=O)C([H])([H])[H])C(F)(F)F |(3.2455,-3.3264,1.9816;2.6901,-2.5216,1.5004;3.3793,-1.5676,0.8738;2.7493,-0.1434,0.0124;2.6046,-0.8335,-1.686;2.2756,-0.0072,-2.3214;1.858,-1.6289,-1.6983;3.5688,-1.2017,-2.0428;5.2094,-1.8045,0.835;5.5195,-3.0664,1.6067;5.615,-0.3846,1.9306;5.2446,0.5425,1.4877;6.7046,-0.358,2.0099;5.1719,-0.5635,2.9135;1.1919,-2.5965,1.6036;0.8307,-3.74,2.2206;0.6715,-1.5739,2.3109;0.5897,-2.591,0.3899),wU:8.8|\",4.88172247797\r\n\"[H]OC(=O)/C1=C(/[H])[C@]2([H])C(C([H])([H])[H])(C([H])([H])[H])[C@]2([H])C([H])([H])C([H])([H])C(C([H])([H])[H])C([H])C([H])([H])C1([H])[H] |(1.841,-4.6579,-1.2541;0.9047,-4.46,-1.4234;0.7945,-3.177,-1.8712;-0.2969,-2.7323,-2.1518;2.0657,-2.4187,-2.0574;3.1951,-2.5789,-1.3331;4.039,-1.9837,-1.6782;3.5994,-3.3771,-0.143;4.396,-4.0725,-0.4243;2.8704,-3.8087,1.1213;3.2039,-5.1941,1.6594;3.0175,-5.2439,2.7396;4.2554,-5.4524,1.4898;2.5851,-5.966,1.1832;1.432,-3.4006,1.3997;1.2918,-3.2381,2.4763;0.7375,-4.1845,1.0863;1.1431,-2.4852,0.8796;3.9522,-2.7366,1.2209;4.9069,-3.1155,1.5831;3.6666,-1.3315,1.7885;2.6584,-1.3088,2.2134;4.335,-1.2381,2.6535;3.8583,-0.0347,0.941;4.7956,-0.0926,0.3743;3.9777,0.7851,1.6657;2.7003,0.2678,0.0106;1.3621,0.5099,0.6713;0.5979,0.8279,-0.0407;1.4626,1.3039,1.4245;0.974,-0.368,1.2003;2.8827,0.2921,-1.3204;3.9109,0.2121,-1.6805;1.8453,0.1249,-2.3902;1.9414,0.8804,-3.1818;0.8298,0.1955,-1.9911;2.0159,-1.284,-3.0651;2.9512,-1.277,-3.6386;1.1948,-1.4347,-3.7703)|\",5.044990788269999\r\n\"[H]O[C@@]1([H])C(C([H])([H])[H])C([H])C([H])([H])C([H])([H])/C(C([H])=O)=C(\\[H])[C@]2([H])C(C([H])([H])[H])(C([H])([H])[H])[C@]2([H])C1([H])[H] |(5.148,-0.8945,-0.1769;5.0303,-0.0648,0.3125;3.7082,-0.0765,0.832;3.6929,0.7422,1.5652;2.6219,0.1911,-0.2036;1.2135,0.2885,0.3478;0.6057,0.988,-0.2326;1.2212,0.6466,1.3845;0.6825,-0.6727,0.3444;2.8984,0.2834,-1.5131;3.9475,0.2873,-1.806;1.9001,0.1277,-2.6225;2.0681,0.8542,-3.4282;0.8771,0.2743,-2.2637;2.0053,-1.3085,-3.2587;2.9281,-1.3508,-3.8505;1.1651,-1.4508,-3.9437;2.0239,-2.4188,-2.2273;0.8043,-3.2143,-2.032;0.9079,-4.0854,-1.3563;-0.2553,-2.9999,-2.6019;3.1549,-2.6011,-1.5035;4.0081,-2.0292,-1.8713;3.5763,-3.3905,-0.3178;4.4282,-4.0139,-0.6024;2.8878,-3.9124,0.9247;3.3606,-5.2711,1.426;2.8305,-6.0857,0.9166;3.1748,-5.3738,2.5026;4.4342,-5.4137,1.2582;1.4163,-3.6717,1.2159;0.8102,-4.5096,0.8528;1.0268,-2.7617,0.7565;1.2539,-3.599,2.299;3.8642,-2.7443,1.067;4.8442,-3.0366,1.4415;3.4289,-1.3846,1.6401;3.9717,-1.2867,2.5875;2.3694,-1.402,1.908)|\",4.6422622895300005\r\n\"[H]C1=NC([H])=C([H])C(/C(=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])=C1OC([H])([H])[H] |(9.2691,1.4763,-2.4944;8.7177,0.5571,-2.6709;9.4585,-0.5028,-3.0167;8.8137,-1.6475,-3.2517;9.4284,-2.5024,-3.5282;7.428,-1.7747,-3.1571;6.9656,-2.7369,-3.3572;6.6326,-0.6838,-2.7873;5.1455,-0.8113,-2.6948;4.5082,-0.3953,-1.5848;5.1218,0.0331,-0.7952;3.0296,-0.4098,-1.2975;2.5056,-1.1257,-1.9398;2.8751,-0.7509,-0.2631;2.3631,0.978,-1.456;2.4526,1.2947,-2.5034;1.289,0.8707,-1.2512;2.9504,2.0623,-0.5461;2.408,3.0078,-0.6621;4.0038,2.2485,-0.7818;2.888,1.7719,0.5106;4.4564,-1.4132,-3.9057;4.9415,-2.3615,-4.1772;3.4173,-1.6632,-3.667;4.4772,-0.4871,-5.139;3.9852,0.46,-4.8809;5.5169,-0.2356,-5.3851;3.796,-1.1136,-6.3595;3.8244,-0.4372,-7.2214;2.7438,-1.3462,-6.153;4.2898,-2.0484,-6.6528;7.3221,0.5281,-2.5386;6.5791,1.6203,-2.193;7.2436,2.8621,-2.0136;6.462,3.5834,-1.7683;7.7586,3.1812,-2.9292;7.9677,2.8188,-1.1897)|\",5.232749345115\r\n\"[H]OC(=O)[C@@]1([H])N([H])[C@]([H])(C(=O)[C@]([H])(N([H])[H])C([H])([H])O[H])C([H])([H])C1([H])[H] |(0.6829,1.9811,-1.7792;0.9749,1.7928,-2.71;1.5568,0.5887,-2.6321;2.0066,-0.0026,-3.5864;1.6175,0.0207,-1.1954;2.6703,0.0492,-0.8943;0.7927,0.8595,-0.2875;1.3279,1.1118,0.5417;-0.3949,0.0927,0.139;-1.3078,0.6934,0.0542;-0.2875,-0.334,1.6029;0.7614,-0.2003,2.2197;-1.5589,-0.8505,2.2919;-2.1384,-1.4366,1.5678;-2.3265,0.3623,2.6124;-1.9664,0.7162,3.4995;-3.3036,0.1213,2.7695;-1.2234,-1.7228,3.5157;-0.5881,-2.5715,3.2144;-2.151,-2.1338,3.9297;-0.6239,-0.9571,4.5468;0.2213,-0.6471,4.1744;-0.4427,-1.1353,-0.8143;-0.9588,-1.9974,-0.3815;-0.9806,-0.8575,-1.7276;1.0383,-1.4006,-1.1214;1.201,-1.952,-2.0499;1.5083,-1.9541,-0.3009)|\",5.23547048362\r\n\"[H]C1=NC([H])=C([H])C([C@@]2([H])N([H])N([H])[C@]([H])(C3([H])C([H])([H])C([H])([H])N([H])C([H])([H])C3([H])[H])C2([H])[H])=C1[H] |(-1.8315,-1.5452,0.1695;-1.5477,-0.4939,0.1516;-2.4883,0.3753,0.5384;-2.1509,1.6718,0.517;-2.9232,2.3713,0.8335;-0.9002,2.1444,0.1234;-0.7004,3.2129,0.138;0.085,1.2303,-0.2767;1.4551,1.6803,-0.7385;2.0607,0.7837,-0.9155;1.3753,2.4364,-2.0323;0.4312,2.8023,-2.145;2.2407,3.5751,-1.9513;3.1616,3.2521,-2.2549;2.3222,3.9595,-0.5255;1.4289,4.5671,-0.3134;3.5635,4.8088,-0.2253;4.4494,4.2009,-0.4823;3.6708,5.1735,1.2689;2.76,5.7023,1.5808;3.7533,4.2684,1.8832;4.8775,6.0811,1.5334;5.8063,5.5055,1.3394;4.8979,6.3804,2.5879;4.7801,7.2847,0.7088;5.5426,7.9195,0.9358;4.8085,6.9661,-0.7194;4.775,7.9037,-1.2863;5.7376,6.4378,-1.0183;3.602,6.0934,-1.0779;3.6202,5.8435,-2.144;2.6852,6.67,-0.897;2.2136,2.6229,0.2463;3.2137,2.2215,0.4514;1.6986,2.7246,1.2058;-0.2669,-0.1222,-0.2607;0.4482,-0.8809,-0.5678)|\",5.287172115215001\r\n\"[H]O[C@@]([H])(C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])=C([H])[H])[C@]([H])(N([H])[H])C([H])([H])[H] |(14.5148,2.656,-3.6867;13.6076,2.7343,-4.0473;13.1984,4.0297,-3.6313;12.4229,4.3569,-4.3358;12.5759,3.9798,-2.2164;13.3577,3.6531,-1.5119;12.2564,4.9823,-1.9034;11.4122,3.0292,-2.1467;11.6202,2.0141,-2.4837;10.1836,3.3471,-1.729;9.983,4.3716,-1.4044;9.0139,2.4013,-1.6784;9.3488,1.3879,-1.9365;8.6247,2.3466,-0.6494;7.8629,2.8159,-2.6157;7.5365,3.8351,-2.3613;8.2446,2.8647,-3.6448;6.659,1.8666,-2.557;6.9977,0.8475,-2.7892;6.2717,1.8307,-1.5274;5.5258,2.2736,-3.5096;5.2422,3.3136,-3.2941;5.9022,2.2699,-4.5435;4.2646,1.3978,-3.4269;3.8783,1.4037,-2.3979;3.4792,1.8496,-4.0491;4.4697,-0.0628,-3.8759;4.8976,-0.058,-4.8911;5.2029,-0.5606,-3.2298;3.189,-0.8527,-3.8783;2.4058,-0.4811,-4.5425;2.9455,-1.9317,-3.1337;1.9935,-2.4547,-3.1733;3.6935,-2.3378,-2.4549;14.4132,4.9817,-3.7633;14.6871,4.9464,-4.8252;15.5438,4.3643,-3.0388;15.4967,4.5949,-2.047;16.4297,4.7318,-3.3789;14.1317,6.438,-3.3786;13.2544,6.8301,-3.9082;14.9868,7.0755,-3.632;13.9456,6.5406,-2.3029)|\",6.914412941205001\r\n\"[H]/C1=C(/[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])/C([H])=C(\\C([H])([H])[H])C([H])([H])C([H])([H])[C@@]([H])(N([H])[H])[C@@]([H])(C([H])([H])[H])C1=O |(3.8082,-3.6288,-2.8752;3.0956,-3.3587,-2.0985;2.3195,-2.272,-2.2178;1.5839,-2.0779,-1.4398;2.3675,-1.1878,-3.2789;3.647,-1.2606,-4.1289;3.6518,-0.4562,-4.8734;3.7135,-2.2106,-4.6706;4.5498,-1.1622,-3.5162;1.1346,-1.3037,-4.2037;1.0902,-0.4529,-4.895;0.2023,-1.3174,-3.6268;1.1769,-2.2239,-4.7967;2.2923,0.1995,-2.5237;2.6399,0.9762,-3.2198;1.2395,0.4182,-2.3183;3.0908,0.2175,-1.2443;4.1522,-0.0035,-1.3682;2.6366,0.3457,0.0167;1.2176,0.7149,0.3834;1.215,1.6302,0.9914;0.7405,-0.0613,0.9967;0.5772,0.8959,-0.4824;3.5317,0.0249,1.2024;4.5882,0.114,0.9153;3.3636,0.745,2.0147;3.2721,-1.4162,1.7373;2.2002,-1.5226,1.9328;3.7709,-1.542,2.7081;3.7902,-2.4878,0.7383;3.8356,-1.9829,-0.2247;5.168,-2.9457,0.9862;5.2364,-3.3542,1.919;5.8063,-2.1506,0.9771;2.9176,-3.7632,0.5102;3.3219,-4.5687,1.1345;1.4155,-3.6601,0.8496;1.2698,-3.5478,1.9292;0.903,-4.5787,0.5436;0.9141,-2.8187,0.3608;3.115,-4.3005,-0.9306;3.3426,-5.4848,-1.1172)|\",5.107576973884999\r\n\"[H]OON1C([H])=C(C([H])([H])[C@]([H])(N([H])[H])C([H])([H])O[H])C2=C1/C([H])=C([H])\\C([H])=C/2[H] |(1.8265,3.9182,-0.9377;2.3965,3.1437,-0.7736;1.8612,2.7399,0.5938;1.0223,1.7067,0.4489;1.4452,0.3952,0.2296;2.4708,0.1303,0.4463;0.386,-0.3645,-0.1797;0.4087,-1.84,-0.4605;-0.2986,-2.3714,0.1871;1.4026,-2.2319,-0.2081;0.0957,-2.248,-1.9213;-0.9644,-2.0467,-2.1295;0.2931,-3.7003,-2.0132;1.2996,-3.8744,-2.0329;-0.0579,-4.0339,-2.9111;0.9194,-1.4507,-2.9487;0.6496,-0.3856,-2.9273;0.6955,-1.8286,-3.9536;2.324,-1.6283,-2.7638;2.599,-1.0013,-2.0771;-0.7572,0.5344,-0.2732;-0.3168,1.8311,0.0855;-1.1506,2.9462,0.0894;-0.7812,3.9256,0.378;-2.4795,2.7439,-0.2766;-3.1629,3.5881,-0.2892;-2.9529,1.4631,-0.6156;-3.9985,1.3361,-0.8817;-2.1073,0.3572,-0.6115;-2.4894,-0.6277,-0.8645)|\",4.693963921125\r\n\"[H]OC(=O)C(=O)/C([H])=C(\\[H])C1=C(OC([H])([H])[H])C([H])=C([H])C([H])=C1[H] |(3.3355,-5.8678,2.1084;3.504,-5.6361,1.1755;3.335,-4.3057,1.0886;3.0305,-3.6031,2.0299;3.5738,-3.7806,-0.3454;3.8765,-4.5628,-1.2337;3.4045,-2.3313,-0.4933;3.1336,-1.7652,0.3845;3.5958,-1.7663,-1.7097;3.8675,-2.4601,-2.506;3.5023,-0.3797,-2.137;3.1672,0.7201,-1.2966;2.9123,0.4358,0.0063;2.592,1.4948,0.8999;2.4475,1.024,1.8732;1.6682,2.005,0.6001;3.409,2.2239,0.9636;3.1117,2.0159,-1.8215;2.857,2.8537,-1.1835;3.3842,2.2429,-3.1714;3.3355,3.2573,-3.5578;3.7142,1.1835,-4.0162;3.9252,1.3581,-5.0665;3.7684,-0.103,-3.4929;4.0245,-0.9375,-4.1407)|\",3.752449998395\r\n\"[H]O/C(C(=O)C([H])([H])C([H])([H])C([H])([H])[H])=C(\\[H])OP(=O)(O[H])O[H] |(5.04,-1.3357,1.1995;4.5429,-0.8808,0.5001;5.3083,0.1305,-0.0271;4.5764,0.9242,-1.0578;5.1648,1.5943,-1.9049;3.0647,0.8591,-1.0228;2.7807,-0.193,-1.1732;2.7355,1.0854,0.0018;2.3801,1.7675,-2.0458;2.6956,2.804,-1.8776;2.7322,1.5077,-3.0506;0.8534,1.6643,-1.9795;0.3841,2.3215,-2.7194;0.4746,1.9522,-0.9911;0.5125,0.6411,-2.1794;6.5607,0.3842,0.4024;7.0314,-0.2355,1.1656;7.2897,1.477,0.013;8.5378,1.2692,-1.054;9.409,0.1186,-0.7454;7.7939,1.3064,-2.4512;6.8193,1.4789,-2.339;9.2416,2.7133,-0.9487;10.0509,2.6458,-0.4156)|\",4.781040353285001\r\n\"[H]N([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(B(OC([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])[H])N([H])[H] |(10.5899,-1.5972,4.3026;10.0388,-2.455,4.33;10.7106,-3.2123,4.4469;9.3263,-2.6157,3.0561;9.9886,-2.6257,2.1695;8.8258,-3.5928,3.0785;8.2759,-1.5169,2.8826;7.6087,-1.5336,3.7538;8.7759,-0.5358,2.8968;7.4635,-1.6524,1.5884;6.9729,-2.6367,1.5818;8.144,-1.6437,0.7233;6.4094,-0.5511,1.4113;6.917,0.4267,1.4371;5.7212,-0.5588,2.269;5.5766,-0.6599,0.1017;6.2883,-0.651,-0.7377;4.591,0.5896,-0.0108;4.8142,1.7293,-0.7342;5.9586,1.9287,-1.5647;6.8757,1.8402,-0.9669;5.9941,1.1544,-2.3417;5.8668,3.3102,-2.1927;6.7312,3.4944,-2.8404;5.8411,4.0842,-1.4187;4.9565,3.3982,-2.7948;3.4412,0.5137,0.7232;2.4673,1.5629,0.7158;2.9208,2.4866,1.0965;2.1473,1.7592,-0.3147;1.2907,1.1382,1.5792;0.5261,1.9231,1.5961;1.6145,0.9482,2.6081;0.837,0.2212,1.1887;4.7729,-1.8889,-0.0394;4.0117,-1.8684,0.637;5.3327,-2.7149,0.1675)|\",7.25999753134\r\n\"[H]OC1=NC(=O)N([C@@]2([H])O[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C2([H])[H])C([H])([H])C1([H])[H] |(8.6269,-0.0724,3.1917;8.6965,-0.5531,2.352;7.654,-0.2324,1.5559;7.5903,-0.8038,0.4184;6.5578,-0.4617,-0.486;6.6449,-0.7759,-1.6605;5.4671,0.2644,0.0046;4.2845,0.3749,-0.8352;4.5625,-0.1011,-1.7762;3.179,-0.3524,-0.2649;2.0747,0.5388,-0.0053;2.1931,0.9864,0.9968;0.7779,-0.2485,-0.0543;-0.0663,0.4006,0.2034;0.8095,-1.0774,0.6594;0.6049,-0.654,-1.0544;2.2465,1.6501,-1.0574;1.7144,2.5652,-0.7821;1.7416,1.2889,-2.3361;2.1284,0.4306,-2.5765;3.7728,1.8159,-1.0476;4.0811,2.465,-0.2193;4.1484,2.2536,-1.9756;5.3024,0.4579,1.4404;4.6028,1.281,1.6117;4.8736,-0.4373,1.9118;6.6575,0.7844,2.0638;6.5869,0.7634,3.1582;6.9886,1.7896,1.7691)|\",5.7688136306\r\n\"[H]OC1=C([H])C([H])=C(C([H])([H])[C@]([H])(/C(=N\\C([H])(C([H])([H])[H])C([H])([H])[H])O[H])N([H])[H])C([H])=C1O[H] |(11.4611,1.7886,1.5049;10.8258,2.5205,1.5402;9.7242,2.1749,0.8108;9.5803,0.9529,0.1557;10.3876,0.2246,0.214;8.4287,0.6592,-0.5785;8.3508,-0.2988,-1.086;7.3892,1.5885,-0.6734;6.1229,1.2732,-1.441;6.3676,0.7787,-2.3873;5.5911,2.1977,-1.6945;5.1407,0.3541,-0.6678;5.6427,-0.6006,-0.469;3.8821,0.1076,-1.5253;3.6714,-0.7832,-2.4031;4.6572,-1.7979,-2.7466;5.6164,-1.6743,-2.2148;4.0911,-3.1765,-2.3758;4.7816,-3.9761,-2.6698;3.1322,-3.3369,-2.8806;3.9175,-3.2498,-1.2962;4.9457,-1.7057,-4.2523;5.6579,-2.4795,-4.5628;5.3646,-0.7267,-4.5114;4.0178,-1.832,-4.8205;2.9187,1.0333,-1.294;3.2316,1.5257,-0.4989;4.7142,1.0244,0.5749;5.5167,1.4593,1.0276;4.3327,0.3448,1.2311;7.5401,2.8182,-0.0111;6.7513,3.5666,-0.0834;8.6863,3.1229,0.7228;8.8691,4.3087,1.3739;8.0973,4.8732,1.2114)|\",5.624593289835\r\n\"[H]OC1=C(O[H])C([H])=C(C([H])([H])[C@]([H])(/C(=N\\C([H])([H])C([H])([H])[H])O[H])N([H])[H])C([H])=C1[H] |(1.9908,2.2091,-6.5746;1.7413,2.1988,-5.6377;2.7889,1.6941,-4.9188;2.6256,1.594,-3.5239;1.432,2.0121,-3.0044;1.459,1.8966,-2.0421;3.6673,1.0931,-2.7456;3.5288,1.0383,-1.6662;4.8837,0.6783,-3.3124;5.9887,0.1199,-2.4454;5.9897,0.6305,-1.4721;6.9621,0.3318,-2.905;5.9087,-1.4158,-2.1385;6.7778,-1.6446,-1.5122;6.0468,-2.2312,-3.4186;7.0874,-2.6305,-4.0272;8.4331,-2.3898,-3.532;8.4852,-1.6331,-2.7342;8.8086,-3.3284,-3.0993;9.3536,-1.9731,-4.6825;10.3855,-1.8564,-4.332;9.0285,-1.021,-5.1174;9.3362,-2.727,-5.4761;4.8322,-2.5083,-3.9635;5.0248,-3.0016,-4.7825;4.7276,-1.8564,-1.3985;3.8975,-1.7392,-1.9753;4.609,-1.2754,-0.57;5.0319,0.7887,-4.6967;5.9609,0.4802,-5.1682;3.9952,1.2906,-5.487;4.1264,1.3763,-6.5647)|\",5.59466076628\r\n\"[H]OC1=C(O[H])C([H])=C(C([H])([H])[C@]([H])(/C(=N\\C([H])([H])[H])O[H])N([H])[H])C([H])=C1[H] |(7.3025,2.2143,-8.3138;6.5516,2.8153,-8.1896;5.8908,2.4541,-7.0504;4.7485,3.2015,-6.7033;4.396,4.2233,-7.537;3.5845,4.6328,-7.1985;4.0384,2.8722,-5.5489;3.1483,3.4496,-5.3002;4.4274,1.8081,-4.7181;3.6576,1.4913,-3.4533;3.5607,0.4068,-3.333;2.64,1.8946,-3.5154;4.3162,2.0522,-2.1655;5.3031,1.5872,-2.0514;3.4352,1.7022,-0.9505;3.4301,0.6499,-0.2429;4.3411,-0.4463,-0.4888;4.8058,-0.7417,0.4601;5.1521,-0.262,-1.212;3.776,-1.3192,-0.8433;2.5239,2.6694,-0.6824;2.7856,3.4255,-1.2586;4.4029,3.5215,-2.2526;4.727,3.7967,-3.1784;5.0815,3.8762,-1.5805;5.5598,1.0729,-5.0787;5.8832,0.2358,-4.4656;6.2804,1.3959,-6.2311;7.1574,0.8117,-6.5053)|\",5.624593289835\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])C([H])([H])[C@@]2([H])OC([H])([H])[C@@]3([H])C(=O)O[C@@]23[H])C([H])=C1[H] |(6.0569,1.861,4.5124;5.4635,1.6322,3.6313;5.7223,2.2777,2.4225;6.5177,3.0156,2.3567;4.9548,1.9828,1.2931;5.1555,2.5013,0.3575;3.9175,1.0432,1.3478;3.0826,0.7286,0.1202;2.0224,0.9025,0.3461;3.3378,1.4297,-0.684;3.2159,-0.7168,-0.4026;2.5739,-0.8455,-1.2801;2.8697,-1.4263,0.3579;4.6337,-1.1161,-0.7954;5.3129,-1.0243,0.0692;4.6127,-2.48,-1.2425;5.8198,-2.7102,-1.9586;6.6594,-2.9047,-1.2719;5.6741,-3.5833,-2.5997;6.0722,-1.4101,-2.736;7.1125,-1.2169,-3.004;5.0445,-1.1144,-3.8342;4.806,-1.487,-4.9433;4.3681,-0.1678,-3.0971;5.2755,-0.3457,-1.9511;5.7487,0.5999,-1.6851;3.6663,0.405,2.5724;2.8592,-0.3213,2.6427;4.4307,0.6938,3.7022;4.2166,0.1901,4.6413)|\",6.462703949375\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C([H])/C2=C(\\[H])C3=C([H])C([H])=C([H])C([H])=C3OC2([H])[H])C([H])=C1[H] |(-2.1326,-2.6208,0.7051;-1.0668,-2.4858,0.5429;-0.2371,-3.5912,0.3508;-0.653,-4.5952,0.3622;1.1285,-3.4114,0.1424;1.7687,-4.2782,-0.0066;1.7042,-2.1263,0.1203;3.146,-2.0059,-0.1003;3.6567,-2.958,-0.2347;3.8742,-0.8646,-0.1449;3.3723,0.0916,-0.0028;5.3032,-0.7669,-0.3478;5.9509,0.4245,-0.3133;5.3954,1.3458,-0.1489;7.385,0.51,-0.5024;8.1287,1.6849,-0.3088;7.6144,2.5789,0.0369;9.4985,1.7153,-0.5576;10.0608,2.6308,-0.3997;10.1432,0.5631,-1.0193;11.2111,0.5801,-1.2193;9.4267,-0.6175,-1.2219;9.9084,-1.5206,-1.5833;8.0591,-0.6442,-0.9554;7.3743,-1.7957,-1.2184;6.1439,-2.0164,-0.5148;5.6233,-2.7724,-1.1079;6.3673,-2.4609,0.47;0.8496,-1.0223,0.316;1.2541,-0.0146,0.3049;-0.5139,-1.2013,0.5238;-1.1516,-0.3337,0.6727)|\",3.499384117430001\r\n\"[H]OC(=N/C1([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(10.7524,3.9779,0.4911;11.4525,4.1268,-0.1634;11.6881,5.4807,-0.2391;12.8254,5.8221,-0.6942;13.1832,7.205,-0.9508;12.3738,7.9208,-0.7435;14.466,7.5539,-0.1647;14.5962,8.6443,-0.1826;14.4103,7.2427,0.8838;15.6023,6.8577,-0.9456;16.5498,7.3992,-0.8512;15.762,5.8517,-0.5474;15.1064,6.7735,-2.42;15.1077,5.7344,-2.761;15.7489,7.3376,-3.105;13.6644,7.3382,-2.4145;12.9976,6.8228,-3.1125;13.6656,8.4031,-2.6817;10.577,6.3474,0.2037;10.8367,7.3565,0.5103;9.2798,5.971,0.2364;9.0004,4.9768,-0.1149;8.1918,6.8193,0.683;8.458,7.8201,1.0254;6.9005,6.4414,0.701;6.6448,5.4384,0.3524;5.7527,7.2993,1.1449;5.2398,6.8119,1.9895;6.1347,8.2562,1.5195;4.7186,7.5351,0.0227;5.2077,8.0627,-0.8077;4.4067,6.5601,-0.3767;3.4694,8.3132,0.4667;2.747,8.3109,-0.3617;2.9818,7.7771,1.295;3.7286,9.7673,0.8863;4.411,9.7943,1.7463;4.2455,10.2905,0.069;2.443,10.5209,1.2434;2.654,11.5555,1.5369;1.9194,10.0381,2.0778;1.7515,10.5498,0.3923)|\",4.57695496541\r\n\"[H]C1=C([H])[C@@]2([H])OC(=O)C([H])([H])[C@]2([H])C([H])([H])C1=O |(3.0411,1.3155,-0.482;2.1297,0.7932,-0.2048;0.9552,1.4347,-0.0726;0.8584,2.5058,-0.2334;-0.2375,0.655,0.3826;-0.2369,0.6156,1.4855;-1.5153,1.1818,-0.0188;-2.4056,0.1319,-0.1172;-3.5712,0.3037,-0.3417;-1.6517,-1.1915,0.0944;-1.794,-1.5153,1.135;-2.0557,-1.9663,-0.5603;-0.2122,-0.7639,-0.1861;-0.106,-0.6574,-1.275;1.019,-1.4929,0.3404;1.1532,-2.4888,-0.0925;0.9573,-1.6227,1.4319;2.2869,-0.6711,0.0355;3.3859,-1.1976,-0.0045)|\",5.004173710695\r\n\"[H]OC(=O)C([H])([H])[C@@]1([H])C([H])=C([H])C([H])([H])[C@]([H])(O[H])C1([H])[H] |(-1.3505,2.7079,-3.7023;-1.1293,3.4379,-3.1005;-0.8474,2.9518,-1.8594;-0.547,3.71,-0.9725;-0.9858,1.4396,-1.7071;-2.0611,1.2097,-1.759;-0.5247,0.9366,-2.5694;-0.3921,0.8969,-0.3957;-0.7884,1.5405,0.4041;-0.8539,-0.5246,-0.1529;-1.9116,-0.7296,-0.3226;-0.0548,-1.5138,0.2646;-0.4725,-2.5085,0.415;1.4141,-1.3257,0.5631;2.0197,-1.7551,-0.2498;1.693,-1.8744,1.4718;1.7621,0.1593,0.7536;2.8499,0.2852,0.7483;1.3533,0.625,2.0406;0.4197,0.3823,2.15;1.1465,1.0016,-0.3743;1.4418,2.0482,-0.2601;1.55,0.6429,-1.3331)|\",6.92801863373\r\n\"[H]O/N=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1/N=C(/O[H])OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(7.1248,5.04,1.9965;7.1006,4.2031,1.507;6.0402,3.5231,2.136;5.8599,2.3734,1.5959;6.4777,2.0507,0.7588;4.8248,1.4565,2.0813;4.0701,1.755,3.229;4.2705,2.6882,3.7456;3.0989,0.8808,3.6989;2.5286,1.1273,4.5902;2.8696,-0.3221,3.0222;2.1213,-1.0209,3.3875;3.5995,-0.6345,1.8803;3.4229,-1.5694,1.3609;4.5775,0.2462,1.3845;5.3858,-0.0555,0.2768;4.9678,-0.6004,-0.798;5.8585,-0.875,-1.7817;6.7128,-0.5607,-1.4337;3.7059,-0.9409,-1.0586;3.1756,-1.3651,-2.3761;1.6774,-1.4865,-2.0846;1.1436,-1.7944,-2.9896;1.2716,-0.528,-1.7474;1.4952,-2.2316,-1.3039;3.7614,-2.7248,-2.7679;3.2469,-3.098,-3.6604;3.6094,-3.45,-1.9615;4.8285,-2.659,-2.9856;3.4354,-0.2802,-3.4248;2.9197,-0.5444,-4.3547;4.4998,-0.1739,-3.6418;3.044,0.6833,-3.0816)|\",4.718454167670001\r\n\"[H]C(=O)[C@@]1([H])C([H])([H])C(=C([H])[H])[C@]([H])(C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])[H])C([H])([H])C1([H])[H] |(5.9323,2.9697,1.8934;5.2661,2.8389,1.0088;5.3827,3.5573,0.0403;4.2637,1.7115,1.141;3.6665,1.935,2.0417;3.3328,1.6007,-0.0764;2.7915,2.5397,-0.2277;3.9619,1.4625,-0.9683;2.3664,0.44,0.0493;1.0532,0.6209,-0.1265;0.6502,1.6051,-0.3526;0.3297,-0.1865,-0.0769;3.036,-0.8976,0.3418;3.672,-1.1128,-0.5334;2.0904,-2.1149,0.4902;1.4144,-2.1354,-0.3769;2.6923,-3.0299,0.4336;1.2737,-2.1458,1.7531;0.666,-1.2655,1.9553;1.2499,-3.1725,2.6169;1.8543,-4.0612,2.4408;0.4387,-3.241,3.8582;0.5047,-4.2339,4.5693;-0.4634,-2.0725,4.2294;-1.1937,-1.868,3.4378;0.1214,-1.1569,4.3788;-0.9901,-2.3171,5.1534;3.9903,-0.7729,1.5549;3.3985,-0.6146,2.4663;4.526,-1.7209,1.6922;4.9948,0.3747,1.3968;5.6648,0.1662,0.5503;5.6278,0.4476,2.2905)|\",5.205537960065\r\n\"[H]C([H])=C1C([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])[H] |(0.7142,-2.4832,-0.3292;0.8402,-1.405,-0.3906;-0.0652,-0.8284,-0.5501;2.0477,-0.8456,-0.2649;3.2872,-1.6889,-0.0587;3.0228,-2.7507,-0.004;3.771,-1.4221,0.8921;4.3177,-1.455,-1.197;3.8893,-1.8333,-2.1317;5.5801,-2.2428,-0.8935;6.4127,-1.9271,-0.0691;5.6552,-3.3726,-1.6323;6.7989,-4.204,-1.3698;6.6995,-5.0583,-2.0399;6.8047,-4.5308,-0.3265;7.7241,-3.659,-1.5746;4.6286,0.0458,-1.3305;5.1704,0.3672,-0.4324;5.3027,0.2141,-2.1796;3.3536,0.8823,-1.5041;2.8728,0.6377,-2.4624;3.6297,1.9425,-1.5483;2.3334,0.6458,-0.3633;2.8292,0.9424,0.5757;1.0636,1.5175,-0.505;0.5811,1.2812,-1.4654;0.355,1.2454,0.2852;1.3283,2.9947,-0.4405;1.9099,3.4201,-1.2574;0.8997,3.8015,0.5423;0.3109,3.4048,1.368;1.1473,5.2621,0.635;0.6991,5.8921,1.5826;1.9573,5.9565,-0.4511;1.4833,5.8404,-1.4331;2.9662,5.5331,-0.5224;2.0285,7.0182,-0.2086)|\",5.197374544550001\r\n\"[H]O[C@@]1([H])C(=O)[C@@]2([H])C([H])([H])C(=C([H])[H])[C@]([H])(C([H])([H])C2([H])[H])C1([H])[H] |(1.7212,-1.8228,0.5235;2.2851,-2.5853,0.7369;3.5233,-2.0919,1.261;3.8776,-2.8519,1.9638;4.5612,-2.0459,0.1268;5.4572,-2.8679,0.0786;4.4971,-0.9173,-0.9083;5.0031,-1.3143,-1.7941;3.0864,-0.434,-1.2933;3.1993,0.2978,-2.1062;2.494,-1.2585,-1.7041;2.3489,0.2349,-0.1356;1.0497,0.5497,-0.203;0.4513,0.3332,-1.0857;0.5407,1.0503,0.6174;3.1963,0.5198,1.096;2.6767,1.2651,1.7096;4.5475,1.1378,0.6638;4.3386,2.1252,0.2356;5.1586,1.3145,1.5569;5.3283,0.28,-0.3689;6.2635,-0.1001,0.0555;5.6112,0.9038,-1.225;3.3658,-0.7473,1.9894;2.4836,-0.8489,2.6343;4.2265,-0.5971,2.6544)|\",5.695342890965\r\n\"[H]C([H])=C1C([H])([H])[C@@]2([H])C([H])([H])C([H])([H])[C@]1([H])C([H])([H])C([H])([H])C21OC([H])([H])C([H])([H])O1 |(1.2122,-1.1754,-1.9468;1.8145,-0.3247,-1.6339;1.9222,0.4833,-2.3538;2.3859,-0.2695,-0.4278;2.2445,-1.3833,0.5966;2.3814,-2.3516,0.0985;1.2306,-1.3892,1.0129;3.2579,-1.2757,1.7588;3.3236,-2.2666,2.2268;4.6612,-0.9117,1.227;5.3455,-0.7916,2.0744;5.0319,-1.7693,0.6528;4.6742,0.3571,0.3207;5.1648,0.1189,-0.6297;5.2705,1.1549,0.7801;3.2531,0.8909,0.0081;3.3266,1.5957,-0.8289;2.6289,1.6626,1.2109;2.9297,2.7156,1.1385;1.5364,1.649,1.1166;3.0428,1.1699,2.6038;2.5147,1.7377,3.3769;4.1119,1.3516,2.7571;2.8029,-0.3109,2.8929;3.5113,-0.6131,4.0978;2.7835,-1.6202,4.7883;3.2172,-2.6152,4.6153;2.8393,-1.4019,5.8592;1.3458,-1.5197,4.2196;0.6118,-1.1962,4.9646;1.0096,-2.4755,3.7945;1.427,-0.5121,3.2166)|\",7.03142189692\r\n\"[H]O[C@]1([H])[C@@]([H])(O[H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])[C@]1([H])N(C([H])([H])[H])C([H])([H])[H] |(4.4099,2.3813,-0.8193;5.09,1.7956,-0.423;4.4341,1.2084,0.6846;4.5985,1.8034,1.5936;5.0927,-0.1516,0.9791;5.6242,-0.4471,0.0594;5.9874,-0.0492,2.0613;5.9106,-0.8982,2.5378;4.0853,-1.2549,1.263;4.2976,-2.1044,2.1034;2.9699,-1.2992,0.52;2.5211,-0.1865,-0.3456;1.0157,-0.4944,-0.404;0.5019,0.0475,-1.1976;0.5313,-0.2683,0.5511;0.8985,-1.5649,-0.5948;3.1695,-0.3453,-1.7245;2.7617,0.3882,-2.4269;2.9552,-1.3447,-2.1163;4.2491,-0.1913,-1.6941;2.8907,1.1248,0.4178;2.388,1.0239,1.3966;2.5268,2.4189,-0.2119;1.2893,2.5186,-0.985;1.2197,3.5394,-1.3741;0.3797,2.3248,-0.3915;1.3011,1.8464,-1.8426;2.5478,3.4971,0.7884;2.3849,4.4546,0.285;3.515,3.5451,1.2921;1.7635,3.3695,1.5552)|\",6.729375522865\r\n\"[H]OC1=N[C@]([H])(C(O[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C1([H])[H] |(2.6506,-3.3251,4.0233;1.8408,-2.9416,3.6541;2.1192,-1.6967,3.1938;1.2095,-0.9687,2.6901;1.8147,0.2887,2.2368;1.2186,1.1189,2.6382;1.7414,0.4116,0.6863;2.6519,-0.5676,0.1654;3.2761,-0.8967,-1.3579;3.9551,-2.649,-1.1993;4.4184,-2.9854,-2.1349;3.159,-3.3562,-0.9414;4.7166,-2.7036,-0.4123;1.9904,-0.8416,-2.7451;2.461,-1.1595,-3.6847;1.5809,0.1612,-2.9102;1.1555,-1.5227,-2.5474;4.6946,0.2873,-1.7742;5.2041,-0.0287,-2.6936;5.4409,0.3023,-0.971;4.3491,1.316,-1.9264;0.3122,0.1076,0.2117;-0.404,0.7868,0.6889;0.0392,-0.9135,0.4825;0.2274,0.2349,-0.8719;2.1528,1.8281,0.255;1.4985,2.5761,0.7173;2.065,1.9433,-0.8313;3.1856,2.0562,0.5368;3.2687,0.3249,2.8168;4.006,0.6704,2.0898;3.3094,0.9963,3.6804;3.5243,-1.1296,3.2522;3.9632,-1.2192,4.2546;4.176,-1.6708,2.5549)|\",7.47768861174\r\n\"[H]C1=C([H])C([H])=C([C@]23O[C@](C([H])([H])[H])(C([H])=C2[H])C2=C3C([H])=C([H])C([H])=C2[H])C([H])=C1[H] |(4.3746,-3.2053,6.8548;4.2382,-2.6829,5.9117;5.3409,-2.155,5.2376;6.339,-2.2645,5.6537;5.1661,-1.4888,4.0242;6.0313,-1.0842,3.5038;3.8866,-1.3398,3.473;3.7038,-0.6037,2.1761;2.366,-0.7359,1.6369;2.5755,0.0431,0.4288;1.3354,0.0789,-0.4329;1.51,0.6735,-1.3364;0.501,0.5233,0.1182;1.0548,-0.9345,-0.7356;3.8376,-0.6633,-0.1217;4.1028,-0.7286,-1.1707;4.5302,-1.0599,0.9447;5.5013,-1.5373,0.9943;3.0761,1.3541,1.0672;3.8042,0.9442,2.1952;4.4121,1.8576,3.0312;4.9642,1.5422,3.9125;4.2806,3.2257,2.7173;4.7421,3.9684,3.3629;3.5622,3.6351,1.5979;3.4686,4.6948,1.3758;2.9464,2.6917,0.75;2.3819,3.0209,-0.1191;2.7844,-1.8673,4.155;1.7956,-1.751,3.7252;2.9615,-2.5371,5.366;2.0984,-2.9469,5.8844)|\",5.396017655415\r\n\"[H]O[C@@]1([H])C([H])([H])C([H])=C([H])[C@]([H])(C([H])([H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C1([H])[H] |(2.2248,4.2908,-1.8607;2.5687,3.4176,-1.6169;3.7001,3.1446,-2.448;3.3745,3.0433,-3.4962;4.7606,4.256,-2.3627;5.4172,4.1993,-3.2453;4.2666,5.2377,-2.4365;5.5862,4.1855,-1.1031;6.206,5.0502,-0.8695;5.5841,3.1331,-0.2805;6.1925,3.1585,0.6238;4.7624,1.8862,-0.5115;3.8832,1.9301,0.1461;5.5706,0.6256,-0.1137;5.9204,0.7362,0.9197;6.4418,0.5106,-0.7654;4.752,-0.6493,-0.2172;4.9065,-1.4948,-1.0753;3.8216,-0.6854,0.7602;2.8452,-1.7859,0.87;3.5752,-3.1096,1.1221;2.8429,-3.898,1.3299;4.2336,-3.021,1.9934;4.1716,-3.4029,0.2568;1.9674,-1.8345,-0.385;1.5017,-0.8586,-0.5615;1.1683,-2.5713,-0.2446;2.5503,-2.1137,-1.2641;2.0202,-1.3787,2.0936;1.2406,-2.1233,2.2867;1.5409,-0.408,1.9306;2.6563,-1.304,2.9816;4.2716,1.8099,-1.9695;3.5068,1.033,-2.0749;5.1072,1.5317,-2.6254)|\",6.887201556155\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])=C([H])[H])C([H])([H])[C@@]([H])(C([H])=C([H])[H])C([H])([H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(4.6596,-1.9256,4.4976;5.4054,-1.7229,3.8995;5.0252,-2.0871,2.5793;5.6919,-1.513,1.9202;5.3272,-3.5888,2.3244;5.2063,-3.7955,1.251;6.3842,-3.742,2.57;4.4786,-4.5346,3.1313;3.4605,-4.7029,2.778;4.878,-5.1442,4.2491;4.2195,-5.8022,4.8105;5.883,-5.0088,4.6431;3.5702,-1.7164,2.2367;3.3676,-2.0579,1.2132;2.8896,-2.2779,2.8885;3.2161,-0.2099,2.2956;3.9218,0.3279,1.6472;1.8251,0.0145,1.7506;1.0112,-0.4383,2.3196;1.5374,0.7045,0.6467;0.5155,0.8268,0.2973;2.3142,1.1755,0.047;3.3674,0.4389,3.7081;3.0471,1.4822,3.6531;4.4196,0.3902,3.9964;2.5832,-0.2706,4.795;2.9226,-1.3356,5.2928;1.4733,0.4048,5.128;0.5435,-0.0647,6.1804;-0.0661,-1.4118,5.7795;-0.8396,-1.6911,6.5038;0.6894,-2.1984,5.7552;-0.5387,-1.3395,4.7935;1.2721,-0.1304,7.5263;0.5508,-0.3558,8.3198;1.7368,0.8347,7.7555;2.0423,-0.9031,7.5243;-0.5216,1.0343,6.1898;-1.2882,0.8082,6.9383;-1.0049,1.1136,5.2108;-0.0749,2.0037,6.4325)|\",6.457261672365\r\n\"[H]C(=O)C([H])([H])[C@]([H])(C([H])=C([H])[H])C([H])([H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(1.5903,-1.338,2.0468;1.7676,-1.1757,0.9588;1.0324,-0.4474,0.3245;2.9475,-1.9339,0.4028;2.7746,-2.9959,0.6373;3.8402,-1.6537,0.9776;3.1892,-1.7594,-1.111;2.2293,-1.9331,-1.6136;4.1766,-2.7863,-1.6146;5.1788,-2.7269,-1.1922;3.8917,-3.7284,-2.5143;4.6368,-4.4475,-2.8449;2.9013,-3.8188,-2.9575;3.6065,-0.321,-1.4962;2.8294,0.382,-1.1747;3.6595,-0.2325,-2.5871;4.932,0.1634,-0.9296;5.5833,-0.4058,-0.0721;5.2686,1.3311,-1.5068;6.4879,2.0743,-1.1282;6.4141,2.4824,0.347;7.2565,3.1413,0.5862;6.4551,1.611,1.0024;5.4875,3.0326,0.544;7.7317,1.2355,-1.4397;8.6303,1.8422,-1.2799;7.7218,0.9139,-2.487;7.7861,0.3538,-0.7994;6.421,3.3053,-2.0354;7.2835,3.9546,-1.8519;5.5081,3.8782,-1.8435;6.4267,3.01,-3.0896)|\",6.185147821865\r\n\"[H]C1C2=C(/N=C(/C([H])([H])[H])NC2=O)C([H])=C([H])[C@@]1([H])C([H])([H])N([H])[H] |(6.9835,-1.127,0.1149;6.4803,-0.1627,0.1029;5.1292,-0.1446,0.0929;4.3998,1.118,0.0512;3.0948,1.1689,0.036;2.4229,-0.0607,0.0567;0.9285,0.0835,0.0229;0.5942,0.6837,0.8777;0.4473,-0.8949,0.0416;0.6292,0.6323,-0.8783;2.9288,-1.2609,0.0987;4.3219,-1.4043,0.1289;4.852,-2.504,0.189;5.1703,2.3541,0.0192;4.6037,3.2794,-0.0241;6.5151,2.3354,0.0351;7.0734,3.2695,0.0105;7.3173,1.0706,0.1022;7.9855,1.0158,-0.7744;8.295,1.0918,1.3256;7.6955,1.1971,2.2438;8.9095,1.9943,1.2365;9.1812,-0.0691,1.3058;8.733,-0.8726,1.7409;10.0193,0.121,1.8495)|\",3.5374800565\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])O[C@@]2([H])C([H])([H])[C@@]3([H])C(=O)C([H])=C([H])[C@]2([H])C([H])=C3[H])C([H])=C1[H] |(2.6435,-4.0985,6.1613;3.0621,-3.6677,5.2557;3.7551,-4.4732,4.3495;3.8774,-5.5347,4.5486;4.2954,-3.9218,3.1875;4.8279,-4.5461,2.4774;4.1418,-2.5576,2.914;4.7688,-1.9381,1.6857;5.7984,-1.616,1.9159;4.2113,-1.0339,1.3885;4.7765,-2.8805,0.6241;5.3707,-2.4039,-0.5761;4.9538,-1.4122,-0.8093;6.916,-2.3071,-0.495;7.2199,-1.2596,-0.4021;7.2543,-2.8227,0.4104;7.6452,-2.9078,-1.7391;8.6589,-2.5042,-1.7987;7.8137,-4.4275,-1.5684;8.9229,-4.9408,-1.5655;6.5968,-5.2471,-1.3631;6.7944,-6.2963,-1.1562;5.3338,-4.7892,-1.4094;4.5253,-5.4889,-1.2068;4.9233,-3.3612,-1.7182;3.8324,-3.3166,-1.7745;5.5492,-2.8418,-3.0027;4.9194,-2.6615,-3.8696;6.8617,-2.6066,-3.0023;7.3898,-2.2433,-3.8798;3.4386,-1.7579,3.8215;3.303,-0.6985,3.6132;2.9051,-2.3069,4.9885;2.3603,-1.6731,5.6834)|\",4.810972876839999\r\n\"[H]/C(C(=O)OC([H])([H])[H])=C(\\C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])Cl |(2.9856,3.152,2.8027;3.5437,2.2784,2.478;4.7704,2.0297,3.2615;5.6074,1.1544,3.1021;4.8713,2.951,4.251;6.0331,2.8342,5.0854;6.9456,2.9173,4.489;6.0378,1.8735,5.6076;5.9641,3.6566,5.7981;3.0659,1.5376,1.4557;1.7748,1.9514,0.7924;1.0385,1.138,0.8435;1.9265,2.1656,-0.2733;1.3399,2.8407,1.2568;3.7111,0.28,0.9162;4.521,-0.0323,1.5751;2.95,-0.5144,0.9315;4.2195,0.3998,-0.5429;4.5922,-0.5871,-0.8489;3.376,0.6196,-1.2092;5.3128,1.4563,-0.7869;4.9779,2.4374,-0.4295;5.478,1.5525,-1.8665;6.636,1.104,-0.1169;7.0255,0.1478,-0.4755;6.5587,1.0782,0.97;7.9183,2.3432,-0.5047)|\",5.970177879969999\r\n\"[H]O[C@@]1(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])[C@]2([H])O[C@@]1([H])C([H])=C2[H] |(8.4587,2.2473,-1.8647;7.8465,2.9242,-1.5339;7.2787,2.437,-0.3275;6.5545,1.1348,-0.5134;6.1519,0.6766,0.3876;6.3595,0.5213,-1.6876;6.7072,0.9445,-2.6244;5.6296,-0.7645,-1.7555;5.1826,-1.3799,-0.8058;5.5199,-1.1772,-3.0436;4.8237,-2.425,-3.2631;3.9964,-2.4975,-2.5529;4.4262,-2.3394,-4.2777;5.7648,-3.6139,-3.1323;5.2328,-4.5399,-3.3798;6.1385,-3.6963,-2.1078;6.6159,-3.5136,-3.814;8.2873,2.3667,0.876;8.3113,1.3723,1.3309;9.2975,2.6347,0.5551;7.653,3.4039,1.8611;7.885,3.2605,2.9168;6.2498,3.1784,1.6172;6.2737,3.5345,0.2319;5.2755,3.4981,-0.2029;7.0126,4.8635,0.2365;6.92,5.6368,-0.5151;7.8629,4.7874,1.2639;8.6341,5.4898,1.5581)|\",5.553843688705\r\n\"[H]C1=C([H])C([H])=C(/C2=C(\\[H])C([H])([H])[C@]3([H])[C@]2([H])[C@]2([H])C([H])=C([H])[C@@]3([H])C2([H])[H])C([H])=C1[H] |(5.7531,-7.0839,6.1276;5.8422,-6.6659,5.1285;6.8949,-7.0526,4.2942;7.6316,-7.7709,4.6451;7.0136,-6.5131,3.0166;7.8494,-6.8003,2.3848;6.0784,-5.5771,2.535;6.1867,-5.0211,1.1803;6.7565,-5.6051,0.0966;7.188,-6.6027,0.086;6.6648,-4.7362,-1.1647;5.7896,-5.0203,-1.7721;7.5522,-4.8119,-1.8041;6.4842,-3.3796,-0.4666;7.4576,-3.2419,0.0216;5.5001,-3.7548,0.6558;4.5856,-4.1497,0.1891;5.0751,-2.3212,1.1933;4.8188,-2.2274,2.2501;3.9727,-1.9619,0.1846;2.908,-2.0115,0.3932;4.5567,-1.7741,-1.0084;4.0598,-1.6057,-1.9586;6.0659,-1.9156,-0.8281;6.6914,-1.4208,-1.5754;6.1804,-1.377,0.6421;5.8942,-0.3248,0.7275;7.1702,-1.5209,1.0886;5.0325,-5.1892,3.3914;4.2988,-4.4679,3.0438;4.9129,-5.731,4.6707;4.0917,-5.4201,5.3115)|\",4.98512574116\r\n\"[H]OC(=O)C1=C(O[H])C(O/C(C(=O)O[H])=C(\\[H])C([H])([H])[H])=C([H])C([H])=C1[H] |(5.6691,5.1857,1.6408;6.1704,6.0167,1.7724;6.7728,6.0127,2.9763;7.4452,6.9568,3.3313;6.584,4.8019,3.8572;5.8419,3.6596,3.4998;5.2185,3.6109,2.2933;4.6971,2.7796,2.1638;5.7549,2.585,4.4028;4.9632,1.4975,4.0366;5.5209,0.4606,3.2969;4.7584,0.2895,2.031;4.1341,1.2002,1.5058;4.7936,-0.9436,1.4987;4.2867,-0.897,0.6649;6.5987,-0.2261,3.7064;7.0311,0.1115,4.6472;7.3227,-1.3536,3.0379;8.3854,-1.0994,2.9354;6.92,-1.6052,2.0581;7.2764,-2.2508,3.6696;6.3491,2.6494,5.6561;6.2161,1.8188,6.343;7.0826,3.7837,6.0158;7.5516,3.8403,6.993;7.1992,4.8377,5.1177;7.7633,5.7303,5.3649)|\",4.500763087269999\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])C([H])=C([H])[H])[C@]([H])(O[H])C([H])([H])N([H])C([H])([H])[H] |(2.3506,2.8829,0.8905;1.6866,3.4561,0.4678;2.4254,4.3328,-0.3844;1.675,4.8195,-1.0189;3.0408,5.466,0.4817;3.4782,6.2179,-0.1909;1.9605,6.1191,1.1346;1.4139,5.3913,1.4843;4.1037,4.996,1.4527;4.9407,4.4589,1.0136;4.0458,5.1904,2.77;4.8325,4.8357,3.4303;3.2167,5.7256,3.2231;3.3456,3.5083,-1.2914;2.674,2.8817,-1.9032;4.1565,2.6658,-0.4816;4.8875,2.4,-1.077;4.2469,4.3264,-2.2304;4.8455,5.0272,-1.6395;3.636,4.9162,-2.9354;5.1772,3.4093,-2.904;6.0355,3.9036,-3.1309;4.643,2.7868,-4.1165;5.4192,2.1666,-4.5755;4.2858,3.5146,-4.8653;3.805,2.131,-3.86)|\",7.32530485546\r\n\"[H]/C(OC([H])([H])C([H])([H])[H])=C(/[H])C(=O)P(=O)(OC([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])[H] |(4.3354,-2.1824,0.5289;5.4015,-1.9975,0.3903;5.8061,-1.6554,-0.8339;4.7826,-1.5417,-1.8461;4.2422,-2.4943,-1.9141;4.073,-0.7603,-1.5459;5.4626,-1.1967,-3.1564;4.7134,-1.0937,-3.9484;6.1682,-1.9817,-3.4447;6.0098,-0.253,-3.0707;6.2529,-2.1195,1.4334;7.3174,-1.9443,1.3258;5.7248,-2.5003,2.7359;4.5479,-2.7658,2.981;6.9693,-2.5545,4.1429;8.3777,-2.5809,3.6796;6.6033,-1.2881,5.0647;5.364,-1.188,5.8167;4.5212,-1.3724,5.1431;5.3663,-1.9641,6.5875;5.3058,0.2021,6.4211;4.39,0.3118,7.0129;5.3074,0.9674,5.6385;6.1659,0.3762,7.0753;6.4694,-3.7735,5.0832;6.6098,-5.139,4.6253;5.8859,-5.313,3.8201;7.6212,-5.2816,4.2298;6.3443,-6.0571,5.8037;6.4301,-7.1033,5.4891;5.3365,-5.8961,6.1995;7.0668,-5.8745,6.6052)|\",4.53885902634\r\n\"[H]O[C@@]([H])(C([H])=C([H])[H])[C@]([H])(O[H])C([H])([H])N([H])C([H])([H])[H] |(2.4095,-1.2871,-3.1641;2.5286,-0.3285,-3.0827;3.1989,-0.0728,-1.8324;2.6078,-0.4931,-1.0049;4.5735,-0.6812,-1.8265;5.2037,-0.437,-2.6807;5.0278,-1.4884,-0.8666;6.031,-1.9048,-0.8937;4.4143,-1.7569,-0.0086;3.1583,1.4589,-1.6662;3.5007,1.7098,-0.657;1.797,1.884,-1.74;1.4271,1.405,-2.5034;4.0147,2.2319,-2.6763;3.7125,1.9452,-3.7028;5.0698,1.9461,-2.5627;3.8906,3.6594,-2.4161;2.8965,3.8718,-2.3557;4.5179,4.4868,-3.436;4.3242,5.543,-3.2203;4.177,4.2768,-4.4686;5.6049,4.3383,-3.4142)|\",5.62187215133\r\n\"[H]O[C@]1([H])C2=C(C(OC([H])([H])[H])=C([H])C([H])=C2[H])[C@@]2([H])C(=O)O[C@]1(C([H])([H])[H])C2([H])[H] |(3.3298,1.0429,2.327;2.5577,1.5336,2.0036;1.6582,0.5844,1.4216;0.6722,1.0622,1.4678;1.6396,-0.7095,2.2278;1.9428,-1.9357,1.6195;1.9267,-3.116,2.3911;2.2255,-4.2663,1.7206;2.181,-5.4961,2.4281;1.1823,-5.6836,2.8424;2.9213,-5.5225,3.2386;2.4194,-6.269,1.6957;1.6331,-3.0564,3.7564;1.624,-3.9567,4.3597;1.3449,-1.8238,4.3505;1.1135,-1.7879,5.4116;1.3472,-0.6555,3.5984;1.127,0.3026,4.0598;2.2517,-1.9612,0.1329;2.6683,-2.9167,-0.1787;0.9352,-1.6636,-0.6032;0.123,-2.4252,-1.0539;0.7902,-0.3046,-0.6546;1.9698,0.3531,-0.0809;2.1906,1.6488,-0.8384;3.078,2.1599,-0.4573;1.3333,2.319,-0.7168;2.3203,1.4411,-1.9051;3.064,-0.7137,-0.2384;3.418,-0.7559,-1.2745;3.9228,-0.54,0.4179)|\",5.575612796745\r\n\"[H]O/C(OC([H])([H])C([H])([H])[H])=C(\\C([H])([H])[H])P(=O)(OC([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])[H] |(7.5176,-2.3505,0.2432;6.6625,-2.0358,0.5825;6.2544,-1.0571,-0.265;7.2262,-0.7655,-1.154;6.9617,-0.7027,-2.5833;7.1021,-1.7175,-2.9785;5.9312,-0.3967,-2.7622;7.937,0.2819,-3.1947;7.8146,0.2917,-4.2837;8.9706,0.0009,-2.9665;7.738,1.2825,-2.804;5.0362,-0.4841,-0.0923;4.1127,-1.0081,0.9958;4.3624,-2.0427,1.2433;3.0634,-0.9735,0.6881;4.1935,-0.4119,1.9126;4.6001,1.0141,-0.9594;5.684,1.9585,-1.334;3.7592,0.6223,-2.3123;2.6282,-0.2681,-2.2698;2.9681,-1.2685,-1.9714;1.9117,0.0873,-1.5192;1.9992,-0.2976,-3.6517;1.1398,-0.9774,-3.6603;2.7224,-0.6421,-4.3979;1.6575,0.701,-3.9412;3.4625,1.6123,0.0333;3.0619,2.9993,-0.1103;3.1315,3.3005,-1.1609;2.0076,3.0197,0.1819;3.9019,3.9055,0.7746;3.5317,4.936,0.7126;4.9459,3.8858,0.4517;3.8471,3.5805,1.8188)|\",6.318483608609999\r\n\"[H]O/N=C(\\[H])C(=O)[C@@]([H])(C1=C([H])C([H])=C(C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(9.2458,-1.2268,-4.4065;9.2953,-2.0925,-3.9688;9.1643,-1.7717,-2.6264;9.2055,-2.8157,-1.8852;9.3369,-3.8194,-2.2927;9.0999,-2.691,-0.4039;9.2432,-3.6982,0.2727;8.7844,-1.3245,0.208;9.1749,-0.5585,-0.4689;7.2671,-1.1483,0.275;6.4578,-2.0788,0.943;6.9104,-2.9596,1.3906;5.0794,-1.8946,1.0228;4.4732,-2.6366,1.5384;4.4571,-0.7792,0.4422;2.9595,-0.5862,0.5529;2.6251,0.1422,-0.1987;2.4565,-1.5327,0.3115;2.4694,-0.1205,1.9473;2.8496,-0.8413,2.6864;3.0145,1.2655,2.3168;2.6852,1.5612,3.3199;2.6548,2.0273,1.612;4.1091,1.2859,2.3017;0.9357,-0.1436,2.009;0.5753,0.1472,3.0027;0.54,-1.1422,1.7874;0.5023,0.5554,1.2817;5.2696,0.1414,-0.2304;4.8154,1.0106,-0.7012;6.6506,-0.041,-0.3166;7.256,0.6868,-0.8519;9.4498,-1.1803,1.5874;9.2036,-0.206,2.0212;10.5397,-1.2584,1.5048;9.109,-1.9627,2.27)|\",4.37559071604\r\n\"[H]C1=NC([H])=C(C([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])N([H])C([H])([H])C2([H])C([H])([H])C2([H])[H])N1[H] |(-0.194,3.7691,5.2357;0.2619,2.8038,5.0601;-0.3689,1.7029,4.7171;0.6157,0.7449,4.6083;0.3722,-0.2746,4.3387;1.8606,1.2681,4.8853;3.2407,0.6817,4.9437;3.4075,0.1735,5.902;3.9959,1.4737,4.8869;3.525,-0.3165,3.8076;2.8139,-1.1606,3.8731;4.9379,-0.8827,3.9948;5.8172,-0.3688,4.6508;5.0974,-2.0415,3.3182;6.4147,-2.6166,3.3833;6.6725,-2.8628,4.4167;7.1577,-1.9189,2.9886;6.3707,-3.5191,2.7732;3.4565,0.3568,2.5062;2.7195,1.059,2.5612;3.1577,-0.5108,1.3623;3.998,-1.1968,1.2121;2.2657,-1.1435,1.542;2.9361,0.3218,0.12;3.7987,0.9173,-0.1731;1.5791,0.9403,-0.1397;0.7815,0.7348,0.5712;1.5365,1.934,-0.577;2.0685,-0.2023,-0.9961;2.3653,0.0087,-2.0198;1.5978,-1.1736,-0.8643;1.6089,2.5991,5.1726;2.3055,3.2883,5.4192)|\",6.163378713825001\r\n\"[H]C1=C([H])[C@]2([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@@]([H])(O[C@@]([H])(C([H])([H])[H])C2([H])[H])C1([H])[H] |(5.7121,-0.3494,2.5556;5.2605,0.0114,1.6329;5.0976,-0.8066,0.591;5.3755,-1.8567,0.6492;4.469,-0.3239,-0.6964;4.9215,-0.8119,-1.5605;4.7101,1.122,-0.8305;4.8613,1.6381,-2.0996;5.0014,0.9423,-3.0947;4.8837,2.9912,-2.0777;5.0589,3.7615,-3.3196;3.8992,3.4804,-4.2811;3.9721,4.1463,-5.1486;2.9406,3.6709,-3.7859;3.9189,2.4465,-4.6283;5.0094,5.2084,-2.8209;5.1284,5.9002,-3.6616;5.8127,5.397,-2.1012;4.0517,5.4177,-2.3331;6.4234,3.45,-3.9428;6.601,4.1188,-4.7929;6.4692,2.4171,-4.2905;7.2214,3.6137,-3.2101;4.1309,1.8627,0.2806;4.2556,2.9268,0.097;2.7171,1.6733,0.3266;2.2612,0.3084,0.3804;2.5165,-0.1103,1.3649;0.7466,0.3615,0.2391;0.3185,-0.6446,0.31;0.469,0.7935,-0.7284;0.312,0.9818,1.0295;2.9346,-0.5436,-0.7074;2.5556,-0.2567,-1.6961;2.6911,-1.6017,-0.549;4.8563,1.4631,1.5836;5.7527,2.0926,1.6858;4.2084,1.7183,2.4327)|\",6.870874725125\r\n\"[H]O/C(=N\\[C@@]([H])(C([H])=C([H])[H])C([H])([H])[C@]([H])(O[H])C([H])([H])[H])OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.1596,2.1134,0.0662;4.3719,1.6299,-0.2352;3.6882,1.2443,0.8798;2.6367,0.5411,0.8731;2.1102,0.0707,-0.4146;2.9036,-0.0368,-1.1632;1.4655,-1.2758,-0.189;0.7917,-1.3245,0.6677;1.6607,-2.3516,-0.9515;1.1562,-3.2938,-0.7527;2.3307,-2.3275,-1.8083;1.0945,1.1038,-0.9607;0.2413,1.1757,-0.2729;1.5795,2.0917,-0.959;0.5924,0.8029,-2.3786;0.1208,-0.1857,-2.3885;1.68,0.6751,-3.3031;2.187,1.503,-3.2658;-0.4168,1.8454,-2.866;-1.3014,1.8786,-2.2189;-0.7393,1.6103,-3.8853;0.0299,2.8491,-2.8722;4.3135,1.743,1.9679;3.8665,1.4752,3.3475;2.4702,2.0651,3.5672;2.1896,1.9585,4.6211;1.7316,1.5532,2.949;2.4641,3.1324,3.3196;3.9181,-0.028,3.6348;3.677,-0.206,4.6888;4.9241,-0.4183,3.4451;3.2047,-0.5686,3.0115;4.911,2.2248,4.1788;4.6918,2.1122,5.2456;4.9068,3.2923,3.9354;5.9142,1.8296,3.9882)|\",7.142988575625\r\n\"[H]OC([H])([H])[C@@]1([H])[C@@]2([H])C([H])=C([H])C([H])=C([H])[C@@]2([H])[C@]1([H])C([H])([H])C([H])([H])[H] |(-0.4104,-1.1174,-2.0503;0.245,-1.0082,-2.7572;1.1187,-2.1359,-2.7074;0.5563,-3.0675,-2.8891;1.8098,-2.0085,-3.5468;1.8921,-2.2505,-1.4004;1.1749,-2.2632,-0.5645;2.8963,-3.4421,-1.2259;2.4946,-4.2642,-0.6155;3.4787,-4.0158,-2.4915;2.815,-4.5912,-3.1355;4.7793,-3.8942,-2.8079;5.1612,-4.3548,-3.7164;5.7323,-3.1923,-1.9406;6.7862,-3.2242,-2.2072;5.325,-2.5459,-0.8358;6.0429,-2.0465,-0.1879;3.8672,-2.4571,-0.4923;3.7195,-2.5021,0.5943;3.054,-1.2546,-1.0801;3.5213,-0.9279,-2.0187;2.807,-0.0491,-0.1745;2.2084,-0.3722,0.6907;3.7717,0.2842,0.2354;2.1205,1.1297,-0.8744;1.897,1.9363,-0.1665;1.1901,0.8155,-1.3571;2.7655,1.544,-1.659)|\",4.879001339465001\r\n\"[H]C([H])=C([H])[C@@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])O[C@]([H])(C([H])([H])[H])C1([H])[H] |(0.2171,0.6499,1.0011;0.9024,0.2793,0.244;0.5575,0.316,-0.787;2.1103,-0.1846,0.5597;2.4457,-0.1983,1.5971;3.0943,-0.733,-0.438;2.7057,-0.5855,-1.4496;4.4048,-0.0463,-0.3488;4.7659,0.8661,-1.3299;3.9198,1.4632,-1.9782;6.0972,1.025,-1.3877;6.6933,2.072,-2.236;6.214,3.4516,-1.7726;6.7458,4.2322,-2.3282;6.4234,3.5877,-0.7061;5.1422,3.5721,-1.9395;8.1898,1.8957,-1.9637;8.766,2.6317,-2.5343;8.5178,0.893,-2.2557;8.4059,2.0322,-0.8995;6.3731,1.8075,-3.7108;6.9329,2.5098,-4.3388;5.3071,1.9274,-3.9095;6.6756,0.7917,-3.9886;5.3965,-0.945,0.1263;4.9928,-2.2462,-0.3523;5.474,-2.9489,0.335;5.457,-2.5063,-1.7844;5.2158,-3.5347,-2.0774;6.539,-2.3657,-1.8637;4.9702,-1.8311,-2.4971;3.4676,-2.2179,-0.1733;3.2016,-2.4918,0.8524;2.9412,-2.8936,-0.8531)|\",6.528011273495\r\n\"[H]OC(NO[C@]([H])(C([H])([H])[H])C([H])([H])C([H])=C=C([H])[H])OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.7491,3.392,1.859;7.669,3.6947,2.0235;8.3759,2.5587,2.0545;7.8925,1.3805,1.8138;6.5083,1.5576,1.4393;5.7021,0.5191,2.0135;6.201,-0.4375,1.8088;4.374,0.576,1.265;3.6958,-0.2002,1.6359;4.5363,0.4144,0.1953;3.8892,1.5493,1.3929;5.5766,0.6711,3.5445;4.9675,-0.1654,3.9162;6.5739,0.5679,3.9809;4.9692,1.9849,3.9896;3.9196,2.1643,3.7516;5.6316,2.9153,4.6316;6.307,3.8346,5.2703;6.8626,4.6007,4.733;6.3464,3.8632,6.3579;9.6457,2.8091,2.355;10.6803,1.7655,2.4686;11.9123,2.5819,2.868;12.7755,1.9205,2.9936;11.7339,3.1072,3.8114;12.1505,3.3235,2.099;10.3122,0.7685,3.5722;11.1501,0.081,3.7322;9.4288,0.1885,3.3016;10.1182,1.296,4.5123;10.891,1.0867,1.1113;11.7367,0.3935,1.1799;11.1248,1.8346,0.3461;10.0033,0.5309,0.8045)|\",6.544338104525\r\n\"[H]O/C(=N/C1=C([H])C([H])=C(/C(=N\\OC([H])([H])[H])C([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(4.6583,-5.7,5.3692;5.2011,-5.2259,4.7208;4.4308,-4.3019,4.0788;5.0242,-3.5641,3.2353;4.3625,-2.6182,2.4455;3.4375,-2.9851,1.4538;3.1966,-4.0353,1.313;2.8518,-2.0207,0.6405;2.1398,-2.3481,-0.1096;3.169,-0.6537,0.7607;2.4886,0.3266,-0.1204;2.8347,1.5345,-0.4249;4.0447,1.9604,0.1346;4.3183,3.2732,-0.3409;5.2679,3.5591,0.1184;3.5313,3.9748,-0.0402;4.4145,3.2845,-1.4329;1.1949,-0.0803,-0.7971;0.4657,-0.4571,-0.0705;1.3596,-0.8706,-1.5398;0.7725,0.7845,-1.3124;4.1072,-0.2988,1.7547;4.3739,0.7396,1.8858;4.6921,-1.2582,2.571;5.4166,-0.9655,3.3251;2.977,-4.2793,4.4985;2.4955,-5.2399,4.2769;2.4286,-3.4901,3.9831;2.9024,-4.1107,5.5805)|\",4.753828968235\r\n\"[H]C1=C([H])/C([H])=C([H])\\C(C([H])([H])C([H])([H])[H])=C(C(=O)OC([H])([H])C([H])([H])[H])/C([H])=C\\1[H] |(7.1052,2.3131,-2.1274;6.168,1.7973,-2.341;6.1494,0.4564,-2.2416;7.0702,-0.0514,-1.9523;5.0332,-0.4155,-2.6307;5.3042,-1.247,-3.2844;3.7471,-0.3331,-2.2494;3.0673,-1.1027,-2.6187;3.166,0.5447,-1.205;2.4828,-0.2471,-0.103;1.6779,0.3312,0.3514;2.0416,-1.1443,-0.5578;3.4672,-0.6815,0.9992;2.9594,-1.3211,1.7299;3.8551,0.196,1.5233;4.3106,-1.2429,0.5821;3.2313,1.9,-1.271;2.6329,2.7507,-0.196;2.3426,2.3958,0.9333;2.446,4.0249,-0.6281;1.8886,4.9578,0.3246;1.4073,5.7188,-0.2952;1.1317,4.4446,0.9226;2.9699,5.5632,1.2081;2.5288,6.3151,1.8728;3.741,6.0494,0.6016;3.4381,4.7916,1.8254;3.7992,2.6477,-2.4269;3.1167,3.3512,-2.8994;5.0714,2.6335,-2.8486;5.3465,3.3301,-3.6428)|\",4.313004530424999\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])/C2=N\\OC([H])([H])[H] |(4.5693,2.5749,4.1987;4.1235,2.0857,3.3367;4.0492,0.6998,3.2895;4.4295,0.0985,4.1081;3.4794,0.043,2.1824;2.9847,0.8088,1.105;3.0746,2.2052,1.1682;2.7005,2.7936,0.3327;3.6344,2.8464,2.2696;3.6989,3.9311,2.2936;2.3515,0.139,-0.1037;2.539,0.7793,-0.9764;0.8209,0.0204,0.0533;0.3733,-0.4316,-0.8403;0.3687,1.0073,0.1987;0.5488,-0.5938,0.9186;3.0276,-1.2194,-0.3574;2.5624,-1.7209,-1.215;4.0773,-1.0421,-0.6255;2.9687,-2.1345,0.8704;3.601,-3.0176,0.7315;1.9536,-2.5295,1.0155;3.3905,-1.436,2.1445;3.6643,-2.0673,3.2357;3.5354,-3.4524,3.0774;3.8746,-4.0865,4.3048;3.7481,-5.1571,4.1253;3.2092,-3.7625,5.1135;4.9135,-3.876,4.5853)|\",4.884443616475\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])O1)[C@]([H])(OC([H])([H])[H])C(=O)OC([H])([H])[H] |(3.615,2.4896,2.4793;3.7089,3.2392,1.8646;2.9296,2.9031,0.7316;3.2526,3.5708,-0.0749;1.4667,3.1034,1.0159;0.7999,3.6762,2.0557;1.2529,4.1081,2.9356;-0.5968,3.5997,1.7316;-1.4274,3.9534,2.3267;-0.6734,2.9892,0.5196;-1.4888,2.7099,-0.1299;0.579,2.6786,0.064;3.2346,1.4479,0.298;2.5828,1.179,-0.5468;2.9761,0.65,1.4269;2.9078,-0.7483,1.1823;2.634,-1.2141,2.1318;3.8725,-1.145,0.8485;2.1333,-0.9771,0.4352;4.6959,1.3254,-0.1539;5.5517,0.6767,0.4016;4.8974,2.0453,-1.2778;6.2441,2.0363,-1.7849;6.2232,2.6564,-2.6812;6.5533,1.0166,-2.0283;6.9324,2.4516,-1.0445)|\",5.96473560296\r\n\"[H]O[C@]([H])(/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]([H])(OC([H])([H])[H])C(=O)OC([H])([H])[H] |(0.3912,2.6008,-0.5974;0.7011,2.7177,0.3175;2.1061,2.4698,0.2931;2.406,2.302,1.3317;2.8747,3.6281,-0.2852;2.6411,3.8738,-1.3194;3.7841,4.3267,0.4098;3.9709,4.0312,1.4434;4.5815,5.4741,-0.0495;4.4976,6.0019,-1.3517;3.8173,5.557,-2.0722;5.2752,7.0911,-1.7326;5.1925,7.4826,-2.7433;6.1596,7.682,-0.8248;6.7655,8.5322,-1.1259;6.2568,7.1708,0.4695;6.94,7.6207,1.185;5.4761,6.0801,0.8497;5.5557,5.687,1.8608;2.3629,1.1601,-0.4935;3.4453,0.9703,-0.5405;1.8214,1.3636,-1.7822;2.2228,0.4158,-2.7594;1.7872,0.7402,-3.7074;1.8558,-0.5899,-2.5206;3.3181,0.3895,-2.8572;1.7399,-0.004,0.288;2.2065,-0.4049,1.3322;0.6256,-0.4924,-0.2836;-0.0324,-1.5361,0.458;-0.8914,-1.8263,-0.1472;-0.3556,-1.1606,1.432;0.6411,-2.3839,0.6076)|\",5.064038757804999\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)[C@]([H])(OC([H])([H])[H])C(=O)OC([H])([H])[H])C([H])=C1[H] |(5.2699,-3.3095,-8.7399;4.8206,-3.2272,-7.754;3.6446,-3.9183,-7.4592;3.1728,-4.5413,-8.214;3.0735,-3.8094,-6.1932;2.1574,-4.348,-5.9633;3.6629,-3.0089,-5.1967;3.0192,-2.9338,-3.8879;2.1112,-3.5247,-3.7691;3.4136,-2.223,-2.8104;4.2932,-1.5884,-2.8124;2.6364,-2.2819,-1.5585;1.6393,-2.9664,-1.3943;3.1766,-1.4173,-0.3913;2.3627,-1.3238,0.3406;3.5728,-0.177,-0.917;3.8591,0.8096,0.0648;4.0757,1.7318,-0.4791;4.7307,0.5351,0.6709;2.9928,0.9763,0.7228;4.3528,-2.1579,0.2697;5.5202,-1.8808,0.1091;3.9037,-3.1733,1.0287;4.9201,-3.9883,1.6403;4.3818,-4.7606,2.1896;5.5332,-3.3886,2.318;5.5621,-4.4347,0.8766;4.8504,-2.3185,-5.5116;5.3274,-1.6954,-4.7612;5.4205,-2.4275,-6.7749;6.3366,-1.8884,-7.0005)|\",4.27762972986\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C(=O)O[C@]1([H])C([H])([H])[C@]2(C([H])(C([H])([H])[H])C([H])([H])[H])O[C@@]1(C([H])([H])[H])C([H])([H])C2([H])[H] |(1.372,2.4307,-4.5469;2.2435,2.5642,-3.8961;3.1366,2.5634,-4.5287;2.1576,3.5514,-3.4242;2.3195,1.4517,-2.845;2.3882,0.4789,-3.3447;1.3967,1.4404,-2.2515;3.5056,1.6005,-1.8866;3.4198,0.8951,-1.0481;3.5316,2.5983,-1.4312;4.8501,1.337,-2.5413;5.0191,0.8283,-3.6299;5.8616,1.7254,-1.7305;7.2029,1.5017,-2.2212;7.2222,1.7758,-3.2779;7.6794,0.0534,-1.9551;7.9369,-0.4802,-2.8736;6.8867,-0.503,-1.4455;8.8809,0.2745,-0.994;9.1715,-0.8835,-0.0333;8.2141,-1.1083,0.4587;9.6237,-2.1443,-0.7878;9.7641,-2.9765,-0.089;10.5799,-1.9804,-1.2998;8.8902,-2.4613,-1.5377;10.1736,-0.5067,1.0688;10.2435,-1.3141,1.807;9.8644,0.4068,1.5839;11.1807,-0.3473,0.665;8.441,1.4352,-0.2499;8.197,2.321,-1.3603;7.7289,3.6785,-0.8802;7.565,4.3503,-1.7305;8.4875,4.1255,-0.2298;6.793,3.592,-0.3239;9.5627,2.2972,-2.101;9.4693,2.5332,-3.1663;10.2313,3.0389,-1.6537;10.0596,0.8513,-1.8255;10.9831,0.8586,-1.2408;10.2467,0.2788,-2.739)|\",7.306256885924999\r\n\"[H]C([H])([H])C(=O)O[C@]1([H])C([H])([H])[C@]2(C([H])(C([H])([H])[H])C([H])([H])[H])O[C@@]1(C([H])([H])[H])C([H])([H])C2([H])[H] |(0.1305,-4.4644,3.719;0.2068,-3.3734,3.7846;-0.4762,-2.9529,3.0403;-0.0792,-3.0485,4.7855;1.6275,-2.9419,3.502;2.3859,-2.4506,4.3097;1.956,-3.1916,2.2123;3.3069,-2.8601,1.8181;3.9755,-3.1673,2.6244;3.4469,-1.3652,1.444;4.1713,-0.84,2.0716;2.4762,-0.8716,1.5519;3.8438,-1.4374,-0.057;3.4413,-0.2226,-0.9002;2.3702,-0.0677,-0.7055;4.1912,1.0453,-0.4608;3.8316,1.9145,-1.0229;5.2683,0.9553,-0.6485;4.0518,1.2587,0.6049;3.6006,-0.467,-2.4088;3.1742,0.3702,-2.9734;3.0899,-1.3847,-2.7126;4.6552,-0.5507,-2.6978;3.1069,-2.6057,-0.4893;3.6243,-3.5508,0.4676;3.0321,-4.9261,0.2441;3.4484,-5.6439,0.9599;3.2678,-5.2727,-0.7672;1.9471,-4.9056,0.3673;5.1577,-3.4273,0.248;5.7357,-3.7106,1.134;5.4614,-4.0861,-0.5712;5.3142,-1.9316,-0.1415;5.7045,-1.8292,-1.1574;5.9789,-1.3796,0.5301)|\",7.29809347041\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C(=O)O[C@]1([H])C([H])([H])[C@]2(C([H])([H])[H])OC(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])C([H])([H])C2([H])[H] |(2.7676,3.335,-2.3021;3.0893,2.4538,-1.736;4.1637,2.5493,-1.5455;2.5826,2.4747,-0.7639;2.7614,1.1675,-2.5005;3.2488,1.1773,-3.4834;1.682,1.117,-2.6906;3.1699,-0.1221,-1.7532;2.8412,-0.9957,-2.3261;2.6976,-0.1499,-0.7664;4.6673,-0.2086,-1.5392;5.2482,0.0975,-0.5188;5.2949,-0.6445,-2.6601;6.7401,-0.752,-2.5864;6.9642,-1.134,-1.5876;7.4435,0.6057,-2.8045;7.7755,1.0274,-1.8502;6.7431,1.3173,-3.2546;8.6264,0.4098,-3.7769;9.3922,1.7061,-4.0127;9.8493,2.0636,-3.0835;8.7201,2.4824,-4.3928;10.1832,1.5482,-4.7531;8.1009,0.0321,-5.0646;7.2732,-1.1626,-5.0692;7.8952,-2.132,-6.0888;7.3233,-3.066,-6.1477;8.9335,-2.373,-5.8482;7.8878,-1.6653,-7.0792;5.8821,-0.7565,-5.5816;5.2146,-1.6246,-5.6376;5.9823,-0.3337,-6.5871;5.4127,-0.0103,-4.9406;7.2627,-1.754,-3.6303;6.657,-2.6668,-3.6071;8.7159,-2.0466,-3.1863;8.7081,-2.4393,-2.163;9.1581,-2.8277,-3.809;9.5382,-0.7286,-3.2723;10.3756,-0.8337,-3.9705;9.9635,-0.461,-2.2974)|\",6.729375522865\r\n\"[H]C([H])([H])C(OC(=O)[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@@]2([H])O[C@]21[H])(C([H])([H])[H])C([H])([H])[H] |(3.1295,2.1685,4.1806;3.7033,1.5411,3.4894;4.4637,1.0065,4.0692;3.0292,0.8139,3.0342;4.3667,2.4244,2.4276;5.2763,1.6089,1.5962;4.8273,0.5991,0.8258;3.6701,0.2357,0.7401;5.9805,-0.0249,0.0455;6.8241,-0.1344,0.739;6.4239,0.8969,-1.1192;6.5941,1.906,-0.7301;7.382,0.525,-1.4989;5.3839,0.9111,-2.2472;5.6747,1.6369,-3.0159;4.4205,1.2489,-1.8439;5.2039,-0.4783,-2.8863;6.0111,-0.6757,-3.6042;4.2659,-0.5075,-3.4541;5.203,-1.6103,-1.872;4.5527,-2.4566,-2.0996;6.4957,-2.0129,-1.3695;5.5728,-1.3977,-0.4622;5.16,-2.0718,0.2895;3.3419,3.1375,1.5395;3.8492,3.7065,0.7525;2.761,3.8424,2.145;2.6564,2.4261,1.0763;5.3213,3.4309,3.0743;4.7661,4.1071,3.7329;5.8281,4.0295,2.3105;6.0824,2.9154,3.6687)|\",7.246391838815\r\n\"[H]OC1=C(O[H])C([H])=C(C([H])([H])[C@]([H])(C(=O)N(C([H])([H])[H])C([H])([H])[H])N([H])[H])C([H])=C1[H] |(10.1247,0.8229,4.0882;9.6408,0.1332,4.568;8.3056,0.2911,4.3194;7.4212,-0.6086,4.944;7.9709,-1.5649,5.7511;7.2533,-2.0936,6.1331;6.0497,-0.4891,4.7214;5.3777,-1.1824,5.2269;5.5147,0.5117,3.8935;4.0266,0.5943,3.6347;3.7117,1.6425,3.5734;3.4584,0.1557,4.4614;3.6125,-0.121,2.3247;4.1705,0.3423,1.5005;2.1002,0.0756,2.0772;1.3632,0.4101,3.0039;1.629,-0.1361,0.8081;0.2252,0.1281,0.526;0.1289,0.8846,-0.2644;-0.2488,0.4898,1.4373;-0.2799,-0.7869,0.1895;2.4433,-0.5602,-0.3233;2.7568,0.2925,-0.9438;1.8471,-1.2314,-0.9519;3.3129,-1.1168,0.0247;3.9124,-1.5606,2.2761;3.3715,-2.04,2.9964;4.8945,-1.7012,2.5119;6.4065,1.3999,3.2855;6.0293,2.1944,2.6467;7.7831,1.2881,3.4976;8.4652,1.9902,3.0203)|\",5.62731442834\r\n\"[H]C1=C([H])C([H])=C([C@]([H])(C([H])([H])C(=O)OC([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(7.7656,-5.8037,7.2897;7.3354,-5.2402,6.466;7.8607,-5.3608,5.1797;8.7041,-6.0204,4.992;7.3022,-4.6326,4.1278;7.7179,-4.7354,3.1272;6.2151,-3.7714,4.3327;5.6422,-2.9956,3.1513;6.1383,-3.391,2.2561;6.0344,-1.4818,3.1545;5.8747,-1.0844,2.1477;7.0963,-1.3911,3.402;5.2648,-0.6155,4.1325;5.5041,-0.5059,5.3172;4.2452,0.0285,3.5111;3.4193,0.8842,4.3343;2.451,0.9043,3.8257;3.3016,0.425,5.3197;4.0085,2.2863,4.4538;4.9871,2.2137,4.9413;4.176,2.687,3.4461;3.0926,3.222,5.2501;3.5282,4.2238,5.3248;2.9345,2.8519,6.2701;2.1091,3.322,4.7745;4.1205,-3.189,2.9623;3.5937,-2.8109,3.8482;3.7922,-2.555,2.1269;3.674,-4.6379,2.7045;3.9842,-5.2704,3.5457;2.576,-4.6569,2.6959;4.1894,-5.2406,1.3922;3.7707,-6.2405,1.2311;5.2805,-5.342,1.3879;3.9082,-4.6198,0.5319;5.6975,-3.6615,5.6323;4.8675,-2.9926,5.8319;6.2523,-4.3871,6.6864;5.836,-4.2833,7.6851)|\",6.462703949374999\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C([H])C([H])=C(C#N)C([H])=C1[H])C([H])([H])C([H])([H])[H] |(4.6811,-0.0994,-3.632;4.1741,-0.709,-4.1955;2.923,-0.888,-3.6948;2.1703,-1.6723,-4.2165;2.5597,-0.0649,-2.4617;2.8187,0.9898,-2.6277;1.4736,-0.1264,-2.3606;3.219,-0.5697,-1.1477;2.9788,-1.6372,-1.056;4.7374,-0.4469,-1.1722;5.5493,-1.5879,-1.0953;5.0858,-2.5679,-1.0221;6.9376,-1.4921,-1.1121;7.5509,-2.3852,-1.0511;7.551,-0.233,-1.2094;8.9806,-0.1237,-1.2248;10.1404,-0.0342,-1.2352;6.7535,0.9207,-1.289;7.2248,1.8956,-1.3608;5.366,0.8078,-1.2696;4.766,1.713,-1.3189;2.5899,0.1507,0.0691;2.7953,1.2279,0.0042;1.5001,0.0426,-0.0044;3.0615,-0.3871,1.4241;2.5681,0.1463,2.2434;4.1428,-0.268,1.5522;2.8269,-1.453,1.5295)|\",5.534795719169999\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H])C([H])([H])C([H])([H])[H] |(2.115,3.4237,0.9033;1.2272,3.0636,1.0631;1.3067,1.9186,1.8007;0.3,1.3468,2.1325;2.7177,1.435,2.1116;3.3035,2.2597,2.5399;2.6296,0.6607,2.8774;3.4491,0.8763,0.8558;3.4655,1.6708,0.0953;4.8974,0.5552,1.1922;5.2276,-0.4367,2.1304;4.4416,-1.0027,2.6222;6.5531,-0.7171,2.4459;6.8181,-1.4803,3.1671;7.5615,0.0078,1.8125;8.9657,-0.2814,2.141;9.1884,-1.1643,2.9691;9.8336,0.3772,1.5685;7.2743,0.9966,0.8761;8.0837,1.5377,0.4016;5.9407,1.2608,0.5746;5.7083,2.0286,-0.159;2.6895,-0.327,0.2494;2.6659,-1.144,0.9829;1.6443,-0.036,0.0932;3.283,-0.8278,-1.0704;2.6919,-1.6592,-1.4689;4.3126,-1.1804,-0.9462;3.2907,-0.0348,-1.8284)|\",4.889885893485\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C(Cl)C([H])=C(Cl)C([H])=C1[H])C([H])([H])[H] |(4.8432,0.6828,-1.0055;4.4293,0.8552,-1.8693;3.24,1.4986,-1.6998;2.5848,1.8252,-2.6572;2.8256,1.7069,-0.2493;3.6626,2.1483,0.3041;1.998,2.4206,-0.2319;2.3862,0.3675,0.4203;3.127,-0.3951,0.1528;2.4269,0.5037,1.9353;3.6314,0.4075,2.6522;5.1597,0.0954,1.8107;3.7047,0.5454,4.0364;4.6545,0.4571,4.5499;2.5307,0.7963,4.7426;2.6015,0.9719,6.4863;1.3107,0.9064,4.0811;0.4015,1.101,4.6392;1.2767,0.7585,2.6958;0.3196,0.8392,2.192;1.0357,-0.0993,-0.1413;1.1147,-0.2556,-1.2213;0.2471,0.6447,0.0163;0.7192,-1.0406,0.3198)|\",5.97562015698\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C([H])C([H])=C(C#N)C([H])=C1[H])C([H])([H])[H] |(0.2401,-2.3368,-1.3893;-0.2308,-2.3889,-0.54;0.5398,-1.8896,0.4645;0.1007,-1.8263,1.5852;1.941,-1.4261,0.0764;2.4632,-1.2169,1.0131;2.4709,-2.2448,-0.4279;1.9836,-0.1492,-0.8208;3.044,0.1338,-0.8533;1.5837,-0.4386,-2.2656;2.5308,-1.0061,-3.1361;3.5345,-1.2106,-2.7705;2.2204,-1.2986,-4.4588;2.9672,-1.7283,-5.1185;0.9322,-1.0255,-4.95;0.6041,-1.3167,-6.3152;0.3409,-1.5534,-7.4232;-0.0269,-0.4628,-4.0946;-1.0225,-0.2495,-4.4701;0.3003,-0.1751,-2.7706;-0.457,0.2641,-2.1299;1.218,1.0136,-0.1717;1.644,1.2389,0.8113;0.1597,0.7803,-0.0146;1.2818,1.9169,-0.787)|\",5.542959134685001\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C([H])C(F)=C([H])C([H])=C1[H])C([H])([H])C([H])([H])C([H])([H])[H] |(5.8822,2.4797,-1.0497;5.5214,1.7479,-1.5775;5.0732,0.7569,-0.7576;4.5874,-0.2389,-1.2321;5.2786,1.0079,0.734;6.3607,0.9665,0.9237;4.9659,2.0338,0.9762;4.546,-0.0092,1.635;4.7526,-0.9999,1.2135;5.1129,0.0271,3.0479;4.9817,1.1699,3.8508;4.4588,2.0556,3.5024;5.5246,1.1767,5.1286;5.3873,2.2884,5.8845;6.2035,0.0846,5.657;6.6104,0.1357,6.6614;6.3328,-1.0515,4.859;6.8565,-1.9213,5.2457;5.7951,-1.0801,3.5701;5.9039,-1.9738,2.9613;3.0163,0.2051,1.5907;2.7714,1.1835,2.0306;2.7076,0.2469,0.5394;2.2102,-0.8882,2.3027;2.4446,-1.8599,1.8461;2.5256,-0.9565,3.3518;0.6989,-0.6452,2.2365;0.145,-1.444,2.7424;0.3484,-0.6021,1.1981;0.4278,0.303,2.7174)|\",6.27494539253\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H])C([H])([H])C([H])([H])C([H])([H])[H] |(5.6466,3.2478,-1.3209;5.2958,2.3506,-1.4485;4.9991,1.7854,-0.2471;4.5483,0.6686,-0.2029;5.3178,2.648,0.9726;6.4123,2.7336,1.0317;4.9399,3.6669,0.8066;4.7584,2.0666,2.2895;4.9959,0.9966,2.2796;5.4595,2.6942,3.485;5.3117,4.0588,3.7877;4.6701,4.6839,3.1727;5.9706,4.6348,4.8687;5.861,5.6852,5.1091;6.7896,3.8307,5.6598;7.4905,4.4308,6.8041;7.3179,5.6316,7.0147;8.2092,3.6979,7.4835;6.96,2.4745,5.3941;7.6032,1.8809,6.0321;6.292,1.9187,4.3059;6.4171,0.861,4.0903;3.2196,2.1978,2.3534;2.9417,3.262,2.3806;2.8075,1.7892,1.4234;2.579,1.4709,3.5426;2.8599,0.4093,3.5053;2.9866,1.8647,4.4829;1.052,1.5944,3.5562;0.6185,1.0618,4.4099;0.6116,1.1762,2.6431;0.7389,2.6436,3.6252)|\",5.020500541724999\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C([H])([H])P(=O)(OC([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])[H])C([H])([H])[H] |(2.8334,-7.4673,-2.0741;3.64,-7.5388,-1.4492;4.2323,-6.4161,-1.3871;5.3346,-6.2773,-0.6208;5.5487,-5.3168,-0.6424;3.8688,-5.1388,-2.166;2.781,-5.1478,-2.347;4.2811,-3.9918,-1.3359;3.7073,-3.9682,-0.4934;4.2352,-2.6729,-1.9604;3.2549,-2.42,-2.4012;4.9783,-2.6031,-2.7594;4.6155,-1.3311,-0.7795;4.3989,0.0419,-1.3045;3.7143,-1.6897,0.529;2.6703,-0.7931,0.9956;2.3233,-0.1739,0.1641;1.8541,-1.4453,1.3203;3.1728,0.0691,2.1408;2.352,0.6776,2.5385;3.9615,0.7451,1.7959;3.5651,-0.5524,2.9522;6.1067,-1.7267,-0.3214;6.83,-0.891,0.6184;6.7923,0.1466,0.2706;6.3311,-0.9545,1.5921;8.2561,-1.4021,0.6937;8.8305,-0.7979,1.4048;8.7396,-1.3389,-0.2859;8.2796,-2.4444,1.0266;4.5912,-5.1529,-3.5221;4.3442,-6.0762,-4.0534;5.6767,-5.119,-3.38;4.2912,-4.3108,-4.1533)|\",7.63551464503\r\n\"[H]OC(=O)[C@@]([H])(N([H])C([H])([H])P(=O)(OC([H])([H])C([H])([H])[H])OC([H])([H])C([H])([H])[H])C([H])([H])[H] |(1.378,-0.7787,4.4962;2.3522,-0.7099,4.4952;2.6905,-0.0585,3.3576;1.866,0.3148,2.5496;4.2046,0.171,3.2506;4.7002,-0.7442,3.5963;4.6403,0.4482,1.8924;4.1209,1.2464,1.5301;4.4846,-0.6731,0.9717;3.4674,-1.1005,0.9406;5.1695,-1.4822,1.2533;4.8625,-0.2246,-0.7582;4.4668,-1.2551,-1.7536;4.1596,1.2287,-0.976;2.9097,1.3591,-1.7032;3.1238,1.99,-2.5725;2.5985,0.3759,-2.0672;1.8585,1.995,-0.809;0.9207,2.1131,-1.3648;2.1823,2.9861,-0.4732;1.6629,1.3792,0.0751;6.4212,0.1706,-0.6781;7.1056,0.6199,-1.873;6.921,-0.0974,-2.68;6.6909,1.5913,-2.1655;8.5848,0.7254,-1.5521;9.1325,1.0805,-2.4324;8.9889,-0.2494,-1.2617;8.7534,1.4289,-0.7307;4.6043,1.3282,4.1826;4.3205,1.1172,5.2178;4.1133,2.2583,3.8727;5.6857,1.4762,4.1264)|\",6.359300686185\r\n\"[H]/N=C(/O[H])[C@]1([H])N(C([H])([H])P(=O)(O[H])O[H])C([H])([H])C([H])([H])[C@@]1([H])O[H] |(-2.1428,-0.8519,-1.7611;-1.8955,-0.2048,-2.5086;-1.1731,0.7444,-2.0587;-0.757,1.6827,-2.9424;-0.2865,2.4185,-2.4907;-0.638,0.9348,-0.6295;-0.6368,2.0033,-0.3947;0.7269,0.3822,-0.4148;1.7959,0.7523,-1.3258;1.6128,0.4625,-2.3765;2.7172,0.2491,-1.0086;2.1529,2.5372,-1.3465;1.0298,3.4463,-1.7371;2.7772,2.7955,0.1187;2.8063,3.7443,0.3329;3.4239,2.6597,-2.3508;3.1836,3.1874,-3.1316;0.5675,-1.0795,-0.2833;0.5334,-1.5709,-1.2697;1.4224,-1.4878,0.2666;-0.7763,-1.2454,0.4592;-0.6515,-1.5887,1.4894;-1.4211,-1.969,-0.0494;-1.3948,0.1792,0.4818;-2.4758,0.1807,0.3258;-1.182,0.8217,1.7308;-0.219,0.9476,1.8003)|\",6.729375522865\r\n\"[H]OP(=O)(O[H])C([H])([H])N([H])[C@@]([H])([S@@](=O)N([H])[H])C([H])([H])S[H] |(-0.7006,-1.1222,-3.1185;-0.136,-1.1553,-2.327;-0.7568,-2.2703,-1.2934;-1.7718,-3.1392,-1.9399;0.5848,-2.9774,-0.7128;1.0348,-3.5244,-1.3787;-1.28,-1.3638,0.2136;-2.1793,-0.7883,-0.0192;-1.5726,-2.1358,0.9348;-0.2943,-0.4786,0.8331;0.5964,-0.9492,0.9815;-0.0873,0.8179,0.2363;-0.3007,0.8352,-0.84;-1.3976,2.0396,0.8966;-2.7163,1.4692,0.4311;-1.0336,1.66,2.5307;-1.0695,0.6415,2.6487;-1.7067,2.1162,3.1442;1.2956,1.3978,0.5154;1.4528,1.49,1.5934;1.3695,2.398,0.0811;2.6945,0.3557,-0.0858;2.2612,0.2239,-1.3587),wD:13.13|\",6.34569499366\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C([H])([H])P(=O)(O[H])O[H])C([H])([H])O[H] |(0.6632,-0.4715,3.252;1.6398,-0.674,3.0321;1.7736,-0.93,1.7973;3.0314,-1.1991,1.3461;2.9998,-1.4441,0.4082;0.6963,-0.9312,0.7028;0.8505,-1.8176,0.0702;-0.684,-0.9906,1.1715;-0.88,-0.228,1.8191;-1.0956,-2.263,1.76;-0.5351,-2.5505,2.6634;-0.9571,-3.0627,1.0238;-2.8708,-2.2208,2.173;-3.9008,-2.2094,1.1046;-2.9667,-3.5014,3.1714;-3.8163,-3.9549,3.0376;-2.894,-0.9082,3.1426;-3.7559,-0.462,3.0823;0.818,0.3063,-0.2151;1.8249,0.3744,-0.6397;0.659,1.2127,0.3952;-0.09,0.2207,-1.2869;-0.9291,-0.0745,-0.8866)|\",7.322583716955\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C([H])([H])P(=O)(O[H])O[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(6.21,1.1089,0.8388;5.8101,1.2667,-0.0892;4.8962,0.4126,-0.3095;4.217,0.4283,-1.485;4.5707,1.1909,-1.9814;4.4087,-0.6985,0.6124;4.0367,-1.5191,-0.0231;5.5668,-1.1136,1.4222;6.3493,-1.2921,0.7954;5.3598,-2.2896,2.2565;4.9976,-3.1831,1.7149;4.6277,-2.0782,3.0427;6.8816,-2.7925,3.1258;6.7726,-4.0112,3.9657;7.896,-2.8341,1.8569;8.8023,-3.0729,2.1148;7.4056,-1.4899,3.9535;7.4107,-1.7062,4.9012;3.2577,-0.2033,1.52;2.9375,-1.0408,2.1551;3.6789,0.5587,2.1883;2.0045,0.3592,0.8185;2.3074,1.2142,0.2;1.3248,-0.6691,-0.1003;0.3929,-0.26,-0.5079;1.0695,-1.5828,0.4532;1.9626,-0.9431,-0.9454;1.0159,0.8733,1.8766;0.137,1.3256,1.4026;1.4735,1.6299,2.5254;0.6617,0.0544,2.5167)|\",7.235507284795\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C([H])([H])P(=O)(O[H])O[H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H] |(2.5542,-1.4271,1.8953;3.2034,-2.2018,1.7369;3.7075,-2.1052,0.5742;4.6009,-3.0247,0.1512;4.8922,-2.7032,-0.7333;3.3697,-1.0367,-0.4896;3.2186,-0.0847,0.0464;4.502,-0.9438,-1.4235;4.2073,-0.6263,-2.3422;5.6076,-0.1123,-0.9473;6.0482,-0.5592,-0.0498;5.3109,0.9197,-0.6965;6.8644,-0.0323,-2.2642;6.3192,0.2155,-3.6247;7.8548,1.0783,-1.6296;8.567,1.3259,-2.2437;7.7325,-1.4062,-2.1582;7.6055,-1.9316,-2.9661;2.033,-1.3935,-1.2045;1.3612,-1.6856,-0.387;2.1604,-2.6109,-2.1357;1.1783,-2.8759,-2.5429;2.5473,-3.4812,-1.5966;2.827,-2.4224,-2.9837;1.3449,-0.1813,-1.8821;0.3405,-0.5131,-2.1776;1.1938,0.5978,-1.1223;2.0249,0.4466,-3.1079;1.3623,1.1879,-3.5682;2.2568,-0.2973,-3.8788;2.9528,0.9719,-2.8544)|\",7.513063412305\r\n\"[H]/N=C(/O[H])[C@@]([H])(N([H])C([H])([H])P(=O)(O[H])O[H])C([H])([H])S[H] |(2.0169,-1.4657,-3.0101;2.6154,-2.2655,-2.7955;2.5144,-2.5551,-1.5624;3.2351,-3.5799,-1.0543;2.9133,-3.7112,-0.1359;1.6931,-1.8056,-0.5021;0.8666,-1.2757,-0.9998;1.2212,-2.8213,0.4602;0.6476,-3.4954,-0.0462;0.4529,-2.3404,1.6073;-0.417,-1.7174,1.3387;1.0865,-1.7446,2.2709;-0.1776,-3.7138,2.6306;-1.0311,-3.3193,3.777;-0.8485,-4.611,1.4557;-1.2715,-5.4183,1.7944;1.1207,-4.595,3.0635;1.1983,-4.5946,4.0328;2.5861,-0.7615,0.2046;3.441,-1.2577,0.6735;2.0214,-0.2522,0.99;3.159,0.6136,-0.8672;3.8853,-0.1362,-1.7242)|\",7.055912143465\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])C1=C([H])C([H])=C(N2NN=C([H])N2)C([H])=C1[H] |(0.1381,0.7648,2.2543;0.8103,1.1559,2.8395;1.1998,2.365,2.3647;2.0879,2.9759,2.9079;0.46,2.8926,1.1165;-0.62,2.791,1.2913;0.7397,4.2979,0.8533;1.7524,4.427,0.876;0.4002,4.8528,1.6387;0.8222,2.0582,-0.1472;0.4218,2.6251,-0.9942;1.9134,2.0458,-0.2589;0.2817,0.6439,-0.169;1.1151,-0.4665,0.0307;2.1758,-0.3149,0.2111;0.6144,-1.7668,0.0018;1.264,-2.6202,0.1539;-0.7479,-1.9644,-0.227;-1.2687,-3.2887,-0.2607;-2.5639,-3.5595,-0.4664;-2.6749,-4.8628,-0.4312;-1.4262,-5.3396,-0.2024;-1.191,-6.3899,-0.1198;-0.5179,-4.3803,-0.0898;-1.6082,-0.8805,-0.4256;-2.6618,-1.0538,-0.6085;-1.0853,0.4088,-0.3958;-1.7526,1.2502,-0.5669)|\",5.015058264715\r\n\"[H]OC(=O)[C@@]([H])(N([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C(=O)O[H])N([H])[H])C([H])([H])[H] |(7.1275,1.5196,-3.3026;7.8411,0.8386,-3.2175;7.1783,-0.328,-3.1167;7.7277,-1.3949,-2.9832;5.6466,-0.1775,-3.24;5.2041,-0.877,-2.5153;5.271,1.2169,-2.9624;4.4407,1.4705,-3.4953;5.0557,1.5114,-1.5383;4.3033,0.8367,-1.0953;6.0005,1.3158,-1.014;4.6263,2.964,-1.3118;3.6666,3.1388,-1.8181;4.428,3.0922,-0.2403;5.6618,3.9919,-1.7943;5.8354,3.8628,-2.8666;6.624,3.8141,-1.2919;5.2619,5.4643,-1.5772;6.053,6.0906,-2.009;3.9791,5.7586,-2.3841;3.966,5.8083,-3.5917;2.8804,5.8999,-1.6254;3.2073,5.8193,-0.6932;5.0153,5.7648,-0.153;5.5859,5.1792,0.453;5.2548,6.7313,0.0591;5.2191,-0.5915,-4.6543;5.5775,-1.6015,-4.8683;5.6378,0.0915,-5.4019;4.1269,-0.5828,-4.7466)|\",7.104892636555\r\n\"[H]O[C@]([H])(C([H])([H])OC(=O)C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(2.4694,2.8855,0.6467;2.817,2.2657,-0.0195;3.4186,1.1732,0.6621;3.2917,0.3041,0.0001;2.7002,0.8796,1.9842;2.8708,1.6835,2.7078;3.0335,-0.0676,2.4104;1.2764,0.7121,1.8029;0.5161,1.8243,1.8663;0.9765,2.9459,2.0218;-0.9377,1.5354,1.7452;-1.4336,0.2327,1.5921;-0.7432,-0.6022,1.5563;-2.8067,0.0217,1.484;-3.19,-0.9878,1.3643;-3.6884,1.1046,1.5287;-4.7587,0.9361,1.4445;-3.1964,2.4037,1.6816;-3.8816,3.246,1.7154;-1.8253,2.6196,1.7892;-1.4206,3.6193,1.9072;4.9349,1.4063,0.8751;5.0324,2.2899,1.5255;5.6307,0.2188,1.5619;6.7142,0.3792,1.5967;5.4559,-0.7135,1.0085;5.2933,0.0663,2.5931;5.6084,1.7273,-0.4666;6.6591,2.0016,-0.3172;5.0975,2.5516,-0.9693;5.5822,0.8549,-1.1328)|\",5.559285965715\r\n\"[H]O[C@@]([H])(C([H])([H])OC(=O)C([H])([H])[H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.3004,-0.1196,2.5553;3.5399,0.0127,3.1448;2.4991,-0.8603,2.72;1.6425,-0.6004,3.3482;2.1158,-0.6166,1.2579;1.2312,-1.2,0.9918;2.9435,-0.8905,0.5933;1.8839,0.7776,0.9823;0.6725,1.2716,1.3353;-0.208,0.6069,1.8362;0.5886,2.7463,1.0201;0.8339,2.9271,-0.0311;1.3181,3.2943,1.626;-0.4167,3.1077,1.2389;2.8857,-2.3361,2.9693;3.7452,-2.586,2.3303;3.2411,-2.388,4.0053;1.7725,-3.3382,2.7438;1.7807,-4.1951,1.6353;2.6108,-4.1518,0.933;0.7464,-5.1089,1.4247;0.7746,-5.7659,0.5592;-0.3172,-5.1804,2.3244;-1.1236,-5.8907,2.1635;-0.3371,-4.3342,3.4354;-1.1594,-4.3851,4.1442;0.6986,-3.424,3.6422;0.6757,-2.7731,4.5134)|\",6.43277142582\r\n\"[H]OC([H])([H])C1=C([H])C(=O)C([H])=C(C([H])([H])[H])[C@@]12C([H])([H])C([H])([H])[C@@]([H])(C(=C([H])[H])C([H])([H])[H])C2([H])[H] |(3.6383,4.0198,2.5074;4.1247,3.1941,2.3501;5.0861,3.4631,1.3224;5.8464,4.1674,1.6883;5.5838,2.5065,1.1459;4.4696,4.0249,0.0503;4.5187,5.3577,-0.1315;5.0097,6.0085,0.59;3.9408,6.034,-1.306;3.9952,7.255,-1.444;3.2971,5.1553,-2.29;2.8555,5.6567,-3.1478;3.2309,3.8157,-2.1579;2.5107,3.0067,-3.2105;2.1597,3.6543,-4.0184;1.6377,2.49,-2.7933;3.1494,2.2336,-3.6538;3.8435,3.0786,-0.9684;4.9592,2.0607,-1.47;4.9399,2.0048,-2.5619;5.9625,2.4069,-1.2031;4.6165,0.6901,-0.862;5.0637,0.5907,0.1339;4.9949,-0.1409,-1.4678;3.0782,0.6894,-0.751;2.6697,0.5498,-1.7599;2.4693,-0.3781,0.1405;1.4596,-1.136,-0.3023;0.9927,-1.8917,0.3246;1.0616,-1.0305,-1.3091;3.0274,-0.5523,1.5352;2.4025,-1.2301,2.1244;3.1024,0.399,2.0761;4.0398,-0.9772,1.5086;2.7686,2.1362,-0.2915;2.8644,2.2114,0.7925;1.7478,2.4403,-0.5413)|\",4.7293387216900005\r\n\"[H]OC([H])([H])C(=C([H])[H])[C@]1([H])C([H])([H])C(=O)C2=C([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])[C@]2(C([H])([H])[H])C1([H])[H] |(0.0556,0.1333,-0.0038;0.5634,-0.3873,-0.6463;1.6335,-1.0051,0.0762;1.2456,-1.6699,0.8625;2.1416,-1.6326,-0.6657;2.606,-0.0081,0.671;2.7803,0.0372,1.9966;2.2312,-0.6282,2.659;3.4746,0.7189,2.48;3.3139,0.9105,-0.3211;3.3197,0.3811,-1.2826;4.7715,1.2164,0.0595;5.3936,0.3165,0.0825;4.7998,1.6446,1.0732;5.4599,2.2538,-0.8225;6.6687,2.2054,-0.9976;4.6204,3.3794,-1.3522;5.234,4.5631,-1.5232;6.2977,4.6166,-1.2984;4.5505,5.8231,-1.9603;4.5411,6.5249,-1.1104;5.1551,6.3207,-2.7319;3.1237,5.5763,-2.4647;3.1552,5.2525,-3.5129;2.5566,6.5155,-2.4539;2.382,4.5205,-1.6233;2.4032,4.8685,-0.5784;0.9094,4.442,-2.0553;0.4205,5.4101,-1.8951;0.3455,3.693,-1.4903;0.8123,4.2006,-3.12;3.1323,3.1461,-1.6313;2.9984,2.4409,-3.0067;1.9743,2.0906,-3.1761;3.6612,1.5725,-3.079;3.2744,3.1125,-3.8254;2.5402,2.2333,-0.523;2.5324,2.7795,0.4314;1.5022,1.9922,-0.7676)|\",5.259960730165\r\n\"[H]C([H])([H])C(=O)OC([H])([H])[C@@]1([H])O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(12.2355,-2.1565,-3.7418;12.1673,-2.0445,-4.8289;13.1528,-1.7283,-5.1858;11.8928,-2.9966,-5.284;11.1289,-1.0038,-5.1744;10.1321,-1.1951,-5.8373;11.4584,0.1964,-4.6385;10.5485,1.2872,-4.906;11.1423,2.1895,-4.7468;10.2173,1.2298,-5.9472;9.3624,1.2696,-3.9781;8.7608,0.3634,-4.0184;9.5837,1.776,-2.65;8.7343,2.5255,-3.5334;9.1969,3.4539,-3.8818;7.2723,2.6742,-3.1592;7.2132,3.4328,-2.3651;6.7446,3.1013,-4.0245;6.5695,1.3948,-2.6899;6.518,0.6752,-3.5184;7.1754,0.9258,-1.9043;5.1516,1.654,-2.1643;5.1991,2.3616,-1.3234;4.5561,2.1499,-2.9455;4.4305,0.3773,-1.7132;4.3701,-0.3232,-2.5593;5.0346,-0.1275,-0.9448;3.0206,0.6302,-1.1643;3.0809,1.3196,-0.3089;2.4194,1.1467,-1.9277;2.2936,-0.6497,-0.7314;2.237,-1.3389,-1.586;2.8921,-1.1643,0.0338;0.8844,-0.3912,-0.1886;0.3916,-1.3244,0.1073;0.2521,0.0939,-0.9425;0.9119,0.2645,0.6905)|\",7.436871534165\r\n\"[H]N1N=C([C@@]([H])(N([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])S/C1=N/C([H])(C([H])([H])[H])C([H])([H])[H] |(3.1905,4.5529,-1.1373;2.5855,3.772,-1.3458;3.1353,2.5247,-1.1959;2.3837,1.6188,-1.72;2.7528,0.1416,-1.6858;2.0228,-0.3984,-2.3028;4.0795,-0.1128,-2.2642;4.7568,0.4869,-1.7927;4.0859,0.1764,-3.2417;2.6786,-0.4355,-0.2466;3.4463,0.091,0.34;1.3153,-0.1721,0.4096;1.2926,-0.5974,1.4192;0.504,-0.6407,-0.1629;1.0936,0.8965,0.4946;3.0025,-1.9358,-0.2439;3.0148,-2.322,0.782;3.9714,-2.134,-0.7074;2.2413,-2.4984,-0.8008;0.9229,2.2331,-2.5209;1.4367,3.9147,-2.127;0.8076,4.947,-2.5103;1.2878,6.2778,-2.1306;2.3939,6.3055,-2.0829;0.8525,7.274,-3.2093;1.1978,8.2867,-2.9701;-0.2399,7.2856,-3.2899;1.2592,6.9888,-4.1849;0.7381,6.6668,-0.7485;1.071,7.6722,-0.4652;1.069,5.9649,0.0253;-0.3571,6.6566,-0.766)|\",5.36064285485\r\n\"[H]C([H])=C([H])C([H])([H])N1C([H])([H])[C@@]([H])(OC([H])([H])[H])[C@@]([H])(Cl)[C@@]1([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(2.3839,2.4914,-3.716;2.3965,1.6153,-3.0732;2.065,0.6804,-3.521;2.796,1.6813,-1.8031;3.1149,2.6399,-1.397;2.7962,0.5092,-0.8572;2.4167,-0.3663,-1.3958;2.0728,0.7092,-0.0372;4.1184,0.149,-0.3272;4.1106,-1.198,0.2913;3.0814,-1.5227,0.512;4.5521,-1.9459,-0.3796;4.898,-1.0975,1.6133;4.4845,-1.7644,2.384;6.2826,-1.3428,1.4687;6.6255,-2.7141,1.4842;7.715,-2.7672,1.4207;6.1951,-3.2635,0.6331;6.2961,-3.1996,2.4162;4.6459,0.3741,1.9584;3.6144,0.4513,2.3128;5.65,1.006,3.3239;4.7491,1.1205,0.6069;4.1352,2.0272,0.6604;6.1789,1.4922,0.1723;6.6728,2.03,0.992;6.7346,0.5603,0.0325;6.2581,2.3447,-1.1088;5.5738,1.9062,-1.8477;5.8454,3.8044,-0.8611;5.8554,4.383,-1.793;6.539,4.2895,-0.1616;4.8394,3.8927,-0.4331;7.6765,2.2901,-1.6963;7.7539,2.8934,-2.6093;7.9624,1.262,-1.9476;8.4152,2.6758,-0.9809)|\",6.065417727645\r\n\"[H]C([H])=C([H])C([H])([H])N1C([H])([H])[C@@]([H])(OC([H])([H])[H])[C@@]1([H])[C@]([H])(Cl)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(1.9855,4.269,0.5599;2.1443,3.3191,1.0629;2.0086,3.3127,2.1427;2.4827,2.2162,0.3944;2.6112,2.2511,-0.6872;2.6581,0.8615,1.022;1.8642,0.2025,0.6401;2.5046,0.9376,2.1169;3.9179,0.1905,0.6979;3.9739,-1.2289,1.1275;3.2343,-1.4797,1.9041;3.8898,-1.95,0.3055;5.3935,-1.0075,1.6991;5.5436,-1.3231,2.7398;6.4402,-1.4816,0.8861;6.7045,-2.8593,1.0618;6.9818,-3.0853,2.1033;7.5433,-3.1082,0.4073;5.8412,-3.4847,0.7866;5.1372,0.5119,1.4978;4.8908,1.0097,2.4485;6.2493,1.29,0.7869;6.684,0.6552,0.0166;7.6024,1.4869,2.0297;5.8407,2.6459,0.2263;5.0255,2.4454,-0.4814;5.4112,3.2562,1.0303;6.9395,3.4489,-0.4988;7.7464,3.6404,0.2203;7.5369,2.6905,-1.6936;8.2801,3.3088,-2.2106;6.759,2.4275,-2.4231;8.0382,1.766,-1.3877;6.3766,4.8068,-0.9459;7.1543,5.4184,-1.4184;5.9744,5.3727,-0.0971;5.5661,4.6794,-1.6758)|\",6.38379093273\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])/C([H])=C([H])/C([H])=C(\\[H])C(=O)C([H])([H])[H] |(4.6097,2.1932,2.6586;5.5678,2.1281,2.4983;6.2413,2.9271,3.3726;7.446,2.9644,3.3546;5.383,3.6937,4.3727;5.9113,4.6228,4.6025;4.4151,3.9608,3.9301;5.1671,2.8982,5.6786;6.1402,2.7288,6.1532;4.5833,3.5251,6.3643;4.459,1.5361,5.5017;5.0959,0.8466,4.9349;4.3364,1.0945,6.502;3.1104,1.639,4.8496;2.4088,2.3457,5.2983;2.7054,0.931,3.7742;3.3879,0.2075,3.3248;1.3863,1.0513,3.1865;0.6923,1.7623,3.6347;0.9471,0.3486,2.1222;1.5917,-0.3788,1.6309;-0.4261,0.5415,1.5867;-1.2061,1.3434,2.0821;-0.8101,-0.3067,0.3873;-0.691,-1.3738,0.6144;-0.1505,-0.0864,-0.4623;-1.845,-0.1039,0.1063)|\",4.653146843550001\r\n\"[H]C([H])([H])C([H])([H])[C@]([H])(C(=O)OC([H])([H])[C@@]1([H])C([H])([H])C([H])([H])N2C([H])([H])C([H])([H])C([H])([H])[C@@]21[H])C([H])([H])[H] |(10.4755,-2.9156,-3.5885;9.7516,-2.8045,-2.7733;8.9032,-3.4654,-2.9899;10.2324,-3.1652,-1.858;9.3006,-1.3455,-2.6513;8.9087,-1.0091,-3.6184;10.1648,-0.7074,-2.4214;8.2107,-1.1087,-1.5731;7.3527,-1.7467,-1.8234;7.7325,0.3348,-1.665;8.0536,1.2287,-0.9102;6.9072,0.5087,-2.7275;6.4136,1.8503,-2.9553;5.4489,1.7134,-3.45;6.2628,2.3361,-1.9882;7.379,2.6684,-3.8173;8.3265,2.7455,-3.2739;7.5636,2.0835,-5.2314;8.4998,2.4571,-5.6633;7.6112,0.99,-5.236;6.3556,2.6395,-6.0101;5.5245,1.9206,-6.0069;6.6004,2.8329,-7.0623;5.9387,3.8813,-5.315;6.0607,5.1188,-6.0913;5.761,4.9544,-7.131;5.3788,5.8726,-5.6725;7.5242,5.5795,-5.9271;8.163,5.0407,-6.638;7.6629,6.6506,-6.1098;7.8564,5.1649,-4.4771;7.764,6.0159,-3.7931;8.8818,4.7918,-4.3812;6.7996,4.0648,-4.1174;6.1872,4.4221,-3.2788;8.6904,-1.4159,-0.1505;8.9854,-2.465,-0.0555;7.9006,-1.2187,0.5814;9.5451,-0.7853,0.1133)|\",5.967456741465\r\n\"[H]C([H])=C([H])[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])C(=O)C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(-0.1272,-1.1941,0.2007;0.6897,-0.7045,0.7248;0.8883,-1.0477,1.739;1.4069,0.2671,0.1574;1.1756,0.5878,-0.8565;2.5385,0.9926,0.8305;2.6421,0.5819,1.845;2.3388,2.5311,0.9527;1.7573,2.9001,0.1042;1.7795,2.772,1.8632;3.7626,3.1547,0.9448;3.9523,3.783,1.8215;3.888,3.7895,0.0621;4.7465,1.9641,0.8855;5.0041,1.6269,1.8974;5.6856,2.2089,0.377;3.9464,0.8304,0.1849;4.3576,-0.1544,0.4264;3.9826,1.0411,-1.3275;3.1394,1.7372,-1.8861;5.0935,0.4335,-2.1241;6.1512,-0.3111,-1.5675;6.2042,-0.4788,-0.497;7.1537,-0.8375,-2.368;7.9726,-1.4101,-1.9443;7.1244,-0.6391,-3.7576;8.1469,-1.2011,-4.4543;8.1811,-1.0339,-5.8654;9.0773,-1.5561,-6.2046;8.2505,0.0254,-6.1431;7.2989,-1.4778,-6.3439;6.0791,0.1009,-4.3315;6.0353,0.2675,-5.4017;5.0854,0.6275,-3.5137;4.2727,1.2054,-3.9416)|\",4.960635494615\r\n\"[H]O[C@]1([H])C([H])([H])N(C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]([H])(C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@]1([H])Cl |(2.9415,0.7406,6.6827;3.4151,0.1765,6.0479;2.8441,0.3997,4.7709;1.8338,-0.0318,4.727;3.7878,-0.2064,3.703;3.2553,-0.8966,3.0423;4.5655,-0.7741,4.2264;4.3769,0.9023,2.9128;3.9277,0.9825,1.5163;3.7752,-0.0432,1.1613;2.9497,1.494,1.4133;4.9485,1.6611,0.5967;5.1445,2.6845,0.9447;5.9008,1.1209,0.6782;4.4902,1.7069,-0.867;4.299,0.6837,-1.2212;3.5282,2.2351,-0.9328;5.5068,2.3836,-1.7922;5.1538,2.3999,-2.8296;6.469,1.8576,-1.7752;5.692,3.4206,-1.4865;4.194,2.1432,3.6967;4.1258,2.9919,3.009;5.342,2.3814,4.6933;5.0722,3.2234,5.3443;5.4223,1.5043,5.3442;6.7058,2.6658,4.0344;6.8515,1.9151,3.2459;6.7682,4.0626,3.3962;7.7362,4.2282,2.9083;6.6418,4.8446,4.1566;5.9917,4.2124,2.637;7.8365,2.502,5.0616;8.8174,2.6801,4.6044;7.8428,1.4929,5.4899;7.7225,3.2145,5.8893;2.8111,1.881,4.339;2.0383,2.0094,3.5777;2.3009,2.9968,5.6891)|\",6.35657954768\r\n\"[H]OC(=O)C1=C([H])C([H])([H])[C@@]2([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@]2(C([H])([H])[H])[C@]1([H])O[H] |(1.4419,-1.7244,1.0132;2.3142,-1.6231,1.4377;3.1892,-1.3977,0.4283;2.8297,-1.395,-0.7405;4.5815,-1.1404,0.8517;4.8859,-0.7831,2.1091;4.1029,-0.7683,2.8646;6.2646,-0.3722,2.5311;6.6761,-1.1636,3.1758;6.1756,0.508,3.1807;7.1957,-0.0737,1.3381;6.7785,0.8161,0.844;8.646,0.3523,1.762;8.5737,1.7052,2.5059;9.5838,2.0922,2.6877;8.0813,1.6173,3.481;8.03,2.4554,1.9189;9.3629,-0.6459,2.6995;10.3183,-0.2183,3.0278;9.5853,-1.6033,2.2224;8.7779,-0.8538,3.602;9.4954,0.5651,0.4827;9.1202,1.4553,-0.0438;10.529,0.7933,0.7774;9.4619,-0.6177,-0.4907;9.923,-1.5035,-0.0345;10.0707,-0.3807,-1.3727;8.0242,-0.9089,-0.9326;7.9984,-1.7615,-1.6255;7.6444,-0.0467,-1.4889;7.0673,-1.195,0.2507;7.3259,-2.6242,0.7839;8.3638,-2.7676,1.0906;7.1194,-3.357,-0.0056;6.6907,-2.8794,1.6377;5.6036,-1.1649,-0.2737;5.4405,-2.0793,-0.8651;5.3971,-0.0312,-1.1232;4.5,-0.1478,-1.4825)|\",5.67085264442\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])C([H])([H])N1C(=O)[C@@]([H])(C([H])([H])C([H])([H])[H])C([H])([H])[C@]1(C([H])([H])[H])N2[H] |(8.4697,-5.2227,1.3904;7.686,-4.618,0.9418;7.3905,-3.3713,1.4818;7.9431,-3.0033,2.3441;6.3836,-2.5663,0.9211;5.6685,-3.0384,-0.1977;5.9783,-4.2962,-0.7217;5.421,-4.6533,-1.5858;6.9789,-5.0915,-0.1669;7.2047,-6.0648,-0.592;4.5701,-2.1999,-0.8212;3.6003,-2.7154,-0.7363;4.7426,-2.0356,-1.8902;4.4987,-0.8837,-0.2109;4.2916,0.2605,-0.9357;4.0794,0.2916,-2.1401;4.329,1.4518,0.0254;5.0246,2.1912,-0.3883;2.9423,2.1222,0.1607;2.2023,1.3629,0.4522;2.9947,2.8337,0.9961;2.4731,2.8475,-1.1041;1.4821,3.2898,-0.9514;2.4263,2.1639,-1.9555;3.1641,3.6579,-1.3675;4.8707,0.8545,1.3448;4.298,1.1924,2.2145;5.9151,1.1399,1.4967;4.8058,-0.6938,1.1982;3.7298,-1.3332,2.0968;3.9662,-1.1588,3.1533;3.6766,-2.415,1.9366;2.7448,-0.9006,1.891;6.1319,-1.2864,1.4385;6.4436,-1.1745,2.398)|\",5.41506562495\r\n\"[H]C1=NC(C(=O)/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=C([H])C([H])=C1[H] |(4.4075,-4.2684,-3.6414;5.3826,-3.9232,-3.3027;5.3789,-2.8968,-2.4486;6.5655,-2.4456,-2.0093;6.5628,-1.2879,-1.0466;7.6205,-0.8401,-0.619;5.2428,-0.7252,-0.6474;4.3451,-1.1636,-1.0679;5.1719,0.301,0.2116;6.0728,0.739,0.6307;3.8628,0.8722,0.6186;2.7761,0.4825,0.2381;4.0469,1.8947,1.485;2.8527,2.5515,1.9728;2.0763,1.7983,2.1281;3.1522,2.9712,2.9362;2.3854,3.6357,1.0128;1.5264,4.1646,1.4415;2.0776,3.1988,0.0587;3.1821,4.3641,0.83;7.7902,-2.9982,-2.4044;8.7087,-2.5812,-2.007;7.7791,-4.0666,-3.296;8.709,-4.5207,-3.6268;6.5507,-4.5418,-3.7564;6.4931,-5.3734,-4.4523)|\",4.321167945940001\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N1C(=O)[C@]([H])(C([H])([H])C([H])([H])[H])C([H])([H])[C@@]1(C([H])([H])[H])N2[H] |(1.0981,2.4115,7.207;1.4291,2.0601,6.2337;1.643,2.9955,5.2086;1.4635,4.0543,5.3735;2.0714,2.524,3.9741;2.2907,1.1483,3.7797;2.0645,0.2162,4.7779;2.2166,-0.8399,4.5888;1.6262,0.6949,6.0231;1.443,-0.0085,6.8299;2.7069,0.9797,2.4395;2.2448,0.0475,1.5261;1.8395,-1.0658,1.8126;2.3506,0.6741,0.1272;1.4136,0.4462,-0.3939;3.5102,0.075,-0.7022;3.6245,0.6822,-1.6104;4.452,0.1758,-0.1456;3.2941,-1.3922,-1.087;4.1506,-1.7741,-1.654;2.4007,-1.504,-1.7133;3.1552,-2.0186,-0.2015;2.4709,2.1931,0.4155;1.4839,2.6634,0.3966;3.1084,2.7104,-0.3093;3.0257,2.299,1.859;4.5341,2.5863,1.9098;4.7397,3.5972,1.5359;5.0918,1.8781,1.2891;4.9017,2.5095,2.9384;2.2856,3.2155,2.7617;2.7043,4.1352,2.8526)|\",5.10485583538\r\n\"[H]/N=C(\\NN/C(=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])O[H] |(8.0123,-0.9793,-3.7286;7.7534,-1.9243,-4.0183;6.6488,-2.2418,-3.4718;5.9707,-1.3331,-2.5946;4.8982,-1.782,-2.1252;4.2338,-0.9374,-1.1954;4.9557,-0.2914,-0.2525;6.0378,-0.3553,-0.287;4.377,0.4994,0.8503;3.196,0.5921,1.1392;5.367,1.1281,1.5293;4.963,1.9572,2.6441;4.1306,1.4726,3.1601;5.8382,1.9712,3.2984;4.5874,3.3589,2.186;4.3616,3.9856,3.0564;3.7018,3.3304,1.5451;5.4114,3.8217,1.6333;2.7357,-1.0243,-1.3159;2.292,-0.0961,-0.9506;2.4843,-1.1404,-2.378;2.1517,-2.2157,-0.5253;2.6517,-3.1383,-0.8453;2.3845,-2.0714,0.5358;0.6383,-2.3456,-0.7155;0.2421,-3.1903,-0.1406;0.381,-2.5092,-1.7692;0.1189,-1.44,-0.3803;6.0611,-3.432,-3.6796;6.6485,-3.9104,-4.2924)|\",3.956535386270001\r\n\"[H]/C(C(=O)OC([H])([H])[H])=C(\\NNC(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[H] |(2.6439,3.0744,-0.751;2.1783,2.1424,-0.4489;0.7039,2.1199,-0.4076;-0.0052,1.2241,0.0162;0.2111,3.2725,-0.9206;-1.2207,3.37,-0.9461;-1.439,4.3443,-1.384;-1.6289,3.3025,0.066;-1.6504,2.5696,-1.5547;2.9662,1.0939,-0.131;4.3806,1.2763,-0.0841;4.8654,1.9462,-1.0223;6.2742,2.2037,-0.8619;6.6959,3.3179,-0.6742;6.993,1.0949,-1.0562;8.4225,1.2846,-1.0277;8.847,0.3068,-1.2533;8.7389,1.6249,-0.0388;8.7222,2.0211,-1.7773;2.5529,-0.2708,0.3548;1.602,-0.539,-0.1081;3.3148,-0.9909,0.0319;2.4141,-0.3245,1.8882;2.1892,-1.347,2.2099;1.5983,0.3257,2.2165;3.3391,-0.0053,2.3796)|\",4.21232240574\r\n\"[H]C#CC(=O)[C@]([H])(OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(-2.5834,4.7868,-0.2517;-1.5488,4.5372,-0.1692;-0.3733,4.2627,-0.0873;1.0461,3.9383,0.0134;1.8264,4.2269,-0.8748;1.4658,3.226,1.3122;1.0876,3.8343,2.151;2.8631,3.0667,1.3939;3.6069,4.2608,1.657;4.6504,3.9549,1.525;3.3927,5.0265,0.9021;3.3914,4.8127,3.0529;3.1223,6.1711,3.2512;3.041,6.8309,2.39;2.9618,6.6867,4.5397;2.7539,7.7446,4.6768;3.0592,5.8422,5.6455;2.9324,6.2393,6.6492;3.3188,4.4813,5.4575;3.3947,3.8192,6.3162;3.485,3.9716,4.171;3.6838,2.9135,4.0218;0.8415,1.8286,1.3832;-0.2485,1.8784,1.3072;1.1089,1.3607,2.3348;1.2274,1.2063,0.5696)|\",4.99056801817\r\n\"[H]C#C[C@]([H])(O[H])[C@]([H])(OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(2.5753,4.9091,-1.2286;2.4722,4.0314,-0.6317;2.3461,3.0361,0.0405;2.1878,1.8132,0.8496;2.0773,0.9557,0.1741;1.0073,1.846,1.6375;1.222,2.4542,2.3682;3.4062,1.5398,1.7624;3.2189,0.5603,2.231;3.3436,2.5514,2.7721;3.9839,2.2194,4.0034;3.5972,1.2594,4.3808;5.0665,2.1031,3.8484;3.7201,3.322,4.9988;4.4675,4.5057,4.9554;5.247,4.6174,4.2056;4.2189,5.5358,5.8614;4.8075,6.4484,5.8195;3.2159,5.3933,6.8235;3.0232,6.1946,7.532;2.463,4.2193,6.8725;1.6812,4.1029,7.6183;2.7138,3.1913,5.9618;2.1261,2.2768,6.0013;4.7451,1.5284,1.0312;4.7304,0.7989,0.2129;5.5609,1.2521,1.7074;4.9541,2.5165,0.6119)|\",6.517126719475001\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\[H])C([H])([H])C(=O)C1=C([H])C([H])=C(F)C([H])=C1[H] |(6.3716,2.9249,1.4635;7.0216,2.2254,1.8339;6.64,1.0658,1.4897;7.5478,-0.4071,1.9915;8.3193,0.2722,2.8603;5.557,0.7185,0.6804;5.2097,-0.2385,0.6986;4.7373,1.7103,0.237;3.6536,1.3747,-0.3519;3.3578,0.3282,-0.4824;2.7377,2.4315,-0.889;2.7587,2.431,-1.9908;3.1001,3.4161,-0.5731;1.2854,2.2201,-0.4468;0.9773,1.2315,0.2026;0.2618,3.2332,-0.8428;-1.0664,3.022,-0.4337;-1.2892,2.1343,0.1487;-2.068,3.925,-0.7661;-3.0979,3.7787,-0.4578;-1.7276,5.0485,-1.5161;-2.6907,5.9283,-1.8427;-0.4249,5.2918,-1.9395;-0.2076,6.1812,-2.5212;0.5682,4.3776,-1.597;1.585,4.5667,-1.9256)|\",4.508926502785\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]([H])(N(N=O)C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(5.3658,3.0323,-1.9156;4.7227,2.8885,-1.2036;5.0204,3.8306,-0.169;4.9114,4.8539,-0.5443;6.4271,3.6453,0.3773;7.0106,2.3721,0.4238;6.4532,1.5223,0.0401;8.2972,2.2,0.9365;8.7392,1.2073,0.9628;9.0161,3.2994,1.41;10.0193,3.1664,1.8066;8.4426,4.5717,1.3655;8.9987,5.4331,1.7264;7.1571,4.7439,0.85;6.7139,5.7358,0.814;3.9134,3.5951,0.8965;3.9988,2.544,1.195;4.1779,4.3512,2.1397;4.1331,5.6795,2.2235;3.8941,6.2685,1.166;4.4967,3.6522,3.375;3.686,2.9679,3.6527;4.6216,4.4031,4.1565;5.4244,3.0805,3.2647;2.4799,3.8409,0.3552;1.8822,4.3128,1.1444;2.5203,4.5618,-0.4679;1.7783,2.556,-0.1097;2.3997,2.0622,-0.8642;1.6985,1.8593,0.7384;0.382,2.8259,-0.6773;-0.1007,1.8982,-1.0051;0.4297,3.4991,-1.5423;-0.2696,3.2966,0.0696)|\",5.15383632847\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@@]([H])(N(N=O)C([H])([H])[H])C([H])([H])C([H])([H])[H] |(4.0652,-1.0999,3.7614;3.1158,-0.8677,3.715;3.022,0.0028,2.613;3.6229,0.9128,2.7816;1.5671,0.4304,2.4579;1.2466,1.581,1.7254;2.0428,2.1837,1.2911;-0.0805,1.9752,1.5579;-0.31,2.8736,0.9909;-1.1095,1.2226,2.1283;-2.1446,1.529,2.0026;-0.7982,0.0822,2.8694;-1.5924,-0.5034,3.3255;0.5315,-0.3117,3.0357;0.7804,-1.1847,3.6282;3.5294,-0.6151,1.2647;3.0974,0.0255,0.4887;4.9955,-0.4551,1.0173;5.9999,-0.6808,1.8501;5.7204,-1.0367,3.0018;5.4176,-0.0641,-0.3245;5.0429,-0.7756,-1.0686;6.5078,-0.0579,-0.3442;5.0395,0.935,-0.5695;3.0468,-2.0522,0.9607;1.966,-2.0625,1.1455;3.1694,-2.2159,-0.1187;3.715,-3.2095,1.7121;3.3253,-4.1621,1.3352;3.5152,-3.1575,2.7833;4.8,-3.2157,1.5655)|\",5.232749345115\r\n\"[H]C1=C([H])C(C(=O)C([H])([H])/C([H])=N/OC([H])([H])C([H])([H])[H])=C([H])C([H])=C1Cl |(5.2625,-1.9464,6.4598;5.1012,-2.7953,5.8042;5.611,-2.7997,4.5071;6.1548,-1.9288,4.157;5.3932,-3.8964,3.6582;5.9198,-3.9711,2.2604;5.6265,-4.9023,1.5277;6.8428,-2.8576,1.7347;7.5684,-2.5468,2.492;7.3802,-3.2945,0.8863;6.064,-1.6575,1.2708;5.3865,-1.7606,0.4196;6.1974,-0.5422,1.8795;5.3961,0.4482,1.3109;5.587,1.6849,2.0105;5.3365,1.5472,3.0702;6.6415,1.9823,1.9456;4.6795,2.7118,1.3558;4.7892,3.6799,1.856;3.6311,2.404,1.4235;4.9339,2.8381,0.2985;4.6484,-4.9884,4.1337;4.4816,-5.8278,3.4667;4.1402,-4.9989,5.4265;3.5696,-5.8443,5.796;4.3742,-3.8973,6.2534;3.7363,-3.8993,7.8879)|\",4.71301189066\r\n\"[H]C1=C([H])C(/C(=N\\OC([H])([H])[H])C([H])([H])/C([H])=N/OC([H])([H])[H])=C([H])C([H])=C1F |(10.1089,-2.5808,1.5584;9.0336,-2.5165,1.6889;8.2506,-1.7202,0.8583;8.724,-1.1549,0.0685;6.8517,-1.6488,1.024;5.9791,-0.8527,0.1256;6.2711,0.1827,-0.589;7.5819,0.6419,-0.4556;7.7347,1.8108,-1.2562;7.5636,1.5881,-2.3156;8.7671,2.1345,-1.1044;7.0451,2.5993,-0.9344;4.507,-1.2388,-0.0264;4.1909,-0.9066,-1.0215;4.375,-2.3229,0.0285;3.6205,-0.5731,0.993;3.5952,0.5189,1.0327;2.9111,-1.2817,1.7834;2.1426,-0.4776,2.6342;1.3547,-1.3107,3.4775;0.773,-0.6262,4.0993;1.9858,-1.9437,4.113;0.6794,-1.9445,2.8908;6.2703,-2.4112,2.0579;5.1987,-2.3835,2.2199;7.0467,-3.1984,2.9039;6.6006,-3.7773,3.7058;8.4209,-3.2402,2.7049;9.1794,-4.0051,3.5154)|\",4.9960102951800005\r\n\"[H]O[C@@]1([H])/C([H])=C2/C(=O)O[C@@]([H])(C([H])([H])[H])C([H])([H])[C@@]2([H])O[C@]1([H])/C([H])=C(\\[H])C([H])([H])[H] |(6.7731,-1.6637,-0.862;6.1322,-2.1954,-1.3649;4.8724,-1.9929,-0.7156;4.1504,-2.5374,-1.3314;4.9458,-2.5724,0.6752;5.013,-3.6517,0.7811;5.0609,-1.7893,1.7572;5.2869,-2.4197,3.0962;5.1857,-3.6094,3.297;5.6755,-1.613,4.1206;5.7467,-0.1648,4.0137;4.7514,0.215,4.2856;6.7603,0.2825,5.0566;6.7941,1.3764,5.1069;7.7593,-0.0882,4.8046;6.4875,-0.104,6.0428;6.0843,0.2913,2.5972;7.0865,-0.0567,2.3169;6.0823,1.3856,2.5506;5.0539,-0.2838,1.6347;4.0561,0.0941,1.9173;5.3559,0.1442,0.3102;4.5198,-0.4978,-0.6841;4.8707,-0.0673,-1.6287;3.0665,-0.137,-0.4919;2.8711,0.9361,-0.5314;2.0453,-0.9748,-0.2898;2.2306,-2.0479,-0.2398;0.6123,-0.5605,-0.1213;-0.0215,-1.012,-0.8961;0.4934,0.5268,-0.1746;0.2167,-0.9031,0.844)|\",5.649083536380001\r\n\"[H]OC1=N[C@@]2([H])[C@@]([H])(O1)/C([H])=C(/[H])[C@@]2([H])/N=C(/O[H])OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(7.4375,-0.2751,2.8741;7.0064,0.4043,3.4202;5.772,0.5427,2.9246;5.2793,-0.1095,1.9437;3.9082,0.3811,1.7837;3.2065,-0.4427,1.9476;3.777,1.5149,2.8544;2.9901,1.3656,3.5996;5.0438,1.4587,3.5895;3.6403,2.7834,2.066;3.6111,3.7597,2.5398;3.5566,2.5419,0.7567;3.4541,3.298,-0.0165;3.6113,1.0735,0.3977;2.6136,0.7408,0.0679;4.5779,0.8514,-0.664;4.7307,-0.326,-1.1164;5.6597,-0.5615,-2.079;6.0959,0.3001,-2.2065;3.997,-1.3864,-0.7599;4.4337,-2.7943,-0.8827;3.3832,-3.5281,-0.0445;3.5659,-4.6074,-0.0713;2.3786,-3.337,-0.4355;3.42,-3.1941,0.9971;5.8292,-2.9695,-0.2757;6.0506,-4.038,-0.1736;5.8687,-2.5006,0.7122;6.598,-2.5194,-0.9078;4.3655,-3.2421,-2.3456;4.5865,-4.3139,-2.4078;5.0868,-2.7038,-2.9623;3.3615,-3.0774,-2.7515)|\",6.549780381535\r\n\"[H]C1=C([H])C([H])=C(C2=NC([H])([H])[C@@]([H])(C([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])O2)C([H])=C1[H] |(3.8771,-8.1093,2.5554;3.9179,-7.2104,1.9458;4.1423,-7.308,0.5689;4.2762,-8.2823,0.1067;4.1957,-6.1589,-0.2142;4.3692,-6.2147,-1.2837;4.0237,-4.8976,0.3764;4.0822,-3.6917,-0.466;4.293,-3.668,-1.7262;4.2651,-2.2599,-2.1295;5.2291,-1.9769,-2.5696;3.5001,-2.1061,-2.9023;3.949,-1.4615,-0.8327;4.7617,-0.7865,-0.5451;2.621,-0.6948,-0.8725;2.6712,-0.003,-1.724;1.8225,-1.4141,-1.0959;2.3064,0.0725,0.394;2.651,1.4252,0.5134;3.126,1.9323,-0.3242;2.3871,2.1324,1.6875;2.6594,3.1823,1.7583;1.7693,1.4942,2.7636;1.5588,2.0431,3.6776;1.4182,0.1468,2.6561;0.9328,-0.3573,3.4878;1.685,-0.5566,1.4816;1.4121,-1.6056,1.4045;3.8922,-2.5001,0.1821;3.7997,-4.8019,1.7569;3.6712,-3.8249,2.2101;3.7475,-5.9573,2.5363;3.5742,-5.8777,3.606)|\",5.33887374681\r\n\"[H]C1=C([H])C(C(=O)C([H])([H])/C([H])=N/OC([H])([H])C([H])([H])[H])=C([H])C([H])=C1F |(2.9177,5.2472,0.8814;3.8624,4.8233,0.5572;4.1566,4.6231,-0.7853;3.4399,4.886,-1.5561;5.3909,4.078,-1.1817;5.6374,3.9066,-2.6443;4.7652,4.1448,-3.4654;7.0141,3.4157,-3.1277;7.0781,3.7063,-4.1818;7.8279,3.8974,-2.5777;7.1525,1.923,-3.0106;6.4866,1.2798,-3.5911;8.039,1.4286,-2.2347;8.0163,0.0329,-2.2602;9.0352,-0.4806,-1.3925;10.0152,-0.1125,-1.7223;8.8612,-0.1178,-0.3712;8.9586,-1.9958,-1.4631;9.7201,-2.4389,-0.8124;9.1308,-2.3467,-2.4856;7.9758,-2.3514,-1.1376;6.3303,3.7268,-0.1973;7.2781,3.2781,-0.4747;6.0458,3.9169,1.1531;6.7547,3.6467,1.9285;4.8175,4.4651,1.5062;4.5408,4.655,2.8096)|\",4.846347677405\r\n\"[H]C#CC(=O)C([H])([H])[C@@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(8.0925,-3.5447,1.1624;7.1545,-3.1939,0.7931;6.0955,-2.7843,0.3764;4.8199,-2.3011,-0.1408;4.1334,-1.525,0.504;4.4313,-2.8254,-1.5106;5.2128,-2.5266,-2.221;4.4488,-3.9245,-1.483;3.0775,-2.2933,-2.0062;3.1013,-1.2025,-1.938;2.8364,-2.6908,-3.4058;3.4895,-2.0622,-4.4239;4.3828,-1.2404,-4.253;3.0152,-2.4863,-5.6235;3.5565,-1.9535,-6.8818;3.2843,-0.4473,-6.9729;3.5925,-0.0763,-7.9569;3.8336,0.0973,-6.2034;2.2136,-0.2472,-6.8539;5.0492,-2.2794,-7.0024;5.4118,-1.9743,-7.9905;5.2143,-3.3578,-6.8985;5.626,-1.7573,-6.2381;2.7498,-2.7126,-7.939;3.0573,-2.4003,-8.9426;1.6791,-2.5134,-7.8261;2.9107,-3.7919,-7.8477;1.6904,-3.5941,-3.569;0.8248,-3.0519,-3.975;1.9277,-4.4037,-4.2644;1.4282,-4.078,-2.1368;0.3854,-4.3699,-1.9812;2.053,-4.9519,-1.9157;1.8526,-2.878,-1.2718;2.08,-3.1441,-0.2373;1.0516,-2.1301,-1.2466)|\",4.8082517383350005\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])N=C([H])C([H])=C1[H] |(10.5212,-1.0957,-1.5814;9.6412,-0.9449,-1.1973;9.6092,-1.4854,0.0414;8.4272,-1.4203,0.8223;7.1734,-0.7506,0.3337;7.1274,-0.1745,-0.7449;5.9797,-0.8201,1.2323;6.0584,-1.358,2.1691;4.8341,-0.2226,0.8782;4.7566,0.3212,-0.0585;3.6377,-0.28,1.7543;3.5696,-0.8642,2.8182;2.6183,0.422,1.2043;1.3816,0.4611,1.9534;1.6168,0.5107,3.0194;0.9056,1.3939,1.6411;0.508,-0.7447,1.6391;-0.4503,-0.6556,2.1637;0.9952,-1.6671,1.9678;0.308,-0.812,0.5648;8.3765,-1.9579,2.0533;9.4446,-2.5603,2.5653;9.3385,-2.9729,3.5661;10.651,-2.6694,1.8665;11.5066,-3.1688,2.3111;10.7283,-2.1264,0.5923;11.6486,-2.1905,0.0138)|\",4.25586062182\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C#CC([H])([H])[H] |(13.2635,-3.8031,3.3265;12.9404,-4.5818,3.8096;11.5913,-4.703,3.6475;11.0031,-5.6135,4.1726;10.9421,-3.6297,2.7808;11.1811,-2.6458,3.212;11.421,-3.6487,1.7902;9.4275,-3.8029,2.6456;9.2202,-4.8003,2.2399;8.978,-3.7868,3.6457;8.7853,-2.7286,1.7602;8.9947,-1.7326,2.1777;9.2413,-2.7431,0.7607;7.2614,-2.9064,1.6126;7.0484,-3.8886,1.171;6.8113,-2.9098,2.617;6.63,-1.8242,0.7796;6.7183,-0.8079,1.1713;6.0115,-2.0109,-0.3879;5.927,-3.0159,-0.8022;5.3902,-0.9051,-1.2157;4.3006,-1.0566,-1.2709;5.5365,0.0582,-0.7083;5.9321,-0.8384,-2.5792;6.3778,-0.7908,-3.7027;6.9117,-0.728,-5.0615;6.2032,-0.2487,-5.7483;7.124,-1.7302,-5.4539;7.8461,-0.1544,-5.0983)|\",6.838221063065\r\n\"[H]O[C@]([H])(C([H])([H])C([H])([H])[H])[C@]1([H])C([H])([H])[C@@]2([H])O[C@@]2([H])[C@]1([H])C#CC([H])([H])[H] |(5.7111,2.6911,2.2506;4.8673,2.2224,2.1475;5.1063,0.8342,2.4086;4.1029,0.3959,2.4278;5.7528,0.6162,3.7848;5.9009,-0.4621,3.934;6.7592,1.0625,3.802;4.9192,1.1911,4.9342;4.7584,2.2651,4.8001;5.4126,1.032,5.8993;3.9323,0.7148,4.9789;5.8858,0.1642,1.2562;6.1015,-0.8648,1.5671;7.2037,0.9244,0.9135;8.0328,0.2345,0.7217;7.5254,1.5744,1.7366;6.9061,1.7387,-0.3364;7.2939,2.7509,-0.4469;6.8936,0.9542,-1.5424;5.6398,1.2922,-0.926;4.9748,1.9302,-1.5034;5.0729,0.1265,-0.1014;5.3658,-0.7897,-0.6325;3.6162,0.134,0.0208;2.4079,0.1195,0.0827;0.9494,0.1124,0.1683;0.5846,-0.7845,0.6838;0.4919,0.1286,-0.8285;0.5785,0.986,0.718)|\",7.95660898862\r\n\"[H]O[C@]([H])(C([H])([H])C([H])([H])[H])[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C#CC([H])([H])[H] |(3.3432,1.8953,-1.424;3.9834,2.4441,-1.9099;5.0581,1.5798,-2.2779;5.7644,2.2392,-2.8023;4.6029,0.4985,-3.2751;5.4581,-0.1364,-3.5378;3.8705,-0.1592,-2.7906;3.9929,1.1017,-4.5446;3.6214,0.3185,-5.2154;4.7333,1.692,-5.0994;3.1612,1.767,-4.2947;5.8117,1.0747,-1.0226;6.0235,1.9745,-0.4369;7.1291,0.3304,-1.3908;7.4477,0.5407,-2.4208;7.961,0.6479,-0.7447;6.8078,-1.1275,-1.1661;7.4565,-1.933,-1.4996;5.6709,-1.2927,-0.4902;5.238,-2.2384,-0.1789;5.0467,0.0422,-0.1056;5.3347,0.2602,0.9367;3.577,0.0575,-0.1316;2.3661,0.0251,-0.0863;0.9059,-0.0144,-0.0269;0.516,0.684,0.7234;0.5482,-1.0166,0.2383;0.4595,0.2513,-0.9928)|\",6.94978774177\r\n\"[H]C1=C([H])[C@]([H])(C#CC([H])([H])[H])[C@]([H])(C(=O)N(OC([H])([H])[H])C([H])([H])[H])C1([H])[H] |(7.2935,-2.5868,-0.0789;6.8412,-1.6291,-0.322;6.0534,-0.9287,0.4937;5.7412,-1.2241,1.491;5.5275,0.3446,-0.1566;5.7089,1.2254,0.4722;4.0824,0.2634,-0.4095;2.8916,0.1988,-0.614;1.4542,0.1216,-0.863;1.202,0.5278,-1.8502;0.8868,0.6899,-0.1155;1.0973,-0.9152,-0.8338;6.4317,0.4442,-1.4543;7.2479,1.1264,-1.2027;5.6896,0.9749,-2.6752;5.2702,0.2594,-3.5734;5.547,2.3542,-2.7614;5.7298,3.0707,-1.5556;6.844,3.9556,-1.68;6.8846,4.5014,-0.7336;6.7046,4.6651,-2.5051;7.7791,3.403,-1.8325;4.4989,2.9322,-3.5891;4.398,2.3054,-4.4746;4.7818,3.946,-3.8861;3.5458,2.9624,-3.047;7.0048,-0.9759,-1.6715;6.4369,-1.5053,-2.4466;8.0484,-0.9414,-2.0091)|\",6.7783560159550005\r\n\"[H]OC([H])([H])[C@@]1([H])C([H])=C([H])C([H])([H])[C@@]1([H])C(=O)N(OC([H])([H])[H])C([H])([H])[H] |(6.9852,-3.151,-2.1789;6.0535,-3.2564,-1.9327;5.6333,-2.0299,-1.3444;5.8682,-1.1734,-1.9927;4.5433,-2.1006,-1.2672;6.2441,-1.8202,0.0578;5.8642,-2.6296,0.6915;7.7582,-1.8045,0.0141;8.3388,-2.6942,-0.2167;8.269,-0.5931,0.2457;9.327,-0.3451,0.2247;7.1998,0.4362,0.516;7.0795,1.1288,-0.3263;7.4127,1.0499,1.4008;5.9218,-0.4223,0.7094;5.7737,-0.6012,1.7773;4.6776,0.2589,0.148;4.7156,1.1042,-0.738;3.4635,-0.0874,0.7247;3.4378,-1.3229,1.4103;3.1225,-1.1185,2.7892;2.1462,-0.6336,2.9092;3.891,-0.5178,3.2899;3.0882,-2.1198,3.2257;2.2101,0.1938,0.04;2.3368,1.1286,-0.5051;1.4063,0.3008,0.7738;1.9548,-0.61,-0.6618)|\",6.6341356751900005\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])[C@@]2([H])C([H])([H])OC(=O)C2([H])[H])C([H])=C1[H] |(-0.8592,-4.181,1.9974;-0.1502,-3.4887,1.5521;-0.6023,-2.354,0.8785;-1.6676,-2.1558,0.7954;0.3124,-1.47,0.3083;-0.0469,-0.5866,-0.2149;1.6974,-1.6952,0.397;2.6108,-0.7229,-0.221;2.1139,0.1073,-0.7251;3.9526,-0.7446,-0.2323;4.4947,-1.5537,0.259;4.8088,0.3012,-0.8794;4.1704,1.0301,-1.3941;5.7106,1.0681,0.1168;6.0863,0.4107,0.9117;5.2169,1.9281,0.5744;6.8283,1.5517,-0.6501;6.9756,0.8263,-1.7998;7.8665,1.0301,-2.5805;5.8656,-0.2179,-1.8718;5.5112,-0.3398,-2.8972;6.2807,-1.182,-1.5469;2.1357,-2.8457,1.0794;3.1983,-3.0534,1.1659;1.2239,-3.7292,1.6488;1.5856,-4.6112,2.1707)|\",5.0395485112600005\r\n\"[H]O[C@@]([H])(/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H])[C@]1([H])C(=O)O[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]1([H])O[H] |(6.0742,-3.4967,-1.8199;6.2183,-2.5678,-2.0903;6.1007,-1.8158,-0.8978;6.4896,-0.8154,-1.1297;4.6727,-1.6739,-0.4182;4.5233,-1.1105,0.5044;3.6017,-2.154,-1.0692;3.7629,-2.6874,-2.0043;2.2281,-2.0027,-0.6169;2.0736,-1.4623,0.3192;1.1592,-2.4837,-1.2735;1.323,-3.0236,-2.2074;-0.2641,-2.3416,-0.8258;-0.8683,-1.8141,-1.577;-0.3368,-1.7879,0.1166;-0.7375,-3.3227,-0.6804;7.0561,-2.3862,0.1991;8.0203,-2.539,-0.3135;6.6369,-3.8017,0.6094;6.1963,-4.5911,-0.2039;6.8242,-4.2333,1.873;7.2242,-3.3676,2.9726;7.8652,-4.014,3.5799;5.9805,-3.0127,3.7842;6.2663,-2.4684,4.6912;5.4548,-3.9255,4.0784;5.2817,-2.3919,3.2145;8.039,-2.1678,2.4864;9.0132,-2.5176,2.1147;8.2334,-1.4935,3.3272;7.3177,-1.4206,1.3661;6.3704,-1.0276,1.7468;8.0248,-0.2592,0.9469;8.8712,-0.547,0.5665)|\",5.281729838205\r\n\"[H]C1([H])C2=C(C([H])([H])C([H])([H])C1([H])[H])C([H])([H])[C@]1([H])OC(=O)C([H])([H])[C@]1([H])C2([H])[H] |(-1.2829,-1.8372,-1.2696;-1.0269,-1.5023,-0.2509;-0.7266,-2.415,0.2847;0.1466,-0.5506,-0.3197;0.0319,0.7681,-0.089;-1.289,1.4398,0.2167;-1.3059,1.7422,1.2768;-1.3608,2.3783,-0.3519;-2.4961,0.5441,-0.0942;-3.406,0.9711,0.3446;-2.6538,0.513,-1.1814;-2.2602,-0.8806,0.4198;-3.1419,-1.5086,0.2444;-2.1049,-0.8498,1.5074;1.2501,1.6626,-0.2033;1.0818,2.6061,0.3291;1.4029,1.938,-1.2569;2.5629,1.0158,0.2836;2.8228,1.3963,1.2781;3.6361,1.4188,-0.6075;4.4938,0.3913,-0.8645;5.4607,0.5183,-1.5698;4.0222,-0.8521,-0.1195;4.1316,-1.7362,-0.7537;4.678,-0.9915,0.7491;2.5789,-0.5335,0.2898;2.3574,-0.8913,1.3001;1.5123,-1.1098,-0.6598;1.7818,-0.8665,-1.7002;1.5156,-2.2045,-0.591)|\",6.642299090705\r\n\"[H]C#CC1([C@]([H])(OC([H])([H])C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[H])OC([H])([H])C([H])([H])O1 |(2.7568,-4.2823,-0.8589;2.9807,-3.2498,-0.7116;3.2346,-2.0819,-0.5396;3.5357,-0.643,-0.357;3.7869,-0.2627,1.1267;4.1564,0.7739,1.0972;4.7338,-1.1137,1.7441;6.108,-0.873,1.4325;6.2889,-1.0528,0.3658;6.3623,0.1795,1.6346;6.9622,-1.7905,2.2746;6.6574,-3.1555,2.3629;5.7745,-3.5329,1.8555;7.4634,-4.0125,3.1105;7.2141,-5.0687,3.1749;8.5886,-3.5186,3.7765;9.2165,-4.1882,4.3586;8.8981,-2.1611,3.6942;9.7643,-1.7651,4.218;8.0845,-1.3029,2.9509;8.3215,-0.2422,2.9016;2.5096,-0.3521,1.9549;1.7318,0.2935,1.5404;2.7265,-0.0435,2.9815;2.1437,-1.3834,1.9743;4.6644,-0.271,-1.1412;4.1806,-0.0183,-2.4617;4.7448,0.8286,-2.8611;4.3363,-0.8915,-3.1068;2.6725,0.2808,-2.2602;2.0442,-0.4395,-2.7981;2.3844,1.2948,-2.5502;2.4761,0.163,-0.8501)|\",6.49263647293\r\n\"[H]C1=C([H])C([C@]([H])(/C([H])=C(\\[H])C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[H])=C([H])C([H])=C1F |(8.405,-0.1185,0.2465;7.4827,-0.5564,0.6143;6.2376,0.0005,0.3184;6.1927,0.8946,-0.299;5.0517,-0.5699,0.7969;3.6888,0.0341,0.4513;3.8898,1.0037,-0.0319;2.9344,0.3478,1.7219;3.438,1.0196,2.4165;1.7332,-0.1184,2.0881;1.1492,-0.7957,1.4741;1.1348,0.2877,3.3803;1.6286,1.0429,4.1953;-0.0746,-0.3035,3.5512;-0.7444,0.0244,4.7768;-1.688,-0.5221,4.7516;-0.9253,1.1009,4.8409;-0.1456,-0.2838,5.6384;2.8954,-0.8266,-0.5567;2.7084,-1.8154,-0.1179;1.9132,-0.3614,-0.7087;3.5898,-0.9898,-1.9119;2.9719,-1.5849,-2.593;4.5585,-1.4902,-1.8125;3.7645,-0.0166,-2.3877;5.1424,-1.7195,1.5958;4.2364,-2.1692,1.9932;6.3755,-2.2925,1.9018;6.4534,-3.1805,2.5207;7.5295,-1.6986,1.4029;8.7283,-2.2455,1.6967)|\",5.43683473299\r\n\"[H]C1=C([H])C(S(=O)(=O)C([H])([H])[H])=C([H])C([H])=C1[C@@]1([H])O[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])[H] |(2.4727,-1.9997,-3.8152;2.6126,-3.0348,-3.5127;2.663,-4.0336,-4.4809;2.5842,-3.7971,-5.537;2.8365,-5.3596,-4.0777;2.88,-6.6565,-5.3236;3.2471,-6.0393,-6.6085;3.6456,-7.7907,-4.7821;1.1525,-7.1864,-5.4526;0.5375,-6.3421,-5.7701;0.8218,-7.5728,-4.4865;1.1311,-7.9777,-6.2061;2.975,-5.6938,-2.7298;3.1323,-6.729,-2.4439;2.9242,-4.6845,-1.7692;3.0141,-4.9192,-0.7138;2.7469,-3.3486,-2.152;2.6777,-2.2513,-1.1423;1.8953,-1.5156,-1.3495;2.7915,-2.5979,0.2423;3.8253,-1.7624,-0.3275;5.1914,-2.405,-0.422;5.8445,-1.838,-1.0944;5.1282,-3.4295,-0.7934;5.6632,-2.4257,0.5676;3.7796,-0.3206,0.1591;2.7295,-0.0566,0.33;4.2755,-0.2655,1.1383;4.4207,0.6861,-0.8056;4.3128,1.7069,-0.4242;3.9455,0.6465,-1.7931;5.491,0.4979,-0.9431)|\",5.85044778575\r\n\"[H]OC([H])([H])/C(=C(\\[H])C1=C([H])C([H])=C(SC([H])([H])[H])C([H])=C1[H])C([H])([H])C([H])([H])[H] |(4.7416,-2.257,-1.8948;5.2769,-1.91,-1.163;4.4231,-1.8728,-0.0126;4.154,-2.8928,0.3023;5.0459,-1.4351,0.7764;3.1722,-1.0445,-0.2211;1.9813,-1.6827,-0.222;2.0122,-2.7552,-0.0202;0.6134,-1.1669,-0.3908;-0.4404,-1.8504,0.2407;-0.2147,-2.73,0.8396;-1.764,-1.4292,0.1388;-2.5347,-1.9893,0.6569;-2.0859,-0.3014,-0.6272;-3.7324,0.3449,-0.8412;-4.7744,-0.8143,0.1031;-5.7991,-0.4538,-0.0167;-4.5215,-0.8096,1.1669;-4.7085,-1.8306,-0.2948;-1.0499,0.3764,-1.2902;-1.2812,1.2377,-1.912;0.2679,-0.0485,-1.1752;1.0328,0.4777,-1.7344;3.3767,0.4491,-0.3699;2.454,0.978,-0.1091;4.1311,0.76,0.367;3.8593,0.9028,-1.7624;4.0088,1.9885,-1.7818;3.1293,0.6501,-2.5411;4.803,0.4152,-2.0188)|\",4.655867982055001\r\n\"[H]O/C(=N/[C@]([H])(C1=C([H])C([H])=C(C#N)C([H])=C1[H])C([H])([H])F)C([H])([H])OC([H])([H])[H] |(5.2025,1.8402,2.3842;5.5567,1.8218,1.4732;4.838,0.9072,0.7944;5.1514,0.6685,-0.4128;4.3921,-0.2662,-1.2159;3.8595,-1.0312,-0.6259;5.2976,-0.9989,-2.202;6.5475,-0.4859,-2.5674;6.8931,0.4275,-2.0959;7.3388,-1.1493,-3.5011;8.3087,-0.7501,-3.7807;6.8871,-2.3437,-4.0867;7.7011,-3.0311,-5.0464;8.3601,-3.5896,-5.826;5.6351,-2.8658,-3.7218;5.2874,-3.793,-4.166;4.8552,-2.1931,-2.788;3.8913,-2.6096,-2.5033;3.3151,0.5045,-1.9975;2.7784,-0.1546,-2.6888;3.7755,1.3281,-2.5524;2.3863,1.0411,-1.1042;3.6966,0.2858,1.6036;3.7223,-0.8112,1.5161;2.7401,0.633,1.1912;3.8513,0.6814,2.955;2.7552,0.3326,3.784;2.982,0.7079,4.7841;2.6227,-0.7578,3.8291;1.822,0.7901,3.425)|\",5.591939627775\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(N2C([H])([H])[C@@]2([H])C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[H])C([H])=C1[H] |(-3.4376,-0.9926,1.8092;-2.4074,-1.1724,1.5133;-2.1123,-2.1179,0.5271;-2.9136,-2.678,0.0518;-0.7912,-2.346,0.1519;-0.5643,-3.0728,-0.6236;0.2642,-1.6411,0.7509;1.691,-1.9416,0.2899;1.812,-3.0339,0.3132;1.7728,-1.6059,-1.1509;3.0439,-1.4868,-1.841;3.9765,-1.5436,-1.2794;3.0812,-1.939,-2.831;2.1722,-0.2806,-1.6053;2.5667,0.4122,-0.8605;1.3813,0.3828,-2.6867;0.4314,-0.3206,-3.4386;0.2423,-1.3617,-3.1958;-0.2722,0.3155,-4.4608;-1.0101,-0.2414,-5.0329;-0.0385,1.6626,-4.7469;-0.5886,2.1571,-5.5431;0.903,2.3718,-3.9989;1.088,3.4223,-4.2084;1.6054,1.7349,-2.9758;2.3358,2.2935,-2.3936;2.78,-1.3602,1.2049;2.6369,-1.7261,2.2269;2.7671,-0.266,1.2402;3.7762,-1.6775,0.8819;-0.0439,-0.6972,1.7353;0.7469,-0.135,2.2219;-1.3697,-0.4634,2.1143;-1.5858,0.2755,2.8818)|\",6.05181203512\r\n\"[H]C([H])=C(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[S@](=O)C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.2804,-3.3307,3.1762;6.7766,-4.0996,2.5906;7.4994,-4.7457,3.0817;6.4966,-4.274,1.2975;5.56,-3.5229,0.4573;4.984,-4.1041,-0.2653;5.3827,-2.1935,0.5083;5.9902,-1.6123,1.204;4.4091,-1.4156,-0.3307;4.9712,-0.7128,-0.9643;3.8812,-2.0925,-1.0135;3.3974,-0.5955,0.5016;3.9555,0.0946,1.1498;2.815,0.0342,-0.1854;2.436,-1.4282,1.3654;3.0154,-2.0561,2.0553;1.8538,-0.7407,1.9933;1.4719,-2.31,0.5635;0.7733,-2.8317,1.2271;2.0031,-3.0738,-0.0156;0.8794,-1.7117,-0.1402;7.2924,-5.7332,0.4924;8.1478,-6.408,1.5441;8.4339,-4.8124,-0.6008;8.0011,-4.3304,-1.8364;6.9712,-4.468,-2.1549;8.9116,-3.6704,-2.664;8.5851,-3.2869,-3.6268;10.2397,-3.5144,-2.2604;10.9471,-3.007,-2.9106;10.6631,-4.0215,-1.0291;11.6995,-3.9091,-0.7219;9.7594,-4.6783,-0.1932;10.0601,-5.1007,0.7612),wU:21.21|\",5.16472088249\r\n\"[H]C([H])([H])OC(=O)[C@]12C(=O)C([H])([H])C([H])([H])[C@@]1([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C2([H])[H] |(6.8893,1.0039,-4.0249;5.882,0.59,-4.0743;5.1599,1.3609,-4.3553;5.8359,-0.2249,-4.8011;5.6055,0.097,-2.752;4.3917,-0.4609,-2.5844;3.5599,-0.5195,-3.4681;4.2179,-1.067,-1.1892;4.6795,-2.555,-1.3119;5.8176,-2.9016,-1.5274;3.4765,-3.4595,-1.0835;3.0475,-3.6783,-2.0728;3.776,-4.4095,-0.6326;2.5149,-2.6,-0.2552;2.7898,-2.657,0.8069;1.4745,-2.9282,-0.3373;2.7152,-1.1542,-0.7796;2.1276,-1.0389,-1.6981;2.2849,-0.0508,0.217;2.5703,-0.3715,1.2313;0.7648,0.1548,0.2035;0.4588,0.8807,0.9661;0.2277,-0.7806,0.4003;0.4309,0.5322,-0.7712;3.0434,1.2526,-0.0905;2.8281,1.538,-1.1302;2.6601,2.07,0.5328;4.5699,1.0993,0.1152;4.8434,1.4059,1.1323;5.1034,1.7715,-0.5656;5.0515,-0.3514,-0.1055;4.9382,-0.9215,0.8257;6.1111,-0.3876,-0.3667)|\",5.776977046115\r\n\"[H]C1=C(C([H])([H])[H])[C@]2(C(=O)OC([H])([H])[H])C(=O)C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])[C@]2([H])C1([H])[H] |(4.2738,2.6019,-2.2833;4.3926,1.7823,-1.5788;5.1073,0.6809,-1.837;5.8205,0.3881,-3.1258;5.7024,1.2294,-3.8173;5.4321,-0.517,-3.6008;6.8889,0.2205,-2.9593;5.1079,-0.2255,-0.5962;6.4206,-0.1405,0.2157;6.5033,-0.3743,1.405;7.4833,0.1801,-0.5423;8.7456,0.2151,0.1478;9.4823,0.4842,-0.6093;8.9729,-0.7642,0.5767;8.7227,0.9597,0.9472;4.9433,-1.747,-0.8789;5.3532,-2.2713,-1.8945;4.2736,-2.5095,0.2425;4.2638,-3.5728,-0.0098;4.9046,-2.3717,1.1298;2.8544,-1.9609,0.5369;2.6674,-2.0213,1.6163;2.1047,-2.5998,0.0563;2.6411,-0.5052,0.063;2.4461,-0.5237,-1.0208;1.4009,0.0728,0.76;0.5557,-0.6198,0.6691;1.083,1.0293,0.3347;1.5905,0.2259,1.8303;3.9213,0.3367,0.2802;4.1916,0.2881,1.3397;3.7898,1.806,-0.2;2.7533,2.1597,-0.1986;4.3428,2.4902,0.4622)|\",5.200095683054999\r\n\"[H]C([H])=C1C([H])([H])C([H])([H])[C@]2([H])[C@]1(C(=O)OC([H])([H])[H])C(=O)C([H])([H])C([H])([H])[C@@]2([H])C([H])([H])[H] |(5.2074,4.3679,2.8814;5.1513,3.2978,2.6931;5.7988,2.657,3.2807;4.2972,2.8025,1.7983;3.3194,3.6035,0.938;2.5701,4.0817,1.5831;3.8213,4.4083,0.3897;2.641,2.5796,-0.0041;3.197,2.4831,-0.9397;1.6096,2.8576,-0.2427;2.7218,1.2692,0.7971;2.0235,1.3831,1.6455;4.14,1.3199,1.4492;5.2715,1.0044,0.4564;5.2399,1.1477,-0.7473;6.3723,0.5845,1.1191;7.5421,0.3648,0.3109;8.3017,-0.0128,0.9954;7.8677,1.3032,-0.1456;7.3342,-0.3624,-0.478;4.1444,0.2787,2.5846;4.3541,0.5368,3.7498;3.7804,-1.1232,2.0953;4.6183,-1.4851,1.4834;3.7078,-1.7774,2.9692;2.4781,-1.1626,1.2597;1.6233,-1.0235,1.9365;2.3654,-2.1626,0.8223;2.4027,-0.0868,0.1526;3.1627,-0.3008,-0.6096;1.0271,-0.1024,-0.5266;0.8019,-1.0966,-0.9301;0.9826,0.6117,-1.3555;0.2307,0.1558,0.1839)|\",5.937524217910001\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])[C@]1([H])[C@]([H])(C(=O)OC([H])([H])[H])C(=O)C([H])([H])C([H])([H])[C@@]1([H])C([H])([H])[H] |(7.573,-2.436,-0.0701;6.528,-2.1439,-0.0071;5.8599,-2.5674,-0.755;6.0831,-1.3179,0.9409;6.7955,-0.9307,1.671;4.6534,-0.868,1.0889;4.2955,-1.1007,2.1004;4.0239,-1.4446,0.3994;4.4257,0.6346,0.7923;3.3524,0.8522,0.8925;4.6793,0.8054,-0.2593;5.2242,1.6366,1.6532;6.2948,1.4448,1.4824;5.0054,1.3906,3.1859;5.3939,0.4095,3.4636;3.5346,1.4378,3.58;2.8041,2.4055,3.4928;3.1201,0.2485,4.0656;1.7445,0.1994,4.4857;1.5861,-0.8181,4.8432;1.0796,0.4229,3.6476;1.5654,0.9218,5.286;5.8002,2.412,4.0263;6.5488,2.0371,4.9065;5.613,3.8608,3.6411;4.5806,4.1435,3.8835;6.295,4.4727,4.2384;5.8376,4.0525,2.128;6.898,3.8784,1.8928;5.6294,5.0964,1.8643;4.9686,3.1173,1.2645;3.9153,3.3446,1.4727;5.2411,3.3941,-0.2232;5.1595,4.4677,-0.4292;4.5314,2.8823,-0.8798;6.2531,3.0778,-0.5082)|\",5.80146729266\r\n\"[H]OC(=O)[C@@]([H])(O[H])C([H])([H])C([H])([H])C1=C([H])C([H])=C2OC([H])([H])OC2=C1[H] |(2.8477,-2.3015,-3.0765;2.27,-1.6287,-3.5037;1.2028,-1.5178,-2.6944;0.2725,-0.7806,-2.9228;1.2675,-2.4276,-1.4489;0.5246,-3.2186,-1.6077;2.537,-3.0947,-1.3917;3.1113,-2.5903,-0.789;0.8908,-1.6622,-0.1726;-0.1554,-1.3584,-0.2804;0.9413,-2.3619,0.6706;1.7295,-0.399,0.1258;1.3043,0.0742,1.0198;1.5864,0.3136,-0.6938;3.2164,-0.6281,0.3495;3.6729,-1.2377,1.5269;2.9506,-1.5408,2.2806;5.0392,-1.458,1.7785;5.3819,-1.9261,2.6949;5.9281,-1.0427,0.8072;7.2973,-1.1362,0.8033;7.7318,-0.5472,-0.43;8.2808,-1.2929,-1.0164;8.3627,0.3239,-0.2176;6.5702,-0.1278,-1.1586;5.492,-0.4379,-0.3692;4.1533,-0.2198,-0.6308;3.8335,0.2526,-1.5543)|\",5.510305472624999\r\n\"[H]C([H])([H])/C1=C(\\C([H])([H])[H])C([H])([H])[C@@]2(Cl)C(=O)C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C1([H])[H] |(-0.0475,-0.2267,2.1065;0.271,0.7692,1.7669;-0.6166,1.2438,1.3257;0.5477,1.3498,2.6486;1.371,0.6624,0.7384;2.6249,1.1258,0.8929;3.1637,1.7869,2.1389;4.0949,1.2985,2.4582;2.4733,1.7616,2.9838;3.4178,2.838,1.9423;3.6715,1.0166,-0.2005;4.4104,0.2463,0.0663;4.2493,1.9455,-0.2653;3.1297,0.6652,-1.5807;2.3757,2.1997,-2.3295;4.2469,0.265,-2.5721;5.4171,0.3474,-2.2574;3.7883,-0.2844,-3.9122;3.343,0.536,-4.489;4.6729,-0.636,-4.4503;2.7414,-1.4033,-3.7265;2.3752,-1.7259,-4.7081;3.2255,-2.2781,-3.2704;1.5744,-0.9441,-2.8419;0.873,-1.7736,-2.6866;1.0149,-0.146,-3.3462;2.0611,-0.4388,-1.4759;2.5941,-1.2701,-0.9837;0.9218,-0.01,-0.5448;0.3232,-0.899,-0.2976;0.2385,0.6625,-1.0855)|\",5.07764445033\r\n\"[H]OC(=O)[C@]1([H])[C@]([H])(C2([H])C([H])([H])C2([H])[H])[C@@]1([H])[C@@]([H])(C(=O)O[H])N([H])[H] |(1.6446,-0.2589,-1.8955;1.92,-0.8663,-2.6067;0.8253,-1.3703,-3.2061;0.9213,-2.1804,-4.1056;-0.5227,-0.8983,-2.7435;-1.0144,-0.2975,-3.5078;-0.96,-0.6613,-1.3123;-1.7108,0.1186,-1.1949;0.0085,-0.7623,-0.1676;0.6322,-1.655,-0.1778;-0.3773,-0.249,1.1939;-1.3587,0.2053,1.3006;-0.0495,-0.81,2.0641;0.6613,0.4965,0.3922;1.6964,0.4462,0.7204;0.3736,1.4535,-0.037;-1.41,-1.9055,-2.0394;-0.9012,-2.8175,-1.7214;-2.8188,-2.1275,-2.5447;-3.1599,-1.1963,-3.0236;-2.8783,-3.2056,-3.6671;-3.8751,-3.8753,-3.8168;-1.8318,-3.3488,-4.5006;-1.0307,-2.8343,-4.263;-3.745,-2.4089,-1.4503;-3.4079,-3.2266,-0.9398;-4.6225,-2.7165,-1.8722)|\",6.182426683359999\r\n\"[H]C1=C(C([H])([H])[H])[C@]([H])(/C([H])=C(\\[H])[C@]([H])(N(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C1=O |(2.3978,2.0558,-0.8433;3.0793,1.2525,-0.5717;2.6477,-0.0196,-0.4571;1.2117,-0.3963,-0.6937;0.5848,0.4806,-0.8786;0.8063,-0.9434,0.1664;1.1291,-1.0677,-1.56;3.5873,-1.1495,-0.0427;3.2716,-2.061,-0.5706;3.3848,-1.3997,1.4411;3.6998,-0.5974,2.1102;2.8133,-2.4896,1.9611;2.4864,-3.3,1.3065;2.5161,-2.704,3.4288;3.0139,-1.898,4.008;3.0161,-4.0261,3.8506;4.4649,-4.1271,3.6986;4.7859,-5.1445,3.9457;5.0198,-3.4253,4.3533;4.7536,-3.9208,2.6649;2.6342,-4.3916,5.211;3.0801,-5.3617,5.4536;1.5506,-4.4984,5.2998;2.9762,-3.6639,5.9742;0.997,-2.5698,3.6472;0.6502,-1.6222,3.2242;0.7284,-2.5766,4.7078;0.4637,-3.3878,3.1491;5.0717,-0.8601,-0.4462;5.2064,-0.9505,-1.9815;6.2356,-0.7306,-2.2884;4.5464,-0.2477,-2.4992;4.9652,-1.9614,-2.3324;6.0172,-1.899,0.1806;7.0494,-1.7175,-0.1421;5.7449,-2.916,-0.1285;5.9946,-1.8657,1.2746;5.4627,0.5556,0.0326;5.5222,0.5757,1.1311;6.457,0.8374,-0.3312;4.4846,1.6509,-0.3797;4.8469,2.8116,-0.5209)|\",4.598724073450001\r\n\"[H]OC(=O)[C@](O[H])(S[H])C([H])([H])C1=C([H])C([H])=C(O[H])C([H])=C1[H] |(-3.7447,3.5766,-0.7424;-3.2954,3.7282,-1.6025;-1.9898,3.5413,-1.3781;-1.1469,3.6418,-2.2406;-1.6447,3.2037,0.0943;-2.8387,2.9063,0.8102;-2.9258,1.9337,0.8279;-1.032,4.7708,0.8897;-0.1589,5.0685,-0.0957;-0.607,2.064,0.1823;0.2262,2.3316,-0.4731;-0.2315,2.0379,1.2097;-1.187,0.7162,-0.1993;-1.6696,-0.1592,0.7891;-1.5718,0.1128,1.8387;-2.2405,-1.3873,0.461;-2.6027,-2.0644,1.2279;-2.3354,-1.7674,-0.8811;-2.8974,-2.9806,-1.1534;-2.8969,-3.1244,-2.1127;-1.8557,-0.9117,-1.8807;-1.923,-1.2061,-2.9266;-1.2912,0.3147,-1.5401;-0.928,0.9716,-2.3251)|\",5.5728916582400005\r\n\"[H]OC(=O)/C([H])=C([H])/C(=N\\C([H])([H])[C@@]([H])(C(=O)O[H])N([H])[H])O[H] |(1.5384,2.225,0.7244;1.6526,1.8928,1.6305;2.5839,0.9102,1.6773;2.8309,0.3439,2.715;3.2622,0.5912,0.3802;3.0777,1.2242,-0.4854;4.0918,-0.4573,0.3092;4.2514,-1.0373,1.219;4.8508,-0.872,-0.898;4.4327,-1.0636,-2.0838;3.0532,-0.8732,-2.4964;2.7151,-1.8101,-2.9603;2.3631,-0.641,-1.679;2.9766,0.2627,-3.5412;3.5117,1.128,-3.1202;3.712,-0.0868,-4.8616;3.2507,0.2206,-5.9382;4.8962,-0.7059,-4.747;5.0692,-0.93,-3.7958;1.5869,0.6576,-3.7382;1.0755,-0.1408,-4.1189;1.57,1.3389,-4.4979;6.1687,-1.1211,-0.6744;6.4074,-0.8517,0.2274)|\",4.100755727035001\r\n\"[H]OC(=O)[C@@]([H])(O[H])C([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])C([H])([H])O[H] |(5.4629,-2.5642,-2.3184;6.011,-2.1704,-3.0426;6.8594,-1.2777,-2.5265;7.613,-0.6257,-3.2222;6.8652,-1.1095,-0.9979;6.9356,-2.1133,-0.5562;8.0047,-0.3667,-0.6278;8.3635,-0.0282,-1.4727;5.5972,-0.4109,-0.443;5.7045,-0.4094,0.6457;5.6359,0.6388,-0.7541;4.2501,-1.027,-0.8617;3.9767,-0.7047,-1.8687;4.3448,-2.4668,-0.9529;4.2956,-2.7902,-0.0333;3.0592,-0.7504,0.0817;2.187,-1.202,-0.3963;3.2673,-1.4692,1.3029;3.7349,-0.8486,1.8908;2.7106,0.7109,0.411;1.7409,0.6795,0.9334;3.7217,1.1541,1.3313;3.4098,1.963,1.7644;2.599,1.678,-0.7787;2.4748,2.6911,-0.3705;3.5345,1.6848,-1.348;1.431,1.4197,-1.7265;1.356,2.2556,-2.4385;0.4866,1.3857,-1.1576;1.6618,0.1913,-2.4089;0.9266,0.0336,-3.0202)|\",7.325304855460001\r\n\"[H]OC(=O)[C@@]1([H])OC(=O)[C@]([H])(O[H])[C@]([H])(O[H])[C@@]1([H])O[H] |(0.1152,-3.2402,-0.2313;0.9378,-3.4578,0.2505;1.6705,-2.3454,0.3707;2.6987,-2.3086,1.0086;1.1487,-1.1113,-0.3724;1.6154,-1.0747,-1.3638;-0.2707,-1.3308,-0.5941;-1.1607,-0.3105,-0.6863;-2.2381,-0.4971,-1.1992;-0.8001,1.03,-0.0481;-1.0621,0.9555,1.0187;-1.5258,2.0568,-0.6959;-2.3428,1.6402,-1.0287;0.6886,1.3402,-0.1821;0.9222,2.2522,0.384;1.0204,1.4759,-1.5535;0.3255,2.0454,-1.9303;1.5006,0.1781,0.3836;2.5676,0.3658,0.209;1.2058,0.0921,1.7651;1.8502,-0.528,2.1477)|\",7.099450359544999\r\n\"[H]OC(=O)[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@]1([H])C([H])=C([H])[C@@]([H])(C([H])([H])[H])C([H])=C1[H] |(7.4726,0.9404,5.2778;6.6798,0.4112,5.4907;6.5267,-0.4574,4.4582;7.2868,-0.4735,3.5131;5.3269,-1.3731,4.6353;4.8428,-1.1142,5.5837;5.8158,-2.8319,4.6924;4.9634,-3.5114,4.8008;6.4906,-2.9967,5.5397;6.3518,-3.0896,3.7738;4.3302,-1.1623,3.4766;4.838,-1.3947,2.5332;3.5193,-1.8936,3.5873;3.7385,0.2507,3.4113;4.5424,0.9826,3.2509;3.2792,0.5045,4.3774;2.6739,0.4463,2.2954;2.3739,1.5071,2.3655;1.4446,-0.3971,2.5326;0.9702,-0.2912,3.5086;0.9292,-1.2425,1.6367;0.035,-1.8135,1.8869;1.4895,-1.4547,0.2518;1.7587,-2.5208,0.1488;0.4235,-1.1681,-0.8309;-0.4605,-1.8014,-0.6893;0.8213,-1.3648,-1.8336;0.1042,-0.1211,-0.7892;2.7328,-0.6303,0.0232;3.2092,-0.7393,-0.951;3.2478,0.216,0.9185;4.1454,0.7823,0.6701)|\",6.57971290509\r\n\"[H]OC1=N[C@]([H])(N([H])[H])N([H])[C@@]([H])(O[H])N1[H] |(-2.7404,-1.5167,0.4011;-2.6611,-0.5736,0.1715;-1.3306,-0.3525,0.0475;-0.4808,-1.2816,0.2601;0.8954,-0.8969,0.0501;1.1482,-1.0071,-1.0228;1.7571,-1.8083,0.7675;1.5694,-1.7107,1.7648;2.7356,-1.5698,0.6126;1.0788,0.5117,0.5005;2.0529,0.7904,0.403;0.2688,1.4906,-0.1892;0.2234,2.401,0.417;0.7849,1.9543,-1.4317;0.6763,1.2325,-2.0741;-1.0601,0.9314,-0.3557;-1.8375,1.5748,-0.4066)|\",7.65184147606\r\n\"[H]OC([H])([H])C1=NC([H])([H])[C@]2([H])[C@]([H])(C([H])([H])O[H])[C@@]2([H])C([H])([H])N1[H] |(-2.8471,-2.5311,-0.8141;-2.502,-3.3773,-0.4681;-1.5689,-2.9787,0.5151;-2.0768,-2.6883,1.4553;-0.9159,-3.8242,0.7406;-0.7262,-1.7887,0.063;0.4928,-1.7526,0.4294;1.4671,-0.6788,0.1979;1.874,-0.4224,1.1854;2.3005,-1.1201,-0.3652;0.8581,0.5257,-0.4875;0.6988,0.3586,-1.5529;0.7097,2.001,-0.1303;0.5554,2.6381,-1.0007;1.5011,2.6781,0.9677;1.5899,2.0104,1.8408;0.981,3.5812,1.3044;2.7766,3.1177,0.519;3.236,2.3405,0.1632;-0.3695,1.0098,0.2283;-0.3466,0.7556,1.2891;-1.6567,0.5572,-0.4256;-1.8781,1.0988,-1.3517;-2.5149,0.6942,0.2413;-1.5501,-0.8892,-0.7774;-1.2011,-0.9726,-1.7296)|\",6.087186835684999\r\n\"[H]OC([H])([H])[C@@]1([H])[C@@]2([H])C([H])([H])N=C([H])N([H])C([H])([H])[C@@]21[H] |(0.5249,4.5896,0.5965;0.7702,3.9821,1.3116;1.5046,2.9087,0.7294;2.4303,3.2727,0.255;1.8012,2.2724,1.5688;0.7009,2.1117,-0.2766;0.462,2.6665,-1.1863;0.8868,0.6227,-0.5066;0.6653,0.345,-1.5384;1.6112,-0.4927,0.2074;1.8824,-0.2021,1.2331;2.5484,-0.7776,-0.2889;0.7691,-1.6853,0.2706;-0.5028,-1.7346,0.1071;-0.9239,-2.7429,0.1894;-1.5622,-0.8552,-0.1333;-2.4457,-1.3312,-0.2532;-1.6006,0.6003,-0.338;-1.6759,0.8465,-1.4093;-2.5001,0.9944,0.1547;-0.3183,1.1239,0.2352;-0.2575,0.9767,1.3139)|\",7.045027589445\r\n\"[H]C1=NC([H])([H])[C@]2([H])[C@]([H])(C([H])([H])OC([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])[C@@]2([H])C([H])([H])N1[H] |(9.4832,-4.5655,1.3795;8.674,-4.0281,0.8722;9.0738,-3.0455,0.1495;8.2373,-2.1539,-0.6504;8.1502,-1.1908,-0.1258;8.778,-1.9461,-1.5835;6.8822,-2.7763,-0.8873;6.9745,-3.7019,-1.4576;5.4443,-2.2935,-0.8441;4.7551,-2.8874,-1.443;5.0726,-0.8349,-0.777;5.0277,-0.4093,-1.7945;5.8355,-0.2651,-0.2208;3.8093,-0.7137,-0.1422;3.3499,0.6248,-0.085;4.0311,1.242,0.5253;3.3553,1.0616,-1.1009;1.9523,0.6715,0.489;1.0475,-0.372,0.2618;1.3786,-1.2452,-0.2906;-0.255,-0.2967,0.756;-0.9464,-1.1164,0.5778;-0.672,0.8248,1.476;-1.6872,0.8821,1.8597;0.2253,1.8688,1.7063;-0.0867,2.7424,2.2728;1.5308,1.7879,1.2195;2.2294,2.5997,1.4122;6.0729,-2.9388,0.3666;6.3306,-2.2201,1.1451;6.0757,-4.3498,0.8711;5.7229,-5.042,0.0902;5.4462,-4.5043,1.758;7.4707,-4.6361,1.2386;7.6063,-5.4589,1.8098)|\",5.84500550874\r\n\"[H]C([H])([H])C([H])([H])OC(=O)[C@@]1([H])OC(=O)[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]1([H])C([H])([H])[H] |(2.9234,-5.032,0.3665;3.524,-4.1551,0.6337;2.9038,-3.4891,1.2403;4.3771,-4.4932,1.2311;3.9937,-3.4496,-0.6288;3.1506,-3.065,-1.2075;4.5953,-4.1091,-1.2584;4.8829,-2.3453,-0.3181;4.2998,-1.1624,-0.0192;3.1109,-0.9583,0.0061;5.3907,-0.1479,0.3211;6.1969,-0.2651,-0.4144;4.8113,1.1513,0.1469;5.3746,2.279,0.6727;5.0169,3.3482,0.243;6.3429,2.1419,1.8526;5.6976,2.2899,2.7319;7.3757,3.276,1.837;7.9827,3.2475,2.7486;6.881,4.247,1.7708;8.049,3.18,0.9774;6.9901,0.7527,1.9659;7.8,0.6684,1.2272;7.4564,0.6461,2.9532;5.9732,-0.3737,1.7293;6.5084,-1.3304,1.6931;4.8951,-0.4556,2.8216;5.3613,-0.4878,3.8128;4.2849,-1.3578,2.7122;4.2146,0.4006,2.7906)|\",7.015095065890001\r\n\"[H]O[C@@]1([H])OC(C(=O)OC([H])([H])C([H])([H])[H])=C(C([H])([H])[H])C([H])([H])[C@]1([H])C([H])([H])[H] |(5.2154,0.4664,3.1037;5.3086,-0.4736,3.3281;6.112,-1.0814,2.3505;6.108,-2.1459,2.5924;5.5126,-1.0098,1.0556;5.6007,0.1934,0.3918;4.5749,0.2005,-0.7044;3.5732,-0.4807,-0.6821;4.9073,1.0199,-1.7304;3.9618,1.0824,-2.8275;2.9478,1.0425,-2.4227;4.1371,2.0636,-3.276;4.2013,-0.0385,-3.8274;3.5282,0.0797,-4.6845;4.0058,-1.0112,-3.3683;5.2327,-0.0188,-4.1945;6.4677,1.177,0.7218;6.5562,2.5369,0.0784;7.4354,2.5992,-0.5781;6.6897,3.3026,0.8546;5.6815,2.7916,-0.517;7.446,0.9542,1.8596;7.1664,1.5977,2.7101;8.438,1.3133,1.5517;7.5367,-0.516,2.3021;7.9359,-0.5629,3.3221;8.4345,-1.3505,1.3784;9.4656,-0.9805,1.4079;8.0885,-1.3119,0.3407;8.4484,-2.4021,1.6882)|\",5.352479439335001\r\n\"[H]C1=C([H])C([C@]2([H])C(=O)C([H])([H])[C@@]2([H])OC([H])([H])C([H])([H])[H])=C([H])C([H])=C1Cl |(7.0985,1.101,-3.2782;7.3225,0.7071,-2.2926;6.3098,0.1793,-1.4903;5.2937,0.1701,-1.8728;6.5852,-0.3304,-0.2155;5.5003,-0.8854,0.6773;5.193,-0.1413,1.4295;5.7704,-2.2162,1.4161;6.739,-2.6733,1.9668;4.3826,-2.7611,1.0813;3.6938,-2.6915,1.9317;4.3431,-3.7729,0.6678;4.1926,-1.5639,0.1166;4.2994,-1.8469,-0.941;2.9724,-0.9099,0.3494;2.7113,0.2012,-0.4999;3.4778,0.9798,-0.3575;2.7559,-0.1157,-1.5556;1.3314,0.7414,-0.1629;1.0928,1.5998,-0.7999;1.2894,1.0618,0.883;0.57,-0.0298,-0.3168;7.913,-0.3089,0.2388;8.1499,-0.7218,1.2137;8.9353,0.2134,-0.5516;9.9593,0.2238,-0.1936;8.6302,0.7213,-1.8134;9.9137,1.3871,-2.8162)|\",5.564728242725\r\n\"[H]S[C@@]1([H])C(=O)O[C@]([H])(C([H])([H])OC([H])([H])C([H])=C([H])[H])C1([H])[H] |(10.5788,0.1636,-0.5058;10.0354,1.3904,-0.6612;8.5386,0.8442,-1.5593;8.1044,1.783,-1.9242;8.8104,-0.0296,-2.7965;9.7359,0.0094,-3.5588;7.7853,-0.9221,-2.9346;6.7999,-0.7753,-1.8839;6.5294,-1.7822,-1.5562;5.5635,-0.0999,-2.4751;5.2203,-0.6726,-3.3509;5.8107,0.9187,-2.8222;4.5806,-0.0648,-1.4635;3.3689,0.5369,-1.8788;3.5576,1.5679,-2.231;2.949,-0.0108,-2.7424;2.3713,0.5586,-0.7577;1.4204,1.025,-1.0147;2.5717,0.0644,0.4621;1.8,0.1157,1.2246;3.5111,-0.4054,0.7344;7.4993,0.0316,-0.7767;8.0009,-0.6438,-0.0762;6.7951,0.6455,-0.2118)|\",6.51440558097\r\n\"[H]/C1=C(\\[H])[C@]2([H])/C([H])=C3C(=C([H])/C2=N/C1=O)\\C([H])([H])C([H])([H])N([H])C([H])([H])C\\3([H])[H] |(1.7265,-2.1155,-2.6127;2.1289,-1.2548,-2.0874;2.6738,-1.3405,-0.8698;2.7516,-2.2873,-0.3379;3.2667,-0.1173,-0.2252;4.3506,-0.1433,-0.4763;3.2365,-0.0769,1.2775;3.3386,-1.03,1.7939;3.1726,1.0714,1.9797;3.0462,2.3527,1.2559;2.8715,2.368,-0.0933;2.6977,3.3001,-0.6232;2.7744,1.1586,-0.8861;2.3147,1.2324,-2.0966;2.0894,0.0397,-2.8206;1.8348,0.0776,-4.014;3.0734,3.6536,2.024;2.9673,4.4767,1.3094;2.2161,3.7106,2.7086;4.3471,3.8886,2.8649;4.4176,4.9622,3.072;5.2341,3.617,2.2605;4.2993,3.2079,4.1573;5.0232,3.5952,4.7584;4.4545,1.7557,4.1098;5.3631,1.4312,3.5659;4.559,1.4119,5.1454;3.2294,1.0503,3.493;2.3301,1.5108,3.9228;3.2362,0.0034,3.8172)|\",4.02184271039\r\n\"[H]C1=N[C@@]2([H])C(=NC1=O)C([H])=C1C(=C2[H])C([H])([H])C([H])([H])N([H])C([H])([H])C1([H])[H] |(-5.8524,-1.1091,-0.6051;-4.878,-0.6344,-0.7352;-3.8324,-1.3636,-0.7672;-2.5707,-0.6426,-0.9758;-2.4068,-0.679,-2.0727;-2.6096,0.8321,-0.6354;-3.6967,1.5482,-0.6925;-4.892,0.861,-0.9232;-5.9278,1.4219,-1.239;-1.3604,1.4501,-0.2591;-1.3691,2.5322,-0.1654;-0.2546,0.7256,0.0713;-0.2894,-0.7512,0.0508;-1.3958,-1.3759,-0.3977;-1.4643,-2.4605,-0.4217;0.9011,-1.5511,0.5372;1.1292,-1.3121,1.5843;0.6302,-2.6123,0.5051;2.1951,-1.3558,-0.2787;1.9493,-1.4167,-1.3568;2.8679,-2.191,-0.0536;2.9034,-0.1293,0.0815;3.8393,-0.162,-0.3162;2.2588,1.1138,-0.3356;2.9841,1.9195,-0.1769;1.9887,1.1283,-1.4091;1.0034,1.4408,0.5031;0.8203,2.5193,0.4546;1.2378,1.2001,1.5489)|\",3.65721015072\r\n\"[H]C1=NC([H])=C([H])C([H])=C1[C@@]1([H])N([H])N([H])[C@@]2([H])C([H])([H])C([H])([H])[C@@]([H])(OC([H])([H])[H])C([H])([H])[C@]12[H] |(-1.4986,-0.8829,-4.8905;-1.6529,0.0737,-4.3912;-2.9269,0.4333,-4.2008;-3.1469,1.6072,-3.598;-4.1898,1.8853,-3.4541;-2.1226,2.4518,-3.1663;-2.356,3.395,-2.6809;-0.8031,2.0554,-3.3687;0.0138,2.6913,-3.0347;-0.542,0.8319,-3.9989;0.8592,0.3328,-4.2594;0.7887,-0.6687,-4.7082;1.5935,1.2123,-5.2137;1.3334,2.1752,-4.9987;3.023,1.1293,-4.8707;3.4733,0.7452,-5.6992;3.1542,0.1671,-3.757;3.2513,-0.8665,-4.1416;4.3015,0.4064,-2.7788;5.275,0.3226,-3.2784;4.2298,1.4211,-2.3721;4.2017,-0.6398,-1.6475;4.9599,-0.4489,-0.8805;4.4033,-1.639,-2.0581;2.8233,-0.6762,-0.964;2.794,-1.5296,-0.266;2.7129,0.5343,-0.2106;1.738,0.5129,0.8118;1.848,1.445,1.3728;1.8913,-0.3355,1.498;0.7106,0.4599,0.4217;1.6683,-0.8188,-1.9865;0.6996,-0.7608,-1.4759;1.7219,-1.8158,-2.4467;1.806,0.269,-3.0466;1.7709,1.2415,-2.5363)|\",4.843626538900001\r\n\"[H]C([H])([H])C(OC(=O)N1O[C@]2([H])C([H])([H])[C@@]1([H])[C@@]1([H])C([H])([H])[C@]12[H])(C([H])([H])[H])C([H])([H])[H] |(2.804,-2.8849,-1.2207;3.504,-2.0426,-1.1839;4.4245,-2.386,-0.6995;3.7354,-1.7334,-2.2045;2.8806,-0.8936,-0.3845;3.8725,0.1787,-0.1863;4.4086,0.8189,-1.2322;4.1462,0.6545,-2.4139;5.4365,1.6748,-0.8351;5.3645,2.1199,0.5551;5.6057,3.5558,0.4715;5.2508,3.9792,1.4126;4.8669,3.9252,-0.8181;3.8061,3.6676,-0.7736;4.9753,4.9626,-1.1344;5.6857,2.9036,-1.6335;5.4061,2.7245,-2.6695;7.1425,3.2558,-1.3257;7.905,2.5638,-1.6692;7.5216,4.6655,-0.9511;6.9118,5.5177,-1.2364;8.5872,4.8767,-0.9793;7.0875,3.7076,0.1318;7.8077,3.352,0.8618;1.6239,-0.3297,-1.0562;0.8455,-1.1007,-1.0832;1.8339,-0.0056,-2.0765;1.2381,0.5215,-0.4842;2.5794,-1.3296,1.0514;1.8406,-2.138,1.0511;2.1802,-0.4923,1.6328;3.4886,-1.687,1.5449)|\",6.892643833165001\r\n\"[H]C(/C(C(=O)OC([H])([H])C([H])([H])[H])=C(/[H])C([H])([H])C([H])([H])[H])=C(/[H])C([H])([H])[H] |(2.3098,1.7943,-0.6661;2.6915,0.7754,-0.6989;4.1494,0.6298,-0.5299;5.042,1.6533,-1.1698;6.2584,1.6397,-1.1442;4.3263,2.6187,-1.7977;5.083,3.6648,-2.4477;5.9718,3.2237,-2.9054;4.4157,4.0338,-3.2309;5.46,4.7695,-1.4712;5.9635,5.5832,-2.006;6.1408,4.3882,-0.7052;4.5694,5.1787,-0.9827;4.783,-0.3309,0.1787;5.8712,-0.2772,0.1766;4.182,-1.4277,1.0052;3.1154,-1.2397,1.1699;4.2429,-2.375,0.4463;4.9098,-1.6036,2.3485;4.4721,-2.4276,2.9226;4.8408,-0.6942,2.9555;5.973,-1.825,2.1985;1.8266,-0.2227,-0.933;2.2027,-1.2397,-1.0411;0.346,-0.0442,-1.1023;0.0151,-0.3901,-2.0912;0.0517,1.0049,-0.993;-0.2146,-0.6333,-0.3634)|\",5.15383632847\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])[H])[C@]([H])(O[H])C([H])(OC([H])([H])[H])OC([H])([H])[H] |(8.2601,-3.1661,0.0546;8.3196,-3.2444,1.0229;7.0217,-2.9042,1.5085;6.9962,-3.2525,2.548;6.7829,-1.3863,1.4901;6.7411,-1.035,0.4513;5.8024,-1.1781,1.9395;7.8667,-0.609,2.2592;7.902,-0.9929,3.2914;8.8453,-0.8214,1.8164;7.6086,0.8738,2.2845;6.6709,1.1808,2.7509;8.4231,1.8021,1.775;9.3547,1.4712,1.31;8.21,3.2953,1.7712;9.0684,3.7713,2.2707;8.2583,3.6534,0.7309;6.9128,3.7909,2.4149;6.8503,4.8834,2.3657;6.8536,3.4992,3.4698;6.033,3.3815,1.9054;5.995,-3.7159,0.7024;6.2253,-4.7815,0.8512;6.1932,-3.3738,-0.6706;5.5225,-3.8706,-1.1675;4.5247,-3.4779,1.0647;4.2509,-2.4358,0.8585;3.7917,-4.3387,0.2103;2.4001,-4.0447,0.1482;2.2271,-3.0332,-0.2473;1.9311,-4.1228,1.1357;1.953,-4.7774,-0.5276;4.2101,-3.6601,2.4251;4.4396,-4.9645,2.9585;3.9011,-5.0015,3.9084;5.5049,-5.1474,3.1497;4.0541,-5.7432,2.2912)|\",7.328025993965\r\n\"[H]C([H])=C1C(=O)O[C@]23[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])C(=C([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]13[H] |(6.7627,-3.843,2.3859;6.7197,-2.9824,3.0475;7.3475,-2.9956,3.934;5.9344,-1.9321,2.81;5.8869,-0.7501,3.7256;6.3599,-0.6236,4.8284;5.1703,0.2388,3.1129;4.8906,-0.0969,1.7109;5.9066,0.6921,0.8392;5.8037,0.3152,-0.1882;7.3816,0.6078,1.2371;7.9873,1.2072,0.5477;7.7623,-0.4186,1.2049;7.544,0.9984,2.2466;5.3186,2.1143,0.8671;5.5224,2.5697,1.8436;5.7556,2.7627,0.0996;3.8101,1.8889,0.6763;3.5716,1.8345,-0.3926;3.1973,2.6928,1.0946;3.4991,0.5196,1.3639;2.993,0.693,2.3182;2.6061,-0.3689,0.514;1.3308,-0.5782,0.8618;0.6589,-1.1695,0.2434;0.9081,-0.1609,1.7725;3.1778,-1.0026,-0.7457;2.3618,-1.1672,-1.4591;3.8786,-0.3155,-1.2384;3.8765,-2.3631,-0.5111;4.1229,-2.7997,-1.4878;3.1493,-3.0392,-0.0419;5.1514,-2.3343,0.35;5.976,-1.8589,-0.1945;5.4636,-3.3723,0.5218;4.9467,-1.6503,1.7062;3.9655,-1.9765,2.09)|\",5.679016059935\r\n\"[H]C1=C([H])N(C(=O)C([H])([H])[C@](C#N)(C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(2.2752,6.4848,3.4934;2.2315,5.8191,2.6422;2.8412,4.6014,2.5512;3.4616,4.0455,3.2356;2.5472,4.0608,1.2989;3.0228,2.8054,0.8769;3.7219,2.1354,1.6109;2.6067,2.3867,-0.5277;2.9793,3.1447,-1.2282;1.5142,2.4203,-0.6009;3.0959,0.9902,-1.0034;4.5748,0.9992,-1.0352;5.7327,0.9981,-1.1212;2.6145,0.8141,-2.4681;2.9714,1.6395,-3.0923;1.5207,0.8015,-2.5131;2.9901,-0.1162,-2.8994;2.6193,-0.1874,-0.0704;3.0988,-0.007,0.8951;1.0989,-0.1975,0.1573;0.8379,-1.0347,0.8138;0.5385,-0.3325,-0.7755;0.7371,0.7136,0.6456;3.0921,-1.5602,-0.5769;2.8963,-2.3205,0.187;4.167,-1.5691,-0.7853;2.5627,-1.8688,-1.486;1.74,4.9674,0.6041;1.3924,4.7596,-0.3952;1.5334,6.0519,1.4081;0.9484,6.9246,1.1503)|\",5.080365588835001\r\n\"[H]C1=C([H])N(C(=O)/C([H])=C(\\C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(5.7396,5.7265,-0.1106;4.7473,5.3169,-0.246;4.4058,3.9996,-0.1135;5.0065,3.1505,0.1688;3.0359,3.8779,-0.3586;2.2091,2.723,-0.3128;0.9981,2.868,-0.3931;2.924,1.4348,-0.2014;3.9846,1.4684,-0.4168;2.3782,0.2396,0.1131;0.9311,0.024,0.4711;0.8624,-0.3161,1.5141;0.5023,-0.7795,-0.1418;0.3325,0.9228,0.3426;3.227,-1.0293,0.1244;2.8105,-1.6716,0.9143;3.0605,-1.7877,-1.2125;3.5816,-2.7511,-1.1709;3.4842,-1.2061,-2.0392;2.0089,-1.9853,-1.4447;4.7161,-0.8219,0.439;5.2099,-1.7929,0.5559;4.8582,-0.2546,1.3651;5.2359,-0.2937,-0.369;2.5282,5.1422,-0.6411;1.4784,5.2536,-0.8585;3.5564,6.0412,-0.5866;3.476,7.1043,-0.7693)|\",4.39191754707\r\n\"[H]C1=C([H])C([H])=C(C(=O)C([H])([H])[C@](C#N)(C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(1.0779,-0.8948,7.4674;1.4178,-0.7244,6.4494;1.0685,0.4531,5.7857;0.4576,1.2,6.2849;1.5039,0.6739,4.4795;1.2249,1.5964,3.9802;2.2943,-0.2829,3.8233;2.7966,-0.1112,2.4202;3.4748,-0.9812,1.8951;2.4438,1.1814,1.6847;2.8287,2.0187,2.2822;1.3539,1.3023,1.6819;2.9664,1.3348,0.2292;4.4453,1.3089,0.2601;5.6058,1.3505,0.2632;2.5523,2.7465,-0.2644;2.9281,3.5176,0.4157;1.4616,2.8316,-0.3059;2.9545,2.9543,-1.2585;2.4636,0.1944,-0.7361;2.8932,-0.7299,-0.342;0.9346,0.0364,-0.7366;0.6547,-0.7705,-1.4228;0.4217,0.9431,-1.0796;0.5363,-0.2305,0.248;2.9813,0.3731,-2.173;2.7609,-0.5271,-2.7573;4.0651,0.527,-2.1985;2.5022,1.2168,-2.6835;2.6389,-1.464,4.5009;3.2516,-2.1933,3.9815;2.2046,-1.6831,5.8041;2.4788,-2.5991,6.32)|\",5.2245859296\r\n\"[H]C1=C([H])C([H])=C(C(=O)/C([H])=C(\\C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(9.0912,-3.1094,-3.9609;8.0887,-2.9614,-3.568;7.0014,-3.6162,-4.154;7.1562,-4.2751,-5.0043;5.7192,-3.4274,-3.6473;4.8636,-3.9357,-4.0795;5.4985,-2.5701,-2.5576;4.085,-2.4257,-2.0577;3.2353,-3.2443,-2.4116;3.7685,-1.2823,-1.1761;4.5074,-0.4867,-1.1314;2.6403,-1.1301,-0.4443;1.5226,-2.1414,-0.4051;1.8941,-3.1607,-0.5163;0.8367,-1.9791,-1.2467;0.9392,-2.0572,0.517;2.4699,0.1309,0.3925;3.3076,0.7986,0.1523;1.1673,0.8782,0.0452;1.1084,1.8148,0.6115;0.2805,0.2848,0.2943;1.1219,1.1235,-1.0216;2.546,-0.1727,1.9032;2.4644,0.7567,2.4788;3.4959,-0.6523,2.1629;1.7341,-0.8335,2.2273;6.5973,-1.921,-1.9737;6.4595,-1.2778,-1.1105;7.8847,-2.1198,-2.4735;8.728,-1.6183,-2.0062)|\",4.786482630295\r\n\"[H]OC([H])([H])[C@]1([H])C([H])([H])C([H])([H])[C@]([H])(C([H])([H])[H])[C@@]2([H])C(=C([H])C(C([H])([H])[H])(C([H])([H])[H])C2([H])[H])[C@@]1(O[H])C([H])([H])[H] |(0.4824,-4.2968,1.2656;-0.2868,-3.7318,1.4567;0.1513,-2.4156,1.1686;0.3859,-2.3066,0.0982;-0.7027,-1.7648,1.3845;1.3553,-1.9387,2.0194;1.0983,-2.2212,3.0518;1.4353,-0.3718,1.9884;0.5387,-0.0143,1.4665;1.3214,-0.0109,3.0179;2.6368,0.399,1.3965;3.5101,0.3135,2.055;2.3525,1.4601,1.4359;3.0781,0.0824,-0.0477;3.7829,0.8821,-0.3239;1.909,0.168,-1.0397;1.3674,1.1132,-0.9087;1.1906,-0.646,-0.9082;2.2567,0.1364,-2.0768;3.9121,-1.2276,-0.1158;4.758,-1.0557,0.5671;3.2023,-2.5091,0.3043;3.2068,-3.4203,-0.6757;2.7866,-4.4172,-0.5846;3.904,-2.9722,-1.9431;5.0345,-3.9532,-2.3164;5.5808,-3.6002,-3.2004;4.6357,-4.9488,-2.547;5.7516,-4.0609,-1.4944;2.9127,-2.8819,-3.123;3.4214,-2.5425,-4.0345;2.0976,-2.184,-2.9072;2.4661,-3.8609,-3.3368;4.4737,-1.5752,-1.5295;5.5678,-1.6169,-1.4899;4.2236,-0.8063,-2.2678;2.6636,-2.7383,1.7045;2.354,-4.1587,1.7752;2.065,-4.3324,2.6867;3.7508,-2.493,2.7693;4.1089,-1.4626,2.7931;4.6019,-3.1533,2.5757;3.3522,-2.7255,3.7657)|\",6.87359586363\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1=C(C([H])(C([H])([H])[H])C([H])([H])[H])C(=O)C([H])([H])[C@@]([H])(C([H])([H])[H])C1([H])[H] |(5.9322,0.316,-7.1396;6.1494,0.1945,-6.0723;7.2398,0.1748,-5.955;5.777,1.0863,-5.553;5.5096,-1.08,-5.5129;5.8773,-1.9504,-6.0749;4.4231,-1.0489,-5.6783;5.7879,-1.2913,-4.0188;5.4298,-0.4157,-3.4561;6.8748,-1.3292,-3.863;5.1294,-2.5571,-3.4528;5.493,-3.4355,-4.0054;4.0481,-2.5049,-3.6432;5.3452,-2.7719,-1.9455;4.6675,-3.5602,-1.5945;5.052,-1.861,-1.406;6.795,-3.1234,-1.5267;6.8157,-3.1796,-0.4278;7.4571,-2.3,-1.7979;7.2954,-4.4538,-2.0566;8.3814,-4.6182,-2.8635;9.277,-3.477,-3.3495;8.8446,-2.5395,-2.9894;10.6982,-3.557,-2.7534;11.283,-2.6812,-3.0607;11.2172,-4.4548,-3.0954;10.6655,-3.5699,-1.6573;9.321,-3.3674,-4.8878;9.8866,-2.4746,-5.1813;8.3116,-3.2757,-5.3073;9.8012,-4.2411,-5.3324;8.7661,-5.9974,-3.2981;9.7736,-6.2122,-3.9635;7.8512,-7.1599,-2.9342;7.0772,-7.2184,-3.7157;8.4416,-8.0793,-3.0002;7.1796,-6.981,-1.5691;6.4265,-7.7707,-1.4441;8.1771,-7.1011,-0.4069;7.6641,-7.0154,0.5584;8.945,-6.3205,-0.4479;8.6866,-8.0712,-0.4304;6.4518,-5.6291,-1.5843;6.0569,-5.4031,-0.5844;5.5673,-5.7077,-2.2359)|\",5.047711926775\r\n\"[H]O[C@@]1(C([H])([H])[H])C([H])=C([H])[C@]2([H])C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])[C@@]1(C([H])([H])[H])[C@]([H])(O[H])C([H])([H])[H] |(7.6838,1.9196,-0.0152;7.8991,1.3937,-0.8018;6.8938,0.354,-0.9355;6.871,-0.4231,0.3995;6.2003,-1.2865,0.3617;6.5001,0.2356,1.1963;7.8723,-0.7609,0.6789;5.5283,0.9904,-1.1141;5.3185,1.8263,-0.4434;4.5881,0.5692,-1.9606;3.6241,1.0785,-1.9903;4.7946,-0.5615,-2.928;4.8949,-0.1203,-3.9332;3.5818,-1.5136,-3.0096;3.4312,-1.9999,-2.0335;2.6689,-0.9362,-3.2133;3.7758,-2.5833,-4.0986;3.8391,-2.0562,-5.0644;2.59,-3.553,-4.1676;2.7264,-4.2862,-4.9713;1.649,-3.0213,-4.3538;2.4788,-4.1067,-3.2264;5.1058,-3.3286,-3.889;5.0273,-3.9468,-2.9809;5.274,-4.025,-4.7213;6.3112,-2.3839,-3.7447;6.4765,-1.859,-4.6959;7.2098,-2.9807,-3.5578;6.0763,-1.3639,-2.6111;5.8325,-1.9616,-1.7183;7.3227,-0.5102,-2.1807;8.4515,-1.4898,-1.7698;8.9302,-1.9232,-2.6518;8.0639,-2.3122,-1.1574;9.2364,-0.9787,-1.2107;7.9243,0.3704,-3.3512;8.0251,-0.3132,-4.2027;9.2689,0.7549,-3.0596;9.2055,1.2894,-2.2488;7.162,1.6125,-3.8354;6.1682,1.3887,-4.2304;7.7525,2.0646,-4.6395;7.0555,2.3501,-3.0362)|\",6.9116918027\r\n\"[H]C1C([H])[C@]2([H])C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])[C@](C([H])=O)(C([H])([H])[H])C1=O |(6.0886,1.7966,-2.7448;5.9059,1.4919,-1.715;4.9164,0.5664,-1.5671;4.7553,0.1129,-0.5854;4.8234,-0.3696,-2.7347;4.9306,0.182,-3.6758;3.6396,-1.3372,-2.818;3.5216,-1.8599,-1.856;2.7153,-0.767,-2.9777;3.8193,-2.3711,-3.9448;3.8713,-1.8165,-4.8952;2.6287,-3.3337,-4.0212;2.7445,-4.0449,-4.8476;1.6878,-2.7922,-4.1749;2.5318,-3.9138,-3.0944;5.15,-3.1248,-3.7751;5.092,-3.7522,-2.8722;5.2969,-3.8115,-4.619;6.3629,-2.1859,-3.6648;6.4978,-1.6369,-4.603;7.269,-2.7852,-3.5101;6.1899,-1.1816,-2.5053;6.0236,-1.7745,-1.5919;7.4835,-0.3271,-2.1785;8.0897,0.3628,-3.3861;9.17,0.6067,-3.2737;7.4992,0.6758,-4.4008;8.5623,-1.1909,-1.505;8.8816,-2.0104,-2.1583;8.1857,-1.6136,-0.5704;9.4414,-0.5892,-1.256;7.0804,0.8512,-1.0799;7.5599,0.8652,0.0236)|\",4.699406198135001\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@]([H])(C([H])=C([H])[H])N([H])[H] |(5.4812,-5.3061,-3.1496;5.3355,-5.5847,-2.2311;3.924,-5.7781,-2.0511;3.5192,-6.4672,-2.8031;3.8104,-6.249,-1.0703;3.1226,-4.4703,-2.0891;3.328,-3.9596,-3.0509;1.7668,-4.8688,-1.9989;1.257,-4.0698,-1.76;3.5329,-3.5004,-0.9546;3.3323,-4.0093,0.005;4.9021,-3.1336,-1.0608;5.3934,-3.9464,-1.2964;2.7286,-2.1892,-0.9865;2.7909,-1.7712,-1.9987;1.3212,-2.448,-0.7881;1.2267,-2.7778,0.1234;3.2171,-1.0948,0.0019;4.288,-0.9638,-0.1741;2.9977,-1.5,1.4481;2.0907,-1.1074,1.9099;3.8255,-2.2623,2.1654;3.6134,-2.5163,3.2007;4.7515,-2.6459,1.743;2.5534,0.1977,-0.2247;1.5505,0.0319,-0.3218;2.8588,0.5854,-1.1167)|\",6.484473057414999\r\n\"[H]C1=C([H])C([H])=C(C(=O)C(=C(\\[H])C#N)/C([H])=C(\\[H])C#N)C([H])=C1[H] |(-0.6925,-2.0906,-3.3632;-0.526,-1.0791,-3.0031;0.3649,-0.239,-3.673;0.8842,-0.5896,-4.5601;0.5884,1.0563,-3.2068;1.2682,1.7096,-3.7441;-0.0906,1.5242,-2.0694;0.0488,2.9175,-1.5552;-0.8356,3.4446,-0.8954;1.312,3.7238,-1.8138;2.5075,3.0826,-1.7238;2.5248,2.0062,-1.5792;3.7869,3.7019,-1.7629;4.8506,4.1755,-1.8085;1.0805,5.1589,-1.9694;0.0812,5.4764,-1.6867;1.9487,6.0747,-2.4506;2.9566,5.8212,-2.7642;1.5836,7.4466,-2.5855;1.3055,8.5704,-2.7087;-1.0041,0.6788,-1.4164;-1.539,1.0601,-0.5528;-1.2107,-0.618,-1.8738;-1.9087,-1.2701,-1.3565)|\",4.08715003451\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])C#CC([H])([H])/C([H])=C([H])/C([H])=C(\\[H])C([H])=O |(4.4133,-5.3662,2.134;3.5285,-5.2611,1.7463;3.2177,-3.9405,1.6408;2.1524,-3.6153,1.1774;4.2726,-2.9753,2.1778;4.2724,-3.0859,3.2723;5.2683,-3.2981,1.8404;4.0105,-1.5164,1.7907;2.955,-1.2951,1.9778;4.6,-0.8634,2.447;4.3473,-1.2079,0.3255;5.4355,-1.2435,0.1777;3.905,-1.9734,-0.3222;3.8335,0.1749,-0.1379;4.2445,0.9607,0.5124;4.2158,0.3827,-1.1464;2.3708,0.2326,-0.138;1.1626,0.1805,-0.111;-0.2899,-0.0073,-0.031;-0.6825,-0.2223,-1.0381;-0.789,0.9122,0.3022;-0.6328,-1.1529,0.8935;-0.0571,-2.0665,0.7493;-1.5576,-1.0915,1.8685;-2.1179,-0.17,2.0284;-1.8593,-2.207,2.7402;-1.2781,-3.1171,2.58;-2.7886,-2.1976,3.7221;-3.3954,-1.3179,3.9281;-3.0239,-3.3729,4.5603;-2.386,-4.2533,4.3142;-3.8405,-3.4275,5.4645)|\",4.598724073450001\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])C#CC([H])([H])[C@]1([H])OC(=O)C([H])=C1[H] |(5.7169,-2.2773,3.213;5.026,-2.2109,3.8928;3.914,-1.6277,3.3733;2.9433,-1.4661,4.0721;4.0171,-1.1815,1.9162;4.6387,-0.273,1.9035;4.5657,-1.9339,1.3323;2.6442,-0.8898,1.3075;2.111,-1.8355,1.1424;2.0609,-0.3356,2.0468;2.6877,-0.0769,0.0108;3.261,0.845,0.176;3.2006,-0.6331,-0.7852;1.2726,0.301,-0.4896;1.3591,0.9112,-1.3984;0.7358,-0.6119,-0.785;0.4897,1.0183,0.521;-0.1092,1.5856,1.4068;-0.8333,2.2378,2.4972;-1.7722,2.6715,2.131;-0.2454,3.0648,2.9158;-1.1893,1.264,3.6445;-1.789,0.4397,3.2367;-1.9943,1.9694,4.5997;-1.4058,1.9129,5.8498;-1.8941,2.4392,6.8166;-0.1583,1.1243,5.7129;0.4933,0.9228,6.5527;-0.0215,0.7423,4.4407;0.7744,0.1374,4.0196)|\",5.983783572495\r\n\"[H]OC([H])([H])[C@@]12C([H])([H])C([H])=C([H])[C@]1([H])/C(=C(/[H])C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])C2([H])[H] |(-4.4127,2.5319,3.9971;-3.8684,1.734,4.0728;-2.6919,1.9271,3.2906;-2.9485,2.1238,2.2363;-2.1258,2.7985,3.6546;-1.8483,0.6517,3.3642;-2.6298,-0.5715,2.7953;-3.1927,-1.0937,3.578;-3.3719,-0.2579,2.0449;-1.548,-1.4106,2.1649;-1.6997,-2.4429,1.861;-0.4194,-0.7235,1.9923;0.4831,-1.0885,1.5216;-0.565,0.7231,2.457;-0.7757,1.3295,1.5596;0.6623,1.3447,3.1162;1.8902,1.4101,2.5565;2.7013,1.8577,3.1247;2.2542,1.033,1.1815;1.5079,0.681,0.2835;3.5959,1.1724,1.0001;4.0684,0.8734,-0.3191;3.5952,1.5261,-1.0583;3.8533,-0.1664,-0.5819;5.1453,1.0448,-0.2911;0.4432,2.0587,4.4334;1.408,2.3137,4.8845;-0.0176,3.0236,4.1708;-0.4716,1.3432,5.4541;-1.0467,2.1021,5.997;0.1436,0.8332,6.2046;-1.4219,0.3167,4.8103;-2.3167,0.1987,5.4293;-0.9264,-0.6612,4.7773)|\",5.276287561195\r\n\"[H]OC([H])([H])[C@@]12C([H])([H])C([H])([H])C([H])([H])/C3=C(\\C(=O)OC([H])([H])[H])[C@@]([H])(C([H])([H])C1([H])[H])[C@]32[H] |(-3.9495,0.4135,1.3449;-2.9905,0.2762,1.3374;-2.6974,-0.7436,0.3883;-3.2169,-1.6813,0.652;-3.0346,-0.4522,-0.6206;-1.1869,-1.004,0.3564;-0.9182,-2.196,-0.6345;-1.8061,-2.3711,-1.257;-0.1195,-1.9129,-1.3317;-0.5193,-3.5237,0.0465;-1.3428,-3.8641,0.6889;-0.3791,-4.3003,-0.7144;0.7638,-3.3836,0.9037;1.6727,-3.3784,0.2897;0.8576,-4.2399,1.5866;0.6394,-2.0995,1.6485;1.4004,-1.0686,2.0794;2.8576,-0.9178,2.1211;3.6621,-1.7835,1.8235;3.2187,0.3258,2.5327;4.6316,0.5624,2.5885;4.7448,1.5853,2.95;5.0823,0.4546,1.5976;5.115,-0.1416,3.2717;0.2093,-0.1332,2.3041;0.0411,0.1871,3.3374;0.0037,0.9939,1.2788;-0.8241,1.6242,1.6162;0.891,1.6271,1.166;-0.3837,0.2814,-0.034;0.5236,-0.0143,-0.5733;-0.9546,0.9308,-0.7076;-0.6557,-1.3378,1.7828;-1.4208,-1.742,2.4549)|\",5.567449381229999\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])O[C@@]([H])(OC([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(2.7641,-0.6897,-1.938;3.0119,-0.9792,-1.0459;2.3046,-0.1401,-0.1163;0.7889,-0.3088,-0.2926;0.2423,0.209,0.501;0.5115,-1.3666,-0.2779;0.4693,0.1095,-1.2561;2.7157,1.3233,-0.33;2.2486,1.9771,0.4115;2.4075,1.6615,-1.3278;3.8036,1.4266,-0.2548;2.7943,-0.6321,1.2662;3.8823,-0.4428,1.3013;2.1493,0.1757,2.2548;2.6511,-0.0435,3.5722;3.7354,0.1936,3.5648;1.98,0.799,4.4512;2.2301,2.182,4.2455;1.7403,2.712,5.066;1.8202,2.5291,3.2906;3.3091,2.4003,4.27;2.4259,-1.488,4.0054;2.8514,-1.6361,5.0043;1.3425,-1.6431,4.078;3.0438,-2.4509,2.9795;4.14,-2.372,3.0185;2.7979,-3.4878,3.2365;2.56,-2.1178,1.5587;1.4888,-2.3385,1.478;3.0806,-2.7216,0.8096)|\",8.65866272291\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@]([H])(O[H])C([H])([H])C([H])([H])C([H])([H])C1([H])OC([H])([H])C([H])([H])C([H])([H])O1 |(4.8122,-5.6315,0.0044;4.8037,-4.7015,-0.2696;4.7554,-3.8845,0.9243;6.1319,-3.8759,1.5994;6.1699,-3.145,2.4159;6.9214,-3.6416,0.8812;6.3487,-4.8622,2.0286;3.6884,-4.4355,1.8802;3.6531,-3.8517,2.8068;3.917,-5.4743,2.1547;2.7021,-4.3989,1.4114;4.3187,-2.4899,0.3927;4.0737,-1.8595,1.2578;3.101,-2.6418,-0.3331;3.2533,-3.4165,-0.9044;5.382,-1.7729,-0.4563;6.2828,-1.6091,0.1502;5.6761,-2.4316,-1.2846;4.8951,-0.4243,-1.0007;4.5779,0.2129,-0.1613;4.0054,-0.5781,-1.6185;5.9735,0.3077,-1.8086;6.3023,-0.3275,-2.6404;6.8616,0.4883,-1.1888;5.504,1.6782,-2.3221;5.139,2.272,-1.4786;4.3948,1.5945,-3.2035;4.7189,1.1231,-4.5109;5.027,0.0655,-4.4832;3.7885,1.1826,-5.0828;5.8252,1.9794,-5.1275;5.4499,2.9963,-5.2899;6.1381,1.5686,-6.0959;7.0036,2.0248,-4.1542;7.7594,2.7531,-4.4627;7.4923,1.0386,-4.1001;6.5673,2.4483,-2.8638)|\",8.177021207525\r\n\"[H]/C1=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C(=O)C([H])([H])C([H])([H])C(=O)C([H])([H])C1([H])[H] |(-0.9111,2.5129,0.7313;-0.9028,1.825,-0.1169;0.2634,1.466,-0.6637;0.256,0.7671,-1.4986;1.6209,1.9337,-0.2087;1.5191,2.8876,0.3255;2.2417,2.1347,-1.0934;2.3918,0.9614,0.717;1.8501,0.8785,1.6687;3.3472,1.4481,0.9539;2.6911,-0.4585,0.1764;3.6532,-0.7756,0.5946;2.8217,-0.4359,-0.9122;1.6457,-1.5335,0.5311;1.0674,-1.2642,1.4235;2.1441,-2.4808,0.7964;0.6582,-1.9095,-0.5614;0.7959,-1.5841,-1.7285;-0.5558,-2.745,-0.1413;-0.6088,-3.6093,-0.8125;-0.457,-3.1007,0.8877;-1.851,-1.9262,-0.2825;-2.7241,-2.5893,-0.187;-1.9095,-1.4894,-1.2854;-2.0078,-0.8443,0.7872;-1.437,-0.9303,1.8604;-2.9045,0.3401,0.4592;-3.8589,-0.0158,0.0467;-3.1076,0.8676,1.3964;-2.2423,1.3024,-0.5677;-2.13,0.7909,-1.5312;-2.9418,2.1327,-0.7353)|\",5.798746154155\r\n\"[H]/C1=C(/[H])C([H])([H])C([H])([H])C(=O)C([H])([H])C([H])([H])C([H])([H])C([H])([H])C(=O)C([H])([H])C1([H])[H] |(-0.9444,2.2705,1.2166;-0.8327,1.8734,0.2049;0.3904,1.7144,-0.3117;0.4788,1.3158,-1.3232;1.7014,1.9808,0.3839;1.5461,2.606,1.2711;2.3676,2.5264,-0.2943;2.4157,0.6778,0.8186;1.8645,0.1886,1.6279;3.4086,0.9232,1.2254;2.6414,-0.2954,-0.3414;2.9587,0.1198,-1.4427;2.4198,-1.7916,-0.1192;3.1719,-2.3222,-0.7123;2.5522,-2.0634,0.9355;1.006,-2.2063,-0.6068;0.8703,-1.7906,-1.6135;0.9716,-3.2973,-0.7168;-0.122,-1.7317,0.3262;0.1403,-0.7586,0.7464;-0.2291,-2.4112,1.1792;-1.4968,-1.5739,-0.3585;-1.9263,-2.5577,-0.5859;-1.3859,-1.0325,-1.3038;-2.4491,-0.8478,0.5913;-2.7587,-1.3638,1.6505;-3.0228,0.5214,0.2133;-3.8981,0.3253,-0.4263;-3.4112,0.9548,1.1413;-2.1067,1.5266,-0.5243;-1.8606,1.1524,-1.5253;-2.7022,2.4384,-0.6792)|\",5.842284370234999\r\n\"[H]C(=O)C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])=C([H])[H] |(11.9889,7.4065,-1.1573;11.5889,7.1396,-2.1642;12.1707,7.4874,-3.1677;10.3135,6.3286,-2.1376;9.5271,6.9456,-1.6781;10.0173,6.111,-3.169;10.4728,5.0315,-1.3072;11.2614,4.4193,-1.7687;10.8193,5.2819,-0.296;9.1948,4.2412,-1.2244;8.7927,3.8846,-2.1753;8.5337,3.9699,-0.0956;8.9394,4.3245,0.8533;7.2579,3.1786,-0.011;6.4587,3.8127,0.4033;6.9342,2.8874,-1.0181;7.376,1.9088,0.8716;6.4456,1.3331,0.7562;8.1892,1.2778,0.4915;7.5955,2.2101,2.3299;6.8099,2.7872,2.8212;8.6655,1.8391,3.0387;9.454,1.2724,2.5399;8.9132,2.1294,4.5039;9.2644,1.199,4.9793;9.7497,2.838,4.5995;7.7216,2.6454,5.2646;6.8564,1.9817,5.2914;7.661,3.8201,5.8922;6.7758,4.1321,6.4402;8.4983,4.5155,5.8869)|\",5.8776591708\r\n\"[H]C(=O)[C@@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])N([H])/C(=N/C([H])([H])[H])N([H])C([H])([H])[H] |(8.7349,1.0284,-1.1442;8.9497,0.9172,-0.059;9.2664,1.88,0.608;8.8148,-0.492,0.5142;9.4333,-1.152,-0.1146;9.2832,-0.6078,1.8931;8.7809,0.0791,2.4576;10.2552,-0.3008,1.9358;7.353,-0.9889,0.4104;7.3138,-1.9366,0.9624;6.7092,-0.2815,0.9544;6.8267,-1.1873,-1.0177;7.507,-1.857,-1.5635;6.792,-0.2431,-1.57;5.4176,-1.804,-1.0664;5.4141,-2.7742,-0.5522;5.1381,-1.9723,-2.1091;4.3573,-0.9958,-0.4726;4.4024,-0.9035,0.5351;3.8657,0.1272,-1.1392;4.3448,0.4584,-2.2816;3.6364,1.4123,-3.1118;3.7878,1.1438,-4.1643;4.0352,2.4315,-2.9907;2.5534,1.4471,-2.9189;2.7809,0.7265,-0.4711;2.4256,0.1458,0.2803;2.8333,2.1446,-0.1191;1.8992,2.4117,0.3837;2.9142,2.76,-1.017;3.6731,2.4013,0.5458)|\",5.085807865844999\r\n\"[H]O[C@@]([H])(C(=O)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.0376,-0.1692,-2.2633;6.7776,-0.5675,-1.7598;6.3706,-0.5121,-0.4166;7.0046,0.1937,0.15;4.9325,0.0568,-0.3505;4.3924,0.3657,-1.4011;4.2566,0.2045,0.9965;3.933,-0.8052,1.2903;5.0069,0.4927,1.7464;3.0612,1.1734,1.0118;2.4178,0.9077,0.1638;2.2553,1.0073,2.3075;1.3851,1.6737,2.3147;2.8645,1.2495,3.1883;1.8911,-0.0203,2.4265;3.5146,2.6287,0.8233;2.6534,3.3065,0.8125;4.0528,2.761,-0.1207;4.1745,2.9444,1.6426;6.454,-1.877,0.2548;6.0485,-3.0202,-0.4448;5.7286,-2.9198,-1.4774;6.0855,-4.2714,0.1688;5.7734,-5.1529,-0.3853;6.5281,-4.3952,1.4884;6.5594,-5.3716,1.9644;6.9423,-3.2611,2.1879;7.3029,-3.3503,3.2093;6.908,-2.0088,1.5713;7.2502,-1.13,2.114)|\",5.597381904785\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@](C#N)(OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(3.2759,2.3445,-2.3445;4.0515,2.2743,-2.929;4.4163,0.9076,-2.8916;5.4498,0.8538,-3.2498;3.5478,0.0294,-3.7826;4.1174,-1.0625,-4.4485;5.1758,-1.277,-4.3221;3.3429,-1.8825,-5.2696;3.8036,-2.7253,-5.7777;1.985,-1.6141,-5.4425;1.3796,-2.2478,-6.0851;1.4103,-0.5199,-4.7926;0.3551,-0.2979,-4.9293;2.1865,0.2959,-3.9703;1.7314,1.1541,-3.4871;4.4366,0.4673,-1.3702;4.7344,-0.9843,-1.281;5.0224,-2.1072,-1.2169;3.1093,0.7671,-0.9489;2.595,0.7235,0.4228;1.0825,0.6012,0.1933;0.5485,0.5889,1.1496;0.8517,-0.3197,-0.3503;0.7151,1.4486,-0.3949;2.9051,2.0459,1.1415;2.3554,2.0942,2.0885;2.5915,2.8958,0.5256;3.9676,2.1589,1.37;3.0976,-0.4833,1.2228;2.5995,-0.4916,2.1987;4.1752,-0.4497,1.4076;2.8664,-1.4226,0.7142;5.5492,1.2452,-0.642;6.5173,1.0012,-1.0913;5.6002,0.9894,0.4179;5.3726,2.3164,-0.7569)|\",6.49807874994\r\n\"[H]C([H])=C([H])C([H])([H])OC([H])([H])/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H] |(7.3465,2.6116,-8.417;6.669,2.3274,-7.6167;5.7788,2.9409,-7.493;6.9128,1.2803,-6.8304;7.8149,0.6849,-6.9657;5.9951,0.8115,-5.7411;5.5491,-0.1628,-6.0137;5.164,1.522,-5.5989;6.7449,0.6672,-4.5418;5.9876,0.1382,-3.4788;5.1118,0.7793,-3.2615;5.5745,-0.8537,-3.7461;6.8163,0.0082,-2.2419;6.3107,-0.3995,-1.3678;8.1038,0.3515,-2.1362;8.6545,0.7642,-2.9735;8.8178,0.1747,-0.8516;8.3376,-0.2754,0.1726;10.1061,0.5826,-0.9652;10.9244,0.4713,0.2207;10.6612,-0.4497,0.7471;11.9454,0.3926,-0.1615;10.7617,1.6852,1.1244;11.4502,1.6116,1.9743;9.741,1.7398,1.5129;10.9846,2.6085,0.5796)|\",6.119840497745\r\n\"[H]O[C@]12C([H])([H])[C@]3([H])O[C@]3([H])C([H])([H])[C@@]1(F)C([H])([H])[C@@]1([H])O[C@@]1([H])C2([H])[H] |(-0.9268,-2.3386,1.8691;-1.0118,-1.372,1.8527;-0.2757,-0.8947,0.7212;1.164,-1.4416,0.7651;1.5293,-1.3431,1.7918;1.1443,-2.5126,0.5182;2.1256,-0.7212,-0.1681;2.7876,-1.3502,-0.7658;2.7723,0.4379,0.3749;1.8057,0.6341,-0.6623;2.2373,0.9555,-1.6134;0.5085,1.3185,-0.2916;-0.1286,1.3688,-1.1799;0.7121,2.3507,0.0205;-0.2773,0.6452,0.8445;0.3644,0.975,2.0499;-1.7119,1.1815,0.9676;-2.0602,0.9381,1.976;-1.7036,2.2752,0.8763;-2.6862,0.5844,-0.0232;-3.7366,0.8089,0.1707;-2.3377,0.6342,-1.419;-2.3579,-0.6477,-0.7589;-3.1771,-1.2868,-1.0906;-1.0113,-1.3293,-0.5694;-0.3917,-1.1514,-1.4567;-1.1608,-2.4155,-0.5091)|\",8.579749706265\r\n\"[H]O/C(C1=C([H])C([H])=C([H])C([H])=N1)=C(\\[H])C(=O)/C([H])=C(/OC([H])([H])C([H])([H])[H])C([H])([H])[H] |(8.1215,3.956,-0.5497;7.4687,4.2695,-1.1965;7.073,3.2036,-1.9662;5.9868,3.608,-2.8922;5.1329,4.6704,-2.5572;5.2539,5.1832,-1.6104;4.1424,5.0428,-3.4624;3.4653,5.8601,-3.2289;4.0362,4.3508,-4.6673;3.2821,4.6091,-5.4048;4.9374,3.3113,-4.9111;4.8934,2.7505,-5.8433;5.8951,2.9454,-4.058;7.6784,2,-1.8644;8.5734,1.9521,-1.2427;7.3981,0.6851,-2.5122;8.3776,-0.0231,-2.7738;5.9999,0.2699,-2.662;5.24,0.9588,-2.3118;5.5536,-0.9046,-3.1744;4.1966,-1.0486,-3.1942;3.5931,-2.2911,-3.5694;3.9085,-2.5809,-4.5795;3.9037,-3.0854,-2.8776;2.0881,-2.0909,-3.5147;1.5751,-3.0168,-3.7965;1.7799,-1.2982,-4.2036;1.7723,-1.8109,-2.5049;6.4074,-2.0078,-3.7231;7.4545,-1.7252,-3.6226;6.1779,-2.1896,-4.781;6.2349,-2.949,-3.186)|\",4.26130289883\r\n\"[H]C(C1=C([H])C([H])=C([H])C([H])=C1[H])N(O)C([H])([H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(2.1928,-0.674,-2.5851;2.7279,0.1409,-2.1082;2.3095,1.5189,-2.3623;1.8507,1.8358,-3.6569;1.8517,1.0649,-4.4234;1.4155,3.121,-3.9663;1.0755,3.3462,-4.9736;1.4183,4.1181,-2.9872;1.0783,5.1215,-3.2282;1.8408,3.8104,-1.6929;1.814,4.5708,-0.9168;2.2716,2.5231,-1.3742;2.5296,2.2751,-0.3493;3.7227,-0.2892,-1.3598;3.9612,-1.5253,-1.1475;4.7249,0.6075,-0.7368;5.6932,0.1862,-1.0029;4.6312,1.6249,-1.1221;4.5385,0.6171,0.7801;3.4673,0.8269,1.311;5.7089,0.4112,1.388;5.8275,0.3779,2.8671;4.9774,-0.7688,3.4212;5.1696,-0.8761,4.4947;5.2426,-1.7108,2.9303;3.9125,-0.5826,3.2733;5.4347,1.7384,3.45;5.6496,1.748,4.5243;4.3723,1.9418,3.3055;6.0182,2.5389,2.982;7.3188,0.1029,3.0693;7.5475,0.0521,4.1389;7.9247,0.898,2.6227;7.6021,-0.8482,2.6077)|\",4.264024037335\r\n\"[H]O[C@]12C([H])([H])[C@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])[C@]1(F)C([H])([H])[C@]([H])(O[H])[C@]([H])(O[H])C2([H])[H] |(-0.0967,-1.5381,-2.097;0.0254,-0.6353,-1.7548;0.0137,-0.7407,-0.2922;-1.2621,-1.4738,0.1542;-1.2683,-2.4874,-0.2698;-1.2385,-1.5867,1.2434;-2.5538,-0.7407,-0.2483;-3.409,-1.2569,0.1973;-2.778,-0.7727,-1.6573;-1.9771,-0.4152,-2.076;-2.5587,0.7082,0.2874;-3.4355,1.2073,-0.1378;-2.7389,0.7243,1.701;-1.8609,0.6444,2.1058;-1.2911,1.4678,-0.1449;-1.2768,2.4556,0.3301;-1.3094,1.6343,-1.2277;0.0032,0.7329,0.206;0.0386,0.61,1.6401;1.2706,1.4869,-0.1843;1.2885,1.6089,-1.2733;1.2449,2.4922,0.2494;2.5556,0.7659,0.27;2.5974,0.7747,1.3644;3.7095,1.4447,-0.1905;3.8505,1.1053,-1.0941;2.5795,-0.6941,-0.22;3.447,-1.1914,0.2263;2.8275,-0.7118,-1.6324;1.9733,-0.5186,-2.0605;1.2992,-1.4494,0.167;1.267,-1.5593,1.2564;1.3272,-2.4619,-0.2576)|\",7.586534151940001\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C2\\N=C(C3([H])C([H])([H])C3([H])[H])N([H])C2=O)O1 |(-0.8488,1.8851,0.1522;-0.72,0.8185,0.26;-1.5435,-0.1898,0.6788;-2.5682,-0.0866,1.0076;-0.7801,-1.3863,0.5931;-1.0784,-2.3947,0.8361;0.4678,-1.0296,0.1245;1.668,-1.7382,-0.1764;2.5068,-1.15,-0.5396;1.8681,-3.0771,-0.0543;0.9512,-4.037,0.3866;1.5781,-5.1813,0.3412;0.9966,-6.48,0.7159;1.668,-7.3297,0.6277;-0.4775,-6.7309,0.4242;-1.0067,-5.9195,-0.0652;-0.7526,-7.7305,0.0998;-0.0351,-6.5239,1.8349;0.0008,-7.3767,2.5068;-0.263,-5.5714,2.3025;2.8862,-5.0774,-0.1059;3.5514,-5.8279,-0.2229;3.1644,-3.7297,-0.3917;4.2177,-3.2792,-0.8065;0.4986,0.3353,-0.0793)|\",3.4041442697550006\r\n\"[H]O/N=C(\\[H])C1=C(OC([H])([H])/C([H])=C(\\[H])C([H])([H])[H])C([H])=C([H])C(Cl)=C1[H] |(6.9168,6.6385,3.1593;6.3463,5.8551,3.1155;7.2756,4.8107,3.0228;6.681,3.676,2.9568;5.594,3.6145,2.977;7.4503,2.4322,2.8548;6.7703,1.1925,2.7794;5.4094,1.2528,2.811;4.6498,0.0424,2.6747;4.9491,-0.46,1.7424;4.8658,-0.6345,3.5124;3.1992,0.4095,2.642;2.8986,1.0832,1.8397;2.2952,-0.0501,3.5099;2.6265,-0.7156,4.3095;0.8296,0.27,3.4836;0.2274,-0.6415,3.3723;0.5787,0.9475,2.6611;0.5147,0.7412,4.4241;7.5,0.0012,2.6851;6.9938,-0.955,2.6319;8.8945,0.0265,2.6607;9.4559,-0.8986,2.5866;9.5597,1.2484,2.7328;11.3199,1.2818,2.7014;8.852,2.4396,2.83;9.3709,3.3892,2.8876)|\",4.590560657935001\r\n\"[H]N([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(2.6867,2.6639,-7.8327;2.1039,3.226,-7.2125;1.1526,2.8818,-7.3421;2.5176,3.009,-5.8225;3.5359,3.406,-5.7139;1.8783,3.6334,-5.1844;2.4786,1.5565,-5.3191;1.456,1.1652,-5.4329;3.1181,0.936,-5.9662;2.9239,1.4078,-3.8591;2.2838,2.0198,-3.2115;3.9443,1.8038,-3.7476;2.8876,-0.0428,-3.3601;3.5478,-0.6651,-3.9821;1.8726,-0.4484,-3.4663;3.3177,-0.1972,-1.9078;4.3343,0.209,-1.7719;2.4027,0.4967,-1.0537;2.3986,-0.1458,0.227;0.9461,-0.3842,0.6323;0.8994,-0.8782,1.6079;0.4524,-1.0193,-0.1092;0.4056,0.566,0.6905;3.1776,0.6815,1.247;3.2387,0.1488,2.2014;2.6854,1.6453,1.4117;4.1937,0.8612,0.8837;3.0844,-1.3959,0.0562;3.2529,-1.6145,-1.3377;4.1672,-2.196,-1.4892;2.4019,-2.1701,-1.7635)|\",8.41920253447\r\n\"[H]C(/C(C(=O)OC([H])([H])C([H])([H])[H])=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=C(/[H])C([H])([H])C([H])([H])OC([H])([H])[H] |(8.2339,-2.3562,-0.6931;7.3851,-1.8444,-1.1439;6.2133,-2.6895,-1.4464;6.4583,-4.0539,-2.0224;5.5933,-4.8435,-2.3517;7.7835,-4.3149,-2.1405;8.1423,-5.6108,-2.6721;7.4346,-5.875,-3.4615;9.1331,-5.4585,-3.1079;8.1706,-6.6715,-1.5814;8.5187,-7.624,-1.9976;7.1701,-6.8227,-1.1669;8.8497,-6.3811,-0.7728;4.9185,-2.378,-1.2166;4.1937,-3.1344,-1.5149;4.3569,-1.1447,-0.5791;3.661,-1.4596,0.2136;5.1495,-0.5588,-0.1021;3.5756,-0.2537,-1.5698;4.255,0.088,-2.3634;2.803,-0.8544,-2.0703;2.9225,0.9582,-0.8934;2.2436,0.6079,-0.1031;3.6966,1.5526,-0.3878;2.1513,1.8488,-1.8727;1.6949,2.7032,-1.3601;1.3484,1.2902,-2.3691;2.8114,2.2435,-2.6548;7.4996,-0.5357,-1.4122;6.6872,-0.0244,-1.9299;8.7063,0.2991,-1.0853;9.4433,-0.2909,-0.5286;8.4211,1.1445,-0.4427;9.3723,0.8776,-2.3357;8.6374,1.446,-2.9338;9.751,0.0633,-2.9766;10.4308,1.722,-1.925;11.131,2.2876,-3.0111;11.921,2.9178,-2.5938;10.4748,2.9092,-3.6424;11.5902,1.5148,-3.6491)|\",5.186489990529999\r\n\"[H]C1=C([H])C([H])=C([C@]([H])(OC(=O)C([H])(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(4.0208,7.8413,-0.1089;4.2436,6.8148,-0.3879;4.1474,6.4121,-1.7204;3.8433,7.1218,-2.4853;4.4279,5.0915,-2.0737;4.3321,4.7802,-3.111;4.8201,4.1561,-1.1079;5.132,2.7291,-1.5257;4.6095,2.5239,-2.4641;4.598,1.8155,-0.5256;3.3535,1.3146,-0.738;2.6933,1.5337,-1.7303;2.891,0.4803,0.4487;3.7857,0.1052,0.9587;2.1252,1.3981,1.4244;1.8118,0.8306,2.3076;1.2295,1.807,0.944;2.7469,2.2355,1.758;2.0249,-0.6931,-0.0233;1.6682,-1.2717,0.8359;2.5864,-1.3672,-0.6792;1.159,-0.3272,-0.5831;6.6366,2.3718,-1.7269;7.4502,2.5362,-0.4302;8.4877,2.2235,-0.5982;7.0402,1.9195,0.3764;7.4681,3.5775,-0.0924;6.7257,0.9065,-2.2013;7.7666,0.6427,-2.4216;6.1399,0.7476,-3.1153;6.357,0.2155,-1.4374;7.2214,3.2854,-2.8215;8.2648,3.0121,-3.0168;7.1996,4.3396,-2.5279;6.671,3.1845,-3.7654;4.9026,4.5687,0.2297;5.1755,3.8497,0.9954;4.6169,5.887,0.5864;4.6837,6.1886,1.6285)|\",6.500799888445\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(OC(=O)C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])=C1[H] |(8.722,0.0451,-8.1412;8.1132,-0.251,-7.291;7.027,0.5293,-6.895;6.7804,1.4422,-7.4303;6.2587,0.1291,-5.8035;5.4193,0.7402,-5.4827;6.5589,-1.0421,-5.0951;5.6708,-1.4424,-3.9294;6.1226,-2.2655,-3.3765;5.5337,-0.3274,-2.995;6.512,-0.1919,-2.0671;7.4576,-0.9471,-1.9695;6.2453,1.0028,-1.16;6.0001,1.8461,-1.8193;5.0138,0.7265,-0.2731;4.7964,1.6013,0.3493;5.1977,-0.1233,0.3952;4.13,0.5045,-0.8778;7.4878,1.34,-0.3317;7.2957,2.2145,0.2995;8.3474,1.5587,-0.9728;7.7636,0.4996,0.313;4.2526,-1.8162,-4.3515;4.2816,-2.678,-5.0269;3.7657,-0.9887,-4.8771;3.6474,-2.0782,-3.4773;7.6627,-1.828,-5.4831;8.0612,-3.082,-4.7351;8.9695,-3.5127,-5.1669;7.2833,-3.8554,-4.7732;8.2568,-2.8618,-3.6795;8.4212,-1.4135,-6.5864;9.2751,-2.0151,-6.8889)|\",6.3130413316\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C#N)C([H])([H])Cl |(2.1151,-2.2191,-0.5161;2.8288,-1.5297,-0.0662;2.42,-0.3391,0.3844;3.1565,0.3409,0.8186;1.0038,0.1653,0.3469;0.348,-0.5989,-0.091;0.6538,0.3266,1.377;0.8509,1.4942,-0.423;1.5001,2.2542,0.0321;-0.18,1.8538,-0.3014;1.1774,1.3724,-1.9158;0.5092,0.6381,-2.384;2.198,0.9963,-2.0487;1.0212,2.6999,-2.6895;-0.0098,3.0686,-2.6107;1.2202,2.5413,-3.7563;1.9216,3.7531,-2.2064;2.6396,4.5736,-1.8073;4.2403,-2.0004,0.0111;4.3295,-2.9795,0.4867;4.8856,-1.2866,0.5255;4.964,-2.2297,-1.6653)|\",6.702164137815\r\n\"[H]C1=C2C(/C([H])=C(/N(=O)=O)C([H])([H])[H])=C([H])N([H])C2=C([H])C(F)=C1F |(0.8837,-0.0038,2.9588;0.6188,-1.0433,3.1213;1.2773,-2.1028,2.4742;2.3686,-2.1655,1.518;3.0268,-1.0102,0.9607;2.5173,-0.0608,1.0971;4.1955,-0.9457,0.2857;4.5769,0.3791,-0.229;3.8663,1.356,0.0268;5.6057,0.4387,-0.9103;5.1633,-2.0488,-0.0039;5.1105,-2.8045,0.786;6.181,-1.6595,-0.0552;4.9573,-2.5391,-0.9649;2.5673,-3.5152,1.2594;3.2451,-3.9962,0.5711;1.6842,-4.2646,1.9906;1.6148,-5.2704,1.9555;0.8788,-3.4335,2.7496;-0.1517,-3.7477,3.642;-0.4642,-4.7629,3.862;-0.7828,-2.6815,4.2551;-1.7849,-2.9,5.1238;-0.4033,-1.3496,3.9978;-1.0684,-0.3723,4.64)|\",3.812315045505\r\n\"[H]C1=C(Cl)C([H])=C2C(=C1[H])C(/C([H])=C(/N(=O)=O)C([H])([H])[H])=C(/[H])N2[H] |(3.9611,6.9944,-0.7042;3.3604,6.2746,-1.2487;2.4159,6.7352,-2.1856;2.2481,8.4687,-2.4352;1.6138,5.8724,-2.9201;0.895,6.2489,-3.6398;1.7882,4.5075,-2.678;2.7259,4.0028,-1.7443;3.5199,4.9126,-1.0297;4.2561,4.5651,-0.3106;2.6298,2.5535,-1.7722;3.4035,1.6725,-0.9334;3.909,2.1394,-0.0932;3.5883,0.3391,-1.0495;4.3849,-0.3015,0.0091;4.8864,0.3963,0.8959;4.504,-1.53,-0.0459;3.0886,-0.5775,-2.121;3.0061,-0.0328,-3.0667;2.1027,-0.9969,-1.8787;3.7706,-1.4185,-2.2522;1.6442,2.2583,-2.7041;1.2267,1.3064,-2.9943;1.1478,3.4174,-3.2407;0.4038,3.465,-3.9206)|\",3.76061341391\r\n\"[H]C1=C([H])C2=C(C([H])=C1F)C(/C([H])=C(/N(=O)=O)C([H])([H])[H])=C(/[H])N2[H] |(3.23,7.4967,-1.6959;3.0612,6.4279,-1.6209;2.1023,5.7915,-2.3985;1.4955,6.3536,-3.1026;1.9523,4.4118,-2.2392;2.7304,3.6586,-1.3253;3.6922,4.3162,-0.5417;4.313,3.7972,0.1808;3.8316,5.6816,-0.7151;4.7512,6.3431,0.0217;2.298,2.2755,-1.4241;2.8723,1.1883,-0.6731;3.8112,1.4014,-0.1708;2.3992,-0.0672,-0.508;3.2452,-0.9871,0.2672;2.8554,-2.1553,0.3718;4.2944,-0.5695,0.7675;1.1158,-0.6441,-1.0162;0.3437,0.1318,-1.0409;0.7817,-1.4585,-0.3727;1.2203,-1.0545,-2.03;1.2971,2.2604,-2.3888;0.7431,1.4297,-2.7981;1.0934,3.5236,-2.8691;0.4327,3.7654,-3.5921)|\",3.795988214475\r\n\"[H]OC(=O)[C@@]([H])(N(O)C([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(3.9418,-1.3636,0.5933;3.4711,-1.9736,-0.0453;2.4534,-1.2992,-0.5806;1.6279,-1.795,-1.3131;2.416,0.2095,-0.2504;1.4118,0.5679,-0.4569;2.7108,0.4055,1.1993;3.9281,0.0901,1.5344;1.8549,0.874,2.0736;2.3105,1.0402,3.0453;0.4352,1.1835,1.8992;-0.4577,0.4229,1.1168;-0.1153,-0.4594,0.5851;-1.8055,0.7707,1.0494;-2.479,0.1712,0.4439;-2.2928,1.8653,1.7659;-3.3452,2.1287,1.7111;-1.4241,2.6086,2.5686;-1.7962,3.4546,3.1394;-0.0775,2.2656,2.6421;0.5953,2.8452,3.2691;3.4355,0.9886,-1.0921;3.2157,0.8223,-2.1507;4.4531,0.66,-0.8773;3.358,2.06,-0.8822)|\",4.34837933099\r\n\"[H]ON([H])[C@@]([H])(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.5192,0.0142,-2.987;4.8904,-0.1212,-2.261;5.6988,-0.8341,-1.2908;5.1189,-1.6522,-1.0842;5.723,-0.0269,-0.0661;6.1439,0.9494,-0.3164;4.307,0.1486,0.5008;3.5661,-0.8029,0.6678;4.0356,1.4282,0.7731;2.7256,1.8562,1.3138;2.4841,1.2081,2.6802;1.5792,1.6337,3.1291;3.3264,1.407,3.3509;2.3536,0.1287,2.5874;1.6227,1.5247,0.3041;0.6733,1.9516,0.6471;1.5019,0.4462,0.1906;1.8593,1.9576,-0.6738;2.9056,3.3697,1.4517;3.1248,3.8223,0.4793;3.7301,3.5995,2.134;1.9901,3.8225,1.8468;6.6235,-0.7571,0.9582;6.1706,-1.7323,1.1734;7.5824,-0.9469,0.4629;6.8318,0.0078,2.2484;7.7428,1.0717,2.3093;8.3171,1.3353,1.4235;7.9278,1.7897,3.4905;8.6421,2.6085,3.5184;7.2026,1.4535,4.6358;7.3486,2.0093,5.5582;6.2952,0.3942,4.5897;5.7322,0.1189,5.4778;6.113,-0.3218,3.4045;5.4051,-1.1466,3.3744)|\",6.465425087880001\r\n\"[H]C([H])=C([H])[C@@]1(C([H])([H])[H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]1([H])C([H])([H])C(=O)C([H])([H])[H] |(0.9808,-1.0307,1.3981;1.3206,-0.8293,0.3861;0.7447,-1.2519,-0.4307;2.4078,-0.096,0.1549;2.9819,0.3071,0.9899;2.9613,0.2388,-1.2097;4.409,-0.2616,-1.3411;5.0415,0.1055,-0.5248;4.8452,0.0592,-2.2925;4.4103,-1.3549,-1.3099;2.1657,-0.3773,-2.2351;1.9083,0.5564,-3.2963;2.4384,0.0016,-4.6131;2.2723,0.7188,-5.4233;1.9246,-0.932,-4.8613;3.51,-0.197,-4.5295;0.4083,0.8559,-3.3425;0.1855,1.5637,-4.1473;0.0729,1.2843,-2.3937;-0.154,-0.0665,-3.5187;2.6538,1.7297,-2.9711;2.7859,1.7493,-1.55;1.8553,2.1067,-1.0866;3.9091,2.7123,-1.169;4.0402,2.6865,-0.0814;4.8436,2.4084,-1.6498;3.5186,4.1431,-1.5579;2.6992,4.7537,-0.8987;4.1629,4.7169,-2.8009;5.2534,4.7594,-2.6827;3.7743,5.7176,-3;3.9566,4.0533,-3.6495)|\",6.223243760935\r\n\"[H]C(=O)C([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@@]1(C([H])=C([H])[H])C([H])([H])[H] |(3.4198,4.9509,1.7347;2.5661,4.4516,2.2376;1.6663,5.097,2.7308;2.6222,2.9366,2.2467;1.7186,2.5678,2.7422;3.4955,2.6292,2.8374;2.744,2.368,0.8184;1.8712,2.6956,0.2418;3.8957,2.8862,0.159;4.937,1.9003,0.1366;6.0613,2.2718,1.1052;6.495,3.2376,0.8283;6.8498,1.5129,1.0764;5.6826,2.334,2.1289;5.4417,1.7821,-1.3008;5.9049,2.7237,-1.6131;4.6111,1.5599,-1.9732;6.1884,0.985,-1.3743;4.3434,0.6801,0.5914;2.9221,0.8209,0.6984;2.199,0.291,-0.5313;1.1431,0.5588,-0.5898;2.7338,-0.502,-1.4573;2.1443,-0.88,-2.2878;3.7767,-0.7999,-1.4098;2.4712,0.0069,1.9187;1.3862,0.0709,2.0564;2.9646,0.3497,2.832;2.7302,-1.0434,1.7589)|\",6.043648619605\r\n\"[H]C(C#N)N(O)[C@]([H])(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(4.2126,2.1195,-1.1465;3.8585,1.5236,-0.3152;4.7619,0.8595,0.541;5.5316,0.3267,1.2376;2.5431,1.4478,-0.1744;1.7415,2.0308,-0.9436;1.9289,0.62,0.9198;2.7092,0.424,1.6608;1.5401,-0.7381,0.2994;1.947,-1.1231,-0.7722;0.7529,-1.4004,1.1525;0.2752,-2.7799,0.8655;-0.5948,-2.7687,-0.3938;-1.0402,-3.7598,-0.5338;-0.009,-2.5217,-1.2806;-1.4085,-2.0429,-0.2907;1.4757,-3.7223,0.7448;1.1183,-4.7547,0.6632;2.109,-3.6531,1.6357;2.0768,-3.4919,-0.1363;-0.5565,-3.1038,2.1073;-0.9761,-4.1114,2.0213;-1.3826,-2.394,2.2179;0.0606,-3.0621,3.0105;0.7732,1.3869,1.5592;0.3479,0.782,2.3616;0.0013,1.6052,0.8197;1.1336,2.3307,1.9791)|\",4.617772042985001\r\n\"[H]N(C([H])([H])C#N)[C@]([H])(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(3.4507,1.0212,0.9172;3.3888,0.0108,0.7941;4.7174,-0.4923,0.4369;4.6459,-1.5747,0.2702;5.1375,-0.047,-0.48;5.6734,-0.2651,1.5313;6.4383,-0.0698,2.3814;2.3727,-0.268,-0.2144;2.3952,-1.3417,-0.4368;2.6229,0.4909,-1.5315;3.3297,1.4784,-1.6002;1.9503,-0.0742,-2.5445;2.015,0.4578,-3.9274;3.4544,0.3692,-4.4423;3.4808,0.6561,-5.4994;3.8299,-0.6568,-4.3607;4.1159,1.0354,-3.8857;1.0981,-0.4972,-4.6939;1.068,-0.2184,-5.7524;0.0785,-0.4581,-4.2974;1.46,-1.5274,-4.617;1.4707,1.8886,-3.9663;1.4156,2.2254,-5.0076;2.1139,2.5736,-3.4122;0.46,1.928,-3.5455;0.9821,0.0902,0.3339;0.2028,-0.1596,-0.3916;0.9179,1.1627,0.5528;0.8043,-0.4606,1.2611)|\",6.6912795837950005\r\n\"[H]C([H])([H])C1(C([H])([H])[H])[C@]2([H])C(S=O)=C(S=O)[C@@]1(C([H])([H])[H])C([H])([H])C2([H])[H] |(0.4983,-0.9455,0.4097;0.8706,-0.0279,-0.0614;0.4409,0.0268,-1.0636;0.4732,0.8148,0.5152;2.4085,-0.0161,-0.0351;2.8466,-0.0475,1.4391;2.464,-0.9507,1.9295;2.4375,0.814,1.9784;3.9323,-0.0335,1.5661;3.0577,-1.1383,-0.9091;2.8529,-2.1705,-0.6115;4.5158,-0.7297,-0.8448;5.7977,-1.782,-0.8216;7.099,-0.995,-0.8074;4.5414,0.704,-0.8509;5.9479,1.6179,-0.91;5.6392,3.0924,-1.0009;3.0699,1.1516,-0.8817;2.7439,2.5765,-0.4514;3.1833,3.3099,-1.1291;3.1262,2.7947,0.5506;1.6565,2.7152,-0.4418;2.6021,0.7875,-2.3339;1.6073,1.2057,-2.5201;3.2747,1.2161,-3.0818;2.5888,-0.77,-2.3512;3.2574,-1.192,-3.1065;1.5886,-1.1705,-2.5419)|\",2.8980125078250003\r\n\"[H]C(NN)C(=O)C(=O)OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.6758,-5.5962,1.1859;3.3004,-4.6276,1.491;2.7433,-4.5932,2.6804;2.2769,-4.6355,3.7138;3.4375,-3.5014,0.5866;3.964,-3.6156,-0.5078;2.8849,-2.1479,1.0799;2.3701,-2.0089,2.1728;3.0623,-1.1954,0.1695;2.5732,0.1398,0.5112;2.718,0.2958,1.5818;3.2337,0.8012,-0.0522;1.1324,0.3255,0.1105;0.8089,0.6699,-1.2083;1.6043,0.8048,-1.9377;-0.5214,0.8385,-1.5903;-0.7606,1.1083,-2.6154;-1.5431,0.6664,-0.6537;-2.5802,0.802,-0.9489;-1.2296,0.3223,0.6625;-2.0217,0.1876,1.3941;0.1018,0.1508,1.0432;0.3484,-0.1287,2.0633)|\",4.710290752155\r\n\"[H]OC1=NSN([H])[C@@]1([H])N1C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(1.6114,-1.5961,-2.2886;2.5243,-1.9164,-2.4366;3.0035,-2.1813,-1.2052;4.2475,-2.2814,-0.9512;4.4769,-2.5208,0.7315;2.7436,-2.665,1.1412;2.591,-3.6451,1.3693;1.9708,-2.3448,-0.0816;1.3236,-3.1978,-0.3242;1.1065,-1.1765,-0.0597;-0.1893,-1.378,0.6136;-0.3431,-2.4564,0.7326;-0.1825,-0.9597,1.6314;-1.3568,-0.8145,-0.2109;-1.2731,-1.2327,-1.2234;-2.2967,-1.199,0.2095;-1.4457,0.7166,-0.2899;-2.2399,0.9803,-1.0008;-1.7717,1.1065,0.6856;-0.1481,1.4443,-0.6808;-0.3838,2.5075,-0.8162;0.2172,1.0846,-1.6524;0.9548,1.3169,0.3861;0.4888,1.3853,1.3781;1.6369,2.1736,0.3227;1.8316,0.0575,0.3055;2.3553,-0.0755,1.2642;2.6018,0.2094,-0.46)|\",5.175605436510001\r\n\"[H]OC([H])([H])[C@]1([H])C([H])([H])C([H])([H])N([H])[C@@]1([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(6.5811,2.6348,-1.299;7.2137,1.9164,-1.1401;6.5275,0.688,-1.359;6.1792,0.6118,-2.4022;7.278,-0.0955,-1.2125;5.3386,0.4752,-0.4078;4.5977,1.2628,-0.6077;5.7667,0.4694,1.0711;6.552,1.204,1.2723;4.9086,0.7063,1.7115;6.2207,-0.9835,1.2934;6.0419,-1.3253,2.3201;7.301,-1.0802,1.1137;5.4737,-1.7998,0.3127;4.8727,-2.4759,0.7704;4.7054,-0.9271,-0.6021;4.8713,-1.2589,-1.6383;3.1842,-0.9156,-0.3452;2.9954,-0.4925,0.6525;2.7209,-0.2244,-1.0662;2.4571,-2.2726,-0.4525;2.9118,-2.9718,0.2671;0.9817,-2.1127,-0.0537;0.4545,-3.0733,-0.0913;0.4653,-1.4238,-0.7349;0.8814,-1.7125,0.9624;2.5766,-2.8978,-1.8505;2.0289,-3.8462,-1.9006;3.6175,-3.1022,-2.1222;2.1546,-2.2306,-2.6138)|\",7.425986980145\r\n\"[H]OC(=O)[C@@]([H])(N(O)C([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.5502,1.2642,-2.4363;6.1039,0.5639,-1.9822;5.8037,0.5895,-0.6848;6.4189,-0.0288,0.1573;4.5789,1.4482,-0.3094;4.6152,1.6207,0.7614;4.7023,2.7661,-1.0044;4.5898,2.6934,-2.2991;4.8772,3.9119,-0.3932;4.8196,4.7541,-1.0765;5.1279,4.1734,1.0247;4.6008,5.3675,1.556;4.0025,6.0136,0.9186;4.8247,5.7192,2.8836;4.3992,6.6385,3.2761;5.5963,4.8945,3.7056;5.7758,5.1694,4.741;6.1513,3.7242,3.1843;6.7749,3.091,3.8088;5.9296,3.3648,1.8563;6.4092,2.4727,1.4657;3.2163,0.8244,-0.6942;2.4598,1.5746,-0.4283;3.1757,0.7105,-1.7806;2.8462,-0.513,-0.0066;1.7685,-0.6222,-0.1969;3.5325,-1.7402,-0.6308;3.0951,-2.6607,-0.2267;4.6044,-1.7666,-0.4153;3.4013,-1.7594,-1.7192;3.0398,-0.4801,1.5177;2.6068,-1.3755,1.9778;2.5524,0.3933,1.9695;4.103,-0.4571,1.7833)|\",4.334773638465\r\n\"[H]OC1=NSN([H])[C@@]1([H])N1C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(0.5609,3.4145,-0.5754;1.0033,3.6525,-1.4157;0.9074,2.5487,-2.1803;1.6786,2.3183,-3.1684;1.3205,0.7799,-3.837;-0.1106,0.457,-2.8151;-0.9176,0.5598,-3.4275;-0.1698,1.5282,-1.7909;-1.1557,2.0081,-1.8468;0.0225,1.1869,-0.3897;-1.0874,0.4449,0.2375;-1.0913,0.6861,1.3085;-2.0473,0.7693,-0.1768;-0.787,-1.0562,0.0259;-1.1744,-1.6658,0.8487;-1.2501,-1.4076,-0.9001;0.7596,-1.1198,-0.0941;1.0513,-1.6161,-1.0225;1.2167,-1.6675,0.7364;1.2196,0.3634,-0.0975;2.0128,0.5698,-0.8205;1.5913,0.651,0.8933)|\",5.20825909857\r\n\"[H]/C(=C(/C([H])([H])[H])C1([H])SC([H])([H])C([H])([H])C([H])([H])S1)C([H])([H])C([H])([H])[H] |(2.5338,1.0513,0.8637;3.2278,0.3949,0.3422;4.5389,0.6512,0.4731;5.6365,-0.2057,-0.1191;5.246,-1.0927,-0.6214;6.3234,-0.5459,0.6687;6.24,0.3589,-0.8376;5.0667,1.8201,1.302;5.7821,1.4247,2.0335;3.8745,2.7343,2.3716;3.0091,3.8362,1.1759;2.3586,3.2382,0.5289;2.3645,4.4512,1.8123;3.9246,4.7225,0.3289;4.5204,5.3752,0.9764;3.2867,5.3673,-0.2935;4.8518,3.9249,-0.5908;5.4145,4.5939,-1.2494;4.2784,3.2395,-1.2264;6.1385,2.9787,0.3112;2.5571,-0.711,-0.4284;3.2844,-1.411,-0.8511;1.9354,-1.2954,0.2661;1.6563,-0.1708,-1.5541;1.1314,-0.987,-2.0636;2.2457,0.3735,-2.3005;0.8998,0.5185,-1.1598)|\",5.953851048940001\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])OC([H])([H])[C@@]2([H])C([H])([H])C([H])([H])N3C([H])([H])C([H])([H])C([H])([H])[C@@]32[H])C([H])=C1[H] |(-2.0841,-3.1038,0.0838;-1.0786,-2.6948,0.0316;-0.6863,-1.9367,-1.0716;-1.3842,-1.7548,-1.8848;0.6086,-1.4159,-1.1369;0.9122,-0.8301,-2.0023;1.5212,-1.6382,-0.101;2.9162,-1.0396,-0.1393;2.9984,-0.2326,0.5986;3.1074,-0.5991,-1.1319;3.9453,-1.9547,0.2071;4.2199,-2.918,-0.8014;3.3182,-3.5094,-1.0291;4.5172,-2.4045,-1.7329;5.3461,-3.8278,-0.3196;6.1755,-3.1788,-0.0146;4.9248,-4.7536,0.8476;5.7273,-4.8168,1.5923;4.0304,-4.3899,1.3638;4.7234,-6.1239,0.1674;3.7116,-6.1883,-0.258;4.8396,-6.9762,0.8444;5.7003,-6.187,-0.9218;7.056,-6.6699,-0.5676;7.0919,-7.0307,0.4682;7.3226,-7.5201,-1.2119;8.0189,-5.4912,-0.8069;8.1326,-4.8993,0.1101;9.0186,-5.8117,-1.1184;7.2829,-4.6676,-1.8759;7.4337,-5.1118,-2.8674;7.6085,-3.6232,-1.926;5.8115,-4.8087,-1.4564;5.1451,-4.6975,-2.3237;1.118,-2.4029,1.0025;1.8269,-2.5858,1.8058;-0.1707,-2.9295,1.0682;-0.4701,-3.5208,1.9298)|\",5.249076176145\r\n\"[H]OC1=C([H])C(O[H])=C([C@@]2([H])N([H])C([H])([H])[C@@]([H])(O[H])[C@@]2([H])O[H])C([H])=C1[H] |(-2.4653,-2.286,0.4182;-1.5543,-2.3286,0.7479;-0.9453,-1.1274,0.5114;0.379,-0.9939,0.9396;0.861,-1.8315,1.4374;1.0547,0.2051,0.726;2.3512,0.3818,1.1356;2.6629,-0.4345,1.5565;0.4368,1.2943,0.0814;1.2078,2.5776,-0.1259;2.2392,2.3226,-0.4295;0.5582,3.5061,-1.0679;0.5066,3.121,-2.0081;1.3496,4.7523,-1.0091;2.2052,4.7354,-1.7032;0.7321,5.6231,-1.2486;1.8538,4.8285,0.4666;2.9494,4.8376,0.4821;1.4136,5.9637,1.177;0.6117,5.6589,1.6457;1.3177,3.5105,1.098;1.9517,3.1161,1.893;0.036,3.7622,1.6736;-0.5841,3.7676,0.9194;-0.8861,1.1291,-0.3305;-1.3799,1.9573,-0.8291;-1.5835,-0.064,-0.1289;-2.6135,-0.1625,-0.4649)|\",5.83956323173\r\n\"[H]O[C@@]1([H])[C@@]([H])(C2=C([H])C([H])=C([H])O2)N([H])C([H])([H])[C@@]1([H])O[H] |(1.8548,-0.0896,-2.1826;2.675,-0.5543,-2.4223;2.9136,-1.5556,-1.4462;3.9925,-1.5542,-1.2442;2.1953,-1.4338,-0.0643;2.7532,-0.7542,0.593;0.7637,-0.983,-0.1275;-0.3881,-1.4405,0.4333;-0.474,-2.3178,1.0566;-1.4295,-0.5353,0.0356;-2.4771,-0.5905,0.2973;-0.8432,0.4144,-0.7367;-1.1991,1.2899,-1.2572;0.5024,0.165,-0.8454;2.2459,-2.7958,0.4846;3.1614,-2.9356,0.9054;2.0881,-3.7217,-0.6674;2.6774,-4.6261,-0.4911;1.0388,-4.016,-0.7685;2.4961,-2.9667,-1.9525;3.3398,-3.4538,-2.4657;1.3849,-2.8665,-2.8245;1.5749,-2.0828,-3.3735)|\",6.20963806841\r\n\"[H]O[C@@]1([H])[C@@]([H])(C2=C([H])C([H])=C([H])N2[H])N([H])C([H])([H])[C@@]1([H])O[H] |(2.0519,-0.914,-3.0282;2.7567,-0.5591,-2.4487;2.9697,-1.5875,-1.5005;4.0483,-1.6499,-1.305;2.2396,-1.4372,-0.1279;2.7938,-0.7571,0.5316;0.8158,-0.9634,-0.2684;-0.3751,-1.5049,0.1864;-0.4572,-2.4105,0.7707;-1.4262,-0.6227,-0.2029;-2.4811,-0.7427,0.0038;-0.8499,0.4302,-0.8803;-1.2887,1.3014,-1.3455;0.5082,0.2196,-0.9074;1.1908,0.7426,-1.4383;2.2783,-2.7968,0.4398;3.1569,-2.9179,0.935;2.2126,-3.7544,-0.6936;2.9606,-4.5438,-0.5645;1.2321,-4.2444,-0.736;2.4408,-2.9552,-1.9974;3.1386,-3.4443,-2.6815;1.2518,-2.7519,-2.7679;0.548,-2.4875,-2.1441)|\",6.3674641017\r\n\"[H]O[C@@]1([H])[C@@]([H])(C2=C([H])N([H])C3=C2C([H])=C([H])C([H])=C3[H])N([H])C([H])([H])[C@@]1([H])O[H] |(-3.6741,-1.1241,-1.9936;-3.084,-1.6615,-2.5557;-2.5683,-2.6703,-1.6919;-1.6279,-3.012,-2.1294;-2.4317,-2.1542,-0.2419;-1.8997,-2.9396,0.3311;-1.7197,-0.8504,-0.0589;-2.2965,0.39,0.0622;-3.3424,0.6609,0.0786;-1.3144,1.3531,0.1929;-1.4808,2.341,0.3028;-0.0741,0.7493,0.1557;-0.292,-0.6473,-0.002;0.8222,-1.5026,-0.0653;0.6904,-2.5748,-0.1867;2.0974,-0.9598,0.0289;2.9645,-1.6127,-0.0191;2.288,0.4286,0.184;3.2975,0.8247,0.2517;1.2077,1.3006,0.2498;1.3557,2.371,0.3677;-3.8608,-2.0739,0.1098;-3.9979,-1.866,1.0957;-4.4549,-3.3633,-0.295;-4.3814,-4.1257,0.4972;-5.5105,-3.2466,-0.5586;-3.6292,-3.8008,-1.5465;-3.1362,-4.7591,-1.3474;-4.3886,-3.9564,-2.723;-4.3121,-3.0931,-3.175)|\",5.25179731465\r\n\"[H]OC(=O)/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(N([H])[H])C([H])([H])N([H])[H] |(0.7975,-3.0542,-4.3558;-0.1576,-2.937,-4.2264;-0.3893,-2.1089,-3.1663;-1.5273,-1.8395,-2.8568;0.8231,-1.6007,-2.4701;1.8054,-1.9229,-2.817;0.7097,-0.7604,-1.4329;-0.3005,-0.482,-1.1401;1.8563,-0.1956,-0.6492;2.7964,-0.5144,-1.1162;1.8321,0.903,-0.7148;1.8717,-0.6005,0.8505;1.5795,-1.6555,0.941;2.9136,-0.5452,1.1915;1.022,0.2602,1.8006;1.3779,1.3002,1.7567;1.2218,-0.0817,2.826;-0.497,0.2388,1.5704;-0.832,-0.8034,1.4816;-0.7375,0.7283,0.6161;-1.2646,0.9392,2.6994;-0.84,1.9364,2.8718;-1.1338,0.3683,3.628;-2.7689,1.1238,2.4431;-2.8913,1.6945,1.5106;-3.3143,1.9675,3.5212;-4.318,2.0976,3.3928;-3.1929,1.4932,4.4162;-3.529,-0.206,2.2521;-3.087,-0.766,1.4201;-4.5656,0.033,1.9482;-3.4502,-1.0474,3.4537;-4.0742,-0.6799,4.17;-3.7896,-1.9836,3.2436)|\",4.96335663312\r\n\"[H]OC(=O)/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])C([H])([H])[H] |(7.1259,-6.0926,3.9633;7.4972,-6.2971,4.8368;6.9473,-5.4724,5.7739;7.4014,-5.4421,6.8937;5.7821,-4.6442,5.3587;5.5906,-3.8028,6.0199;4.9511,-4.9165,4.3435;5.1232,-5.8,3.7231;3.7487,-4.1016,3.9659;2.8486,-4.727,4.0678;3.633,-3.2661,4.6677;3.8209,-3.5766,2.5184;3.9387,-4.4133,1.8203;4.7177,-2.951,2.407;2.5786,-2.7711,2.1208;2.4766,-1.9054,2.789;1.6757,-3.3816,2.256;2.6365,-2.2879,0.6617;3.5087,-1.6332,0.535;2.8363,-3.378,-0.2353;1.9503,-3.7684,-0.3444;1.3891,-1.5102,0.2042;1.5866,-1.1933,-0.8322;0.3229,-2.4793,0.2001;-0.3761,-2.1599,-0.3886;1.035,-0.28,1.0435;0.7877,-0.5998,2.0623;1.9249,0.3608,1.1146;-0.1324,0.5277,0.464;-0.3484,1.4067,1.0803;-1.0499,-0.0718,0.4235;0.0892,0.8804,-0.5512)|\",6.016437234555\r\n\"[H]OC1=C([H])C(O[H])=C([C@@]2([H])N([H])[C@]([H])(C([H])([H])O[H])[C@@]([H])(O[H])[C@@]2([H])O[H])C([H])=C1[H] |(-3.0955,-1.8193,-0.4344;-2.2607,-1.9256,0.0477;-1.5505,-0.7603,-0.0435;-0.3154,-0.719,0.6107;0.0224,-1.5945,1.1595;0.4562,0.4391,0.5531;1.671,0.5244,1.1848;1.8532,-0.3172,1.6308;0.0248,1.5776,-0.1546;0.8929,2.8158,-0.1861;1.9414,2.4988,-0.3256;0.4521,3.8093,-1.1794;0.5743,3.5027,-2.1423;1.2028,5.0557,-0.9186;0.557,5.9211,-1.1067;2.4436,5.1941,-1.7971;3.1561,4.3822,-1.5731;2.9429,6.1499,-1.5727;2.0159,5.1323,-3.1565;2.8012,5.1629,-3.7231;1.5602,4.992,0.6014;2.6468,4.8946,0.7241;1.1599,6.1193,1.3455;0.2609,5.8937,1.6557;0.8569,3.6954,1.0804;1.3332,3.2349,1.947;-0.481,4.0168,1.4608;-0.9804,4.0704,0.6239;-1.2126,1.5039,-0.7948;-1.5579,2.3714,-1.3482;-2.0041,0.3541,-0.751;-2.9629,0.3265,-1.2644)|\",5.842284370235\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])[C@]([H])(C2=C([H])C([H])=C([H])S2)[C@]([H])(O[H])[C@]1([H])O[H] |(2.3584,-4.7755,1.322;2.8479,-5.4892,0.8784;3.4819,-4.8966,-0.2422;4.451,-4.4404,0.0399;3.6968,-5.6977,-0.9573;2.5971,-3.8401,-0.8907;1.6507,-4.3162,-1.1693;2.2895,-2.7729,0.0948;3.1658,-2.5052,0.5516;1.852,-1.6068,-0.7057;2.0209,-0.7027,-0.1127;0.3858,-1.6715,-1.0642;-0.2585,-2.4155,-2.028;0.268,-3.0907,-2.6956;-1.6746,-2.2268,-2.0354;-2.3448,-2.7314,-2.7227;-2.0967,-1.3417,-1.0828;-3.1059,-1.0148,-0.8708;-0.7687,-0.7228,-0.1587;2.8069,-1.6278,-1.9448;3.719,-1.0883,-1.6671;2.3349,-1.0203,-3.1347;1.3609,-1.0384,-3.1018;3.1556,-3.1444,-2.1519;4.2487,-3.2667,-2.2211;2.5405,-3.6582,-3.3208;2.5323,-2.9079,-3.9463)|\",5.907591694354999\r\n\"[H]OC([H])([H])[C@@]1([H])N(C([H])([H])[H])[C@]([H])(C2=C([H])C([H])=C([H])O2)[C@]([H])(O[H])[C@]1([H])O[H] |(0.3504,-1.3999,-0.5454;0.2201,-2.034,0.1741;0.8805,-1.5297,1.3298;0.5965,-0.4834,1.5245;0.503,-2.1283,2.1668;2.4135,-1.6863,1.26;2.6152,-2.7137,0.9315;3.1465,-0.7517,0.3907;2.8024,-0.6565,-1.0119;2.6966,-1.6601,-1.4357;3.602,-0.1371,-1.5516;1.8682,-0.0908,-1.209;3.3538,0.527,1.097;2.5866,1.2748,0.8288;4.6729,1.1349,0.7523;5.0492,2.3603,0.2894;4.3867,3.1819,0.0536;6.4801,2.3425,0.1834;7.124,3.1443,-0.1501;6.8714,1.1048,0.5863;7.8288,0.6145,0.6753;5.7855,0.3528,0.9358;3.1895,0.1452,2.626;2.2453,0.5671,2.9864;4.1962,0.613,3.4823;4.8893,-0.0742,3.4515;3.1132,-1.399,2.5928;2.6029,-1.8156,3.4647;4.4377,-1.929,2.6139;4.8189,-1.7186,1.7395)|\",6.168820990835\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])[C@]([H])(C2=C([H])C([H])=C([H])O2)[C@]([H])(O[H])[C@]1([H])O[H] |(2.2437,-5.0597,1.2461;2.5005,-5.7495,0.6116;3.0594,-5.0751,-0.5044;4.1241,-4.8259,-0.3292;3.0234,-5.7674,-1.3516;2.289,-3.8048,-0.8365;1.2431,-4.0718,-1.0182;2.3231,-2.8799,0.3326;3.2788,-2.8584,0.6952;2.0394,-1.5428,-0.2156;2.4349,-0.7787,0.4657;0.5534,-1.3623,-0.3383;-0.5188,-2.0101,0.1932;-0.4742,-2.8815,0.8292;-1.6893,-1.3096,-0.2546;-2.7189,-1.5472,-0.026;-1.2531,-0.2809,-1.0248;-1.7398,0.5096,-1.5748;0.118,-0.2892,-1.086;2.8353,-1.5243,-1.5634;3.8755,-1.286,-1.3091;2.4461,-0.604,-2.567;1.5208,-0.3518,-2.404;2.7673,-3.0039,-2.0681;3.7676,-3.3306,-2.3986;1.8354,-3.1352,-3.1227;1.9281,-2.3125,-3.6397)|\",6.23140717645\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])[C@]([H])(C2=C([H])C([H])=C([H])N2[H])[C@]([H])(O[H])[C@]1([H])O[H] |(2.4682,-4.8621,1.4448;2.6253,-5.5831,0.8127;3.0214,-4.9797,-0.4074;4.1027,-4.7432,-0.407;2.8495,-5.717,-1.1985;2.2251,-3.7109,-0.6901;1.1631,-3.9853,-0.6779;2.453,-2.7128,0.3892;3.4334,-2.7722,0.6725;2.2744,-1.3878,-0.2393;2.8075,-0.638,0.3593;0.8154,-1.0298,-0.3338;-0.3051,-1.5681,0.276;-0.2982,-2.4066,0.9581;-1.4297,-0.7899,-0.1243;-2.4579,-0.9372,0.1774;-0.9667,0.2024,-0.9623;-1.4876,0.992,-1.4848;0.394,0.0541,-1.0768;1.0112,0.5536,-1.7023;2.9624,-1.5714,-1.63;4.0485,-1.5514,-1.4648;2.6535,-0.6174,-2.6269;2.0286,-1.0833,-3.2201;2.5239,-3.0059,-2.0393;3.2971,-3.5122,-2.6252;1.3903,-2.9357,-2.909;0.62,-2.7029,-2.3586)|\",6.370185240205001\r\n\"[H]C1=NC(=O)[C@@]2([H])C(N1)S[C@@]([H])(SC([H])([H])[H])C2([H])[H] |(2.4447,-7.2919,0.1452;2.3484,-6.2547,0.4655;1.6822,-6.0232,1.5503;1.5694,-4.6803,1.9552;0.6647,-4.2799,2.6556;2.701,-3.7654,1.479;3.578,-4.0056,2.105;3.0691,-4.1521,0.0599;2.937,-5.3414,-0.4189;3.648,-2.7938,-0.8776;3.4397,-1.5997,0.5559;4.4184,-1.5125,1.0373;2.8853,0.0591,0.0594;4.4289,0.7359,-0.6489;4.2156,1.7737,-0.917;4.7283,0.1949,-1.5508;5.2407,0.721,0.0843;2.4224,-2.2628,1.5022;2.5022,-1.8386,2.506;1.4017,-2.0812,1.1479)|\",4.2096012672350005\r\n\"[H]C1=C([H])C2=C(N[C@]([H])(N([H])[H])C(=O)N2)C([H])=C1Br |(1.5562,4.4471,-0.0369;1.2821,3.4007,-0.1188;0.6801,2.9289,-1.2364;0.4652,3.5685,-2.086;0.3284,1.5261,-1.3581;0.5651,0.6459,-0.1663;0.1672,-0.5815,-0.1182;-0.6127,-1.0621,-1.2441;-1.6608,-0.7925,-0.9975;-0.5295,-2.5099,-1.323;0.4513,-2.777,-1.2506;-0.8297,-2.7866,-2.2576;-0.3552,-0.3167,-2.5673;-0.4148,-0.8778,-3.642;-0.1242,1.0762,-2.4932;1.2662,1.2043,0.983;1.4691,0.5379,1.8132;1.592,2.5152,0.9939;2.4763,3.2891,2.4997)|\",2.58508157975\r\n\"[H]C1=C2/N[C@]([H])(N([H])[H])C(=O)N/C2=C([H])/C(Br)=C\\1[H] |(-0.4155,1.4328,2.1466;-0.4127,1.141,1.1013;0.8444,1.286,0.3779;1.8941,1.6852,1.0134;3.1397,1.7118,0.2694;3.5559,0.6894,0.3834;4.0686,2.6417,0.8871;3.5832,3.525,1.0398;4.8023,2.8388,0.2066;2.9764,1.8612,-1.2567;3.8001,2.4484,-1.9289;1.8762,1.2178,-1.8605;0.8603,0.9559,-1.0855;-0.3339,0.3737,-1.6633;-0.3013,0.1013,-2.7116;-1.4426,0.2207,-0.8985;-3.0295,-0.5307,-1.6436;-1.5042,0.6211,0.4987;-2.4354,0.4848,1.0367)|\",2.5878027182550003\r\n\"[H]C1=NC(=O)C2=C(C([H])=C([H])C(F)=C2F)C1=C([H])[H] |(2.6055,-2.0807,-0.0062;3.1416,-1.127,-0.0063;4.4233,-1.1833,-0.0053;5.1704,0.0284,-0.0075;6.3841,-0.0339,-0.0103;4.432,1.3334,-0.007;3.0149,1.3683,-0.0071;2.3455,2.5994,-0.0077;1.2617,2.6332,-0.0085;3.0533,3.7957,-0.0077;2.5502,4.7569,-0.0075;4.4409,3.7623,-0.0078;5.1453,4.9031,-0.0084;5.1297,2.5506,-0.0079;6.4601,2.6097,-0.0085;2.3049,0.0788,-0.0067;0.9668,-0.0907,-0.0054;0.5408,-1.0898,-0.0057;0.2595,0.7316,-0.004)|\",4.027284987400001\r\n\"[H]C1=C([H])C(Cl)=C(OC(=O)[C@@]([H])(Cl)C([H])([H])[H])C([H])=C1[H] |(4.9484,-2.654,6.9801;4.331,-2.2807,6.1686;3.476,-3.1539,5.4979;3.417,-4.2013,5.7734;2.6779,-2.6781,4.4574;1.598,-3.7729,3.6204;2.742,-1.3312,4.084;1.9837,-0.9277,2.9913;1.1676,0.1736,3.0823;1.0473,0.8637,4.0554;0.501,0.3514,1.7176;0.1238,-0.6208,1.3946;-0.9352,1.4352,1.8659;1.4906,0.9042,0.688;2.3324,0.2126,0.5792;1.8688,1.8817,1.0011;0.9979,1.0143,-0.2813;3.6014,-0.4606,4.753;3.6386,0.581,4.4575;4.3925,-0.937,5.7966;5.0573,-0.2545,6.3171)|\",6.05181203512\r\n\"[H]C#CC([H])([H])[C@@](C#N)(C([H])([H])[H])C([H])([H])[C@]([H])(O[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.2653,-3.8983,-4.1908;5.9841,-3.5897,-3.2092;5.694,-3.2512,-2.0858;5.3218,-2.852,-0.7288;6.2149,-2.5324,-0.1772;4.8995,-3.7161,-0.2094;4.2818,-1.678,-0.7029;3.9328,-1.4357,0.7108;3.6612,-1.2412,1.8231;4.9305,-0.3925,-1.2693;5.2277,-0.5617,-2.3089;5.8211,-0.1192,-0.6947;4.2302,0.4477,-1.2358;2.9816,-1.9898,-1.5013;2.3606,-1.0902,-1.4593;3.2734,-2.119,-2.5514;2.1579,-3.2049,-1.0384;2.0424,-3.1548,0.0519;2.8737,-4.4263,-1.2633;3.268,-4.3908,-2.1498;0.7158,-3.2787,-1.6377;0.7547,-3.3989,-3.1737;-0.259,-3.5148,-3.5743;1.3304,-4.2768,-3.4905;1.1929,-2.5124,-3.6478;0.0281,-4.5287,-1.0508;-0.9886,-4.6282,-1.449;-0.0457,-4.4596,0.0418;0.5886,-5.4357,-1.2912;-0.1014,-2.0358,-1.2316;-1.1535,-2.1714,-1.508;0.2453,-1.121,-1.7252;-0.0643,-1.8694,-0.1477)|\",7.78789840131\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])[C@]2(C([H])([H])[H])C(=O)N(C3=C([H])C([H])=C([H])C([H])=C3[H])O[C@@]21[H] |(5.5254,-0.2188,-2.0522;5.0043,0.4084,-2.5829;3.8986,0.7641,-1.7856;3.1703,1.2242,-2.4656;4.1798,1.7347,-0.6218;5.0767,1.4054,-0.0797;4.3619,2.756,-0.9682;2.9278,1.5914,0.2535;2.0956,2.1637,-0.1765;3.0591,1.9288,1.2857;2.5793,0.0726,0.2005;1.0826,-0.2085,0.3744;0.5023,0.3103,-0.3968;0.7519,0.1439,1.3561;0.8631,-1.2799,0.3078;3.3658,-0.6664,1.29;3.1974,-0.5455,2.4956;4.2903,-1.4726,0.6828;5.2107,-2.4039,1.2045;5.9288,-3.232,0.3294;5.768,-3.1605,-0.7383;6.8406,-4.1509,0.8468;7.3917,-4.7891,0.1614;7.0451,-4.2565,2.2224;7.7573,-4.9743,2.6188;6.3239,-3.4272,3.0838;6.4722,-3.4975,4.158;5.4092,-2.4986,2.5927;4.8473,-1.8584,3.2588;4.2501,-1.4058,-0.731;3.2143,-0.4461,-1.1042;2.5331,-0.9866,-1.7669)|\",5.24635503764\r\n\"[H]O[C@]([H])(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[C@](C#N)(C([H])([H])[H])C([H])([H])C([H])=C([H])[H] |(5.8655,-0.0871,2.5349;4.9395,0.0636,2.2865;4.5151,-1.0585,1.4983;3.4431,-0.8825,1.3498;4.6409,-2.4006,2.2879;3.9588,-3.5412,1.5047;3.9138,-4.4458,2.1222;4.4954,-3.8045,0.5868;2.9315,-3.275,1.2283;6.1165,-2.7665,2.5455;6.1825,-3.6959,3.1226;6.638,-1.9972,3.1312;6.6726,-2.9191,1.6132;3.9128,-2.2348,3.6375;3.9765,-3.1598,4.2224;2.8511,-2.0071,3.4836;4.343,-1.421,4.2273;5.2147,-1.0432,0.1239;5.1635,-2.0353,-0.3356;6.281,-0.8223,0.2601;4.65,-0.0323,-0.921;3.2317,-0.3689,-1.1644;2.1168,-0.638,-1.3497;5.4158,-0.2015,-2.2539;6.4683,0.0585,-2.1088;5.0054,0.4564,-3.0267;5.3527,-1.2331,-2.6147;4.6961,1.4649,-0.4538;4.2344,2.0712,-1.2421;4.0798,1.5653,0.4451;6.0771,1.9849,-0.1599;6.5426,1.6086,0.7491;6.7297,2.8793,-0.9044;7.7184,3.2389,-0.6312;6.2947,3.2983,-1.8098)|\",7.273603223865001\r\n\"[H]OC(=O)[C@]12N([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])C([H])([H])C([H])([H])C2([H])[H] |(0.6112,0.3854,-2.1614;0.2939,-0.4652,-2.5033;-0.1505,-1.2465,-1.4845;-0.6158,-2.333,-1.7365;0.0026,-0.7631,-0.0092;-1.1257,-1.2558,0.7778;-1.2098,-2.2589,0.6154;-2.3907,-0.6201,0.3497;-2.646,-0.8739,-0.6942;-3.1918,-1.0219,0.9798;-2.3365,0.9085,0.4966;-3.269,1.3383,0.1118;-2.2779,1.1617,1.5633;-1.1172,1.5234,-0.2241;-1.0358,2.5924,0.0099;-1.2441,1.4453,-1.3137;0.1234,0.7595,0.2413;0.1222,0.8194,1.3382;1.5527,1.1523,-0.1926;1.6091,1.5517,-1.2177;1.9378,1.9596,0.4385;2.3805,-0.1659,-0.0416;3.227,-0.0397,0.6396;2.7984,-0.4687,-1.0074;1.391,-1.2424,0.481;1.3571,-1.2526,1.5763;1.6418,-2.253,0.144)|\",6.530732412\r\n\"[H]O[C@]1(C([H])([H])[H])[C@@]2([H])C([H])=C([H])[C@]1([H])[C@]1([H])C([H])([H])OC([H])([H])[C@@]21[H] |(2.7734,0.5892,1.7808;2.7578,1.1015,0.9579;2.8617,0.1731,-0.1233;1.9323,-1.0051,0.1817;1.8872,-1.7346,-0.6271;0.9225,-0.6276,0.3766;2.2741,-1.5312,1.0845;4.345,-0.188,-0.4896;4.9062,-0.7623,0.2527;4.0864,-0.8681,-1.8348;4.6809,-1.6807,-2.2411;3.0978,-0.1989,-2.448;2.7147,-0.3735,-3.4483;2.6117,0.8998,-1.5157;1.5922,1.2608,-1.674;3.693,2.0022,-1.3273;3.374,2.5272,-0.423;4.3519,3.092,-2.1542;3.7771,4.0208,-2.233;4.6521,2.7702,-3.1619;5.5378,3.3785,-1.3505;5.7759,2.3201,-0.3733;6.8562,2.1426,-0.3499;5.4382,2.6352,0.623;4.9512,1.1861,-0.9663;5.4851,0.9559,-1.8969)|\",6.628693398179999\r\n\"[H]C([H])([H])C([H])([H])O[C@@]1([H])OC(C([H])(C([H])([H])[H])C([H])([H])[H])(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C1([H])[H] |(0.2304,2.6826,-1.0514;0.7951,1.8803,-0.5635;0.3387,1.679,0.4112;0.711,0.9747,-1.1732;2.2516,2.2815,-0.3992;2.3425,3.1914,0.2094;2.7099,2.4951,-1.3796;2.9442,1.2032,0.2242;4.3304,1.4161,0.3649;4.7354,1.7562,-0.6003;4.6427,2.4283,1.3056;5.0424,1.8845,2.5981;4.0177,2.3597,3.6785;4.2946,1.8405,4.6074;4.0727,3.8767,3.9394;3.2778,4.1673,4.6359;3.9177,4.4296,3.006;5.0208,4.2053,4.3752;2.5605,1.9824,3.3587;1.9261,2.222,4.2202;2.4215,0.922,3.132;2.1936,2.5504,2.5001;6.4767,2.4425,2.8723;6.3835,3.5338,2.8229;7.0221,2.0756,4.2635;8.0287,2.4902,4.3931;7.1019,0.9892,4.3945;6.4009,2.4633,5.0766;7.4975,2.0516,1.7909;8.4435,2.577,1.9666;7.1435,2.3268,0.7937;7.7212,0.9776,1.7996;5.0321,0.3413,2.4086;5.917,-0.1258,2.8493;4.1625,-0.1067,2.8941;4.9694,0.1287,0.8868;4.3793,-0.7426,0.5932;5.9716,0.0217,0.462)|\",8.764787124605\r\n\"[H]O/C(=C([H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]([H])(O[H])C([H])([H])O[H] |(7.3036,-0.0956,-0.3575;7.2357,-0.9711,-0.7713;5.9152,-1.3232,-0.8914;4.905,-0.5135,-0.5059;5.1669,0.4717,-0.1148;3.4935,-0.8189,-0.5684;3.2134,-1.8086,-0.9224;2.5226,0.0533,-0.2122;2.8317,1.0443,0.1225;1.0758,-0.1655,-0.2279;0.4764,-1.3947,-0.5703;1.0974,-2.2454,-0.836;-0.9068,-1.5422,-0.5694;-1.3433,-2.5016,-0.8353;-1.7352,-0.4696,-0.2247;-2.8151,-0.5895,-0.2232;-1.1605,0.7544,0.1199;-1.7916,1.5972,0.3898;0.2249,0.9018,0.118;0.6644,1.8598,0.3869;5.784,-2.7003,-1.5133;4.9275,-3.2128,-1.0555;6.954,-3.4899,-1.2917;7.5976,-2.9281,-0.8301;5.5606,-2.6446,-3.0379;4.7109,-1.9985,-3.2784;5.3277,-3.6673,-3.3786;6.691,-2.1212,-3.7006;7.4415,-2.6317,-3.3503)|\",4.05993864946\r\n\"[H]C1=C(OC([H])([H])C([H])([H])[H])C(OC(=O)C([H])([H])[H])=C([H])C(/C([H])=C(\\[H])C([H])([H])[H])=C1[H] |(5.1343,-2.584,1.3245;5.0397,-1.7808,0.6029;6.1763,-1.1353,0.1112;7.4587,-1.4031,0.4719;7.7076,-2.5236,1.3221;7.2045,-2.3766,2.2884;7.2991,-3.4328,0.8597;9.2112,-2.6327,1.5047;9.4508,-3.4887,2.1447;9.6096,-1.7269,1.9728;9.7053,-2.7702,0.5381;5.9963,-0.1075,-0.8359;7.1074,0.5906,-1.3092;7.9615,-0.0626,-2.1645;7.764,-1.1677,-2.6002;9.1544,0.8135,-2.4573;9.745,0.3659,-3.2574;9.7692,0.9045,-1.555;8.8337,1.8207,-2.739;4.7376,0.2745,-1.2562;4.6694,1.0711,-1.9896;3.5822,-0.3574,-0.7541;2.2188,0.0129,-1.1569;1.4386,-0.6391,-0.7605;1.8404,1.0421,-1.9296;2.5918,1.7227,-2.3309;0.4189,1.3531,-2.2954;0.1284,2.356,-1.9526;-0.277,0.6305,-1.856;0.2751,1.3429,-3.3849;3.7696,-1.3933,0.1707;2.9006,-1.9082,0.5729)|\",4.90349158601\r\n\"[H]OC([H])([H])C#CC([H])([H])[C@@]([H])(OC([H])([H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(1.6181,4.7852,2.7945;1.5679,5.1986,1.9182;0.7188,4.3803,1.1173;-0.3317,4.4582,1.4435;0.7705,4.8056,0.109;1.115,2.9669,1.0944;1.4486,1.8033,1.128;1.8703,0.4034,1.1406;1.0165,-0.2537,0.9248;2.2253,0.1234,2.1405;2.9949,0.0835,0.1338;3.8562,0.7312,0.3626;3.3435,-1.2774,0.3864;4.6746,-1.6262,0.0235;4.853,-1.3846,-1.0372;5.3972,-1.0385,0.6131;4.8856,-3.1006,0.2635;5.989,-3.5684,0.9746;6.7016,-2.8591,1.3902;6.2076,-4.9363,1.1733;7.0783,-5.2595,1.7324;5.2953,-5.8603,0.6555;5.3972,-7.2157,0.7914;6.507,-7.7341,1.5057;6.3885,-8.8192,1.4983;6.5204,-7.3788,2.5448;7.4577,-7.4701,1.0234;4.1759,-5.4025,-0.0583;3.4765,-6.1351,-0.4493;3.9804,-4.0432,-0.2515;3.1071,-3.6975,-0.7971;2.5645,0.2983,-1.3173;2.2702,1.3401,-1.479;1.7114,-0.3472,-1.5559;3.3766,0.0634,-2.0132)|\",5.8831014478100006\r\n\"[H]C(=O)[C@@]1([H])O[C@]([H])(C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(3.6267,1.2591,4.0741;2.9802,0.3557,4.1128;2.6124,-0.1266,5.1604;2.6046,-0.1925,2.7363;1.6481,0.2742,2.4544;3.6137,0.2205,1.8058;4.2627,-0.9276,1.2272;5.3337,-0.7037,1.191;3.7688,-1.1152,-0.2161;2.692,-1.3005,-0.2507;4.2819,-1.9734,-0.6672;4.1181,0.1025,-1.0515;5.24,0.5524,-1.1554;3.0326,0.611,-1.6712;3.2474,1.7908,-2.4855;4.2099,1.6965,-2.9941;2.4413,1.7598,-3.2228;3.1892,3.0568,-1.6441;3.2905,3.9371,-2.2895;4.0027,3.0689,-0.9137;2.2351,3.1271,-1.1118;3.9482,-2.0941,2.1906;4.6403,-1.9905,3.0382;4.086,-3.5084,1.6274;3.9038,-4.2511,2.4117;5.0931,-3.6864,1.2325;3.3676,-3.6967,0.8215;2.5397,-1.7238,2.6905;2.3046,-2.1372,3.6749;1.7729,-2.0618,1.9841)|\",5.831399816215001\r\n\"[H]O[C@@]([H])(C([H])([H])[C@@]([H])(C([H])=C([H])[H])C([H])([H])[H])[C@@]1([H])O[C@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.4572,-2.384,1.8463;5.8589,-1.803,2.3514;4.6437,-1.7839,1.6139;3.8744,-1.4271,2.3094;4.725,-0.8496,0.3931;5.4523,-1.2735,-0.313;3.7516,-0.8542,-0.1207;5.1349,0.6046,0.7221;6.0699,0.5574,1.293;5.3897,1.3681,-0.5545;4.5262,1.4861,-1.214;6.5594,1.893,-0.9205;6.6777,2.4335,-1.8561;7.4464,1.8035,-0.2961;4.0785,1.3293,1.5784;4.3957,2.3554,1.7924;3.919,0.8249,2.5376;3.1138,1.3799,1.0567;4.2809,-3.1992,1.1788;3.4321,-3.2912,0.4989;5.4129,-4.0115,0.8201;4.6393,-4.3817,1.9866;5.154,-4.185,2.9267;3.8702,-5.6572,1.9206;3.6525,-6.3116,0.7016;4.0912,-5.9007,-0.2026;2.905,-7.4881,0.6594;2.7471,-7.9914,-0.2909;2.3665,-8.0229,1.8319;1.7847,-8.94,1.7975;2.5868,-7.3785,3.0507;2.1782,-7.7925,3.9687;3.3395,-6.205,3.0952;3.5159,-5.7097,4.0476)|\",6.212359206915\r\n\"[H]OC([H])([H])[C@@]1([H])OC([H])([H])[C@@]([H])(N([H])[H])C(=O)C1([H])[H] |(1.893,-1.9037,1.6488;2.2993,-1.1978,2.1806;1.6235,-0.0082,1.8224;0.6732,0.0998,2.3722;2.2605,0.8327,2.1175;1.3824,0.0382,0.31;2.3588,-0.0062,-0.1829;0.72,-1.1572,-0.13;-0.6516,-1.2713,0.2462;-1.0092,-2.2086,-0.1846;-0.7737,-1.3357,1.3374;-1.5116,-0.0981,-0.2888;-1.4469,-0.135,-1.3842;-2.893,-0.2643,0.1393;-3.5201,0.1222,-0.5623;-3.0359,0.3099,0.9711;-0.8513,1.199,0.1843;-1.45,2.0243,0.8497;0.6231,1.3019,-0.1578;0.7239,1.3744,-1.2484;1.0432,2.2077,0.2907)|\",5.771534769105\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])OC([H])([H])[C@@]2([H])OC([H])([H])[C@@]3([H])O[C@@]3([H])C2([H])[H])C([H])=C1[H] |(-2.3257,0.0832,-1.4744;-1.2569,0.1063,-1.279;-0.5203,1.263,-1.5381;-1.0121,2.1452,-1.9396;0.8535,1.2863,-1.2939;1.4245,2.1871,-1.5097;1.5058,0.1607,-0.7786;2.984,0.2201,-0.4659;3.138,0.5753,0.5689;3.4835,0.9486,-1.1263;3.5666,-1.0632,-0.617;4.9444,-1.0858,-0.2953;5.5158,-0.4051,-0.9459;5.1126,-0.7705,0.7477;5.452,-2.5102,-0.4848;5.3067,-2.7959,-1.5408;6.847,-2.4667,-0.1952;7.5273,-3.6464,-0.5839;7.5648,-3.7298,-1.685;8.553,-3.5416,-0.2148;6.8907,-4.9031,-0.0286;7.2896,-5.8407,-0.4233;6.6122,-4.9016,1.3715;5.4995,-4.847,0.4593;4.9045,-5.7614,0.4453;4.7377,-3.533,0.4101;3.724,-3.6983,0.0312;4.6487,-3.1396,1.4303;0.7607,-0.9952,-0.5196;1.2653,-1.8735,-0.1307;-0.611,-1.023,-0.7723;-1.177,-1.9293,-0.5722)|\",6.5252901349900005\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])OC([H])([H])[C@]2([H])OC([H])([H])C([H])=C([H])C2([H])[H])C([H])=C1[H] |(-1.9091,-0.4759,-0.4271;-0.8267,-0.4307,-0.3405;-0.1459,0.7465,-0.6543;-0.6952,1.622,-0.9907;1.2451,0.7987,-0.5499;1.7716,1.7154,-0.8076;1.9703,-0.3183,-0.121;3.4702,-0.2355,0.0453;3.7221,0.0578,1.0809;3.8849,0.5463,-0.6128;4.0572,-1.4916,-0.2462;5.4524,-1.5269,-0.0115;5.9756,-0.7695,-0.6184;5.6826,-1.3234,1.0466;5.9566,-2.9152,-0.3876;5.4473,-3.6585,0.2488;7.3529,-2.9106,-0.0959;7.9494,-4.1936,-0.2181;9.0325,-4.0189,-0.2194;7.7259,-4.8059,0.6759;7.5279,-4.9266,-1.4639;8.0812,-5.8307,-1.7117;6.5188,-4.4985,-2.2252;6.2467,-5.0372,-3.1311;5.7161,-3.2762,-1.8571;4.6452,-3.4458,-2.0198;5.995,-2.4264,-2.4989;1.281,-1.4956,0.1927;1.8439,-2.3674,0.5103;-0.108,-1.5521,0.0796;-0.6309,-2.4742,0.3208)|\",6.51440558097\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(OC([H])([H])C([H])=C([H])[H])C([H])([H])C([H])=C([H])[H] |(5.7787,-0.3561,-5.1882;6.1008,-0.6161,-4.3131;4.981,-0.6912,-3.4223;4.3764,0.2275,-3.4761;4.3405,-1.5511,-3.6496;5.5795,-0.8241,-2.0242;6.0917,-1.7967,-1.9587;6.4983,0.2381,-1.797;7.0152,0.3001,-2.6191;4.5218,-0.7411,-0.9197;4.0509,0.2539,-0.985;3.5512,-1.7483,-1.2106;2.2725,-1.5336,-0.6247;2.3548,-1.5077,0.4748;1.8588,-0.5632,-0.9439;1.3668,-2.6563,-1.0372;1.7568,-3.6574,-0.8566;0.1555,-2.4897,-1.5659;-0.4796,-3.3344,-1.8187;-0.253,-1.5002,-1.7619;5.134,-0.8991,0.4891;5.8815,-0.11,0.6206;4.3423,-0.715,1.2305;5.7467,-2.2517,0.7344;5.087,-3.1056,0.5814;7.0084,-2.4611,1.1127;7.3951,-3.4626,1.2838;7.704,-1.6383,1.2652)|\",6.775634877450001\r\n\"[H]O[C@]([H])(C([H])([H])[H])[C@@]1([H])C(=O)N2[C@@]([H])(C(=O)OC([H])=C([H])[H])C(=O)C([H])([H])[C@]21[H] |(3.1169,-4.5746,0.2079;3.9307,-4.4441,0.7202;4.8399,-3.6969,-0.0901;4.4528,-2.6763,-0.2473;5.0699,-4.3414,-1.4568;5.8159,-3.777,-2.0254;4.1384,-4.3604,-2.0366;5.4209,-5.3724,-1.3366;6.1354,-3.6186,0.7109;6.5799,-4.6172,0.7777;7.1186,-2.5181,0.2424;7.5977,-2.1721,-0.8106;7.201,-1.9707,1.5237;7.1138,-0.5906,1.9675;7.9445,-0.2932,2.6143;7.0424,0.4322,0.8346;7.8393,1.315,0.6538;5.9148,0.2285,0.093;5.7128,1.0364,-1.0189;6.5341,1.7087,-1.2341;4.5861,0.9395,-1.713;4.438,1.5656,-2.585;3.7971,0.2476,-1.4371;5.7719,-0.5932,2.7872;5.4147,0.3117,3.495;5.0372,-1.9194,2.5302;4.4975,-2.2383,3.4241;4.2987,-1.7332,1.7417;6.1436,-2.8715,2.0796;6.5115,-3.4785,2.9125)|\",5.608266458805001\r\n\"[H]O[C@@]1([H])[C@@]([H])(O[H])[C@]2([H])N(C([H])([H])C2([H])[H])C([H])([H])[C@@]1([H])O[H] |(1.2522,-2.0377,1.7822;1.9926,-2.0063,1.1452;1.4645,-1.3336,0.0059;1.9439,-1.7844,-0.8694;-0.0703,-1.5681,-0.0841;-0.2645,-2.53,-0.5786;-0.5471,-1.6296,1.2768;-1.3086,-2.2254,1.3098;-0.7245,-0.4215,-0.8542;-0.4146,-0.5082,-1.9078;-0.4468,0.9175,-0.275;-1.8251,1.0328,0.2411;-1.9059,0.8294,1.3194;-2.3111,1.9923,0.0234;-2.2405,-0.1792,-0.6302;-2.7702,0.0916,-1.5464;-2.7959,-0.9723,-0.1213;0.6962,1.0353,0.6259;1.0264,2.079,0.689;0.4642,0.6974,1.6517;1.8389,0.1847,0.0516;2.007,0.524,-0.9797;3.0438,0.3631,0.7638;3.0973,-0.4125,1.3553)|\",6.4763096419\r\n\"[H]C([H])=C([H])C([H])([H])[C@@]1(C([H])([H])[H])N([S@](=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C1([H])[H] |(0.2015,2.0961,5.0325;0.1124,2.0825,3.9495;-0.8971,2.1211,3.5448;1.1836,2.03,3.157;2.1762,1.989,3.6087;1.1528,2.0162,1.6509;1.6719,2.9014,1.2592;0.1157,2.0695,1.2936;1.8307,0.7792,1.0571;1.1231,-0.5353,1.3398;0.9921,-0.6525,2.4206;1.6838,-1.391,0.9573;0.124,-0.5419,0.8859;2.4462,1.0454,-0.2695;2.3823,-0.2153,-1.5027;3.1866,-1.4198,-1.0636;3.4464,0.6873,-2.7781;4.8193,0.9955,-2.1799;5.4948,1.3458,-2.9703;5.2534,0.0938,-1.7353;4.7563,1.7749,-1.4157;3.5674,-0.3196,-3.9324;4.1644,0.1217,-4.7392;2.586,-0.5806,-4.3469;4.0578,-1.2393,-3.6004;2.6899,1.9459,-3.212;3.2485,2.447,-4.0121;2.5694,2.645,-2.3803;1.6958,1.7007,-3.605;3.3142,0.7867,0.8941;3.8219,-0.1768,0.9121;3.9131,1.6384,1.2135),wU:14.14|\",6.435492564325\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]1([H])N(C(=O)C([H])([H])[H])C([H])([H])[C@@]([H])(O[H])[C@@]1([H])O[H] |(6.3623,-1.0128,0.948;6.5996,-0.074,0.8809;5.3763,0.6632,0.7196;5.6698,1.6442,0.3384;4.7029,0.1761,0.0118;4.7083,0.8226,2.0906;4.31,-0.1483,2.4188;5.6994,1.2879,3.0171;6.5273,0.8652,2.718;3.6076,1.8891,2.1644;3.2729,1.9464,3.2076;4.2065,3.1436,1.784;5.034,3.1829,2.3003;2.3663,1.6876,1.2681;2.6889,1.5632,0.2309;1.5759,0.5095,1.6675;1.7938,-0.7242,1.1149;2.7326,-0.9441,0.3498;0.8175,-1.8243,1.5024;-0.1807,-1.6315,1.0907;0.7122,-1.9149,2.589;1.1906,-2.763,1.0918;0.3745,0.8597,2.4529;-0.5305,0.5109,1.9426;0.3978,0.4196,3.4554;0.3916,2.3834,2.5474;-0.6199,2.8021,2.4392;0.9622,2.7825,3.7804;1.3195,3.6731,3.5929;1.3066,2.8279,1.3828;0.7233,2.8262,0.4507;1.7761,4.1361,1.6375;2.7563,4.1066,1.6598)|\",7.22734386928\r\n\"[H]C([H])([H])C([S@@](=O)N1C([H])([H])[C@@]1(C([H])([H])[H])C([H])([H])C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.3134,1.4232,-2.87;5.3033,1.0376,-2.6827;4.9829,1.3647,-1.6905;4.6347,1.486,-3.4278;5.3158,-0.4897,-2.7874;3.5364,-1.1,-2.5976;3.6137,-2.6117,-2.6378;3.3157,-0.5332,-0.9452;3.289,-1.5571,0.1144;3.3855,-2.5899,-0.2187;3.8475,-1.3009,1.0136;2.0359,-0.8312,-0.2461;0.929,-1.586,-0.9644;0.2171,-1.9915,-0.2373;1.32,-2.4175,-1.556;0.3739,-0.9153,-1.632;1.5811,0.3278,0.6354;2.4722,0.8815,0.9529;0.986,1.022,0.0256;0.7756,-0.1047,1.8677;0.5271,0.7633,2.4881;1.3465,-0.8061,2.4878;-0.1667,-0.592,1.5941;6.1981,-1.1493,-1.7266;7.2533,-0.9418,-1.9439;6.0547,-2.2347,-1.7314;5.9749,-0.7661,-0.7269;5.717,-0.9517,-4.1967;6.746,-0.6352,-4.4043;5.0702,-0.5122,-4.9658;5.6649,-2.0409,-4.2828),wU:5.5|\",6.8572690326\r\n\"[H]C1=C([H])C([H])=C([C@@]2(C([H])([H])[H])N([S@@](=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C2([H])[H])C([H])=C1[H] |(1.8034,-3.5131,-7.0995;2.1423,-3.5736,-6.0688;3.4082,-4.0753,-5.7721;4.0649,-4.4078,-6.5718;3.8461,-4.1499,-4.4476;4.8372,-4.5413,-4.2448;3.0234,-3.7294,-3.3933;3.4773,-3.8323,-1.954;4.9504,-4.1263,-1.7126;5.1967,-5.1222,-2.0987;5.1823,-4.0929,-0.6482;5.5919,-3.3992,-2.2202;2.7108,-2.9316,-1.0429;3.2458,-2.0561,0.3685;4.5601,-2.5382,0.9396;3.6155,-0.4043,-0.4787;4.7702,-0.5685,-1.4672;5.1134,0.4189,-1.8008;4.4643,-1.1329,-2.3531;5.6122,-1.0779,-0.9881;4.0207,0.5127,0.6859;4.2422,1.5134,0.2968;4.9101,0.1363,1.1999;3.2132,0.6106,1.4215;2.3302,0.093,-1.1474;2.489,1.1118,-1.5227;1.496,0.1282,-0.4362;2.0412,-0.5448,-1.9854;2.5081,-4.37,-0.9341;2.9438,-4.8833,-0.0764;1.5281,-4.7318,-1.243;1.7505,-3.224,-3.7064;1.1118,-2.8741,-2.9019;1.3151,-3.146,-5.0274;0.3278,-2.7465,-5.2442),wD:13.13|\",5.7143908605\r\n\"[H]C1=C([H])C2=C(C([H])=C1F)C([H])([H])N(C([H])([H])[H])C(=O)/C2=C(\\[H])C([H])([H])[H] |(-0.0199,0.8928,4.5986;0.8135,0.2772,4.277;1.2988,0.3285,2.9725;0.8471,1.0222,2.2731;2.3816,-0.4702,2.5629;3.0108,-1.2851,3.5227;2.534,-1.3431,4.8322;3.0101,-1.9673,5.5826;1.434,-0.5722,5.1849;0.9758,-0.6298,6.4527;4.2109,-2.0975,3.1066;3.8967,-3.0979,2.7583;4.8618,-2.2635,3.9723;5,-1.4279,2.0777;6.3859,-1.8501,1.9465;6.4577,-2.923,1.716;6.9261,-1.6579,2.881;6.8376,-1.284,1.133;4.4116,-0.7432,1.0378;5.0448,-0.3929,0.0445;2.938,-0.4518,1.1956;2.2691,-0.2222,0.0462;2.8883,-0.2131,-0.8483;0.7997,-0.016,-0.1638;0.4847,-0.4967,-1.0973;0.5568,1.0518,-0.2697;0.1925,-0.4158,0.6534)|\",4.9524720791\r\n\"[H]C1=C([H])C2=C(C([H])=C1Cl)C([H])([H])N(C([H])([H])[H])C(=O)/C2=C(\\[H])C([H])([H])[H] |(-0.0608,-2.8679,3.377;0.7961,-3.0105,2.7274;1.2918,-1.9575,1.9633;0.8236,-0.9843,2.0522;2.403,-2.1247,1.118;3.0482,-3.3747,1.1096;2.5601,-4.4351,1.873;3.061,-5.3985,1.8613;1.4292,-4.2507,2.6642;0.8168,-5.5904,3.6228;4.2788,-3.5565,0.2577;4.0048,-3.95,-0.7377;4.9318,-4.3082,0.7156;5.0479,-2.3246,0.1201;6.4379,-2.4761,-0.2822;6.5183,-3.0093,-1.2406;6.9892,-3.0431,0.4774;6.8701,-1.4826,-0.3915;4.4465,-1.0937,-0.0178;5.073,-0.0974,-0.3706;2.9679,-1.0439,0.2876;2.3016,-0.0058,-0.2607;2.9273,0.7008,-0.8017;0.831,0.2812,-0.2593;0.5364,0.7201,-1.2196;0.5685,1.0212,0.5112;0.2216,-0.6104,-0.0862)|\",4.85995336993\r\n\"[H]OC([H])([H])C([H])([H])[C@@]([H])(OC([H])([H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H])C([H])=C([H])[H] |(3.8265,3.2096,-2.2199;3.7513,2.3913,-2.7352;4.104,1.3125,-1.8715;3.4406,1.2655,-0.9966;3.9403,0.4032,-2.4567;5.5639,1.3957,-1.4169;5.7203,2.3359,-0.8695;6.2112,1.4267,-2.3022;6.0123,0.2442,-0.4983;7.0551,0.4489,-0.207;5.1851,0.2746,0.6634;5.7918,-0.2627,1.8392;4.9602,-0.3822,2.5434;6.2041,-1.2624,1.6425;6.86,0.6377,2.427;8.1056,0.1419,2.8105;8.3318,-0.9124,2.6645;9.0793,0.9671,3.3849;10.0364,0.5446,3.6684;8.8057,2.3254,3.5721;9.6754,3.2285,4.113;10.9596,2.7711,4.5052;11.4829,3.646,4.8953;11.5208,2.3639,3.6535;10.894,2.0072,5.2912;7.5577,2.8408,3.185;7.3649,3.8984,3.3376;6.6036,2.0055,2.6238;5.6419,2.4112,2.3217;5.9738,-1.1001,-1.1974;6.7166,-1.2261,-1.9866;5.1195,-2.0861,-0.9243;5.1411,-3.0258,-1.4691;4.3681,-1.9771,-0.1478)|\",5.858611201265\r\n\"[H]C1=C([H])C([H])=C([H])[C@]2([H])C1=NC(=O)C([H])=C2OC([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(9.3101,-2.523,0.3001;8.9337,-2.999,-0.5996;7.609,-3.0846,-0.8706;6.8861,-2.7035,-0.1535;7.1165,-3.6194,-2.1316;6.0518,-3.5634,-2.3412;7.9599,-4.1616,-3.0268;7.6142,-4.5694,-3.9704;9.424,-4.3245,-2.6997;9.4905,-5.345,-2.2648;9.9273,-3.4214,-1.5763;11.1534,-3.0464,-1.4197;12.1051,-3.3409,-2.4337;13.2878,-3.1064,-2.2377;11.636,-3.8843,-3.7217;12.3768,-3.9273,-4.5102;10.3746,-4.3345,-3.8703;9.8319,-4.893,-4.9713;10.6553,-5.0291,-6.1433;11.014,-4.0361,-6.4376;11.5274,-5.6496,-5.8963;9.8041,-5.6767,-7.2269;8.9578,-5.0151,-7.4578;9.3783,-6.6002,-6.8141;10.576,-6.0054,-8.5204;11.4289,-6.6462,-8.2479;9.6801,-6.8031,-9.4796;10.2272,-7.0837,-10.3867;8.8092,-6.2099,-9.7868;9.31,-7.7228,-9.0117;11.1296,-4.7505,-9.2147;11.6454,-5.0177,-10.144;11.8463,-4.208,-8.5883;10.3188,-4.0568,-9.4736)|\",4.1442939431150005\r\n\"[H]OC1=NC(SC([H])([H])[H])=NC(=O)[C@@]1([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(4.1996,-2.7078,-2.2424;4.8048,-2.2772,-2.8688;5.3374,-1.1845,-2.2992;6.1808,-0.5142,-2.9993;6.7277,0.6009,-2.3643;7.68,1.5357,-3.506;8.2682,2.8984,-2.4447;8.9214,3.5104,-3.0718;8.8224,2.5006,-1.5935;7.4288,3.4938,-2.0822;6.6404,0.9675,-1.119;5.8601,0.1881,-0.2704;5.8961,0.308,0.9417;4.8793,-0.8064,-0.9146;4.8609,-1.7011,-0.2807;3.4351,-0.2098,-0.9875;3.4683,0.6839,-1.6237;2.7916,-0.9332,-1.5105;2.7866,0.146,0.3646;3.4149,0.8987,0.8549;1.4029,0.7641,0.109;0.938,1.0761,1.0508;0.7281,0.0432,-0.3714;1.4682,1.645,-0.5406;2.6871,-1.0612,1.3077;2.141,-0.7915,2.2185;3.6771,-1.4118,1.6156;2.1473,-1.8947,0.8363)|\",4.69124278262\r\n\"[H]C1=C([H])N(C([H])([H])[H])C(C(=O)/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=N1 |(9.9242,3.0672,7.1021;9.9614,2.6466,6.106;11.0065,2.694,5.2033;11.9905,3.1372,5.2569;10.5835,2.0251,4.088;11.3711,1.8273,2.8728;12.3447,2.3004,3.0204;10.8666,2.2743,2.015;11.5012,0.763,2.6723;9.2991,1.5969,4.3539;8.4797,0.8282,3.3994;8.9198,0.5271,2.2866;7.1242,0.4563,3.8659;6.8407,0.7725,4.865;6.2915,-0.235,3.0735;6.6558,-0.5143,2.0839;4.9062,-0.6728,3.4382;4.6444,-0.3047,4.4383;4.1917,-0.2171,2.7347;4.7316,-2.2048,3.38;5.0157,-2.5635,2.3803;5.4335,-2.6722,4.084;3.3012,-2.657,3.6985;3.0185,-2.294,4.6979;2.6033,-2.1802,2.9939;3.1161,-4.1788,3.6418;3.4002,-4.5409,2.6434;3.8127,-4.6543,4.3467;1.6849,-4.6247,3.958;1.5858,-5.7151,3.9088;0.9688,-4.1931,3.2478;1.3836,-4.3082,4.9642;8.9111,1.9673,5.574)|\",4.691242782620001\r\n\"[H]C1=C([H])N(C([H])([H])[H])C(C(=O)/C([H])=C(\\[H])C([H])(C([H])([H])[H])C([H])([H])[H])=N1 |(8.5722,2.1957,3.7295;7.5814,2.5933,3.5545;7.0454,3.775,4.0296;7.4523,4.5538,4.6584;5.7697,3.8437,3.5414;4.8267,4.9314,3.7943;5.3266,5.6729,4.4216;3.9374,4.5545,4.3017;4.5129,5.389,2.8553;5.5783,2.7031,2.7893;4.3211,2.3888,2.0861;3.3569,3.1568,2.1396;4.3051,1.1144,1.3313;5.2093,0.514,1.3547;3.2126,0.7322,0.6538;2.3443,1.3934,0.6817;3.0736,-0.5319,-0.1464;4.0149,-1.0935,-0.0764;1.942,-1.4091,0.4254;1.844,-2.3357,-0.1524;0.9796,-0.884,0.3845;2.1357,-1.6761,1.4698;2.8148,-0.203,-1.6304;2.7226,-1.1238,-2.2183;3.6308,0.3927,-2.0531;1.8846,0.3663,-1.7498;6.6692,1.9374,2.7894)|\",4.691242782620001\r\n\"[H]O[C@]([H])(C([H])([H])C(=O)C1=NC([H])=C([H])N1C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(3.8394,-0.9249,1.3145;3.8007,-0.8215,0.3452;3.3198,-2.0674,-0.1436;2.2856,-2.2333,0.2105;4.1891,-3.2194,0.4039;3.9364,-4.1875,-0.0331;5.2409,-3.0233,0.1521;4.1166,-3.3283,1.916;3.9359,-2.345,2.6427;4.2859,-4.663,2.4999;4.4269,-5.7884,1.7985;4.5596,-6.7808,2.7143;4.6893,-7.8143,2.4226;4.5003,-6.2607,3.9951;4.5643,-6.7265,4.968;4.3259,-4.9138,3.8585;4.2096,-3.962,4.9623;4.3185,-4.5143,5.8981;3.2391,-3.4641,4.9357;4.9861,-3.1992,4.8898;3.2551,-1.9841,-1.6939;2.3419,-0.7959,-2.0628;2.2705,-0.6935,-3.1524;2.7286,0.1382,-1.6483;1.3266,-0.9441,-1.673;2.6432,-3.2742,-2.2735;2.476,-3.1601,-3.351;1.6735,-3.5004,-1.8115;3.2934,-4.1454,-2.1366;4.6547,-1.7496,-2.2931;4.5847,-1.5995,-3.3772;5.3198,-2.6054,-2.1257;5.1205,-0.8632,-1.8524)|\",5.0177794032200005\r\n\"[H]C1=C([H])N(C([H])([H])[H])C(C(=O)/C([H])=C(\\[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])=N1 |(7.5324,0.6083,-5.6621;6.5809,1.0373,-5.377;5.7909,1.9084,-6.1023;5.9129,2.3556,-7.0783;4.6923,2.1671,-5.3297;3.5851,3.0446,-5.7043;3.7912,3.4428,-6.7004;3.4885,3.8624,-4.9887;2.6457,2.4902,-5.7112;4.8553,1.4445,-4.1657;3.882,1.4487,-3.058;2.8503,2.1221,-3.1262;4.231,0.605,-1.8908;5.1669,0.0608,-1.946;3.4101,0.5436,-0.8319;2.4953,1.1369,-0.8855;3.5933,-0.249,0.4412;2.3897,-1.2123,0.5796;2.4529,-1.7684,1.5229;2.3684,-1.936,-0.2428;1.4392,-0.6659,0.5732;3.5786,0.7463,1.6269;3.6501,0.2061,2.5787;2.6537,1.3349,1.6418;4.4214,1.444,1.5664;4.9032,-1.0529,0.4602;4.9929,-1.6074,1.4015;5.7769,-0.3975,0.3731;4.9406,-1.7781,-0.3604;5.9962,0.7558,-4.1825)|\",4.7075696136500005\r\n\"[H]C1=C2C([H])([H])C([H])([H])N(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])N=NC1=O |(6.0658,5.5352,-0.046;5.948,4.8885,0.8196;5.7947,3.5562,0.7079;5.7547,2.8576,-0.6263;4.7345,2.5015,-0.8263;6.0006,3.5768,-1.4168;6.6846,1.6315,-0.7398;7.7202,1.9023,-0.508;6.6536,1.265,-1.767;6.2799,0.54,0.1429;5.3609,-0.3496,-0.3551;4.8587,-0.2544,-1.4655;5.0819,-1.3443,0.5262;4.0884,-2.3052,0.0968;4.226,-2.4986,-0.9691;4.3293,-3.2071,0.665;2.6779,-1.8197,0.3923;1.9534,-2.5985,0.1277;2.4499,-0.9265,-0.1961;2.5581,-1.5876,1.4559;6.8285,0.4635,1.4974;6.1824,-0.2002,2.0708;7.8301,0.0094,1.4812;6.9178,1.8298,2.199;7.8174,2.3779,1.8978;7.0057,1.6517,3.2765;5.6825,2.7462,1.9675;4.7937,2.0997,1.9231;5.4944,3.499,3.2409;5.6118,4.735,3.3183;5.9543,5.5401,2.1245;6.2053,6.7123,2.3116)|\",3.594623965105\r\n\"[H]C1N=C([H])NC(=O)[C@@]1([H])C([H])([H])C(=O)OC([H])([H])C([H])([H])[H] |(4.1466,-2.5177,-0.8643;3.9633,-1.7126,-1.5788;3.1408,-1.9372,-2.5384;2.9411,-0.856,-3.421;2.4157,-1.1327,-4.3351;3.2615,0.3826,-3.2656;3.9471,0.7153,-2.0778;3.9449,1.843,-1.6326;4.7339,-0.4234,-1.4226;5.647,-0.5463,-2.0344;5.1641,-0.0969,0.0073;5.7315,0.8356,0.033;5.8166,-0.8863,0.404;3.9568,0.0261,0.9197;2.8637,-0.4351,0.6541;4.2629,0.6676,2.0558;3.1865,0.852,3.0154;2.5643,-0.046,3.0197;3.7007,0.9439,3.9749;2.3727,2.0946,2.6913;1.6158,2.2525,3.4681;1.8625,1.9836,1.7306;3.0146,2.9801,2.6492)|\",4.394638685575\r\n\"[H]C1=C([H])C2=C(C(=O)[C@@]3([H])C(OC([H])([H])[H])=C([H])C([H])=C(C([H])([H])[H])/C3=N/2)C([H])=C1[H] |(8.4481,-6.7755,-3.7872;7.8026,-5.9298,-3.5644;8.2455,-4.9388,-2.6968;9.2149,-5.0018,-2.2121;7.4315,-3.8289,-2.4042;6.1518,-3.7518,-3.0109;5.2823,-2.5981,-2.6998;4.0793,-2.5858,-2.8732;6.1028,-1.3758,-2.2119;6.6248,-1.0929,-3.148;5.3195,-0.1653,-1.7934;4.376,0.1729,-2.6927;3.5634,1.308,-2.4173;4.1685,2.2217,-2.3588;3.0105,1.1756,-1.4797;2.863,1.3813,-3.25;5.6406,0.5199,-0.6671;5.0936,1.4074,-0.3721;6.765,0.1091,0.1362;6.9927,0.7147,1.0108;7.5544,-0.9708,-0.1313;8.7168,-1.3632,0.7403;8.8622,-0.6371,1.5465;9.6423,-1.4308,0.1571;8.5618,-2.3536,1.1832;7.217,-1.815,-1.2601;7.8737,-2.9248,-1.4421;5.6964,-4.7839,-3.8416;4.6979,-4.7038,-4.2612;6.5235,-5.8608,-4.1381;6.1831,-6.6441,-4.8091)|\",3.3415580841400003\r\n\"[H]C([H])=C1/C([H])=C(/[H])C(=O)/C2=C\\1N([H])C1=C([H])C([H])=C([H])C([H])=C1C2=O |(0.3252,1.1218,0.189;0.8424,0.1676,0.1927;0.205,-0.7008,0.3265;2.1863,0.1241,0.072;2.9364,1.3732,-0.0249;2.3564,2.2916,-0.0747;4.2771,1.396,-0.0193;4.8413,2.3221,-0.0672;5.0962,0.1628,0.0828;6.3155,0.2643,0.1438;4.3641,-1.1316,0.0914;2.9777,-1.1315,0.0621;2.2749,-2.3032,0.0297;1.2712,-2.265,-0.0657;2.8659,-3.5574,0.0363;2.0769,-4.72,-0.0021;0.9925,-4.6411,-0.0384;2.6959,-5.9612,0.0101;2.0877,-6.861,-0.0185;4.0964,-6.0604,0.0603;4.57,-7.0376,0.0701;4.8678,-4.9084,0.0956;5.9518,-4.9426,0.1312;4.2662,-3.6405,0.0835;5.1101,-2.4138,0.1084;6.3323,-2.5068,0.1381)|\",3.197337743375\r\n\"[H]/C1=C([H])/C2=C(C(/[H])=C([H])/C([H])=C/2[H])\\[C@@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]2([H])O1 |(2.0443,3.2363,0.5063;1.6528,2.2299,0.381;0.3285,2.0425,0.249;-0.2431,2.9675,0.235;-0.4704,0.8272,0.1144;-0.0109,-0.4887,0.3893;-0.9221,-1.5424,0.2129;-0.6157,-2.5607,0.4139;-2.2395,-1.3397,-0.198;-2.9056,-2.1911,-0.3084;-2.6895,-0.0466,-0.4514;-3.7132,0.1355,-0.7675;-1.8079,1.0139,-0.2916;-2.1485,2.0278,-0.4882;1.4184,-0.7613,0.8856;1.5154,-0.3429,1.8994;1.7031,-2.2831,0.9837;1.0322,-2.7171,1.7336;1.4465,-2.7564,0.0261;3.1521,-2.6527,1.3064;3.4393,-2.2607,2.2928;3.2425,-3.7447,1.3624;4.0806,-2.0799,0.2357;3.818,-2.5046,-0.7442;5.1256,-2.353,0.4264;3.9448,-0.5594,0.2204;4.5733,-0.1046,-0.5541;4.302,-0.1638,1.1801;2.5089,-0.0344,0.0346;2.2186,-0.0822,-1.0239;2.6732,1.3515,0.3944)|\",4.94702980209\r\n\"[H]C#CC1=C([H])C([H])=C([H])C([H])=C1[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])O[H] |(6.0609,3.0595,4.0406;5.8104,2.4494,3.2024;5.5228,1.76,2.2477;5.2722,0.8704,1.1505;6.0615,-0.2974,1.1141;6.7912,-0.452,1.9027;5.9232,-1.2313,0.0946;6.5437,-2.1229,0.0869;4.9827,-1.009,-0.9107;4.8585,-1.7277,-1.7162;4.1935,0.1393,-0.8762;3.4605,0.3011,-1.6627;4.3077,1.1036,0.1352;3.3982,2.3259,0.0484;2.6471,2.0794,-0.7159;2.5965,2.6366,1.3359;2.0352,1.7391,1.6248;3.2703,2.8497,2.1718;1.6442,3.8258,1.1388;0.8625,3.5554,0.4126;1.1283,4.0461,2.0821;2.3959,5.065,0.6327;3.1203,5.3919,1.393;1.7031,5.9026,0.4863;3.141,4.7559,-0.6719;3.7101,5.6283,-1.013;2.4117,4.5179,-1.4608;4.1231,3.5797,-0.524;4.5022,3.3149,-1.5192;5.2818,3.9644,0.2099;5.1405,3.805,1.1557)|\",5.44227701\r\n\"[H]OC([H])([H])[C@@]12O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])C2=O |(2.7958,0.4876,1.7834;2.9418,-0.4167,1.453;2.7492,-0.3292,0.0541;3.4301,0.4006,-0.411;2.9757,-1.3109,-0.373;1.3197,0.0525,-0.2937;0.9306,-0.1523,-1.6698;0.2962,-0.9598,-0.6722;0.6314,-1.9988,-0.6773;-1.1961,-0.7288,-0.4946;-1.6389,-0.7103,-1.499;-1.6317,-1.5904,0.0249;-1.5415,0.5656,0.2664;-2.5795,0.8535,0.0663;-1.4716,0.3824,1.3463;-0.5983,1.7363,-0.0924;-0.8648,2.6487,0.4477;-0.6427,1.9359,-1.1702;0.8147,1.3503,0.2874;1.5067,1.9967,1.0617)|\",5.6817371984400005\r\n\"[H]C([H])([H])C([H])([H])C(=O)C(C([H])([H])[H])(C([H])([H])[H])[C@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC2(C([H])([H])C2([H])[H])C1([H])[H] |(5.5534,-2.2201,-3.038;4.7946,-1.476,-2.7732;5.2852,-0.5043,-2.6619;4.0902,-1.3882,-3.6058;4.0736,-1.8812,-1.4889;4.7848,-1.9938,-0.6589;3.5905,-2.8614,-1.5929;2.9933,-0.8938,-1.0706;2.7997,0.1256,-1.7128;2.1037,-1.2115,0.1618;2.8309,-2.0893,1.2;3.7452,-1.6017,1.5571;3.0902,-3.0708,0.7982;2.1827,-2.2467,2.0706;1.6775,0.1131,0.8252;2.5461,0.6104,1.2706;0.953,-0.0771,1.626;1.2368,0.8023,0.1031;0.8244,-1.9633,-0.3163;0.2501,-2.1995,0.5917;1.2271,-3.1884,-0.9415;0.1634,-4.1271,-1.1209;0.5849,-5.0127,-2.2933;1.5058,-5.5504,-2.0463;0.7683,-4.4035,-3.181;-0.2013,-5.7412,-2.5133;-0.0773,-4.9407,0.1524;0.8186,-5.5158,0.4052;-0.9123,-5.6317,0.0033;-0.3245,-4.2844,0.9917;-1.0694,-3.4465,-1.3804;-0.9412,-2.2103,-2.0577;-2.1829,-1.7756,-2.7712;-3.0322,-2.4518,-2.7351;-2.4397,-0.7193,-2.7482;-0.9728,-2.2178,-3.5705;-0.3909,-1.467,-4.0997;-1.0416,-3.1779,-4.0717;-0.0926,-1.197,-1.2871;0.498,-0.5892,-1.9782;-0.7324,-0.5152,-0.7135)|\",6.12256163625\r\n\"[H]/C(=C(/[H])C([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(5.133,0.6317,0.9317;4.9982,1.536,0.3325;5.8897,2.5239,0.4515;5.763,3.4247,-0.1523;7.0831,2.5036,1.3681;7.999,2.6864,0.7899;7.1793,1.5125,1.8285;7.027,3.5585,2.5017;6.1179,3.4041,3.0932;6.9717,4.9568,1.9048;7.7433,5.382,1.0694;5.9651,5.6944,2.4264;5.8643,7.0379,1.9252;5.0154,7.4819,2.446;5.6932,7.0328,0.8454;6.7801,7.5971,2.1346;8.2616,3.4431,3.4248;8.3415,2.3972,3.7486;9.1589,3.6588,2.8311;8.2161,4.3558,4.6548;9.1025,4.206,5.2812;7.3321,4.1503,5.27;8.1853,5.4147,4.3742;3.8092,1.5392,-0.5888;2.8861,1.4307,0.0026;3.7378,2.5089,-1.0998;3.8509,0.4082,-1.6348;4.7579,0.5191,-2.2435;3.9464,-0.5574,-1.1184;2.6154,0.386,-2.5397;2.6722,-0.4269,-3.2726;2.5127,1.3271,-3.0939;1.6978,0.2446,-1.9552)|\",6.84094220157\r\n\"[H]OC1(C([H])([H])C([H])([H])C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])O[H])C([H])([H])C1([H])[H] |(3.0562,5.2677,-0.1954;3.5394,4.497,-0.5365;4.4385,4.0884,0.476;3.7745,3.5067,1.7117;4.549,3.2517,2.4477;3.1564,4.2883,2.1833;2.9065,2.2753,1.4182;2.1851,2.5409,0.6362;3.5419,1.485,0.9939;2.1931,1.7407,2.667;1.5327,2.5225,3.0752;2.9369,1.528,3.4445;1.3647,0.4618,2.4355;2.0134,-0.2815,1.9486;0.1395,0.7008,1.541;0.4296,1.0865,0.559;-0.5442,1.4306,1.9951;-0.423,-0.2266,1.3781;0.9245,-0.1536,3.7658;0.2872,-1.0318,3.5707;0.3152,0.5773,4.3255;2.0842,-0.5262,4.5064;1.7936,-0.8546,5.3701;5.7097,3.5008,-0.0589;6.1608,2.6674,0.4736;5.812,3.4607,-1.1391;5.724,4.8726,0.5941;5.8396,5.7309,-0.062;6.1869,4.9795,1.5723)|\",8.72941232404\r\n\"[H]O[C@@](C1=C([H])C([H])=C([H])C([H])=C1[H])(C([H])([H])[H])C([H])([H])/C(=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[H] |(8.7352,0.4172,2.8395;8.9926,0.7896,3.6976;8.486,2.1331,3.7181;9.0114,2.7881,4.9956;8.6993,4.124,5.2906;8.0859,4.7042,4.6058;9.1653,4.7284,6.4578;8.9109,5.7648,6.6654;9.9553,4.0069,7.3545;10.3214,4.4762,8.2639;10.2707,2.6783,7.0707;10.8862,2.1056,7.76;9.8032,2.0732,5.902;10.0509,1.0417,5.6834;9.0211,2.8696,2.4773;8.7195,3.922,2.4645;8.6413,2.4013,1.5615;10.1142,2.8236,2.4684;6.919,2.1116,3.748;6.6141,1.8253,4.7612;6.5618,3.1352,3.5865;6.2684,1.1651,2.7494;5.9477,1.589,1.5138;6.1577,2.6341,1.2766;5.2903,0.8254,0.3943;5.9363,0.8653,-0.4967;5.1811,-0.2357,0.6444;3.9109,1.4011,0.0149;3.2483,1.3493,0.8898;4.0183,2.4696,-0.2229;3.2572,0.6796,-1.1703;3.9269,0.7305,-2.0406;3.151,-0.3882,-0.9312;1.8885,1.2571,-1.5457;1.447,0.7218,-2.3942;1.9686,2.315,-1.8242;1.1859,1.1886,-0.7062;5.9905,-0.225,3.2779;5.6553,-0.9224,2.5066;5.2115,-0.184,4.0524;6.8816,-0.6403,3.7621)|\",6.2803876695400005\r\n\"[H]C1=C([H])C(/C(=C(/F)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1OC([H])([H])[H] |(5.4152,-4.7943,0.6198;5.1434,-3.8408,1.0584;5.4065,-2.6645,0.3545;5.8746,-2.7389,-0.6238;5.0684,-1.4029,0.8633;5.3836,-0.1655,0.0947;4.4335,0.7221,-0.2389;4.8165,1.8448,-0.926;2.951,0.6782,-0.0355;2.4749,0.8706,-1.0092;2.6744,-0.3378,0.2614;2.3716,1.6822,0.9967;2.8855,1.5044,1.9525;2.5973,3.1493,0.6018;2.1545,3.8202,1.3476;2.1259,3.3669,-0.3652;3.6589,3.3912,0.5131;0.8765,1.3982,1.2043;0.4458,2.0839,1.9431;0.7076,0.3735,1.556;0.3183,1.526,0.2677;6.8298,0.0512,-0.2946;7.2225,-0.8128,-0.8448;7.4562,0.165,0.5995;6.951,0.9387,-0.9177;4.4648,-1.3614,2.1339;4.22,-0.3986,2.5732;4.1946,-2.5215,2.8487;3.7316,-2.483,3.8299;4.5309,-3.7729,2.3152;4.2282,-4.8526,3.0947;4.5286,-6.1458,2.5961;4.1975,-6.847,3.3643;5.6064,-6.2738,2.4296;3.9937,-6.3553,1.66)|\",5.50214205711\r\n\"[H]C1=C([H])C(/C(=C(\\[H])F)C([H])([H])[H])=C([H])C([H])=C1Cl |(6.4397,2.541,-0.5393;5.3688,2.5207,-0.3671;4.6827,1.3091,-0.3104;5.2412,0.3888,-0.4496;3.2943,1.26,-0.0968;2.5779,-0.0379,-0.0559;1.3213,-0.1114,-0.5088;0.7563,0.7005,-0.9537;0.6087,-1.2583,-0.4694;3.283,-1.2642,0.4728;2.5833,-2.0949,0.5811;3.7443,-1.0622,1.4463;4.0831,-1.5905,-0.2042;2.6199,2.4806,0.0889;1.558,2.4771,0.3157;3.2894,3.7,0.0279;2.7541,4.6319,0.1759;4.6642,3.7116,-0.2035;5.521,5.2459,-0.2719)|\",5.25179731465\r\n\"[H]C([H])=C([H])C(=O)[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]1([H])C([H])([H])[H] |(2.2355,1.0558,-2.018;2.4332,0.1637,-1.4307;2.9198,-0.6721,-1.9269;2.0955,0.0778,-0.1393;1.6023,0.8924,0.3809;2.3894,-1.1533,0.6454;2.9641,-2.1214,0.1709;2.0091,-1.1495,2.1385;2.8984,-0.7975,2.6803;0.9407,-0.2682,2.4573;-0.2775,-1.0311,2.56;-0.9789,-0.6134,3.8499;-1.8952,-1.1962,3.9863;-1.2405,0.4491,3.8148;-0.3219,-0.7849,4.7065;-1.151,-0.8242,1.3235;-2.0465,-1.4499,1.3927;-0.6031,-1.1056,0.4199;-1.459,0.2231,1.2401;0.1104,-2.4031,2.6127;1.5382,-2.5197,2.659;1.8284,-3.3082,1.9593;2.0304,-2.8539,4.0649;3.1073,-3.0606,4.0549;1.5139,-3.7431,4.4397;1.8434,-2.0276,4.7598)|\",4.71301189066\r\n\"[H]C1C(C([H])([H])[H])[C@]2([H])C([H])([H])C([H])([H])[C@@]3(C([H])([H])[H])O[C@]32C(=O)C([H])([H])C1([H])[H] |(2.7555,2.4478,-0.0522;3.3108,1.6941,0.5135;2.7204,0.4856,0.6166;3.2991,-0.6818,1.3737;4.1121,-0.3694,2.0347;3.6791,-1.4712,0.7111;2.5256,-1.1438,2.002;1.7783,0.2163,-0.5427;1.2277,1.1323,-0.7883;0.8256,-0.9974,-0.5818;1.2316,-1.809,0.0335;-0.1681,-0.7603,-0.1871;0.7911,-1.4556,-2.0633;-0.013,-0.9646,-2.6239;0.6388,-2.5368,-2.1491;2.144,-1.0364,-2.6533;2.8637,-1.9569,-3.6043;2.2239,-2.1719,-4.4692;3.1037,-2.9064,-3.1136;3.7942,-1.5111,-3.9582;2.1796,0.3832,-2.9981;2.7498,-0.0413,-1.7475;4.2827,0.1361,-1.5749;4.9978,-0.8403,-1.7151;4.924,1.495,-1.2101;5.9756,1.4379,-1.5119;4.4368,2.2896,-1.7903;4.7961,1.8344,0.3196;5.3815,1.1186,0.9046;5.1942,2.8382,0.5068)|\",4.85451109292\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])C(=O)[C@@]12O[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])C(=C([H])[H])C([H])([H])[H] |(4.9754,2.7471,4.6569;4.4409,2.0504,4.0166;3.3546,2.0659,4.0806;5.0818,1.2226,3.1907;6.1734,1.245,3.1633;4.4244,0.2299,2.2721;4.6962,0.4421,1.2295;3.3341,0.325,2.3292;4.8292,-1.2163,2.5919;4.5079,-1.5066,3.6014;5.9253,-1.3232,2.5931;4.2743,-2.2336,1.6101;3.6507,-1.9137,0.6147;4.5242,-3.6916,1.9643;3.7161,-4.1506,3.0747;3.5379,-4.7708,1.7664;2.1733,-4.6369,1.1462;1.4814,-5.3484,1.6128;2.2153,-4.8588,0.074;1.7773,-3.6281,1.2712;4.298,-6.0869,1.6776;3.775,-6.9037,2.1884;4.3955,-6.3648,0.6205;5.6752,-5.7854,2.3071;6.4659,-6.4202,1.8968;5.635,-5.9575,3.3853;5.9431,-4.2678,2.0345;6.4709,-3.8353,2.896;6.7903,-4.0435,0.7911;6.2775,-3.7721,-0.4149;6.923,-3.6498,-1.2815;5.2136,-3.6451,-0.5869;8.2782,-4.189,1.0052;8.8339,-4.0852,0.0685;8.528,-5.1652,1.4426;8.6504,-3.4312,1.7087)|\",5.7688136306\r\n\"[H]C([H])=C([H])C([H])([H])C([H])([H])C(=O)C1=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C(=C([H])[H])C([H])([H])[H] |(5.3795,-3.2247,-1.6362;5.3849,-2.1381,-1.6522;4.8254,-1.6681,-2.4568;6.0479,-1.4359,-0.7315;6.5882,-1.9817,0.0439;6.1502,0.0628,-0.6219;7.2093,0.3543,-0.6504;5.8131,0.3743,0.3766;5.3846,0.8516,-1.6813;4.3171,0.5855,-1.667;5.7323,0.5949,-2.691;5.5034,2.363,-1.5039;6.1939,2.8288,-0.6034;4.7324,3.2232,-2.4463;4.438,4.5302,-2.2431;4.821,5.419,-1.1007;3.9384,5.9496,-0.7192;5.519,6.1935,-1.4508;5.307,4.8688,-0.2967;3.613,5.0929,-3.3796;2.7514,5.6643,-3.0094;4.2184,5.8068,-3.959;3.1998,3.8577,-4.2066;3.2282,4.035,-5.2859;2.1723,3.5727,-3.9569;4.1629,2.7061,-3.774;3.5869,1.7857,-3.6214;5.2272,2.3943,-4.8234;5.2459,1.2115,-5.4491;5.9796,0.9809,-6.2177;4.5233,0.4312,-5.2208;6.2297,3.4755,-5.1484;6.9255,3.1507,-5.9276;6.8133,3.7583,-4.264;5.7335,4.3894,-5.5018)|\",5.20825909857\r\n\"[H]C1=NC([H])=C(Cl)N=C1/C([H])=C(\\[H])OC([H])([H])C([H])([H])[H] |(6.977,-3.6827,-3.3046;7.4578,-3.0278,-2.5808;8.7894,-3.0564,-2.5327;9.3885,-2.2562,-1.648;10.4728,-2.2651,-1.5929;8.6154,-1.4325,-0.8205;9.4339,-0.3853,0.3394;7.303,-1.3899,-0.8534;6.6823,-2.1978,-1.7459;5.2315,-2.1946,-1.8233;4.7603,-2.8525,-2.548;4.4617,-1.4165,-1.0386;4.9047,-0.7477,-0.3023;3.1192,-1.4249,-1.1299;2.4162,-0.5739,-0.2079;2.6151,-0.9119,0.8179;2.7878,0.4545,-0.3087;0.9368,-0.6561,-0.5333;0.3663,-0.0275,0.1586;0.5785,-1.6862,-0.4425;0.7456,-0.3121,-1.5544)|\",4.318446807435\r\n\"[H]O[C@]([H])(C([H])([H])[H])[C@@]([H])(C(=O)OC([H])([H])[H])[C@@]([H])(C(=O)OC([H])([H])[H])C([H])([H])[H] |(4.903,1.3318,1.8159;5.2569,0.8985,1.0214;4.7764,1.6201,-0.1141;5.3107,1.1935,-0.9665;5.1242,3.1055,-0.0034;4.8683,3.6442,-0.9208;6.1976,3.2203,0.1759;4.584,3.5752,0.8307;3.2594,1.3832,-0.3138;2.7158,1.8849,0.4957;2.7866,1.9811,-1.63;3.4357,2.0235,-2.6547;1.5194,2.4401,-1.5334;0.9434,2.9241,-2.7605;-0.0579,3.2658,-2.4979;0.8932,2.1203,-3.4991;1.54,3.7464,-3.1629;2.8859,-0.1366,-0.2401;3.2816,-0.4706,0.7247;1.371,-0.2905,-0.1886;0.6762,-0.7536,-1.0668;0.8778,0.1699,0.9858;-0.5527,0.1326,1.1069;-0.7733,0.5213,2.1018;-0.9218,-0.891,1.0036;-1.0164,0.7574,0.3388;3.4849,-0.9966,-1.357;4.5772,-0.9635,-1.3219;3.1572,-0.6656,-2.3453;3.1733,-2.0382,-1.2342)|\",7.268160946855001\r\n\"[H]OC([H])([H])[C@]1(C([H])([H])[H])C2=C(C([H])=C([H])C([H])=C2[H])N([H])[C@@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.9045,0.2492,-0.6211;4.4617,-0.436,-1.1489;3.1035,-0.0607,-1.2724;2.9915,0.8374,-1.9029;2.6136,-0.8925,-1.7928;2.3846,0.1974,0.0779;0.9029,0.474,-0.2207;0.8094,1.309,-0.9251;0.3507,0.7349,0.6853;0.4175,-0.3984,-0.6727;3.0657,1.3415,0.82;3.8184,0.8292,1.8909;4.5552,1.6777,2.7231;5.1339,1.2816,3.5535;4.5258,3.0512,2.4611;5.0898,3.7231,3.1031;3.7884,3.571,1.3942;3.7803,4.6406,1.2067;3.0549,2.708,0.5673;2.4739,3.1113,-0.2591;3.7256,-0.5568,1.938;3.8116,-1.0137,2.8372;2.6489,-1.0368,1.0514;3.0392,-1.8619,0.4449;1.4433,-1.5514,1.8233;0.7967,-2.7206,1.4032;1.1734,-3.2538,0.5327;-0.3137,-3.2148,2.09;-0.798,-4.1262,1.7491;-0.7916,-2.5467,3.2179;-1.6525,-2.9311,3.7581;-0.1512,-1.3837,3.6523;-0.5148,-0.858,4.5314;0.9571,-0.8919,2.9624;1.4476,0.0141,3.3083)|\",4.95791435611\r\n\"[H]O[C@](C1NC([H])([H])C([H])([H])C1([H])[H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])/C([H])=C(\\[H])C([H])([H])[H] |(5.5861,3.8016,1.4386;5.645,2.8317,1.5617;4.3463,2.3443,1.3403;3.4759,3.4374,0.724;3.963,4.612,0.614;2.9681,5.4911,-0.0215;2.9279,6.4473,0.5107;3.3086,5.7084,-1.0429;1.6238,4.7177,0.0022;1.0136,5.0497,0.8493;1.0327,4.864,-0.9063;2.0653,3.2508,0.2069;2.1015,2.6893,-0.7364;1.4382,2.6801,0.8972;3.6492,1.9284,2.6646;2.645,1.2421,2.6987;4.2597,2.437,3.742;3.6759,2.1206,5.0309;3.9886,2.947,5.6735;2.5871,2.1222,4.9364;4.184,0.7846,5.551;3.795,0.606,6.5602;5.2777,0.7763,5.5961;3.8493,-0.0319,4.9049;4.4344,1.0928,0.4251;5.1182,0.3997,0.9357;3.4567,0.6058,0.3754;4.9612,1.4118,-0.9461;5.8842,1.9905,-0.9735;4.3893,1.0246,-2.0903;3.4699,0.4362,-2.0481;4.9242,1.3143,-3.4643;5.1577,0.3862,-4.004;5.8366,1.9183,-3.4203;4.1882,1.8541,-4.0757)|\",6.008273819039999\r\n\"[H]OC1=NC(=C([H])[H])[C@]2([H])C(C([H])([H])[H])=C([H])C([H])([H])[C@@]1(C(=O)OC([H])([H])[H])C2([H])[H] |(2.91,4.9519,2.055;2.7307,4.8948,1.0895;3.0465,3.6568,0.6673;3.1036,3.4562,-0.5938;3.3064,2.1702,-1.11;3.5642,2.0152,-2.418;3.7402,1.0339,-2.847;3.6124,2.8775,-3.0745;3.1987,0.9961,-0.1421;3.6544,0.1135,-0.6058;1.7387,0.6914,0.1756;0.9988,-0.169,-0.8127;1.4736,-1.1546,-0.9161;1.01,0.2939,-1.8079;-0.044,-0.3214,-0.5175;1.1549,1.2051,1.2661;0.1106,0.9761,1.4757;1.8362,2.1102,2.2596;1.9513,1.5895,3.2232;1.2181,2.9935,2.4629;3.2447,2.5748,1.7599;3.9668,3.1462,2.9766;3.7213,4.2391,3.4706;4.8624,2.3058,3.5018;5.5343,2.7623,4.695;6.206,1.9509,4.9734;4.8087,2.9574,5.4879;6.094,3.6763,4.4846;3.9639,1.3683,1.1344;5.0028,1.6197,0.8959;3.9772,0.5333,1.8414)|\",5.07764445033\r\n\"[H]C([H])=C1OC(=O)[C@@]2(C(=O)OC([H])([H])[H])C([H])([H])/C([H])=C(/C([H])([H])[H])[C@]1([H])C2([H])[H] |(1.7322,-2.8003,1.3022;2.7531,-2.7839,0.9391;3.2412,-3.7279,0.7242;3.3866,-1.6246,0.7668;4.6949,-1.6807,0.2908;5.5493,-0.6168,0.2596;6.6344,-0.7664,-0.2428;5.0952,0.7351,0.8563;6.3046,1.2499,1.6519;6.5677,0.8771,2.7744;7.0301,2.1472,0.9668;8.2189,2.6166,1.629;8.8934,1.7805,1.8274;7.9637,3.1042,2.5733;8.6753,3.3252,0.9379;4.6911,1.6604,-0.317;4.6903,2.701,0.0357;5.456,1.6131,-1.0999;3.3373,1.3008,-0.874;3.0608,1.7852,-1.8098;2.4786,0.4464,-0.3027;1.1502,0.0894,-0.9143;1.0013,0.5931,-1.874;0.3192,0.3666,-0.2505;1.0744,-0.9931,-1.0809;2.8331,-0.2436,1.0151;1.9261,-0.3584,1.6192;3.8901,0.5511,1.7921;4.1932,0.0239,2.7011;3.4969,1.5313,2.085)|\",6.174263267845\r\n\"[H]OC([H])([H])/C([H])=C(/C(=C=C([H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(5.1023,4.509,-3.5163;5.2394,5.4688,-3.4769;5.9192,5.7367,-2.2471;6.9557,5.3733,-2.2839;5.9653,6.8308,-2.1869;5.1679,5.182,-1.0683;4.1244,5.4924,-1.0357;5.616,4.3504,-0.1117;4.6804,3.8804,0.9825;4.1013,2.7124,0.8213;3.5385,1.5453,0.6225;4.0506,0.648,0.9781;2.209,1.3285,-0.0624;2.3247,0.6927,-0.9503;1.7635,2.277,-0.3751;1.5023,0.8187,0.6056;4.4451,4.7572,2.2376;3.56,5.9657,1.8587;3.3888,6.602,2.7362;2.5844,5.633,1.4863;4.0278,6.5817,1.0842;3.7364,3.9465,3.3375;3.5662,4.5764,4.2188;4.3371,3.0836,3.6447;2.7651,3.5732,2.9971;5.791,5.2766,2.7874;5.6194,5.8863,3.6824;6.31,5.9028,2.0534;6.4551,4.4501,3.065;7.0095,3.7651,-0.0521;7.6374,4.0623,-0.8958;6.947,2.6697,-0.037;7.526,4.0605,0.869)|\",6.400117763759999\r\n\"[H]OC([H])([H])/C([H])=C(/C(=C=C([H])C([H])=C(C([H])([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(9.6809,1.6883,0.0582;10.2916,2.336,-0.3277;9.6258,3.6011,-0.2943;9.5418,3.9756,0.7354;10.2982,4.2828,-0.8296;8.2908,3.5394,-0.9849;8.3452,3.1251,-1.9906;7.0855,3.884,-0.4999;5.8456,3.7062,-1.3536;5.1267,2.6284,-1.1512;4.4261,1.5361,-0.9093;4.664,0.6395,-1.4822;3.3473,1.4606,0.0747;3.1527,2.3887,0.6108;2.5873,0.3876,0.3783;1.5088,0.4882,1.4278;1.6835,-0.2249,2.2466;0.523,0.2405,1.0085;1.4507,1.492,1.86;2.72,-0.9702,-0.2628;2.9694,-1.7318,0.4895;3.4843,-1.0105,-1.0421;1.7667,-1.2804,-0.7132;5.464,4.7819,-2.4025;5.5087,6.184,-1.7585;5.2462,6.9469,-2.5013;6.5085,6.4207,-1.3787;4.7955,6.2645,-0.9301;6.465,4.7407,-3.5792;6.1968,5.4982,-4.3262;6.4535,3.7616,-4.0715;7.487,4.9463,-3.2471;4.048,4.5259,-2.948;3.7861,5.293,-3.6863;3.3003,4.5523,-2.148;3.978,3.5481,-3.4359;6.8284,4.4044,0.8967;7.7379,4.4922,1.4961;6.1341,3.7333,1.418;6.3486,5.39,0.87)|\",5.167442020995001\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(C([H])(C(=O)C([H])([H])[H])C(=O)C([H])([H])[H])C([H])([H])N(=O)=O)O1 |(6.3667,-6.6203,3.291;5.6753,-5.7918,3.2805;4.9612,-5.1456,4.2404;4.9614,-5.3629,5.2993;4.2154,-4.1239,3.5631;3.5407,-3.4033,4.0041;4.5289,-4.2239,2.2398;4.094,-3.4797,1.0147;3.4424,-2.664,1.3439;5.3004,-2.8104,0.3039;6.049,-3.5597,0.0347;4.8812,-2.0462,-0.9673;3.7518,-1.6143,-1.0915;5.9377,-1.8678,-2.0355;6.8763,-1.4862,-1.6158;5.5734,-1.1859,-2.8063;6.1536,-2.8459,-2.483;5.9381,-1.7848,1.2801;5.2444,-0.9667,1.8469;7.4338,-1.8771,1.4911;7.972,-1.8764,0.5357;7.6673,-2.83,1.9838;7.7765,-1.0492,2.1147;3.2025,-4.3458,0.094;2.7954,-3.7361,-0.7122;2.4054,-4.8171,0.6674;3.9491,-5.468,-0.5738;3.6732,-6.6162,-0.2526;4.7917,-5.1497,-1.4173;5.4298,-5.2431,2.0542)|\",4.449061455675\r\n\"[H]C1C([H])=C([H])C2=C1[C@]([H])(C1=C([H])C3=C(OC([H])([H])O3)C([H])=C1[H])C([H])([H])C2[H] |(-0.2129,-1.8394,1.0473;0.0142,-0.8916,0.5726;1.3715,-0.4419,0.2077;2.2683,-1.0281,0.383;1.3327,0.7926,-0.3745;2.1705,1.3682,-0.7466;-0.0692,1.174,-0.4098;-0.8479,0.0882,0.2051;-2.311,0.4538,0.2485;-2.591,0.6891,1.285;-3.2972,-0.5872,-0.2624;-4.578,-0.6666,0.3278;-4.8545,-0.0317,1.164;-5.4689,-1.5835,-0.192;-5.1364,-2.4116,-1.2594;-6.2007,-3.2284,-1.5653;-7.2411,-2.8832,-0.6442;-7.504,-3.7627,-0.0436;-8.1145,-2.518,-1.1986;-6.7559,-1.8456,0.2167;-3.8919,-2.3581,-1.8538;-3.633,-3.0101,-2.6812;-2.9749,-1.4271,-1.3335;-1.9814,-1.3701,-1.7669;-2.3417,1.8084,-0.5668;-2.8918,1.6819,-1.5096;-2.8601,2.6069,-0.0182;-0.8951,2.1507,-0.8366;-0.6028,3.071,-1.3369)|\",3.89122806215\r\n\"[H]C(=O)/C([H])=C(\\[H])C([H])([H])OC1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(5.9518,6.4209,6.0979;6.3639,5.3923,5.9749;7.282,5.0007,6.6697;5.7101,4.5905,4.9295;6.0697,3.5791,4.7699;4.7059,5.1043,4.2063;4.3754,6.1262,4.3993;3.956,4.4054,3.1176;4.0528,4.9813,2.181;2.8808,4.3904,3.3656;4.4522,3.0948,2.9543;3.8708,2.2892,2.003;4.3892,0.9963,1.9019;5.2004,0.7071,2.5627;3.877,0.0936,0.9718;4.3018,-0.9022,0.9179;2.8295,0.4802,0.1249;2.2526,-0.3228,-0.8217;2.733,-1.6495,-0.9522;2.1325,-2.1105,-1.7388;2.6084,-2.2198,-0.0215;3.7912,-1.6709,-1.2466;2.3124,1.7753,0.2275;1.5003,2.0638,-0.4324;2.8247,2.6784,1.1573;2.3987,3.674,1.2097)|\",4.00551587936\r\n\"[H]OC1=NC(=O)N(C([H])([H])OC([H])([H])C([H])([H])O[H])N([H])[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.1274,-4.4038,1.4824;5.039,-4.4888,1.1604;5.3136,-3.4815,0.3043;6.4821,-3.4537,-0.2121;6.7988,-2.4424,-1.143;7.7788,-2.5191,-1.8588;5.8943,-1.373,-1.2558;6.268,-0.1875,-1.9632;5.3548,0.303,-2.324;6.9093,-0.4774,-2.8009;6.9637,0.6772,-1.0702;7.3243,1.9254,-1.6529;7.8402,1.7626,-2.6118;6.4299,2.5434,-1.8373;8.2686,2.6021,-0.6716;7.734,2.7741,0.2776;8.5783,3.5761,-1.065;9.4439,1.8384,-0.4769;9.1446,0.9282,-0.3132;4.886,-1.1729,-0.273;5.3388,-0.8187,0.5741;4.2292,-2.4477,0.0272;3.6505,-2.2788,0.9465;3.2639,-2.8665,-1.0997;2.5986,-2.0122,-1.2746;3.8391,-3.014,-2.0211;2.4378,-4.1236,-0.7937;3.1033,-4.9835,-0.6421;1.8863,-3.9807,0.1491;1.4409,-4.4525,-1.91;0.8609,-5.3501,-1.6706;1.9592,-4.6326,-2.859;0.7352,-3.6281,-2.0664)|\",5.404181070929999\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C([H])=C1[H])SC([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H] |(8.5856,-3.6986,2.6591;7.9563,-2.9521,2.6951;6.9781,-3.3621,3.5402;7.0563,-4.5116,4.0763;6.1136,-5.0112,4.9955;5.4694,-6.2258,4.7134;5.6797,-6.7219,3.7704;4.5737,-6.7759,5.6279;4.0752,-7.7126,5.3923;4.3219,-6.1364,6.8448;3.628,-6.571,7.559;4.9775,-4.9403,7.1389;4.7976,-4.4385,8.0862;5.8689,-4.3786,6.2251;6.3925,-3.4576,6.463;5.6754,-2.1547,3.7495;6.107,-0.8551,2.4888;6.9787,-0.2988,2.84;6.3696,-1.3696,1.5618;4.9117,0.0321,2.317;4.045,-0.4187,1.833;4.854,1.304,2.7246;5.7256,1.7365,3.2214;3.6662,2.2117,2.5697;3.9534,3.0888,1.969;2.8741,1.696,2.0108;3.1082,2.7062,3.9195;2.8068,1.8376,4.5192;3.9104,3.2039,4.4817;1.9244,3.6641,3.7558;1.5474,3.9998,4.7283;1.0941,3.1825,3.2251;2.2116,4.5547,3.1831)|\",5.59466076628\r\n\"[H]O/C(=N/C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])SC([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H] |(9.9833,1.5685,1.6018;9.0286,1.713,1.7483;8.9528,2.9626,2.2808;10.0032,3.6425,2.4727;9.9315,4.9842,3.0388;9.2556,5.6152,2.4415;9.5088,4.9498,4.0565;11.3304,5.6068,3.0934;11.9856,4.9159,3.6391;11.7288,5.6669,2.0712;11.3795,6.9927,3.7543;10.9736,6.9241,4.7741;12.4318,7.2852,3.8679;10.6384,8.0937,2.9845;10.7737,9.0692,3.4656;9.5607,7.903,2.9296;11.0127,8.1751,1.9562;7.2782,3.4759,2.6459;6.2523,2.0376,2.0598;6.2686,2.0191,0.9677;6.7305,1.1287,2.4318;4.861,2.1949,2.5916;4.742,2.0796,3.6694;3.7931,2.462,1.8329;3.9324,2.5873,0.7566;2.3857,2.617,2.3365;1.7437,1.8598,1.8598;2.353,2.4194,3.4162;1.7924,4.0101,2.0474;2.4156,4.7714,2.534;1.8503,4.2106,0.9685;0.341,4.1457,2.5174;-0.0546,5.1446,2.3019;0.2567,3.98,3.5984;-0.3078,3.4149,2.019)|\",6.29671450057\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C(Cl)C([H])=C(Cl)C([H])=C1[H] |(3.3118,-2.1861,3.583;3.0008,-1.5538,2.9183;1.6245,-1.2346,3.1907;0.9933,-2.1197,3.0235;1.5075,-0.9253,4.2381;1.1834,-0.1065,2.2533;0.141,0.1103,2.5189;1.2274,-0.5343,0.7785;0.6231,-1.4347,0.6174;2.2498,-0.7459,0.457;0.8291,0.2592,0.1367;1.9958,1.206,2.4689;1.5874,1.9539,1.7786;3.3577,1.0432,2.106;3.6101,0.1364,2.3654;1.8543,1.7687,3.8858;0.7012,2.4244,4.3402;-0.6758,2.6795,3.2617;0.583,2.9262,5.6362;-0.3226,3.4315,5.9496;1.6553,2.772,6.5106;1.5224,3.3941,8.1487;2.8291,2.1443,6.1005;3.6645,2.046,6.7853;2.9123,1.6585,4.7978;3.833,1.2005,4.4531)|\",6.05181203512\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C([H])C([H])=C(Cl)C([H])=C1[H] |(2.8684,-2.8109,0.6492;2.2662,-2.5881,-0.0776;2.6118,-1.2826,-0.5354;3.6766,-1.2162,-0.7898;2.0412,-1.1395,-1.4611;2.2443,-0.1778,0.4727;2.7947,-0.3925,1.4048;0.7403,-0.1981,0.7757;0.4296,-1.2141,1.034;0.1631,0.112,-0.1057;0.4774,0.4714,1.6002;2.7173,1.197,-0.0518;2.2172,1.3753,-1.0118;4.1158,1.1855,-0.3846;4.6133,1.2792,0.4435;2.4034,2.3679,0.8684;1.7817,3.5179,0.3687;1.4868,3.5543,-0.6769;1.5356,4.6242,1.182;1.0507,5.5092,0.7839;1.9237,4.5806,2.5195;1.6198,5.9671,3.5577;2.5485,3.4518,3.0482;2.8412,3.4308,4.0926;2.7808,2.3551,2.2193;3.2574,1.4751,2.6457)|\",6.12256163625\r\n\"[H]OC([H])([H])[C@]1([H])O[C@@]([H])(C([H])([H])[H])[C@@]2([H])OC3(O[C@@]21[H])C([H])([H])C([H])([H])C([H])([H])C3([H])[H] |(5.9447,0.4895,0.0298;6.4476,-0.3033,0.2859;5.4818,-1.2047,0.7974;5.9622,-2.1852,0.883;5.1508,-0.9068,1.8043;4.2956,-1.2946,-0.1588;4.6543,-1.6732,-1.121;3.7816,0.0596,-0.4224;2.4471,0.2909,0.1529;2.5097,1.2507,0.6759;1.378,0.3732,-0.9269;0.4099,0.6174,-0.4751;1.2718,-0.5671,-1.4756;1.636,1.1592,-1.6435;2.369,-0.8842,1.1192;2.9996,-0.6428,1.9877;1.1951,-1.5365,1.5641;1.661,-2.8883,1.8805;2.9505,-3.0637,1.2381;3.0284,-1.9818,0.3251;2.4134,-2.2101,-0.5572;0.5841,-3.8753,1.4273;0.1933,-3.6299,0.4357;1.036,-4.874,1.3869;-0.4553,-3.7853,2.5552;-1.0732,-2.8928,2.4067;-1.1264,-4.6493,2.585;0.3908,-3.64,3.8518;0.5081,-4.611,4.3451;-0.0958,-2.9759,4.5724;1.7745,-3.0971,3.3995;2.572,-3.8232,3.5865;2.0574,-2.1622,3.8921)|\",8.547096044205\r\n\"[H]O/C(=N/C1=C([H])C([H])=C([H])C([H])=C1[H])S[C@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(3.966,3.3682,2.4475;3.9582,2.9108,1.5842;4.177,3.89,0.6713;4.348,5.085,1.0658;4.5446,6.1744,0.1918;3.5286,6.6243,-0.6663;2.5816,6.0935,-0.6919;3.7322,7.7492,-1.465;2.9372,8.0858,-2.1256;4.944,8.4409,-1.419;5.0987,9.3173,-2.0421;5.9525,8.0016,-0.5581;6.8977,8.5361,-0.5092;5.7546,6.8824,0.2483;6.5295,6.5404,0.9279;4.2531,3.3037,-1.0141;3.8673,1.4746,-0.8645;4.4557,1.102,-0.0225;4.354,0.853,-2.146;3.8114,1.1395,-3.0482;5.3655,-0.0119,-2.2312;5.6674,-0.4429,-3.1817;5.9306,-0.3209,-1.3545;2.3673,1.2319,-0.6168;2.0549,1.8178,0.2541;1.7983,1.6114,-1.4767;2.0325,-0.2483,-0.3793;2.3478,-0.8427,-1.246;2.6191,-0.6173,0.4739;0.5411,-0.4754,-0.1109;0.3254,-1.5363,0.0589;-0.0699,-0.1427,-0.959;0.2074,0.079,0.7747)|\",5.646362397875\r\n\"[H]O/C(=N\\C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])S[C@@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.2347,0.6704,2.001;4.6636,1.0558,2.7834;5.9624,0.6364,2.8231;6.576,-0.0809,1.9777;5.8717,-0.5413,0.7828;5.0278,-1.1948,1.0621;5.4446,0.3088,0.2199;6.8306,-1.3057,-0.1348;7.6744,-0.6417,-0.3605;7.2459,-2.1526,0.4277;6.1999,-1.8014,-1.4445;5.7754,-0.9476,-1.9927;6.9998,-2.1991,-2.0827;5.1251,-2.8822,-1.2686;4.7645,-3.2394,-2.2398;4.2548,-2.5126,-0.714;5.5221,-3.7468,-0.7224;6.6851,1.2655,4.327;8.4682,0.7733,4.077;8.4141,-0.1238,3.4555;9.207,1.8494,3.3316;9.2951,2.8145,3.8291;9.7422,1.68,2.1212;10.2778,2.4815,1.6189;9.6529,0.7379,1.5864;9.086,0.409,5.4426;8.5285,-0.4423,5.8546;10.1065,0.0519,5.2434;9.1438,1.5275,6.4932;9.7271,2.3732,6.1072;8.1316,1.9096,6.6838;9.7595,1.0502,7.813;9.7967,1.8598,8.5503;10.7838,0.6869,7.664;9.1767,0.2292,8.248)|\",6.32936816263\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C([H])C([H])=C([H])C([H])=C1Cl |(2.4227,-1.0524,2.5254;1.9528,-1.5282,1.8228;2.5122,-1.1075,0.5799;3.5988,-1.2528,0.5659;2.0741,-1.7722,-0.1745;2.1723,0.356,0.2377;2.5831,0.9892,1.0405;0.6545,0.5619,0.1744;0.198,0.1739,1.0889;0.2175,0.0211,-0.6737;0.3897,1.6187,0.0722;2.8943,0.7758,-1.0616;2.5698,0.1053,-1.8686;4.2956,0.5955,-0.8146;4.7685,0.8315,-1.6282;2.6164,2.2177,-1.4715;3.2577,3.2517,-0.7699;3.9397,2.9751,0.0285;3.0551,4.5914,-1.0884;3.5655,5.3661,-0.5232;2.199,4.9326,-2.1372;2.0323,5.9739,-2.3982;1.5561,3.9304,-2.8599;0.8944,4.1744,-3.6841;1.7701,2.5919,-2.5239;0.9309,1.3762,-3.4972)|\",6.206916929905\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C([H])N=C([H])C([H])=C1[H] |(3.2771,-0.5992,-2.5723;3.5393,-1.4827,-2.2692;2.339,-2.2443,-2.0385;1.6953,-2.212,-2.9291;2.6799,-3.2751,-1.9106;1.5532,-1.7796,-0.8011;0.6846,-2.4486,-0.7266;1.0418,-0.3395,-0.9546;0.4788,-0.2142,-1.8877;1.8692,0.3778,-0.9423;0.3774,-0.0713,-0.1261;2.382,-1.9373,0.5066;1.7654,-1.5546,1.3318;3.5395,-1.1129,0.488;3.915,-1.1885,-0.4119;2.7158,-3.3957,0.8198;1.7026,-4.3359,1.0582;0.656,-4.0301,1.0258;1.913,-5.6245,1.3449;3.1865,-6.036,1.4106;3.3378,-7.0889,1.6431;4.2752,-5.1894,1.2072;5.2894,-5.5708,1.285;4.0338,-3.8479,0.9123;4.8485,-3.1455,0.7706)|\",6.239570591965001\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C(N(=O)=O)C([H])=C([H])C([H])=C1[H] |(3.5357,2.1206,0.1906;2.6129,1.8342,0.2861;2.3764,0.8028,-0.6875;1.2939,0.6377,-0.67;2.6378,1.1698,-1.6903;3.1134,-0.5182,-0.4042;4.1904,-0.2974,-0.3297;2.8983,-1.4708,-1.59;3.269,-1.0129,-2.5149;1.8407,-1.7132,-1.7262;3.4377,-2.4116,-1.4478;2.695,-1.0877,0.9871;1.605,-1.1806,1.0066;3.1356,-0.2123,2.0241;2.8423,0.6803,1.7529;3.3302,-2.442,1.282;2.6951,-3.6968,1.2395;1.2759,-3.8542,0.8717;0.6838,-4.8431,1.3026;0.7533,-3.0044,0.1455;3.3616,-4.8831,1.5678;2.8106,-5.8146,1.5336;4.7018,-4.8452,1.9276;5.2248,-5.7663,2.1661;5.3585,-3.6138,1.9889;6.4025,-3.5658,2.2862;4.675,-2.4407,1.6842;5.1693,-1.4808,1.7818)|\",4.54430130335\r\n\"[H]C1C([H])C(C([H])([H])[H])(C([H])([H])[H])[C@@]2([H])C([H])=C([H])N(C([H])([H])[H])C(=O)[C@]2([H])C1([H])[H] |(1.5005,-4.9318,-4.8606;1.9038,-5.7621,-4.2769;3.2443,-5.8426,-4.1138;3.5858,-6.6666,-3.4757;3.9409,-4.551,-3.737;5.4608,-4.6992,-3.5761;5.9405,-4.7834,-4.5578;5.7113,-5.5989,-3.0015;5.9034,-3.8377,-3.0618;3.6285,-3.336,-4.6219;3.9701,-3.5206,-5.6464;4.1481,-2.4451,-4.2535;2.5617,-3.0996,-4.6624;3.2508,-4.4376,-2.2669;3.7241,-5.2445,-1.6811;3.5147,-3.1377,-1.544;4.3178,-2.4785,-1.8548;2.7696,-2.8087,-0.4815;2.9366,-1.9045,0.0962;1.719,-3.6129,-0.0048;1.2185,-3.3979,1.3495;1.9661,-3.6851,2.0986;0.329,-4.013,1.4797;0.9623,-2.342,1.4919;1.1861,-4.6382,-0.7759;0.3266,-5.3953,-0.3473;1.701,-4.693,-2.2194;1.2329,-3.8277,-2.7173;1.1887,-5.9925,-2.9732;1.5375,-6.8834,-2.44;0.095,-6.0062,-2.9971)|\",4.628656597005\r\n\"[H]C([H])=C([H])C([H])([H])[C@]1([H])C(=O)N(C([H])([H])[H])C([H])=C([H])[C@]1([H])C(C([H])=C([H])[H])(C([H])([H])[H])C([H])([H])[H] |(1.1193,-0.0301,2.0591;1.3789,0.4316,1.1101;0.595,0.4657,0.3571;2.594,0.9316,0.8836;3.3429,0.8687,1.6768;3.0447,1.6299,-0.3712;3.2101,2.6919,-0.1483;2.2601,1.6131,-1.1317;4.3834,1.0922,-0.9129;5.073,1.0853,-0.0506;5.0305,2.1026,-1.8699;4.5482,3.1989,-2.1265;6.2624,1.7065,-2.366;7.0254,2.6151,-3.2142;6.9692,2.3185,-4.2691;8.0761,2.6193,-2.9038;6.604,3.6136,-3.1043;6.7246,0.3936,-2.1708;7.7765,0.2526,-2.4027;5.9437,-0.6036,-1.735;6.3801,-1.587,-1.598;4.4746,-0.3785,-1.4318;4.1928,-1.0172,-0.5857;3.5501,-0.8633,-2.6365;2.0926,-0.9343,-2.203;1.5672,0.0188,-2.1942;1.3965,-2.0253,-1.8772;0.3501,-1.9543,-1.5923;1.8247,-3.0237,-1.8711;3.6357,0.0875,-3.8507;3.0313,-0.3051,-4.6769;4.6667,0.1821,-4.2081;3.2601,1.0897,-3.6168;4.0363,-2.2605,-3.0792;3.3531,-2.6795,-3.8253;4.0876,-2.963,-2.2384;5.0327,-2.2046,-3.5266)|\",5.61098759731\r\n\"[H]C([H])=C([H])C([H])([H])[C@]1([H])C(=S)N(C([H])([H])[H])C([H])=C([H])[C@]1([H])C(C([H])=C([H])[H])(C([H])([H])[H])C([H])([H])[H] |(2.7386,4.1858,1.194;2.6783,3.5211,0.3363;1.6783,3.2623,-0.0076;3.773,3.0497,-0.2632;4.7513,3.3326,0.1218;3.7504,2.1364,-1.4598;4.2271,2.6311,-2.3172;2.7111,1.9551,-1.7463;4.443,0.7566,-1.2272;4.0817,0.3557,-0.2798;5.9536,0.9539,-1.0524;6.6699,1.0198,0.4518;6.6558,1.1225,-2.2116;8.0843,1.4415,-2.1715;8.4159,1.6824,-3.1837;8.6602,0.5926,-1.7887;8.2573,2.2911,-1.5079;6.0678,0.8593,-3.4706;6.6429,1.2167,-4.3173;4.9108,0.202,-3.5962;4.5198,0.0175,-4.5904;4.2003,-0.286,-2.3542;4.7374,-1.1917,-2.0203;2.7283,-0.7903,-2.5836;1.8653,0.2177,-3.326;2.3368,0.6672,-4.201;0.5978,0.5448,-3.0655;0.0586,1.2428,-3.7003;0.0477,0.1445,-2.2185;2.8089,-2.0546,-3.484;1.8066,-2.4639,-3.6465;3.4259,-2.8285,-3.0109;3.2391,-1.8383,-4.4679;2.1076,-1.2077,-1.2381;1.1577,-1.7298,-1.3946;1.9141,-0.3497,-0.587;2.7754,-1.8944,-0.7051)|\",4.231370375275\r\n\"[H]OC([H])([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C1=C([H])C([H])=C(C#N)C([H])=C1[H] |(3.6562,1.7189,1.5748;3.8511,2.0763,0.6943;2.5987,2.3673,0.0459;1.9653,2.9766,0.7059;2.865,2.9805,-0.8191;1.8386,1.1064,-0.399;0.9233,1.4636,-0.8911;1.4342,0.2294,0.7955;0.8958,0.8125,1.5527;2.3107,-0.2319,1.2623;0.7783,-0.5873,0.4754;2.6457,0.2794,-1.4429;2.0555,-0.6182,-1.6756;3.8651,-0.1967,-0.8963;4.2263,0.5293,-0.349;2.8495,1.0369,-2.7563;1.7474,1.3806,-3.5548;0.7438,1.0986,-3.2442;1.9131,2.0683,-4.7511;1.0546,2.3273,-5.3625;3.2039,2.4265,-5.1776;3.3836,3.1381,-6.4091;3.5273,3.7175,-7.4078;4.3134,2.0749,-4.3932;5.3115,2.341,-4.7268;4.131,1.383,-3.1987;4.9878,1.0854,-2.6049)|\",5.640920120865001\r\n\"[H]C1=C([H])[C@@]([H])(C#N)C2=C(C1=O)C(=O)C1=C([H])\\C([H])=C([H])/C([H])=C\\1N2[H] |(4.5581,-1.4254,-0.3945;3.6452,-0.8492,-0.2766;3.6508,0.4831,-0.3133;4.561,1.0572,-0.4624;2.3874,1.2811,-0.0843;2.464,1.7404,0.9175;2.2833,2.4069,-1.0313;2.172,3.3103,-1.7517;1.1132,0.4391,-0.0852;1.1229,-0.9358,-0.0092;2.4185,-1.664,-0.0438;2.5312,-2.8729,0.0936;-0.1642,-1.6749,0.0695;-0.256,-2.8947,0.1329;-1.3863,-0.8226,0.0615;-2.6561,-1.4164,0.1326;-2.6961,-2.4988,0.1984;-3.8044,-0.6387,0.1136;-4.7828,-1.1064,0.1691;-3.7009,0.759,0.0208;-4.5983,1.3708,0.0036;-2.4578,1.3714,-0.0488;-2.3737,2.4532,-0.1211;-1.301,0.5762,-0.0259;-0.0403,1.1611,-0.0865;0.0113,2.1616,-0.2364)|\",4.100755727035001\r\n\"[H]C1=C([H])C([H])=C2C(=O)[C@@]3([H])C(OC([H])([H])[H])=C([H])C([H])=C(N([H])[H])/C3=N/C2=C1[H] |(10.021,-1.8191,1.7952;8.9492,-1.8086,1.6147;8.1344,-2.788,2.2037;8.5738,-3.5591,2.8299;6.7627,-2.7643,1.9802;6.1027,-3.5013,2.4283;6.2001,-1.7966,1.1381;4.742,-1.7768,0.887;3.927,-2.3658,1.5689;4.3918,-0.9985,-0.4079;4.8147,-1.6824,-1.1718;2.9405,-0.8481,-0.7542;2.2601,-2.013,-0.6526;0.8749,-2.0002,-0.9654;0.5195,-3.017,-0.7929;0.7074,-1.7231,-2.0149;0.3308,-1.3036,-0.3155;2.4593,0.3337,-1.2143;1.4196,0.4398,-1.5007;3.3195,1.4789,-1.3687;2.8721,2.4003,-1.7342;4.6556,1.4688,-1.0664;5.4847,2.5828,-1.1507;5.287,3.2194,-1.9115;6.4655,2.3396,-1.063;5.2415,0.2598,-0.5146;6.4906,0.2905,-0.1446;7.0141,-0.7866,0.5633;8.3974,-0.8117,0.8196;9.0108,-0.0235,0.394)|\",2.8327051837050004\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@@]2([H])O[C@@]12C(=O)OC([H])([H])[H] |(4.1922,4.0235,2.0229;4.3537,3.9482,1.0687;3.135,4.2677,0.3915;3.3751,4.1565,-0.6693;2.6526,5.6973,0.6758;3.4865,6.3805,0.4804;1.8469,5.9459,-0.0249;2.1553,5.8499,2.1204;1.886,6.893,2.3204;2.9791,5.621,2.8125;0.9551,4.9337,2.4293;0.0239,5.4004,2.0831;0.8519,4.8021,3.5131;1.0399,3.5647,1.7741;0.6131,2.7273,2.3239;0.7286,3.526,0.3638;2.0772,3.2396,0.772;2.4444,1.7763,0.5925;2.1578,0.8929,1.3712;3.0859,1.5794,-0.572;3.4503,0.2174,-0.8545;3.9514,0.2444,-1.8222;2.5591,-0.4139,-0.8972;4.1232,-0.1644,-0.0825)|\",6.79196170848\r\n\"[H]C([H])([H])OC(=O)[C@@]12O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])C2=O |(6.0842,1.2209,-0.8475;5.0455,0.9273,-0.697;4.9846,-0.0891,-0.2994;4.5455,1.6141,-0.0096;4.4363,0.9947,-1.9989;3.1368,0.6757,-2.0436;2.4746,0.3138,-1.0963;2.6029,0.808,-3.457;3.0758,-0.1823,-4.3902;1.696,-0.2367,-3.996;1.4695,-1.0626,-3.3217;0.6886,0.0548,-5.0957;0.9977,-0.5286,-5.9727;-0.2914,-0.3309,-4.792;0.5706,1.5472,-5.4612;0.1195,1.6532,-6.4541;-0.1089,2.0426,-4.7558;1.9306,2.2804,-5.4231;1.8255,3.3339,-5.6952;2.6352,1.8031,-6.1153;2.4807,2.2103,-4.0123;2.7354,3.1795,-3.3244)|\",5.771534769105\r\n\"[H]OC1(C(C([H])([H])[H])(C([H])([H])[H])[C@]2([H])OC(C([H])([H])[H])(C([H])([H])[H])OC3(C([H])([H])C3([H])[H])C2([H])[H])C([H])([H])C1([H])[H] |(2.4834,-0.5051,-2.2424;1.5889,-0.8745,-2.1214;1.0351,-0.1112,-1.0737;1.854,-0.1664,0.2436;2.1167,-1.644,0.5929;2.6195,-1.7378,1.5612;2.732,-2.1375,-0.1584;1.1622,-2.1792,0.6546;1.0715,0.4728,1.4112;1.6331,0.4008,2.348;0.1263,-0.0524,1.5729;0.8484,1.5329,1.2424;3.1585,0.6817,0.0285;2.8049,1.706,-0.1395;3.8303,0.3449,-1.2133;5.0849,-0.3688,-1.1752;5.065,-1.3462,-2.3469;4.9371,-0.8009,-3.287;4.2393,-2.0525,-2.2292;6.0051,-1.9038,-2.3885;6.2428,0.6304,-1.3057;6.1254,1.2023,-2.2309;7.1979,0.0962,-1.3416;6.2763,1.3362,-0.473;5.1701,-1.164,-0.0096;5.1334,-0.4446,1.2111;6.4191,-0.3873,1.9984;7.2951,-0.8409,1.5441;6.6232,0.4935,2.604;5.2966,-1.3191,2.4162;4.7244,-1.0752,3.308;5.4321,-2.382,2.2384;4.1756,0.7425,1.1881;3.6525,0.8279,2.1443;4.7472,1.6707,1.0698;0.2988,1.1369,-1.519;0.2335,1.9979,-0.8587;0.3652,1.3809,-2.5751;-0.465,-0.1154,-1.1274;-0.8948,-0.7106,-1.9275;-1.0546,-0.0947,-0.2164)|\",8.04368542078\r\n\"[H]N([H])[C@]12C(=O)N(C([H])([H])[H])C(=O)[C@](C([H])([H])[H])(C([H])([H])C1([H])[H])C2(C([H])([H])[H])C([H])([H])[H] |(4.3051,-2.6197,-0.5884;3.7609,-1.8524,-0.1931;3.1107,-2.3183,0.4414;3.0033,-1.2351,-1.2713;2.1406,-2.272,-2.0205;2.3163,-3.4737,-1.8972;1.171,-1.7804,-2.8963;0.3773,-2.757,-3.6506;-0.1481,-2.2258,-4.4418;-0.3485,-3.2504,-2.9966;1.0469,-3.5148,-4.0585;0.8318,-0.4185,-2.9754;-0.1064,-0.0538,-3.6625;1.7392,0.5549,-2.21;1.0764,1.9335,-2.1864;0.1223,1.9099,-1.6505;0.8652,2.2726,-3.2036;1.7312,2.6651,-1.7047;3.1159,0.5592,-2.9588;2.9808,0.4494,-4.0396;3.6024,1.5269,-2.8026;3.9417,-0.5882,-2.3317;4.8288,-0.2145,-1.812;4.2851,-1.3218,-3.0692;2.1029,-0.0468,-0.8071;0.8821,-0.4891,0.0241;1.2066,-0.9182,0.9796;0.2531,-1.2318,-0.4772;0.2481,0.3723,0.2607;2.9102,0.9496,0.0503;2.26,1.7516,0.4145;3.7381,1.4119,-0.4948;3.339,0.4341,0.9132)|\",6.16065757532\r\n\"[H]N([H])[C@]12C([H])([H])OC([H])([H])[C@](C([H])([H])[H])(C([H])([H])C1([H])[H])C2(C([H])([H])[H])C([H])([H])[H] |(1.8076,-2.7167,0.3941;1.778,-1.7107,0.2279;1.7129,-1.5963,-0.7842;3.0122,-1.0936,0.7238;4.2452,-1.5641,-0.0844;4.1456,-1.2778,-1.1449;4.3019,-2.6587,-0.0469;5.4767,-1.1013,0.4547;5.4875,0.2938,0.7393;6.4004,0.4596,1.3224;5.581,0.8794,-0.1888;4.2586,0.749,1.5574;4.4662,2.2026,1.9925;4.594,2.8723,1.1324;5.3646,2.2944,2.6156;3.6204,2.5711,2.581;4.1001,-0.2328,2.7566;5.0784,-0.5554,3.1276;3.5898,0.2611,3.5903;3.2642,-1.4245,2.2148;2.3037,-1.5174,2.7316;3.7909,-2.379,2.3257;2.9501,0.4743,0.7267;1.6826,0.9753,1.4563;1.6144,2.0676,1.4044;1.6513,0.6932,2.5123;0.7934,0.5528,0.9811;2.9181,1.1081,-0.6781;1.9895,0.8336,-1.1945;3.7507,0.8203,-1.3236;2.9256,2.2021,-0.6046)|\",7.943003296095\r\n\"[H]C1=C(C([H])([H])C([H])([H])[H])C([H])([H])[C@@]2([H])[C@@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])C([H])([H])[C@@]12[H] |(3.6799,-2.3264,-0.3397;3.2168,-1.7158,-1.1125;3.12,-0.3785,-1.0598;3.616,0.5091,0.0494;4.1487,-0.0985,0.7916;4.3505,1.221,-0.3572;2.4995,1.3064,0.7482;2.9112,1.9265,1.5524;1.7537,0.6337,1.1867;1.9812,1.9709,0.048;2.4178,0.1819,-2.2886;2.9851,0.9835,-2.7762;1.4385,0.6062,-2.0234;2.2558,-1.0458,-3.2076;1.2917,-1.0828,-3.7194;3.4028,-1.4267,-4.2016;2.822,-1.5291,-5.5412;3.4398,-2.2863,-6.4774;4.4886,-2.8719,-6.3125;2.628,-2.3108,-7.755;3.18,-2.851,-8.5251;2.4095,-1.2928,-8.0927;1.6673,-2.8056,-7.5754;4.6479,-0.544,-4.2543;5.4307,-1.1049,-4.7867;4.9956,-0.4009,-3.2251;4.35,0.7744,-4.8298;3.9833,0.6526,-5.7721;5.2142,1.305,-4.9246;3.5861,-2.7858,-3.4714;3.196,-3.6238,-4.053;4.6118,-3.0218,-3.1711;2.6102,-2.3098,-2.3552;1.8012,-3.0167,-2.132)|\",6.288551085055\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[C@@]2([H])[C@@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])C([H])([H])[C@@]12[H] |(5.0926,-3.0679,-0.5338;5.227,-2.1918,-1.1655;4.3131,-1.2203,-1.3133;2.9588,-1.1618,-0.6614;2.8802,-1.9562,0.0928;2.8535,-0.2073,-0.1222;1.7837,-1.2908,-1.6523;1.8454,-0.4911,-2.4021;1.884,-2.2366,-2.2012;0.4192,-1.2356,-0.9584;-0.4001,-1.3331,-1.6796;0.2836,-0.2858,-0.4262;0.3154,-2.0435,-0.2235;4.7985,-0.1352,-2.265;4.7118,0.8727,-1.842;4.2141,-0.1383,-3.1968;6.2673,-0.5183,-2.5348;6.5726,-0.3725,-3.573;7.391,-0.0066,-1.574;8.4026,0.6721,-2.386;9.6708,0.7808,-1.9263;10.0545,0.3881,-0.8453;10.5487,1.4502,-2.9623;10.1175,2.4069,-3.2735;10.6169,0.819,-3.8551;11.5452,1.6052,-2.547;7.0006,0.89,-0.4004;7.8563,0.92,0.2911;6.1725,0.4058,0.1291;6.5496,2.2106,-0.8593;7.2953,2.6509,-1.3953;6.3801,2.81,-0.0538;7.7401,-1.4731,-1.1987;8.6757,-1.8044,-1.6544;7.7836,-1.6855,-0.1258;6.4784,-1.9625,-1.9696;6.6719,-2.7686,-2.6885)|\",6.28582994655\r\n\"[H]C1=C(C2([H])C([H])([H])C2([H])[H])[C@]2([H])C([H])([H])[C@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])[C@]2([H])C1([H])[H] |(8.0461,-0.1634,-4.7049;7.0338,-0.0026,-4.3425;6.3319,1.1253,-4.5407;6.7368,2.3388,-5.2963;6.4895,3.2761,-4.7975;6.5955,2.3837,-6.8121;6.2653,1.4727,-7.3038;6.2356,3.3009,-7.2708;7.9612,2.375,-6.1767;8.5469,3.2903,-6.1931;8.5553,1.4703,-6.2661;4.9591,1.0323,-3.9049;4.1575,1.1706,-4.6413;4.7482,1.8066,-2.5709;3.7263,2.1686,-2.4403;5.4332,2.6335,-2.3553;5.0206,0.499,-1.7782;3.8991,0.0417,-0.9553;3.398,0.8659,-0.0065;3.8338,1.9675,0.2524;2.2065,0.2271,0.6754;2.4603,-0.773,1.0406;1.3861,0.1124,-0.0415;1.8832,0.8593,1.5032;6.2972,0.5033,-0.9386;6.2174,1.331,-0.2181;7.1338,0.7348,-1.607;6.5456,-0.8099,-0.329;5.7682,-1.0479,0.2841;7.3723,-0.7552,0.2634;5.0063,-0.3145,-3.114;4.1013,-0.9235,-3.1708;6.2813,-1.0588,-3.5638;6.8405,-1.453,-2.708;6.033,-1.9206,-4.2016)|\",6.201474652895\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])[H])[C@]2([H])C([H])([H])[C@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])[C@]2([H])C1([H])[H] |(5.5575,1.858,-0.646;5.6735,0.9386,-0.075;4.7976,0.522,0.8524;3.509,1.2113,1.2189;3.3769,2.0888,0.5714;2.6673,0.5359,0.9944;3.3874,1.6528,2.6908;3.4812,0.7784,3.3469;4.228,2.3154,2.9373;2.0637,2.3667,2.9813;1.9989,2.6754,4.0308;1.2072,1.7136,2.7724;1.9516,3.265,2.3615;5.2214,-0.811,1.4376;4.4443,-1.576,1.3102;5.8921,-0.8232,2.8469;5.6537,-1.7159,3.4289;5.7367,0.0517,3.4839;7.2909,-0.9021,2.1795;8.0403,-2.1221,2.4799;8.314,-2.4339,3.7678;8.0161,-1.7508,4.7243;9.0392,-3.7607,3.845;9.9703,-3.7193,3.2702;8.4229,-4.5523,3.4063;9.2573,-3.9949,4.8876;8.1856,0.3155,2.4034;8.3297,0.4283,3.4887;7.6355,1.1972,2.0552;9.4325,0.2174,1.6345;9.9455,-0.6116,1.9284;10.0268,1.0162,1.8482;6.6169,-1.0853,0.7816;6.7725,-2.1045,0.4209;6.825,-0.0113,-0.3091;7.8117,0.4588,-0.2332;6.7676,-0.4521,-1.3157)|\",6.27494539253\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C(C([H])([H])C([H])([H])[H])=C([H])C([H])([H])[C@@]12[H] |(2.9374,2.8247,-6.3581;3.4497,3.6686,-6.1657;4.236,3.5212,-5.0961;4.8725,4.4544,-4.6502;4.3627,2.1177,-4.4722;4.9825,2.2723,-3.5864;4.9604,1.5051,-5.1649;3.1081,1.3031,-4.0738;2.4551,0.5721,-5.2603;1.6939,-0.1256,-4.878;3.22,-0.0347,-5.7606;1.8916,1.4866,-6.2721;1.0325,1.9122,-5.9256;1.6455,0.9794,-7.1201;3.3902,0.2423,-2.9526;3.2984,-0.8029,-3.271;4.3631,0.368,-2.4677;2.2068,0.8134,-2.1097;1.3657,0.1172,-2.011;2.5857,1.4863,-0.8098;2.8564,0.711,0.4515;3.6239,-0.0526,0.2529;3.2794,1.3842,1.2072;1.6097,0.0163,1.0296;1.8605,-0.5346,1.943;1.1798,-0.6989,0.3189;0.834,0.7496,1.2766;2.6685,2.8135,-0.9718;2.9523,3.5076,-0.1836;2.3519,3.2866,-2.3716;3.1872,3.848,-2.8059;1.4958,3.976,-2.3798;2.0267,1.9746,-3.1392;1.0371,2.0364,-3.6069)|\",6.925297495224999\r\n\"[H]C1=C(C([H])([H])C([H])([H])[H])[C@]2([H])C([H])([H])[C@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])[C@]2([H])C1([H])[H] |(6.1287,-0.5319,-1.5549;5.1436,-0.9632,-1.3858;4.3678,-0.6597,-0.3333;4.6935,0.3447,0.7414;5.6533,0.8206,0.504;3.9391,1.1467,0.7156;4.7487,-0.2193,2.1725;5.0357,0.5643,2.8831;5.4827,-1.0296,2.249;3.7778,-0.6144,2.4898;3.0415,-1.3919,-0.4093;2.1949,-0.6927,-0.4088;2.8258,-2.6438,0.497;1.7867,-2.7686,0.8089;3.4629,-2.7462,1.3796;3.1986,-3.5668,-0.6935;2.1297,-4.4691,-1.1242;1.6003,-5.3436,-0.2377;1.9628,-5.4677,0.9128;0.4931,-6.147,-0.8864;0.8973,-6.7563,-1.7022;-0.257,-5.4792,-1.3215;0.0308,-6.7944,-0.1403;4.4927,-4.3589,-0.5179;4.3779,-4.989,0.3772;5.2951,-3.6425,-0.308;4.8418,-5.0984,-1.7377;4.0899,-5.7472,-1.963;5.6695,-5.6659,-1.5643;3.2023,-2.3316,-1.6507;2.3385,-2.3738,-2.3178;4.5107,-1.927,-2.363;5.1333,-2.7986,-2.5928;4.3035,-1.4295,-3.3224)|\",6.274945392529999\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C(C([H])([H])[H])=C([H])C([H])([H])[C@@]12[H] |(6.9878,-1.1856,-1.8517;7.4025,-0.2849,-1.6801;6.79,0.342,-0.6727;7.0876,1.4784,-0.3645;5.7317,-0.4353,0.1336;5.3258,0.3041,0.8273;6.2697,-1.1718,0.7507;4.5613,-1.165,-0.5694;4.9547,-2.5404,-1.1385;4.0436,-3.0734,-1.452;5.4098,-3.1338,-0.3357;5.9384,-2.4601,-2.236;5.4918,-2.1232,-3.0883;6.3175,-3.3808,-2.4495;3.3136,-1.3688,0.3594;3.0889,-2.413,0.6066;3.3654,-0.8002,1.2929;2.3987,-0.7054,-0.7167;1.6724,-1.3976,-1.1612;1.7895,0.6249,-0.3423;0.5502,0.7173,0.4972;0.2703,1.7573,0.6928;0.6849,0.2161,1.466;-0.3,0.223,0.0065;2.5168,1.6362,-0.8338;2.2872,2.6876,-0.6756;3.722,1.1957,-1.632;4.6517,1.6034,-1.2188;3.6739,1.5601,-2.6679;3.6538,-0.3566,-1.5785;3.6615,-0.7781,-2.5904)|\",6.968835711304999\r\n\"[H]OC(=O)C([H])([H])[C@]1(C([H])([H])N([H])[H])C([H])([H])[C@]2([H])C([H])=C([H])C3(C([H])([H])C3([H])[H])[C@]12[H] |(0.3748,3.6216,-3.3501;-0.2426,3.1047,-3.947;-0.8529,2.1394,-3.2455;-1.5649,1.3208,-3.7871;-0.6309,2.132,-1.725;-1.2656,1.3274,-1.3563;-1.024,3.0726,-1.3127;0.8251,1.9158,-1.2368;1.5904,3.2391,-1.0716;2.5705,3.0348,-0.6129;1.0379,3.878,-0.371;1.7309,3.9911,-2.3352;2.4631,3.5751,-2.9102;2.0161,4.9513,-2.1526;1.6154,0.8145,-2.028;1.0943,0.3963,-2.894;2.6137,1.1358,-2.3502;1.634,-0.0933,-0.7667;2.6283,-0.451,-0.4707;0.5937,-1.1852,-0.7174;0.6716,-2.096,-1.3046;-0.4082,-0.8917,0.1199;-1.2654,-1.5287,0.3239;-0.2163,0.4285,0.7907;-1.3687,1.2356,1.3558;-1.3191,2.3184,1.2719;-2.3736,0.8333,1.2513;-0.4151,0.5535,2.2955;-0.759,-0.3244,2.8366;0.2935,1.1673,2.8472;0.9914,1.049,0.0869;1.666,1.5388,0.797)|\",6.5905974591100005\r\n\"[H]C1=C([H])[C@]2([H])[C@@]([H])(C([H])([H])[H])[C@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])[C@]2([H])C1([H])[H] |(6.5777,-0.0589,1.3672;5.7092,0.0911,0.7299;4.7834,1.0392,0.9162;4.7905,1.767,1.7243;3.7094,1.0066,-0.1396;3.6397,1.9508,-0.6958;2.3153,0.4136,0.2493;2.1509,0.3342,1.3309;1.1342,1.137,-0.3914;1.0677,2.1563,0.0077;0.1901,0.627,-0.1827;1.2529,1.2158,-1.4785;2.7603,-0.9588,-0.372;1.9669,-1.4239,-1.5093;0.6987,-1.865,-1.3519;0.1085,-1.9399,-0.2948;0.109,-2.2435,-2.6946;-0.8634,-2.714,-2.5443;0.7787,-2.9194,-3.2349;-0.0114,-1.3449,-3.3098;2.9438,-2.0756,0.6574;1.9904,-2.2129,1.1854;3.6754,-1.7198,1.391;3.4744,-3.3011,0.0443;2.8004,-3.663,-0.6281;3.5675,-4.0183,0.7617;4.046,-0.2864,-0.951;3.9603,-0.1914,-2.0355;5.4392,-0.7681,-0.4881;5.4488,-1.8425,-0.2746;6.198,-0.5953,-1.2655)|\",6.326647024125\r\n\"[H]OC(=O)C([H])([H])[C@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C([H])([H])C([H])([H])S[C@@]21[H] |(-1.498,-3.624,-3.7873;-0.9197,-2.8523,-3.6384;-1.0692,-2.5299,-2.3317;-1.8366,-3.1309,-1.6061;-0.2678,-1.295,-1.9703;-0.9529,-0.4427,-2.0623;0.5082,-1.1468,-2.727;0.3744,-1.3161,-0.5733;1.5517,-2.2978,-0.5304;1.9667,-2.2991,0.494;2.3443,-1.9177,-1.1895;1.1691,-3.6214,-1.0238;0.4781,-4.047,-0.4082;1.9774,-4.2395,-1.0433;-0.6573,-1.3737,0.6381;-1.5981,-1.8934,0.4579;-0.1723,-1.7785,1.5338;-0.6691,0.1797,0.5948;-1.3733,0.4548,-0.2009;-0.6404,1.3433,1.5729;-1.623,1.6445,1.9529;0.0045,1.1259,2.4323;-0.0305,2.5009,0.7126;0.5903,3.171,1.3114;-0.8181,3.0936,0.2401;1.0384,1.7838,-0.6848;0.7168,0.1058,-0.0092;1.4484,-0.0608,0.7916)|\",6.315762470105001\r\n\"[H]OC(=O)C([H])([H])[C@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C([H])([H])C([H])([H])O[C@@]21[H] |(-0.555,-4.3522,-3.5096;-0.1004,-3.4973,-3.3887;-0.4751,-3.0535,-2.165;-1.2626,-3.6677,-1.4731;0.114,-1.6893,-1.8663;-0.6447,-0.9538,-2.1617;0.9711,-1.5269,-2.5264;0.5425,-1.4652,-0.4073;1.8104,-2.2649,-0.0848;2.0716,-2.0904,0.9748;2.6373,-1.8582,-0.6834;1.6665,-3.6777,-0.4405;0.9485,-4.1189,0.1319;2.5371,-4.1762,-0.2687;-0.6432,-1.5196,0.6679;-1.4798,-2.1786,0.441;-0.2485,-1.7462,1.6643;-0.8358,0.0052,0.3948;-1.4139,0.0594,-0.5367;-0.9783,1.3675,1.0636;-1.9904,1.7831,1.0925;-0.5686,1.3821,2.0793;-0.0475,2.1896,0.073;0.5944,2.8888,0.6219;-0.6355,2.7569,-0.656;0.7986,1.2632,-0.6826;0.6102,0.043,0.0155;1.2307,0.0585,0.9306)|\",7.06407555898\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C([H])([H])SC([H])([H])[C@@]21[H] |(-2.0909,-1.884,-1.9154;-1.8677,-2.6962,-1.3721;-1.81,-2.3772,-0.0715;-1.4976,-3.1914,0.7697;-2.1598,-0.9271,0.2936;-2.0856,-0.8778,1.382;-3.211,-0.7402,0.0316;-1.2549,0.1576,-0.3429;-1.8517,0.7148,-1.6455;-1.2684,1.5746,-2.0042;-2.8578,1.0899,-1.4235;-1.988,-0.3245,-2.6886;-1.1016,-0.4525,-3.1748;-2.6675,-0.0425,-3.3924;-0.8231,1.3437,0.6144;-1.3634,2.2913,0.5172;-0.8401,1.0262,1.6614;0.5932,1.26,-0.024;0.5842,1.8625,-0.941;2.0105,1.2762,0.516;2.0915,0.8497,1.5198;2.5109,2.2474,0.4896;2.8823,0.1384,-0.7178;1.3033,-0.6246,-1.4216;1.1277,-0.2032,-2.4173;1.4442,-1.7046,-1.5009;0.2866,-0.1853,-0.3821;0.4631,-0.7964,0.5139)|\",6.37290637871\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C([H])([H])OC([H])([H])[C@@]21[H] |(-3.8053,-1.3834,-1.7919;-4.5481,-1.0416,-1.2203;-4.0692,-0.7845,0.0123;-4.75,-0.2576,0.8626;-2.6156,-1.2028,0.2499;-2.4674,-2.2183,-0.1382;-2.4671,-1.2369,1.3318;-1.5584,-0.2642,-0.3713;-1.7964,-0.0115,-1.8705;-0.9213,0.4816,-2.3153;-2.6427,0.6747,-1.9861;-2.1626,-1.2562,-2.5807;-1.4216,-1.9506,-2.4897;-2.2655,-1.0701,-3.5771;-1.2789,1.089,0.4089;-1.729,2.0029,0.01;-1.5488,0.9889,1.4643;0.2297,0.8538,0.1193;0.4234,1.2328,-0.8928;1.5708,0.8135,0.8247;1.4849,0.6334,1.9053;2.2228,1.6759,0.6561;2.2025,-0.3351,0.1763;1.2343,-1.1618,-0.533;1.339,-1.0081,-1.6176;1.4617,-2.2072,-0.2994;-0.076,-0.6336,0.0266;-0.1503,-1.0161,1.0549)|\",7.279045500875\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])SC([H])([H])C([H])([H])[C@]12[H] |(1.5808,-3.528,1.1886;1.2165,-4.2006,0.5406;0.0448,-3.7807,0.0453;-0.5491,-4.4097,-0.8031;-0.5173,-2.4694,0.6176;-0.7596,-2.6293,1.6782;-1.4613,-2.3115,0.0912;0.3776,-1.2132,0.4686;1.2672,-0.9902,1.7033;0.623,-0.9851,2.5902;1.755,-0.0067,1.6614;2.2522,-2.0781,1.8838;2.618,-2.0786,2.834;3.0508,-1.9368,1.2664;1.1398,-1.063,-0.9204;2.1954,-1.3471,-0.9687;0.5941,-1.5859,-1.7109;0.8136,0.4481,-0.8415;1.5928,0.9272,-0.2395;0.1845,1.7195,-2.0006;-0.9834,2.3344,-0.6347;-0.7206,3.3657,-0.3899;-1.9932,2.3262,-1.0524;-0.8887,1.3999,0.6203;-1.8689,1.3277,1.1043;-0.181,1.8137,1.3479;-0.3812,0.0959,0.0207;-1.1635,-0.2669,-0.6589)|\",6.44909825685\r\n\"[H]C1=C([H])[C@]2([H])C([H])([H])[C@](OC(=O)C([H])([H])[H])(C([H])([H])N([H])[H])[C@]2([H])C1([H])[H] |(8.2135,-0.3723,2.5713;7.1989,-0.5392,2.2171;6.841,-1.5023,1.3599;7.5208,-2.2256,0.9156;5.3616,-1.4939,1.0704;4.8786,-2.4413,1.34;4.9046,-0.9061,-0.297;3.9963,-1.3729,-0.6836;5.647,-0.8797,-1.1009;4.6234,0.4586,0.3908;3.2269,0.8916,0.3381;2.6168,1.0375,-0.8612;3.1537,0.8748,-1.936;1.1644,1.4187,-0.6686;0.7089,1.6151,-1.6399;1.0806,2.3019,-0.0275;0.6293,0.6038,-0.169;5.5291,1.6066,-0.0514;5.3883,1.7424,-1.1345;6.5664,1.2896,0.1032;5.2934,2.8177,0.7457;4.3242,3.1098,0.6342;5.8657,3.5778,0.3824;4.8689,-0.1941,1.7915;3.9233,-0.2893,2.3294;6.0311,0.3099,2.6748;6.1987,1.3852,2.5497;5.8209,0.1449,3.742)|\",6.397396625254999\r\n\"[H]OC1=NC(=O)N(C([H])([H])C([H])([H])[H])[C@@]([H])(N([H])[H])[C@@]1([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(2.8732,-5.4594,-0.9795;3.5646,-4.9637,-1.4453;3.3057,-3.6398,-1.3523;4.1445,-2.8499,-1.8991;3.9206,-1.4554,-1.8409;4.5558,-0.7024,-2.562;2.9714,-0.9758,-0.9359;3.0016,0.4699,-0.6855;3.1726,0.9444,-1.6522;2.0131,0.7807,-0.3345;4.0873,0.9076,0.3044;4.0628,1.9952,0.4385;5.076,0.6328,-0.0745;3.956,0.4467,1.2917;2.4078,-1.8896,0.086;3.1751,-2.1159,0.8427;1.2765,-1.3606,0.8167;0.5815,-0.9662,0.1866;1.5661,-0.6215,1.4518;2.0495,-3.2215,-0.6167;1.8521,-3.9512,0.1807;0.7825,-3.1451,-1.5419;0.7035,-2.1026,-1.8777;0.879,-4.0098,-2.8111;-0.0242,-3.8746,-3.4162;0.9513,-5.0789,-2.5733;1.7401,-3.743,-3.4311;-0.4863,-3.5167,-0.7541;-1.3821,-3.3373,-1.3599;-0.5728,-2.951,0.1769;-0.4753,-4.5836,-0.4936)|\",5.730717691530001\r\n\"[H]OC(=O)[C@](O[H])(OC([H])([H])[H])C(F)(F)F |(2.4309,-3.7702,2.8795;1.5496,-3.3479,2.9129;1.5011,-2.4115,1.9558;0.549,-1.7007,1.7687;2.7831,-2.3102,1.0673;3.7982,-3.0223,1.7516;4.5524,-3.1254,1.1451;3.1378,-1.013,0.7516;3.3334,-0.1229,1.8575;3.693,0.8096,1.4207;4.0829,-0.5166,2.5527;2.3897,0.0587,2.3815;2.5251,-3.0053,-0.2941;1.998,-4.2282,-0.0904;1.7079,-2.3119,-1.0807;3.7082,-3.1706,-0.9354)|\",7.205574761239999\r\n\"[H]/C(C(=O)OC([H])([H])C([H])([H])[H])=C(/[H])C1(C2([H])C([H])([H])C2([H])[H])C([H])([H])C1([H])[H] |(3.5458,-0.6388,-1.8762;3.8058,-0.409,-0.8478;2.7718,0.2779,-0.0448;2.8727,0.6045,1.1235;1.6641,0.5018,-0.7982;0.5655,1.1753,-0.146;-0.3188,0.8528,-0.702;0.4954,0.8206,0.8854;0.7321,2.6875,-0.1965;-0.1484,3.1748,0.2383;0.8414,3.0318,-1.2302;1.6121,2.9955,0.3749;4.9982,-0.7105,-0.3147;5.1643,-0.4097,0.7206;6.1322,-1.4209,-0.9652;7.4769,-0.839,-0.586;7.6406,-0.8411,0.492;8.0568,0.3459,-1.3224;7.5184,0.6982,-2.1986;8.5398,1.1324,-0.7492;8.7267,-1.0031,-1.416;9.6633,-1.1548,-0.8862;8.6724,-1.5336,-2.3626;5.9279,-2.0979,-2.3007;4.9597,-1.999,-2.781;6.758,-2.1208,-2.9995;6.0184,-2.9448,-1.0615;6.9283,-3.5144,-0.8919;5.1105,-3.4097,-0.6883)|\",5.801467292660001\r\n\"[H]C1=C([H])C([C@]([H])(C([H])([H])[H])C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])C([H])([H])[H])=C([H])C([H])=C1Cl |(-4.4423,-4.3889,5.3487;-4.2706,-3.3388,5.1368;-3.291,-2.9467,4.2265;-2.7044,-3.7167,3.7318;-3.0546,-1.5936,3.943;-1.9932,-1.1656,2.9387;-1.9895,-0.0667,2.918;-2.3239,-1.656,1.5165;-1.5863,-1.285,0.7952;-3.314,-1.3088,1.2035;-2.3229,-2.7508,1.4624;-0.566,-1.618,3.3502;-0.5089,-2.7131,3.3726;0.1253,-1.285,2.5608;-0.1076,-1.0537,4.6636;-0.1501,0.0329,4.7485;0.3436,-1.7783,5.6977;0.3934,-2.8641,5.6318;0.8191,-1.2433,6.9996;1.2344,-2.0182,7.8503;0.782,0.2629,7.2538;1.3788,0.7601,6.4758;-0.2486,0.6139,7.1035;1.2856,0.644,8.645;1.2481,1.7297,8.7832;2.3163,0.3093,8.7932;0.6779,0.1734,9.4233;-3.8372,-0.6403,4.6047;-3.678,0.4165,4.4029;-4.8248,-1.0106,5.519;-5.4239,-0.2599,6.0234;-5.0329,-2.3627,5.7778;-6.274,-2.8474,6.927)|\",5.205537960065\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])[C@]([H])(C([H])([H])[H])C([H])([H])OC([H])([H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(0.6889,-2.8144,-1.521;0.1747,-3.277,-2.2023;1.0609,-3.532,-3.2816;1.8411,-4.2576,-2.9921;0.4518,-4.0306,-4.0475;1.7173,-2.3008,-3.8563;2.5343,-2.494,-4.5504;1.3577,-1.0508,-3.5573;0.5254,-0.9145,-2.8653;1.9691,0.2293,-4.0809;1.1716,0.7718,-4.6162;3.1426,0.0301,-5.0476;2.8464,-0.5757,-5.9104;3.9809,-0.4624,-4.5462;3.4963,0.997,-5.4233;2.3548,1.1514,-2.9141;1.4758,1.3362,-2.2715;2.6873,2.1293,-3.3034;3.3905,0.5483,-2.1618;3.7877,1.3131,-1.0393;4.2138,2.2796,-1.3597;2.8992,1.5512,-0.4254;4.8007,0.5433,-0.2235;4.804,-0.852,-0.2025;4.0969,-1.3889,-0.8266;5.7125,-1.5643,0.5856;5.6927,-2.6482,0.5702;6.6351,-0.8722,1.3779;7.5656,-1.4619,2.1873;7.6099,-2.878,2.2345;8.4116,-3.13,2.9313;7.8378,-3.3089,1.2501;6.6654,-3.3016,2.6015;6.642,0.5301,1.3638;7.371,1.0523,1.9754;5.7391,1.222,0.5665;5.7658,2.3099,0.5576)|\",5.8259575392050005\r\n\"[H]O[C@@]([H])(C([H])([H])[H])[C@@]1([H])N([H])[C@]([H])(C([H])([H])C2=C([H])C([H])=C(OC([H])([H])[H])C([H])=C2[H])C([H])([H])C1([H])[H] |(3.5246,0.8782,-1.5375;2.6746,0.3934,-1.5697;2.5109,-0.0519,-0.23;1.4474,-0.2984,-0.1149;3.3466,-1.3074,0.0388;3.1688,-1.7032,1.0471;4.4173,-1.0908,-0.0623;3.0941,-2.0858,-0.688;2.8523,1.1099,0.7315;2.8337,0.729,1.7647;4.1826,1.6499,0.3588;4.9419,1.2297,0.8889;4.1773,3.1149,0.4724;4.9348,3.5254,-0.2086;4.4813,3.6562,1.8982;4.4454,4.753,1.855;3.6825,3.3426,2.5826;5.8206,3.2084,2.4391;7.0006,3.8767,2.0949;6.949,4.7591,1.4602;8.251,3.451,2.5493;9.1381,4.0032,2.2602;8.3405,2.3219,3.3721;9.5046,1.8156,3.8767;10.7181,2.4731,3.5506;11.5083,1.9072,4.0475;10.7283,3.508,3.918;10.9004,2.4741,2.4676;7.1708,1.6384,3.7309;7.2543,0.7702,4.3774;5.9357,2.0804,3.2694;5.0371,1.5475,3.5743;2.7548,3.4575,0.0022;2.4328,4.4535,0.323;2.7169,3.4245,-1.0914;1.8913,2.3218,0.5898;1.042,2.0851,-0.0566;1.4909,2.6032,1.5697)|\",5.65452581339\r\n\"[H]OC1=N[C@@]2([H])C([H])([H])[C@]1([H])[C@]([H])(C1=C([H])C([H])=C(F)C([H])=C1[H])C2([H])[H] |(1.2084,5.4296,-2.1007;1.876,5.5898,-1.4163;1.5363,4.8966,-0.3052;2.1825,5.0017,0.7915;1.5448,3.9782,1.662;1.7551,4.1613,2.7172;0.0655,4.0011,1.2189;-0.422,4.9597,1.4213;-0.5349,3.1826,1.6278;0.4535,3.8248,-0.2664;-0.3266,3.8885,-1.0304;1.2655,2.4779,-0.2476;2.0096,2.5027,-1.0524;0.4541,1.2035,-0.4271;1.1415,-0.0043,-0.6355;2.2286,0.0011,-0.6703;0.4674,-1.2114,-0.8034;0.9988,-2.1434,-0.9659;-0.9228,-1.2083,-0.7661;-1.5877,-2.3724,-0.9304;-1.6422,-0.0387,-0.5687;-2.7267,-0.0719,-0.5493;-0.9457,1.1609,-0.4003;-1.5174,2.0706,-0.2473;1.9974,2.5848,1.1447;1.6719,1.7877,1.8211;3.0849,2.5324,1.0481)|\",6.05181203512\r\n\"[H]OC1=N[C@]2([H])C([H])([H])[C@@]1([H])C([H])([H])[C@@]2([H])C1=C([H])C([H])=C(F)C([H])=C1[H] |(3.3784,5.5866,2.026;3.3322,5.5383,1.0589;2.2344,4.8296,0.7108;1.8746,4.6815,-0.5067;0.7205,3.7474,-0.415;0.1129,3.7799,-1.3205;0.0479,4.1521,0.9188;-0.3501,5.1711,0.8912;-0.7228,3.4618,1.2772;1.379,4.0353,1.6885;1.4042,4.3415,2.7387;1.7388,2.5385,1.4225;1.1394,1.8937,2.0726;2.7944,2.3191,1.6104;1.3353,2.3454,-0.0883;2.2423,2.2414,-0.6929;0.4656,1.1271,-0.3459;1.0729,-0.138,-0.4097;2.1508,-0.2168,-0.2875;0.3326,-1.2972,-0.6322;0.8033,-2.2737,-0.6834;-1.0428,-1.1856,-0.8016;-1.7713,-2.3018,-1.0216;-1.684,0.0441,-0.7553;-2.7584,0.0952,-0.8993;-0.9229,1.1928,-0.5259;-1.4328,2.1505,-0.4983)|\",6.054533173625\r\n\"[H]C1=C(N2C(=O)[C@@]3([H])C([H])=C([H])[C@]2([H])C3([H])[H])C([H])=C([H])C(Cl)=N1 |(3.4019,-1.2402,-0.7564;3.7181,-0.2231,-0.5419;2.8678,0.6752,0.1211;1.5992,0.2778,0.5675;0.4917,1.1105,0.7477;0.4759,2.3255,0.7844;-0.7076,0.1381,0.8637;-1.5896,0.6015,1.3028;-0.8554,-0.493,-0.5285;-1.6592,-0.2628,-1.218;0.2147,-1.2697,-0.7359;0.4794,-1.8253,-1.628;1.088,-1.1304,0.5212;1.8887,-1.8507,0.6778;-0.0037,-1.0271,1.6072;-0.6253,-1.9226,1.6771;0.3979,-0.76,2.5901;3.3486,1.9821,0.3322;2.722,2.714,0.8236;4.6222,2.3116,-0.1103;5.0253,3.3071,0.0362;5.3761,1.3204,-0.7439;7.0027,1.7114,-1.3047;4.9488,0.0926,-0.9595)|\",5.17560543651\r\n\"[H]C1=NC([H])=C([H])C([H])=C1N1C(=O)[C@@]2([H])C([H])=C([H])[C@]1([H])C2([H])[H] |(1.3577,-1.9486,-0.3675;0.7758,-1.0344,-0.2759;1.4281,0.0934,-0.5671;0.7503,1.2438,-0.492;1.3055,2.1475,-0.7362;-0.5951,1.3061,-0.1272;-1.1078,2.2627,-0.0826;-1.2735,0.1334,0.1863;-2.318,0.1396,0.4675;-0.5742,-1.0835,0.1151;-1.1891,-2.3059,0.4369;-2.5481,-2.5996,0.3094;-3.4665,-1.8184,0.1502;-2.6196,-4.1457,0.3903;-3.6212,-4.5136,0.6076;-1.9395,-4.6513,-0.8895;-2.4685,-5.0962,-1.7244;-0.6386,-4.3487,-0.803;0.1365,-4.4958,-1.546;-0.4648,-3.6133,0.5358;0.5423,-3.4836,0.9275;-1.4755,-4.386,1.4092;-1.2232,-5.4428,1.5235;-1.6356,-3.9202,2.387)|\",5.2790086997\r\n\"[H]C1=C([H])C([H])=C(N2C(=O)[C@@]3([H])C([H])=C([H])[C@]2([H])C3([H])[H])S1 |(-2.686,-0.2143,-1.0276;-1.6285,-0.1744,-0.8037;-0.7468,0.8362,-1.0517;-1.0226,1.7731,-1.5238;0.5836,0.5343,-0.6299;1.4184,1.2153,-0.7468;0.6877,-0.7155,-0.0663;1.8263,-1.307,0.4699;1.9641,-2.647,0.8054;1.0686,-3.4668,0.8926;3.4898,-2.8425,0.9914;3.7432,-3.7435,1.5478;4.0856,-2.6837,-0.4154;4.479,-3.5035,-1.0052;3.9251,-1.4061,-0.7812;4.1645,-0.9435,-1.7314;3.1993,-0.7186,0.3862;3.1695,0.3691,0.4093;3.8431,-1.4514,1.5832;4.92,-1.2836,1.6595;3.3498,-1.2271,2.5343;-0.862,-1.5455,-0.0486)|\",4.68580050561\r\n\"[H]C1=C([H])C(N2C(=O)[C@@]3([H])C([H])=C([H])[C@]2([H])C3([H])[H])=C([H])S1 |(4.937,3.2893,-0.953;4.3856,2.3658,-0.8401;3.219,2.1692,-0.1625;2.6695,2.9399,0.3587;2.7746,0.8018,-0.2263;1.6153,0.3393,0.4121;0.5338,1.1132,0.8219;0.4774,2.3246,0.9238;-0.5939,0.0819,1.0912;-1.4005,0.4833,1.7028;-0.9614,-0.4805,-0.2892;-1.8799,-0.2487,-0.816;0.0834,-1.1945,-0.7253;0.2094,-1.6848,-1.6835;1.1574,-1.0817,0.3696;2.0012,-1.7693,0.3396;0.2681,-1.0897,1.6308;-0.2996,-2.0153,1.7523;0.8251,-0.8568,2.5441;3.6255,0.0044,-0.9562;3.5378,-1.0481,-1.1815;4.9762,0.9063,-1.5681)|\",5.07764445033\r\n\"[H]C1=C([H])C([H])=C(N2C(=O)[C@@]3([H])C([H])([H])[C@]2([H])[C@]2([H])O[C@@]23[H])C([H])=C1[H] |(-2.4634,-0.2244,-0.3818;-1.3816,-0.1301,-0.3601;-0.7744,1.1154,-0.5311;-1.3852,2.0008,-0.6866;0.6115,1.2523,-0.5001;1.0746,2.2191,-0.6401;1.4215,0.1196,-0.3022;2.829,0.2369,-0.2476;3.613,1.2451,-0.8196;3.2345,2.28,-1.3339;5.0676,0.7369,-0.6817;5.8097,1.5268,-0.7924;4.9545,-0.0089,0.6632;4.6961,0.6487,1.4964;5.8298,-0.6106,0.9051;3.7475,-0.8455,0.2;3.2818,-1.4967,0.9381;4.2782,-1.4844,-1.1044;3.6742,-2.1713,-1.6914;5.7032,-1.6616,-1.1637;5.134,-0.4417,-1.6793;5.2516,-0.2717,-2.7457;0.8112,-1.1355,-0.14;1.413,-2.0282,-0.0054;-0.578,-1.2526,-0.1648;-1.0285,-2.2334,-0.0373)|\",5.493978641595\r\n\"[H]C1=NC(/C(C#N)=N/OC(=O)OC([H])([H])C([H])=C([H])[H])=C([H])C([H])=C1[H] |(12.9736,-2.0871,-4.4598;11.9357,-2.4059,-4.3894;11.2486,-1.9304,-3.3472;9.966,-2.3,-3.2264;9.2552,-1.7404,-2.0482;9.9855,-0.8756,-1.1525;10.5433,-0.1771,-0.4113;8.015,-2.0542,-1.878;7.4928,-1.4555,-0.7376;6.168,-1.7979,-0.5421;5.5074,-2.5223,-1.2358;5.8068,-1.1576,0.5679;4.4281,-1.3778,0.9933;4.2017,-2.4399,0.8599;4.4383,-1.132,2.0564;3.4676,-0.513,0.235;3.395,-0.7066,-0.8333;2.7158,0.4242,0.8112;2.0072,1.0183,0.241;2.7751,0.6307,1.878;9.3172,-3.1526,-4.1308;8.2764,-3.4154,-3.9794;10.0491,-3.6384,-5.2091;9.5828,-4.302,-5.9317;11.386,-3.2606,-5.3464;11.991,-3.6195,-6.1733)|\",4.688521644114999\r\n\"[H]O[C@]([H])(/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])/C([H])=C(\\[H])OC(=O)C([H])([H])[H] |(9.0501,3.2379,1.8373;8.4293,3.8138,1.3593;7.1415,3.2143,1.4713;6.4665,3.9268,0.977;6.7003,3.0235,2.9039;5.7957,2.4302,3.034;7.34,3.5369,3.9644;8.2144,4.1566,3.7693;6.9851,3.3878,5.3839;5.9473,2.5528,5.8389;5.3699,1.971,5.1262;5.6513,2.454,7.195;4.8463,1.8012,7.5226;6.3861,3.183,8.135;6.1539,3.1018,9.1934;7.4217,4.0117,7.7025;8.0016,4.5827,8.4228;7.7167,4.1093,6.3434;8.5241,4.7587,6.0123;7.0834,1.8829,0.6815;7.8054,1.1907,1.1449;6.0936,1.4258,0.8087;7.3946,2.0625,-0.7769;8.3222,2.5695,-1.0303;6.5838,1.6531,-1.7493;5.635,1.1473,-1.61;6.936,1.8609,-3.0756;6.0697,1.4352,-4.0455;5.0201,0.8824,-3.8163;6.621,1.764,-5.4099;5.9457,1.3809,-6.1754;7.616,1.3243,-5.532;6.728,2.8487,-5.5183)|\",5.09397128136\r\n\"[H]/C(C#N)=C(/[H])C1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(3.8041,2.069,-1.2531;3.5346,1.2127,-1.8643;3.9514,1.2669,-3.2304;4.2986,1.3274,-4.34;2.851,0.1689,-1.3655;2.6135,-0.6537,-2.0413;2.3482,-0.0099,0.0487;0.8028,-0.0947,-0.0433;0.3474,-0.2896,0.9317;0.4942,-0.9008,-0.7193;0.3874,0.8437,-0.4272;2.9401,-1.3494,0.5839;2.5841,-2.182,-0.0375;4.0319,-1.3151,0.4621;2.6177,-1.6133,2.0637;1.5407,-1.7868,2.191;3.1155,-2.5385,2.38;3.0596,-0.4412,2.9511;4.1572,-0.3678,2.9298;2.7791,-0.6262,3.9955;2.4584,0.8854,2.4645;2.8429,1.718,3.0666;1.3709,0.8755,2.6169;2.7837,1.1434,0.9834;3.8706,1.2792,0.8899;2.3195,2.083,0.6556)|\",6.174263267845001\r\n\"[H]OC1=N[C@]([H])(C([H])([H])C([H])([H])C([H])=C([H])[H])[C@@]([H])(C([H])([H])[H])O1 |(7.4037,4.0407,-2.0695;8.0087,3.3001,-2.2466;7.2417,2.2063,-2.3009;5.9767,2.1623,-2.1466;5.61,0.7453,-2.2936;4.9516,0.6391,-3.1693;4.8654,0.2129,-1.0625;4.7009,-0.8675,-1.1826;5.5109,0.3401,-0.1819;3.5165,0.9143,-0.8239;3.7034,1.9951,-0.7688;2.8578,0.7503,-1.6874;2.8337,0.4459,0.4317;3.3844,0.6132,1.3597;1.6433,-0.1528,0.4875;1.2045,-0.4708,1.4296;1.0565,-0.3427,-0.4095;6.9656,0.0087,-2.5602;7.2286,-0.6535,-1.7267;7.0508,-0.7377,-3.8797;6.3311,-1.5649,-3.8859;8.0521,-1.1516,-4.0354;6.8161,-0.0701,-4.7159;7.9496,1.0876,-2.55)|\",7.194690207220001\r\n\"[H]C([H])([H])C(=O)[C@]1([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])[C@]1([H])C(=O)C([H])([H])C(=O)OC([H])([H])C([H])([H])[H] |(2.905,-1.3919,-6.0812;2.1897,-2.1752,-5.7968;1.2164,-1.9035,-6.2181;2.5355,-3.1214,-6.2178;2.1424,-2.2949,-4.2879;2.6819,-3.2237,-3.7078;1.4425,-1.2142,-3.4557;0.9786,-1.7649,-2.6275;0.3783,-0.3228,-4.1107;0.7885,0.1796,-4.9944;-0.5098,-0.8851,-4.4211;0.0923,0.7031,-3.0062;-0.5497,0.2481,-2.2392;-0.4247,1.5965,-3.3719;1.4817,1.0377,-2.4013;1.8751,1.9174,-2.9204;1.4378,1.3255,-0.898;0.7484,2.1498,-0.6799;2.4241,1.6087,-0.5113;1.0945,0.4488,-0.3338;2.4096,-0.1683,-2.7662;2.8539,-0.6342,-1.88;3.535,0.2396,-3.715;3.4003,1.1225,-4.5422;4.8541,-0.5307,-3.5851;4.6472,-1.5995,-3.468;5.358,-0.1858,-2.6739;5.7267,-0.318,-4.8054;5.5145,-0.832,-5.8831;6.7479,0.5189,-4.5423;7.6033,0.8672,-5.6609;8.5635,1.1118,-5.2003;7.7187,-0.0112,-6.3001;7.0317,2.0462,-6.4332;7.7175,2.3299,-7.2399;6.8928,2.9111,-5.7768;6.0664,1.7841,-6.8748)|\",5.77425590761\r\n\"[H]O/C(=N/[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])OC([H])([H])C([H])([H])[C@@]1([H])OC1([H])[H])C([H])([H])[H] |(5.2431,-4.5698,2.0842;5.6649,-3.8904,1.5239;5.1401,-2.708,1.9279;4.2816,-2.721,2.8719;3.6888,-1.4637,3.3224;3.6409,-0.7012,2.5347;4.5835,-0.8627,4.4129;5.4354,-0.0245,4.1987;4.3697,-1.4151,5.6203;5.2284,-0.9461,6.6728;5.1052,0.1305,6.8159;6.2746,-1.1558,6.4348;4.9206,-1.491,7.5656;2.2669,-1.7464,3.8023;2.2908,-2.4647,4.6336;1.6959,-2.2006,2.9766;1.6946,-0.5151,4.1992;0.378,-0.6478,4.7061;-0.2854,-1.0992,3.9524;0.3811,-1.3093,5.5883;-0.1189,0.747,5.0706;0.0374,1.3991,4.2016;0.4992,1.1557,5.8808;-1.5751,0.8301,5.4782;-1.8984,1.8392,5.7455;-2.535,0.1231,4.6777;-2.357,-0.2817,6.0428;-3.1958,-0.0673,6.706;-1.9034,-1.2623,6.182;5.6878,-1.5469,1.1368;5.9963,-0.7407,1.8069;4.9141,-1.1499,0.4683;6.5351,-1.8719,0.5308)|\",6.82461537054\r\n\"[H]O/C(=N/[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])OC([H])([H])C([H])([H])C([H])=O)C([H])([H])[H] |(3.9896,3.71,-2.0319;3.5995,2.846,-1.8294;4.5663,2.0268,-1.3122;4.184,0.8603,-0.998;5.1031,-0.1153,-0.4461;5.9977,0.3209,0.0171;4.3814,-0.8425,0.6949;4.5919,-0.6291,1.8692;3.4821,-1.7297,0.2283;2.6886,-2.4345,1.2147;2.4594,-1.7483,2.0333;1.7678,-2.6901,0.685;3.4076,-3.6765,1.7191;2.756,-4.2304,2.4048;4.3182,-3.4015,2.2588;3.6726,-4.3385,0.888;5.5596,-1.0923,-1.5362;4.6838,-1.4691,-2.0809;6.0815,-1.9514,-1.0827;6.4329,-0.3867,-2.4023;6.8522,-1.1631,-3.5116;7.3848,-2.0668,-3.1773;5.9864,-1.4995,-4.1016;7.7688,-0.2985,-4.3646;7.2422,0.6078,-4.6989;8.6211,0.0631,-3.7701;8.3072,-1.0097,-5.5822;8.9755,-0.395,-6.2265;8.0565,-2.1594,-5.8729;5.93,2.6762,-1.2103;6.6959,1.9884,-0.8557;5.8893,3.5398,-0.5345;6.242,3.0348,-2.199)|\",5.93752421791\r\n\"[H]C1=C([H])C2=C(C(=O)[C@]3([H])C([H])([H])C([H])([H])C([H])([H])[C@]3(C([H])([H])[H])O2)C([H])=C1[H] |(7.0263,-3.3075,-0.5429;6.5131,-2.3564,-0.6593;5.1314,-2.3114,-0.5278;4.5506,-3.2014,-0.3079;4.4585,-1.0889,-0.6747;5.1835,0.0837,-0.9597;4.4834,1.3648,-1.1647;5.0791,2.4167,-1.3554;2.9497,1.3134,-1.1877;2.6701,1.2513,-2.2489;2.3308,2.5537,-0.5051;3.0234,3.3951,-0.5795;1.4062,2.8472,-1.0114;2.0446,2.128,0.9696;0.9853,2.265,1.2107;2.6089,2.7365,1.6826;2.4413,0.6364,1.0696;1.7973,0.0554,1.7373;3.4696,0.5343,1.4365;2.3787,0.1215,-0.3852;0.9542,-0.2334,-0.809;0.2764,0.6123,-0.6594;0.5892,-1.0773,-0.2146;0.9274,-0.519,-1.8656;3.1096,-1.1184,-0.5482;6.584,0.0131,-1.0776;7.1139,0.9356,-1.2938;7.251,-1.1924,-0.9306;8.3313,-1.2403,-1.0292)|\",4.696685059630001\r\n\"[H]O/C(=N/[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])OC([H])([H])C#CC([H])([H])[H])C([H])([H])[H] |(11.7684,-2.6051,2.5829;11.2279,-2.3736,1.8136;9.9131,-2.3568,2.1745;9.0942,-2.0404,1.255;7.6627,-2.0309,1.5313;7.415,-1.7794,2.5719;7.1269,-3.4542,1.3228;7.0965,-4.2912,2.2031;6.7684,-3.6969,0.0521;6.3197,-5.034,-0.22;7.1119,-5.755,-0.0014;6.0713,-5.048,-1.2816;5.4404,-5.2733,0.3837;7.0063,-0.9954,0.6226;7.1632,-1.272,-0.4289;7.4873,-0.0183,0.7915;5.629,-0.9507,0.9506;4.896,-0.0907,0.0897;5.3055,0.9337,0.1377;5.0014,-0.425,-0.9568;3.4852,-0.0809,0.4711;2.3135,-0.0575,0.766;0.8987,-0.0406,1.1287;0.7168,-0.6532,2.0199;0.2735,-0.4372,0.3196;0.5542,0.9769,1.3496;9.6083,-2.7301,3.614;8.9658,-1.9732,4.076;10.5148,-2.8162,4.2217;9.0594,-3.6755,3.6463)|\",6.688558445290001\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@]1([H])OC(=O)C([H])=C([H])C1([H])[H] |(2.1177,-1.7516,0.6629;2.4806,-1.3899,-0.1608;2.5265,0.0416,-0.0181;3.4156,0.4178,1.1747;3.5366,1.5013,1.2552;4.4061,-0.0373,1.0696;2.969,0.0528,2.1085;1.1065,0.5906,0.1714;1.0981,1.6831,0.2366;0.6879,0.2052,1.1101;0.4454,0.2833,-0.6435;3.2127,0.4883,-1.3408;4.1966,0.0117,-1.3302;3.5089,1.9023,-1.2816;2.6822,2.8454,-1.8219;2.8717,4.0117,-1.5562;1.626,2.3867,-2.7477;0.9686,3.1665,-3.1167;1.5318,1.1093,-3.1318;0.7772,0.8061,-3.8559;2.4811,0.0659,-2.6256;3.215,-0.1479,-3.4168;1.9665,-0.8808,-2.4323)|\",5.918476248375001\r\n\"[H]C(=O)C1=C([H])C([H])=C(N([H])/N=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])=C1[H] |(10.5685,0.3369,-4.4068;10.502,-0.5291,-3.7093;11.4592,-1.2672,-3.5471;9.2002,-0.674,-3.0425;8.9724,-1.7242,-2.1358;9.7801,-2.4229,-1.9405;7.7471,-1.86,-1.5095;7.5765,-2.675,-0.8092;6.7089,-0.9414,-1.7756;5.4978,-1.1117,-1.1285;5.3985,-1.8968,-0.4872;4.452,-0.2824,-1.3153;3.3678,-0.5298,-0.6648;3.3075,-1.3905,0.017;2.1695,0.2975,-0.7828;2.1193,1.4311,-1.6158;2.9994,1.701,-2.1904;0.9579,2.1907,-1.6974;0.9315,3.0635,-2.3444;-0.1741,1.8384,-0.9533;-1.0793,2.4353,-1.021;-0.1346,0.7165,-0.1246;-1.0087,0.4354,0.4564;1.028,-0.0476,-0.0403;1.0554,-0.9215,0.6071;6.9252,0.1138,-2.6812;6.1227,0.8131,-2.8797;8.1612,0.2339,-3.3;8.3297,1.0497,-4.0002)|\",3.823199599525\r\n\"[H]C1=C([H])C(C([H])([H])C([H])([H])[H])=C([H])C([H])=C1C(=O)N([H])[C@@]1([H])C(=O)OC1([H])[H] |(2.2534,4.0605,0.4056;2.7278,3.1143,0.6439;2.3391,1.9373,0.0138;1.5528,1.9667,-0.7373;2.9366,0.7098,0.3372;2.5342,-0.562,-0.3773;2.7049,-1.423,0.2807;1.4579,-0.5385,-0.5891;3.3043,-0.7701,-1.6947;2.9893,-1.6972,-2.1868;4.3832,-0.8304,-1.5134;3.1285,0.0594,-2.3885;3.9338,0.7008,1.3214;4.3937,-0.2421,1.6082;4.3342,1.8773,1.9519;5.0793,1.8195,2.7414;3.741,3.1012,1.6122;4.0991,4.4069,2.2512;3.3451,5.3737,2.241;5.343,4.4756,2.8466;5.9826,3.6995,2.7557;5.7659,5.6439,3.5538;4.8736,6.2394,3.7639;6.6616,5.4486,4.7929;6.6601,4.7955,5.7923;7.6223,6.3292,4.3521;6.9531,6.5066,3.0584;6.7522,7.5617,2.863;7.5453,6.0638,2.2533)|\",5.673573782925001\r\n\"[H]O/C(=N/[C@@]1([H])C(=O)OC1([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(2.7645,-1.4725,-0.751;3.2274,-0.6203,-0.7893;2.386,0.3497,-0.3266;2.9217,1.48,-0.089;2.1355,2.615,0.2886;1.1076,2.6169,-0.1029;2.1436,2.9884,1.7917;1.8137,2.4783,2.8243;2.7013,4.2187,1.5967;2.84,3.9811,0.1494;2.3198,4.7584,-0.4142;3.8916,3.9101,-0.1308;0.9537,-0.0646,-0.2197;0.3581,-0.753,-1.2916;0.9344,-0.9475,-2.1928;-0.9735,-1.1569,-1.2174;-1.4287,-1.6745,-2.0571;-1.7192,-0.8937,-0.0659;-2.7558,-1.2136,-0.0052;-1.1303,-0.2212,1.0063;-1.7045,-0.0235,1.9071;0.1991,0.197,0.9348;0.6566,0.706,1.7785)|\",5.58921848927\r\n\"[H]/C(C(=O)OC([H])([H])[H])=C(\\C#N)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(6.4488,-0.1504,-0.9328;5.4432,-0.0948,-1.3365;5.2846,-0.6517,-2.7019;4.2576,-0.7568,-3.3458;6.4915,-1.0529,-3.1615;6.4902,-1.6191,-4.4823;5.8523,-2.5062,-4.5176;7.5279,-1.8821,-4.6871;6.1253,-0.8901,-5.2106;4.4587,0.4603,-0.5943;4.8242,0.9335,0.7165;5.0815,1.3279,1.7806;3.0046,0.6488,-0.9675;2.7552,1.7057,-0.8017;2.8797,0.4303,-2.0287;2.0294,-0.2223,-0.132;2.2088,-0.0075,0.9314;0.5832,0.1756,-0.4583;-0.1251,-0.4099,0.1388;0.3559,-0.0042,-1.5167;0.4014,1.2367,-0.2505;2.2586,-1.7215,-0.3683;1.5528,-2.3172,0.2216;3.2708,-2.0287,-0.0828;2.1183,-1.9767,-1.4256)|\",5.330710331295\r\n\"[H]C([H])([H])OC(=O)/C(=C(\\C#N)[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(7.8458,2.8661,2.8605;6.8935,2.5259,3.266;7.0517,1.77,4.0389;6.3276,3.3604,3.688;6.1864,1.9567,2.1487;4.9742,1.4498,2.4235;4.4337,1.527,3.5088;4.3317,0.7766,1.2366;5.0256,-0.0694,0.4279;6.4207,-0.3173,0.646;7.5414,-0.6259,0.7369;4.3028,-1.1593,-0.9997;3.2579,-2.5345,-0.2218;2.8722,-3.2046,-1.0001;2.4005,-2.147,0.3388;3.8616,-3.1372,0.466;3.2789,-0.128,-2.2146;3.0264,-0.7489,-3.0832;3.8475,0.7341,-2.5812;2.3411,0.2412,-1.7888;5.7595,-1.9122,-1.9346;5.3884,-2.5325,-2.7596;6.3815,-2.5402,-1.2895;6.4081,-1.1395,-2.3621;2.85,1.0435,1.1239;2.3623,0.2263,0.5846;2.4384,1.0501,2.1405;2.4738,2.3826,0.429;2.8891,2.3525,-0.5878;0.9456,2.4845,0.3178;0.652,3.4006,-0.2073;0.4803,2.506,1.3115;0.5215,1.6342,-0.2303;3.0529,3.6168,1.136;2.7204,4.5329,0.6343;4.1482,3.6187,1.1247;2.7253,3.6668,2.1813)|\",5.24091276063\r\n\"[H]C([H])([H])OC(=O)/C(=C(/C#N)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.7034,4.3602,-5.2942;5.7411,4.3954,-4.7777;5.5583,5.3886,-4.3677;4.9498,4.1139,-5.4767;5.747,3.5072,-3.6422;5.9811,2.2075,-3.9264;6.173,1.7879,-5.0495;6.0439,1.3653,-2.6901;4.8907,0.8494,-2.2016;4.9323,-0.0044,-1.0445;4.9306,-0.6882,-0.1015;3.5092,1.0808,-2.8003;2.7556,1.0125,-2.0055;3.4719,2.1076,-3.1822;3.1298,0.105,-3.9452;3.9068,0.1951,-4.7159;3.0771,-1.3568,-3.481;2.8009,-2.0138,-4.3137;2.3309,-1.4931,-2.6875;4.0415,-1.7022,-3.0943;1.7896,0.5341,-4.5597;1.5196,-0.117,-5.3988;1.8293,1.5641,-4.9339;0.9803,0.4764,-3.8202;7.8149,1.0668,-1.9914;8.2199,-0.774,-2.1175;9.2314,-0.9647,-1.7384;8.1811,-1.1135,-3.1585;7.522,-1.3805,-1.5317;9.0057,2.0724,-3.0628;10.0355,1.9267,-2.7155;8.7889,3.1457,-3.0105;8.9594,1.7682,-4.1139;7.8781,1.68,-0.2048;8.8861,1.5389,0.2047;7.1768,1.1372,0.4361;7.6406,2.7482,-0.146)|\",5.4939786415950005\r\n\"[H]C([H])=C(C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])O1 |(7.8053,-1.4387,-1.3481;7.6241,-0.393,-1.1218;8.4111,0.3227,-1.3351;6.462,0.0201,-0.6019;6.315,1.4913,-0.3517;7.1748,2.3254,-0.5589;5.0846,1.7926,0.1313;4.824,3.1878,0.4122;5.2936,3.7956,-0.365;3.7374,3.2789,0.3391;5.3252,3.5794,1.7945;5.0487,4.6181,2.0092;6.4142,3.4959,1.8454;4.8829,2.9396,2.5653;5.2924,-0.8948,-0.2718;4.9283,-0.631,0.7312;4.4569,-0.6677,-0.9479;5.6134,-2.3606,-0.3183;6.3546,-2.7213,0.3934;5.0412,-3.235,-1.1625;4.3018,-2.8856,-1.8818;5.308,-4.6541,-1.2232;4.8031,-5.6518,-2.019;4.0609,-5.5289,-2.7958;5.446,-6.8652,-1.6206;5.2938,-7.8543,-2.0294;6.296,-6.5262,-0.6121;6.9869,-7.0847,0.001;6.2276,-5.1894,-0.3558)|\",4.58783951943\r\n\"[H]C1=C([H])[C@@]2([H])O[C@]1([H])P(Cl)C2([H])[H] |(-0.4199,-2.2614,-1.1832;-0.398,-1.4418,-0.4749;-1.2974,-0.4657,-0.3013;-2.2329,-0.3123,-0.8264;-0.6758,0.5073,0.6948;-1.356,1.0418,1.3601;0.1739,-0.3461,1.468;0.7724,-1.0927,0.4183;1.4105,-1.886,0.8044;1.6942,0.2556,-0.6358;3.2227,0.8123,0.7295;0.3103,1.4351,-0.0847;0.7326,2.2034,0.5677;-0.1578,1.9169,-0.9474)|\",5.630035566845001\r\n\"[H]OC([H])([H])C1=C([H])[C@]2(C([H])([H])/C([H])=C(\\C([H])([H])[H])C([H])([H])O[H])C(=O)O[C@]2([H])C1([H])[H] |(3.58,-7.8169,-4.0504;3.4352,-8.3117,-3.2279;2.5877,-7.5152,-2.3997;1.5993,-7.3611,-2.8591;2.4379,-8.1147,-1.4935;3.2129,-6.1937,-2.0553;2.7065,-4.9787,-2.3134;1.7545,-4.7843,-2.7984;3.6206,-3.8558,-1.8832;2.9818,-2.7205,-1.0671;2.5201,-3.1499,-0.1718;3.7836,-2.0513,-0.7199;1.9953,-1.9174,-1.877;2.4071,-1.5101,-2.8006;0.7134,-1.6445,-1.5864;-0.0398,-2.1129,-0.3668;0.4595,-2.9291,0.1609;-1.041,-2.4466,-0.6597;-0.1776,-1.2872,0.3457;-0.0866,-0.7749,-2.5348;0.5579,-0.4107,-3.3481;-0.4808,0.102,-2.0063;-1.2468,-1.4336,-3.0512;-0.9302,-2.1966,-3.5606;4.5687,-3.4285,-3.0232;4.5324,-2.7148,-3.9839;5.6259,-4.183,-2.5745;4.8806,-4.6156,-1.3763;5.3842,-4.2371,-0.4848;4.5696,-6.1115,-1.3799;5.3392,-6.6751,-1.9201;4.5358,-6.5198,-0.3601)|\",6.18242668336\r\n\"[H]O[C@@]1([H])[C@]2([H])OC(=O)C([H])([H])[C@]2([H])O[C@]1([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(6.0935,-1.5592,0.3122;6.1692,-0.6723,0.701;5.18,0.1505,0.0923;5.0619,1.0271,0.7307;5.6279,0.5192,-1.3295;6.7095,0.6646,-1.3813;4.9779,1.7161,-1.7976;4.3058,1.5137,-2.976;3.6964,2.3934,-3.521;4.4829,0.0646,-3.4122;3.5297,-0.3674,-3.7239;5.1618,0.0437,-4.2733;5.0821,-0.6396,-2.1971;5.85,-1.3704,-2.4697;4.0991,-1.3129,-1.398;3.8729,-0.6214,-0.1478;3.7644,-1.3931,0.6232;2.5703,0.2054,-0.1944;2.6406,0.9789,-0.9638;1.7769,-0.4841,-0.5068;2.2225,0.8383,1.1367;2.5102,2.1873,1.386;2.9656,2.7883,0.6019;2.2108,2.7661,2.6205;2.4357,3.8154,2.7924;1.6191,2.002,3.6275;1.3818,2.452,4.5876;1.327,0.6571,3.3913;0.8608,0.0561,4.1677;1.626,0.0826,2.1557;1.3834,-0.9629,1.9753)|\",6.4763096419\r\n\"[H]/N=C(\\C1=C([H])C([H])=C(/C([H])=N/OC([H])([H])C(=O)O[H])C([H])=C1[H])N([H])[H] |(0.3586,-3.9061,-0.5835;-0.5347,-4.1693,-0.1642;-1.22,-3.1238,0.1304;-0.7553,-1.7072,0.0012;0.5933,-1.3857,0.2328;1.2819,-2.1684,0.5382;1.0477,-0.0822,0.1039;2.0909,0.1511,0.2905;0.1618,0.9474,-0.2661;0.5938,2.3365,-0.4158;-0.1363,3.0772,-0.7498;1.7974,2.7059,-0.168;1.9607,4.0841,-0.3486;3.3508,4.4313,-0.3364;3.8814,3.8882,-1.13;3.3737,5.4975,-0.5615;4.0905,4.2132,0.9942;4.9776,4.9559,1.3381;3.7367,3.1329,1.709;3.043,2.6386,1.2155;-1.1845,0.6267,-0.5014;-1.8787,1.4099,-0.7961;-1.6391,-0.682,-0.3636;-2.6789,-0.9213,-0.5629;-2.5343,-3.2711,0.5546;-2.8484,-2.6331,1.2755;-2.7755,-4.2392,0.7373)|\",4.367427300525\r\n\"[H]OC([H])([H])[C@@]1([H])[C@]2([H])N(C([H])([H])[C@@]1([H])O[H])[C@@]([H])(C([H])([H])O[H])[C@@]([H])(O[H])[C@@]2([H])O[H] |(-0.098,1.3831,2.7503;0.2274,2.2027,2.3388;-0.661,2.478,1.2708;-1.6435,2.8087,1.6494;-0.2309,3.3259,0.7258;-0.8943,1.315,0.2921;-1.6039,1.7021,-0.4531;0.3337,0.7597,-0.5145;0.3028,1.1994,-1.5192;0.1338,-0.7256,-0.5984;-1.2304,-1.0134,-0.1477;-1.9314,-0.9424,-0.9929;-1.3045,-2.0305,0.2525;-1.5777,0.0694,0.8851;-2.6611,0.2178,0.9958;-0.9999,-0.217,2.1717;-1.4566,-0.982,2.5538;1.2511,-1.4279,0.0858;0.8626,-2.1571,0.8058;2.0748,-2.1856,-0.9734;1.5988,-3.1502,-1.2022;3.0867,-2.3688,-0.6075;2.1811,-1.4116,-2.1729;1.25,-1.1547,-2.3412;2.048,-0.3168,0.8362;1.6184,-0.2067,1.8353;3.43,-0.567,0.9735;3.8475,-0.0374,0.2657;1.7655,0.9608,0.0219;1.864,1.8565,0.6325;2.7306,1.0892,-1.0232;2.6564,0.2928,-1.5992)|\",7.023258481405\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])OC([H])([H])[C@@]2(C(=O)C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C1([H])[H] |(1.2461,6.339,-3.6254;1.3345,5.2542,-3.4967;0.6341,4.9539,-2.7076;1.003,4.7801,-4.4291;2.7696,4.8467,-3.1473;3.0804,5.3524,-2.2221;3.4523,5.2026,-3.9321;2.9456,3.3325,-2.9759;2.6344,2.8253,-3.9021;2.2623,2.9749,-2.1911;4.3815,2.9235,-2.6245;4.6954,3.4228,-1.6992;5.0631,3.2776,-3.4128;4.559,1.4118,-2.446;4.2844,0.8899,-3.3755;3.8728,1.0451,-1.6689;5.9878,1.0115,-2.0663;6.678,1.3963,-2.8302;6.3832,1.6188,-0.8232;6.4449,0.6086,0.1808;7.1131,0.9628,0.9715;5.459,0.4122,0.6228;6.9628,-0.6451,-0.5415;6.6026,-1.9423,0.1898;5.8613,-1.9591,1.1565;7.223,-3.2181,-0.3641;6.9607,-4.0437,0.3039;6.7614,-3.4213,-1.3428;8.7476,-3.0814,-0.5517;9.2274,-3.0055,0.4337;9.1429,-3.9895,-1.0225;9.0932,-1.8414,-1.3863;8.7182,-1.9754,-2.4107;10.1816,-1.7333,-1.4701;8.4979,-0.571,-0.7643;8.7244,0.3085,-1.3783;8.9763,-0.3972,0.2102;6.1752,-0.5178,-1.8803;6.6951,-0.972,-2.7289;5.198,-1.0086,-1.7924)|\",5.869495755285\r\n\"[H]C([H])([H])C([H])(C([H])([H])[H])[C@@]1([H])OC([H])([H])[C@@]2(C(=O)C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C1([H])[H] |(1.0928,2.0608,0.4537;1.4866,1.2227,1.0405;2.0863,1.6422,1.8596;0.6466,0.6847,1.4872;2.3372,0.2949,0.1622;1.7096,-0.0742,-0.6622;3.5235,1.0623,-0.4423;3.1727,1.9358,-1.0035;4.1172,0.4493,-1.13;4.197,1.4255,0.3456;2.825,-0.9264,0.9763;3.5373,-0.5739,1.7333;1.7685,-1.5609,1.7011;1.136,-2.478,0.8112;0.6222,-3.2229,1.4265;0.3978,-1.9771,0.1743;2.2595,-3.0974,-0.0583;1.8714,-3.173,-1.5467;0.8998,-2.5959,-2.0012;2.7997,-3.9818,-2.4482;2.3278,-4.0598,-3.432;3.7259,-3.4007,-2.5751;3.1584,-5.3593,-1.8592;2.264,-5.9976,-1.8603;3.8967,-5.8538,-2.5018;3.6818,-5.2229,-0.4242;4.6296,-4.6662,-0.428;3.9043,-6.2118,-0.0046;2.6515,-4.5066,0.4566;3.0146,-4.4052,1.4874;1.7443,-5.1261,0.506;3.4309,-2.0662,0.1227;4.2616,-2.5397,0.6561;3.8276,-1.7009,-0.8292)|\",5.850447785749999\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])OC([H])([H])[C@@]3(C(=O)C([H])([H])C([H])([H])C([H])([H])C3([H])[H])C2([H])[H])C([H])=C1[H] |(-2.8993,0.6768,-1.1838;-1.8741,0.4633,-0.8928;-0.8621,0.4439,-1.8538;-1.0948,0.6438,-2.8966;0.4547,0.1791,-1.4748;1.2429,0.1788,-2.2254;0.7737,-0.0789,-0.1372;2.1927,-0.4217,0.2509;2.8884,0.0629,-0.4579;2.474,0.0162,1.5745;3.6747,-0.6347,1.9767;4.5468,-0.0366,1.6679;3.6601,-0.7203,3.0642;3.7148,-2.0246,1.2795;3.4825,-3.174,2.2791;3.1768,-2.9827,3.4411;3.666,-4.5806,1.7235;3.535,-5.2911,2.5446;2.8723,-4.7691,0.9856;5.0366,-4.7343,1.0339;5.8316,-4.6489,1.7876;5.1223,-5.7377,0.6;5.2301,-3.6586,-0.0412;4.5043,-3.8199,-0.8505;6.2242,-3.7498,-0.4961;5.0615,-2.2481,0.5406;5.1596,-1.4951,-0.2526;5.8808,-2.0554,1.2485;2.5017,-1.9352,0.2974;2.718,-2.3474,-0.6932;1.6258,-2.4593,0.6955;-0.2459,-0.0575,0.8224;0.0061,-0.2381,1.8625;-1.5609,0.2151,0.4457;-2.3432,0.2369,1.2001)|\",5.913033971365\r\n\"[H]C1=C([H])C([H])=C([C@]2(C([H])([H])[H])OC([H])([H])[C@]3(C(=O)C([H])([H])C([H])([H])C3([H])[H])C2([H])[H])C([H])=C1[H] |(4.4215,-4.3972,2.3293;3.9879,-3.4921,1.9125;3.8062,-3.3675,0.5351;4.0986,-4.178,-0.128;3.251,-2.2037,-0.0011;3.1067,-2.1052,-1.0717;2.8762,-1.1433,0.8312;2.2272,0.1182,0.2507;0.7111,0.0844,0.485;0.4843,-0.0053,1.5524;0.2721,-0.7749,-0.0315;0.2457,0.9981,0.0995;2.4121,0.1631,-1.1821;3.4948,1.0345,-1.4652;4.4645,0.5487,-1.2801;3.4284,1.3154,-2.521;3.3022,2.2167,-0.4819;4.608,3.0155,-0.3243;5.6087,2.6043,0.2238;4.4434,4.3867,-0.9821;4.3371,5.1263,-0.1748;5.3386,4.6608,-1.5485;3.1473,4.2737,-1.8013;3.3697,3.8843,-2.8029;2.6376,5.2326,-1.9351;2.2928,3.2556,-1.0124;1.5039,2.7973,-1.6172;1.8054,3.7608,-0.1678;2.8919,1.4332,0.7809;3.7897,1.2074,1.3628;2.2078,2.0085,1.4131;3.0595,-1.2792,2.2147;2.7726,-0.4691,2.8817;3.6097,-2.4424,2.7523;3.7466,-2.5269,3.8273)|\",5.766092492095\r\n\"[H]C1([H])C(=O)[C@@]2(C([H])([H])OC3(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C3([H])[H])C2([H])[H])C([H])([H])C1([H])[H] |(-5.4138,-1.1814,-0.1668;-4.4018,-1.5175,0.0791;-4.3196,-2.5608,-0.2598;-3.3521,-0.7079,-0.6837;-3.426,-0.362,-1.8443;-2.1814,-0.3772,0.2569;-2.1388,1.1396,0.5738;-2.828,1.4462,1.3677;-2.3673,1.7151,-0.3366;-0.8192,1.3754,1.0286;0.0988,0.562,0.2555;0.7955,1.4512,-0.7924;0.056,1.8178,-1.5157;1.1929,2.3311,-0.2691;1.9423,0.7213,-1.5095;1.5396,-0.1163,-2.0979;2.4219,1.3985,-2.2272;2.9733,0.1882,-0.5033;3.4603,1.0371,-0.0015;3.7651,-0.365,-1.024;2.305,-0.7073,0.5509;3.0382,-1.0284,1.3017;1.9429,-1.6252,0.065;1.1358,0.008,1.2461;1.518,0.8556,1.8311;0.6365,-0.6628,1.9564;-0.7928,-0.5418,-0.3923;-0.889,-0.3814,-1.4719;-0.3907,-1.5484,-0.2381;-2.4615,-1.2293,1.5136;-1.9949,-2.2172,1.4026;-2.0517,-0.7764,2.4221;-3.9987,-1.3826,1.5566;-4.3195,-2.2302,2.1695;-4.4548,-0.4823,1.9873)|\",5.836842093225\r\n\"[H]C1([H])C(=O)[C@@]2(C([H])([H])OC3(C([H])([H])C([H])([H])C([H])([H])C3([H])[H])C2([H])[H])C([H])([H])C1([H])[H] |(-4.5086,-1.261,2.8589;-3.8273,-0.4046,2.8668;-3.6647,-0.1354,3.9206;-2.4658,-0.7787,2.2763;-1.856,-1.8045,2.499;-1.9875,0.345,1.3433;-2.0683,-0.1301,-0.1493;-2.9639,0.2282,-0.6666;-2.0451,-1.2292,-0.2005;-0.9389,0.429,-0.7931;0.1602,0.2914,0.1316;1.3181,1.2083,-0.3284;0.9389,1.998,-0.9839;1.7646,1.6924,0.55;2.3563,0.2822,-0.9826;2.0638,0.057,-2.0164;3.3613,0.7173,-1.0101;2.2614,-0.99,-0.1238;2.7688,-0.8269,0.8373;2.7236,-1.8676,-0.5891;0.7458,-1.1578,0.0891;0.4845,-1.7483,0.9721;0.3105,-1.6597,-0.7835;-0.4811,0.682,1.4843;-0.0156,0.1564,2.3236;-0.3668,1.7588,1.6532;-2.9606,1.5036,1.649;-2.5814,2.0899,2.4971;-3.0676,2.1908,0.8027;-4.2837,0.8144,2.048;-4.9562,1.4777,2.6001;-4.8215,0.4831,1.1502)|\",5.869495755285\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])OC([H])([H])[C@]2(C(=O)C([H])([H])C([H])([H])C2([H])[H])C1([H])[H] |(0.1455,1.3577,0.8422;0.64,1.652,-0.0906;0.2288,1.0304,-0.8957;0.3605,2.6917,-0.3017;2.1608,1.4956,0.0079;2.407,0.454,0.2585;2.5386,2.1048,0.8413;2.8955,1.8904,-1.2797;2.6524,2.9342,-1.5303;2.5128,1.2832,-2.1141;4.4177,1.7265,-1.1895;4.6658,0.6869,-0.9407;4.8018,2.3374,-0.3587;5.15,2.1145,-2.4792;4.9486,3.1682,-2.7219;4.7524,1.5276,-3.3203;6.6714,1.9277,-2.3936;7.056,2.5228,-1.557;7.0359,0.5716,-2.1007;7.1064,-0.1451,-3.3196;7.8309,-0.9536,-3.1756;6.1358,-0.6004,-3.5749;7.5099,0.8812,-4.4417;8.9374,0.5568,-4.9275;9.9604,0.8771,-4.363;8.855,-0.219,-6.2479;9.2842,0.4241,-7.0286;9.4723,-1.1222,-6.2099;7.3538,-0.4621,-6.4763;7.0533,-1.4206,-6.0344;7.0795,-0.4957,-7.5349;6.671,0.702,-5.7227;5.6125,0.5105,-5.5151;6.7236,1.6154,-6.3303;7.4227,2.2348,-3.7032;8.4386,2.5712,-3.4729;6.9291,3.0046,-4.3055)|\",5.953851048940001\r\n\"[H]C([H])([H])C([H])(C([H])([H])[H])[C@@]1([H])OC([H])([H])[C@]2(C(=O)C([H])([H])C([H])([H])C2([H])[H])C1([H])[H] |(3.5699,-2.074,-0.1326;3.6622,-1.0287,-0.4504;4.6744,-0.6906,-0.1924;3.5598,-0.9865,-1.5378;2.6031,-0.1571,0.2394;1.6084,-0.5275,-0.0508;2.7321,-0.2427,1.7678;2.6573,-1.2835,2.1023;1.9523,0.3291,2.2831;3.7044,0.1424,2.1013;2.6897,1.2978,-0.2327;3.6618,1.7172,0.0806;2.6244,1.3378,-1.6656;2.2248,2.6516,-2.0022;3.0472,3.3684,-1.8455;1.9355,2.661,-3.0572;1.0566,2.9669,-1.0231;0.8846,4.4852,-0.8618;1.6536,5.2101,-0.2659;-0.3908,4.9293,-1.5818;-1.1359,5.1683,-0.809;-0.2203,5.8469,-2.1534;-0.8207,3.7054,-2.4083;-0.3374,3.7295,-3.3933;-1.8999,3.6611,-2.5826;-0.3008,2.5017,-1.5909;-0.2084,1.5841,-2.1812;-0.9941,2.2902,-0.7656;1.5486,2.239,0.2514;1.9184,2.9624,0.9838;0.73,1.6746,0.7112)|\",5.87221689379\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(N([H])[H])C([H])([H])C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.999,0.3507,-1.3298;5.2472,0.72,-2.211;6.3752,0.1453,-2.6683;6.9153,0.5446,-3.678;6.9737,-0.9682,-1.8129;6.1912,-1.5566,-1.3236;7.5275,-1.6284,-2.4861;7.9748,-0.404,-0.7584;8.3524,-1.2597,-0.1849;9.1371,0.2728,-1.3347;8.8221,0.934,-2.0463;9.708,-0.4071,-1.8352;7.3184,0.5788,0.249;6.9905,1.4767,-0.2871;8.0721,0.8708,0.9818;6.1105,-0.0052,0.9454;5.0103,-0.1411,0.4198;6.3782,-0.4018,2.1932;5.297,-1.025,2.9461;4.6766,-1.5942,2.2509;5.8118,-1.7098,3.6252;4.4778,0.0112,3.7057;4.0259,0.7022,2.983;3.6436,-0.5202,4.1857;5.2732,0.7874,4.764;5.7008,0.077,5.4856;6.1245,1.2871,4.2855;4.4195,1.8205,5.506;5.0106,2.3593,6.2546;3.5788,1.3434,6.0243;4.0034,2.5618,4.8132)|\",6.16065757532\r\n\"[H]OC(=O)/C([H])=C([H])/C([H])=C([H])/C([H])=C(\\[H])[C@@]1([H])C([H])=C([H])C(=O)C([H])([H])C1([H])[H] |(-4.0536,-0.319,-3.1581;-4.5592,-0.2382,-2.3333;-3.7119,-0.1807,-1.2642;-4.1662,-0.0782,-0.1466;-2.2664,-0.2509,-1.5815;-1.9513,-0.3441,-2.6205;-1.3478,-0.1978,-0.5929;-1.7231,-0.1041,0.4256;0.0784,-0.2559,-0.7834;0.4482,-0.3547,-1.8044;0.9677,-0.1912,0.2382;0.5895,-0.09,1.2558;2.4031,-0.25,0.0756;2.7784,-0.3627,-0.9427;3.2811,-0.178,1.0948;2.9027,-0.0505,2.1095;4.7839,-0.2338,0.9409;4.9995,-0.4147,-0.121;5.3713,1.1129,1.3144;5.2227,1.9185,0.5964;6.0184,1.358,2.4659;6.429,2.3392,2.6907;6.2075,0.3267,3.512;6.8328,0.5618,4.5345;5.5389,-1.0218,3.2638;6.0926,-1.7855,3.8182;4.5376,-0.9746,3.7172;5.4385,-1.3647,1.7709;4.8841,-2.2991,1.6288;6.449,-1.5304,1.376)|\",3.90755489318\r\n\"[H]N([H])[C@]([H])(C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])C(=O)OC([H])([H])C([H])([H])C([H])([H])[H] |(3.5441,-2.8068,0.3297;3.2207,-3.5509,0.9468;2.3457,-3.8993,0.558;2.9843,-2.9854,2.28;2.5728,-3.7884,2.9026;1.9626,-1.8254,2.2859;1.0523,-2.1482,1.7611;2.346,-0.967,1.7247;1.534,-1.3678,3.6656;1.8527,-1.8801,4.7197;0.7119,-0.2969,3.5845;0.2251,0.206,4.8393;-0.3622,1.0911,4.5925;1.0563,0.4663,5.4995;-0.3994,-0.5414,5.337;4.3352,-2.6131,2.9123;4.9598,-3.5114,2.9702;4.1914,-2.2629,3.94;5.1248,-1.5653,2.1517;4.7647,-0.9924,1.1388;6.3143,-1.3427,2.7477;7.1857,-0.3641,2.1321;8.19,-0.6541,2.4536;7.1131,-0.4589,1.0451;6.8514,1.0545,2.5818;5.8295,1.292,2.2659;6.8694,1.0899,3.6783;7.8328,2.0788,2.0013;7.5817,3.0914,2.3336;7.8117,2.0742,0.905;8.8625,1.871,2.3173)|\",6.481751918910001\r\n\"[H]OB(O[H])C([H])([H])N(C(=S)[C@@]1([H])N([H])C([H])([H])C([H])([H])C1([H])[H])C([H])([H])[H] |(0.1298,-2.8968,1.0856;0.6967,-2.1484,0.8402;-0.0378,-1.1233,0.322;-1.3984,-1.2873,0.225;-1.8495,-0.517,-0.1478;0.6694,0.2285,-0.1264;0.32,1.0714,0.4808;0.3659,0.471,-1.1581;2.1423,0.2288,-0.101;2.908,0.8622,0.8127;4.5941,0.8818,0.7795;2.2533,1.5801,2.006;3.0869,2.0897,2.502;1.1854,2.5461,1.675;1.5791,3.3831,1.248;0.5294,2.8361,2.9568;1.1577,3.4543,3.623;-0.415,3.3674,2.7966;0.3477,1.4338,3.5608;0.2813,1.4622,4.6524;-0.5787,0.984,3.1872;1.5821,0.6393,3.0545;1.2975,-0.3217,2.6202;2.2951,0.4253,3.8549;2.7296,-0.5384,-1.199;2.3295,-1.5573,-1.1772;3.8118,-0.5623,-1.0904;2.4666,-0.0712,-2.1571)|\",4.623214319995\r\n\"[H]OB(O[H])C([H])([H])N(C(=S)[C@@]([H])(N([H])[H])C([H])([H])[H])C([H])([H])[H] |(2.9199,-1.5369,-5.6118;3.3582,-1.0022,-4.9311;3.129,-1.5069,-3.6868;2.3847,-2.6416,-3.5413;2.2725,-2.8924,-2.6119;3.7376,-0.7094,-2.4424;3.162,0.2038,-2.2988;4.7607,-0.3794,-2.6614;3.7593,-1.5041,-1.1978;2.9316,-1.3623,-0.1351;3.0316,-2.3869,1.1962;1.8773,-0.2226,-0.0799;1.3921,-0.3955,0.8818;2.4211,1.138,0.0035;2.9335,1.41,-0.8319;3.0737,1.1978,0.7833;0.7652,-0.3119,-1.1366;0.319,-1.3123,-1.1385;1.0974,-0.0871,-2.1544;-0.0128,0.413,-0.8793;4.7992,-2.5401,-1.1618;5.3723,-2.4921,-2.0904;4.3608,-3.5362,-1.0488;5.4619,-2.3811,-0.3058)|\",4.59872407345\r\n\"[H]OB(O[H])C([H])([H])N(C(=S)[C@@]([H])(N([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(6.2341,3.4337,-2.7121;6.3404,2.4691,-2.7181;5.1283,1.8464,-2.7231;3.9869,2.598,-2.7419;3.1673,2.0764,-2.7457;5.118,0.2475,-2.6639;5.2896,-0.0613,-1.6336;5.939,-0.1761,-3.2536;3.8376,-0.3142,-3.1432;2.7689,-0.551,-2.3373;1.1933,-0.3156,-2.8809;2.9363,-1.1933,-0.9232;2.1371,-1.9429,-0.9578;4.1568,-1.9349,-0.6226;4.9146,-1.3534,-0.2813;4.4923,-2.4599,-1.4252;2.539,-0.253,0.2518;1.5857,0.194,-0.0515;2.297,-1.0703,1.5302;1.9336,-0.4205,2.3347;3.2137,-1.5611,1.8721;1.5463,-1.8505,1.3615;3.5288,0.8923,0.5156;3.1595,1.5203,1.3339;3.6558,1.5427,-0.3558;4.5189,0.5288,0.8229;3.634,-0.1157,-4.5818;4.5906,-0.2937,-5.083;3.2915,0.8986,-4.8197;2.8833,-0.8173,-4.9456)|\",4.454503732685\r\n\"[H]C(=O)C([H])([H])C([H])([H])[C@@]([H])(C#N)C1=C(C(F)(F)F)C([H])=C([H])C([H])=C1[H] |(2.2753,0.9817,-5.0026;2.0931,1.632,-4.1178;1.8829,2.8169,-4.2547;2.1296,0.9129,-2.7878;1.3749,0.1133,-2.8255;3.0921,0.386,-2.7176;1.9073,1.8463,-1.5985;0.9728,2.3993,-1.7347;2.7015,2.5998,-1.5717;1.8815,1.1074,-0.2293;2.8265,0.5723,-0.1116;1.8301,2.1012,0.8576;1.7849,2.9002,1.6987;0.7037,0.131,-0.0972;0.86,-1.2655,0.0032;2.2286,-1.9016,0.0248;2.1712,-3.2403,0.152;2.9211,-1.6439,-1.1185;2.988,-1.4357,1.045;-0.2627,-2.0959,0.0954;-0.1237,-3.1676,0.1738;-1.5464,-1.5554,0.0945;-2.4091,-2.2105,0.1694;-1.7113,-0.1743,0.0066;-2.7058,0.2624,0.0168;-0.5946,0.6539,-0.0866;-0.7307,1.7306,-0.1385)|\",5.953851048940001\r\n\"[H]/C(C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=C(/[H])C1=C(C([H])([H])[H])N=C(C([H])([H])[H])S1 |(7.7263,1.8271,-2.24;7.7687,1.4757,-1.2135;6.925,2.173,-0.2266;6.8566,1.9088,0.9614;6.223,3.171,-0.8209;5.3465,3.9389,0.0319;5.822,4.0668,1.0083;5.2664,4.9117,-0.462;3.9779,3.2791,0.1736;3.5574,3.1141,-0.8275;4.109,2.2934,0.6359;3.0118,4.1243,1.0145;3.4488,4.2965,2.0079;2.8971,5.1163,0.5543;1.6337,3.4731,1.1699;0.9668,4.0931,1.7794;1.7129,2.492,1.6533;1.1537,3.3242,0.195;8.5509,0.4482,-0.823;8.5141,0.1864,0.2324;9.4311,-0.3424,-1.6441;10.2442,-1.3932,-1.2528;10.3918,-1.9493,0.1328;9.7972,-1.4052,0.8693;11.4427,-1.9141,0.4419;10.0868,-3.0022,0.1519;10.9838,-1.9776,-2.2569;10.7718,-1.4163,-3.4137;11.4534,-1.8284,-4.6819;11.9856,-2.7655,-4.5036;12.1787,-1.0719,-5.0058;10.7376,-1.9721,-5.4984;9.6182,-0.0918,-3.3743)|\",4.234091513779999\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])SC(C([H])([H])[H])=N1 |(6.2037,-2.6479,-0.2601;5.3063,-2.2846,0.2284;5.1185,-0.9806,0.6337;5.9873,0.1724,0.5297;5.6109,1.1193,0.9119;7.2267,0.1888,0.0004;7.7065,-0.6946,-0.4084;7.994,1.4533,-0.0377;7.5868,2.523,0.3824;9.2038,1.24,-0.601;10.1813,2.3293,-0.7663;9.6113,3.4098,-1.6924;10.3874,4.1524,-1.9103;8.7607,3.9163,-1.2338;9.2908,2.9672,-2.6421;10.5803,2.8872,0.6046;11.4136,3.5889,0.4865;10.9123,2.0755,1.2612;9.7464,3.4067,1.0789;11.3648,1.6194,-1.429;12.1755,2.3317,-1.6146;11.0649,1.1784,-2.3853;11.7459,0.8191,-0.7863;3.5274,-0.8642,1.3631;3.2749,-2.5739,1.0516;2.0101,-3.2733,1.4438;2.063,-4.3059,1.0912;1.1288,-2.7932,1.0036;1.8756,-3.2807,2.532;4.2838,-3.1609,0.4618)|\",4.25586062182\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])ONO |(6.3883,-2.4556,4.0082;6.8698,-2.8664,3.1202;6.2102,-2.8646,1.9585;6.702,-3.274,1.0723;4.822,-2.3251,1.747;4.1702,-3.1297,1.3723;4.3993,-2.005,2.7085;4.7764,-1.1532,0.7472;5.4097,-0.3384,1.1244;5.2234,-1.4717,-0.2061;3.3575,-0.6292,0.4935;2.7286,-1.4496,0.1161;2.9086,-0.3181,1.4487;3.3055,0.5435,-0.4942;3.9341,1.3624,-0.1166;3.7527,0.2327,-1.4492;1.8853,1.0627,-0.741;1.8822,1.8997,-1.4485;1.4236,1.4123,0.1907;1.2413,0.276,-1.1531;8.2653,-3.3957,3.316;8.659,-3.787,2.3685;8.26,-4.2325,4.03;9.2508,-2.3452,3.8384;9.2963,-1.4796,3.1703;10.2555,-2.7681,3.9501;8.8299,-1.8376,5.1199;9.6688,-2.342,6.1536;9.309,-1.9587,7.2112)|\",4.590560657935001\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])(OC([H])([H])[H])OC([H])([H])[H] |(9.1623,0.9272,-4.475;9.4408,-0.0547,-4.0917;8.5346,-1.0293,-3.9973;8.8513,-1.9934,-3.6016;7.0854,-0.9013,-4.3779;6.8962,0.092,-4.8075;6.8591,-1.6333,-5.1666;6.1386,-1.1353,-3.183;6.3369,-2.1288,-2.7546;6.3874,-0.4097,-2.3975;4.6429,-1.0169,-3.5203;4.072,-1.0519,-2.5815;4.446,-0.0261,-3.9565;4.1044,-2.1031,-4.4629;4.6285,-2.0527,-5.4257;4.3347,-3.0934,-4.0405;2.5902,-1.9959,-4.6947;2.0826,-2.0162,-3.7197;2.3594,-1.0143,-5.1357;1.996,-3.1009,-5.5843;2.234,-4.0818,-5.1481;0.901,-3.0173,-5.5621;2.466,-3.0651,-7.0436;1.9724,-3.8448,-7.6351;3.5468,-3.2236,-7.13;2.2342,-2.0986,-7.5087;10.89,-0.1587,-3.687;11.0889,0.5001,-2.8257;11.1866,-1.4896,-3.3452;12.4408,-1.6439,-2.6975;12.4722,-1.0788,-1.7534;13.2685,-1.3097,-3.3342;12.5532,-2.7089,-2.4803;11.7667,0.3577,-4.6785;11.7048,-0.3079,-5.936;12.4727,0.1536,-6.5618;10.7248,-0.1845,-6.4157;11.9119,-1.3801,-5.8359)|\",7.21101703825\r\n\"[H]/C1=C(\\[H])[C@]2([H])C([H])([H])C(=O)OC([H])([H])[C@]23C(=O)C([H])([H])C([H])([H])[C@@]3(C([H])([H])[H])C1([H])[H] |(-0.9803,1.0795,0.3784;0.0718,1.265,0.1756;0.4415,2.1695,-0.7408;-0.2907,2.7674,-1.2798;1.9117,2.5098,-0.8668;2.0444,3.0926,0.0603;2.4789,3.4365,-1.9331;1.9054,4.3637,-2.0401;2.5505,2.9666,-2.9179;3.8943,3.8363,-1.5053;4.503,4.7476,-2.0031;4.5005,3.1454,-0.4759;3.979,1.9732,0.1988;4.8616,1.3584,0.3825;3.5757,2.2848,1.17;2.887,1.3088,-0.6837;3.7587,0.6388,-1.7711;3.9913,0.9838,-2.9061;4.3151,-0.6408,-1.0847;4.297,-1.4634,-1.8042;5.3683,-0.486,-0.8242;3.4013,-0.8837,0.1582;3.8971,-0.5784,1.0871;3.1272,-1.938,0.2683;2.1819,0.0107,-0.1689;1.4349,-0.7303,-1.3283;0.8656,-1.5583,-0.8886;2.1137,-1.1654,-2.0648;0.733,-0.0944,-1.8657;1.1119,0.3924,0.8766;0.6383,-0.5039,1.294;1.5675,0.9262,1.7255)|\",5.942966494919999\r\n\"[H]OC([H])([H])[C@@]12C([H])=C([H])C(=O)C([H])([H])[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])[C@]2([H])OC([H])([H])OC([H])([H])[H] |(5.5842,-1.8506,1.2982;5.3954,-0.9742,1.664;4.4193,-1.1184,2.6914;4.6372,-1.9987,3.308;4.534,-0.2352,3.3275;2.9579,-1.196,2.1532;2.8232,-2.441,1.3142;3.6587,-2.6732,0.6535;1.7732,-3.2753,1.3457;1.7499,-4.1925,0.7625;0.5712,-2.983,2.157;-0.3506,-3.7809,2.2485;0.535,-1.6077,2.805;-0.2022,-1.6258,3.6147;0.1706,-0.9159,2.0349;1.9148,-1.1319,3.3278;2.2845,-1.9785,4.56;3.2133,-1.645,5.0345;2.3929,-3.0403,4.3103;1.4917,-1.8995,5.313;1.8649,0.3791,3.6751;0.8854,0.6566,4.0801;2.597,0.6046,4.4596;2.2081,1.1555,2.3784;2.9709,1.919,2.5646;1.3337,1.6754,1.9747;2.6984,0.1132,1.3369;3.6278,0.4233,0.8421;1.7124,-0.1404,0.3373;1.6924,0.8061,-0.7183;1.3235,1.7817,-0.3729;2.7217,0.9236,-1.1124;0.8129,0.3903,-1.7025;1.2163,-0.7985,-2.3724;0.4994,-0.9566,-3.1811;1.208,-1.6617,-1.6984;2.2246,-0.6866,-2.8008)|\",5.00145257219\r\n\"[H]O[C@@]([H])(C([H])([H])[H])C([H])([H])C#CC([H])([H])OC([H])([H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(1.9002,-2.6737,-2.1015;2.1937,-1.9501,-2.6823;2.965,-1.0661,-1.8805;3.39,-0.3439,-2.587;2.0987,-0.3246,-0.8608;2.6794,0.4338,-0.3216;1.6864,-1.0235,-0.1238;1.2676,0.1714,-1.371;4.1521,-1.8191,-1.224;4.785,-2.2276,-2.0222;4.7704,-1.1121,-0.6543;3.7031,-2.9097,-0.3558;3.2741,-3.8182,0.3191;2.7782,-4.9007,1.1665;2.0655,-4.5018,1.9098;3.6137,-5.3424,1.7381;2.1531,-5.8924,0.3649;1.7012,-6.9975,1.1383;2.5349,-7.3648,1.7645;0.8996,-6.6853,1.829;1.2111,-8.09,0.2222;0.0514,-8.8072,0.5119;-0.5393,-8.5434,1.3865;-0.3749,-9.8654,-0.2978;-1.2811,-10.4021,-0.0417;0.3694,-10.2062,-1.4316;0.0523,-11.2148,-2.2967;-1.1022,-11.9922,-2.0269;-1.1526,-12.7393,-2.8212;-2.0152,-11.382,-2.0451;-1.029,-12.4997,-1.0557;1.5311,-9.482,-1.7421;2.0907,-9.7559,-2.6311;1.9451,-8.4429,-0.9218;2.8388,-7.8795,-1.1712)|\",5.869495755285\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])SC(C([H])([H])[H])=N1 |(7.003,-2.8377,3.1213;7.8859,-2.2079,3.1194;8.2779,-1.4246,2.0545;7.6803,-1.2335,0.7504;8.1858,-0.5565,0.0645;6.5495,-1.812,0.2978;5.958,-2.5007,0.8924;6.0653,-1.5056,-1.0625;6.6109,-0.7598,-1.8566;4.9088,-2.1667,-1.3158;4.3145,-1.9497,-2.6144;3.2456,-2.1224,-2.4585;4.4746,-0.9083,-2.9073;4.8819,-2.9047,-3.6607;5.9595,-2.7236,-3.7542;4.7556,-3.9369,-3.3079;4.2047,-2.7315,-5.0271;3.1216,-2.8892,-4.9223;4.3322,-1.6944,-5.3671;4.7571,-3.6874,-6.0898;4.248,-3.5506,-7.0505;4.6257,-4.734,-5.7884;5.8288,-3.5216,-6.2529;9.7516,-0.5909,2.5094;9.695,-1.3746,4.08;10.7328,-1.122,5.13;10.538,-1.7847,5.9765;10.7017,-0.0844,5.4837;11.7439,-1.3145,4.7539;8.6735,-2.1766,4.2358)|\",4.247697206305\r\n\"[H]C1=C(/C([H])=C(\\[H])C(=O)N(C([H])([H])[H])C([H])([H])[H])SC(C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])=N1 |(6.8759,-1.9521,1.6761;6.2451,-1.5597,0.8857;6.6934,-0.7426,-0.1302;8.013,-0.2188,-0.4154;8.1246,0.4203,-1.2888;9.1313,-0.4428,0.3033;9.0984,-1.0541,1.1977;10.4039,0.1992,-0.1317;10.418,0.9958,-1.0729;11.5568,-0.1216,0.5562;12.7782,0.6057,0.241;13.5738,-0.0898,-0.0564;13.1267,1.175,1.114;12.5675,1.2903,-0.5787;11.6542,-1.0663,1.6579;12.6388,-1.5453,1.624;10.9087,-1.8574,1.5751;11.5508,-0.579,2.6389;5.3243,-0.3605,-1.1573;4.2885,-1.3346,-0.1224;2.8291,-1.5219,-0.4166;2.3626,-1.884,0.5071;2.3761,-0.5501,-0.6522;2.525,-2.5107,-1.5732;3.0731,-2.1603,-2.4601;2.9985,-3.9327,-1.2458;2.8254,-4.6036,-2.0951;2.4518,-4.3355,-0.3836;4.0653,-3.9602,-1.0023;1.0254,-2.4858,-1.8999;0.7946,-3.1643,-2.7291;0.6912,-1.4809,-2.1842;0.4304,-2.8049,-1.0346;4.9175,-1.8841,0.8856)|\",4.26130289883\r\n\"[H]/C1=C(\\[H])[C@@]2(C(=O)OC([H])([H])[H])C(=O)C([H])([H])C([H])([H])[C@@]2(C([H])([H])[H])C([H])([H])C1=O |(2.2204,-1.9801,-3.669;2.1325,-1.451,-2.7237;3.2097,-1.1862,-1.9696;4.1969,-1.5116,-2.2921;3.1989,-0.3857,-0.6846;3.7116,-1.2142,0.4986;4.2407,-0.7483,1.4826;3.4732,-2.5298,0.3293;3.9206,-3.38,1.4028;5.0024,-3.2924,1.5298;3.6464,-4.392,1.1051;3.4279,-3.1043,2.3386;4.2012,0.7938,-0.9379;5.3901,0.6437,-1.0776;3.4179,2.0932,-1.0606;3.5664,2.6599,-0.1315;3.8168,2.7043,-1.8754;1.9596,1.6516,-1.2317;1.7519,1.4652,-2.2914;1.2418,2.4025,-0.8871;1.8189,0.3288,-0.4144;1.6086,0.6741,1.0727;2.4413,1.2438,1.494;1.4924,-0.2256,1.6856;0.6941,1.2688,1.1786;0.6307,-0.5218,-0.8992;0.5178,-1.3954,-0.2393;-0.3043,0.0452,-0.8351;0.769,-1.0601,-2.3155;-0.194,-1.2095,-3.0523)|\",4.83274198488\r\n\"[H]OC(=O)C([H])([H])[C@@]([H])(O[H])/C(=C(\\[H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H] |(4.6176,-1.6619,-2.3164;5.479,-1.2573,-2.5735;5.7457,-0.2285,-1.7454;6.7381,0.4472,-1.875;4.7401,0.0046,-0.6088;5.0197,-0.6647,0.2177;4.8952,1.0278,-0.2599;3.2698,-0.2238,-0.9874;3.0516,0.3512,-1.8961;3.0772,-1.6024,-1.4231;3.0654,-2.1531,-0.6211;2.221,0.1249,0.0567;2.5599,0.358,1.3407;3.6112,0.2696,1.6079;1.6985,0.7011,2.4874;0.5424,1.4963,2.3863;0.2547,1.9101,1.4261;-0.2171,1.8005,3.5156;-1.1028,2.422,3.4138;0.1603,1.322,4.7712;-0.4332,1.5612,5.6492;1.3166,0.548,4.8928;1.6286,0.1816,5.8671;2.0793,0.2517,3.7661;2.9819,-0.3465,3.8696;0.8116,0.1136,-0.4881;0.0684,-0.0749,0.2902;0.7165,-0.6627,-1.2546;0.5593,1.0697,-0.9683)|\",5.25179731465\r\n\"[H]O[C@@]1([H])C([H])C([H])[C@@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])[C@]([H])(C([H])([H])[H])OC(=O)C1([H])[H] |(0.633,-3.3,4.2066;-0.0644,-3.291,3.5308;0.3514,-2.4177,2.4855;-0.5109,-2.3544,1.8129;0.7355,-1.0445,2.9619;1.4566,-1.0113,3.7814;0.4286,0.0744,2.3063;-0.2883,0.0288,1.4856;1.1206,1.3969,2.4707;0.4168,2.1577,2.832;2.1595,1.3808,3.4523;2.7471,0.6398,3.2258;1.683,1.975,1.1362;0.8489,2.0997,0.4368;2.1491,3.2913,1.4214;2.6482,3.2097,2.2541;2.8363,1.2035,0.4372;3.2552,1.9225,-0.276;3.635,1.0073,1.1655;2.5575,-0.0944,-0.3412;3.4636,-0.3507,-0.9039;1.3866,0.0129,-1.3189;1.5335,0.8973,-1.9497;0.4321,0.1286,-0.7961;1.3231,-0.8693,-1.9551;2.3912,-1.1252,0.6748;1.9118,-2.3592,0.4073;1.7781,-2.8309,-0.7009;1.539,-3.0705,1.7018;1.2641,-4.098,1.4542;2.4175,-3.0927,2.3608)|\",6.77019260044\r\n\"[H]OC([H])([H])[C@@]1([H])S[C@]1([H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.2211,-2.4769,-3.5634;6.7112,-2.5567,-2.7281;5.7431,-2.6148,-1.6974;5.0847,-3.4917,-1.7998;6.3091,-2.7248,-0.7638;4.8951,-1.3609,-1.6172;5.4683,-0.4489,-1.4555;3.5984,-1.1571,-2.9185;3.5157,-1.4195,-1.0714;3.1757,-2.4322,-0.8393;2.7305,-0.0448,-0.012;0.8776,-0.0017,-0.3854;0.3729,0.7684,0.2101;0.401,-0.9625,-0.1572;0.6958,0.2188,-1.4431;3.5375,1.6163,-0.4122;3.0415,2.4233,0.1401;3.4628,1.844,-1.4809;4.5981,1.6339,-0.1356;3.0153,-0.5004,1.8071;2.5537,0.2408,2.4709;4.0835,-0.5423,2.0505;2.5795,-1.4768,2.0503)|\",6.106234805220001\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])S[C@]2([H])C([H])([H])OC(=O)N(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(12.8741,-2.2036,3.7864;11.9947,-2.2461,3.1495;11.9499,-1.513,1.9641;12.7945,-0.8955,1.6701;10.8195,-1.5735,1.1475;10.7892,-0.9985,0.2247;9.7175,-2.3616,1.5047;8.5168,-2.3585,0.62;8.6457,-1.7653,-0.2842;7.5939,-3.9453,0.2718;7.1295,-2.4035,1.1513;7.0199,-2.514,2.2276;6.048,-1.5613,0.5103;5.0623,-1.9733,0.7356;6.1912,-1.5098,-0.5712;6.1125,-0.1936,0.962;5.5477,0.0415,2.1852;5.047,-0.839,2.869;5.6237,1.3586,2.5348;5.0471,1.7809,3.8007;5.7958,2.318,4.3971;4.7116,0.8998,4.3456;4.1926,2.4514,3.6351;6.172,2.4048,1.6864;6.9426,2.9644,2.2326;5.3858,3.1118,1.3868;6.6158,1.9697,0.7943;9.776,-3.1024,2.6948;8.9442,-3.7449,2.9692;10.9032,-3.0415,3.5103;10.9326,-3.621,4.4291)|\",5.428671317475\r\n\"[H]O[C@@]([H])(C([H])([H])/C(OC([H])([H])C([H])([H])[H])=C(/[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(4.1761,-0.3513,-2.8108;4.8871,-1.01,-2.7552;6.0275,-0.3454,-2.2179;6.5057,0.2666,-3.0037;5.6441,0.6081,-1.0576;5.0041,0.0616,-0.3545;6.5439,0.9387,-0.5385;4.8927,1.8212,-1.5296;3.6758,1.4534,-2.0238;2.7751,2.442,-2.551;2.554,3.1742,-1.7643;3.2605,2.9682,-3.3828;1.5193,1.7209,-3.0052;0.7983,2.4434,-3.402;1.0534,1.19,-2.1693;1.7462,0.996,-3.794;5.3175,3.1078,-1.5136;4.6851,3.901,-1.8902;6.6168,3.5615,-1.002;7.5227,2.8899,-0.5343;6.6933,4.9173,-1.1186;7.9186,5.5321,-0.6651;7.6286,6.5522,-0.3992;8.2671,5.0101,0.2295;8.9827,5.5247,-1.7533;9.8726,6.0642,-1.408;8.6148,6.0135,-2.6615;9.2757,4.4998,-1.9971;7.0469,-1.4216,-1.7899;7.9236,-0.8781,-1.4082;6.5135,-2.333,-0.6754;7.2485,-3.1074,-0.4264;5.5903,-2.8286,-0.9932;6.3019,-1.7772,0.2451;7.489,-2.2439,-3.0099;8.2518,-2.9794,-2.7282;7.9144,-1.6003,-3.7894;6.638,-2.7792,-3.4428)|\",5.845005508739999\r\n\"[H]ONC([H])C1=C(\\C([H])([H])[H])C2=C(OC(C([H])([H])[H])(C([H])([H])[H])C2([H])[H])/C(C([H])([H])[H])=C\\1C([H])([H])[H] |(2.1834,-5.2469,1.151;1.7396,-4.3985,0.9972;2.8306,-3.5333,0.7569;2.3954,-2.3419,0.5372;1.317,-2.1935,0.5762;3.2356,-1.1774,0.2451;4.6413,-1.2828,0.0284;5.3882,-2.593,0.0719;6.4376,-2.4494,-0.2009;4.9471,-3.3289,-0.608;5.3448,-3.049,1.0665;5.3389,-0.1121,-0.2441;4.684,1.1158,-0.3112;5.515,2.1583,-0.5974;6.9054,1.6771,-0.5689;7.5329,2.219,0.717;8.5865,1.9241,0.7858;7.0077,1.8264,1.5944;7.4752,3.3121,0.7409;7.5885,2.2262,-1.817;8.6345,1.9009,-1.8542;7.5677,3.321,-1.8206;7.0825,1.8688,-2.7197;6.8003,0.1225,-0.5528;7.0831,-0.3039,-1.5251;7.4774,-0.3082,0.1938;3.3109,1.2645,-0.114;2.6939,2.6429,-0.2053;1.9385,2.7042,-0.9986;3.4653,3.3838,-0.4252;2.2062,2.9389,0.7317;2.586,0.0904,0.1681;1.0894,0.2009,0.3997;0.751,1.2359,0.3436;0.7978,-0.1802,1.3854;0.5173,-0.3637,-0.347)|\",4.57695496541\r\n\"[H]C1=C(C([H])([H])C(=O)OC([H])([H])[H])[C@]([H])(C#N)C(=O)N=C1OC([H])([H])[H] |(3.9463,1.2255,2.5701;3.8244,0.1616,2.3957;4.4008,-0.7677,3.171;5.268,-0.4196,4.3482;5.4704,0.6555,4.3802;6.2412,-0.9228,4.2691;4.6573,-0.8411,5.6836;3.6689,-1.5291,5.8226;5.3835,-0.3481,6.6993;4.9348,-0.711,8.0203;4.9464,-1.7968,8.1404;3.9214,-0.3405,8.1933;5.6399,-0.2394,8.7044;4.1061,-2.2294,2.874;3.3669,-2.545,3.6293;5.2882,-3.0766,3.0605;6.2533,-3.6949,3.2422;3.4438,-2.505,1.4861;3.4735,-3.6251,1.0268;2.7803,-1.459,0.8524;2.9761,-0.2512,1.2767;2.3835,0.783,0.6789;1.5274,0.4925,-0.4434;1.1455,1.4614,-0.765;0.7134,-0.1695,-0.1402;2.0993,0.011,-1.2399)|\",4.85995336993\r\n\"[H]C(=O)C(=O)N([H])[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])[H] |(4.0514,0.6162,4.1248;3.6976,1.0981,3.1923;3.5454,2.2932,3.0754;3.4249,0.0959,2.0671;3.6072,-1.1041,2.2509;2.9882,0.6708,0.927;2.9259,1.6836,0.8774;2.7262,-0.0898,-0.2809;3.5412,-0.8089,-0.4274;2.732,0.8932,-1.4467;2.7429,2.1005,-1.3256;2.7008,0.2495,-2.6236;2.6685,1.0916,-3.7925;2.6243,0.4086,-4.6404;3.5689,1.7091,-3.8401;1.7894,1.7402,-3.7707;1.3988,-0.8702,-0.2121;1.2423,-1.4356,-1.1348;0.5569,-0.1859,-0.0647;1.4413,-1.5664,0.6288)|\",4.76743466076\r\n\"[H]O[C@@]([H])(C([H])([H])C#CC(=C([H])[H])C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.3189,3.6865,-3.7488;6.6991,2.8519,-3.426;5.8681,2.4029,-2.362;5.9311,3.1077,-1.5149;4.3878,2.3804,-2.8271;4.27,1.6408,-3.6282;3.7366,2.0694,-2.0006;3.969,3.6979,-3.3086;3.6708,4.8034,-3.714;3.3142,6.1226,-4.1456;2.3857,6.3138,-5.0981;2.1103,7.3155,-5.4157;1.8814,5.4815,-5.5782;4.0288,7.2696,-3.4628;3.6989,8.2321,-3.8651;5.1145,7.192,-3.5986;3.8403,7.2604,-2.3826;6.4245,1.0393,-1.8718;5.6056,0.537,-0.6665;6.0704,-0.3636,-0.2494;4.5772,0.2743,-0.9396;5.5628,1.288,0.1327;6.3997,-0.0147,-2.996;6.8863,-0.9375,-2.6586;6.9295,0.3483,-3.8815;5.3773,-0.2773,-3.2925;7.8838,1.2583,-1.4204;8.3094,0.322,-1.0402;7.9419,2.0036,-0.6168;8.5027,1.609,-2.25)|\",5.815072985185\r\n\"[H]C1=C(C(=O)/C([H])=C(\\[H])C2=C([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(-2.8169,-5.4583,-0.4568;-2.9029,-5.4617,0.6261;-2.1326,-4.7554,1.4718;-1.0191,-3.8292,1.146;-0.4353,-3.2499,2.0646;-0.6377,-3.6258,-0.2716;-1.1849,-4.1648,-1.0381;0.367,-2.7838,-0.5952;0.8591,-2.2774,0.2346;0.8517,-2.4695,-1.9312;1.8402,-1.5574,-2.0691;2.2632,-1.1103,-1.1697;2.4082,-1.1009,-3.384;2.5345,-0.0093,-3.3687;3.4273,-1.5074,-3.493;1.5408,-1.5282,-4.5773;2.0967,-1.3997,-5.5136;0.6597,-0.8747,-4.6393;1.0807,-2.9809,-4.4119;1.9629,-3.6337,-4.3536;0.5048,-3.3073,-5.2861;0.2366,-3.1517,-3.1395;0.1011,-4.2198,-2.9266;-0.7748,-2.7496,-3.3041;-2.5242,-4.9897,2.9157;-1.6728,-5.3735,3.4904;-2.8031,-4.0453,3.3981;-3.7052,-5.9995,2.8421;-3.4704,-6.9205,3.3842;-4.6069,-5.5861,3.3039;-3.9416,-6.2925,1.33;-4.9549,-6.0194,1.0019;-3.8282,-7.3584,1.0851)|\",4.318446807435\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])C(C2=C([H])C([H])=C([H])C([H])=C2[H])=C([H])C(=O)C1([H])[H] |(4.0235,-2.5674,1.4218;4.7403,-2.7272,0.7922;5.4293,-1.4935,0.5834;5.2771,-0.8035,1.4233;6.5003,-1.7218,0.5289;4.9945,-0.839,-0.7434;5.3057,-1.5105,-1.552;3.5336,-0.7318,-0.8171;3.0719,-1.6293,-0.7271;2.8949,0.316,-0.1875;1.4359,0.1455,0.0286;0.6504,-0.5593,-0.8995;1.1103,-0.9541,-1.8009;-0.719,-0.7209,-0.6919;-1.3146,-1.2568,-1.4258;-1.3244,-0.1875,0.4471;-2.3913,-0.3145,0.6086;-0.552,0.5115,1.3774;-1.0144,0.9247,2.2696;0.8158,0.6775,1.171;1.4168,1.2038,1.9061;3.5634,1.453,0.1844;3.0257,2.2849,0.6237;4.9787,1.6363,-0.0866;5.6168,2.6244,0.2641;5.6306,0.5404,-0.9393;5.5202,0.8462,-1.989;6.7036,0.5114,-0.7249)|\",4.3973598240800005\r\n\"[H]C([H])([H])C#CC(=O)[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(3.4819,-0.7834,-4.4429;2.8767,-0.89,-3.5345;2.039,-0.1871,-3.6122;2.4709,-1.9077,-3.5154;3.6784,-0.6124,-2.3505;4.3416,-0.382,-1.3614;5.1375,-0.0611,-0.1986;5.5565,1.0653,0.0191;5.4181,-1.1885,0.8135;4.7851,-0.9899,1.6885;5.1278,-2.4841,0.3175;6.3708,-3.1613,0.0185;6.5118,-4.3486,0.9731;7.4431,-4.8874,0.7723;5.6709,-5.0374,0.8453;6.5199,-4.0071,2.0126;6.4004,-3.5669,-1.4484;7.357,-4.0415,-1.6877;6.2759,-2.6816,-2.0766;5.5933,-4.2741,-1.6617;7.4014,-2.2014,0.2359;6.9061,-1.2634,1.1822;7.0419,-1.617,2.2142;7.426,-0.3141,1.0466)|\",5.07764445033\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])C([H])([H])C([H])([H])O[C@]1([H])OC([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-4.501,-3.5844,2.3722;-5.2669,-4.0231,1.9704;-5.1448,-3.9116,0.5484;-4.1395,-4.1992,0.2098;-5.8628,-4.6277,0.139;-5.5282,-2.4993,0.0819;-5.6205,-2.5085,-1.0128;-6.821,-2.1933,0.5899;-6.8191,-2.5387,1.5009;-4.5208,-1.4097,0.4909;-4.9799,-0.4368,0.2869;-4.3517,-1.4463,1.5765;-3.1776,-1.4878,-0.2251;-3.3156,-1.4069,-1.3168;-2.6614,-2.4356,-0.0281;-2.3751,-0.4027,0.2393;-1.1306,-0.3,-0.3743;-1.2469,-0.2883,-1.4787;-0.357,-1.4461,-0.0261;0.9319,-1.4373,-0.6337;1.4051,-2.3816,-0.3482;0.829,-1.4288,-1.7336;1.7503,-0.229,-0.1741;1.9382,-0.3219,0.9034;2.7247,-0.2252,-0.679;0.9778,1.0683,-0.4586;0.9345,1.2334,-1.5452;1.4996,1.9332,-0.0329;-0.4519,0.9806,0.1009;-1.053,1.8443,-0.2052;-0.4375,0.9561,1.1973)|\",8.236886254635\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])C([H])([H])C([H])([H])O[Si](C([H])([H])[H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(7.4354,-3.2677,3.6236;6.8021,-3.3423,2.8951;5.9402,-2.2007,2.9231;5.6016,-1.9875,3.9476;6.4501,-1.309,2.5278;4.7329,-2.5597,2.0604;5.0966,-2.7115,1.0279;4.1471,-3.7614,2.5485;4.897,-4.3585,2.7164;3.6684,-1.4676,2.0649;3.3138,-1.3088,3.0908;4.1107,-0.5247,1.7178;2.473,-1.8035,1.1766;1.9911,-2.7197,1.5404;2.8165,-1.9928,0.1476;1.5628,-0.709,1.2019;0.0964,-0.6888,0.3683;-1.0509,-2.0372,1.0398;-2.0253,-2.0176,0.5368;-1.225,-1.9192,2.1151;-0.6233,-3.0342,0.8808;0.3915,-1.0007,-1.4764;-0.5429,-0.913,-2.044;0.7823,-2.0101,-1.6521;1.1085,-0.2879,-1.8989;-0.6065,1.0621,0.6892;-0.7633,1.2959,2.2066;-1.1534,2.3067,2.3975;0.1922,1.2015,2.7342;-1.4668,0.5864,2.6601;-1.9847,1.2075,0.0103;-2.3983,2.2107,0.1898;-2.7133,0.4845,0.3977;-1.9256,1.0727,-1.0771;0.3566,2.1241,0.1176;-0.0358,3.1354,0.3009;0.4862,2.0179,-0.9666;1.3471,2.0615,0.5817)|\",8.476346443075\r\n\"[H]C1=C2C(=O)C([H])([H])[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]2([H])C([H])([H])C([H])([H])C1([H])[H] |(10.6535,1.6883,8.4825;10.381,1.2388,7.5279;9.3144,1.7123,6.8703;8.3876,2.7599,7.3788;8.5596,3.4796,8.3456;7.152,2.74,6.4697;6.2393,2.8898,7.0534;7.2276,3.5867,5.7694;7.2291,1.3944,5.7071;6.803,0.6131,6.3555;6.4946,1.3665,4.3603;6.9076,2.1581,3.7155;6.7197,0.4161,3.8585;4.9727,1.5452,4.4669;4.555,0.7411,5.0887;4.7582,2.479,5.0035;4.2462,1.5955,3.1104;3.1921,1.8526,3.2877;4.6652,2.4236,2.5209;4.3024,0.3101,2.2645;3.8376,0.5183,1.2901;5.3477,0.0504,2.0442;3.6014,-0.902,2.8931;4.0703,-1.1498,3.8549;2.5596,-0.6357,3.1232;3.6252,-2.1406,1.991;3.1106,-2.9874,2.4592;4.6544,-2.4542,1.7758;3.1345,-1.942,1.0301;8.7575,1.1256,5.5933;9.1325,1.7123,4.7363;9.223,-0.3252,5.4189;8.7162,-0.9612,6.159;8.9555,-0.7107,4.4274;10.7478,-0.4061,5.6242;11.104,-1.4287,5.4522;11.2373,0.2247,4.8699;11.1757,0.0619,7.0304;12.2486,0.3015,7.0415;11.0595,-0.7583,7.7572)|\",5.036827372755001\r\n\"[H]C([H])=C1C(=O)C([H])([H])[C@@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]1([H])C([H])([H])[H] |(1.7371,-2.651,-0.3354;2.4197,-2.2423,-1.0754;2.6981,-2.8794,-1.9112;2.9165,-1.0046,-0.999;3.8351,-0.4562,-2.0528;4.2653,-1.0592,-3.0174;4.119,1.0053,-1.7014;5.0977,1.0398,-1.2012;4.189,1.6266,-2.598;2.9989,1.415,-0.715;3.3755,2.1411,0.013;1.7889,2.0401,-1.3922;1.2405,1.5326,-2.5812;1.6891,0.6724,-3.0698;0.1187,2.1246,-3.1635;-0.2851,1.7144,-4.0854;-0.4786,3.2391,-2.5725;-1.3496,3.7009,-3.0295;0.0576,3.7583,-1.3939;-0.392,4.6304,-0.9262;1.1783,3.1623,-0.8142;1.5923,3.5786,0.1018;2.6902,0.0682,0.0464;3.4958,-0.0345,0.7925;1.3568,0.0079,0.7902;1.2709,-0.9252,1.3585;0.5121,0.0652,0.0971;1.2687,0.8373,1.501)|\",4.99056801817\r\n\"[H]OC1=N[C@]([H])(C([H])([H])[C@]([H])(C#CC2=C([H])C([H])=C([H])C([H])=C2[H])O[H])C([H])([H])O1 |(7.4531,3.0473,3.1994;7.5813,3.8164,2.618;6.8975,3.57,1.5013;6.2225,2.5201,1.2319;5.641,2.741,-0.1091;4.5514,2.8163,-0.0044;5.9542,1.5857,-1.0669;7.0402,1.5085,-1.204;5.5054,1.8033,-2.0455;5.4502,0.2105,-0.5694;5.6915,-0.5296,-1.3435;3.9846,0.2025,-0.3792;2.7807,0.1902,-0.225;1.3635,0.1291,-0.0458;0.6905,-1.1047,-0.1395;1.2635,-2.0033,-0.3465;-0.6891,-1.1707,0.0363;-1.1945,-2.1298,-0.0378;-1.4213,-0.0119,0.3067;-2.4976,-0.0667,0.445;-0.7641,1.2171,0.4003;-1.3285,2.1215,0.6111;0.6158,1.2909,0.2259;1.1283,2.2452,0.301;6.1515,-0.2196,0.5884;6.1575,0.5446,1.2001;6.2477,4.1052,-0.5546;5.5023,4.8591,-0.8174;6.9688,4.0062,-1.3729;6.9657,4.5792,0.6149)|\",5.3497583008300005\r\n\"[H]OC1=N[C@]([H])(C([H])([H])C(=O)C#C[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])O1 |(2.7026,-1.3328,9.2502;2.8891,-2.2626,9.0346;3.5094,-2.2479,7.8516;3.7717,-1.2151,7.149;4.4795,-1.7184,5.9623;5.5134,-1.3515,5.9722;3.8116,-1.2201,4.6781;3.744,-0.1247,4.6893;2.7724,-1.5769,4.6207;4.5438,-1.6652,3.4238;5.4926,-2.4366,3.4585;4.0449,-1.1301,2.1695;3.6063,-0.6833,1.1216;2.952,-0.0176,-0.4808;1.0655,-0.0394,-0.3862;0.6263,0.3212,-1.3243;0.699,0.6005,0.4241;0.6889,-1.0532,-0.2095;3.5688,-1.124,-1.8802;3.2132,-0.7548,-2.85;3.2126,-2.1534,-1.7622;4.6635,-1.1508,-1.911;3.5984,1.7461,-0.6809;3.2061,2.1915,-1.6033;4.6923,1.7674,-0.737;3.2915,2.3828,0.1561;4.4599,-3.2692,6.1248;5.4514,-3.7229,6.122;3.8328,-3.7765,5.3853;3.8624,-3.4781,7.4332)|\",4.70756961365\r\n\"[H]C1([H])C([H])([H])[C@@]2(C([H])([H])[H])O[C@]3([H])C([H])([H])C(=O)O[C@]3([H])[C@@]2([H])C1([H])[H] |(4.8278,2.8174,0.0672;4.3453,2.0227,0.646;3.9358,2.4853,1.554;3.2352,1.2965,-0.1288;3.5732,1.0702,-1.1474;2.3102,1.8766,-0.2148;3.0265,-0.0426,0.629;1.7952,-0.0289,1.5301;1.8206,0.8309,2.2101;1.7469,-0.9441,2.1288;0.8847,0.0386,0.9253;2.8666,-1.1551,-0.2803;4.0887,-1.8659,-0.4354;4.6539,-1.5092,-1.3076;3.7969,-3.3578,-0.4471;4.5871,-3.9294,-0.9496;2.8421,-3.6056,-0.9146;3.7888,-3.727,1.0345;3.3652,-4.7229,1.5559;4.3813,-2.7215,1.7521;4.8428,-1.6556,0.8928;5.9317,-1.7386,0.8057;4.3746,-0.2936,1.395;4.213,-0.3357,2.4761;5.307,0.8834,1.0251;5.9957,1.1431,1.8356;5.9234,0.6211,0.1532)|\",7.387891041075\r\n\"[H]O[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])[C@]1([H])OC(=O)C([H])=C1[H] |(2.6229,-0.4039,1.9439;2.8366,0.2844,1.2937;2.4994,-0.224,-0.0033;1.0089,-0.5968,-0.0374;0.6938,-0.9486,-1.0258;0.3762,0.2508,0.2447;0.8128,-1.4091,0.6744;3.388,-1.4166,-0.4245;4.3932,-1.2311,-0.0288;3.0382,-2.3687,-0.0071;3.3954,-1.3987,-1.9713;4.3759,-1.6823,-2.3657;2.6773,-2.119,-2.3777;3.0051,0.0546,-2.3763;2.0382,0.0679,-2.8938;3.724,0.5157,-3.0573;2.9118,0.8416,-1.051;3.9161,1.1748,-0.7583;2.0301,2.0934,-1.1334;0.9987,1.8067,-1.382;2.4962,2.9356,-2.2064;2.7885,4.1966,-1.7281;3.1944,5.0805,-2.4399;2.5078,4.1937,-0.2728;2.6678,5.0666,0.3463;2.068,2.9839,0.0805;1.8012,2.6275,1.0671)|\",6.108955943724999\r\n\"[H]OC([H])([H])/C([H])=C(\\[H])[C@@]([H])(O[H])[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@]1(O[H])C([H])([H])[H] |(3.4786,4.7965,1.9651;2.6937,5.3622,2.0401;1.5617,4.5213,1.8011;1.4436,3.772,2.6;0.6984,5.1951,1.8522;1.628,3.846,0.4605;1.6838,4.5061,-0.4042;1.6367,2.5198,0.2928;1.5941,1.8628,1.1597;1.7377,1.8259,-1.048;0.7397,1.4798,-1.3502;2.111,2.7277,-2.0923;2.9413,3.1487,-1.8129;2.6847,0.6054,-1.0258;3.6681,0.9502,-0.6686;2.8668,-0.0495,-2.4051;1.8831,-0.2377,-2.8562;3.409,0.5933,-3.1024;3.5948,-1.377,-2.1057;4.6792,-1.2306,-2.1631;3.3451,-2.1571,-2.8322;3.1844,-1.7581,-0.6571;4.063,-1.8586,-0.0116;2.6423,-2.7104,-0.6176;2.3098,-0.5901,-0.1043;2.6803,-0.2406,1.2395;2.4306,-0.9831,1.8129;0.8178,-0.9555,-0.138;0.4763,-1.1872,-1.153;0.195,-0.1502,0.2631;0.6415,-1.8512,0.4727)|\",6.99332595785\r\n\"[H]OC([H])([H])C#C[C@@]([H])(O[H])[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@]1(O[H])C([H])([H])[H] |(2.2558,2.1606,2.2337;2.0965,2.9812,2.7388;1.277,3.8165,1.9277;0.2276,3.7843,2.265;1.6171,4.8529,2.0547;1.3513,3.4087,0.5214;1.5331,2.8111,-0.516;1.8889,1.9593,-1.6558;0.9867,1.548,-2.1235;2.525,2.6901,-2.7081;3.2613,3.1799,-2.3061;2.7925,0.7969,-1.1708;3.6837,1.2534,-0.7166;3.2592,-0.1311,-2.3049;2.392,-0.4885,-2.8795;3.9257,0.3734,-3.0084;3.9117,-1.2928,-1.5427;4.9011,-0.9838,-1.1828;4.055,-2.1888,-2.1555;2.9614,-1.5265,-0.351;3.4908,-1.8569,0.549;2.2245,-2.3013,-0.5949;2.2281,-0.1677,-0.0783;2.6096,0.3745,1.2092;2.2919,-0.2473,1.8843;0.7064,-0.3513,-0.1152;0.3724,-0.7129,-1.0949;0.1859,0.5834,0.1101;0.4004,-1.103,0.6246)|\",7.42326584164\r\n\"[H]O[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])C([H])=O |(2.4647,0.1751,1.8593;2.8734,0.6527,1.1198;2.4106,0.0559,-0.0937;0.8768,-0.0185,-0.0923;0.4837,-0.3909,-1.0441;0.4359,0.9659,0.0955;0.5329,-0.701,0.6962;3.052,-1.3191,-0.38;4.0924,-1.2734,-0.0364;2.558,-2.1354,0.1617;2.9889,-1.4807,-1.9129;3.8126,-2.0958,-2.2882;2.0629,-1.9866,-2.2085;3.025,-0.0326,-2.4872;2.1718,0.1673,-3.1419;3.9234,0.1506,-3.0842;2.9797,0.9092,-1.2688;3.9945,1.1825,-0.9429;2.2377,2.2096,-1.4731;2.2494,2.8876,-0.5919;1.656,2.528,-2.4893)|\",6.149773021300001\r\n\"[H]OC([H])([H])[C@@]1([H])N([H])C(C([H])([H])C([H])([H])C([H])([H])[H])=C([H])C(=O)C1([H])[H] |(6.5859,-2.4628,1.5355;6.5923,-2.3601,0.5726;7.2319,-1.1223,0.2593;6.7572,-0.2832,0.7878;8.2984,-1.1416,0.5316;7.1023,-0.9295,-1.2567;7.6066,-1.7777,-1.7343;5.6937,-0.9833,-1.6611;5.2492,-1.8616,-1.4211;4.9056,0.1326,-1.5043;3.4204,-0.1293,-1.445;2.8821,0.7878,-1.7065;3.156,-0.8813,-2.2022;2.9585,-0.6185,-0.0544;3.5493,-1.4959,0.2418;3.1822,0.1629,0.6828;1.4662,-0.9593,-0.0178;1.1619,-1.2944,0.9799;1.2226,-1.7592,-0.7277;0.8538,-0.0875,-0.2778;5.4354,1.3878,-1.3857;4.7821,2.2476,-1.2832;6.8669,1.6217,-1.4269;7.3907,2.723,-1.2849;7.7226,0.385,-1.7423;7.8353,0.3439,-2.8345;8.7222,0.5307,-1.3215)|\",5.0395485112600005\r\n\"[H]OC1=N[C@]([H])(C([H])([H])C(=O)C#CC([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])O1 |(11.2132,-1.6199,-5.8497;11.5446,-0.8885,-5.3012;10.7383,-0.8341,-4.2368;9.7269,-1.581,-4.0177;9.1973,-1.1505,-2.7142;9.325,-1.9603,-1.9852;7.7035,-0.8315,-2.8102;7.1613,-1.6929,-3.2202;7.5308,-0.0104,-3.5221;7.098,-0.4442,-1.4709;7.7851,-0.2674,-0.4734;5.6576,-0.2864,-1.4445;4.4519,-0.1475,-1.4442;3.0036,0.0376,-1.4058;2.7615,0.9953,-1.8893;2.5209,-0.7373,-2.0183;2.4141,0.0167,0.0214;2.6556,-0.9437,0.492;2.9093,0.7899,0.6205;0.8997,0.2377,0.0184;0.4934,0.1899,1.0343;0.3866,-0.524,-0.5814;0.6411,1.2191,-0.3979;10.0825,0.0701,-2.3187;10.5829,-0.0429,-1.3566;9.5423,1.0217,-2.3308;11.0967,0.1224,-3.3598)|\",4.968798910130001\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(N([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(-1.388,0.1982,6.2685;-0.6159,0.2107,5.5036;-0.8749,-0.3106,4.2341;-1.8518,-0.7303,4.0074;0.1155,-0.2946,3.2514;-0.0827,-0.7055,2.2658;1.3833,0.2392,3.5238;2.456,0.2919,2.4459;3.4267,0.471,2.949;2.4742,-0.9796,1.7048;2.4124,-1.7376,2.3846;3.6647,-1.1994,0.8828;3.7002,-0.4276,0.1047;4.6013,-1.0983,1.4663;3.6322,-2.5777,0.2128;4.5899,-2.7334,-0.3037;3.5796,-3.3554,0.991;2.4742,-2.7526,-0.7772;2.5606,-2.0142,-1.5831;1.5269,-2.5383,-0.2716;2.4336,-4.1557,-1.3876;2.2947,-4.9274,-0.621;3.3905,-4.393,-1.8762;1.3486,-4.3158,-2.4347;0.7765,-3.4132,-3.0063;1.1029,-5.6251,-2.6806;0.1197,-5.8851,-3.6948;0.05,-6.9713,-3.7641;0.4305,-5.4583,-4.6525;-0.8458,-5.4554,-3.4145;2.2066,1.4741,1.4915;2.1512,2.4103,2.0566;1.2621,1.3332,0.9559;3.0098,1.5754,0.7537;1.6334,0.7541,4.8015;2.6151,1.1648,5.0289;0.6427,0.7441,5.7849;0.8567,1.1473,6.7715)|\",6.08990797419\r\n\"[H]C1=C2C(C([H])([H])[H])(C([H])([H])[H])O[C@@]3([H])C([H])([H])C(=O)O[C@@]23C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(2.3191,2.8236,0.2323;2.2701,2.4211,1.2439;2.1343,1.1009,1.3988;2.0231,0.0778,0.2704;0.9184,0.4188,-0.7397;1.1537,1.3372,-1.2885;-0.0429,0.558,-0.2345;0.8144,-0.3975,-1.4621;3.3652,-0.1661,-0.4322;3.6808,0.7257,-0.9844;3.2731,-0.9995,-1.1371;4.1382,-0.4123,0.301;1.6646,-1.1625,0.9343;1.2957,-0.9447,2.2895;0.2038,-0.8831,2.4016;1.9503,-2.0126,3.1502;2.0294,-2.9762,2.6436;1.4248,-2.1655,4.1007;3.3324,-1.4317,3.4343;4.309,-1.9951,3.8514;3.3254,-0.0935,3.152;2.001,0.3659,2.7145;1.339,1.0997,3.8982;0.8515,0.349,4.534;2.1356,1.5372,4.5102;0.3246,2.1857,3.4969;-0.2741,1.8253,2.6516;-0.3808,2.343,4.3213;0.9785,3.5368,3.1389;0.2732,4.1281,2.5419;1.1635,4.109,4.0576;2.324,3.4081,2.3844;3.1035,3.1027,3.0932;2.623,4.3946,2.014)|\",6.4763096419\r\n\"[H]O[C@@]([H])(C1=C(OC([H])([H])[H])C([H])=C([H])C([H])=C1[H])[C@]1([H])OC(=O)C([H])=C1[H] |(1.1942,0.2026,-0.7307;1.4686,-0.5662,-1.2607;2.882,-0.476,-1.3843;3.1611,-1.1806,-2.1743;3.3775,0.906,-1.7763;2.9364,2.0641,-1.1044;2.0273,1.8532,-0.0853;1.5379,2.9772,0.6376;0.8541,2.5767,1.3879;0.9954,3.6679,-0.0196;2.3542,3.5124,1.1368;3.4025,3.3277,-1.4693;3.0573,4.2174,-0.9556;4.329,3.4472,-2.5092;4.6895,4.4337,-2.787;4.7834,2.3168,-3.1811;5.5047,2.4073,-3.9875;4.3016,1.0595,-2.8097;4.6539,0.1719,-3.3285;3.5359,-0.987,-0.0708;3.2902,-0.2873,0.7386;4.9626,-1.0327,-0.2069;5.4184,-2.317,0.023;6.5853,-2.6098,-0.0384;4.2352,-3.1579,0.326;4.319,-4.2162,0.5357;3.1408,-2.3945,0.2734;2.1083,-2.6857,0.4203)|\",5.270845284185\r\n\"[H]O[C@@]([H])(C1=C([H])C(C([H])([H])[H])=C([H])C([H])=C1[H])[C@]1([H])OC(=O)C([H])=C1[H] |(4.4675,-4.1128,-1.1511;4.0112,-3.2561,-1.1145;4.923,-2.3376,-0.5464;5.8174,-2.2406,-1.1825;4.269,-0.9776,-0.4056;2.8781,-0.8477,-0.3748;2.2675,-1.7367,-0.5003;2.2677,0.4042,-0.2121;0.7609,0.5198,-0.1773;0.4371,1.5622,-0.2583;0.3507,0.1189,0.7589;0.2986,-0.0426,-0.9965;3.0839,1.533,-0.0769;2.6278,2.5127,0.0441;4.4747,1.4156,-0.1088;5.096,2.3024,-0.0158;5.0678,0.1662,-0.2769;6.1517,0.0812,-0.3224;5.4305,-2.859,0.825;6.2142,-2.1805,1.1883;6.0199,-4.1561,0.6219;5.4248,-5.0841,1.4647;5.7497,-6.2428,1.4735;4.3946,-4.3668,2.2506;3.7792,-4.869,2.9851;4.3825,-3.0853,1.8778;3.7427,-2.2916,2.2439)|\",5.28989325372\r\n\"[H]O[C@@]([H])(C1=C(C([H])([H])[H])C([H])=C([H])C([H])=C1[H])[C@]1([H])OC(=O)C([H])=C1[H] |(1.3243,2.9068,-0.3518;2.1454,2.3876,-0.3546;1.7587,1.0257,-0.3385;1.1688,0.7902,-1.2367;2.9949,0.1457,-0.2945;2.974,-1.1702,-0.8045;1.7543,-1.7582,-1.4824;1.9519,-2.7842,-1.8063;1.4667,-1.1828,-2.3716;0.8756,-1.7898,-0.8265;4.1381,-1.9428,-0.7008;4.1305,-2.9568,-1.0938;5.3007,-1.4404,-0.1177;6.1886,-2.0638,-0.0543;5.3173,-0.1353,0.3715;6.2198,0.2749,0.8163;4.1679,0.649,0.2813;4.1731,1.6728,0.6394;0.8221,0.7543,0.8707;0.4479,-0.276,0.8091;-0.3083,1.6408,0.7682;-0.4433,2.3741,1.9386;-1.319,3.1859,2.0887;0.6501,1.9581,2.8461;0.7733,2.3852,3.8324;1.3933,1.0349,2.2321;2.2741,0.5304,2.61)|\",5.292614392225\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(N([H])C2=C(C(=O)OC([H])([H])[H])C([H])([H])C2([H])[H])C([H])([H])[H])C([H])=C1[H] |(0.1235,3.7083,-4.6741;0.6877,3.1661,-3.9202;1.8863,2.5373,-4.2571;2.2618,2.5874,-5.2758;2.6107,1.8451,-3.2848;3.548,1.363,-3.5538;2.1485,1.765,-1.9669;2.9436,1.0054,-0.9127;3.8511,0.6175,-1.3886;3.3935,1.9177,0.1403;2.7389,2.5868,0.5357;4.5682,1.8388,0.789;5.0851,2.5133,1.8638;4.4315,3.5283,2.6436;3.2856,3.9233,2.4257;5.1967,4.0164,3.6573;4.5762,5.0317,4.4529;5.3155,5.3079,5.2065;3.6679,4.6531,4.9313;4.3123,5.9001,3.8419;6.416,1.7841,1.809;7.2965,2.3919,1.5679;6.6598,1.1445,2.6659;5.8171,1.0058,0.5747;5.6975,-0.0742,0.7225;6.3365,1.1746,-0.3762;2.1574,-0.1866,-0.3336;1.8595,-0.8764,-1.1311;1.2509,0.1528,0.1789;2.772,-0.7317,0.3908;0.9397,2.3962,-1.6407;0.5545,2.3448,-0.6246;0.2171,3.0951,-2.6072;-0.7151,3.5826,-2.3349)|\",4.944308663585\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])N(=O)=O |(5.2265,-3.9988,2.8871;5.4654,-3.2383,3.633;5.1868,-1.9573,3.3747;5.4229,-1.2079,4.1338;4.5639,-1.4326,2.1097;4.3435,-2.2667,1.4301;3.598,-0.9679,2.3569;5.4382,-0.3855,1.3875;4.8572,0.0506,0.5629;5.6536,0.4426,2.0787;6.7543,-0.9498,0.8403;7.3357,-1.3823,1.6665;6.5323,-1.7843,0.1576;7.6098,0.0899,0.106;7.0307,0.5178,-0.7251;7.8277,0.9266,0.7852;8.9244,-0.4849,-0.4309;9.5148,0.2785,-0.9502;8.7412,-1.3021,-1.1395;9.541,-0.8863,0.3832;6.0689,-3.7343,4.9205;6.404,-2.8958,5.5382;6.9468,-4.3631,4.7163;5.1054,-4.5912,5.7503;5.5872,-4.9588,6.6613;4.679,-5.4244,5.1915;3.9276,-3.7738,6.225;4.18,-2.7369,6.8329;2.8046,-4.2019,5.9802)|\",5.156557466975\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])[H])[C@]([H])(OC([H])([H])OC([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(2.7967,3.0187,0.518;3.41,2.3192,0.2336;3.0695,1.1711,1.0045;2.0451,0.8459,0.7529;4.0297,0.0462,0.6216;5.0606,0.3951,0.7477;3.884,-0.7981,1.3091;3.8312,-0.4159,-0.8263;4.5451,-1.2042,-1.089;3.9718,0.42,-1.5176;2.8208,-0.8145,-0.9795;3.0059,1.5477,2.4998;2.6739,0.6788,3.0909;2.0036,2.5675,2.5366;1.2758,2.7667,3.7451;1.3401,1.8515,4.3581;1.6915,3.6125,4.2961;-0.0419,3.0978,3.4542;-0.7914,2.0441,2.8694;-1.8186,2.4038,2.7751;-0.7821,1.1468,3.5086;-0.409,1.7771,1.8768;4.3121,2.0846,3.0964;4.6728,2.9102,2.4692;4.0303,2.5503,4.4228;5.2362,2.4959,5.1942;5.8399,3.8898,5.347;6.8027,3.8303,5.8643;5.17,4.5358,5.9231;6.0026,4.336,4.3618;4.9163,1.8304,6.5305;5.8185,1.7714,7.1469;4.5368,0.818,6.3629;4.1564,2.4036,7.0713;6.1538,1.6928,4.4415;5.44,1.0853,3.3726;6.1362,0.9485,2.5428;5.0366,0.1035,3.6686)|\",8.723970047029999\r\n\"[H]N1/C(=C2\\C(=O)C([H])([H])C([H])([H])C2([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-1.0245,-1.9485,-0.8132;-0.535,-1.1302,-0.435;-1.1666,0.0371,-0.7034;-2.3938,0.037,-1.3417;-3.0149,-1.1691,-1.8359;-2.5833,-2.3317,-1.7433;-4.3178,-0.7888,-2.5455;-4.1311,-0.7866,-3.6295;-5.1027,-1.5275,-2.356;-4.6295,0.6242,-2.0206;-5.195,1.2357,-2.731;-5.2293,0.5467,-1.1057;-3.2485,1.2439,-1.6728;-3.3333,1.9664,-0.8513;-2.8542,1.8027,-2.5375;-0.5288,1.3278,-0.2261;-0.7449,2.1206,-0.9498;-1.0346,1.6243,0.7053;0.9808,1.2175,0.0299;1.5192,1.1819,-0.9274;1.3325,2.1109,0.5582;1.2882,-0.0518,0.83;0.7616,-0.0199,1.7932;2.3594,-0.1358,1.0459;0.8354,-1.2761,0.0363;0.8899,-2.1828,0.6496;1.5156,-1.4271,-0.8185)|\",4.58783951943\r\n\"[H]/C(C(=O)C([H])([H])C([H])([H])[H])=C1/N([H])C([H])([H])C([H])([H])C1([H])[H] |(2.1032,-3.2295,-0.6132;1.4734,-2.3899,-0.3379;2.0717,-1.0991,-0.0967;1.4113,-0.0906,0.218;3.592,-0.997,-0.2414;4.0485,-1.7159,0.455;3.8648,-1.3543,-1.2451;4.1373,0.4088,-0.0002;5.227,0.4265,-0.1122;3.8854,0.7584,1.0054;3.7043,1.1231,-0.7071;0.1171,-2.5862,-0.2321;-0.7776,-1.6406,0.1252;-0.4287,-0.693,0.2645;-2.1667,-2.0739,0.0822;-2.7297,-1.6765,0.9335;-2.6641,-1.7306,-0.8381;-2.0303,-3.6105,0.1167;-2.033,-3.9547,1.1565;-2.8445,-4.1166,-0.4083;-0.6448,-3.8666,-0.5169;-0.7251,-4.0019,-1.604;-0.1379,-4.7493,-0.117)|\",5.069481034815\r\n\"[H]O[C@@](C([H])=O)(C([H])([H])[H])C([H])([H])C([H])([H])/C([H])=C(\\C([H])([H])[H])C([H])([H])C([H])([H])C([H])=C(C([H])([H])[H])C([H])([H])[H] |(9.0803,1.6611,0.2816;8.7074,0.8029,0.5383;7.6967,0.4581,-0.4139;6.4509,1.3178,-0.1503;6.4773,1.867,0.8192;5.5089,1.4063,-0.9055;8.1803,0.677,-1.8523;7.3698,0.4799,-2.5595;9.0253,0.0176,-2.0739;8.5046,1.7154,-2.0019;7.3469,-1.0258,-0.1697;6.5633,-1.3141,-0.8799;8.2413,-1.6134,-0.4125;6.8894,-1.3502,1.2684;7.6334,-0.9457,1.9663;5.9478,-0.8305,1.4767;6.7564,-2.8336,1.4901;7.7013,-3.3771,1.42;5.6543,-3.5566,1.7484;4.2662,-2.9772,1.8865;3.8316,-3.2371,2.8621;3.585,-3.3832,1.1271;4.2465,-1.8893,1.7936;5.7628,-5.0576,1.9338;6.8181,-5.3455,2.008;5.2942,-5.3455,2.8879;5.109,-5.8882,0.7992;4.0472,-5.6133,0.7202;5.5711,-5.6075,-0.153;5.2116,-7.3684,1.0533;4.6825,-7.7053,1.9476;5.875,-8.2977,0.348;5.8718,-9.7456,0.7789;5.4533,-10.3911,-0.0065;5.2883,-9.9015,1.692;6.8938,-10.1059,0.9634;6.6668,-8.0269,-0.9087;6.2677,-8.6091,-1.7512;7.712,-8.3425,-0.7832;6.6685,-6.9746,-1.2017)|\",5.09941355837\r\n\"[H]O[C@@]([H])(C1=C([H])C([H])=C(F)C([H])=C1[H])[C@]1([H])OC(=O)C([H])=C1[H] |(-0.4091,3.7848,-0.7973;0.4382,3.6728,-1.2567;1.1287,2.5961,-0.6278;1.471,2.8854,0.3778;0.2865,1.3375,-0.5272;0.4903,0.4258,0.5162;1.2549,0.6244,1.262;-0.2653,-0.7417,0.6036;-0.1177,-1.4575,1.4054;-1.2318,-0.9845,-0.3664;-1.9726,-2.1087,-0.284;-1.4618,-0.101,-1.4144;-2.227,-0.3279,-2.1496;-0.6934,1.0603,-1.4897;-0.8582,1.7644,-2.3;2.3975,2.3968,-1.4888;2.0939,2.0289,-2.4774;3.239,1.4165,-0.8734;4.4797,1.9657,-0.5972;5.3568,1.339,-0.0617;4.4559,3.3701,-1.0719;5.3127,4.0239,-0.9758;3.2539,3.6289,-1.5917;2.8885,4.5544,-2.0184)|\",5.40962334794\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(OC([H])([H])OC([H])([H])[H])[C@@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])C([H])([H])[H] |(1.2961,6.3275,-6.618;0.5016,6.0081,-7.0885;-0.5861,6.1058,-6.1709;-0.5789,7.057,-5.6284;-1.5012,6.0491,-6.7692;-0.5706,4.9093,-5.2121;-1.4928,4.8972,-4.622;-0.5456,3.717,-5.9931;0.0301,3.9424,-6.7492;0.6439,4.9026,-4.2524;1.4994,4.4985,-4.8067;0.9471,6.2698,-3.8619;2.2767,6.6337,-3.9534;2.3883,7.5849,-3.415;2.9495,5.8776,-3.5082;2.6273,6.8077,-5.3234;4.0015,7.1028,-5.5179;4.2823,8.0444,-5.0237;4.1572,7.2049,-6.5934;4.6444,6.2963,-5.1345;0.4644,4.0901,-2.949;1.362,4.3213,-2.3483;-0.6845,4.516,-2.2248;-0.5605,5.663,-1.3978;0.489,5.7522,-1.0588;-0.8489,6.5687,-1.9391;-1.4334,5.566,-0.3212;-1.1191,4.5207,0.5867;-1.2195,3.535,0.1187;-1.8253,4.6036,1.4161;-0.0937,4.628,0.9762;0.4013,2.5549,-3.0966;1.142,2.2491,-3.8464;0.738,2.1349,-2.1397;-0.9682,1.9537,-3.435;-0.9063,0.8596,-3.3981;-1.2963,2.2393,-4.4359;-1.7205,2.2747,-2.709)|\",8.39743342643\r\n\"[H]O[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]([H])(OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])C([H])([H])C([H])=C([H])[H] |(4.2255,2.8917,-4.356;3.3231,2.6038,-4.1414;3.3037,2.2782,-2.7472;2.2698,1.9778,-2.5551;3.6368,3.5089,-1.9013;3.4941,3.322,-0.831;2.9915,4.344,-2.1916;4.6815,3.817,-2.0502;4.2164,1.0695,-2.4822;5.2577,1.3532,-2.7003;3.9385,0.2907,-3.2004;4.1687,0.4975,-1.061;4.4494,1.2886,-0.3442;2.8095,0.126,-0.8127;2.4862,-0.1019,0.5527;2.9677,-1.0155,0.9325;2.8669,0.7378,1.1613;0.9871,-0.2189,0.7037;0.1381,0.733,0.1243;0.5668,1.5377,-0.4654;-1.2428,0.6446,0.2907;-1.8905,1.3874,-0.1676;-1.7949,-0.395,1.0442;-2.8717,-0.4628,1.1741;-0.9571,-1.3476,1.624;-1.3781,-2.1638,2.2051;0.4258,-1.2608,1.4484;1.075,-2.0118,1.8933;5.146,-0.6779,-0.8537;6.1628,-0.3268,-1.0738;5.1511,-0.9518,0.21;4.8317,-1.931,-1.694;3.7707,-2.1803,-1.5519;4.9635,-1.7174,-2.7613;5.6895,-3.106,-1.311;5.605,-3.4374,-0.2738;6.5311,-3.7444,-2.1249;7.1275,-4.588,-1.7874;6.6514,-3.4505,-3.1661)|\",6.4463771183450005\r\n\"[H]OC1=NC2=N/C([H])=C([H])\\C([H])=C\\2C([H])([H])N2C([H])([H])C([H])([H])C([H])([H])[C@]12[H] |(3.7714,1.7303,0.2372;2.9301,2.0484,0.5967;1.9172,1.2711,0.1359;0.7323,1.6379,0.4306;-0.3548,0.8202,0.0963;-1.4173,1.4477,-0.4277;-2.491,0.7122,-0.7311;-3.3306,1.2579,-1.1603;-2.5803,-0.6631,-0.5203;-3.481,-1.2088,-0.7839;-1.4804,-1.306,0.0484;-1.5043,-2.3759,0.2444;-0.3428,-0.5719,0.3731;0.857,-1.1882,1.0513;0.6302,-2.2168,1.3563;1.0618,-0.615,1.9771;2.0512,-1.2135,0.197;3.2955,-1.4505,0.9447;3.2775,-0.9412,1.9254;3.411,-2.5229,1.1411;4.4294,-0.8777,0.0589;5.0343,-0.1675,0.6341;5.1126,-1.6525,-0.2988;3.6952,-0.1942,-1.1224;3.6791,-0.8616,-1.9887;4.1629,0.7372,-1.4651;2.2481,-0.0141,-0.6399;1.5482,-0.0505,-1.4807)|\",5.09941355837\r\n\"[H]C1=C([H])C(F)=C(N([H])C([H])([H])/C([H])=C(\\[H])N2C(=O)OC([H])([H])C2([H])[H])C([H])=C1[H] |(-2.5908,-0.3093,0.4467;-1.511,-0.3155,0.3366;-0.8243,0.882,0.1019;-1.3383,1.8349,0.025;0.5489,0.8514,-0.035;1.2328,2.0099,-0.2504;1.3095,-0.3294,0.0518;2.6833,-0.2527,-0.1445;3.0578,0.6749,0.0117;3.5747,-1.3276,0.28;3.4974,-1.5342,1.3615;3.2652,-2.2453,-0.2385;4.9941,-1.0004,-0.0848;5.2003,-0.8609,-1.144;5.9632,-0.8737,0.8321;5.7804,-1.0039,1.8948;7.2934,-0.5739,0.561;8.2707,-0.6112,1.5498;8.1106,-0.8193,2.7261;9.4831,-0.3737,0.9702;9.3073,-0.0121,-0.4103;9.4461,1.07,-0.5054;10.069,-0.526,-0.9994;7.8696,-0.4398,-0.7638;7.8276,-1.394,-1.307;7.3442,0.3169,-1.3557;0.6013,-1.5164,0.3012;1.138,-2.4547,0.3916;-0.7904,-1.5037,0.4351;-1.309,-2.4394,0.6249)|\",5.3334314698\r\n\"[H]O[C@]1([C@]2([H])OC(=O)C([H])=C2[H])C([H])([H])C([H])([H])SC([H])([H])[C@@]1([H])C([H])([H])[H] |(2.7245,2.5287,-0.7486;2.3922,2.3854,0.1521;2.9569,1.151,0.6303;2.5516,1.2176,2.1332;3.1379,2.0486,2.5529;2.8942,0.0332,2.8631;1.8195,-0.3739,3.6421;1.8814,-1.3335,4.3657;0.7053,0.5619,3.3731;-0.2566,0.4656,3.8588;1.1044,1.4599,2.4686;0.5451,2.2834,2.0442;4.4966,1.2467,0.4922;4.7175,1.4819,-0.5586;4.838,2.104,1.0881;5.2906,-0.0033,0.8834;5.1351,-0.26,1.9341;6.3603,0.171,0.7318;4.8613,-1.4438,-0.1666;3.0462,-1.3991,0.0931;2.6383,-2.1515,-0.59;2.8202,-1.7193,1.1129;2.4058,-0.0329,-0.2211;2.7196,0.2229,-1.2456;0.8729,-0.1504,-0.2397;0.5634,-0.8057,-1.0619;0.408,0.8278,-0.3931;0.4784,-0.5756,0.6874)|\",4.57695496541\r\n\"[H]O[C@]1([C@]2([H])OC(=O)C([H])=C2[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])C([H])([H])C([H])=C([H])[H] |(5.2749,-0.0704,2.7558;4.991,-0.8162,2.2063;5.178,-0.4607,0.8363;4.3449,-1.4988,0.0247;4.473,-1.2639,-1.0413;2.9416,-1.431,0.3119;2.4752,-2.6474,0.7669;1.3293,-2.818,1.094;3.612,-3.5935,0.7268;3.5115,-4.6321,1.0119;4.6941,-2.9434,0.2992;5.6833,-3.357,0.1497;6.6655,-0.6698,0.4304;6.9618,-1.6862,0.7065;6.7568,-0.5888,-0.6629;7.6283,0.3271,1.0848;7.6237,0.1754,2.1739;8.652,0.1201,0.7497;7.2231,1.7658,0.7625;7.3284,1.9409,-0.3184;7.8894,2.4801,1.2622;5.772,2.0131,1.1823;5.463,3.032,0.9282;5.7,1.9609,2.2822;4.7588,1.0235,0.5573;4.8084,1.1542,-0.5357;3.3228,1.3895,1.021;2.613,0.6863,0.5796;3.2455,1.2576,2.1075;2.915,2.7892,0.6461;2.951,3.0229,-0.4207;2.4931,3.7281,1.4946;2.179,4.7111,1.1534;2.4315,3.5433,2.5654)|\",5.619151012825\r\n\"[H]C(=O)[C@@]([H])(OC([H])([H])OC([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(1.902,2.6789,-2.8958;2.6744,2.339,-3.6165;3.2173,3.1065,-4.3792;2.9714,0.8415,-3.5511;2.0392,0.3066,-3.7957;4.0288,0.4539,-4.4069;3.6885,0.3881,-5.783;3.1703,1.2952,-6.1005;4.6525,0.2946,-6.2996;2.8356,-0.678,-6.092;3.4544,-1.9552,-6.0102;2.7059,-2.6886,-6.3189;3.7877,-2.184,-4.9904;4.3204,-2.0184,-6.6863;3.3868,0.4343,-2.1313;4.3112,0.9646,-1.8596;2.3354,0.7625,-1.2265;2.2697,-0.2505,-0.1998;0.867,-0.8557,-0.2176;0.7727,-1.6174,0.5626;0.6663,-1.32,-1.1879;0.1155,-0.0789,-0.0449;2.649,0.3497,1.1484;2.6704,-0.43,1.9163;1.9233,1.1141,1.4425;3.6405,0.8065,1.0849;3.2545,-1.2264,-0.5437;3.5551,-1.072,-1.9235;4.5702,-1.4319,-2.1002;2.8545,-1.637,-2.5584)|\",5.95929332595\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])=C([H])[H])O[C@]1([H])C([H])([H])C(=O)OC([H])([H])[H] |(2.5572,3.6368,-4.8367;1.6694,3.5479,-4.4436;1.5989,2.2946,-3.7866;1.4169,1.4811,-4.5092;0.4163,2.342,-2.8187;0.5217,3.2216,-2.1689;-0.5056,2.4891,-3.3927;0.3539,1.0645,-1.9738;0.1367,0.1932,-2.605;-0.4441,1.1323,-1.224;1.7062,0.8293,-1.2794;1.8839,1.6751,-0.5884;1.7225,-0.442,-0.4748;0.961,-0.4885,0.3042;2.5584,-1.4625,-0.6574;2.5022,-2.355,-0.0403;3.3234,-1.4327,-1.4264;2.7568,0.7763,-2.244;2.9184,1.9406,-3.0536;3.6568,1.636,-3.8025;3.5293,3.1156,-2.2586;4.131,2.7176,-1.4344;2.7652,3.7623,-1.8112;4.4171,3.9963,-3.1131;4.4393,4.0149,-4.3323;5.2046,4.7816,-2.3591;6.0644,5.6828,-3.0813;5.4725,6.3617,-3.7002;6.7527,5.1243,-3.7208;6.6118,6.2358,-2.3179)|\",6.587876320605\r\n\"[H]OC(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H])(C([H])([H])OC([H])([H])[H])C([H])([H])OC([H])([H])[H] |(7.2537,3.0892,2.8584;7.2792,3.2715,1.9024;6.866,2.0538,1.2883;5.3629,1.8522,1.3611;5.0049,0.8967,0.9789;4.4979,2.7593,1.8183;4.8925,3.7098,2.1742;3.0072,2.5621,1.8867;2.5069,3.3605,1.3159;2.7421,1.6157,1.398;2.436,2.563,3.3222;1.3682,2.3107,3.2678;2.9177,1.7602,3.8959;2.5986,3.8939,4.0645;2.1384,3.849,5.0583;2.1216,4.715,3.5143;3.6537,4.1557,4.2021;7.605,0.8889,1.972;8.6871,0.9811,1.7822;7.2663,-0.0796,1.5787;7.3439,0.9959,3.3636;8.0658,0.0621,4.1391;7.8039,0.2396,5.1855;9.1536,0.184,4.0143;7.8003,-0.9731,3.8743;7.3271,2.1769,-0.1732;8.4129,2.3679,-0.1955;6.8211,3.0419,-0.6294;7.0117,0.9791,-0.8572;7.3665,1.0204,-2.2228;7.0874,0.0575,-2.6588;8.4493,1.1751,-2.3577;6.8341,1.824,-2.756)|\",7.145709714130001\r\n\"[H]C1=C([H])C([H])=C(N([H])C([H])([H])/C([H])=C(\\[H])N2C(=O)OC([H])([H])C2([H])[H])C([H])=C1[H] |(6.0233,4.6605,1.5945;5.8149,3.6459,1.2682;6.6124,3.0326,0.297;7.4494,3.5711,-0.1408;6.3489,1.7336,-0.1208;6.981,1.2647,-0.8726;5.2716,1.004,0.4219;5.0701,-0.3135,0.0054;5.4429,-0.5079,-0.9161;3.8208,-1.0223,0.2658;3.6698,-1.0478,1.3534;2.9441,-0.505,-0.1622;3.9,-2.4272,-0.2591;4.6724,-3.0608,0.1726;3.0794,-2.8872,-1.2134;2.3051,-2.2717,-1.6622;3.0968,-4.1747,-1.7397;2.1092,-4.6231,-2.61;1.2058,-3.9794,-3.0808;2.3181,-5.9484,-2.8585;3.5547,-6.3747,-2.2622;3.39,-7.3421,-1.7843;4.3014,-6.4862,-3.0551;3.9354,-5.2598,-1.2672;4.9978,-4.9996,-1.3218;3.6958,-5.5139,-0.2253;4.4654,1.6288,1.3906;3.6169,1.1056,1.8187;4.7454,2.932,1.8062;4.1079,3.3919,2.5574)|\",5.249076176145\r\n\"[H]/C(OC([H])([H])C([H])([H])[H])=C(\\C(=O)C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[H] |(2.6721,3.3123,0.1226;2.2394,2.3131,0.1053;0.9065,2.2216,0.2227;0.1978,3.4581,0.4357;0.5345,3.9043,1.3803;0.4385,4.1534,-0.379;-1.2854,3.1444,0.4725;-1.8573,4.064,0.6355;-1.5128,2.4458,1.2836;-1.6096,2.6962,-0.4718;3.038,1.2278,-0.028;4.4709,1.5276,-0.1811;4.9577,2.6511,-0.2498;5.4575,0.3462,-0.1708;5.5608,-0.4364,0.7501;6.2324,0.3464,-1.2658;7.2651,-0.6567,-1.2907;7.756,-0.5439,-2.2572;6.8331,-1.6556,-1.189;7.975,-0.4896,-0.4765;2.4981,-0.1854,-0.0747;3.1185,-0.8322,0.5534;1.4928,-0.1886,0.3581;2.4376,-0.76,-1.5025;2.0636,-1.7901,-1.4911;3.4267,-0.7686,-1.9759;1.7734,-0.1624,-2.1368)|\",4.78376149179\r\n\"[H]/C(OC([H])([H])C([H])([H])[H])=C(\\C(=O)C(=O)OC([H])([H])[H])C([H])([H])[H] |(4.7868,-0.8034,-0.4723;4.1145,-1.422,-1.0645;2.8825,-0.9432,-1.2911;2.5775,0.3367,-0.7042;2.6901,0.2669,0.3856;3.2931,1.0807,-1.0774;1.156,0.6991,-1.0886;0.8866,1.6661,-0.6509;0.4525,-0.0554,-0.7231;1.055,0.7694,-2.176;4.5346,-2.6184,-1.5402;5.9153,-2.9748,-1.171;6.6515,-2.2994,-0.461;6.5019,-4.2625,-1.783;6.3589,-4.5868,-2.9434;7.2424,-4.9329,-0.8895;7.9102,-6.1008,-1.3984;8.4394,-6.5272,-0.5463;8.6123,-5.8233,-2.1889;7.1851,-6.8138,-1.8002;3.6438,-3.5186,-2.3618;2.6118,-3.1619,-2.3213;3.6624,-4.5503,-1.9928;3.9621,-3.5535,-3.4083)|\",4.61505090448\r\n\"[H]C([H])([H])OC(=O)[C@@]12C(=O)[C@]3([H])C([H])([H])[C@@]4([H])C([H])([H])[C@]3([H])[C@]1([H])[C@]42[H] |(4.9664,0.5937,-1.4021;4.7763,0.1807,-0.4114;4.3307,-0.8147,-0.4876;5.7034,0.1119,0.164;3.8611,1.0971,0.2115;3.4826,0.7757,1.4572;3.8783,-0.2077,2.0561;2.5398,1.7668,2.0564;1.7495,2.7946,1.293;1.776,3.0289,0.1051;0.8666,3.4658,2.3359;0.0352,4.0116,1.8885;1.8698,4.3563,3.1453;2.7049,4.7056,2.5281;1.3776,5.2354,3.5756;2.2826,3.3571,4.2378;2.9685,3.7523,4.9909;0.8809,2.9081,4.7289;0.2563,3.7407,5.0732;0.9179,2.14,5.509;0.4747,2.3305,3.3538;-0.542,1.9505,3.2401;1.5962,1.2763,3.1602;1.4601,0.2131,3.3278;2.7944,2.0465,3.5785;3.6731,1.5411,3.9646)|\",6.15521529831\r\n\"[H]O[C@@]1([H])C(C(=O)C2=C([H])C([H])=C(F)C([H])=C2[H])=C([H])C([H])([H])C([H])([H])C1([H])[H] |(1.9097,0.9696,1.7642;2.4155,0.6589,0.9974;1.5109,0.6338,-0.129;1.9743,-0.0647,-0.8325;0.1345,0.1331,0.2777;-0.2334,-1.3327,0.3932;-1.4095,-1.6384,0.5737;0.7647,-2.4426,0.2224;2.1228,-2.3649,0.5741;2.5305,-1.4456,0.9808;2.956,-3.4748,0.4348;4.0041,-3.4339,0.7131;2.4252,-4.6561,-0.0676;3.2313,-5.7258,-0.2145;1.0832,-4.7715,-0.4204;0.7078,-5.7146,-0.8041;0.259,-3.6649,-0.2598;-0.7961,-3.7269,-0.5032;-0.8651,1.0132,0.4939;-1.8344,0.6039,0.7709;-0.7706,2.501,0.322;-1.3983,2.7697,-0.5438;-1.2434,3.0015,1.1785;0.6644,3.0018,0.1093;1.1769,3.0851,1.0776;0.6587,4.0081,-0.3236;1.4401,2.024,-0.7785;2.4631,2.3761,-0.9488;0.9523,1.9381,-1.7596)|\",4.783761491790001\r\n\"[H]C(=O)C([H])([H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C(=O)C1=C([H])C([H])=C(F)C([H])=C1[H] |(4.131,-3.7587,-2.3556;3.8319,-3.1502,-1.4708;4.6807,-2.7045,-0.7267;2.3449,-2.9295,-1.3241;1.8516,-3.9117,-1.3478;2.1532,-2.4661,-0.3526;1.7734,-2.0648,-2.4713;1.9588,-2.5584,-3.4343;0.6844,-2.0156,-2.3511;2.358,-0.6317,-2.5317;1.919,-0.1239,-3.4008;3.4397,-0.686,-2.7061;2.0873,0.1663,-1.2888;1.0773,0.5487,-1.1396;2.9928,0.3862,-0.3248;4.0005,-0.0034,-0.4342;2.6443,1.1265,0.9183;1.641,1.8328,0.9761;3.5448,0.9837,2.1062;3.3853,1.8978,3.1612;2.6306,2.6708,3.0613;4.1725,1.8172,4.303;4.0704,2.5212,5.1225;5.1163,0.7963,4.3874;5.8826,0.7084,5.492;5.2911,-0.1379,3.3728;6.0287,-0.9248,3.4889;4.5032,-0.0378,2.2271;4.623,-0.783,1.4464)|\",4.84906881591\r\n\"[H]OC([H])([H])[C@@]1([H])N2C(=O)C([H])([H])C([H])([H])[C@]2(C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(8.14,-1.3528,0.7633;7.8,-2.2249,0.4701;6.5144,-1.9689,-0.0326;6.5308,-1.3764,-0.9651;6.0836,-2.9467,-0.2797;5.5608,-1.2645,0.969;5.8795,-1.5787,1.9733;5.6412,0.2167,0.9108;6.8094,0.8871,1.1288;7.934,0.39,1.2039;6.5099,2.3728,1.2703;7.1166,2.7916,2.0774;6.8064,2.881,0.3455;4.9991,2.4251,1.527;4.5099,3.2899,1.0701;4.8077,2.4763,2.6045;4.4309,1.085,0.9682;3.7682,1.2962,-0.4207;3.3267,0.3548,-0.7663;2.9228,1.9829,-0.2657;4.6637,1.8465,-1.5387;5.048,2.8367,-1.262;5.5386,1.1953,-1.6607;3.9214,1.9571,-2.8754;4.5757,2.352,-3.6604;3.0542,2.6247,-2.7973;3.5557,0.978,-3.2084;3.4391,0.44,1.9531;3.9206,0.4319,2.9388;2.5491,1.0742,2.0469;3.0416,-1.0033,1.5668;2.8511,-1.5681,2.4864;2.0974,-1.0049,1.0109;4.1156,-1.7262,0.7242;3.897,-1.6117,-0.3448;4.0783,-2.8029,0.92)|\",6.843663340075\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])[C@@]1([H])OC(=O)[C@@]([H])(C([H])([H])[H])[C@]([H])(O[H])[C@]1([H])C([H])([H])[H] |(7.6985,1.0275,0.5704;6.7845,1.1877,0.2919;6.6581,0.7748,-1.0674;7.3226,1.3614,-1.7215;6.9341,-0.2869,-1.184;5.2113,0.9818,-1.4955;4.9685,2.0482,-1.3934;5.1167,0.7325,-2.5584;4.2211,0.1479,-0.6723;4.3329,0.4009,0.3884;4.4744,-0.9169,-0.7707;2.76,0.3787,-1.0749;2.5596,1.4555,-1.0356;2.5984,0.071,-2.4886;2.2815,-1.1701,-2.9425;2.4866,-1.4328,-4.1041;1.5663,-2.1466,-1.9986;0.4976,-1.9698,-2.2085;1.886,-3.5996,-2.3712;1.3151,-4.2802,-1.7338;2.952,-3.8109,-2.2277;1.6449,-3.7904,-3.4187;1.7832,-1.8585,-0.5024;2.7734,-2.2287,-0.2154;0.8949,-2.6195,0.3118;-0.0127,-2.4088,0.0401;1.7227,-0.3488,-0.2042;2.0212,-0.2124,0.8429;0.3201,0.2574,-0.3757;0.3199,1.3063,-0.0601;-0.4172,-0.2635,0.2455;-0.0213,0.2317,-1.4164)|\",7.243670700310001\r\n\"[H]OC1=N[C@]([H])(C([H])([H])[H])[C@]([H])(OC([H])([H])C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(6.5857,-2.1302,-0.4977;5.8697,-2.3937,0.0997;4.8498,-1.5083,-0.0193;3.758,-1.6698,0.608;2.8848,-0.5239,0.3038;2.9492,0.174,1.1534;1.4277,-0.9494,0.1432;0.7893,-0.0846,-0.0688;1.0825,-1.4217,1.0679;1.3201,-1.6634,-0.6761;3.5044,0.1908,-0.9399;3.3882,1.2827,-0.8829;2.8736,-0.2917,-2.1167;3.092,0.5089,-3.2656;2.6813,1.5203,-3.1085;4.1749,0.6344,-3.4438;2.4503,-0.1356,-4.4735;1.9985,0.6599,-5.5323;2.0682,1.7436,-5.46;1.4534,0.0775,-6.6772;1.1043,0.7088,-7.4902;1.346,-1.3111,-6.7714;0.9169,-1.7668,-7.6598;1.7868,-2.11,-5.7146;1.7007,-3.1917,-5.7777;2.3394,-1.5272,-4.5736;2.6729,-2.1453,-3.7465;4.9808,-0.2563,-0.8694;5.6137,0.4832,-0.3591;5.4155,-0.4422,-1.8589)|\",6.503521026950001\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]1([H])O[C@]([H])(OC([H])([H])[H])C([H])([H])C1([H])[H] |(3.8735,-3.2674,5.4279;4.6389,-2.6742,5.322;4.5708,-2.2454,3.9748;3.81,-1.4582,3.841;5.5478,-1.8173,3.7251;4.2305,-3.4338,3.0661;5.0263,-4.1821,3.1616;3.049,-4.0637,3.5798;2.3079,-3.5194,3.2579;4.0262,-3.1099,1.5829;3.8463,-4.0573,1.0617;2.8179,-2.3121,1.4856;3.0389,-1.194,0.6349;2.3631,-0.4029,0.9899;2.7733,-1.4895,-0.7123;1.4134,-1.8071,-0.972;1.3284,-1.9684,-2.0491;0.7488,-0.9794,-0.679;1.103,-2.714,-0.4397;4.5271,-0.8839,0.7645;4.8994,-0.3233,-0.0956;4.6966,-0.2918,1.6697;5.14,-2.2911,0.8922;6.0772,-2.2986,1.4562;5.3432,-2.7018,-0.1002)|\",8.66954727693\r\n\"[H]OC1=NC([H])([H])[C@]2([H])C([H])([H])[C@]12C(=O)OC([H])([H])C([H])([H])[H] |(4.6807,-0.4211,2.3215;5.2111,-1.2442,2.3984;5.7846,-1.4501,1.2041;6.3618,-2.5426,0.8825;6.7824,-2.4506,-0.5212;7.8414,-2.7208,-0.6187;6.2114,-3.1782,-1.1132;6.5189,-1.0172,-1.0102;6.194,-0.829,-2.0284;7.2134,0.0668,-0.2643;7.2874,1.0397,-0.7407;8.0202,-0.1815,0.4211;5.7982,-0.3576,0.1607;4.6956,0.6257,0.1202;3.9524,0.8208,1.0758;4.6203,1.3018,-1.035;3.5979,2.3317,-1.1341;3.5082,2.8285,-0.1656;4.001,3.0327,-1.8685;2.2703,1.7453,-1.5856;1.5411,2.551,-1.7273;1.8763,1.0549,-0.8346;2.3819,1.2119,-2.5351)|\",6.11167708223\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C2=C(C([H])([H])[H])N=C(C([H])([H])[H])O2)C([H])=C1[H] |(2.6757,-8.575,-0.1477;2.5846,-7.4924,-0.134;1.4165,-6.8769,-0.5855;0.5898,-7.4783,-0.9546;1.3054,-5.4883,-0.5653;0.3919,-5.0152,-0.9185;2.355,-4.6759,-0.0949;2.1748,-3.2242,-0.099;1.2184,-2.8761,-0.4847;3.0587,-2.2912,0.3206;4.0273,-2.5868,0.716;2.8376,-0.8726,0.2977;3.5906,0.2128,0.6732;4.9645,0.2656,1.2556;5.3919,-0.7342,1.3752;4.9524,0.7526,2.2381;5.6352,0.85,0.6146;2.8672,1.3745,0.4266;1.7382,0.9709,-0.0733;0.5788,1.7883,-0.5205;0.8193,2.8449,-0.392;-0.3192,1.5529,0.0628;0.3459,1.5991,-1.5748;1.6372,-0.3814,-0.1851;3.5267,-5.3161,0.3572;4.3595,-4.7248,0.7269;3.638,-6.7024,0.3375;4.5523,-7.1717,0.6916)|\",3.945650832250001\r\n\"[H]/C(C(=O)OC([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])=C(/[H])C1=C(C([H])([H])[H])N=C(C([H])([H])[H])O1 |(8.6829,-1.7938,1.7569;7.6997,-1.5384,1.3773;6.5487,-2.3248,1.8538;5.3901,-2.1534,1.518;6.9595,-3.2802,2.7292;5.9509,-4.1431,3.303;6.0984,-4.0922,4.3865;4.9658,-3.7415,3.0539;6.1209,-5.5741,2.8011;7.1361,-5.9185,3.0422;5.4321,-6.2059,3.3814;5.8515,-5.7521,1.3029;4.8452,-5.3799,1.0676;6.5502,-5.1238,0.7364;5.9836,-7.2066,0.8421;5.7992,-7.2989,-0.234;5.266,-7.8546,1.3609;6.9876,-7.6022,1.042;7.5071,-0.5337,0.4974;6.4895,-0.3411,0.1652;8.5172,0.3175,-0.0502;8.4908,1.3649,-0.94;7.3335,1.977,-1.6558;6.3932,1.4768,-1.4089;7.4803,1.9219,-2.7407;7.2388,3.038,-1.3967;9.7823,1.8396,-1.118;10.5266,1.0903,-0.3571;11.9985,1.1549,-0.1579;12.4052,1.9601,-0.7715;12.4744,0.2097,-0.4436;12.2454,1.344,0.8931;9.8383,0.1411,0.3262)|\",4.329331361455001\r\n\"[H]C1([H])OC([H])([H])C([H])([H])N([C@]2(C#N)C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]2([H])F)C1([H])[H] |(-2.2692,-2.4516,3.7064;-2.5106,-1.3763,3.6985;-3.5253,-1.2377,4.0837;-1.6442,-0.6749,4.5834;-0.301,-0.7976,4.1513;0.3107,-0.2333,4.8626;0.0107,-1.8553,4.1735;-0.0995,-0.2499,2.7345;0.9366,-0.4417,2.4531;-0.2545,0.845,2.7539;-1.0091,-0.9341,1.8198;-0.7676,-0.9867,0.3692;-1.398,-2.2578,-0.0883;-1.8727,-3.2506,-0.4579;0.7499,-1.0944,-0.008;1.2196,-1.8109,0.674;0.7915,-1.5516,-1.0033;1.5363,0.2322,-0.0608;1.7347,0.6136,0.9457;2.5166,0.0243,-0.5067;0.8099,1.3189,-0.8653;0.7996,1.0491,-1.9311;1.3489,2.2699,-0.7894;-0.6352,1.4837,-0.3754;-1.1711,2.2341,-0.9677;-0.658,1.8193,0.6685;-1.371,0.1595,-0.5086;-1.3276,-0.1723,-1.5556;-2.7207,0.3338,-0.2058;-2.4032,-0.834,2.2759;-2.7685,0.2043,2.2592;-3.0444,-1.4213,1.6136)|\",6.805567401005\r\n\"[H]O[C@@]([H])(C([H])=C([H])[H])C([H])([H])C(=O)O[C@@]([H])(C([H])([H])[H])C([H])([H])[C@@]1([H])O[C@]1([H])C([H])=C([H])[H] |(-1.071,0.8052,-0.9335;-1.6721,0.7813,-0.1703;-0.8908,0.4251,0.9765;-1.6263,0.3862,1.7885;-0.2951,-0.9497,0.7865;-1.0278,-1.6921,0.4699;0.9807,-1.3076,0.938;1.3021,-2.3301,0.76;1.7543,-0.6104,1.2491;0.1326,1.5249,1.3062;0.9093,1.5686,0.5366;0.6123,1.3096,2.2684;-0.5204,2.8925,1.3648;-0.2652,3.8114,0.6187;-1.4237,2.9518,2.3714;-2.1555,4.2017,2.5338;-1.4692,5.0187,2.2951;-3.3473,4.2229,1.5834;-3.909,5.1563,1.7029;-4.0215,3.3825,1.7833;-3.0067,4.1584,0.5468;-2.5546,4.2763,4.0089;-3.1957,3.4201,4.2595;-3.1484,5.1873,4.1521;-1.3628,4.3141,4.9365;-0.7239,3.4309,4.9306;-0.6281,5.5506,4.9546;-1.321,5.1641,6.1486;-2.2111,5.7643,6.3572;-0.4906,4.8231,7.3337;-1.0457,4.5361,8.2268;0.8423,4.8608,7.343;1.4121,4.5971,8.229;1.3959,5.1656,6.4598)|\",6.881759279145001\r\n\"[H]OC(C1=C([H])[C@]([H])(O[H])[C@]([H])(C([H])([H])[H])C([H])([H])C1=O)(C([H])([H])[H])C([H])([H])[H] |(5.704,1.0208,4.7288;5.2397,1.0725,3.879;4.8407,-0.2671,3.544;4.1327,-0.1686,2.1916;3.982,1.0117,1.5575;4.397,1.9088,2.0085;3.2807,1.2149,0.2371;4.0446,1.2275,-0.5547;2.6922,2.5126,0.1608;2.1116,2.6167,0.9312;2.285,0.0791,-0.0792;2.0221,0.1567,-1.1417;0.9833,0.1806,0.7307;0.3127,-0.6488,0.4806;0.4539,1.1111,0.4997;1.1634,0.1446,1.8116;3.0129,-1.2563,0.1515;3.8529,-1.3382,-0.5561;2.3623,-2.1177,-0.0284;3.5988,-1.4058,1.5526;3.6509,-2.5081,2.0851;3.8787,-0.7877,4.6279;3.5223,-1.7911,4.3879;3.0232,-0.1112,4.7244;4.3963,-0.8256,5.5957;6.0964,-1.1526,3.4437;5.8383,-2.1734,3.1542;6.6029,-1.19,4.4174;6.7932,-0.7331,2.7109)|\",5.12118266641\r\n\"[H]C1=C([H])C(Cl)=NC(=O)C1=O |(2.4212,-0.7688,-0.0058;1.3942,-0.4157,-0.0022;1.0728,0.8898,-0.0031;1.8173,1.6784,-0.0091;-0.35,1.2706,0.006;-0.626,3.0081,0.0161;-1.3652,0.5008,0.0087;-1.1403,-0.9013,0.0007;-2.0656,-1.6761,-0.0097;0.3228,-1.4327,0.0046;0.5476,-2.6261,0.0111)|\",3.1020978957\r\n\"[H]OC(=O)C1=C([H])C2=C(C([H])=C1[H])N([H])C([H])([H])[C@@]1([H])C([H])([H])/C([H])=C(/[H])[C@@]21[H] |(8.7517,0.3794,-1.423;8.5892,-0.5762,-1.4704;7.3104,-0.8482,-1.0728;6.985,-1.9966,-0.8704;6.3933,0.3145,-0.9133;5.2154,0.1239,-0.1767;5.0446,-0.8594,0.2533;4.2865,1.1392,0.0225;4.5375,2.4094,-0.5504;5.7158,2.6102,-1.2961;5.9006,3.5837,-1.7451;6.6262,1.5805,-1.4752;7.4994,1.7678,-2.0971;3.6472,3.4494,-0.3467;3.7744,4.2399,-0.9662;2.2599,3.1255,-0.0377;1.7367,4.0571,0.2019;1.7513,2.6665,-0.9021;2.1851,2.163,1.1537;2.5344,2.6904,2.0471;0.7395,1.622,1.3127;0.4836,1.465,2.371;-0.0108,2.3197,0.9155;0.7814,0.3073,0.5656;-0.1135,-0.2504,0.3042;2.0325,-0.0824,0.3144;2.3252,-1.0002,-0.1871;3.0523,0.8973,0.8785;3.4067,0.4978,1.8427)|\",4.879001339465\r\n\"[H]O/N=C(C(=C(/[H])C1=C([H])N=C([H])C2=C1C([H])=C([H])C([H])=C2[H])\\C([H])([H])[H])/C([H])([H])C([H])([H])[H] |(3.071,-1.3997,3.4417;2.7852,-1.8022,2.6072;2.9056,-0.7163,1.707;2.5725,-1.0427,0.5124;2.1562,-2.4231,0.1047;0.8741,-2.6433,-0.2324;0.1921,-1.7919,-0.2274;0.2649,-3.9419,-0.6155;-0.2926,-4.091,-1.8741;-0.2578,-3.2613,-2.5781;-0.8877,-5.2201,-2.3388;-0.9471,-6.2494,-1.52;-1.4226,-7.1543,-1.9017;-0.446,-6.2441,-0.191;0.1763,-5.0496,0.2896;0.6425,-5.0212,1.633;1.0951,-4.113,2.0179;0.5116,-6.1315,2.4381;0.8688,-6.0958,3.4639;-0.0866,-7.3205,1.9519;-0.1758,-8.1846,2.6045;-0.5592,-7.3733,0.6611;-1.0302,-8.2751,0.2769;3.2547,-3.4584,0.078;2.8966,-4.4209,-0.2934;3.6768,-3.5971,1.0788;4.0724,-3.1227,-0.5734;2.6439,0.0314,-0.5516;3.3023,-0.3352,-1.3534;1.6502,0.1067,-1.0172;3.1097,1.4073,-0.073;3.1275,2.1109,-0.9123;4.1143,1.3566,0.3575;2.4431,1.8045,0.6982)|\",4.606887488965\r\n\"[H]O/N=C(C(=C(/[H])C1=C(C([H])([H])[H])C([H])=C([H])N=C1[H])\\C([H])([H])[H])/C([H])([H])C([H])([H])[H] |(4.9077,-1.6961,3.2031;4.3968,-1.9816,2.4301;3.278,-1.1249,2.4941;2.4436,-1.3175,1.5342;2.6252,-2.2822,0.4039;1.6011,-3.1118,0.1059;0.6909,-3.0088,0.695;1.5502,-4.1669,-0.9213;0.3707,-4.3939,-1.6694;-0.8626,-3.5452,-1.4832;-1.6421,-3.8237,-2.1983;-0.64,-2.4801,-1.617;-1.281,-3.6589,-0.4745;0.3896,-5.4281,-2.6068;-0.4897,-5.6326,-3.2124;1.5407,-6.1976,-2.7703;1.5655,-7.0015,-3.5044;2.656,-6.0143,-2.0552;2.6386,-5.0254,-1.1564;3.5399,-4.9156,-0.558;3.9231,-2.2077,-0.3712;3.7988,-2.6001,-1.3843;4.723,-2.7653,0.1246;4.2633,-1.1684,-0.4386;1.2151,-0.4302,1.5835;0.3122,-1.0555,1.5875;1.2336,0.1089,2.5357;1.1348,0.5617,0.4117;0.2305,1.1755,0.4891;1.107,0.0397,-0.5509;2,1.2338,0.4088)|\",4.95791435611\r\n\"[H]O/N=C(C(=C([H])\\C1=C(/[H])C2=C([H])C([H])=C([H])C([H])=C2\\N=C/1[H])\\C([H])([H])[H])/C([H])([H])C([H])([H])[H] |(0.4562,-0.9268,0.1105;0.7618,-0.187,-0.4376;2.1079,-0.0849,-0.0351;2.792,0.7917,-0.6874;2.3007,1.6809,-1.7813;1.1412,2.3548,-1.6123;0.6356,2.2293,-0.6585;0.4577,3.2695,-2.5388;-0.2277,4.3694,-2.0492;-0.241,4.5718,-0.9797;-0.9015,5.2492,-2.9272;-1.6,6.4061,-2.4895;-1.6275,6.6333,-1.4262;-2.2281,7.2263,-3.3992;-2.7597,8.1105,-3.0583;-2.1878,6.9251,-4.7842;-2.6893,7.5834,-5.4884;-1.5227,5.8078,-5.2384;-1.4816,5.553,-6.2928;-0.865,4.944,-4.325;-0.2313,3.8346,-4.8081;0.3868,3.0509,-3.952;0.8416,2.1551,-4.3697;3.1989,1.8106,-2.9926;2.9079,2.6501,-3.6268;3.1789,0.899,-3.605;4.2417,1.9676,-2.6977;4.2416,0.8789,-0.2416;4.3932,0.1087,0.5204;4.9049,0.6283,-1.08;4.6195,2.2593,0.3194;5.6705,2.2744,0.6282;4.0052,2.5027,1.1932;4.4744,3.0538,-0.4209)|\",4.269466314345\r\n\"[H]O/N=C(C(=C(/[H])C1=C([H])N=C(C([H])([H])[H])C([H])=C1[H])\\C([H])([H])[H])/C([H])([H])C([H])([H])[H] |(5.0652,0.9244,-0.1106;4.457,1.0724,0.6301;3.2086,0.7808,0.0386;2.2338,0.8994,0.8657;2.3665,1.2338,2.32;1.7706,2.3582,2.7711;1.2905,2.9928,2.026;1.723,2.9131,4.1303;1.6122,4.3071,4.2862;1.5767,4.9391,3.3983;1.5528,4.9482,5.4543;1.587,4.2183,6.5823;1.5087,4.9681,7.8892;0.5507,4.7857,8.3933;1.5997,6.0405,7.7039;2.3017,4.6566,8.5794;1.669,2.8193,6.5447;1.6711,2.2497,7.4706;1.7311,2.1645,5.3207;1.7582,1.0808,5.2912;3.1582,0.2563,3.1605;3.5945,0.7404,4.0388;3.9694,-0.1766,2.572;2.5232,-0.5697,3.5108;0.8428,0.641,0.3194;0.3956,-0.1719,0.9108;0.2244,1.5222,0.543;0.7674,0.3095,-1.1717;-0.2741,0.1417,-1.4669;1.3434,-0.5906,-1.406;1.1736,1.1236,-1.7795)|\",4.917097278535\r\n\"[H]OC1=NN([H])[C@]([H])(OC2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])C1([H])[H] |(5.0697,-5.0573,1.7129;5.195,-4.1809,2.106;4.8931,-3.2346,1.1621;4.9814,-2.0217,1.5382;4.7353,-1.0131,0.5916;4.43,-0.1933,1.1038;3.9101,-1.3208,-0.5459;3.9479,-0.4657,-1.2288;2.5271,-1.6187,-0.2358;1.6987,-0.6248,0.2142;2.0044,0.7427,0.2044;2.9564,1.1063,-0.1677;1.0596,1.665,0.666;1.3079,2.7231,0.6525;-0.1822,1.245,1.1364;-0.9081,1.969,1.4946;-0.4805,-0.1215,1.1432;-1.444,-0.4676,1.5084;0.4496,-1.0503,0.6891;0.2339,-2.1142,0.6919;4.4495,-2.5811,-1.2139;3.8403,-2.8327,-2.0856;5.4674,-2.3677,-1.5554;4.4585,-3.7313,-0.1966;5.1274,-4.5339,-0.5356;3.4538,-4.1662,-0.1051)|\",5.4966997801\r\n\"[H]C1=NC([H])=C([H])C(C(=O)N([H])N([H])/C(=C(\\[H])C(=O)C([H])([H])[H])C([H])([H])[H])=C1[H] |(7.5752,2.8555,-3.3447;6.5694,2.4419,-3.3855;6.0893,2.1826,-4.6094;4.8582,1.6646,-4.6863;4.4918,1.4458,-5.6877;4.0579,1.3975,-3.5742;3.0803,0.9443,-3.7082;4.5652,1.6852,-2.3018;3.81,1.4372,-1.0307;4.3569,1.3401,0.0606;2.4362,1.3356,-1.1613;1.9913,1.856,-1.909;1.6901,1.2962,0.0186;2.2619,1.6156,0.7976;0.8892,0.1869,0.2856;0.5614,-0.7242,-0.6701;1.0069,-0.6268,-1.6553;-0.351,-1.8473,-0.45;-0.9265,-2.0701,0.6146;-0.5651,-2.7697,-1.6468;0.3848,-3.2177,-1.9656;-0.9619,-2.211,-2.5042;-1.2659,-3.5608,-1.3737;0.3829,0.174,1.7046;0.6763,-0.7525,2.2051;-0.7091,0.1948,1.727;0.7698,1.0313,2.2666;5.858,2.2096,-2.2093;6.284,2.4221,-1.2349)|\",4.078986618995\r\n\"[H]OC1=C2C(=O)[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]([H])(O[H])C2=C([H])C([H])=C1[H] |(5.2506,-4.3724,-0.9484;4.6103,-3.6478,-0.8591;5.2854,-2.4733,-0.9263;4.5772,-1.2515,-0.7719;3.0934,-1.2353,-0.5585;2.3918,-2.2175,-0.7348;2.4824,0.092,-0.0907;2.7609,0.1995,0.9719;0.9567,0.0781,-0.1884;0.5415,1.007,0.2179;0.5357,-0.7674,0.361;0.6339,-0.0155,-1.2312;3.1277,1.255,-0.8577;2.6949,2.211,-0.5429;2.9211,1.1411,-1.9314;4.6387,1.3083,-0.6348;5.0803,2.0245,-1.3359;4.9687,1.8477,0.6557;4.8431,1.144,1.3118;5.3116,-0.0429,-0.832;6.6942,-0.0548,-1.031;7.2337,0.8869,-1.0677;7.3708,-1.2629,-1.176;8.4458,-1.2741,-1.3341;6.6718,-2.4642,-1.1219;7.1983,-3.4108,-1.2331)|\",4.88172247797\r\n\"[H]C([H])([H])C1C(C([H])([H])[H])[C@]2(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])[C@@]2([H])C([H])([H])C1([H])[H] |(1.6153,-0.8623,2.5393;2.2668,-0.6204,1.6846;2.243,-1.5066,1.0328;3.276,-0.5055,2.0833;1.7338,0.6483,1.0619;2.4675,1.7402,0.6495;1.77,3.0926,0.6103;0.819,3.047,1.1509;1.5831,3.5323,-0.3789;2.3953,3.8176,1.1528;3.4864,1.4356,-0.4561;4.4472,0.2675,-0.1399;4.8856,0.4111,0.8529;5.277,0.2359,-0.8479;3.9702,-0.7125,-0.1619;4.3347,2.6381,-0.9303;3.7134,3.5398,-0.9763;5.1211,2.8415,-0.1903;4.9582,2.4147,-2.3187;5.4972,3.3217,-2.6222;5.7161,1.6223,-2.2811;3.9006,2.0814,-3.3792;4.3875,1.9044,-4.3483;3.2613,2.9666,-3.5171;2.9773,0.8741,-3.0535;1.8385,0.894,-4.1007;2.2588,0.9611,-5.1122;1.1763,1.7572,-3.9591;1.2258,-0.013,-4.0632;3.7324,-0.4602,-3.2323;4.6599,-0.5133,-2.6596;3.9938,-0.5976,-4.2892;3.1107,-1.3138,-2.9388;2.3791,1.107,-1.6193;1.8487,2.0605,-1.7378;1.267,0.0938,-1.1907;0.5114,0.0403,-1.9825;1.6838,-0.9166,-1.0993;0.5287,0.467,0.1673;-0.1637,-0.334,0.4607;-0.0556,1.3782,0.0126)|\",3.73340202886\r\n\"[H]C1=C([H])C([H])=C([H])C(C#CC2=C([H])/C(=N\\OC([H])([H])[H])C([H])([H])C([H])([H])C2([H])[H])=N1 |(11.8319,-3.3492,2.8358;10.775,-3.5809,2.7118;10.2536,-4.7746,3.2188;10.897,-5.4778,3.7387;8.8948,-5.0299,3.0366;8.4446,-5.9443,3.4133;8.1212,-4.0916,2.3622;7.0602,-4.2458,2.1953;8.7383,-2.9176,1.8882;7.9664,-1.9378,1.1923;7.2697,-1.1313,0.604;6.524,-0.1504,-0.0973;5.1818,-0.2893,-0.2742;4.6618,-1.1383,0.1528;4.4017,0.6722,-1.0366;3.1137,0.6709,-1.1613;2.5096,-0.3789,-0.4597;1.1064,-0.3321,-0.6868;0.6874,-1.1523,-0.0985;0.8714,-0.4778,-1.7479;0.6838,0.622,-0.3508;5.1466,1.7597,-1.7785;5.4909,1.3604,-2.7446;4.4616,2.5839,-1.9979;6.3635,2.2261,-0.9669;6.9173,2.9934,-1.52;6.0211,2.6887,-0.0321;7.2898,1.047,-0.6414;8.056,1.3424,0.0846;7.8358,0.731,-1.5431;10.055,-2.6649,2.0621)|\",3.74428658288\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C2=C(C#N)C(C([H])([H])[H])=NN2C([H])([H])[H])C([H])=C1[H] |(11.421,-3.4525,0.4576;10.3413,-3.3327,0.4383;9.761,-2.3149,-0.3262;10.3907,-1.6432,-0.9036;8.3798,-2.1573,-0.355;7.9518,-1.3628,-0.9598;7.5393,-3.0145,0.3829;6.0803,-2.9067,0.4011;5.5783,-3.6496,1.016;5.325,-1.9968,-0.254;5.8096,-1.244,-0.8704;3.88,-1.9279,-0.2137;2.8942,-2.7224,0.4038;3.0786,-3.8756,1.201;3.2163,-4.8274,1.8606;1.6478,-2.1294,0.0576;0.2748,-2.5773,0.447;0.0713,-3.5927,0.0873;-0.4689,-1.9,0.0202;0.1549,-2.5897,1.5365;1.8364,-1.0606,-0.7074;3.1821,-0.952,-0.863;3.7043,0.146,-1.6583;4.281,-0.2269,-2.5109;4.3354,0.805,-1.0532;2.8428,0.7047,-2.0227;8.1408,-4.0338,1.1456;7.5077,-4.7056,1.7199;9.5246,-4.1922,1.1737;9.9642,-4.9865,1.7707)|\",3.986467909825\r\n\"[H]OC(=O)C#C[C@@]([H])(O/C([H])=C(\\[H])C(=O)OC([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(3.2149,4.4758,1.7207;2.5722,5.1624,1.4715;1.8379,4.7388,0.4127;0.9805,5.4302,-0.0764;2.1737,3.4027,-0.0597;2.4296,2.296,-0.4814;2.7709,0.9632,-0.9989;2.3573,0.8591,-2.0128;4.2045,0.8283,-1.0764;4.8268,1.4895,-2.0803;4.1873,2.0029,-2.7947;6.1643,1.496,-2.1888;6.7868,0.9887,-1.4605;6.8806,2.1775,-3.2746;8.091,2.1861,-3.3916;6.0471,2.8108,-4.1452;6.7018,3.5124,-5.2114;5.9013,3.9561,-5.8048;7.2951,2.8253,-5.8207;7.3599,4.29,-4.8141;2.2192,-0.2074,-0.1313;2.8111,-0.1479,1.2878;2.4326,-0.9839,1.8864;2.5336,0.7812,1.798;3.9029,-0.2139,1.2632;2.611,-1.5331,-0.8115;2.2023,-2.376,-0.2432;3.6965,-1.6517,-0.8652;2.2077,-1.5926,-1.83;0.6843,-0.0907,-0.0775;0.2702,-0.922,0.5033;0.2439,-0.1354,-1.081;0.3617,0.8431,0.3945)|\",5.21914365259\r\n\"[H]/C(OC([H])([H])C#CC(=O)OC([H])([H])C([H])([H])[H])=C(/[H])C(=O)OC([H])([H])[H] |(7.6449,4.1452,-1.2548;8.2356,3.291,-0.9377;7.5379,2.1495,-1.0671;8.2033,0.9368,-0.6839;9.1158,0.8133,-1.284;8.5046,1.0013,0.3711;7.3049,-0.1887,-0.8894;6.5788,-1.1418,-1.0532;5.7553,-2.3245,-1.2114;6.0713,-3.4232,-0.8046;4.6158,-2.0256,-1.8623;3.6809,-3.1198,-2.0615;3.1205,-2.8278,-2.9526;4.2468,-4.0302,-2.2724;2.7717,-3.2919,-0.8553;2.0254,-4.0671,-1.0644;2.2449,-2.3593,-0.6297;3.3458,-3.5983,0.0237;9.493,3.4559,-0.4859;10.1281,2.6423,-0.1582;10.0453,4.8181,-0.4288;9.4812,5.8403,-0.7739;11.3052,4.7979,0.0774;11.9416,6.0794,0.1783;11.3781,6.739,0.8444;12.0149,6.5535,-0.8044;12.9342,5.8823,0.585)|\",5.643641259369999\r\n\"[H]/C(O[C@]([H])(C#CC(=O)OC([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])=C(/[H])C(=O)OC([H])([H])[H] |(4.4782,0.1261,-3.2572;4.9952,0.182,-2.3007;4.2338,0.2044,-1.1876;2.8013,0.2044,-1.3773;2.546,-0.5787,-2.1089;2.3575,1.4968,-1.9087;1.9986,2.5556,-2.3729;1.4952,3.8061,-2.9032;0.3507,4.1878,-2.7759;2.4638,4.4768,-3.5548;2.0584,5.7358,-4.1226;2.938,6.1099,-4.6463;1.755,6.4302,-3.3346;1.2258,5.5943,-4.8162;2.1634,-0.1602,-0.0172;2.4949,0.6067,0.6951;0.6312,-0.1284,-0.1037;0.1977,-0.3806,0.8698;0.2604,-0.863,-0.8303;0.2574,0.8567,-0.3974;2.6708,-1.5287,0.4562;2.2553,-1.7663,1.4416;3.7609,-1.5492,0.531;2.3589,-2.322,-0.2357;6.336,0.2219,-2.248;6.869,0.2925,-1.307;7.0933,0.1747,-3.5044;6.6214,0.0857,-4.625;8.4287,0.2451,-3.2664;9.2556,0.2196,-4.437;10.2829,0.2491,-4.0714;9.0508,1.086,-5.0725;9.0795,-0.6909,-5.0165)|\",5.276287561195001\r\n\"[H]/C(O[C@@]([H])(C#CC(=O)OC([H])([H])[H])C1([H])C([H])([H])C1([H])[H])=C(/[H])C(=O)OC([H])([H])[H] |(1.5917,-0.193,-6.1691;1.9181,0.8141,-6.4233;1.4158,1.8273,-5.6854;0.5807,1.4637,-4.5687;-0.0748,0.639,-4.8882;1.3869,1.0083,-3.4298;2.0214,0.6298,-2.4712;2.8398,0.1811,-1.3643;4.0519,0.1644,-1.3707;2.0703,-0.2132,-0.3289;2.7968,-0.6762,0.8238;2.0341,-0.9514,1.5525;3.4374,0.1173,1.2171;3.4143,-1.5408,0.5672;-0.2791,2.6669,-4.2018;-0.9254,2.4554,-3.3531;-0.8209,3.5826,-5.2683;-0.52,3.3944,-6.294;-1.8367,3.9497,-5.1513;0.249,4.0723,-4.3198;-0.0221,4.7798,-3.5417;1.2468,4.2028,-4.7258;2.7688,1.0271,-7.4382;3.1127,2.0214,-7.6983;3.2482,-0.1235,-8.215;2.9321,-1.2884,-8.0458;4.1148,0.2779,-9.1798;4.6491,-0.7731,-9.9949;5.3085,-0.2813,-10.7113;3.8476,-1.306,-10.5143;5.211,-1.4885,-9.3875)|\",5.368806270365\r\n\"[H]/C(O[C@]([H])(C#CC(=O)OC([H])([H])[H])C([H])([H])[H])=C(/[H])C(=O)OC([H])([H])[H] |(4.6586,0.5494,-0.4531;3.9877,1.1099,-1.1026;2.6736,0.8083,-1.0377;2.3054,-0.2819,-0.1671;2.8778,-0.187,0.7675;2.6204,-1.5794,-0.7752;2.8739,-2.6576,-1.2632;3.2198,-3.9184,-1.8869;3.8683,-4.0243,-2.9052;2.7221,-4.9537,-1.18;3.0211,-6.2544,-1.7202;2.5474,-6.9632,-1.0412;4.1019,-6.4148,-1.7551;2.6137,-6.3529,-2.7297;0.8138,-0.1255,0.1278;0.4828,-0.9094,0.8148;0.2363,-0.1999,-0.7976;0.6327,0.8524,0.5824;4.4631,2.065,-1.9164;3.816,2.6223,-2.5837;5.9019,2.3574,-1.8942;6.7362,1.8058,-1.1971;6.2052,3.351,-2.7683;7.5912,3.7134,-2.819;7.6591,4.5089,-3.5622;8.2053,2.8583,-3.1153;7.9339,4.0698,-1.8433)|\",5.268124145680001\r\n\"[H]/C(O[C@]([H])(C#CC(=O)OC([H])([H])[H])C([H])([H])C([H])([H])[H])=C(/[H])C(=O)OC([H])([H])[H] |(4.1567,-2.2198,-2.0438;4.736,-1.735,-1.2596;4.0911,-0.8494,-0.4707;2.7033,-0.5971,-0.7749;2.1927,-1.5626,-0.9098;2.5644,0.182,-2.0097;2.4333,0.8313,-3.0229;2.3539,1.599,-4.2486;3.2832,1.7725,-5.007;1.1114,2.0923,-4.4268;0.9354,2.8722,-5.6242;-0.0945,3.2268,-5.5877;1.0985,2.2523,-6.51;1.6349,3.7115,-5.6422;2.1136,0.1181,0.4504;2.3488,-0.5048,1.3208;2.6449,1.0675,0.5789;0.605,0.3528,0.3448;0.2378,0.8588,1.2433;0.057,-0.5921,0.2484;0.3566,0.9772,-0.5196;6.0359,-2.0217,-1.0916;6.6342,-1.5372,-0.3288;6.6615,-3.015,-1.9733;6.1059,-3.646,-2.8559;7.9803,-3.1551,-1.681;8.6854,-4.1001,-2.4958;9.7143,-4.0896,-2.1336;8.6476,-3.8071,-3.5488;8.2532,-5.0995,-2.3934)|\",5.246355037640001\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])[C@@]1([H])C(O[H])=NSN1[H] |(-0.1672,-2.4522,0.9373;-1.0206,-2.2422,0.5204;-0.8247,-1.5412,-0.6249;-1.7622,-1.2024,-1.3049;0.6336,-1.1874,-0.9605;1.2541,-2.0815,-0.8147;0.8022,-0.7228,-2.3339;-0.0159,-0.1575,-2.5685;0.7597,-1.5183,-2.9692;1.1494,-0.108,0.016;0.8634,-0.3309,1.0506;0.6877,0.8536,-0.2423;2.6825,0.0051,-0.0613;2.9688,0.0749,-1.1173;3.1756,1.2076,0.7569;3.0326,2.4685,0.2886;2.734,2.4344,-0.6337;3.6476,0.9947,1.9226;3.6338,-0.7074,2.2215;3.3311,-1.1745,0.5365;4.2484,-1.2914,0.1033)|\",5.453161564019999\r\n\"[H]C([H])=C1C(=O)O[C@]2([H])[C@]3([H])/C(C([H])([H])[H])=C(/[H])C([H])([H])[C@]3([H])C([H])([H])C([H])([H])C([H])([H])[C@]12[H] |(2.8116,3.8959,-4.3256;1.9904,3.6932,-3.644;1.4091,4.5383,-3.2861;1.6786,2.46,-3.2429;0.5545,2.2008,-2.2934;-0.273,2.9704,-1.8709;0.5875,0.8874,-1.9288;1.7402,0.1882,-2.4818;1.3329,-0.7163,-2.9456;2.6083,-0.3077,-1.3122;1.863,-0.8616,-0.7141;3.3086,0.6208,-0.3108;2.8547,2.0002,0.0656;3.4593,2.393,0.89;1.8038,2.0053,0.3799;2.935,2.7052,-0.7709;4.2929,-0.0696,0.2829;4.8919,0.3037,1.1099;4.4265,-1.4719,-0.2667;3.9104,-2.1956,0.3848;5.4649,-1.8143,-0.358;3.7201,-1.3498,-1.6442;3.2596,-2.2963,-1.9564;4.7953,-0.9498,-2.683;5.4896,-0.2465,-2.2049;5.3798,-1.8564,-2.8904;4.3586,-0.3243,-4.014;3.6306,-0.9646,-4.5337;5.2401,-0.2849,-4.6664;3.794,1.1004,-3.8571;3.984,1.6793,-4.7698;4.3242,1.6188,-3.048;2.2861,1.1279,-3.5984;1.7963,0.7929,-4.5264)|\",5.10485583538\r\n\"[H]C1=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]([H])([C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])N([H])C([H])(C([H])([H])[H])C([H])([H])[H])C1([H])[H] |(8.4312,-2.5429,6.2294;8.4992,-3.3227,5.4698;8.0135,-4.5403,5.7474;7.3491,-4.8662,7.0599;7.2991,-3.9932,7.7192;7.8861,-5.6646,7.5917;6.324,-5.2331,6.9059;8.1007,-5.6655,4.7378;8.4349,-6.584,5.2431;7.0882,-5.896,4.3686;9.0448,-5.3462,3.5683;8.9391,-6.1012,2.7818;10.0812,-5.4213,3.9255;8.843,-3.9212,3.0094;9.5747,-3.756,2.2116;7.4352,-3.6939,2.3824;6.7059,-3.7065,3.2059;7.0409,-4.8196,1.4082;6.1057,-4.5757,0.8887;6.8834,-5.7712,1.9252;7.8125,-4.9822,0.6459;7.2503,-2.3123,1.6986;6.1725,-2.1738,1.5328;7.5387,-1.5062,2.3867;7.9323,-2.0654,0.3431;7.4993,-1.138,-0.0772;7.6688,-2.8653,-0.361;9.3923,-2.0267,0.4339;9.6691,-1.2999,1.0953;10.1091,-1.8186,-0.8329;9.7497,-2.6026,-1.5151;9.8572,-0.452,-1.4981;10.3943,-0.3742,-2.4509;10.2083,0.3626,-0.8503;8.7943,-0.2862,-1.7029;11.6056,-2.0445,-0.5934;12.1715,-1.9691,-1.5289;11.7795,-3.0314,-0.154;12.0071,-1.2906,0.0972;9.1522,-2.9377,4.1622;10.2441,-2.8879,4.3042;8.8568,-1.9137,3.901)|\",6.811009678015001\r\n\"[H]O[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])[C@]2([H])C(=C([H])[H])C([H])([H])C([H])([H])[C@]([H])(C([H])([H])[H])/C(C([H])([H])[H])=C(/[H])[C@]12[H] |(6.666,0.9055,1.2158;6.9214,0.078,1.6595;6.7569,-0.9856,0.7123;7.1722,-2.2506,1.4582;7.0978,-3.1291,0.808;6.5434,-2.4177,2.3392;8.208,-2.156,1.8014;7.6107,-0.7336,-0.5682;8.3107,0.0819,-0.3591;8.2164,-1.6086,-0.8268;6.6194,-0.3833,-1.701;6.4401,-1.2705,-2.3204;6.9855,0.3955,-2.3756;5.2859,-0.0141,-1.0138;4.4625,-0.248,-1.6968;5.1113,1.4505,-0.6259;6.0795,2.3664,-0.784;5.9124,3.4108,-0.5304;7.062,2.1261,-1.1769;3.7757,1.8925,-0.0319;3.8362,1.7862,1.058;3.6632,2.9666,-0.2238;2.4811,1.1946,-0.4974;2.3606,1.3004,-1.5845;1.651,1.7586,-0.0498;2.3008,-0.3119,-0.122;2.8114,-0.911,-0.8819;0.8148,-0.6963,-0.2062;0.4199,-0.4747,-1.2052;0.2072,-0.1402,0.5166;0.665,-1.7663,-0.0208;2.9558,-0.6322,1.2151;2.1123,-0.5324,2.4653;2.7201,-0.6833,3.3628;1.3042,-1.2754,2.4774;1.6292,0.4516,2.5509;4.2676,-0.9026,1.3052;4.6804,-1.0528,2.3011;5.2809,-1.0301,0.188;5.1521,-2.0183,-0.2861)|\",6.44365597984\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])[C@]1([H])C([H])=C([H])C(=O)N[N+]1[O-] |(2.736,2.379,-0.2897;2.3364,2.9936,0.3693;2.3707,2.4405,1.5965;1.9232,3.0171,2.5613;2.9573,1.0134,1.6969;3.8282,0.9314,1.0399;3.3632,0.6739,3.0557;2.7037,1.1211,3.6946;4.2645,1.1044,3.2549;1.9013,-0.0372,1.2631;2.3463,-1.0291,1.3867;1.059,0.0203,1.9641;1.2879,0.0745,-0.1578;0.9065,1.0855,-0.3263;0.2081,-0.9397,-0.3482;-0.7051,-0.7794,0.2191;0.3866,-2.0106,-1.1238;-0.3589,-2.788,-1.249;1.6631,-2.2024,-1.8361;1.953,-3.2297,-2.4135;2.5996,-1.1192,-1.9002;2.3803,-0.0794,-1.199;3.1353,0.9183,-1.3085)|\",3.785103660455001\r\n\"[H]C#CC([H])([H])OC1=C([H])C(=O)N=C2C([H])=C([H])C([H])=C([H])[C@]21[H] |(10.7294,-10.9562,-3.3234;10.9226,-10.0015,-2.8882;11.135,-8.922,-2.395;11.4342,-7.6254,-1.8018;11.8107,-6.9266,-2.5613;12.205,-7.721,-1.0251;10.2327,-7.0925,-1.2193;10.3206,-5.8863,-0.6136;11.4209,-5.1234,-0.4727;12.3965,-5.4069,-0.8478;11.3713,-3.8438,0.2629;12.394,-3.2384,0.5423;10.1092,-3.2955,0.6126;9.0403,-3.9958,0.4222;7.7441,-3.3759,0.6535;7.7501,-2.4052,1.1384;6.609,-3.9325,0.1657;5.6591,-3.4197,0.2944;6.6268,-5.1774,-0.588;5.7061,-5.519,-1.0528;7.7594,-5.8904,-0.7154;7.7932,-6.8246,-1.2652;9.015,-5.4651,0.0076;8.9622,-6.0087,0.9756)|\",4.149736220125\r\n\"[H]OC([H])([H])C1C([H])O[C@@]2([H])C([H])([H])[C@]([H])(O[H])C([H])([H])[C@@]([H])(O[H])[C@]2([H])C1=O |(1.5329,-1.3659,2.1654;2.3055,-0.9203,1.7816;1.8527,0.3378,1.276;1.599,1.0256,2.0958;2.7137,0.7412,0.7353;0.6614,0.0875,0.377;-0.559,0.6247,0.6813;-1.4639,0.3548,0.1305;-0.5495,1.9386,1.1248;-0.2803,2.644,-0.1511;-1.1826,2.5361,-0.767;-0.0589,4.1217,0.136;-0.9155,4.5235,0.6856;0.8362,4.2539,0.7555;0.0999,4.8828,-1.1885;0.3509,5.9308,-0.9613;-1.1724,4.8166,-1.8393;-1.0984,5.2734,-2.6919;1.2109,4.2807,-2.0621;2.1787,4.465,-1.57;1.2555,4.7961,-3.0308;1.0689,2.7659,-2.309;0.1871,2.563,-2.9226;2.1615,2.2736,-3.0705;2.9772,2.4636,-2.5786;0.9209,2.0329,-0.9635;1.8331,2.2197,-0.3832;0.6818,0.4659,-1.0958;0.2948,-0.0803,-2.0965)|\",4.049054095440001\r\n\"[H]OC1C([H])O[C@@]2([H])C([H])([H])[C@]([H])(O[H])C([H])([H])[C@@]([H])(O[H])[C@]2([H])C1=O |(1.2378,0.597,1.9575;1.55,0.1217,1.1604;0.4747,-0.1234,0.3677;-0.7241,0.4808,0.6544;-1.6091,0.3107,0.045;-0.5685,1.7783,1.156;-0.1837,2.5277,-0.0551;-1.072,2.5219,-0.7065;0.149,3.9653,0.3081;-0.6827,4.4141,0.8595;1.0371,3.9871,0.9531;0.4089,4.7832,-0.9775;0.7623,5.777,-0.6833;-0.8047,5.0368,-1.6877;-1.0217,4.2625,-2.2282;1.475,4.1193,-1.8641;2.4449,4.1868,-1.3477;1.5713,4.6733,-2.8038;1.2157,2.6287,-2.1757;0.335,2.5111,-2.8185;2.2756,2.0817,-2.9427;3.0994,2.2138,-2.4453;0.986,1.8579,-0.8652;1.8987,1.9258,-0.2645;0.6263,0.3275,-1.0762;0.2762,-0.1636,-2.1187)|\",4.15245735863\r\n\"[H]C(=O)[C@]1([H])C(C([H])([H])[H])=C([H])C(=O)[C@@]2([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@]12C([H])([H])[H] |(-0.3891,1.1366,-0.6187;0.6746,1.2841,-0.3314;1.1527,2.3966,-0.3069;1.4389,0.0001,0.0151;0.6767,-0.6989,0.3917;2.4379,0.2722,1.122;1.879,0.4318,2.5088;2.6671,0.6267,3.2413;1.3306,-0.4681,2.8192;1.1662,1.2666,2.5455;3.751,0.3789,0.8608;4.4718,0.5574,1.6554;4.3358,0.2389,-0.4886;5.5539,0.22,-0.6175;3.3451,0.1807,-1.6609;2.9669,1.2147,-1.7194;3.9806,-0.0937,-3.0702;4.8346,-1.3794,-3.1512;5.3371,-1.4162,-4.1257;4.2447,-2.296,-3.0636;5.6032,-1.3888,-2.3763;4.889,1.1001,-3.4469;5.2438,0.987,-4.479;5.757,1.1632,-2.7885;4.3399,2.0486,-3.3885;2.8417,-0.1522,-4.1207;2.4237,0.8583,-4.2439;3.2741,-0.4256,-5.0923;1.6957,-1.1027,-3.7609;2.0474,-2.1408,-3.7244;0.9269,-1.0702,-4.5432;1.0705,-0.6899,-2.4235;0.2285,-1.3473,-2.1646;0.6509,0.3157,-2.563;2.09,-0.6742,-1.2563;2.4335,-2.1305,-0.8647;1.5325,-2.64,-0.5015;3.1862,-2.1787,-0.0732;2.8101,-2.7052,-1.7117)|\",4.95791435611\r\n\"[H]C1=NC2=C(C([H])=C([H])N2[H])C(C2([H])C([H])([H])N([H])N([H])C2([H])[H])=C1[H] |(2.9303,1.4528,0.3076;2.0406,0.8331,0.2119;0.8849,1.4894,0.0638;-0.1747,0.6912,-0.05;-0.1942,-0.7346,-0.0275;-1.5667,-1.1302,-0.1884;-1.9652,-2.1346,-0.2203;-2.3044,0.0201,-0.2988;-3.3695,0.1487,-0.4318;-1.4716,1.1197,-0.2155;-1.7437,2.0895,-0.2673;1.0397,-1.3934,0.1304;1.1385,-2.9005,0.181;0.132,-3.3169,0.0656;1.7466,-3.4746,1.4902;1.0069,-3.6119,2.2854;2.5295,-2.8068,1.8706;2.3607,-4.7553,1.1203;1.6353,-5.4682,1.0412;2.8868,-4.576,-0.2076;3.8291,-4.2134,-0.0775;2.0616,-3.5492,-0.9012;2.7057,-2.8132,-1.3929;1.4547,-4.0258,-1.6794;2.1618,-0.5657,0.2491;3.1518,-0.9972,0.3706)|\",5.08852900435\r\n\"[H]C1=C(OC([H])([H])[H])C([H])=C([H])/C(=C(\\[H])N([H])N([H])C(=O)C(=O)N([H])[H])C1=O |(4.8115,-0.3488,-1.0783;4.3096,0.5608,-1.3837;3.0611,0.9105,-0.9236;2.3222,0.1776,-0.0537;2.864,-1.0469,0.4262;2.1127,-1.4611,1.1006;3.7992,-0.8792,0.9741;3.0482,-1.7486,-0.3964;2.4107,2.1232,-1.3409;1.4252,2.3379,-0.9432;3.0418,2.9525,-2.2154;2.5578,3.8706,-2.5427;4.3455,2.6451,-2.7252;4.9527,3.4904,-3.6406;4.453,4.3857,-4.0028;6.1847,3.2655,-4.1204;6.59,2.3484,-3.8339;6.6804,3.9884,-5.1825;7.5493,4.4981,-5.0592;6.1735,3.8962,-6.4441;5.1707,3.2856,-6.7788;7.0323,4.7137,-7.4335;8.0356,5.3155,-7.0678;6.5352,4.6615,-8.6842;5.7061,4.1083,-8.8568;7.014,5.1297,-9.4395;5.0271,1.4092,-2.3001;6.18,1.11,-2.7185)|\",3.77149796793\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])C([H])([H])[C@@]1([H])N([H])[C@]([H])(C([H])([H])O[H])[C@]([H])(O[H])[C@]([H])(O[H])[C@]1([H])O[H] |(-1.8968,1.1844,-2.5812;-1.0562,0.8787,-2.2114;-0.004,1.6963,-2.7318;0.1941,1.4678,-3.7899;-0.2602,2.7617,-2.6452;1.2297,1.4108,-1.8735;2.004,2.1374,-2.1467;0.9126,1.675,-0.5082;0.041,1.2628,-0.3698;1.7601,-0.0214,-2.0769;2.0742,-0.1318,-3.1228;0.9282,-0.7223,-1.9293;2.8935,-0.4216,-1.0991;2.5254,-0.1617,-0.097;3.2173,-1.852,-1.0759;2.3961,-2.4113,-0.8583;3.8296,-2.3721,-2.2993;3.2089,-2.1971,-3.1999;3.9954,-3.8798,-2.1613;4.6298,-4.0926,-1.2925;4.489,-4.2806,-3.0603;2.6862,-4.435,-2.0044;2.7843,-5.3501,-1.7031;5.175,-1.6592,-2.5215;5.6076,-1.9793,-3.4767;6.1467,-1.9858,-1.5303;5.7893,-1.6249,-0.6945;4.9667,-0.1312,-2.5653;4.3661,0.1194,-3.448;6.2015,0.5389,-2.7234;6.797,0.117,-2.0787;4.206,0.3598,-1.3167;3.9998,1.4324,-1.4322;5.097,0.1801,-0.2013;4.5911,0.3499,0.6086)|\",7.23278614629\r\n\"[H]O[C@@]1([H])C([H])([H])C([H])([H])[C@]2([H])[C@]([H])(OC(=O)C([H])([H])[H])OC([H])=C([H])[C@]21[H] |(8.8373,0.4683,1.8142;9.1099,-0.3966,2.1596;7.9936,-0.983,2.8234;8.2932,-2.0234,2.9775;7.6443,-0.2789,4.1793;7.6152,-0.9934,5.0085;8.4439,0.4345,4.4044;6.2698,0.4339,3.9938;6.2423,1.4242,4.4601;5.4607,-0.16,4.4333;6.1107,0.4773,2.4702;6.7611,1.2706,2.068;4.7605,0.7172,1.8204;4.3207,1.6912,2.0323;3.8536,-0.3016,2.2673;2.5252,0.0219,2.3247;2.0888,1.1262,2.1067;1.7104,-1.1909,2.7012;2.0967,-1.6394,3.6218;1.7889,-1.9464,1.912;0.6671,-0.902,2.8319;4.8681,0.6568,0.403;5.7986,-0.1991,-0.1472;5.7189,-0.1776,-1.229;6.6839,-0.9417,0.5271;7.3715,-1.5809,-0.0169;6.6777,-0.8807,2.0254;6.0065,-1.6566,2.4283)|\",6.245012868975\r\n\"[H]OC([H])([H])[C@@]([H])(N([H])[H])[C@@]([H])(O[H])C([H])([H])[C@]([H])(O[H])[C@]([H])(N([H])[H])C([H])([H])O[H] |(0.0188,-1.5589,2.3889;-0.9201,-1.7391,2.2051;-1.196,-1.1343,0.945;-0.6213,-1.6116,0.1384;-2.257,-1.312,0.7493;-0.9476,0.3841,0.9568;-1.257,0.7775,-0.0234;-1.8253,0.9577,1.9795;-1.8146,0.3822,2.8186;-1.5855,1.9176,2.2157;0.5401,0.7731,1.1385;0.5877,1.8713,1.2004;1.0743,0.2389,2.3771;0.4597,0.5201,3.0772;1.4726,0.3434,0.0041;1.2031,0.9389,-0.8754;1.3094,-0.7084,-0.2614;2.9674,0.5213,0.3582;3.2686,-0.3528,0.9559;3.2029,1.7053,1.1263;2.8092,1.5274,1.9973;3.9042,0.5797,-0.8609;3.7611,-0.3332,-1.4534;5.3266,0.6343,-0.4756;5.4153,1.2926,0.2993;5.6404,-0.2704,-0.1294;3.6685,1.7817,-1.7982;3.7179,2.7127,-1.2104;2.6785,1.7338,-2.2637;4.6205,1.7694,-2.8431;5.4476,1.5052,-2.3928)|\",7.300814608914999\r\n\"[H]O/C(=N/C(C#N)(C([H])([H])[H])C([H])([H])[H])[C@]([H])(N([H])[H])C([H])([H])C1=C([H])C([H])=C([H])S1 |(3.7929,0.8117,0.6676;3.553,1.4227,-0.0722;4.5463,1.2671,-0.9727;4.3618,1.759,-2.1321;5.2495,1.6494,-3.2831;6.7094,1.7132,-2.9953;7.852,1.7864,-2.7948;4.9315,2.8504,-4.2058;5.5343,2.8179,-5.119;5.1249,3.7911,-3.6834;3.8699,2.8123,-4.4649;4.953,0.3238,-4.0241;5.5133,0.2615,-4.9626;3.8817,0.284,-4.2414;5.2136,-0.5399,-3.4042;5.7541,0.532,-0.3574;6.3565,0.0671,-1.1438;5.2135,-0.4445,0.6052;5.9405,-0.7356,1.2569;4.8872,-1.2822,0.126;6.6428,1.5577,0.4028;6.9322,2.3451,-0.2976;6.0276,2.0265,1.1795;7.8878,0.9525,0.9931;9.0673,0.6594,0.3575;9.2277,0.8772,-0.6938;10.043,0.0659,1.2154;11.0361,-0.2233,0.888;9.6017,-0.0914,2.4996;10.132,-0.4991,3.3497;7.9785,0.4947,2.6842)|\",5.801467292660001\r\n\"[H]O/C(=N/C1(C#N)C([H])([H])C1([H])[H])[C@@]([H])(N([H])[H])C([H])([H])C1=C([H])C([H])=C([H])S1 |(4.1834,-2.1437,-0.8395;3.9247,-2.413,-1.7406;2.619,-2.1141,-1.9541;2.213,-2.2745,-3.1543;0.833,-2.0675,-3.5063;-0.1532,-2.8281,-2.7619;-0.8981,-3.4591,-2.128;0.5513,-1.8733,-4.9887;1.436,-1.9262,-5.615;-0.3474,-2.3199,-5.4025;0.4072,-0.705,-4.0616;-0.5894,-0.3443,-3.824;1.1866,0.0506,-4.0639;1.8216,-1.7071,-0.7074;0.8445,-1.3529,-1.0403;1.5717,-2.8401,0.1967;2.3951,-3.4362,0.2554;0.8196,-3.4129,-0.1849;2.445,-0.5078,0.0627;2.5473,0.3362,-0.63;1.6984,-0.218,0.8109;3.7585,-0.7533,0.7672;4.0332,-1.613,1.8069;3.2763,-2.261,2.2344;5.3816,-1.5372,2.27;5.7765,-2.144,3.0776;6.1252,-0.6123,1.5906;7.166,-0.3528,1.7297;5.1879,0.1809,0.3671)|\",5.48037294907\r\n\"[H]C1=C(C([H])([H])[H])C2=C(O1)/C([H])=C(/[H])[C@@]1([H])/C(C([H])([H])[H])=C(/[H])C(=O)/N=C/21 |(1.7633,3.1936,-6.2084;1.9072,2.7636,-5.2288;2.8261,2.9801,-4.247;3.9789,3.9352,-4.2722;4.0159,4.4786,-5.2219;4.9263,3.4078,-4.1265;3.9047,4.6629,-3.457;2.4305,2.0957,-3.1758;1.3186,1.4093,-3.6205;0.9829,1.8049,-4.8682;0.6381,0.3671,-2.9078;-0.1794,-0.1695,-3.3785;1.0639,0.0887,-1.6603;0.5704,-0.6801,-1.0757;2.1457,0.9008,-0.975;1.5674,1.6606,-0.4045;2.9725,0.1799,0.076;2.3477,-0.9432,0.8558;3.0113,-1.2739,1.6592;2.1355,-1.8076,0.213;1.3924,-0.6398,1.3058;4.2117,0.6304,0.3259;4.8288,0.1943,1.1066;4.8097,1.777,-0.3958;5.8305,2.3143,0.0064;4.1955,2.2149,-1.5891;3.0228,1.7632,-1.8994)|\",3.9184394472\r\n\"[H]O/C(=N/C([H])([H])C#N)[C@@]([H])(N([H])[H])C([H])([H])C1=C([H])C([H])=C([H])S1 |(3.2889,-3.4076,-0.2976;3.1049,-3.789,-1.1699;1.848,-3.4361,-1.5687;1.5368,-3.7964,-2.7483;0.2092,-3.4963,-3.2729;-0.0383,-2.4248,-3.2387;0.1901,-3.7862,-4.3291;-0.8628,-4.238,-2.5798;-1.6636,-4.824,-1.976;0.9886,-2.7577,-0.4915;0.0868,-2.3683,-0.969;0.5421,-3.6998,0.5441;1.3134,-4.2913,0.8513;-0.1664,-4.3227,0.1577;1.6982,-1.5487,0.1712;1.0147,-1.1802,0.9423;2.5944,-1.8784,0.7186;2.0839,-0.4604,-0.7918;3.2495,-0.3086,-1.4976;4.0863,-0.9939,-1.4069;3.2473,0.8226,-2.3699;4.0852,1.0997,-3.0003;2.0746,1.5223,-2.3253;1.8039,2.4178,-2.8681;0.9529,0.8091,-1.209)|\",5.831399816215001\r\n\"[H]O/C(=N/C1(C#N)C([H])([H])C1([H])[H])[C@@]1([H])N([H])C([H])([H])[C@]([H])(F)C1([H])[H] |(-1.2833,0.4337,0.8766;-1.1725,0.2539,-0.0946;-2.3777,-0.2056,-0.4721;-2.5157,-0.5791,-1.6823;-3.7032,-1.0649,-2.2988;-4.8061,-1.6276,-1.5382;-5.682,-2.0862,-0.9238;-3.4758,-1.7232,-3.6641;-2.4305,-1.8056,-3.9412;-4.1054,-2.5671,-3.928;-4.0903,-0.3635,-3.609;-5.1483,-0.2587,-3.8296;-3.4713,0.4926,-3.8553;-3.4235,-0.1628,0.6668;-3.9176,-1.1334,0.728;-2.7318,0.1772,1.9442;-2.6228,-0.6351,2.5441;-3.4499,1.2741,2.6121;-4.2909,0.9295,3.2323;-2.7718,1.8605,3.2397;-4.022,2.0619,1.4338;-3.2598,2.7093,0.9845;-5.0769,2.869,1.8332;-4.4732,0.9626,0.4695;-4.5475,1.3017,-0.5661;-5.4622,0.6138,0.7832)|\",6.7484234924\r\n\"[H]C1=NC2=C(S1)[C@]1([H])C(=O)C([H])=C([H])N=C1/C([H])=C\\2[H] |(-0.7583,3.5609,-0.8499;-0.3681,2.5839,-0.5867;-1.1158,1.5778,-0.2472;-0.3245,0.4738,0.0034;1.0363,0.6529,-0.1571;1.3613,2.2798,-0.648;2.0099,-0.4279,0.1489;2.3805,-0.2447,1.1829;3.3504,-0.4039,-0.6085;3.7674,0.6276,-1.1184;4.0635,-1.6654,-0.5423;5.0744,-1.7042,-0.9345;3.4236,-2.7939,-0.1148;3.962,-3.7398,-0.1008;2.0989,-2.902,0.254;1.3803,-1.8085,0.2851;-0.0398,-1.8982,0.5138;-0.4345,-2.8839,0.7366;-0.8538,-0.8152,0.3759;-1.9285,-0.8999,0.5072)|\",3.3987019927449995\r\n\"[H]N1N([H])[C@@]2([H])C([H])([H])C([H])([H])[C@]3([H])N([H])C([H])([H])C([H])([H])C(=O)[C@]3([H])[C@@]2([H])C1([H])[H] |(1.0477,3.424,-0.7322;0.8891,3.1773,0.2572;1.8652,2.1825,0.6357;2.7955,2.5782,0.5151;1.6087,1.0146,-0.222;1.9022,1.2129,-1.2723;2.3156,-0.2585,0.2524;3.3432,-0.2918,-0.129;2.3813,-0.2415,1.3482;1.5174,-1.4862,-0.2052;2.0878,-2.4093,-0.0407;1.3322,-1.42,-1.2873;0.1776,-1.5799,0.559;0.4086,-1.9738,1.5577;-0.7914,-2.5145,-0.0343;-0.3074,-3.3525,-0.3484;-1.5966,-1.9404,-1.1103;-1.001,-1.416,-1.8809;-2.1225,-2.7517,-1.6245;-2.6054,-0.9838,-0.4633;-3.4542,-1.5388,-0.0505;-3.017,-0.2693,-1.1896;-1.9925,-0.1986,0.6956;-2.6909,0.4217,1.4758;-0.4681,-0.1706,0.787;-0.227,0.1605,1.8011;0.0513,0.9332,-0.1774;-0.3241,0.7412,-1.1887;-0.3457,2.3749,0.2869;-0.7269,2.3512,1.3111;-1.1044,2.8423,-0.3469)|\",5.562007104220001\r\n\"[H]O[C@@]([H])(/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[H] |(9.4923,-2.1363,-6.6174;10.2898,-1.7089,-6.2626;10.6673,-2.4451,-5.1007;11.5516,-1.9111,-4.7238;9.5991,-2.4365,-4.0318;9.7612,-3.1179,-3.1952;8.4979,-1.6679,-4.0562;8.3653,-0.9779,-4.8891;7.4631,-1.6738,-3.0338;7.6028,-2.364,-2.1996;6.3672,-0.8956,-3.0599;6.238,-0.2126,-3.9008;5.2965,-0.8802,-2.0055;4.3217,-1.095,-2.4715;5.4834,-1.6898,-1.2883;5.1797,0.4547,-1.2341;4.4437,0.3215,-0.4285;6.1403,0.6681,-0.7454;4.7647,1.6563,-2.093;5.5096,1.8257,-2.8829;3.8194,1.4265,-2.6082;4.5965,2.9485,-1.2828;3.8478,2.7867,-0.4942;5.5391,3.1744,-0.7642;4.1854,4.1496,-2.1404;4.0736,5.0556,-1.5339;3.2291,3.9672,-2.6462;4.9339,4.3576,-2.9149;11.0931,-3.8746,-5.4663;11.473,-4.4131,-4.5902;11.8781,-3.8481,-6.2282;10.2407,-4.4414,-5.8626)|\",5.333431469800001\r\n\"[H]OC(=N/C1=C(C([H])([H])[H])C([H])=C([H])C([H])=C1C([H])([H])[H])/C([H])=C(\\[H])C(=O)OC([H])([H])[H] |(2.1981,-2.3367,2.9493;2.9271,-1.7519,3.2088;2.9364,-0.6614,2.3906;4.0085,0.0237,2.3575;4.1124,1.2262,1.6229;4.8821,1.2153,0.4391;5.4799,-0.0765,-0.0618;4.7116,-0.7651,-0.4385;6.0036,-0.6044,0.7435;6.1851,0.1091,-0.8779;5.0689,2.4169,-0.2489;5.6534,2.4118,-1.1661;4.5293,3.611,0.227;4.6908,4.5387,-0.3151;3.7975,3.6101,1.4122;3.3897,4.5415,1.7983;3.576,2.4297,2.1302;2.8132,2.4544,3.4346;2.6745,3.4835,3.7799;3.3439,1.8978,4.2155;1.8165,2.0034,3.3435;1.7017,-0.3723,1.6297;1.805,0.2879,0.773;0.4839,-0.8416,1.9428;0.2936,-1.4564,2.8189;-0.6966,-0.4978,1.1099;-0.6749,0.1736,0.0989;-1.8171,-1.0436,1.6328;-3.0271,-0.7731,0.9051;-2.9548,-1.1609,-0.1145;-3.2149,0.3029,0.8632;-3.819,-1.2812,1.4556)|\",3.39053857723\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C(C([H])([H])[H])C([H])=C1N([H])C(=O)/C([H])=C(\\[H])C(=O)OC([H])([H])[H] |(3.6974,-2.1987,-0.27;3.4676,-1.1442,-0.2197;2.1519,-0.6915,-0.1295;1.3519,-1.4282,-0.1107;1.821,0.6651,-0.0659;0.3786,1.1012,0.0286;-0.2923,0.237,0.0428;0.0865,1.7347,-0.8195;0.1868,1.686,0.9379;2.8686,1.6114,-0.0922;2.5788,3.0926,-0.0252;3.502,3.6797,-0.0508;2.04,3.3586,0.8933;1.9518,3.4207,-0.8642;4.1858,1.1643,-0.1823;4.9905,1.8984,-0.2032;4.5021,-0.2017,-0.247;5.8677,-0.5438,-0.3366;6.5035,0.2429,-0.3368;6.445,-1.7904,-0.4319;5.8208,-2.8462,-0.4494;7.9346,-1.7366,-0.513;8.4481,-0.777,-0.4924;8.66,-2.8562,-0.6143;8.1764,-3.8273,-0.6389;10.1394,-2.7925,-0.6933;10.805,-1.775,-0.6777;10.666,-4.0333,-0.7857;12.0988,-4.0871,-0.8676;12.3487,-5.1465,-0.9293;12.5506,-3.6348,0.0193;12.4519,-3.5552,-1.7553)|\",3.757892275405\r\n\"[H]C1=NC([H])=C([H])C(C(=O)N([H])N([H])/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])=C1[H] |(8.2888,-1.1303,4.5093;8.3767,-1.2853,3.4354;9.6201,-1.3943,2.9547;9.7464,-1.6006,1.6367;10.7646,-1.6917,1.2633;8.6673,-1.7018,0.7614;8.8086,-1.8808,-0.2986;7.3709,-1.5708,1.2695;6.2356,-1.682,0.2903;6.3742,-2.1988,-0.8135;5.0316,-1.1558,0.6853;4.8384,-0.951,1.6557;3.8963,-1.3904,-0.0972;4.0816,-2.1992,-0.6877;3.3873,-0.2905,-0.8155;2.6274,-0.6059,-1.5254;3.6939,1.0026,-0.619;4.4926,1.2365,0.0815;3.0812,2.1496,-1.2972;1.8979,2.0669,-2.0567;1.3736,1.1188,-2.1413;1.3726,3.1891,-2.6915;0.4579,3.0962,-3.2716;2.0044,4.4308,-2.5773;1.5883,5.3056,-3.0693;3.1694,4.5346,-1.8167;3.6711,5.4935,-1.7135;3.6978,3.41,-1.1858;4.6099,3.5011,-0.6002;7.2281,-1.3641,2.6463;6.2559,-1.2877,3.125)|\",3.616393073145\r\n\"[H]C1N=C(OC([H])([H])[H])C(OC([H])([H])[H])C(=O)[C@@]1([H])C(=O)[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H] |(6.5537,-0.118,-2.2953;6.8157,0.6854,-1.6069;7.3013,1.7679,-2.0954;7.6331,2.8073,-1.2336;8.1471,3.9113,-1.7917;8.315,3.9548,-3.2152;8.7443,4.9381,-3.4122;7.3566,3.8503,-3.7305;8.9931,3.1694,-3.5596;7.4538,2.7902,0.1338;7.7206,3.9229,0.8439;8.6164,3.789,1.9557;9.5848,3.3897,1.6263;8.1912,3.1457,2.7286;8.7573,4.801,2.3411;6.8642,1.6233,0.7669;6.6344,1.5471,1.9763;6.5697,0.4338,-0.1481;7.271,-0.3614,0.152;5.1708,-0.2675,-0.0105;5.0224,-1.3094,-0.6163;4.0587,0.3395,0.8388;4.2805,1.3951,1.0238;4.0499,-0.3891,2.2021;3.2484,0.0165,2.8291;4.9989,-0.2449,2.724;3.8708,-1.461,2.063;2.701,0.214,0.1202;2.5249,-0.842,-0.1138;1.9202,0.5184,0.829;2.5895,1.056,-1.1554;1.5926,0.9626,-1.5999;3.3151,0.7363,-1.9127;2.7633,2.1194,-0.949)|\",3.6817003972650006\r\n\"[H]O[C@@]1([H])[C@@]2([H])N(C([H])([H])[C@]([H])(O[H])[C@]2([H])O[H])[C@]([H])(O[H])C([H])([H])[C@@]1([H])O[H] |(-1.3258,-2.5503,-1.5011;-0.8858,-2.7316,-0.6479;-0.9345,-1.4874,0.0507;-0.9281,-1.7138,1.1234;0.2845,-0.6483,-0.3049;0.3092,-0.529,-1.4086;0.2803,0.6396,0.3824;1.6424,1.2107,0.229;1.6968,1.9453,-0.5862;1.9473,1.7087,1.1546;2.5522,0.0119,-0.1274;3.4577,-0.0204,0.4949;2.8958,0.0924,-1.5029;3.0054,-0.8334,-1.7903;1.6425,-1.2166,0.0975;1.6246,-1.4846,1.1651;2.072,-2.3075,-0.6967;1.2792,-2.8553,-0.852;-0.8369,1.4826,0.0043;-0.7713,1.7928,-1.0575;-0.8074,2.7049,0.7096;-0.7922,2.4756,1.6551;-2.1542,0.7199,0.269;-2.2672,0.6472,1.3593;-2.9959,1.3098,-0.109;-2.2095,-0.701,-0.3086;-3.0868,-1.2304,0.0769;-2.3882,-0.7292,-1.7355;-1.7819,-0.086,-2.1367)|\",7.409660149115001\r\n\"[H]/N=C(/O[H])[C@@]([H])(O[H])[C@@]([H])(C(=O)O[H])N([H])[H] |(-1.4968,1.4101,2.3219;-2.0657,0.6431,1.9563;-1.3189,-0.3473,1.6528;-1.863,-1.4907,1.211;-1.123,-2.1364,1.1801;0.2092,-0.3582,1.5952;0.6217,0.2784,2.3914;0.6461,-1.6928,1.7293;1.4651,-1.725,1.1737;0.69,0.1686,0.1842;-0.0019,-0.2429,-0.5603;0.6859,1.7063,0.0889;1.715,2.3375,0.0282;-0.5088,2.3235,0.0771;-1.2456,1.7002,0.2467;2.0202,-0.3841,-0.0553;2.7106,0.3193,0.2136;2.1623,-0.5595,-1.0468)|\",7.123940606090001\r\n\"[H]O[C@@]([H])(C([H])([H])OC([H])=C([H])[H])C([H])([H])O[P@](=O)(O[H])OC([H])([H])C([H])([H])N([H])[H] |(3.8681,-2.6888,2.2031;4.135,-3.4868,1.6843;4.7921,-3.0841,0.5017;5.4654,-2.2304,0.6776;5.64,-4.2778,0.0647;4.9923,-5.1463,-0.1213;6.3306,-4.5327,0.8791;6.3546,-3.9199,-1.1164;7.2571,-4.8377,-1.5678;7.4802,-5.6484,-0.8731;7.8444,-4.7366,-2.7596;8.5798,-5.4712,-3.065;7.6104,-3.9243,-3.4399;3.8082,-2.6846,-0.5983;3.0554,-3.4657,-0.738;4.3283,-2.4884,-1.5392;3.1566,-1.4579,-0.1713;1.6612,-1.0837,-0.6055;0.7384,-2.1985,-0.916;1.9153,-0.0173,-1.7909;1.1282,0.0446,-2.3576;1.2218,-0.1155,0.6084;0.5997,-0.6815,1.7904;0.3914,0.1853,2.4239;-0.3461,-1.1539,1.5043;1.5052,-1.6907,2.4987;1.5968,-2.5882,1.8784;1.0199,-2.0059,3.4295;2.862,-1.2108,2.8058;3.1778,-0.6038,2.0508;2.8713,-0.6589,3.6611),wD:17.17|\",6.963393434295\r\n\"[H]OC1=NC([H])([H])C([H])([H])N([C@@]2([H])O[C@]([H])(C([H])([H])O[H])[C@@]([H])(O[H])[C@@]2([H])O[H])C1([H])[H] |(4.225,-0.8642,-3.311;3.4599,-1.4537,-3.2201;2.6409,-0.9825,-2.2344;1.6854,-1.7222,-1.8588;0.7826,-1.2069,-0.8376;-0.2379,-1.5261,-1.0864;1.0334,-1.695,0.1157;0.8182,0.3162,-0.6723;0.2551,0.6344,0.2114;0.3566,0.7976,-1.5436;2.1976,0.8277,-0.5725;2.8636,0.5684,0.6977;2.0731,0.4745,1.4499;3.6548,-0.6208,0.7089;5.0268,-0.353,1.0946;5.6621,-0.941,0.4252;5.2226,-0.8284,2.5346;4.6598,-0.1725,3.2238;6.2817,-0.7616,2.8074;4.839,-2.1801,2.6794;3.9831,-2.2625,2.2249;5.2516,1.1443,0.895;5.9595,1.5474,1.6339;5.7188,1.4009,-0.419;5.3405,2.2771,-0.6369;3.8337,1.7382,1.0808;3.6787,2.0063,2.1334;3.6446,2.877,0.2685;2.8821,2.6267,-0.3053;2.9536,0.4329,-1.7714;2.7022,1.1235,-2.5907;4.0241,0.5407,-1.5729)|\",6.88992269466\r\n\"[H]C(NNC1C([H])=C([H])C(Cl)=NN1[H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-0.5766,-0.2681,1.4327;-0.9445,-1.0337,0.7415;-1.6197,-2.0224,1.202;-1.801,-1.9817,2.589;-2.4617,-3.029,3.005;-2.781,-3.218,4.4012;-2.4514,-2.4505,5.0925;-3.4651,-4.3179,4.7997;-3.7229,-4.4942,5.8377;-3.8554,-5.275,3.7966;-4.7369,-6.7041,4.2914;-3.5947,-5.1491,2.5364;-2.9143,-4.0471,2.1755;-2.7102,-3.9465,1.1876;-0.6532,-0.9206,-0.7287;-1.1551,-1.772,-1.2039;0.8669,-1.0229,-1.0251;1.2796,-1.9142,-0.5366;0.9881,-1.1731,-2.1078;1.6511,0.2323,-0.609;1.6412,0.3341,0.4855;2.7045,0.1198,-0.8953;1.0597,1.5,-1.2437;1.1881,1.4498,-2.3355;1.6093,2.3878,-0.9059;-0.4351,1.6423,-0.9193;-0.8517,2.5227,-1.4251;-0.5577,1.8228,0.1581;-1.2238,0.3889,-1.3339;-1.1888,0.2957,-2.429;-2.2816,0.4932,-1.0623)|\",3.654489012215\r\n\"[H]O[C@@]([H])(C(=C([H])[H])C([H])([H])[H])C([H])([H])[C@]([H])(O[H])[C@]([H])(O[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(1.4774,3.9991,-1.3789;1.1975,3.6389,-0.5222;2.351,3.0747,0.1202;2.7198,2.2294,-0.4872;3.4806,4.0828,0.2687;3.2576,5.3969,0.1705;4.065,6.1174,0.2705;2.2607,5.7927,0.0031;4.852,3.5059,0.5194;5.6071,4.2939,0.5887;5.1476,2.8183,-0.285;4.884,2.9278,1.452;1.8689,2.5107,1.4673;1.5005,3.3408,2.0807;2.7314,2.0735,1.9854;0.7746,1.4451,1.3126;1.1029,0.723,0.543;-0.4734,2.0123,0.9129;-0.2786,2.6031,0.1626;0.4995,0.6081,2.5782;1.3786,-0.0363,2.7217;-0.5843,-0.2749,2.299;-1.2228,0.2598,1.795;0.2804,1.3777,3.9064;1.133,2.0618,4.0309;-1.0127,2.2068,3.9386;-1.8857,1.5554,3.8175;-1.0433,2.9548,3.1424;-1.112,2.7178,4.9038;0.3078,0.3799,5.0765;0.1597,0.8932,6.0339;1.2642,-0.1556,5.1264;-0.4875,-0.3633,4.9579)|\",6.83005764755\r\n\"[H]O[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])[C@]([H])(O[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.2807,0.3885,-0.2534;3.3454,0.1199,-0.2304;3.3415,-1.1533,0.4111;2.3367,-1.2664,0.8421;3.5313,-2.3096,-0.6051;3.5713,-3.2457,-0.0286;2.3023,-2.3803,-1.5267;2.4078,-3.1901,-2.2575;2.1808,-1.4378,-2.0714;1.3813,-2.5584,-0.9571;4.8183,-2.198,-1.437;4.895,-3.0406,-2.1348;5.7134,-2.1794,-0.8106;4.8078,-1.2767,-2.0312;4.3221,-1.0938,1.6002;3.8999,-0.3482,2.2978;5.5642,-0.6018,1.0936;6.1947,-0.6407,1.8359;4.4978,-2.4168,2.3566;3.518,-2.8803,2.5287;5.0882,-3.1145,1.752;5.179,-2.2445,3.7231;4.5645,-1.5585,4.3282;6.4572,-1.63,3.4698;6.9046,-1.4986,4.3206;5.3292,-3.5566,4.4731;6.419,-4.3984,4.2175;7.1732,-4.0847,3.5014;6.5407,-5.62,4.8801;7.3931,-6.2621,4.6738;5.5718,-6.0173,5.8033;5.6681,-6.9686,6.3198;4.4828,-5.1838,6.0644;3.7281,-5.4824,6.7872;4.3661,-3.959,5.4058;3.5188,-3.3102,5.6209)|\",6.23684945346\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[C@]([H])(O[H])[C@]([H])(O[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(7.5437,1.7603,-0.3179;6.6871,1.8454,-0.7646;5.6557,1.3962,0.1433;5.9,0.3735,0.4749;5.5994,2.2947,1.3882;4.8592,1.8746,2.0844;6.5703,2.2154,1.9035;5.2959,3.7757,1.1263;5.9839,4.1496,0.3584;4.2852,3.8819,0.7122;5.4135,4.6293,2.3935;5.1924,5.6821,2.1855;6.4256,4.5788,2.8141;4.7162,4.2906,3.1698;4.3554,1.3284,-0.663;4.0946,2.3255,-1.0337;3.5542,1.0081,0.017;4.4414,0.3505,-1.8429;4.86,-0.6008,-1.4673;5.2823,0.8467,-2.8859;6.09,1.1796,-2.4514;3.0926,-0.0189,-2.4934;2.5622,-0.6535,-1.7686;3.3632,-0.857,-3.6157;4.1445,-0.46,-4.0399;2.1357,1.1411,-2.8655;2.0008,1.7567,-1.9628;2.6658,2.0429,-3.9907;1.9444,2.8373,-4.2179;2.8137,1.4609,-4.9079;3.6229,2.5034,-3.7338;0.7619,0.5618,-3.2413;0.0578,1.3597,-3.504;0.327,-0.0113,-2.4127;0.8548,-0.1112,-4.1004)|\",7.8368788944\r\n\"[H]O[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[C@]([H])(O[H])[C@]([H])(O[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.0138,-0.9471,2.1203;4.7299,-1.8567,2.304;3.3932,-1.7912,2.8179;3.0961,-2.8382,2.9288;3.3493,-1.1439,4.2222;2.3014,-1.1723,4.5507;4.1732,-1.9789,5.2141;4.1363,-1.5457,6.2206;5.2205,-2.0304,4.8984;3.7928,-3.0053,5.2752;3.8125,0.3225,4.2288;3.7519,0.7403,5.2398;3.2034,0.9589,3.5761;4.8599,0.4123,3.9103;2.4715,-1.1589,1.757;2.6141,-1.738,0.8382;2.7992,-0.1311,1.5404;0.9913,-1.1271,2.1308;0.8603,-0.4764,3.0119;0.5882,-2.45,2.4698;-0.3808,-2.41,2.5487;0.0714,-0.5167,1.0515;0.3218,0.557,0.9952;-1.2414,-0.6688,1.6159;-1.8923,-0.3513,0.9724;0.1609,-1.0848,-0.3843;1.2099,-0.9756,-0.6921;-0.2271,-2.5672,-0.5003;-0.1345,-2.9038,-1.5395;-1.2683,-2.72,-0.1924;0.3991,-3.1987,0.1328;-0.6766,-0.229,-1.3532;-0.5304,-0.5614,-2.3866;-0.4048,0.8325,-1.3005;-1.754,-0.313,-1.1497)|\",8.258655362675\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])N([H])C([H])([H])[C@]1([H])N2C([H])([H])[C@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])N([H])C1([H])[H] |(11.5016,-3.4262,1.1592;10.4845,-3.0801,1.3241;9.4099,-3.9645,1.1932;9.578,-5.0043,0.9289;8.1169,-3.5018,1.4339;7.2674,-4.1779,1.3778;7.8739,-2.1689,1.774;8.9606,-1.2678,1.8802;10.2648,-1.75,1.6686;11.1038,-1.062,1.7581;8.7308,0.0671,2.1729;9.5312,0.633,2.4083;7.4085,0.5755,2.4877;7.4888,1.3377,3.2734;6.952,1.0662,1.6118;6.514,-0.5845,2.9703;6.9459,-0.95,3.9116;6.5426,-1.7156,2.0324;5.7248,-1.5307,0.8182;6.1469,-0.752,0.1561;5.752,-2.4706,0.2566;4.279,-1.1745,1.1942;3.9034,-2.0065,1.8075;3.3362,-1.0375,-0.0249;3.5126,-1.9264,-0.649;1.8666,-1.0558,0.4229;1.1919,-0.9628,-0.4363;1.6664,-0.2287,1.1119;1.6221,-1.9901,0.9422;3.6175,0.2013,-0.8941;2.993,0.1819,-1.7945;4.6618,0.257,-1.2213;3.3755,1.1277,-0.3583;4.2249,0.0163,2.057;4.5595,0.8223,1.5295;5.058,-0.1628,3.2467;5.0442,0.7651,3.8318;4.5897,-0.9418,3.8606)|\",5.26812414568\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N1C([H])([H])[C@@]([H])(C([H])([H])[H])N([H])C([H])([H])[C@]1([H])C([H])([H])N2[H] |(8.0684,0.1703,7.5422;7.5393,0.2005,6.5938;7.4838,1.3894,5.8693;7.976,2.2843,6.2457;6.8135,1.464,4.642;6.1679,0.3092,4.1221;6.2305,-0.8738,4.8734;5.7309,-1.7668,4.5184;6.9162,-0.9373,6.0893;6.9452,-1.8747,6.6373;5.4877,0.4087,2.879;4.6318,-0.7226,2.5218;5.2399,-1.6327,2.5448;3.817,-0.8582,3.2615;4.0188,-0.5809,1.1211;4.8424,-0.5998,0.393;3.0626,-1.7322,0.8139;2.6581,-1.625,-0.1969;2.2199,-1.7446,1.5176;3.5713,-2.6999,0.8869;3.3391,0.705,0.9316;2.5294,0.7456,1.5539;4.2467,1.8042,1.2401;5.0318,1.8064,0.4718;3.7031,2.7525,1.1518;4.9094,1.7371,2.6269;4.1479,1.973,3.3963;6.0284,2.7823,2.7076;5.5908,3.7868,2.6542;6.6751,2.66,1.8258;6.8014,2.6572,3.9263;6.9407,3.495,4.4709)|\",5.194653406045\r\n\"[H]O[C@@]1([H])C(=C([H])[H])C([H])([H])C([H])([H])[C@@]([H])(O[H])[C@]2(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]21[H] |(1.3808,-3.4636,-1.8666;1.6645,-2.5367,-1.8098;3.0687,-2.5138,-2.0871;3.6128,-3.0257,-1.2736;3.3715,-3.2666,-3.3761;2.4422,-3.4353,-4.3239;2.6517,-4.0032,-5.2271;1.4482,-3.0102,-4.2241;4.76,-3.8434,-3.4928;5.0125,-4.3669,-2.5584;4.7817,-4.597,-4.2887;5.8507,-2.7979,-3.7782;5.7054,-2.3965,-4.7902;6.8253,-3.2999,-3.7783;5.9598,-1.6326,-2.783;6.0464,-2.0383,-1.7663;7.2268,-0.9923,-2.9976;7.2322,-0.6607,-3.9099;4.8082,-0.5727,-2.7843;4.496,-0.098,-4.219;5.4073,0.1825,-4.7654;3.8612,0.795,-4.1873;3.9677,-0.8574,-4.8035;5.2852,0.6468,-1.9232;6.1431,0.3553,-1.3096;5.6266,1.4766,-2.5514;4.1118,1.0198,-1.0181;3.372,1.6271,-1.5588;4.4288,1.5967,-0.1436;3.5033,-0.3472,-0.6591;4.2595,-0.8795,-0.0567;2.1826,-0.3132,0.167;1.3421,-0.3448,-0.5376;2.0615,-1.5326,1.0973;1.1032,-1.52,1.6311;2.8589,-1.5246,1.8533;2.1178,-2.4716,0.5436;2.0357,0.9645,1.0144;1.0996,0.9303,1.5849;2.0153,1.8728,0.4038;2.8549,1.0641,1.7385;3.4632,-1.0185,-2.0697;2.6502,-0.5292,-2.625)|\",6.99332595785\r\n\"[H]OC([H])([H])/C(=C(\\[H])[C@@]([H])(C([H])([H])[H])[C@]([H])(O[H])C([H])([H])C1(C([H])([H])[H])OC([H])([H])C([H])([H])O1)C([H])([H])[H] |(0.6055,4.4671,-0.5182;-0.2141,4.0538,-0.2023;0.1603,2.9963,0.6633;0.6407,3.385,1.5799;-0.7826,2.5361,0.988;1.0635,1.9374,0.0535;1.341,1.9472,-1.258;0.8772,2.7248,-1.8627;2.2211,0.9943,-2.0305;2.7646,0.335,-1.3487;1.3561,0.1072,-2.9504;0.6572,-0.4849,-2.3518;1.9617,-0.5905,-3.5412;0.7535,0.7122,-3.6395;3.2982,1.796,-2.8127;3.7587,2.4963,-2.1072;2.7124,2.6539,-3.7996;2.333,2.0872,-4.4901;4.4131,0.9533,-3.4638;5.0206,1.6366,-4.0674;3.9744,0.2298,-4.1647;5.3845,0.1902,-2.5356;6.0597,1.0583,-1.4844;5.3328,1.4453,-0.7652;6.5679,1.9018,-1.9609;6.7951,0.4519,-0.9505;4.7482,-0.9039,-1.8715;4.8222,-2.0386,-2.728;5.0073,-2.9183,-2.1042;3.8771,-2.1899,-3.2673;5.9888,-1.7184,-3.695;5.6661,-1.7532,-4.7445;6.8409,-2.3949,-3.5735;6.4092,-0.4081,-3.3326;1.5719,0.9298,1.0565;2.1882,0.1526,0.6002;0.7369,0.433,1.5705;2.1697,1.4181,1.8388)|\",7.01509506589\r\n\"[H]O[C@]1(C([H])([H])[H])/C(=C(/[H])C([H])=C(C([H])([H])[H])C([H])([H])[H])O[C@]2([H])C([H])=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@]21[H] |(2.9517,-4.1354,3.4748;3.3634,-3.8985,2.6282;4.4377,-2.9994,2.9209;3.9525,-1.9068,3.8872;4.7481,-1.1978,4.1376;3.5969,-2.3533,4.8262;3.1306,-1.3462,3.4335;4.9691,-2.402,1.6056;4.3051,-1.8697,0.562;4.9318,-1.4807,-0.2367;2.8627,-1.7935,0.4191;2.287,-2.3049,1.186;2.1761,-1.1822,-0.5723;0.6681,-1.2085,-0.5873;0.2857,-1.6765,-1.5068;0.2492,-0.1913,-0.5646;0.2609,-1.7613,0.2654;2.8029,-0.4407,-1.7256;2.4486,0.5999,-1.7575;2.5127,-0.8934,-2.6849;3.8943,-0.4172,-1.6851;6.3536,-2.4453,1.5801;6.8056,-2.8306,2.8901;6.847,-1.9125,3.5034;8.1375,-3.5108,2.9272;8.9065,-3.1847,2.2297;8.3878,-4.4518,3.8506;9.7297,-5.1295,3.9559;9.6316,-6.2183,3.8431;10.1866,-4.9584,4.9406;10.4256,-4.7704,3.1915;7.3374,-4.9302,4.8474;7.1463,-5.9953,4.6429;7.7666,-4.9066,5.8592;5.9942,-4.1543,4.8293;6.0666,-3.2651,5.4692;5.2002,-4.7852,5.2484;5.7034,-3.7487,3.3906;5.741,-4.649,2.7621)|\",4.857232231425001\r\n\"[H]OC([H])([H])[C@]1(C([H])([H])[H])O[C@]1([H])C([H])([H])C1(C([H])([H])[H])OC([H])([H])C([H])([H])O1 |(4.3169,1.3654,-1.1061;4.2472,2.0402,-0.4127;4.3003,1.3674,0.842;5.2272,0.7921,0.9633;4.3075,2.1586,1.5994;3.0901,0.4724,1.0686;1.7661,1.2037,1.1306;1.6204,1.7986,0.2232;1.7541,1.8883,1.9875;0.9325,0.504,1.2394;3.28,-0.5413,2.0771;3.115,-0.9639,0.7082;2.1452,-1.4284,0.5177;4.2645,-1.6918,0.0474;5.2271,-1.2544,0.3282;4.1662,-1.6097,-1.0433;4.3017,-3.1941,0.3834;4.4642,-3.4907,1.8703;3.6109,-3.1108,2.4377;5.3719,-3.0149,2.2529;4.5381,-4.5719,2.0198;3.1033,-3.8367,-0.0766;3.422,-4.5679,-1.2567;2.7547,-5.4322,-1.3171;3.2922,-3.9497,-2.1579;4.8891,-4.9207,-1.0336;5.4629,-5.0606,-1.954;4.9937,-5.8164,-0.4037;5.3817,-3.7616,-0.3693)|\",8.39743342643\r\n\"[H]OC([H])([H])/C(=C(\\[H])C([H])([H])C1(C([H])([H])[H])OC([H])([H])C([H])([H])O1)C([H])([H])[H] |(3.2317,-1.0635,1.5952;4.16,-1.2886,1.4223;4.7579,-0.1308,0.8317;4.7878,0.7066,1.5459;5.7937,-0.4295,0.6281;4.0664,0.296,-0.4457;3.4954,1.5093,-0.5142;3.5747,2.1557,0.3603;2.7645,2.1195,-1.6785;1.8573,2.6203,-1.3157;2.4451,1.3613,-2.4003;3.5907,3.1693,-2.4504;2.7519,3.9145,-3.4883;1.97,4.5016,-2.9964;2.2821,3.2086,-4.1807;3.3868,4.5977,-4.0605;4.1556,4.1389,-1.5546;5.5707,3.9676,-1.5396;6.0393,4.9494,-1.4204;5.8761,3.3149,-0.7104;5.8438,3.3097,-2.8898;6.7239,2.661,-2.9027;5.9364,4.0589,-3.6906;4.6882,2.4999,-3.0723;4.0855,-0.7324,-1.548;3.4357,-0.4769,-2.3876;3.7884,-1.7106,-1.1536;5.104,-0.8452,-1.9444)|\",6.865432448115\r\n\"[H]OC1=C([H])C([H])=C(/C([H])=C(\\OC([H])([H])[H])C(=O)N([H])[H])C([H])=C1[H] |(-2.0284,-2.4537,0.6949;-1.1993,-2.8808,0.428;-0.2965,-1.92,0.0749;-0.5884,-0.5525,0.1185;-1.5712,-0.2137,0.4415;0.3804,0.3714,-0.258;0.1422,1.4316,-0.2257;1.6636,-0.0305,-0.6752;2.6204,1.004,-1.052;2.2289,2.016,-1.1247;3.9368,0.9202,-1.3283;4.6596,-0.2656,-1.3172;5.3142,-0.5261,-0.0662;4.5807,-0.6511,0.7387;6.0053,0.2856,0.1909;5.8731,-1.4557,-0.1989;4.691,2.1496,-1.7421;4.2048,3.275,-1.7138;5.9826,1.9047,-2.1392;6.2413,0.9619,-2.3939;6.4656,2.6711,-2.587;1.9299,-1.4172,-0.7184;2.8981,-1.7573,-1.0651;0.968,-2.3473,-0.3513;1.1734,-3.4124,-0.3921)|\",4.427292347635\r\n\"[H]/N=C(\\C1=C([H])C([H])=C([H])C([H])=C1[H])N([H])OC(=O)C([H])=C([H])[H] |(8.3365,2.6252,1.2944;8.0122,1.8372,0.7306;8.7162,0.8045,0.9892;9.9018,0.7112,1.8938;10.8404,1.7537,1.9122;10.7087,2.6022,1.2461;11.9442,1.6935,2.7603;12.6683,2.5036,2.7618;12.123,0.591,3.5996;12.9832,0.5454,4.2618;11.1936,-0.4505,3.5866;11.3237,-1.305,4.245;10.0888,-0.3956,2.7364;9.3518,-1.1925,2.7381;8.368,-0.4505,0.4498;9.1564,-0.9624,0.0642;7.416,-0.3828,-0.593;6.1166,-0.5353,-0.1485;5.8015,-0.7418,0.9948;5.2112,-0.4098,-1.3176;5.669,-0.2082,-2.2811;3.892,-0.5408,-1.1651;3.2094,-0.4533,-2.0048;3.4682,-0.738,-0.1846)|\",5.54023799618\r\n\"[H]C(=O)/C1=C(\\[H])N(/C([H])=C(\\[H])C([H])([H])[H])C2=C([H])C([H])=C(Cl)C([H])=C21 |(8.6924,2.4842,0.1691;7.6538,2.878,0.2063;7.4481,4.0821,0.2126;6.6057,1.8649,0.2456;5.2559,2.1436,0.2973;4.7852,3.1145,0.3184;4.5231,0.9843,0.3257;3.1197,0.861,0.3779;2.7886,-0.17,0.4295;2.2353,1.8631,0.3689;2.5768,2.8948,0.3129;0.7516,1.6519,0.4349;0.2517,2.0746,-0.4466;0.319,2.1513,1.3119;0.4955,0.5888,0.4917;5.4148,-0.0985,0.2932;5.1634,-1.4724,0.303;4.1571,-1.8771,0.3383;6.2535,-2.334,0.2639;6.1024,-3.4076,0.2707;7.5606,-1.819,0.2144;8.9079,-2.952,0.1652;7.8231,-0.4563,0.2026;8.8449,-0.0953,0.1636;6.7316,0.4238,0.2427)|\",4.38103299305\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@@]([H])(C([H])([H])[H])[C@]([H])(O[H])C([H])=C([H])[H] |(7.3886,3.7667,-2.8143;7.8277,3.3907,-2.0347;7.0447,2.2645,-1.6091;7.5428,1.9333,-0.69;5.6212,2.715,-1.2341;5.0886,1.8755,-0.7782;5.7309,3.489,-0.4628;4.7758,3.275,-2.3882;5.3047,4.1106,-2.8738;4.6317,2.5001,-3.1495;3.4002,3.7787,-1.9295;2.8582,2.9511,-1.4489;3.5291,4.5486,-1.1546;2.5479,4.3464,-3.072;3.0904,5.1742,-3.5508;2.4243,3.5761,-3.8462;1.1701,4.8373,-2.6149;0.5878,5.235,-3.4542;1.2618,5.6333,-1.8658;0.5896,4.0238,-2.1627;7.1615,1.1158,-2.6457;6.6739,1.4406,-3.5761;8.6399,0.8333,-2.9656;8.7376,0.0688,-3.7454;9.1852,0.4848,-2.0799;9.1334,1.7452,-3.3068;6.4411,-0.2016,-2.2513;6.7702,-0.9516,-2.9933;5.0347,-0.0106,-2.3887;4.6098,-0.8606,-2.1923;6.8307,-0.7328,-0.8899;7.8808,-1.0035,-0.7832;5.9988,-0.9246,0.1349;6.3457,-1.336,1.0783;4.9452,-0.6654,0.0736)|\",7.15659426815\r\n\"[H]C([H])=C(C(=O)N1C(=O)OC([H])([H])C1([H])[H])C([H])([H])/C([H])=C(\\[H])Cl |(0.1392,-1.6722,-2.0464;0.0452,-1.1174,-1.1178;-0.8317,-1.3201,-0.514;0.97,-0.2245,-0.7503;0.9051,0.4531,0.5803;1.9117,0.6666,1.2411;-0.3427,0.7855,1.1408;-1.4936,1.2501,0.4623;-1.7163,1.2673,-0.7175;-2.3638,1.7281,1.3938;-1.871,1.4704,2.7218;-2.1109,2.3324,3.3463;-2.3775,0.5823,3.1156;-0.3696,1.2449,2.5324;0.0454,0.489,3.2017;0.2165,2.1636,2.6459;2.228,0.0841,-1.5468;2.1208,1.0748,-2.0108;3.0558,0.1739,-0.8321;2.5718,-0.942,-2.5898;2.8067,-1.9459,-2.2411;2.6084,-0.6718,-3.8922;2.3815,0.2997,-4.3179;3.0417,-1.8562,-5.1082)|\",5.417786763455\r\n\"[H]O[C@]1(C([H])([H])[H])C(=O)C2=C(C([H])([H])[H])/C(C([H])([H])[H])=C(C([H])([H])[H])/C([H])=C\\2C1([H])[H] |(7.8191,3.0942,-1.7003;7.503,2.1766,-1.7192;6.9096,1.9052,-0.4492;7.849,2.3185,0.6858;7.4157,2.0958,1.6676;8.0473,3.3984,0.647;8.7939,1.7763,0.5888;6.7022,0.359,-0.4085;7.6284,-0.4267,-0.496;5.2602,0.0868,-0.2196;4.5906,-1.1476,-0.0969;5.3417,-2.4557,-0.1717;4.977,-3.0697,-1.0059;5.1922,-3.049,0.7399;6.4094,-2.2873,-0.307;3.1954,-1.1337,0.0952;2.4655,-2.4535,0.2261;2.5935,-3.0702,-0.6729;1.394,-2.327,0.3867;2.8562,-3.0442,1.0645;2.4971,0.0978,0.1626;0.9987,0.147,0.3689;0.6448,1.182,0.3835;0.7009,-0.3184,1.3165;0.458,-0.3763,-0.4291;3.1936,1.3066,0.0376;2.6468,2.2458,0.0881;4.5679,1.303,-0.1575;5.4806,2.4985,-0.3234;5.3871,3.1949,0.52;5.2377,3.0576,-1.2348)|\",4.9742411871400005\r\n\"[H]O[C@@]1([H])C2=C([H])C(C([H])([H])[H])=C(C([H])([H])[H])C(C([H])([H])[H])=C2C(=O)C1(C([H])([H])[H])C([H])([H])[H] |(4.9029,3.1307,-1.6474;5.1302,3.4597,-0.7631;5.6021,2.3542,-0.0032;5.6881,2.7567,1.0138;4.6684,1.1605,0.0021;3.2824,1.1803,0.068;2.7479,2.1264,0.0964;2.5794,-0.0288,0.0998;1.0708,-0.0043,0.1759;0.6956,1.0228,0.2007;0.7016,-0.5188,1.0723;0.6129,-0.5085,-0.6849;3.2764,-1.2642,0.0696;2.475,-2.5473,0.1154;1.7132,-2.5725,-0.6738;1.9392,-2.6497,1.0687;3.1003,-3.4325,-0.0045;4.684,-1.2901,0.0018;5.4559,-2.5914,-0.0308;5.1757,-3.2015,-0.8987;5.2537,-3.1947,0.863;6.5276,-2.3997,-0.0755;5.3617,-0.054,-0.0426;6.814,0.2378,-0.174;7.7513,-0.5433,-0.13;6.9822,1.7565,-0.4268;8.1629,2.3217,0.37;8.3094,3.3835,0.1417;9.08,1.7773,0.1249;7.9964,2.2261,1.4495;7.2257,1.9389,-1.9398;7.3067,3.0028,-2.1849;6.417,1.5044,-2.5419;8.1549,1.4384,-2.229)|\",4.955193217605\r\n\"[H]OC([H])([H])C([H])([H])C1=C(C([H])([H])[H])C([H])=C2C(=O)[C@]([H])(C([H])([H])[H])C([H])([H])C2=C1[H] |(2.8474,-1.8977,-2.1263;2.1681,-2.5747,-1.9749;1.0989,-1.9509,-1.2828;0.6626,-1.1334,-1.8788;0.3299,-2.7213,-1.1666;1.511,-1.4284,0.1105;1.8901,-2.2813,0.6849;0.6161,-1.0628,0.6264;2.5758,-0.3544,0.034;2.2352,1.0245,-0.0478;0.7949,1.4855,0.0062;0.7331,2.5708,-0.1159;0.3229,1.2339,0.9648;0.1829,1.0285,-0.7803;3.2572,1.9667,-0.1626;3.0297,3.0283,-0.2212;4.5901,1.5516,-0.1989;5.8095,2.3817,-0.3131;5.8653,3.5891,-0.4634;7.0279,1.439,-0.2019;7.4713,1.6422,0.7836;8.0811,1.7323,-1.2741;8.3476,2.7934,-1.2547;7.6981,1.4993,-2.275;8.9882,1.1399,-1.1121;6.4357,0.0041,-0.2116;6.8128,-0.6097,0.6152;6.7042,-0.5252,-1.1361;4.9342,0.2019,-0.1278;3.9223,-0.7504,-0.0113;4.1626,-1.8093,0.0595)|\",5.009615987705\r\n\"[H]OC([H])([H])C([H])([H])C1=C(C([H])([H])[H])C2=C(C([H])=C1C([H])([H])[H])C([H])([H])[C@]([H])(C([H])([H])[H])C2([H])[H] |(1.4578,1.819,-5.1459;1.7683,2.3932,-5.8653;2.8651,1.725,-6.4762;2.5315,0.7803,-6.9336;3.1977,2.3812,-7.2872;4.0393,1.4585,-5.5115;4.453,2.4222,-5.1999;4.83,0.9522,-6.0787;3.6455,0.6329,-4.2969;3.2706,1.2776,-3.091;3.3004,2.789,-2.9597;2.7885,3.2784,-3.7929;4.3311,3.1697,-2.9433;2.8213,3.1135,-2.0327;2.9006,0.4863,-1.9953;2.8767,-0.9095,-2.0871;3.2451,-1.5385,-3.2694;3.2427,-2.6246,-3.3396;3.6393,-0.7801,-4.3789;4.0807,-1.5057,-5.6334;3.9836,-2.5882,-5.5054;5.1314,-1.2986,-5.8761;3.4912,-1.2254,-6.5138;2.405,-1.523,-0.7876;1.3408,-1.7996,-0.8583;2.948,-2.4344,-0.5098;2.5975,-0.3762,0.2385;3.6389,-0.4257,0.5873;1.6724,-0.4421,1.4528;1.817,-1.373,2.0141;0.6192,-0.3966,1.1473;1.8566,0.3919,2.1405;2.4623,0.9187,-0.6075;3.0549,1.7415,-0.1917;1.4149,1.2614,-0.6195)|\",6.00827381904\r\n\"[H]OC1=C(C([H])([H])O[H])C(/C([H])=C(\\[H])C([H])([H])O[H])=C([H])C([H])=C1[H] |(-0.752,3.2303,-0.3925;0.1184,2.8289,-0.2467;-0.0273,1.4629,-0.1813;1.1525,0.7082,-0.0454;2.4768,1.4408,-0.0788;2.3936,2.3087,-0.7433;3.2559,0.7858,-0.4721;2.945,1.8508,1.2121;2.2677,2.4403,1.5802;1.0369,-0.6953,0.088;2.1771,-1.6026,0.3248;2.0432,-2.607,-0.0828;3.2845,-1.3666,1.0475;3.4469,-0.4007,1.5221;4.3513,-2.4022,1.2663;4.0729,-3.3484,0.7734;4.4667,-2.6125,2.3364;5.6397,-1.9535,0.8435;5.5535,-1.6848,-0.0849;-0.2346,-1.2937,-0.0078;-0.3155,-2.3745,0.0699;-1.3814,-0.5252,-0.1686;-2.3554,-1.0029,-0.2285;-1.286,0.863,-0.245;-2.1782,1.4767,-0.3562)|\",5.12662494342\r\n\"[H]OC([H])([H])C([H])([H])C1=C(C([H])([H])[H])C([H])=C2C(=C1[H])C(=O)[C@@]([H])(C([H])([H])O[H])[C@]2([H])O[H] |(2.6684,-0.9197,2.5331;1.9653,-1.5879,2.5698;2.2865,-2.5926,1.6161;3.2922,-3.0011,1.802;1.5659,-3.3998,1.7814;2.1896,-2.109,0.1554;1.1497,-1.8465,-0.0606;2.4469,-2.9553,-0.4947;3.1171,-0.9458,-0.1426;2.673,0.4053,-0.0628;1.234,0.7367,0.2663;1.1015,1.8158,0.3858;0.9111,0.241,1.187;0.557,0.406,-0.5325;3.5732,1.4522,-0.3126;3.2327,2.4821,-0.2518;4.8976,1.1725,-0.628;5.3331,-0.1526,-0.6875;4.4516,-1.2102,-0.4601;4.8132,-2.2339,-0.5243;6.7745,-0.1852,-0.9938;7.4542,-1.1679,-1.2636;7.2905,1.2605,-0.9426;7.7574,1.4022,0.0424;8.3485,1.5287,-2.0235;8.6788,2.5739,-1.9781;7.896,1.3758,-3.0198;9.4971,0.7286,-1.842;9.1697,-0.1882,-1.7685;6.0109,2.1381,-0.9783;5.8519,2.512,-2.0048;6.0049,3.2289,-0.0648;6.615,3.9029,-0.4023)|\",4.993289156675\r\n\"[H]/C(=C1/C(C([H])([H])C([H])([H])C([H])([H])[H])=C(C([H])([H])[H])C(=O)C1([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(8.1716,-1.3835,-4.372;9.2529,-1.3565,-4.5279;9.6414,-1.346,-5.8199;10.9786,-1.2866,-6.4478;12.2898,-1.1218,-5.7252;12.3559,-1.8181,-4.8816;13.107,-1.3912,-6.4048;12.5282,0.3191,-5.2225;12.489,1.0024,-6.0808;11.709,0.6178,-4.5578;13.8681,0.4722,-4.4971;14.0197,1.5036,-4.1597;14.7075,0.2087,-5.1521;13.9199,-0.1783,-3.6153;10.8827,-1.3483,-7.8091;11.9747,-1.322,-8.835;12.5034,-0.3603,-8.8483;11.5352,-1.4789,-9.8243;12.7247,-2.1033,-8.6613;9.4669,-1.446,-8.2192;9.0486,-1.5345,-9.363;8.624,-1.406,-6.9499;7.9692,-0.5256,-6.9705;7.9725,-2.286,-6.8956;10.0326,-1.315,-3.2385;11.0694,-1.6238,-3.3807;10.0733,-0.2718,-2.8849;9.3661,-2.1505,-2.1226;9.9035,-1.9737,-1.1808;8.3563,-1.7445,-1.9705;9.2531,-3.6709,-2.3726;8.3554,-4.0323,-1.8539;9.069,-3.8559,-3.4393;10.431,-4.5304,-1.8735;10.5789,-4.3222,-0.8038;10.1312,-5.5868,-1.9351;11.7807,-4.3764,-2.5955;12.139,-3.3417,-2.5083;12.521,-4.989,-2.063;11.7602,-4.7985,-4.0692;12.7571,-4.7154,-4.5177;11.0776,-4.1828,-4.6654;11.4379,-5.8422,-4.173)|\",4.536137887835\r\n\"[H]C1=C([H])/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C1=O |(8.4937,-0.7932,-7.1577;8.2021,-1.5601,-6.4496;7.6735,-1.3728,-5.2183;7.4713,-0.4033,-4.7745;7.398,-2.6366,-4.5353;6.8668,-2.8046,-3.3098;6.7229,-3.8281,-2.9592;6.4183,-1.736,-2.3538;6.9814,-1.8399,-1.4133;6.6521,-0.7372,-2.7405;4.912,-1.8209,-2.0306;4.3391,-1.6925,-2.9591;4.6777,-2.8306,-1.6638;4.4602,-0.7829,-0.9956;5.0411,-0.9159,-0.0707;4.7015,0.2256,-1.3631;2.9631,-0.8581,-0.6696;2.3821,-0.7217,-1.5938;2.7206,-1.8676,-0.3053;2.5104,0.1762,0.3693;3.0913,0.0395,1.2926;2.7525,1.1848,0.0051;1.0146,0.0953,0.6904;0.7239,0.8451,1.4349;0.7473,-0.8911,1.0891;0.4065,0.2638,-0.207;7.8234,-3.7446,-5.4825;6.9968,-4.4087,-5.7623;8.6171,-4.3786,-5.0694;8.3432,-3.0053,-6.7278;8.7826,-3.524,-7.738)|\",4.67491595159\r\n\"[H]OC1=C(/C([H])=C(\\[H])C([H])([H])C([H])([H])C(O[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C(Cl)C([H])=C1[H] |(-0.0789,2.9012,-6.8738;0.4384,2.3807,-6.2403;1.7227,2.2683,-6.7037;2.627,1.503,-5.9333;2.166,0.859,-4.6944;1.0862,0.7819,-4.5891;2.9416,0.3878,-3.7058;4.0256,0.4981,-3.767;2.4293,-0.2828,-2.4633;2.8378,-1.3,-2.3957;1.3386,-0.378,-2.5243;2.8349,0.4754,-1.1832;2.3492,1.4593,-1.1692;3.9173,0.6585,-1.2081;2.5235,-0.2537,0.1384;3.244,-1.497,0.0619;3.056,-1.9926,0.8749;3.0552,0.57,1.3217;2.5592,1.5454,1.3888;4.1326,0.7341,1.2167;2.8814,0.0435,2.2695;1.0224,-0.5337,0.3126;0.4378,0.3935,0.2978;0.8341,-1.0273,1.2753;0.6515,-1.1908,-0.4792;3.942,1.3819,-6.4098;4.655,0.7747,-5.8636;4.342,2.0105,-7.5838;6.0035,1.8267,-8.1427;3.4466,2.7693,-8.3341;3.7643,3.2502,-9.2525;2.133,2.8902,-7.8847;1.4203,3.4791,-8.4594)|\",4.80553059983\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C([H])([H])C2=C([H])C([H])([H])C([H])([H])OC2=O)C([H])=C1[H] |(-2.511,-1.3942,0.9016;-1.4948,-1.1262,0.6255;-1.2514,0.0017,-0.1583;-2.0783,0.621,-0.4964;0.0548,0.3398,-0.5093;0.2367,1.2214,-1.1203;1.1488,-0.4375,-0.0903;2.5035,-0.0344,-0.4991;2.5472,0.9003,-1.0611;3.6547,-0.6791,-0.2606;3.6676,-1.6175,0.2895;5.0068,-0.1843,-0.7137;5.6524,-0.0582,0.1666;4.9057,0.7943,-1.1968;5.6935,-1.1392,-1.669;6.0006,-0.8373,-2.9406;5.7717,0.1511,-3.3362;6.6814,-1.8325,-3.8385;7.7691,-1.6655,-3.841;6.3464,-1.717,-4.8767;6.3556,-3.2382,-3.3548;6.9694,-3.9989,-3.8419;5.2985,-3.4669,-3.5436;6.6133,-3.392,-1.9467;6.1325,-2.4466,-1.0935;6.132,-2.673,0.0979;0.8851,-1.5692,0.7044;1.707,-2.1855,1.0571;-0.4181,-1.9081,1.0554;-0.5961,-2.787,1.6698)|\",4.5715126884\r\n\"[H]O[C@]1([H])[C@@]([H])(O[H])[C@@]([H])(O[H])C(C#N)=C([H])[C@]1([H])O[H] |(0.3676,0.5059,-2.0586;0.5592,1.4088,-1.7535;0.4811,1.4357,-0.3244;0.7598,2.4555,-0.0476;1.4588,0.4405,0.3161;2.48,0.68,0.0089;1.4688,0.555,1.7433;0.567,0.4067,2.0744;1.1491,-1.0189,-0.0797;1.4793,-1.1996,-1.1117;1.8922,-1.9104,0.7205;1.9664,-1.481,1.5934;-0.3603,-1.2911,-0.0041;-0.7622,-2.6681,-0.0483;-1.0893,-3.7832,-0.0978;-1.2911,-0.3175,0.0663;-2.3481,-0.5718,0.0926;-0.9719,1.163,0.1161;-1.1125,1.5177,1.148;-1.8785,1.9108,-0.6724;-1.4663,1.9687,-1.5545)|\",6.01915837306\r\n\"[H]OC([H])([H])C([H])([H])C1=C(C([H])([H])[H])C([H])=C2C(=C1[H])C(=O)C(C([H])([H])[H])(C([H])([H])[H])[C@]2([H])O[H] |(2.9063,-1.2045,2.5508;2.2267,-1.8964,2.5907;2.5406,-2.8511,1.5843;3.5582,-3.248,1.726;1.8398,-3.679,1.7318;2.3977,-2.3032,0.1507;1.3475,-2.0534,-0.0283;2.6562,-3.1125,-0.5445;3.2938,-1.1071,-0.113;2.8231,0.2282,0.038;1.3849,0.5137,0.4124;1.2331,1.5838,0.5805;1.0919,-0.0263,1.3182;0.6961,0.2032,-0.3846;3.6961,1.3044,-0.1813;3.3356,2.323,-0.0668;5.0185,1.0682,-0.5405;5.4805,-0.241,-0.6692;4.6275,-1.3264,-0.4686;5.0111,-2.3374,-0.5864;6.9216,-0.2335,-1.001;7.6169,-1.1917,-1.2871;7.4236,1.2291,-0.8858;8.1561,1.3496,0.4672;8.5371,2.3657,0.6128;9.0004,0.653,0.4907;7.4924,1.1229,1.3074;8.3687,1.5731,-2.0428;8.7841,2.5822,-1.9274;7.8537,1.5266,-3.0098;9.1995,0.8616,-2.0743;6.0991,2.0691,-0.8841;5.9213,2.442,-1.9069;6.0462,3.1619,0.0264;6.6174,3.8641,-0.3213)|\",4.971520048635\r\n\"[H]C([H])([H])C(=O)C([H])([H])C([H])([H])[C@@]1(C([H])([H])[H])C(=O)OC([H])([H])C1=O |(2.2017,1.518,-2.1278;1.4867,1.0469,-2.8147;0.5101,1.5204,-2.6955;1.8625,1.2151,-3.8312;1.3811,-0.4369,-2.5109;0.3656,-0.9283,-2.0563;2.6179,-1.2783,-2.823;2.712,-1.3247,-3.9177;3.5083,-0.7386,-2.4761;2.5387,-2.6857,-2.2267;1.5264,-3.0743,-2.3858;2.6793,-2.6506,-1.1404;3.5886,-3.6913,-2.7928;5.0354,-3.202,-2.6142;5.2109,-2.3059,-3.215;5.7544,-3.9655,-2.9293;5.2223,-2.9784,-1.5597;3.3755,-5.0299,-2.0816;3.6104,-5.287,-0.9317;2.8435,-5.9609,-2.9352;2.6965,-5.4491,-4.2643;3.2467,-6.0832,-4.9677;1.6371,-5.4496,-4.5468;3.262,-4.03,-4.2395;3.408,-3.3277,-5.2156)|\",5.496699780099999\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C2\\C(=O)C([H])=C([H])C2([H])[H])C([H])=C1[H] |(-2.6105,-0.2581,1.1658;-1.5662,-0.3066,0.8686;-0.9134,0.8396,0.4096;-1.4448,1.7854,0.3468;0.4232,0.7686,0.0317;0.9269,1.6642,-0.3252;1.1444,-0.444,0.1002;2.543,-0.3939,-0.3264;2.8271,0.6123,-0.6438;3.5533,-1.294,-0.4202;3.6314,-2.7563,-0.1046;2.762,-3.504,0.3327;5.0174,-3.178,-0.426;5.3465,-4.202,-0.2903;5.7364,-2.1416,-0.8808;6.7778,-2.1785,-1.1874;4.928,-0.8733,-0.9279;4.8892,-0.464,-1.9483;5.3814,-0.0901,-0.3023;0.4698,-1.5921,0.5661;1.0077,-2.5302,0.6259;-0.8686,-1.5148,0.9435;-1.3718,-2.4092,1.3012)|\",4.160620774145\r\n\"[H]O[C@@]([H])(/C(=C(\\[H])C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H])C([H])([H])[H])C([H])([H])[H] |(4.0877,-0.1633,-2.1265;3.6697,0.7133,-2.094;3.3357,0.9638,-0.7183;2.8578,1.9521,-0.7502;2.2783,-0.0406,-0.2713;2.4595,-0.8229,0.8128;3.357,-0.6611,1.4056;1.574,-1.8588,1.37;1.5767,-2.0615,2.7662;2.214,-1.4395,3.3889;0.7751,-3.0244,3.3639;0.7663,-3.1739,4.4365;-0.0361,-3.8172,2.5521;-0.8852,-4.8419,3.1706;-0.842,-4.9588,4.3961;-1.5925,-5.525,2.4285;-0.0485,-3.6696,1.1666;-0.6749,-4.319,0.5673;0.7568,-2.6968,0.5846;0.7749,-2.6099,-0.4953;1.0486,-0.0444,-1.1493;0.1667,-0.4204,-0.625;1.196,-0.6511,-2.0512;0.8361,0.9716,-1.5006;4.6014,1.0706,0.1298;4.3815,1.3905,1.154;5.2685,1.8079,-0.3258;5.1361,0.1138,0.1775)|\",4.1905532977\r\n\"[H]O[C@@]1(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2(C([H])([H])[H])C(=O)O[C@]1([H])C2=O |(3.7248,-1.9833,0.4822;2.8602,-2.05,0.0451;2.6899,-0.8833,-0.7585;2.7351,0.3842,0.1261;3.7191,0.3794,0.6184;2.7212,1.2818,-0.5038;1.6309,0.4797,1.1852;1.8358,1.3047,1.8756;1.5684,-0.4454,1.7673;0.655,0.6714,0.7279;3.7591,-0.8322,-1.882;4.7115,-0.4792,-1.4666;3.9212,-1.8615,-2.2173;3.3677,0.0502,-3.0879;4.0319,-0.1659,-3.9322;3.4794,1.1158,-2.8533;1.8858,-0.1731,-3.5682;1.546,0.585,-4.8409;1.7152,1.6572,-4.7015;0.4964,0.441,-5.1141;2.1645,0.2293,-5.67;1.6677,-1.6852,-3.6452;1.7508,-2.4055,-4.6035;1.3579,-2.1558,-2.3952;1.3098,-1.0634,-1.4414;0.5204,-1.2783,-0.7223;1.055,0.1476,-2.3372;0.3871,1.1261,-2.0986)|\",5.736159968540001\r\n\"[H]O/C(=N/C([H])([H])[H])[C@@]1([H])N([H])N([H])C([H])([H])[C@@]1([H])SC1=C([H])C([H])=C([H])C([H])=C1[H] |(3.9396,2.5551,-0.5359;3.9772,1.7013,-0.0543;2.6924,1.2955,0.212;2.4768,0.0495,0.1951;1.2197,-0.5564,0.5649;1.3918,-1.2422,1.4035;0.8623,-1.1653,-0.2745;0.4085,0.1292,0.8544;1.7502,2.4286,0.6069;0.7681,2.0014,0.8278;2.2241,3.0699,1.8558;3.2377,3.1754,1.8183;1.6635,4.3869,1.8904;0.7231,4.2835,2.2696;1.5917,4.8614,0.5002;0.7588,5.5575,0.3648;2.517,5.405,0.2694;1.4748,3.6024,-0.4096;0.4717,3.495,-0.8291;2.6136,3.8551,-1.8552;2.0186,2.6569,-3.0597;0.9494,3.0126,-3.894;0.4852,3.9886,-3.7857;0.4933,2.1192,-4.8631;-0.337,2.4003,-5.5051;1.1109,0.8755,-5.013;0.7592,0.1821,-5.7721;2.1834,0.5263,-4.191;2.6668,-0.4398,-4.3056;2.6414,1.4118,-3.213;3.4682,1.1336,-2.5683)|\",5.58377621226\r\n\"[H]C1=C([H])C(N([H])C(=O)[C@@]2([H])N([H])N([H])C([H])([H])C2([H])[H])=C([H])C([H])=C1F |(0.6485,-5.0957,1.8492;0.2551,-4.0868,1.9176;-0.7318,-3.6454,1.0367;-1.1231,-4.3,0.2708;-1.227,-2.3359,1.145;-2.2248,-1.8138,0.2934;-2.4881,-0.8461,0.4501;-2.8994,-2.4312,-0.7318;-2.7178,-3.5828,-1.1039;-3.9507,-1.5171,-1.3658;-4.8217,-1.5084,-0.6934;-3.4828,-0.109,-1.4577;-2.5419,-0.1076,-1.8579;-4.3161,0.5174,-2.4478;-5.1584,0.8039,-1.9511;-4.6669,-0.5285,-3.4501;-5.7047,-0.3951,-3.7722;-4.0314,-0.3967,-4.3328;-4.4031,-1.9081,-2.7843;-3.6137,-2.4663,-3.2952;-5.2903,-2.5467,-2.7656;-0.7198,-1.4867,2.141;-1.0999,-0.4715,2.2304;0.2658,-1.9274,3.0195;0.6648,-1.2788,3.7922;0.7406,-3.2275,2.8947;1.6969,-3.6627,3.7446)|\",5.532074580665\r\n\"[H]C1=C(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C(=O)C([H])([H])/C1=C(\\[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(9.7162,0.3419,-2.4518;9.8286,0.6761,-3.4773;11.0189,0.7593,-4.1281;12.4054,0.4203,-3.6116;13.0172,-0.6931,-4.497;14.0293,-0.9339,-4.1493;13.0762,-0.3734,-5.5402;12.4161,-1.6088,-4.4465;13.2998,1.6817,-3.6898;14.3115,1.4445,-3.3387;12.9026,2.4843,-3.057;13.3663,2.0498,-4.7165;12.351,-0.0692,-2.1534;13.3613,-0.3047,-1.8001;11.7445,-0.9775,-2.0539;11.9352,0.6946,-1.4852;10.757,1.2594,-5.5088;11.5769,1.4644,-6.39;9.2456,1.4796,-5.6456;9.0453,2.5269,-5.9024;8.8551,0.8691,-6.4688;8.689,1.0824,-4.2939;7.3907,1.1087,-3.9386;6.6694,1.4356,-4.6897;6.7972,0.7107,-2.6168;7.581,0.487,-1.8836;6.2308,1.5626,-2.2094;5.8437,-0.497,-2.7308;5.0721,-0.2793,-3.4833;6.4063,-1.3604,-3.1117;5.172,-0.8617,-1.4009;5.9466,-1.0659,-0.6466;4.608,0.0068,-1.0292;4.2336,-2.0708,-1.5025;3.4609,-1.8695,-2.2595;4.7988,-2.9402,-1.8703;3.5562,-2.4341,-0.1746;4.3279,-2.631,0.5832;2.9872,-1.5671,0.1901;2.626,-3.6465,-0.2826;2.1623,-3.8824,0.682;3.1725,-4.5373,-0.616;1.8195,-3.4654,-1.004)|\",4.579676103915\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])[H])C(=O)C([H])([H])/C1=C(/[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.4814,-0.8531,-0.6682;5.6207,0.2108,-0.8402;4.6253,1.1238,-0.9544;3.1389,0.9551,-0.8827;2.7575,1.6047,-0.0804;2.7023,1.3752,-1.8012;2.6442,-0.4804,-0.6745;3.0104,-1.1145,-1.4935;3.0797,-0.8871,0.2486;1.117,-0.5705,-0.6011;0.7867,-1.6042,-0.4505;0.653,-0.2028,-1.5244;0.724,0.0299,0.2283;5.2348,2.4611,-1.1862;4.6299,3.5082,-1.3346;6.7585,2.2812,-1.2056;7.1617,2.6202,-2.1672;7.2185,2.9031,-0.4287;6.9564,0.7944,-0.9718;8.1148,0.1052,-0.8914;8.0043,-0.9693,-0.7154;9.8875,0.7416,-1.0408;10.1965,1.4961,-2.7545;11.2414,1.8162,-2.8508;9.5651,2.3731,-2.9373;9.9954,0.7691,-3.5496;10.2633,2.0353,0.2958;11.3089,2.3611,0.2319;10.1025,1.6256,1.2995;9.6344,2.9271,0.1949;11.0283,-0.7524,-0.8005;12.0825,-0.4605,-0.8764;10.8433,-1.5233,-1.5582;10.883,-1.2121,0.1845)|\",4.590560657935001\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C2/C([H])=C(C([H])([H])C([H])([H])C([H])([H])[H])C(=O)C2([H])[H])C([H])=C1[H] |(13.0452,1.8268,-1.4835;12.003,1.5221,-1.4437;11.6606,0.1699,-1.5067;12.4354,-0.5867,-1.5968;10.324,-0.2131,-1.453;10.0655,-1.2683,-1.5025;9.2873,0.7368,-1.3347;7.914,0.2382,-1.287;7.8429,-0.848,-1.356;6.7302,0.8883,-1.1732;5.4531,0.1867,-1.1471;5.3991,-0.8959,-1.2232;4.3859,1.0167,-1.0216;2.9212,0.71,-0.9551;2.5121,1.1826,-0.0498;2.4207,1.2324,-1.7846;2.5573,-0.7784,-0.9822;2.9492,-1.2336,-1.9022;3.0559,-1.2916,-0.1485;1.0465,-1.0174,-0.8997;0.8123,-2.0877,-0.9145;0.524,-0.5498,-1.743;0.6282,-0.597,0.0229;4.8856,2.4112,-0.9455;4.2086,3.4166,-0.8177;6.4193,2.3644,-1.0527;6.7445,2.9456,-1.9244;6.867,2.8335,-0.1684;9.655,2.0975,-1.2732;8.896,2.8644,-1.1835;10.9928,2.4799,-1.3273;11.2481,3.5351,-1.2769)|\",3.88034350813\r\n\"[H]/C(=C1/C(C([H])([H])[H])=C(C([H])([H])[H])C(=O)C1([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(7.2029,3.5892,-1.0112;7.8258,3.7498,-0.13;8.1882,5.0296,0.0911;8.9957,5.6363,1.1702;9.5919,4.8752,2.3199;10.0547,5.5472,3.0466;10.3625,4.174,1.9766;8.8313,4.2862,2.8446;9.1214,6.9837,0.9897;9.8333,8.0039,1.8215;10.592,7.5681,2.477;9.1246,8.5624,2.4479;10.3115,8.7406,1.1674;8.4066,7.4017,-0.2348;8.3443,8.5371,-0.6789;7.7692,6.1559,-0.8399;6.6804,6.28,-0.8908;8.1215,6.0126,-1.8684;8.1217,2.4944,0.6498;8.3355,1.6895,-0.0684;9.021,2.6054,1.2591;6.958,2.0319,1.5606;7.2962,1.1531,2.1287;6.7542,2.8201,2.2966;5.6698,1.6718,0.8061;5.2857,2.5603,0.2859;5.9146,0.9396,0.0228;4.5565,1.091,1.6944;3.7417,0.7344,1.0482;4.9362,0.2022,2.2207;3.9726,2.0744,2.7187;4.7607,2.4128,3.4041;3.6191,2.9731,2.1929;2.8224,1.4736,3.5338;2.4219,2.1956,4.2545;3.1534,0.5913,4.0955;1.9968,1.1594,2.8833)|\",4.606887488965\r\n\"[H]C1=C([H])C([H])([H])[C@@]2([H])C(=O)C([H])([H])C([H])([H])C([H])([H])[C@@]2(C([H])([H])[H])C([H])([H])C1([H])[H] |(4.0635,3.6678,-1.8998;3.8034,2.6708,-1.5439;4.4843,1.6399,-2.0556;5.2683,1.8633,-2.7789;4.2746,0.1655,-1.783;4.9456,-0.1786,-0.9817;4.5906,-0.3814,-2.6811;2.802,-0.2198,-1.4712;2.1746,0.3585,-2.1583;2.5694,-1.6851,-1.8591;2.0236,-1.9837,-2.906;3.0761,-2.738,-0.8846;2.7676,-3.7235,-1.2452;4.1767,-2.7075,-0.8974;2.5873,-2.4532,0.5484;3.0559,-3.1567,1.247;1.507,-2.6347,0.6071;2.9103,-1.0111,0.9579;4.0027,-0.8992,1.0141;2.5335,-0.8146,1.9705;2.3419,0.0652,-0.004;0.7977,0.0369,0.0275;0.4269,0.1166,1.0568;0.3888,-0.8821,-0.4065;0.3778,0.8714,-0.5453;2.834,1.4477,0.4955;2.2707,1.6892,1.4074;3.8851,1.3705,0.8028;2.7228,2.6271,-0.4869;1.7272,2.654,-0.9563;2.7857,3.559,0.0905)|\",5.858611201265\r\n\"[H]C([H])=C([H])C([H])([H])[C@]1([H])C(=O)C([H])([H])C([H])([H])C([H])([H])[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])[H] |(8.0625,-3.8345,-3.0475;7.6466,-3.8225,-2.0439;7.6385,-4.7716,-1.5106;7.1667,-2.7054,-1.4963;7.1995,-1.7763,-2.0669;6.5697,-2.6078,-0.1167;6.6341,-3.5836,0.3789;5.5001,-2.3769,-0.2086;7.2231,-1.4939,0.7522;7.0339,-0.535,0.2553;8.7487,-1.6538,0.7513;9.4742,-0.8782,0.1572;9.2889,-2.8283,1.5521;8.9603,-3.7548,1.0592;10.3818,-2.8025,1.5182;8.7511,-2.8025,2.9977;9.2344,-1.9851,3.5463;9.0366,-3.7271,3.5137;7.2245,-2.6343,3.0366;6.8939,-2.5367,4.0791;6.7529,-3.5518,2.658;6.693,-1.4176,2.229;7.2114,-0.1079,2.864;6.8627,-0.017,3.9002;6.8595,0.7691,2.3111;8.3047,-0.0545,2.8736;5.1435,-1.4461,2.3074;4.7826,-2.4149,1.9364;4.8611,-1.4173,3.3689;4.3762,-0.3221,1.5797;4.6332,0.6543,2.0064;4.6826,-0.2901,0.5236;2.8854,-0.5215,1.6572;2.5025,-1.4418,1.2087;2.0225,0.3173,2.2367;2.4061,1.2357,2.6863;0.5371,0.1108,2.3241;0.1957,0.1049,3.3683;0.2351,-0.8365,1.8644;-0.0092,0.9208,1.8217)|\",5.978341295485\r\n\"[H]O[C@@]12C([H])([H])OC(=O)[C@]1(C([H])([H])[H])C([H])([H])C([H])([H])C(=O)[C@@]2([H])C([H])([H])[H] |(1.1985,-0.6383,-1.9393;2.1328,-0.7425,-2.181;2.8565,-1.1249,-1.0126;2.5868,-2.6041,-0.6459;2.3383,-3.1649,-1.5541;1.7982,-2.7545,0.094;3.807,-3.1141,-0.0849;4.8638,-2.3425,-0.4735;5.9958,-2.6013,-0.1582;4.3842,-1.1656,-1.3312;4.6554,-1.5196,-2.8104;4.1066,-2.4097,-3.1325;5.7255,-1.7069,-2.9418;4.3522,-0.6934,-3.4591;5.1379,0.1287,-0.9344;5.4735,0.0634,0.1069;6.0492,0.2172,-1.5343;4.2551,1.3668,-1.1147;3.9516,1.4734,-2.1654;4.7759,2.2905,-0.8424;2.9749,1.2799,-0.2987;2.3203,2.263,-0.0141;2.5516,-0.1334,0.1429;3.209,-0.4052,0.9846;1.1044,-0.1307,0.6485;0.9838,0.6537,1.3986;0.822,-1.0862,1.0996;0.3915,0.1042,-0.1517)|\",6.1579364368150005\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C([H])C([H])=C1N([H])C(=O)[C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H] |(3.362,-0.0406,-1.898;3.1407,-0.5969,-0.9887;1.8188,-0.7524,-0.5814;1.0256,-0.3106,-1.18;1.4958,-1.4616,0.5816;0.0603,-1.6563,1.0103;-0.0462,-1.5885,2.0988;-0.3181,-2.6429,0.7105;-0.5976,-0.9051,0.5607;2.5542,-2.0022,1.3227;2.3377,-2.5536,2.2352;3.8849,-1.859,0.9344;4.6861,-2.2811,1.5246;4.1869,-1.1492,-0.237;5.505,-0.9501,-0.7093;5.5742,-0.4092,-1.5616;6.6921,-1.3931,-0.1869;6.7988,-2.0314,0.8547;7.9204,-1.0473,-1.0422;7.7556,-0.0838,-1.5442;9.1253,-0.9661,-0.1834;8.8663,-1.3659,0.7195;10.1423,-1.8366,-0.7184;10.6874,-1.2657,-1.3648;9.4316,-2.8741,-1.4782;10.0952,-3.3336,-2.218;9.1023,-3.6522,-0.7797;8.2168,-2.16,-2.0982;7.3631,-2.8241,-2.2674;8.4816,-1.703,-3.0586)|\",5.50758433412\r\n\"[H]C1=C(C([H])([H])[H])[C@@]2([H])[C@@]3([H])C(=O)[C@]([H])(C(C([H])([H])[H])=C3[H])[C@]2([H])C1=O |(2.9457,4.6137,-3.3327;3.2689,3.6603,-2.9253;4.2512,2.8719,-3.4258;5.1466,3.1469,-4.5884;4.9547,4.1304,-5.0272;5.0192,2.3848,-5.3691;6.2005,3.104,-4.2822;4.1686,1.5365,-2.6959;3.3519,1.0213,-3.2182;5.1564,0.3714,-2.3021;6.0744,0.2286,-2.8715;5.2998,0.5667,-0.7604;6.2576,0.5747,-0.0384;3.7604,0.6976,-0.4495;3.4964,0.8005,0.6035;3.3633,-0.6048,-1.1342;2.2574,-1.4868,-0.6488;1.2884,-0.9742,-0.7089;2.4042,-1.7587,0.405;2.1954,-2.4098,-1.2345;4.2226,-0.8263,-2.1559;4.2383,-1.6956,-2.806;3.6321,1.9782,-1.3187;4.4077,2.6451,-0.8999;2.5971,3.0319,-1.7345;1.5395,3.3421,-1.2273)|\",4.985125741159999\r\n\"[H]C1([H])[C@]2([H])C([H])([H])[C@]3([H])C4(NN4)[C@@]1([H])C([H])([H])[C@]([Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])(C2([H])[H])C3([H])[H] |(5.6481,-1.7093,4.7186;5.3187,-1.2468,3.779;6.1065,-0.5464,3.4697;3.9883,-0.4934,3.9889;4.1249,0.2831,4.7533;2.8989,-1.4838,4.4515;1.9525,-0.9533,4.6237;3.186,-1.9497,5.4033;2.6998,-2.5689,3.3694;1.943,-3.2957,3.6892;4.0267,-3.2711,3.1482;4.0532,-4.6909,2.7441;4.3612,-4.4819,3.9243;5.1278,-2.3305,2.694;6.0582,-2.8922,2.5449;4.6867,-1.6693,1.3646;4.5799,-2.4441,0.594;5.4794,-0.9887,1.0252;3.3491,-0.8963,1.5468;2.8103,-0.0684,-0.108;2.57,-1.3828,-1.4573;2.261,-0.9089,-2.397;1.7967,-2.1117,-1.1886;3.4941,-1.937,-1.6581;4.1361,1.1635,-0.6853;3.8287,1.638,-1.6253;5.1013,0.6764,-0.8659;4.2979,1.9634,0.0466;1.1774,0.8724,0.1296;0.8764,1.3519,-0.8099;1.2664,1.6615,0.8852;0.3598,0.209,0.434;3.5554,0.1681,2.6615;4.318,0.8973,2.3541;2.627,0.7336,2.8237;2.2714,-1.9061,2.0365;1.3087,-1.3978,2.1843;2.1063,-2.688,1.2833)|\",4.88172247797\r\n\"[H]O[C@]([H])(C(=O)C1=C([H])C(OC([H])([H])[H])=C(OC([H])([H])[H])C([H])=C1OC([H])([H])[H])C([H])([H])[H] |(0.921,1.1884,1.3796;0.6286,0.6694,0.6032;1.68,-0.2415,0.3668;1.2554,-1.2394,0.2204;2.6129,-0.2577,1.5897;2.4629,0.6521,2.4054;3.6956,-1.259,1.7927;4.6189,-0.9698,2.8182;4.4948,-0.0426,3.3662;5.6729,-1.8013,3.1421;6.5847,-1.4224,4.0975;6.483,-2.1126,5.3446;6.6749,-3.185,5.2271;5.4928,-1.9653,5.7963;7.2452,-1.6771,5.9948;5.8298,-3.0094,2.4238;6.8865,-3.7832,2.7776;7.1228,-4.9867,2.0595;8.025,-5.4197,2.4943;7.2923,-4.7883,0.9938;6.2901,-5.6931,2.172;4.9221,-3.3387,1.4135;5.0435,-4.2652,0.8709;3.8629,-2.4763,1.0929;2.9603,-2.7774,0.1173;3.0644,-4.007,-0.5861;2.2266,-4.0158,-1.2849;2.982,-4.8654,0.0921;4.0052,-4.0718,-1.1473;2.4685,0.1687,-0.8915;3.2511,-0.5532,-1.1426;2.9234,1.154,-0.7411;1.7722,0.2417,-1.7326)|\",4.59328179644\r\n\"[H]O[C@]1([H])C(C([H])([H])[H])=C([H])C(=O)C2=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])[C@]21[H] |(3.4118,0.4533,4.1583;2.8451,0.9683,3.5597;1.8113,0.0945,3.0913;1.2642,0.7197,2.3841;0.8314,-0.2751,4.1892;0.5787,0.7563,5.2532;-0.2103,0.4349,5.9389;0.2869,1.714,4.8025;1.4863,0.9625,5.8321;0.1502,-1.4329,4.1281;-0.6256,-1.6664,4.8539;0.3099,-2.4458,3.0629;-0.52,-3.3535,2.9727;1.5057,-2.332,2.1855;1.8576,-3.3305,1.3337;1.0281,-4.5645,1.0667;1.5296,-5.2088,0.336;0.0326,-4.3102,0.692;0.8543,-5.1365,1.9838;3.1671,-3.2975,0.5711;3.7021,-4.2352,0.7834;2.9289,-3.3591,-0.5019;4.0859,-2.0959,0.856;4.7852,-1.9669,0.0234;4.7022,-2.3085,1.7387;3.2672,-0.8138,1.122;3.9523,-0.002,1.3987;2.5241,-0.3784,-0.1846;2.1057,-1.2808,-0.6484;1.3476,0.6001,-0.0354;0.9572,0.8532,-1.0286;1.6513,1.5404,0.4411;0.5168,0.1695,0.5324;3.5379,0.2259,-1.1763;3.0546,0.4525,-2.1338;4.3779,-0.4443,-1.3847;3.9506,1.164,-0.7832;2.4368,-1.1364,2.3957;3.1864,-1.4659,3.1403)|\",4.721175306175\r\n\"[H]C1([H])[C@]2([H])C([H])([H])[C@]3([H])C4(NN4)[C@@]1([H])C([H])([H])[C@](F)(C2([H])[H])C3([H])[H] |(-1.435,1.7906,-1.407;-1.1595,0.7359,-1.2821;-1.675,0.1712,-2.0701;-1.5971,0.2307,0.1104;-2.6835,0.3402,0.2155;-0.8866,1.0526,1.2084;-1.2069,0.7143,2.2026;-1.158,2.1124,1.1248;0.646,0.8882,1.0672;1.1675,1.4832,1.8255;1.0479,1.3538,-0.3226;2.3796,1.9512,-0.5456;1.4425,2.7581,-0.5456;0.3717,0.5696,-1.4349;0.7032,0.9435,-2.4101;0.7506,-0.923,-1.2753;1.8332,-1.0623,-1.3802;0.2617,-1.5297,-2.0474;0.3042,-1.4042,0.1108;0.6458,-2.7588,0.2458;-1.2143,-1.259,0.2589;-1.7127,-1.8687,-0.5054;-1.5217,-1.6457,1.2387;1.0232,-0.6067,1.206;0.7267,-0.9893,2.1904;2.1066,-0.7449,1.1088)|\",5.161999743985\r\n\"[H]/C1=C([H])/C(F)=C(/[H])C2=C1ON([H])[C@]2([H])C([H])([H])S(=O)(=O)N([H])[H] |(-0.2034,2.7523,-1.4523;0.3192,1.8738,-1.09;1.7113,1.7629,-1.1778;2.3098,2.5584,-1.6091;2.3468,0.621,-0.6981;3.6936,0.5441,-0.7888;1.6549,-0.4493,-0.131;2.1863,-1.328,0.2169;0.2726,-0.3294,-0.0398;-0.369,0.8141,-0.5138;-1.7374,0.7552,-0.369;-2.0116,-0.357,0.5358;-1.9403,0.1004,1.4537;-0.8301,-1.265,0.4178;-1.0606,-2.0005,-0.3641;-0.671,-1.998,1.7542;-1.6569,-2.2828,2.1355;-0.1512,-1.3963,2.505;0.2418,-3.5611,1.6043;-0.4748,-4.4033,0.6432;1.6723,-3.2691,1.4594;0.0228,-4.2048,3.1576;-0.5367,-5.054,3.101;0.9337,-4.3953,3.5711)|\",5.45316156402\r\n\"[H]OC1=C([H])C([C@@]([H])(OS(O)(O)C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1[H] |(0.862,-3.8151,-4.5859;0.5962,-3.3098,-3.802;1.5159,-2.3235,-3.5758;1.2966,-1.487,-2.4786;0.4157,-1.6486,-1.8657;2.1928,-0.4532,-2.1956;1.9863,0.4314,-0.9854;2.6233,1.3164,-1.0685;0.5882,0.8934,-1.0461;0.2089,2.2933,-0.279;-0.21,2.0159,1.0938;1.2636,3.2769,-0.5317;-1.2391,2.675,-1.2691;-1.6717,3.5886,-0.8558;-1.9447,1.8472,-1.1864;-0.9278,2.826,-2.3032;2.2372,-0.2802,0.3432;3.2736,-0.6328,0.3805;1.5743,-1.1448,0.4462;2.0641,0.3931,1.1876;3.3117,-0.2598,-3.015;4.0063,0.5491,-2.8049;3.5318,-1.106,-4.1019;4.4012,-0.9586,-4.737;2.6395,-2.1376,-4.3896;2.8099,-2.7919,-5.2429)|\",5.79602501565\r\n\"[H]O[C@]([H])(C1=C([H])C(OS(O)(O)C([H])([H])[H])=C([H])C([H])=C1[H])C([H])([H])[H] |(1.2259,-0.4852,0.0582;2.0706,-0.6325,-0.3993;2.9671,0.395,0.031;3.8983,0.1593,-0.5002;3.2393,0.3035,1.5338;3.0978,-0.9352,2.1727;2.7829,-1.8063,1.6102;3.359,-1.0456,3.5336;3.1286,-2.2914,4.1472;4.4725,-3.1686,4.6027;5.2999,-3.4223,3.428;5.0556,-2.567,5.7995;3.5633,-4.6492,5.055;4.2987,-5.3628,5.4323;3.0651,-5.0397,4.1671;2.8436,-4.3925,5.8331;3.775,0.0426,4.2977;3.9732,-0.0803,5.3562;3.931,1.2711,3.6569;4.2567,2.1356,4.2285;3.6642,1.4029,2.2928;3.7906,2.3739,1.8249;2.4744,1.7665,-0.4412;3.228,2.5481,-0.3006;1.5696,2.0682,0.1014;2.2388,1.7107,-1.5079)|\",6.33481043964\r\n\"[H]OC(=O)C([H])([H])[C@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])[C@@]3([H])C([H])([H])[C@@]3([H])C([H])([H])[C@]12[H] |(4.3111,-1.8338,-0.6524;4.0359,-2.3691,-1.4543;2.9942,-1.7826,-2.061;2.4432,-2.2925,-3.0126;2.5475,-0.4287,-1.4939;1.7416,-0.0914,-2.1481;3.3735,0.2863,-1.614;2.089,-0.4133,-0.0094;3.2365,-0.0492,0.939;2.8387,0.1064,1.9534;3.6804,0.9013,0.6182;4.3098,-1.0668,0.9216;4.0371,-1.8695,1.4883;5.1656,-0.6984,1.3321;1.2023,-1.6685,0.4152;1.4062,-2.6252,-0.0732;1.2173,-1.8024,1.5038;-0.0205,-0.8449,-0.0904;-0.0089,-0.9843,-1.181;-1.4615,-0.5028,0.2371;-2.2633,-1.1311,-0.1422;-1.7823,0.4605,1.3522;-2.8201,0.4822,1.6755;-1.0827,0.6262,2.168;-1.5,1.0204,-0.0256;-2.3334,1.4569,-0.5701;-0.0769,1.5479,-0.3611;0.1196,2.5362,0.0714;0.0306,1.6271,-1.4506;0.7661,0.412,0.2305;0.7479,0.5433,1.3204)|\",7.390612179580001\r\n\"[H]OC(=O)C([H])([H])[C@@]1(C([H])([H])N([H])[H])C([H])([H])[C@@]2([H])C([H])([H])[C@@]3(C([H])([H])C([H])([H])[H])C([H])([H])[C@@]3([H])[C@@]21[H] |(4.4626,6.8135,-2.4312;3.7312,7.279,-2.9354;3.0992,6.415,-3.7411;2.1368,6.749,-4.3985;3.6633,4.9874,-3.8033;4.6639,5.03,-4.258;3.015,4.4576,-4.5046;3.7231,4.2123,-2.4628;5.084,4.3735,-1.7675;5.8671,4.1001,-2.4844;5.1715,3.6774,-0.9226;5.3398,5.7704,-1.352;6.332,5.917,-1.1767;4.8585,5.9706,-0.4761;2.4657,4.3958,-1.4978;2.5558,5.0977,-0.6617;1.5808,4.6628,-2.0841;2.588,2.878,-1.1897;3.3711,2.7862,-0.424;1.7372,1.6077,-1.0549;0.8367,1.6423,-1.6803;1.4262,1.3739,-0.0308;2.7774,0.5502,-1.5774;3.2148,-0.5299,-0.6016;4.1195,-1.0194,-0.9883;3.5052,-0.0634,0.3516;2.14,-1.5943,-0.3412;2.4915,-2.3442,0.3768;1.8693,-2.1137,-1.2678;1.2252,-1.1475,0.066;2.8378,0.2427,-3.0599;3.2378,-0.7292,-3.3454;2.0374,0.591,-3.711;3.791,1.2702,-2.4963;4.8531,1.045,-2.4304;3.253,2.7041,-2.5395;2.4591,2.7298,-3.3005)|\",7.33618940948\r\n\"[H]OC(=O)C([H])([H])[C@]1(C([H])([H])N([H])[H])C([H])([H])[C@]2([H])C([H])([H])[C@]3([H])C([H])([H])[C@]3([H])[C@]21[H] |(0.5398,-4.2885,2.0619;1.0286,-4.7974,1.3492;1.7295,-3.9595,0.5738;2.3488,-4.356,-0.3899;1.7248,-2.4763,0.9741;2.3345,-1.9775,0.2176;2.2545,-2.3755,1.9328;0.3409,-1.7854,1.0625;-0.2366,-1.841,2.4857;-1.1121,-1.1839,2.5771;0.5218,-1.4543,3.1763;-0.5467,-3.2241,2.9088;-1.4379,-3.5222,2.514;-0.6467,-3.2775,3.9207;-0.7017,-2.1652,-0.0843;-0.166,-2.4883,-0.9828;-1.4752,-2.9064,0.1446;-1.1083,-0.6657,-0.0927;-1.8154,-0.5373,0.7386;-1.401,0.4976,-1.0544;-0.8537,0.4029,-2.0007;-2.4602,0.6409,-1.2917;-0.8552,1.6774,-0.186;-1.5718,2.4342,0.124;0.5944,2.1004,-0.2571;0.8272,3.1264,0.0206;1.2135,1.7173,-1.0674;0.1763,1.1409,0.8298;0.1609,1.465,1.867;0.2742,-0.3366,0.4335;0.9698,-0.4008,-0.416)|\",7.33618940948\r\n\"[H]O/C(=N/C([H])=C(/OC([H])([H])[H])C(=O)C([H])([H])[H])C1=NC([H])=C([H])C([H])=C1[H] |(7.8008,1.1024,-0.5146;6.8413,1.1232,-0.7338;6.2703,0.1935,0.0495;4.9947,0.0771,-0.0044;4.2315,-0.6377,0.8845;4.5506,-0.6927,1.9281;3.0087,-1.1861,0.6316;2.354,-1.67,1.7389;1.9139,-3.0306,1.7063;1.5415,-3.2445,2.7112;2.753,-3.706,1.4841;1.1187,-3.1729,0.9732;2.2793,-1.2074,-0.6709;1.1207,-1.6084,-0.7026;2.9682,-0.7384,-1.9365;3.3602,0.2769,-1.8256;2.2429,-0.7885,-2.7517;3.831,-1.3727,-2.1716;7.2685,-0.6018,0.8442;8.5067,-0.0695,0.8226;9.4909,-0.6921,1.4713;10.47,-0.2204,1.4259;9.3011,-1.8864,2.1697;10.1314,-2.3573,2.6862;8.0303,-2.4568,2.1651;7.8444,-3.3983,2.6738;6.9934,-1.8126,1.4907;6.0058,-2.2529,1.4492)|\",3.4068654082599994\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[C@@]1([H])OC(=O)C([H])([H])[C@]([H])(C([H])([H])[H])C1([H])[H] |(2.0849,-0.4465,0.6849;2.4856,-1.333,0.6708;3.7729,-1.1896,0.0801;4.2624,-2.1575,0.2527;3.6833,-0.9699,-1.4405;3.1219,-0.0456,-1.6396;4.6955,-0.817,-1.846;3.0045,-2.1355,-2.1702;3.5895,-3.0519,-2.0057;2.0257,-2.3096,-1.7094;2.8452,-1.8928,-3.674;2.3612,-2.7445,-4.1657;3.8166,-1.7365,-4.1602;2.2316,-1.0045,-3.8697;4.6234,-0.1289,0.8157;5.6769,-0.2484,0.527;4.556,-0.334,1.8914;4.2516,1.332,0.5619;4.3661,1.556,-0.5051;2.8195,1.459,0.8344;2.2087,2.6101,1.2396;1.0035,2.6593,1.2187;3.0802,3.7259,1.7957;2.5528,4.6658,1.6081;3.0795,3.5824,2.8857;4.5269,3.75,1.2768;5.1155,4.391,1.9456;4.6191,4.3422,-0.1383;5.6467,4.3018,-0.5179;4.3035,5.3913,-0.137;3.9768,3.8122,-0.8518;5.082,2.3197,1.3753;5.0973,2.0021,2.4268;6.1163,2.2774,1.012)|\",6.908970664195\r\n\"[H]O[C@]([H])(C([H])([H])[H])C([H])([H])C(=O)C([H])([H])C([H])([H])C([H])([H])C(=C([H])[H])C([H])([H])[H] |(8.097,-0.3335,5.0261;7.619,0.3398,4.5178;8.5531,0.9694,3.6315;9.0294,0.2123,2.9903;9.6236,1.733,4.41;10.3354,2.2199,3.7335;10.192,1.0516,5.0558;9.1625,2.4972,5.0453;7.7162,1.8808,2.7255;8.3852,2.4059,2.0349;7.1955,2.6207,3.3444;6.7342,1.05,1.8961;7.143,0.3421,0.9934;5.2669,1.1294,2.2767;5.2078,0.8903,3.349;4.9547,2.1834,2.1995;4.3564,0.2235,1.4485;4.4828,0.4664,0.3875;4.6827,-0.8169,1.561;2.8761,0.3467,1.8593;2.5613,1.395,1.7327;2.7801,0.1216,2.9292;1.941,-0.5508,1.0744;1.2863,-1.5589,1.6602;0.6143,-2.2069,1.1024;1.3984,-1.7751,2.7201;1.7775,-0.2469,-0.3956;1.0586,-0.9231,-0.8682;2.7284,-0.3361,-0.9363;1.4271,0.7838,-0.5472)|\",6.206916929905001\r\n\"[H]C1=C([H])C(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])=C([H])C([H])=C1C(=O)C([H])([H])[H] |(1.9668,4.4381,-4.1795;2.7059,4.7955,-3.4688;3.4021,3.869,-2.7028;3.1901,2.8122,-2.8338;4.37,4.2816,-1.7666;5.1396,3.3513,-0.93;5.8428,3.8307,-0.2473;5.0699,2.0103,-0.9173;4.3755,1.502,-1.5858;5.908,1.1272,-0.0356;5.2515,0.4583,0.5389;6.4353,1.7472,0.7003;6.9546,0.2792,-0.7974;7.5372,-0.2887,-0.057;7.6596,0.9643,-1.2872;6.409,-0.7047,-1.8519;5.8489,-0.1264,-2.6016;7.5739,-1.3941,-2.5786;7.2096,-2.0756,-3.3564;8.1789,-1.9839,-1.8774;8.2366,-0.6629,-3.0563;5.4581,-1.75,-1.2487;5.1097,-2.4501,-2.017;4.5711,-1.2949,-0.7938;5.9661,-2.3369,-0.4719;4.6049,5.6648,-1.6367;5.347,6.0077,-0.9194;3.9116,6.5916,-2.4022;4.0972,7.6558,-2.299;2.9484,6.1729,-3.3328;2.2293,7.2121,-4.1327;2.482,8.3995,-3.9863;1.1741,6.7683,-5.1347;0.3678,6.2132,-4.6404;1.6051,6.109,-5.8976;0.7581,7.6542,-5.6175)|\",4.41912893212\r\n\"[H]OC1=C([H])C([H])=C(N([H])C(SC([H])([H])[H])C([H])N(=O)=O)C([H])=C1[H] |(6.2715,5.9042,6.1448;6.4333,5.0104,6.4845;5.6936,4.115,5.7664;4.8521,4.492,4.7141;4.7716,5.5389,4.4285;4.1231,3.5345,4.0134;3.4939,3.8413,3.1858;4.2113,2.1829,4.3694;3.5421,1.1549,3.6556;4.0066,0.2558,3.6422;2.325,1.1881,3.02;2.2313,-0.1125,1.8135;0.5484,-0.8012,1.9979;0.3389,-1.0203,3.0471;0.5588,-1.7346,1.4287;-0.1878,-0.1098,1.5892;1.389,2.1308,3.3526;1.5114,2.7413,4.2352;0.2393,2.4977,2.6107;0.0346,2.021,1.4803;-0.5106,3.329,3.1477;5.0555,1.812,5.4262;5.1298,0.7655,5.7121;5.7972,2.7643,6.1151;6.4521,2.4787,6.9317)|\",3.88578578514\r\n\"[H]C([H])([H])O[C@@]1([H])C(=O)N2C([H])([H])[C@]3(C([H])([H])[H])OC([H])([H])[C@@]([H])(O3)[C@@]21[H] |(0.7238,-0.0136,-7.1376;1.4474,0.0584,-6.3225;2.3428,-0.5279,-6.5816;1.7392,1.1057,-6.1794;0.8068,-0.4678,-5.1652;1.6274,-0.4356,-4.0315;2.5716,-0.9771,-4.2048;1.8612,0.9463,-3.3326;2.4562,1.9694,-3.5873;1.1185,0.5446,-2.2421;1.1423,0.9066,-0.8392;1.828,1.7474,-0.7045;0.1423,1.2006,-0.4959;1.6234,-0.3306,-0.0388;1.5223,-0.1607,1.4626;0.4872,0.0336,1.7584;2.1477,0.6759,1.7881;1.8684,-1.0739,1.9529;2.9706,-0.6281,-0.4167;2.9345,-1.6636,-1.4015;3.5477,-1.3713,-2.2599;3.3403,-2.5912,-0.979;1.4375,-1.8247,-1.719;1.1494,-2.8544,-1.9444;0.8582,-1.467,-0.4469;0.8712,-0.8251,-2.7324;-0.1892,-1.0344,-2.9099)|\",6.642299090705\r\n\"[H]C([H])=C([H])C(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])OC([H])([H])[C@]2(C([H])([H])[H])OC([H])([H])[C@]1([H])O2 |(1.2804,4.0143,-0.9651;1.3623,3.0576,-1.4744;0.8559,2.9765,-2.4318;2.0462,2.0475,-0.9363;2.5238,2.2051,0.0337;2.236,0.6511,-1.5063;1.6828,-0.3686,-0.4781;0.6042,-0.2283,-0.3499;2.1568,-0.2477,0.504;1.8678,-1.391,-0.8203;1.5159,0.4476,-2.8457;0.433,0.5314,-2.6997;1.7384,-0.5464,-3.2408;1.8238,1.1791,-3.5959;3.767,0.4413,-1.5881;4.1727,0.5668,-0.5735;4.0858,-0.9407,-2.0377;5.2347,-1.173,-2.9177;6.1642,-1.295,-2.3447;4.9927,-2.1324,-3.3837;5.3853,-0.0538,-4.091;5.5543,-0.6217,-5.4797;6.5256,-1.1203,-5.5637;5.5132,0.1848,-6.2157;4.765,-1.3492,-5.6925;6.4167,1.0173,-3.9113;5.7403,2.112,-3.1866;6.452,2.5344,-2.4745;5.3622,2.8666,-3.8797;4.6689,1.1797,-2.5783;5.4996,0.5821,-2.1947;4.2206,0.6497,-3.8729)|\",7.14026743712\r\n\"[H]C1=C2C([H])([H])C([H])([H])C([H])([H])C3(OC([H])([H])C([H])([H])O3)[C@]2(C([H])([H])[H])C([H])([H])C1([H])[H] |(0.9846,-0.2839,-1.4571;2.011,-0.6389,-1.3997;2.5464,-1.261,-0.3444;1.996,-1.4827,1.0298;0.9354,-1.2153,1.0862;2.0755,-2.5461,1.2914;2.8348,-0.6503,2.052;2.9523,-1.226,2.9758;2.2933,0.2656,2.3146;4.2262,-0.256,1.5042;4.9025,0.0071,2.325;4.1449,0.6333,0.8683;4.8587,-1.3775,0.6662;4.9654,-2.5657,1.4554;6.2674,-2.5723,2.0266;6.3001,-1.9834,2.9554;6.5379,-3.6075,2.2513;7.109,-1.93,0.9247;7.4527,-2.6773,0.1955;7.9758,-1.3768,1.3012;6.2036,-1.0162,0.3129;4.0017,-1.6603,-0.6077;4.0994,-3.1403,-1.0363;3.5584,-3.2891,-1.9774;3.6776,-3.8137,-0.2863;5.1447,-3.4293,-1.1958;4.3607,-0.734,-1.8058;4.7649,0.2143,-1.4367;5.1228,-1.1752,-2.4559;3.0106,-0.4707,-2.5222;2.9707,0.5265,-2.9787;2.8311,-1.1884,-3.3378)|\",6.849105617085001\r\n\"[H]C1=C([C@@]2([H])O[C@]2([H])C([H])([H])[H])C([H])=C([H])C(C(=O)OC([H])([H])[H])=N1 |(5.5939,-0.5911,-0.1846;5.6477,0.4745,-0.4085;4.5104,1.1485,-0.8782;3.2363,0.4148,-1.1073;3.0968,-0.4773,-0.4929;2.0479,1.2036,-1.2772;2.5202,0.4513,-2.4037;2.9628,1.0726,-3.1857;1.6471,-0.6815,-2.8779;2.2044,-1.343,-3.5513;0.7807,-0.293,-3.4255;1.2795,-1.2688,-2.0306;4.6263,2.5201,-1.1232;3.7573,3.0806,-1.454;5.8523,3.1405,-0.913;5.9976,4.2009,-1.0849;6.9269,2.3666,-0.459;8.2438,3.0566,-0.2436;8.4101,4.2434,-0.4543;9.2054,2.232,0.1998;10.4816,2.8509,0.4212;11.1332,2.0523,0.7764;10.4001,3.644,1.1694;10.8683,3.2795,-0.5077;6.8305,1.0536,-0.2042)|\",5.415065624950001\r\n\"[H]C#CC([H])([H])O[C@]([H])(C(C([H])=C([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(12.3173,0.9503,0.8679;11.9382,0.2046,0.2065;11.5114,-0.6411,-0.5429;11.0031,-1.6654,-1.4674;11.833,-2.3112,-1.7824;10.6071,-1.1777,-2.364;10.0319,-2.509,-0.8476;8.6495,-2.3643,-1.203;8.134,-2.7159,-0.2975;8.2676,-3.3808,-2.3394;8.5034,-4.7684,-1.7551;7.9624,-4.9546,-0.8243;9.2352,-5.7597,-2.2621;9.2954,-6.7217,-1.7601;9.8055,-5.6618,-3.1811;6.7574,-3.2893,-2.6702;6.4795,-2.2989,-3.0417;6.1367,-3.5158,-1.7943;6.5088,-4.0273,-3.4403;9.0798,-3.1717,-3.6289;8.7722,-3.9035,-4.3844;10.1503,-3.3133,-3.4534;8.9194,-2.1719,-4.035;8.2314,-0.8799,-1.3179;8.8723,-0.3426,-0.6071;8.4154,-0.327,-2.6263;7.4116,0.676,-2.8116;7.8627,2.014,-2.2124;7.0727,2.7625,-2.3282;8.7654,2.3711,-2.7178;8.0876,1.9167,-1.1457;7.0881,0.7787,-4.293;6.2767,1.4955,-4.4495;6.7792,-0.1967,-4.6765;7.9675,1.1189,-4.8481;6.2701,0.1668,-2.1262;6.7446,-0.5464,-0.9893;6.6788,0.0681,-0.0814;6.1119,-1.4257,-0.8573)|\",7.393333318085\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(/F)C([H])([H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(6.4172,5.4387,-2.1933;6.1221,4.55,-1.6421;5.8933,3.3478,-2.3136;6.0199,3.2938,-3.3919;5.5213,2.2043,-1.6076;5.3914,1.2671,-2.1397;5.353,2.2381,-0.2112;4.9726,1.0602,0.5851;5.4119,0.9862,1.5781;4.123,0.0761,0.2567;3.9831,-0.9256,1.1819;3.2074,-0.1087,-0.8993;3.5018,-0.9936,-1.4829;3.2796,0.7549,-1.5661;1.3563,-0.3386,-0.4016;1.057,-2.1098,0.1907;0.0086,-2.2579,0.4767;1.685,-2.3418,1.057;1.2907,-2.8364,-0.5969;0.3298,-0.0003,-1.9595;0.4652,1.0286,-2.3137;-0.7396,-0.1448,-1.7635;0.6085,-0.6745,-2.7782;0.9131,0.9053,0.9539;1.1182,1.9342,0.6355;1.4906,0.7187,1.8659;-0.1506,0.8412,1.2124;5.6103,3.4529,0.4517;5.5036,3.4967,1.5331;5.9849,4.5947,-0.2529;6.1718,5.5212,0.284)|\",5.246355037640001\r\n\"[H]C1=C([H])C(/C([H])=C(\\[H])[C@]([H])(C([H])([H])[H])C([H])([H])N(=O)=O)=C([H])C([H])=C1Cl |(5.9185,3.2358,4.8081;5.3365,2.5007,4.2625;4.806,2.7991,3.0097;4.983,3.7834,2.5829;4.0484,1.8632,2.2853;3.5206,2.25,0.9693;3.8194,3.245,0.6372;2.726,1.5356,0.1578;2.3875,0.541,0.451;2.2289,2.0246,-1.1798;2.5946,3.0442,-1.3456;0.6904,2.0221,-1.2456;0.2793,2.6689,-0.4647;0.2894,1.0125,-1.0933;0.3298,2.3889,-2.2125;2.8428,1.1452,-2.2871;2.503,0.1097,-2.2398;3.9342,1.1867,-2.257;2.4559,1.6474,-3.6565;1.8625,0.8744,-4.4024;2.76,2.8072,-3.9257;3.8424,0.5981,2.8673;3.2685,-0.1574,2.3393;4.3647,0.2833,4.1169;4.1985,-0.6956,4.5542;5.1105,1.2396,4.8094;5.7722,0.8443,6.3878)|\",4.421850070624999\r\n\"[H]O[C@@]1([H])[C@@]([H])(F)[C@@]([H])(O[H])[C@@]2([H])N([H])[C@]1(O[H])C([H])([H])C2([H])[H] |(2.3441,-0.837,1.3761;2.6813,-0.1315,0.7959;1.7022,0.1012,-0.215;2.2516,0.3562,-1.1291;0.8154,-1.1209,-0.4718;0.2705,-0.9759,-1.4108;1.6061,-2.2559,-0.6292;-0.1989,-1.3385,0.6517;-0.9309,-2.1011,0.347;0.5467,-1.8045,1.7865;-0.0723,-1.9651,2.5164;-0.8975,0.0094,0.983;-1.6574,-0.1702,1.7548;0.0773,1.0026,1.439;0.703,0.6128,2.1443;0.8281,1.3204,0.2085;1.6599,2.438,0.3602;2.4503,2.1183,0.8317;-0.318,1.6007,-0.7798;-0.5634,2.6633,-0.7117;-0.0318,1.3913,-1.8158;-1.487,0.711,-0.2678;-1.848,-0.0036,-1.0158;-2.3397,1.3266,0.0285)|\",7.390612179580001\r\n\"[H]C1=C([H])C([H])=C(N2C(=O)[C@]3([H])N(C([H])([H])C([H])([H])C3([H])[H])C2(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(2.8456,1.1254,6.092;2.8565,1.0917,5.0062;3.854,1.7553,4.289;4.627,2.3088,4.8154;3.8778,1.7092,2.897;4.6517,2.2285,2.3455;2.8858,1.0053,2.196;2.9307,0.9662,0.7745;3.1987,2.0816,0.0021;3.3839,3.2221,0.3933;3.2394,1.6039,-1.457;2.2842,1.8597,-1.9318;3.4035,0.136,-1.3559;4.8603,-0.0919,-1.4594;5.4007,0.1679,-0.5317;5.0707,-1.1398,-1.6922;5.2689,0.872,-2.5751;6.3476,1.0543,-2.5935;4.9766,0.4568,-3.5456;4.4446,2.1395,-2.2604;4.1291,2.6654,-3.1653;5.0043,2.8535,-1.6484;2.7406,-0.2511,-0.0956;1.2497,-0.4859,-0.4114;1.1626,-1.2879,-1.1505;0.788,0.4144,-0.8273;0.685,-0.7741,0.4795;3.3497,-1.5033,0.5361;3.311,-2.3254,-0.1854;2.784,-1.8011,1.4222;4.3877,-1.3452,0.8379;1.875,0.3555,2.9182;1.0809,-0.1644,2.3954;1.8705,0.3903,4.3131;1.0811,-0.1233,4.8555)|\",5.58377621226\r\n\"[H]OC1=N[C@]([H])(N([H])C2=C([H])C([H])=C(C#N)C([H])=C2[H])N([H])N([H])C1([H])[H] |(-4.2668,-4.9669,2.8134;-3.6695,-5.0122,2.0499;-3.7881,-3.878,1.3144;-3.0431,-3.7658,0.2898;-3.205,-2.5765,-0.5292;-3.2227,-2.887,-1.5826;-2.0174,-1.7435,-0.2709;-1.3259,-2.2335,0.2849;-1.4817,-0.8787,-1.217;-0.1721,-0.3848,-1.0294;0.3993,-0.7135,-0.1648;0.3899,0.5096,-1.9232;1.3986,0.8762,-1.7616;-0.3369,0.9489,-3.0465;0.2398,1.8756,-3.9705;0.7097,2.6308,-4.7219;-1.6376,0.4591,-3.242;-2.2064,0.7845,-4.1073;-2.2041,-0.4378,-2.3446;-3.2074,-0.8011,-2.5383;-4.4716,-1.8938,-0.2719;-4.5585,-1.0319,-0.8036;-4.6186,-1.5701,1.1152;-3.7375,-1.1587,1.4487;-4.8046,-2.8493,1.7892;-4.7085,-2.6973,2.8729;-5.8191,-3.2163,1.5867)|\",5.08852900435\r\n\"[H]/C(=C(/C#N)C(=O)C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])N(C([H])([H])[H])C([H])([H])[H] |(7.0547,-0.4543,-1.8338;7.2804,0.4903,-1.3435;6.1982,0.9936,-0.6457;6.1845,2.2083,0.0899;6.1438,3.2017,0.7024;4.9603,0.1613,-0.7008;4.9334,-0.8908,-1.3322;3.7326,0.6884,0.0328;4.0247,1.0028,1.045;3.4336,1.6189,-0.4744;2.5504,-0.2917,0.0919;2.3977,-0.6832,-0.9218;2.8471,-1.4817,1.0165;2.0015,-2.1795,1.036;3.0208,-1.1431,2.047;3.7304,-2.0318,0.6801;1.2742,0.4406,0.5311;0.4192,-0.2452,0.5585;1.024,1.2611,-0.1525;1.388,0.8679,1.536;8.5349,0.9224,-1.5329;9.1049,2.1561,-0.999;9.9831,1.92,-0.3857;8.3824,2.6922,-0.3895;9.4249,2.8033,-1.8251;9.4558,0.1435,-2.3551;10.3414,-0.1377,-1.7718;9.7858,0.7322,-3.2205;8.9653,-0.7635,-2.7118)|\",4.960635494615\r\n\"[H]C#CC([H])([H])/N=C(/O[H])[C@]([H])(N([H])[H])C([H])([H])C(=O)OC([H])([H])[H] |(9.0458,-4.3164,0.3566;8.554,-3.8857,-0.4859;7.9867,-3.4032,-1.4374;7.2914,-2.816,-2.5934;7.4418,-3.4739,-3.458;7.7847,-1.8658,-2.8566;5.8435,-2.6725,-2.4126;5.3989,-1.7758,-1.6277;4.0662,-1.6406,-1.4797;3.9649,-1.0082,-0.7253;6.1843,-0.7315,-0.8085;7.1679,-1.1416,-0.552;5.3773,-0.3992,0.3795;5.7347,0.4651,0.7866;5.4837,-1.1314,1.0807;6.3674,0.5328,-1.6772;5.3766,0.9638,-1.8751;6.8038,0.278,-2.6455;7.2135,1.602,-1.0115;7.1916,1.8754,0.1747;7.9909,2.2397,-1.9024;8.7983,3.3101,-1.3747;9.4734,2.9337,-0.6025;8.1639,4.0902,-0.9463;9.3606,3.6945,-2.2254)|\",6.103513666715\r\n\"[H]C([H])=C=C(C([H])([H])[H])[C@@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(2.3465,-1.2369,-3.3757;3.3198,-0.8635,-3.0598;4.1119,-0.8845,-3.8072;3.5251,-0.4206,-1.8465;3.7067,0.0264,-0.6277;3.4735,1.4777,-0.2616;3.1601,2.0622,-1.1301;4.3812,1.938,0.1511;2.6942,1.5565,0.5084;4.1695,-0.8852,0.5127;3.5923,-0.6178,1.4048;4.0098,-2.3149,0.239;2.8434,-3.0168,0.3137;2.768,-4.2189,0.0992;1.8135,-2.2004,0.6615;0.4457,-2.7231,0.7802;-0.0256,-3.2638,-0.5741;-1.0851,-3.5383,-0.5162;0.552,-4.142,-0.8655;0.0853,-2.4941,-1.3461;0.3765,-3.7812,1.8869;-0.664,-4.0925,2.0351;0.7451,-3.3665,2.8321;0.9741,-4.6558,1.6272;-0.3516,-1.4759,1.1731;-1.411,-1.7258,1.2944;-0.262,-0.7026,0.4031;0.0179,-1.065,2.1184;5.2688,-3.0093,-0.0505;5.5012,-3.7209,0.753;5.1899,-3.579,-0.9809;6.2911,-1.8651,-0.1271;7.2932,-2.1768,0.1813;6.3565,-1.4922,-1.155;5.6913,-0.7798,0.7822;6.0887,0.2192,0.5792;5.8907,-1.0094,1.8362)|\",6.90080724868\r\n\"[H]C([H])=C([H])C(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(1.715,4.8156,0.1107;2.0964,4.0638,-0.5754;2.1464,4.3517,-1.6219;2.4664,2.8581,-0.1399;2.375,2.6464,0.9281;2.9839,1.6881,-0.9589;1.9607,0.5365,-0.8052;0.9837,0.8581,-1.183;1.8487,0.2295,0.2382;2.2761,-0.3437,-1.3741;3.1044,2.0298,-2.4522;2.116,2.2398,-2.8758;3.5252,1.1826,-3.0037;3.7322,2.908,-2.635;4.36,1.268,-0.3218;4.1619,1.0642,0.7351;4.959,0.0476,-0.9017;4.7598,-1.1647,-0.2916;3.9639,-1.3626,0.6166;5.566,-2.1047,-0.8507;5.5438,-3.4978,-0.381;5.9691,-3.5679,1.0899;6.054,-4.6162,1.398;5.2422,-3.0688,1.7321;6.9479,-3.0942,1.2255;4.1584,-4.1102,-0.6152;4.1817,-5.1783,-0.3704;3.8717,-4.0093,-1.6679;3.4052,-3.6224,0.005;6.5893,-4.1667,-1.2781;6.6755,-5.2292,-1.0274;7.5704,-3.6989,-1.1457;6.3061,-4.0806,-2.3323;6.1735,0.3171,-1.6979;7.0399,-0.152,-1.2186;6.088,-0.1102,-2.7024;6.2936,1.85,-1.7128;7.3395,2.1705,-1.6765;5.861,2.2591,-2.6298;5.4944,2.3066,-0.4785;5.1134,3.3273,-0.5677;6.1336,2.2719,0.4117)|\",6.887201556155\r\n\"[H]C(=C(C([H])([H])[H])C([H])([H])[H])C([H])([H])[C@@]1([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(2.022,1.1347,-1.6709;2.7996,1.1708,-0.9062;2.5781,1.9646,0.155;1.3027,2.7671,0.2693;0.7454,2.4952,1.1769;0.6426,2.6145,-0.5907;1.5162,3.8427,0.3482;3.5189,2.1313,1.3242;3.7785,3.1897,1.4667;4.4493,1.5697,1.2159;3.0378,1.8045,2.2568;3.9835,0.2867,-1.1868;3.6464,-0.7534,-1.2836;4.7037,0.3069,-0.3609;4.765,0.6135,-2.4837;5.5712,-0.1252,-2.5558;3.9341,0.5133,-3.7008;3.6247,-0.711,-4.2155;3.8713,-1.7698,-3.6503;3.0124,-0.578,-5.4227;2.5418,-1.7581,-6.1606;3.7279,-2.6562,-6.5289;3.3883,-3.4714,-7.1786;4.1854,-3.0828,-5.6353;4.484,-2.0809,-7.0752;1.4795,-2.504,-5.3446;1.0557,-3.3177,-5.9446;0.6639,-1.825,-5.0712;1.9097,-2.9246,-4.4349;1.9209,-1.1403,-7.4169;1.525,-1.9269,-8.0682;2.6685,-0.57,-7.9779;1.1006,-0.4652,-7.1516;3.8119,1.7789,-4.4352;4.4442,1.767,-5.3344;2.7807,1.941,-4.7616;4.2955,2.8227,-3.4198;4.7225,3.7039,-3.9086;3.4611,3.1531,-2.7936;5.3197,2.0528,-2.5658;5.4579,2.495,-1.5747;6.2958,2.0434,-3.0656)|\",6.843663340075\r\n\"[H]OP(O[H])OC([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(4.4979,-1.6267,-5.8852;4.3976,-0.8815,-6.5051;3.3017,0.2191,-5.9512;1.8438,-0.378,-6.4522;1.7294,-1.3317,-6.2817;3.3552,-0.3554,-4.3749;2.5692,0.3275,-3.3882;2.6299,1.4149,-3.5246;1.5153,0.0316,-3.4647;3.1136,-0.0548,-2.0209;4.1668,0.2551,-1.9482;2.3267,0.5974,-1.0286;2.2288,-0.2583,0.1254;0.7483,-0.4921,0.421;0.6333,-1.1277,1.3045;0.2659,-0.9824,-0.4302;0.2412,0.4608,0.6028;2.9866,0.3552,1.2973;2.9682,-0.3214,2.1577;2.5316,1.3071,1.5877;4.0279,0.5315,1.0133;2.8776,-1.4836,-0.2286;2.9663,-1.5308,-1.6453;3.8256,-2.1474,-1.9181;2.057,-1.9619,-2.0947)|\",7.507621135294999\r\n\"[H]OC1=N[C@]([H])(C(=C([H])[H])C([H])([H])C([H])([H])[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])O1 |(7.914,0.4354,2.4436;8.241,-0.0637,1.6755;7.1519,-0.4946,1.0291;5.9413,-0.2732,1.3597;5.1216,-0.9776,0.3521;4.5111,-0.2267,-0.1689;4.1668,-1.9756,1.0071;4.5386,-3.2163,1.3364;3.8497,-3.9003,1.8261;5.5393,-3.5999,1.155;2.7792,-1.4546,1.3132;2.3435,-1.029,0.3961;2.1333,-2.2909,1.6063;2.7566,-0.3809,2.4168;1.7364,-0.0105,2.5708;3.3993,0.4682,2.1659;3.1187,-0.7959,3.3635;6.1634,-1.5776,-0.656;6.128,-2.6709,-0.6879;6.0791,-1.0235,-2.0826;6.0784,0.0743,-2.0044;7.2915,-1.4489,-2.923;7.232,-1.0185,-3.929;7.3302,-2.5409,-3.0302;8.2275,-1.1217,-2.4624;4.7667,-1.4687,-2.7473;4.6881,-1.0463,-3.7548;3.883,-1.1492,-2.1839;4.7262,-2.5614,-2.841;7.4496,-1.2252,-0.0617)|\",6.751144630905\r\n\"[H]C([H])=C1N2C(=O)O[C@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]12[H] |(4.9044,-4.5471,1.8167;4.1274,-4.1671,2.4718;4.1018,-4.5182,3.4984;3.2356,-3.3022,2.0245;2.0827,-2.5703,2.4383;2.4262,-1.3656,3.1537;2.1836,-1.1411,4.3029;3.0642,-0.4824,2.3303;3.2083,-0.971,0.9777;4.2809,-0.9539,0.7498;2.4504,-0.0651,-0.004;1.3778,-0.1749,0.2143;2.8394,1.4072,0.1929;2.2725,2.0454,-0.4935;3.9066,1.5609,-0.0127;2.6412,1.738,1.2154;2.7091,-0.5122,-1.4518;2.1519,0.1227,-2.1485;2.404,-1.5483,-1.6373;3.7737,-0.4256,-1.7041;2.6574,-2.4011,1.0546;2.0314,-2.7601,0.2415)|\",6.930739772235001\r\n\"[H]C([H])=C([H])C([H])([H])C1([H])O[C@]([H])(C([H])([H])[H])[C@@]([H])(C([H])([H])[H])O1 |(2.1707,-1.5802,3.2826;2.3759,-1.1447,2.3082;2.1471,-0.0869,2.1917;2.8742,-1.87,1.3066;3.0922,-2.9261,1.4614;3.1825,-1.3409,-0.0671;2.9426,-0.2729,-0.1336;2.5869,-1.8655,-0.8265;4.6482,-1.5286,-0.4516;5.3081,-1.0167,0.2697;5.005,-2.9122,-0.4709;5.6373,-3.2045,-1.725;4.9127,-3.7189,-2.3761;6.852,-4.0911,-1.5007;7.3316,-4.3378,-2.4555;6.5583,-5.0278,-1.0168;7.5839,-3.5889,-0.8587;5.9305,-1.809,-2.3018;6.8949,-1.4483,-1.9028;5.9138,-1.6944,-3.815;6.7146,-2.2989,-4.2555;6.0634,-0.6547,-4.1219;4.9535,-2.0371,-4.2145;4.8665,-1.0281,-1.7533)|\",7.25999753134\r\n\"[H]/N=C(\\O[H])O[C@]1([H])C([H])([H])C([H])([H])[C@@]23O[C@@](C([H])([H])[H])(C([H])=C2[H])C([H])([H])C(=O)[C@]13[H] |(4.1372,6.7813,0.8618;4.0692,5.9663,0.2533;3.7655,4.9058,0.8902;3.5093,4.7326,2.2145;3.5872,5.5926,2.655;3.6402,3.708,0.3085;3.9229,3.6466,-1.1097;4.8132,4.2507,-1.2867;2.709,4.1186,-1.9583;1.8928,4.3678,-1.2752;2.9458,5.0193,-2.5297;2.3071,2.9228,-2.8545;2.8261,2.9617,-3.8204;1.2321,2.8752,-3.0508;2.7873,1.7041,-2.0764;1.9077,1.4832,-0.9607;2.2959,0.1951,-0.4469;1.1465,-0.38,0.3669;1.4113,-1.3646,0.7681;0.2561,-0.4841,-0.2608;0.906,0.281,1.2059;2.638,-0.5663,-1.7276;2.7134,-1.6471,-1.7856;2.8957,0.3202,-2.6896;3.2356,0.122,-3.7005;3.5749,0.4409,0.3982;3.2941,1.0362,1.2773;4.0364,-0.4869,0.7524;4.6349,1.2405,-0.3623;5.8169,1.1395,-0.1009;4.1517,2.1627,-1.4897;4.9267,2.137,-2.2651)|\",6.087186835685\r\n\"[H]OC1=N[C@]([H])(C(=C([H])[H])C([H])=C([H])[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])O1 |(5.4992,-1.8125,2.5276;6.2309,-1.7571,1.8891;5.691,-1.2991,0.7545;4.4638,-1.01,0.5676;4.3767,-0.5318,-0.8234;4.1248,0.5336,-0.7999;3.3134,-1.278,-1.6265;3.3928,-2.6108,-1.7796;2.6461,-3.1636,-2.3433;4.1885,-3.1955,-1.3268;2.1876,-0.552,-2.2277;1.5373,-1.173,-2.8445;1.868,0.7436,-2.0924;0.991,1.1504,-2.5869;2.437,1.4394,-1.4834;5.8373,-0.6947,-1.3913;5.887,-1.5008,-2.1323;6.4779,0.5744,-1.9587;6.3605,1.3625,-1.1996;7.9779,0.3772,-2.2221;8.424,1.3016,-2.6065;8.1447,-0.4087,-2.9702;8.508,0.0922,-1.3094;5.7472,1.0108,-3.2387;6.1885,1.9329,-3.6326;4.6806,1.1943,-3.0718;5.8346,0.2428,-4.0178;6.5957,-1.1562,-0.232)|\",5.401459932425\r\n\"[H]OC1=N[C@]([H])(C(=C([H])[H])C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])O1 |(3.7934,-5.1292,0.4344;3.2151,-4.5288,-0.0668;3.5445,-3.2924,0.3208;4.4547,-2.9801,1.157;4.3927,-1.5082,1.2752;5.372,-1.1027,0.9941;4.116,-1.0878,2.7218;2.8726,-0.9404,3.2001;2.6918,-0.7107,4.2459;1.9893,-1.0783,2.584;5.3064,-0.902,3.5956;6.4367,-1.7276,3.4582;6.4253,-2.5245,2.7207;7.546,-1.559,4.2866;8.4045,-2.2162,4.1732;7.5559,-0.5584,5.2598;8.4238,-0.4254,5.9003;6.4442,0.2748,5.3997;6.4455,1.0661,6.1451;5.3337,0.1063,4.5745;4.4845,0.7774,4.6694;3.3313,-1.0724,0.209;2.488,-0.5304,0.6468;3.8977,-0.2659,-0.966;4.7677,-0.8188,-1.3513;2.8702,-0.1279,-2.0984;3.2991,0.4277,-2.9401;1.9818,0.4191,-1.7566;2.5451,-1.1065,-2.4617;4.3686,1.1145,-0.4803;4.8051,1.6811,-1.3099;5.1253,1.0464,0.309;3.5267,1.698,-0.0864;2.7957,-2.3437,-0.2708)|\",5.510305472625\r\n\"[H]OC1=N[C@]([H])(C(=C([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])O1 |(11.2004,1.0908,-0.5578;11.4161,0.4809,-1.2843;10.3362,-0.2906,-1.4458;9.2462,-0.2091,-0.7892;8.3849,-1.2799,-1.3075;7.4507,-0.8358,-1.6802;8.01,-2.3118,-0.2463;8.7262,-2.4474,0.8731;8.4512,-3.1735,1.6345;9.5922,-1.8253,1.0736;6.7704,-3.151,-0.4964;6.4421,-3.5809,0.4587;5.9567,-2.4938,-0.835;6.9444,-4.3081,-1.5041;7.1859,-3.913,-2.5004;7.8073,-4.9108,-1.192;5.7102,-5.2188,-1.623;5.9804,-6.0924,-2.2309;5.453,-5.6087,-0.6279;4.4792,-4.5445,-2.2407;3.6514,-5.2553,-2.3422;4.7044,-4.1527,-3.2409;4.1193,-3.7087,-1.6302;9.1958,-1.8717,-2.5254;9.3928,-2.9389,-2.3955;8.6008,-1.6155,-3.9183;7.5896,-2.0504,-3.8896;8.4705,-0.1205,-4.2488;7.9669,0.0106,-5.2128;9.458,0.3475,-4.3223;7.8931,0.4255,-3.495;9.4082,-2.3488,-5.0005;8.9604,-2.201,-5.9896;9.4498,-3.4278,-4.808;10.4371,-1.9737,-5.0357;10.4937,-1.2101,-2.4155)|\",7.02597961991\r\n\"[H]OC1=N[C@]([H])(C(=C([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])O1 |(5.7109,4.6007,3.0247;6.2805,3.8538,3.2773;5.7196,2.7673,2.737;4.6769,2.7386,2.004;4.4327,1.3172,1.727;4.335,1.1803,0.6424;3.1729,0.7774,2.4014;2.5526,1.4714,3.3587;1.6714,1.093,3.8673;2.9019,2.4562,3.6488;2.6897,-0.5765,1.8924;3.5807,-1.1815,1.6659;1.8551,-1.3684,2.9091;1.6332,-2.3688,2.5202;0.896,-0.8775,3.1109;2.3831,-1.4807,3.8621;1.9081,-0.4137,0.57;1.6396,-1.3931,0.1562;2.4876,0.1222,-0.19;0.9844,0.151,0.7404;5.7478,0.6007,2.2222;5.5413,-0.2378,2.8953;6.7286,0.1701,1.1198;6.8737,1.0416,0.4639;8.0906,-0.2239,1.7114;8.7939,-0.4864,0.9132;7.9933,-1.0978,2.3693;8.5218,0.5923,2.2962;6.1554,-0.9803,0.2783;6.8623,-1.2594,-0.5107;5.2088,-0.7196,-0.2066;5.9838,-1.8715,0.896;6.3789,1.6335,3.0397)|\",7.175642237685\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C(F)C([H])=C([H])C([H])=C2[H])C1([H])[H] |(3.9686,4.5617,1.5148;3.6974,4.1263,2.3396;2.4922,3.4144,2.0562;2.1655,3.0484,3.0354;1.4222,4.2658,1.4138;1.1338,5.2266,1.8312;0.8958,3.6974,0.3284;0.1003,4.1156,-0.2832;1.4924,2.3307,0.0267;1.8745,2.3108,-1.003;0.4673,1.2086,0.1094;-0.1475,0.834,1.3081;0.2044,1.4738,2.4511;-1.0976,-0.1743,1.3958;-1.5272,-0.4101,2.3641;-1.4711,-0.8526,0.2346;-2.2136,-1.6436,0.2879;-0.884,-0.5122,-0.9842;-1.1662,-1.0357,-1.8931;0.0701,0.5057,-1.036;0.5264,0.7659,-1.9885;2.6731,2.2444,1.0463;3.6239,2.3928,0.5213;2.7298,1.2722,1.5412)|\",6.12800391326\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C([H])=C(SC([H])([H])[H])C([H])=C2[H])C1([H])[H] |(5.2626,0.7146,-8.5738;5.639,1.5963,-8.4189;5.0682,2.0847,-7.1976;5.3696,3.137,-7.1733;3.5682,1.8892,-7.1706;2.9035,2.4172,-7.8487;3.206,0.9309,-6.3125;2.1922,0.5665,-6.1661;4.3804,0.35,-5.542;4.5925,-0.659,-5.9275;4.1509,0.2167,-4.0466;3.5878,1.2644,-3.3006;3.2901,2.1803,-3.8057;3.397,1.1531,-1.9286;2.9574,1.9803,-1.3769;3.7658,-0.0205,-1.249;3.4697,-0.0519,0.5105;4.0945,-1.6861,1.0198;3.9389,-1.7376,2.1003;5.1628,-1.7916,0.8119;3.5371,-2.498,0.5445;4.325,-1.0724,-1.9819;4.6199,-1.9956,-1.4951;4.5092,-0.9464,-3.3613;4.9441,-1.7787,-3.9111;5.536,1.3268,-5.9285;6.4901,0.8203,-6.1021;5.6852,2.0405,-5.1111)|\",5.417786763455\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C([H])=C(OC([H])([H])[H])C(C([H])([H])[H])=C2[H])C1([H])[H] |(7.6785,4.4467,-1.1305;8.1541,3.9671,-1.8283;7.9099,2.57,-1.6127;8.6051,2.0775,-2.3005;8.125,2.1881,-0.1646;9.104,2.2266,0.305;6.9711,1.926,0.4568;6.8631,1.6953,1.5138;5.7672,2.0049,-0.4673;5.2041,2.9217,-0.2346;4.7959,0.8428,-0.3436;5.2437,-0.4793,-0.2979;6.3106,-0.6881,-0.3282;4.3445,-1.5444,-0.2078;4.7252,-2.5591,-0.1723;2.9702,-1.2939,-0.1628;2.0105,-2.2673,-0.0759;2.4271,-3.6211,-0.0309;1.5129,-4.2148,0.0305;3.0503,-3.8237,0.8505;2.9824,-3.9041,-0.9352;2.4842,0.0288,-0.2029;1.0003,0.2894,-0.1474;0.7916,1.363,-0.1843;0.5572,-0.1168,0.7698;0.4786,-0.192,-0.9833;3.4123,1.0654,-0.2912;3.0407,2.0885,-0.3216;6.4448,2.1333,-1.8685;5.9354,2.8362,-2.5345;6.4359,1.1517,-2.3545)|\",5.69262175246\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@@]([H])(C2=C([H])C(OC([H])([H])[H])=C([H])C(F)=C2[H])C1([H])[H] |(0.5198,5.4452,0.0609;0.6791,6.2742,-0.4195;2.0002,6.1999,-0.949;2.1477,7.1819,-1.4171;3.0752,5.9354,0.0813;3.1093,6.4761,1.0232;3.9592,5.0253,-0.3345;4.8337,4.693,0.2185;3.6601,4.5204,-1.7349;4.3424,5.017,-2.4408;3.8156,3.0218,-1.9296;3.3669,2.1312,-0.9533;2.9269,2.4942,-0.0296;3.4801,0.7457,-1.1378;3.0119,-0.0273,-0.1172;3.1085,-1.4379,-0.2416;2.6867,-1.8485,0.6774;4.1526,-1.7633,-0.3388;2.5337,-1.8072,-1.1011;4.0456,0.2354,-2.3122;4.1581,-0.824,-2.5029;4.4844,1.147,-3.2676;5.034,0.6582,-4.4015;4.3874,2.5202,-3.1086;4.7531,3.179,-3.8895;2.2121,5.0518,-1.9718;1.4947,4.2549,-1.7411;2.0341,5.3583,-3.0066)|\",5.972899018475\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C([H])=C(F)C(C([H])([H])[H])=C2[H])C1([H])[H] |(6.3904,5.3264,-0.7925;5.6976,5.7665,-0.2733;4.5367,4.9275,-0.3311;3.7446,5.55,0.0976;4.2477,4.4789,-1.747;3.9453,5.1752,-2.5243;4.5213,3.1828,-1.9241;4.4583,2.6486,-2.8689;4.9729,2.4929,-0.6471;6.0551,2.3032,-0.712;4.3033,1.1551,-0.3786;5.0554,0.0431,0.013;6.1349,0.1289,0.111;4.4383,-1.18,0.2834;5.0071,-2.053,0.5868;3.0618,-1.2744,0.1525;2.4581,-2.46,0.4113;2.2611,-0.199,-0.2409;0.7688,-0.3688,-0.3714;0.298,0.5607,-0.7041;0.3154,-0.6587,0.584;0.5182,-1.1568,-1.0913;2.9127,1.0092,-0.5011;2.3167,1.8642,-0.8141;4.6829,3.5848,0.4301;5.4576,3.6505,1.2;3.739,3.3456,0.9318)|\",6.032764065585\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C(Br)=C([H])C([H])=C2[H])C1([H])[H] |(-3.9159,-4.6571,1.3471;-3.879,-4.0636,2.1154;-2.6797,-3.2983,1.9911;-2.6297,-2.7463,2.9371;-1.4471,-4.1469,1.7819;-1.2065,-4.9745,2.4437;-0.7357,-3.7696,0.7174;0.1872,-4.2342,0.3796;-1.3135,-2.5565,0.0134;-1.4965,-2.7861,-1.0451;-0.3886,-1.344,0.0429;0.2597,-0.9603,1.2258;0.1334,-1.5474,2.1295;1.0738,0.1687,1.2328;1.9569,0.672,2.8662;1.2715,0.9399,0.088;1.9128,1.8139,0.114;0.628,0.5536,-1.0878;0.7712,1.1383,-1.9924;-0.1907,-0.5762,-1.1106;-0.6837,-0.8653,-2.0355;-2.6696,-2.3369,0.7652;-3.5121,-2.5994,0.1155;-2.8075,-1.2929,1.059)|\",6.1143982207350005\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C(C([H])([H])[H])=C([H])C([H])=C2[H])C1([H])[H] |(5.9606,-5.873,0.0139;5.0093,-5.9047,-0.1801;4.4855,-4.6068,0.1085;3.4025,-4.7382,-0.0047;4.8375,-4.1121,1.4925;4.6582,-4.7192,2.3758;5.3772,-2.8915,1.4795;5.6984,-2.3365,2.3575;5.4593,-2.2894,0.0894;6.4947,-1.996,-0.1322;4.6019,-1.0381,-0.0721;3.2594,-1.0233,0.326;2.8342,-1.914,0.7851;2.4504,0.1073,0.159;1.0106,0.0953,0.6205;0.4518,0.9433,0.2116;0.942,0.1532,1.7148;0.4983,-0.8244,0.3151;3.0136,1.2499,-0.423;2.4017,2.1381,-0.5633;4.3507,1.2563,-0.8203;4.7779,2.1491,-1.2699;5.1408,0.1206,-0.6446;6.1824,0.1299,-0.9584;5.0272,-3.4842,-0.8247;5.8927,-3.8811,-1.3678;4.2926,-3.1785,-1.5745)|\",6.272224254025001\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@]([H])(C2=C([H])C([H])=C([H])C(OC([H])([H])[H])=C2[H])C1([H])[H] |(6.0979,4.407,4.8429;5.2181,4.8177,4.8247;4.4672,4.1466,3.8037;3.5695,4.7636,3.6934;5.2661,4.0144,2.5255;5.5491,4.8761,1.9275;5.6557,2.7541,2.3112;6.2927,2.4211,1.4957;5.134,1.7871,3.361;5.9656,1.5006,4.0224;4.545,0.5004,2.8059;4.8554,-0.7368,3.3902;5.5382,-0.778,4.2352;4.2922,-1.9081,2.8907;4.5368,-2.8634,3.3481;3.4148,-1.8797,1.8037;2.9899,-2.804,1.4298;3.1037,-0.6474,1.216;2.2638,-0.4874,0.1494;1.6596,-1.6428,-0.4076;1.0386,-1.2883,-1.2324;1.0265,-2.1618,0.3247;2.4094,-2.3454,-0.7953;3.6692,0.5313,1.7193;3.4126,1.4704,1.2372;4.1124,2.6752,4.1383;4.1222,2.5029,5.2187;3.1036,2.4523,3.7749)|\",5.874938032295\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C([H])=C([H])C([H])=C1F |(-0.6613,3.9394,0.2856;-0.0614,3.879,1.0472;1.2475,3.6689,0.5484;1.8574,3.4716,1.4342;1.7619,4.889,-0.262;1.2583,5.8051,0.0689;2.841,5.0442,-0.1081;1.4729,4.5042,-1.6944;1.4971,5.2127,-2.5176;1.2272,3.1971,-1.8137;1.0134,2.673,-2.7405;1.3397,2.4783,-0.478;2.3561,2.0614,-0.4012;0.3868,1.3277,-0.2328;0.8419,0.1113,0.2941;1.901,0.0083,0.5187;-0.0204,-0.9597,0.5325;0.3699,-1.8875,0.9406;-1.3791,-0.8371,0.2409;-2.0591,-1.6648,0.4202;-1.8679,0.3592,-0.2854;-2.9171,0.4972,-0.5253;-0.9815,1.4037,-0.5035;-1.4883,2.5599,-1.023)|\",6.21508034542\r\n\"[H]OC([H])([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1C([H])([H])[C@@]([H])(N([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(1.4114,-2.3644,1.227;0.5773,-2.8543,1.1573;-0.19,-2.2516,0.1239;-1.15,-2.7772,0.1214;-0.4025,-1.1931,0.345;0.4742,-2.3801,-1.2596;0.6646,-3.4444,-1.4395;1.4554,-1.8907,-1.2263;-0.3877,-1.8125,-2.3707;-1.3249,-2.657,-2.9825;-1.3692,-3.6995,-2.6735;-2.1928,-2.1975,-3.9717;-2.9063,-2.8774,-4.4294;-2.136,-0.8611,-4.3635;-2.8089,-0.4808,-5.1275;-1.2095,-0.0094,-3.7627;-1.1749,1.0361,-4.0613;-0.3251,-0.4579,-2.7691;0.6908,0.5115,-2.1958;0.8469,0.3118,-1.1301;0.2835,1.5288,-2.2568;2.0598,0.4842,-2.9421;2.4138,-0.5559,-2.9438;1.9761,0.8723,-4.357;1.5532,1.7971,-4.4362;1.3423,0.24,-4.844;3.165,1.323,-2.2482;4.035,1.2292,-2.9112;2.8315,2.82,-2.1395;3.6821,3.3692,-1.7191;1.9715,3.0018,-1.4834;2.6153,3.2644,-3.1178;3.5604,0.755,-0.8762;4.4492,1.267,-0.4895;3.7969,-0.3147,-0.9388;2.7654,0.8841,-0.131)|\",6.081744558675\r\n\"[H]C#CC([H])([H])O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])O[H] |(9.781,-6.343,4.0078;8.9859,-5.7507,3.6146;8.0905,-5.0792,3.1644;6.9966,-4.2598,2.644;6.8236,-3.4123,3.3246;6.0683,-4.85,2.6263;7.3372,-3.813,1.3334;6.5536,-2.7241,0.8054;6.9066,-2.6637,-0.2307;5.0458,-2.9822,0.7856;4.6687,-3.0817,1.8142;4.58,-2.0751,0.3807;4.6311,-4.2045,-0.0434;4.9835,-4.0745,-1.0773;5.1469,-5.0939,0.3397;3.1128,-4.4311,-0.0517;2.7595,-4.5654,0.9821;2.6225,-3.5201,-0.4233;2.6496,-5.6276,-0.8995;1.5533,-5.6076,-0.9627;3.0152,-5.5051,-1.9289;3.0906,-6.994,-0.3599;2.6836,-7.8083,-0.9701;2.7382,-7.1421,0.6689;4.181,-7.1009,-0.3557;6.9152,-1.363,1.4603;6.3152,-1.1894,2.3628;6.5583,-0.319,0.5572;7.2714,-0.3024,-0.1062;8.402,-1.2668,1.8159;8.6642,-1.9397,2.6424;8.6171,-0.241,2.1317;9.2105,-1.5166,0.6679;9.2028,-2.481,0.546)|\",7.366121933035001\r\n\"[H]C([H])=C1N2C(=O)O[C@]([H])(C([H])([H])C([H])([H])C3=C([H])C([H])=C(C([H])([H])[H])O3)[C@@]12[H] |(11.8195,3.872,-1.7282;11.9686,3.1346,-0.9465;12.8502,3.2199,-0.3193;11.0992,2.1564,-0.7719;10.8596,1.002,0.032;10.2638,1.3377,1.3037;10.7729,1.1864,2.3749;9.0093,1.8489,1.129;8.6105,1.8929,-0.2609;8.3399,2.931,-0.4774;7.4187,0.9681,-0.4881;7.1762,0.9803,-1.5586;7.7128,-0.0577,-0.2318;6.1695,1.3633,0.3297;6.4296,1.3981,1.3922;5.4159,0.5743,0.2086;5.5827,2.6803,-0.0564;5.4815,3.8901,0.5633;5.8399,4.1286,1.5555;4.8062,4.7664,-0.3512;4.5486,5.8045,-0.1916;4.5418,4.0363,-1.471;3.8661,4.3382,-2.7629;3.5469,5.3839,-2.7786;2.9799,3.7082,-2.9116;4.5327,4.1711,-3.6183;5.0134,2.7577,-1.3058;9.8665,1.4431,-1.0137;9.7437,0.7758,-1.8644)|\",5.80690956967\r\n\"[H]C([H])=C1N2C(=O)O[C@]([H])(C([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])[C@@]12[H] |(-0.6144,0.332,0.9057;0.3449,-0.1326,1.1085;0.3545,-1.1242,1.5494;1.4696,0.5012,0.8316;2.8875,0.3487,0.9184;3.4047,-0.4397,-0.1747;3.9299,-1.5095,-0.0788;3.226,0.2091,-1.3657;2.6289,1.5116,-1.2053;1.7438,1.5431,-1.8481;3.6388,2.5886,-1.6321;4.5037,2.5413,-0.9599;3.9962,2.3198,-2.6329;3.0413,3.9803,-1.6349;2.1426,4.3664,-2.6404;1.8979,3.6684,-3.4386;1.5688,5.6374,-2.6372;0.8782,5.9199,-3.4272;1.8871,6.5475,-1.6265;1.4437,7.5394,-1.6254;2.7836,6.1783,-0.6235;3.0437,6.8822,0.1624;3.3553,4.9042,-0.6296;4.0629,4.6279,0.1492;2.2683,1.5703,0.2812;2.4131,2.5103,0.8078)|\",6.43277142582\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])[C@]1([H])ON2C(=O)C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C1([H])[H] |(0.5335,-0.6085,1.1424;0.9689,-0.2626,0.1983;0.4016,-0.7246,-0.6196;0.8125,0.8215,0.1388;2.4578,-0.6127,0.1144;2.5854,-1.7003,0.2161;2.9966,-0.1566,0.9528;3.1071,-0.1493,-1.1947;2.9726,0.9349,-1.3156;2.6081,-0.6246,-2.053;4.6007,-0.4607,-1.2781;4.7764,-1.5277,-1.1032;5.3289,0.1989,-0.2054;5.8465,1.3657,-0.8075;6.2333,2.406,-0.0029;5.9893,2.4745,1.1902;6.8926,3.5509,-0.7859;6.0991,4.2834,-0.9898;7.5899,4.0345,-0.0956;7.5866,3.1455,-2.097;7.8493,4.0403,-2.673;8.5308,2.6323,-1.8685;6.7004,2.2085,-2.9307;7.1989,1.9185,-3.8629;5.7634,2.7137,-3.2009;6.3861,0.9657,-2.1056;7.3089,0.3847,-1.9458;5.2705,0.0245,-2.5948;5.6652,-0.8005,-3.1943;4.5481,0.5755,-3.2074)|\",6.6395779522\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C([H])=C(SC([H])([H])[H])C([H])=C1[H] |(5.7374,-4.915,0.7926;6.2743,-5.4754,1.3774;5.4147,-6.0049,2.3626;6.081,-6.4428,3.1134;4.4387,-7.1116,1.886;4.0831,-6.9048,0.8646;4.9181,-8.0972,1.8569;3.3188,-7.0041,2.8929;2.5728,-7.7811,3.0385;3.3555,-5.8441,3.5584;2.6395,-5.5314,4.3139;4.4979,-4.959,3.1071;5.0486,-4.5386,3.9582;4.0797,-3.7956,2.2148;2.9213,-3.8244,1.4225;2.2644,-4.6884,1.469;2.588,-2.7616,0.5883;1.6808,-2.8121,-0.0089;3.4092,-1.6255,0.5131;2.8747,-0.321,-0.5797;4.1966,0.9254,-0.4369;3.9098,1.7371,-1.1101;5.1626,0.5262,-0.758;4.2719,1.3204,0.5799;4.5715,-1.5857,1.2931;5.2316,-0.7255,1.2654;4.8947,-2.659,2.1249;5.8026,-2.6075,2.7212)|\",5.43139245598\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C([H])=C(OC([H])([H])[H])C(C([H])([H])[H])=C1[H] |(4.7629,1.8345,1.9627;5.3995,2.5687,1.9833;5.1075,3.4075,0.887;5.9827,4.0583,0.7857;3.8568,4.3117,1.0413;3.0601,3.7849,1.5889;4.0769,5.2239,1.6083;3.4662,4.5612,-0.396;2.7884,5.3548,-0.7;4.0391,3.6836,-1.2277;3.8845,3.6456,-2.3029;4.9032,2.6789,-0.4963;5.8785,2.5507,-0.9829;4.2811,1.2932,-0.3522;5.1034,0.1795,-0.1534;6.1832,0.3062,-0.1365;4.5678,-1.0992,0.0208;5.237,-1.9397,0.1668;3.1826,-1.2807,0.0011;2.5605,-2.4897,0.1563;3.3651,-3.644,0.3321;2.6715,-4.4824,0.42;3.9718,-3.5787,1.2451;4.0272,-3.8097,-0.5281;2.322,-0.1796,-0.1893;0.8284,-0.3862,-0.2093;0.3066,0.5634,-0.3612;0.4735,-0.8289,0.7291;0.5312,-1.075,-1.0091;2.8919,1.0816,-0.3584;2.231,1.93,-0.5162)|\",5.774255907610001\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C(F)=C([H])C(OC([H])([H])[H])=C1[H] |(6.2729,-1.1954,-0.2637;7.0591,-1.7509,-0.3961;7.2375,-1.9384,-1.7882;7.9186,-2.7901,-1.8593;7.8517,-0.6922,-2.4985;8.0394,0.1028,-1.7643;8.8267,-0.9276,-2.9483;6.8427,-0.3059,-3.5456;6.9669,0.5616,-4.1888;5.8118,-1.1518,-3.6031;4.9729,-1.0714,-4.2892;5.9033,-2.2727,-2.5919;6.041,-3.2331,-3.1075;4.6759,-2.4116,-1.7057;4.2456,-3.6822,-1.295;4.7552,-4.5855,-1.6131;3.1442,-3.7731,-0.4605;2.722,-4.9942,-0.0672;2.4383,-2.6638,-0.0015;1.5841,-2.8133,0.6461;2.8736,-1.4008,-0.4142;2.2797,-0.2301,-0.0503;1.1503,-0.2865,0.8087;0.846,0.7491,0.9698;1.4014,-0.7459,1.7735;0.3222,-0.8411,0.3486;3.9868,-1.2795,-1.2623;4.2787,-0.2862,-1.5897)|\",5.96473560296\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C(C([H])([H])[H])=C(F)C([H])=C1[H] |(2.7109,3.1601,-1.3337;2.867,4.0035,-0.8772;4.2472,4.0859,-0.5982;4.3493,4.9511,0.0649;5.174,4.3089,-1.8207;4.8046,3.7534,-2.6969;5.2196,5.364,-2.1145;6.4877,3.7507,-1.3283;7.4405,3.9609,-1.807;6.326,2.9678,-0.2556;7.1224,2.4375,0.2598;4.8738,2.8488,0.1537;4.7432,2.9812,1.2352;4.2147,1.5258,-0.2265;3.0475,1.1243,0.4459;2.6568,1.7543,1.2419;2.3667,-0.0564,0.1341;1.1193,-0.4998,0.8549;0.8423,0.2187,1.6315;1.2593,-1.4791,1.3275;0.2735,-0.6002,0.1644;2.901,-0.8345,-0.8956;2.2725,-1.9894,-1.223;4.0445,-0.478,-1.593;4.4106,-1.1266,-2.3825;4.6989,0.7086,-1.2563;5.6,0.9927,-1.791)|\",6.119840497745\r\n\"[H]O[C@]1([H])C([H])([H])C([H])=C([H])[C@]1([H])C1=C([H])C(Br)=C([H])C([H])=C1[H] |(-0.1985,-3.5179,-2.2064;0.3702,-3.9719,-1.5629;-0.3901,-4.1705,-0.3838;0.2904,-4.678,0.3042;-1.6777,-4.9886,-0.6653;-1.539,-5.6428,-1.5345;-1.9296,-5.6399,0.1855;-2.7307,-3.9195,-0.8462;-3.7095,-4.1147,-1.2754;-2.3246,-2.7424,-0.3593;-2.9163,-1.8314,-0.3345;-0.9313,-2.8265,0.2377;-1.0271,-2.9849,1.3235;-0.0311,-1.6265,0.0272;0.9126,-1.2887,1.0066;0.9804,-1.8713,1.9202;1.7645,-0.2057,0.8061;3.0529,0.2444,2.1619;1.7042,0.5675,-0.351;2.3742,1.4093,-0.4851;0.7601,0.2342,-1.3223;0.6963,0.8294,-2.2293;-0.0979,-0.8493,-1.1368;-0.8366,-1.0848,-1.8988)|\",6.1797055448550005\r\n\"[H]C1=C([H])C([C@@]2([H])C([H])=C([H])[C@]([H])(OC(=O)OC([H])([H])C([H])([H])[H])C2([H])[H])=C([H])S1 |(11.5629,-3.109,3.1134;11.0262,-2.7992,2.2267;9.7455,-3.1021,1.8639;9.0906,-3.7298,2.4593;9.3564,-2.5175,0.6105;7.9894,-2.6862,-0.0233;8.0227,-2.1876,-1.0013;7.6045,-4.1419,-0.2052;8.2138,-4.8211,-0.7953;6.4907,-4.4788,0.4462;6.0339,-5.4635,0.4629;5.9034,-3.3203,1.2014;5.858,-3.4921,2.2811;4.5297,-3.1372,0.7497;3.7567,-2.3909,1.553;4.1042,-1.8611,2.5886;2.5309,-2.3395,1.0093;1.5553,-1.551,1.7321;0.5955,-1.971,1.4219;1.6931,-1.7137,2.8038;1.6608,-0.076,1.3779;0.8642,0.4851,1.8799;1.5593,0.0727,0.2979;2.623,0.3289,1.7041;6.8102,-2.1191,0.8364;6.2272,-1.3991,0.2549;7.1683,-1.6017,1.729;10.3745,-1.7867,0.0578;10.3646,-1.2388,-0.8755;11.8027,-1.7915,1.0451)|\",5.9484087719300005\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C([H])([H])OC([H])([H])OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(1.8659,5.6288,5.3504;2.8584,5.4389,4.9508;3.0834,4.3387,4.1177;2.2641,3.6691,3.8692;4.3531,4.0903,3.6054;4.5077,3.2255,2.9667;5.4339,4.9394,3.9087;6.7935,4.7339,3.3869;7.537,5.4341,3.7705;7.2031,3.8047,2.5116;6.5033,3.0945,2.0744;8.6243,3.6445,2.0731;9.033,2.6871,2.4297;9.2559,4.4483,2.4843;8.6687,3.6681,0.6413;9.9412,3.3798,0.1184;9.8695,3.6154,-0.9526;10.7137,4.0007,0.5933;10.3585,2.0551,0.3178;9.6114,1.095,-0.4313;8.5481,1.1545,-0.164;9.6941,1.3248,-1.5065;10.1733,-0.2834,-0.1287;9.6217,-1.0504,-0.6834;10.0933,-0.5044,0.9407;11.2295,-0.3412,-0.4113;5.1907,6.0375,4.7521;6.0134,6.7038,5.002;3.9195,6.2875,5.2669;3.7591,7.1449,5.9153)|\",5.0966924198650005\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C(N(=O)=O)O1)[C@@]([H])(C([H])=C([H])[H])C([H])([H])C([H])([H])[H] |(3.7651,0.7723,-0.8678;4.5296,0.4009,-0.394;4.1939,-0.9453,-0.0827;4.9955,-1.2953,0.5809;4.2564,-1.7506,-1.3549;4.6622,-1.4083,-2.621;5.0086,-0.4332,-2.9277;4.5616,-2.5832,-3.4167;4.7873,-2.7123,-4.4642;4.1041,-3.5524,-2.5698;3.7983,-4.9255,-2.7974;3.993,-5.3396,-3.9453;3.3668,-5.5877,-1.8528;3.9157,-3.0699,-1.3151;2.8615,-1.013,0.7263;3.0708,-0.3649,1.5875;1.7124,-0.3765,-0.0442;1.3927,0.6081,0.2978;1.0821,-0.9117,-1.0939;0.2747,-0.3847,-1.5955;1.3264,-1.8964,-1.4846;2.5419,-2.4176,1.2858;2.2478,-3.0931,0.4775;3.4612,-2.8398,1.7148;1.4484,-2.3906,2.3592;1.2621,-3.3977,2.7473;0.505,-2.0075,1.9553;1.7348,-1.754,3.2058)|\",4.563349272885\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C(N(=O)=O)S1)[C@@]([H])(C([H])=C([H])[H])C([H])([H])[H] |(4.9454,-0.5778,2.626;5.2225,-0.6009,1.6919;4.2514,0.1311,0.9628;4.3069,-0.2426,-0.0675;4.619,1.6071,0.9592;5.5788,2.2023,1.7493;6.169,1.6414,2.4626;5.7404,3.5885,1.5012;6.4339,4.2457,2.0103;4.9025,4.02,0.5064;4.7945,5.3536,-0.0039;5.5275,6.2131,0.4897;3.9717,5.5485,-0.9059;3.8991,2.7566,-0.1415;2.8244,-0.214,1.505;2.7961,-1.3108,1.4669;2.6939,0.21,2.9467;2.5969,1.2819,3.1246;2.7211,-0.6253,3.9896;2.6357,-0.2676,5.0123;2.8011,-1.703,3.8572;1.6734,0.3231,0.6404;0.7256,-0.1149,0.9698;1.8141,0.0614,-0.4152;1.5745,1.4108,0.7115)|\",4.48443625624\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C(N(=O)=O)O1)[C@@]([H])(C([H])=C([H])[H])C([H])([H])[H] |(5.4985,0.7986,1.6419;5.0505,0.0348,1.2411;4.3071,0.5161,0.1217;3.8708,-0.3791,-0.3354;5.2237,1.1323,-0.8933;5.6089,0.794,-2.1645;5.2626,-0.0617,-2.7267;6.5403,1.7823,-2.5907;7.0621,1.8598,-3.5322;6.6538,2.6446,-1.5375;7.4433,3.821,-1.3761;8.1416,4.14,-2.3438;7.3729,4.4229,-0.3043;5.8659,2.2717,-0.499;3.1608,1.4752,0.5498;3.6331,2.3558,1.0084;2.3635,1.935,-0.646;1.8282,1.15,-1.185;2.2555,3.198,-1.0592;1.6508,3.4673,-1.9209;2.7664,4.0127,-0.55;2.2573,0.7925,1.5914;1.5058,1.496,1.9635;2.8462,0.4227,2.4348;1.7283,-0.0617,1.1498)|\",4.536137887835\r\n\"[H]C1=C([H])C([C@@]2([H])C([H])=C([H])C([H])([H])[C@@]2([H])OC(=O)OC([H])([H])C([H])([H])[H])=C([H])S1 |(2.7776,-5.292,-3.4128;3.2722,-5.2882,-2.4508;4.5455,-4.8965,-2.1565;5.2437,-4.5199,-2.8943;4.8664,-5.0164,-0.7614;6.2077,-4.685,-0.1506;6.1946,-5.064,0.8835;7.4224,-5.2679,-0.8569;7.4208,-6.2721,-1.2695;8.4582,-4.4266,-0.8438;9.4447,-4.6378,-1.2458;8.1179,-3.1325,-0.1409;8.4622,-2.2332,-0.6642;8.5588,-3.1,0.8665;6.5761,-3.1726,-0.0551;6.1774,-2.6918,0.838;6.0229,-2.5002,-1.2113;5.6787,-1.2135,-1.0467;5.7821,-0.5673,-0.0248;5.1958,-0.7722,-2.2175;4.7708,0.6114,-2.2405;4.0499,0.6507,-3.0606;4.263,0.8406,-1.3005;5.9419,1.5511,-2.4813;5.5784,2.5812,-2.5725;6.4678,1.2887,-3.4052;6.6466,1.5091,-1.646;3.8082,-5.5034,-0.0391;3.7685,-5.7035,1.0239;2.4194,-5.8143,-1.031)|\",6.032764065585\r\n\"[H]OC([H])([H])C([H])([H])C1=C(C([H])([H])[C@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])O[H])C([H])=C([H])C([H])=C1[H] |(0.9413,0.8833,-3.8636;0.0719,1.1962,-3.5688;0.0425,1.1104,-2.1499;0.825,1.7376,-1.6926;-0.9217,1.5307,-1.847;0.1563,-0.3374,-1.6389;1.1021,-0.7657,-1.9963;-0.6441,-0.9192,-2.1101;0.0457,-0.438,-0.1288;1.1615,-0.3214,0.7293;2.5728,-0.1594,0.1983;3.1536,0.4673,0.8888;2.567,0.3752,-0.757;3.3354,-1.4975,0.0151;2.6341,-2.23,-0.409;3.711,-2.0118,1.3402;4.4948,-1.4704,1.7074;4.0337,-2.9749,1.2719;4.5071,-1.33,-0.9745;4.1253,-0.9486,-1.93;5.1968,-0.5609,-0.5913;5.2912,-2.6183,-1.2514;4.5929,-3.4212,-1.5404;5.7955,-2.9666,-0.3391;6.3428,-2.4576,-2.3533;6.9272,-3.3862,-2.4546;7.0455,-1.6594,-2.0887;5.7882,-2.0659,-3.6059;5.2264,-2.7939,-3.9147;0.9602,-0.3977,2.1143;1.8234,-0.3302,2.7694;-0.31,-0.5736,2.6592;-0.4358,-0.6292,3.7373;-1.4137,-0.6796,1.8133;-2.4116,-0.8167,2.2219;-1.2273,-0.6116,0.4343;-2.0862,-0.6995,-0.2283)|\",6.31848360861\r\n\"[H]OC([H])([H])C1=C(C([H])([H])[C@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])O[H])C([H])=C([H])C([H])=C1[H] |(3.1314,-0.0888,0.7267;2.79,0.4819,1.4335;1.3999,0.1838,1.5809;1.2639,-0.7912,2.073;1.0237,0.9436,2.2733;0.6197,0.1896,0.2792;0.5563,1.3211,-0.5642;1.2263,2.6475,-0.2461;1.7775,2.9854,-1.1319;1.9675,2.5214,0.547;0.2389,3.7875,0.1058;-0.5749,3.7495,-0.6309;0.9234,5.0729,-0.117;1.7223,5.1503,0.5117;0.2986,5.8413,0.1285;-0.4176,3.6327,1.5004;-1.2573,4.341,1.5614;-0.8651,2.6311,1.5733;0.5034,3.8722,2.7062;1.349,3.1696,2.6885;0.9319,4.8817,2.6305;-0.23,3.746,4.0485;-0.683,2.7469,4.1304;-1.0534,4.4718,4.0839;0.6689,3.9683,5.2682;1.464,3.2046,5.2951;0.0824,3.857,6.1862;1.2238,5.2801,5.3292;1.8971,5.3453,4.6358;-0.1668,1.2175,-1.7621;-0.205,2.0807,-2.4232;-0.8281,0.0458,-2.1257;-1.3789,0.0012,-3.0616;-0.7718,-1.066,-1.2864;-1.2783,-1.9888,-1.5558;-0.0452,-0.9859,-0.0998;0.0115,-1.8552,0.5521)|\",6.032764065585\r\n\"[H]OC([H])([H])C1=C(C([H])([H])[C@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])O[H])C([H])=C([H])C([H])=C1[H] |(0.6149,1.0031,2.786;1.3288,0.3775,2.5634;1.6073,0.4651,1.1687;1.9402,1.4726,0.8932;2.4632,-0.2027,1.0157;0.4518,0.0041,0.287;-0.1343,0.7806,-0.7366;0.2593,2.2118,-1.0731;0.1059,2.3834,-2.1436;1.3247,2.394,-0.8838;-0.5701,3.2856,-0.3205;-1.6323,3.0562,-0.4865;-0.3246,4.5922,-0.9514;0.6401,4.8765,-0.7743;-0.9097,5.3036,-0.5143;-0.2849,3.2166,1.1877;-0.3638,2.1723,1.4913;0.7591,3.5237,1.3612;-1.2049,4.0112,2.1157;-2.2433,3.6791,1.9834;-1.1779,5.0868,1.8917;-0.8029,3.8216,3.5773;-1.5201,4.3303,4.2386;0.1897,4.2675,3.75;-0.7739,2.4166,3.8592;-0.5138,2.2989,4.7854;-1.1842,0.2158,-1.4823;-1.6332,0.8066,-2.2783;-1.6623,-1.0664,-1.2301;-2.475,-1.4712,-1.8278;-1.0882,-1.8249,-0.208;-1.449,-2.8281,0.0037;-0.043,-1.2864,0.5347;0.4061,-1.8623,1.3396)|\",6.23140717645\r\n\"[H]C1=C2O[C@]([H])(/C([H])=C(\\[H])C([H])([H])[H])[C@]([H])(C([H])([H])[H])C2=C([H])C(C(=O)OC([H])([H])[H])=C1[H] |(6.1695,-1.3504,4.8769;6.3626,-2.055,4.0749;5.569,-2.0675,2.9295;4.5174,-1.2352,2.7153;4.1291,-1.3725,1.305;4.6653,-0.5827,0.7626;2.6555,-1.154,1.1873;2.0213,-1.8084,1.784;2.1066,-0.2242,0.401;2.7648,0.4279,-0.1763;0.6341,0.0117,0.2354;0.3687,1.0415,0.5088;0.0431,-0.6696,0.856;0.3263,-0.1241,-0.8101;4.6934,-2.7615,0.849;5.0952,-2.6563,-0.1656;3.6781,-3.9171,0.8545;4.1742,-4.8517,0.5716;2.8654,-3.7333,0.1434;3.2422,-4.0608,1.8492;5.7901,-2.963,1.876;6.8304,-3.8739,1.9509;7.0186,-4.5807,1.1496;7.657,-3.8757,3.092;8.7991,-4.8093,3.2434;9.5376,-4.8565,4.2097;8.9452,-5.6307,2.1713;10.0343,-6.5586,2.2606;10.0052,-7.1346,1.3347;9.9122,-7.2161,3.1258;10.9866,-6.0291,2.3532;7.4091,-2.9723,4.1388;8.0632,-3.0018,5.004)|\",5.134788358934999\r\n\"[H]/C(C(=O)C([H])([H])[H])=C1C(=C(/[H])C(=O)C([H])([H])[H])\\C([H])([H])C([H])([H])C([H])([H])C\\1([H])[H] |(3.8982,0.7027,0.9995;2.8247,0.6691,1.175;2.1606,-0.5795,0.7189;0.965,-0.8162,0.84;3.0867,-1.5952,0.0642;3.8831,-1.8945,0.7577;3.5771,-1.1595,-0.8157;2.5134,-2.4746,-0.2344;2.2517,1.7552,1.7483;3.1026,2.9171,2.133;4.2385,2.7056,2.8417;4.4712,1.6797,3.1203;5.2157,3.7163,3.3226;5.1455,4.917,3.0913;6.3601,3.1554,4.1554;5.9738,2.6429,5.0458;7.024,3.9665,4.4594;6.9279,2.4122,3.581;2.5768,4.2727,1.7065;3.1872,5.0634,2.1399;2.6878,4.3402,0.6124;1.0848,4.4408,2.0536;0.9703,4.494,3.1452;0.7301,5.3978,1.6523;0.25,3.2808,1.5051;0.2824,3.2902,0.4068;-0.803,3.391,1.7909;0.7703,1.9236,2.0173;0.2156,1.0886,1.5922;0.6269,1.8873,3.1091)|\",4.353821608\r\n\"[H]C(=O)C(C([H])([H])[H])(C([H])([H])[H])[C@@]([H])(/C([H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])N(=O)=O |(7.205,3.4715,-6.9703;7.277,2.8445,-6.0509;7.9922,1.8671,-6.0341;6.4341,3.3716,-4.8912;4.9901,3.5512,-5.413;4.9785,4.2058,-6.2924;4.3578,4.0185,-4.6514;4.5278,2.6019,-5.7042;7.0242,4.7661,-4.5615;7.0484,5.4026,-5.4539;8.0449,4.683,-4.1723;6.4144,5.2661,-3.8036;6.5468,2.4472,-3.639;7.6147,2.3113,-3.4393;5.891,3.0243,-2.4054;4.8411,3.311,-2.4921;6.4908,3.1122,-1.2141;7.5312,2.7963,-1.1331;5.8212,3.5423,0.0621;6.472,4.2492,0.5974;4.8912,4.078,-0.1685;5.496,2.3564,1.0016;4.9909,2.756,1.8913;4.776,1.6991,0.4974;6.7164,1.5331,1.4285;6.4308,0.768,2.1589;7.4836,2.1659,1.8935;7.171,1.0172,0.5757;5.9882,1.0313,-3.9071;6.5212,0.5523,-4.7299;4.912,1.0127,-4.0739;6.2165,0.1535,-2.7026;5.236,-0.3886,-2.202;7.3717,0.0442,-2.2984)|\",5.2409127606300006\r\n\"[H]C(N(O)[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])([H])[H])C(F)(F)F |(0.8168,2.4293,1.4719;1.6478,2.087,0.8705;1.5256,0.8912,0.3389;0.5059,0.1718,0.5243;2.6057,0.286,-0.5493;3.4856,0.9169,-0.4439;2.1508,0.3081,-1.9982;2.8765,1.0607,-2.9293;3.7291,1.6478,-2.5988;2.5107,1.0691,-4.2761;3.0869,1.6545,-4.9874;1.4058,0.3326,-4.7031;1.1174,0.3386,-5.7509;0.6691,-0.4097,-3.7772;-0.1972,-0.9796,-4.1025;1.0386,-0.4268,-2.4328;0.4532,-0.989,-1.7139;2.9283,-1.1093,-0.0101;3.7075,-1.5566,-0.6341;2.0455,-1.75,-0.0289;3.2959,-1.049,1.0197;2.8228,2.9901,0.7147;2.5872,4.1261,1.3954;3.9798,2.4602,1.1974;3.0823,3.3324,-0.5738)|\",5.159278605480001\r\n\"[H]O[C@@]1([H])[C@]2([H])O[C@]2([H])C2=C(C(=O)C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])O2)[C@@]1([H])O[H] |(8.6988,1.1302,0.2248;8.5891,1.614,1.0653;7.2228,1.4746,1.3927;7.1403,1.4279,2.4855;6.4205,2.7075,0.9618;7.0176,3.5353,0.5797;5.3617,3.0832,1.8558;5.0108,2.5848,0.5513;4.5442,3.3159,-0.1079;4.4181,1.2218,0.5058;5.1678,0.1024,0.6983;4.5346,-1.2105,0.6201;5.1771,-2.2558,0.5574;3.0118,-1.185,0.6498;2.6968,-1.0847,1.6981;2.6256,-2.1377,0.2767;2.4438,-0.016,-0.1701;0.9655,0.2166,0.1262;0.3786,-0.6572,-0.1755;0.5971,1.0882,-0.4241;0.8086,0.3916,1.1949;2.6949,-0.1939,-1.6723;2.1815,-1.0896,-2.0382;3.7621,-0.3019,-1.8897;2.3152,0.6722,-2.2229;3.1029,1.2388,0.2307;6.6698,0.1521,0.8189;6.9977,-0.6683,1.4666;7.2805,-0.019,-0.4783;7.2076,-0.9639,-0.6889)|\",5.00145257219\r\n\"[H]O[C@]1([H])[C@]2([H])O[C@]2([H])C2=C(OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C2=O)[C@@]1([H])O[H] |(6.831,1.87,-0.8731;6.4989,2.7407,-0.581;6.3816,2.6474,0.8249;6.7555,3.5752,1.2729;7.1887,1.4465,1.3623;8.0041,1.6554,2.0539;7.5065,0.4793,0.3502;6.5318,0.1265,1.3772;6.896,-0.658,2.0364;5.1076,0.0186,0.9695;4.3586,1.1506,0.8153;3.097,1.1809,0.382;2.5245,-0.0391,-0.2218;1.0158,0.1664,-0.1263;0.4909,-0.6816,-0.5782;0.7198,1.079,-0.6529;0.7023,0.253,0.9187;2.9921,-0.0818,-1.6814;2.5546,-0.9487,-2.1876;4.0809,-0.1598,-1.7542;2.6744,0.8239,-2.2067;2.9862,-1.2689,0.571;2.5016,-1.2737,1.5582;2.6852,-2.1889,0.0617;4.4948,-1.2997,0.8068;5.1098,-2.35,0.9378;4.8963,2.5139,1.2148;4.3262,3.2884,0.6983;4.6833,2.7349,2.6095;5.0917,1.9976,3.0919)|\",5.08308672734\r\n\"[H]C(N(O)C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])C(F)(F)F |(0.1094,2.5037,2.4068;-0.1492,1.4686,2.2314;0.257,0.9567,1.0887;0.9115,1.6425,0.2515;-0.0591,-0.4669,0.6959;-0.3864,-0.9626,1.6081;1.2035,-1.1827,0.1776;2.0408,-0.9872,0.8576;0.9816,-2.2561,0.2594;1.5907,-0.8615,-1.2756;2.0036,0.1493,-1.3225;2.382,-1.5545,-1.5892;0.3888,-0.9682,-2.2263;0.0563,-2.0162,-2.2866;0.6889,-0.678,-3.2406;-0.7815,-0.0944,-1.7503;-1.6395,-0.2048,-2.4253;-0.4853,0.96,-1.7705;-1.2118,-0.4801,-0.3286;-1.6003,-1.5083,-0.3352;-2.0261,0.1564,0.0352;-0.9472,0.7559,3.2674;-1.2272,1.6144,4.2649;-2.1295,0.2723,2.8005;-0.3037,-0.3083,3.8177)|\",5.175605436510001\r\n\"[H]O/C(=N/C([H])([H])[H])[C@@]1([H])N(C(=O)C([H])([H])[H])[C@@]([H])(C(F)(F)F)OC1([H])[H] |(1.99,0.257,-2.9014;2.453,0.9614,-3.3877;3.7929,0.8121,-3.214;4.5534,1.6614,-3.7609;5.9959,1.5987,-3.6918;6.3943,1.4835,-4.7076;6.4288,0.7945,-3.0755;6.3765,2.5553,-3.3138;4.1931,-0.4192,-2.393;5.1645,-0.2476,-1.9224;3.2023,-0.7977,-1.3657;3.0816,-0.1813,-0.1138;2.5492,-0.7669,0.8127;3.6033,1.2371,-0.0114;4.6833,1.2828,-0.1935;3.3973,1.6,0.996;3.1178,1.8893,-0.7451;2.9283,-2.2353,-1.4577;3.0239,-2.7186,-0.4824;1.4978,-2.4864,-1.9674;0.5868,-2.0458,-1.0877;1.2805,-1.8284,-3.1415;1.2904,-3.7914,-2.1849;3.8788,-2.7542,-2.3509;4.2497,-1.7063,-3.2496;5.2558,-1.9195,-3.615;3.5594,-1.6506,-4.0986)|\",6.674952752765\r\n\"[H]C([H])([H])OC(=O)[C@@]1([H])N(C(=O)C([H])([H])[H])[C@@]([H])(C(F)(F)F)OC1([H])[H] |(6.676,1.6597,-1.2185;6.6712,1.7326,-0.1314;7.5182,1.1915,0.2964;6.7098,2.7761,0.1905;5.4336,1.1223,0.2874;5.2201,1.1063,1.6106;5.9571,1.6016,2.4342;3.8913,0.4423,1.9943;3.1768,1.2572,2.1538;3.3588,-0.5217,1.0409;2.4439,-0.2945,0.0216;1.9241,-1.2286,-0.5678;2.1408,1.1548,-0.3076;1.597,1.6399,0.5128;1.5117,1.1703,-1.1981;3.0572,1.7227,-0.4912;3.5327,-1.8585,1.5749;2.7057,-2.5024,1.2686;4.8407,-2.5004,1.0837;4.8588,-2.5925,-0.2554;5.9198,-1.7691,1.4649;4.9838,-3.7343,1.5953;3.5511,-1.7107,2.973;4.0499,-0.4072,3.2927;3.4482,-0.0169,4.1164;5.0964,-0.4496,3.6041)|\",6.80828853951\r\n\"[H]O/C(=N/C([H])([H])[H])[C@@]1([H])N([H])[C@@]([H])(C(F)(F)F)OC1([H])[H] |(2.2925,3.1175,-1.4507;2.2053,2.1823,-1.6976;1.7753,1.4434,-0.6271;1.4758,0.238,-0.8593;1.0405,-0.6969,0.1519;1.6708,-1.5924,0.0953;0.0172,-1.0191,-0.0787;1.0544,-0.3404,1.193;1.7284,2.2083,0.6936;1.5489,1.5057,1.5102;2.9669,2.9612,0.9764;3.6859,2.8816,0.2652;2.6133,4.3288,1.3279;3.0765,4.657,2.2673;3.0475,5.3402,0.2533;2.4394,5.0988,-0.9473;2.7556,6.596,0.6032;4.3787,5.2518,0.0408;1.2194,4.355,1.5113;0.643,3.3111,0.7179;-0.2787,2.9951,1.2104;0.4014,3.6763,-0.2878)|\",6.88448041765\r\n\"[H]OC(=O)[C@@]1([H])N([H])[C@@]([H])(C(F)(F)F)OC1([H])[H] |(2.641,-1.2345,-1.4157;2.3791,-2.1642,-1.5681;1.1407,-2.352,-1.0594;0.5684,-3.4097,-1.1293;0.5611,-1.1238,-0.3715;0.9656,-1.0928,0.6504;0.9661,0.1287,-1.0467;0.7513,0.0257,-2.0423;0.0105,1.0838,-0.503;0.4101,1.5459,0.4088;-0.2931,2.2024,-1.5003;-0.6986,1.696,-2.6866;-1.2556,3.0084,-1.0351;0.8101,2.9422,-1.7256;-1.1985,0.3889,-0.1914;-0.9679,-1.0228,-0.3191;-1.411,-1.5311,0.5401;-1.4264,-1.4076,-1.2384)|\",7.43415039566\r\n\"[H]OC1=N[C@]([H])(C([H])([H])[C@@]([H])(O[H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(2.3915,-0.7861,4.9781;2.6877,-0.195,4.2688;3.6587,-0.8195,3.5405;4.1088,-0.1902,2.5358;5.1634,-0.7633,1.6893;5.9096,0.0334,1.5704;4.5888,-1.0625,0.2859;3.8624,-1.8825,0.3557;5.4129,-1.4159,-0.3482;3.878,0.1192,-0.4115;3.5858,-0.2338,-1.4108;2.6554,0.4482,0.2363;2.8865,0.5755,1.179;4.7464,1.378,-0.5978;4.0994,2.1335,-1.0623;5.0214,1.7826,0.3867;6.0058,1.1886,-1.4545;6.6964,0.4898,-0.9622;5.7289,0.7192,-2.4099;6.7414,2.5048,-1.7307;7.6329,2.3449,-2.3483;7.0635,2.9817,-0.7967;6.0931,3.2162,-2.2565;5.8793,-1.9804,2.3134;6.4691,-2.4957,1.5464;6.5889,-1.6285,3.0753;4.8842,-2.9412,2.9706;5.3999,-3.7974,3.419;4.1956,-3.3445,2.2181;4.0852,-2.1866,4.0403;4.6865,-2.0514,4.9515;3.194,-2.7571,4.3375)|\",6.685837306785\r\n\"[H]C1=C([H])[C@]2([H])N(O[C@]([H])(/C([H])=C(\\[H])C([H])([H])[H])C2([H])[H])C(=O)C1([H])[H] |(7.6683,0.5207,-3.7787;7.2075,0.0458,-2.9147;7.1269,0.7028,-1.7573;7.5135,1.7139,-1.6548;6.5289,0.0956,-0.5255;7.2671,0.1043,0.2901;6.1176,-1.2941,-0.7267;5.2803,-1.6306,0.3531;4.5321,-0.4361,0.7335;4.6955,-0.3381,1.8151;3.0589,-0.5698,0.4528;2.4523,0.2493,0.8458;2.4847,-1.5767,-0.2087;3.1175,-2.3754,-0.591;1.012,-1.6827,-0.4805;0.8152,-1.7347,-1.5598;0.455,-0.832,-0.0722;0.5974,-2.6022,-0.0457;5.2169,0.7377,-0.0141;4.6079,1.0591,-0.8645;5.3967,1.6001,0.6341;5.8244,-1.8735,-1.9465;5.0343,-2.7905,-2.0929;6.6853,-1.3522,-3.0973;7.5139,-2.0598,-3.252;6.0634,-1.4284,-3.9967)|\",6.296714500569999\r\n\"[H]N1[C@@]([H])(C(F)(F)F)SC([H])([H])[C@@]1([H])C(=O)OC([H])([H])[H] |(1.5708,-2.332,-0.4353;0.9976,-2.2768,0.4078;0.0022,-3.3127,0.4215;-0.8424,-3.035,1.0576;-0.5496,-3.5686,-0.9753;-1.2436,-2.5004,-1.4149;0.4367,-3.8053,-1.865;-1.382,-4.6271,-0.98;0.758,-4.8813,1.1801;2.2566,-3.9503,1.6829;2.5317,-4.2148,2.7066;3.0883,-4.1886,1.0129;1.8712,-2.4441,1.5595;1.3483,-2.133,2.4691;3.1186,-1.5942,1.3646;3.613,-1.3638,0.282;3.6248,-1.1862,2.5387;4.8485,-0.4284,2.4546;5.1126,-0.1894,3.4845;4.6895,0.4832,1.874;5.6332,-1.0237,1.9812)|\",5.874938032295\r\n\"[H]N1[C@](C([H])([H])[H])(C(F)(F)F)OC([H])([H])[C@@]1([H])C(=O)OC([H])([H])[H] |(2.1141,1.2926,-1.5801;2.777,0.887,-0.9197;2.0701,0.0776,0.054;0.6296,0.5223,0.268;0.595,1.5642,0.5963;0.1504,-0.1041,1.0237;0.0832,0.4148,-0.6733;2.8458,0.1132,1.3919;2.3025,-0.7309,2.2911;4.1382,-0.2536,1.2173;2.8399,1.3467,1.933;2.0774,-1.2717,-0.4202;3.2149,-1.4553,-1.2549;4.0189,-1.9705,-0.7189;2.9106,-2.0692,-2.1081;3.6637,-0.0075,-1.6636;4.7133,0.1441,-1.3958;3.5063,0.2242,-3.1579;2.6033,0.8545,-3.6676;4.488,-0.3853,-3.8429;4.4148,-0.2677,-5.2769;5.278,-0.8136,-5.6572;4.4599,0.7827,-5.5747;3.4855,-0.7066,-5.6488)|\",6.838221063065\r\n\"[H]N1[C@@]([H])(C(F)(F)F)OC([H])([H])[C@@]1([H])C(=O)OC([H])([H])[H] |(3.7043,-0.3282,-3.379;2.8859,-0.9053,-3.5666;3.1839,-2.0531,-4.417;2.8038,-1.9246,-5.4377;4.6935,-2.3198,-4.5368;5.323,-1.2113,-4.9927;5.2466,-2.641,-3.347;4.9442,-3.322,-5.3952;2.5225,-3.1643,-3.8578;2.4209,-2.9535,-2.4486;1.5275,-3.4724,-2.0966;3.2993,-3.3551,-1.9303;2.328,-1.4127,-2.3123;1.2793,-1.1082,-2.1998;3.0971,-0.8881,-1.1093;4.0756,-0.1756,-1.1697;2.5409,-1.3231,0.0351;3.1944,-0.8964,1.2467;2.6153,-1.3354,2.0589;3.1917,0.1942,1.3155;4.2263,-1.2553,1.2695)|\",6.571549489574999\r\n\"[H]C1=C(C2([H])C([H])([H])C2([H])[H])N=C(C2([H])C([H])([H])C2([H])[H])[C@]([H])(OC([H])([H])[H])C1=O |(1.9336,-4.4954,1.446;2.4465,-3.6467,1.0038;3.7971,-3.4804,1.1177;4.7035,-4.4775,1.7377;5.7343,-4.1397,1.7641;4.4783,-5.972,1.5429;3.6119,-6.2625,0.9549;5.3578,-6.5907,1.3875;4.2509,-5.3658,2.8892;4.9746,-5.555,3.6768;3.2297,-5.2423,3.2385;4.4722,-2.3185,0.7132;3.8389,-1.3923,0.0706;4.539,-0.1404,-0.2759;3.9303,0.5948,-0.7861;6.0234,-0.2032,-0.6376;6.4675,-1.1936,-0.6259;6.3604,0.4549,-1.4337;5.6175,0.3844,0.6678;5.6641,1.4613,0.8024;5.7872,-0.2097,1.5604;2.41,-1.5745,-0.4185;2.5307,-2.0663,-1.41;1.7663,-0.3362,-0.5633;0.7179,-0.3145,-1.5288;0.3778,0.7225,-1.5855;-0.109,-0.9627,-1.2296;1.092,-0.6225,-2.5185;1.6201,-2.6178,0.4017;0.3962,-2.5926,0.4509)|\",4.138851666105\r\n\"[H]O/C(=N/C([H])=C([H])/C(=C(/[H])C([H])=O)C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(2.4393,-3.0482,-5.9462;3.0802,-2.634,-5.3459;2.41,-2.1087,-4.2843;3.0823,-1.3146,-3.5368;2.604,-0.8684,-2.3196;1.7521,-1.3813,-1.8688;3.205,0.1655,-1.6801;4.0529,0.6155,-2.1861;2.7932,0.6999,-0.3945;3.4123,1.7586,0.2058;3.0513,2.1044,1.1721;4.5465,2.5256,-0.3044;4.9636,2.2254,-1.2874;5.0391,3.4668,0.3018;1.6261,0.0484,0.3149;1.8308,-1.0091,0.5229;0.7172,0.0838,-0.2984;1.4142,0.5466,1.2641;0.9895,-2.5443,-4.1513;0.6508,-3.8961,-4.3377;1.4336,-4.6249,-4.533;-0.6753,-4.3121,-4.2334;-0.9233,-5.3616,-4.3647;-1.6803,-3.3823,-3.9584;-2.7146,-3.7058,-3.8827;-1.3541,-2.0357,-3.782;-2.1348,-1.3084,-3.5785;-0.0275,-1.6173,-3.873;0.2233,-0.5682,-3.7512)|\",3.7470077213849993\r\n\"[H]O/C(=N/C([H])=C([H])/C([H])=C(\\[H])C([H])=O)C1=C([H])C([H])=C([H])N1C([H])([H])[H] |(4.4258,2.4598,-1.0193;3.6424,2.9743,-0.7663;2.6312,2.1089,-0.4504;1.4439,2.5926,-0.5383;0.3497,1.9233,-0.0413;0.5039,1.0805,0.6399;-0.9173,2.2991,-0.357;-1.0579,3.1372,-1.0357;-2.0736,1.6275,0.1642;-1.8881,0.7961,0.8484;-3.3643,1.9365,-0.1182;-3.6224,2.7502,-0.7927;-4.4672,1.1873,0.4729;-4.1509,0.3645,1.1571;-5.6513,1.4033,0.2677;3.0315,0.7513,-0.0713;2.4526,-0.4621,-0.4327;1.5834,-0.5674,-1.067;3.2296,-1.4982,0.1296;3.0615,-2.5621,0.0376;4.2557,-0.8971,0.8332;5.048,-1.3304,1.4278;4.1508,0.4628,0.7088;5.0198,1.4144,1.3934;5.4442,0.9289,2.2746;4.4486,2.2871,1.7133;5.8451,1.7489,0.7533)|\",3.654489012215\r\n\"[H]C1=C2C(=C([H])\\C([H])=C/1OC([H])([H])[H])/[C@]1([H])C([H])([H])C(=O)O[C@]([H])(C/2([H])[H])C1([H])[H] |(2.7351,2.4545,-0.8348;3.2575,2.2824,0.1007;3.7342,3.3896,0.8206;4.4125,3.1991,2.032;4.5999,1.887,2.4971;5.1283,1.728,3.4351;4.1344,0.7902,1.7896;4.2846,-0.2235,2.1472;3.4541,0.9855,0.5774;3.0331,-0.1489,-0.0538;2.3467,-0.0173,-1.2881;2.1096,-1.0335,-1.6084;2.9738,0.4653,-2.0498;1.4158,0.5538,-1.1743;4.9082,4.3811,2.8537;5.8235,4.0846,3.3801;3.8665,4.8198,3.9031;4.3294,5.5153,4.6179;3.4864,3.9769,4.4863;2.6569,5.5478,3.3293;1.625,5.6776,3.9417;2.7735,6.1376,2.1083;3.8921,5.9088,1.2094;3.972,6.8499,0.6578;3.5378,4.7791,0.2387;2.502,4.9158,-0.0934;4.1672,4.8905,-0.6568;5.1821,5.5957,1.9595;5.974,5.3817,1.2317;5.5026,6.4569,2.5577)|\",5.673573782925\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])[C@]([H])(C([H])([H])[H])[C@]([H])(/C([H])=C(\\[H])C([H])([H])[H])O2 |(8.0131,-2.9655,3.1865;7.4985,-2.1785,2.6415;6.3375,-2.4926,1.9254;5.9355,-3.5003,1.8998;5.713,-1.4595,1.2351;6.2047,-0.1512,1.2439;7.3563,0.1477,1.9621;7.749,1.1618,1.9829;8.0099,-0.8775,2.6596;8.9169,-0.6601,3.2163;5.2321,0.7214,0.4759;5.7379,1.3587,-0.2595;4.4023,1.6193,1.4098;5.0609,2.3123,1.9446;3.8668,1.0268,2.1597;3.671,2.2099,0.8469;4.4201,-0.3762,-0.293;4.9159,-0.5587,-1.2565;2.9637,-0.124,-0.517;2.3448,-0.0533,0.3767;2.4134,0.0186,-1.7255;3.0533,-0.0706,-2.6054;0.9627,0.297,-1.9903;0.5175,-0.4923,-2.6106;0.8333,1.2385,-2.5409;0.3884,0.3647,-1.0606;4.5742,-1.6014,0.4931)|\",5.67629492143\r\n\"[H]C(/C([H])=C1/N[C@]1([H])C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(7.1228,-0.4529,-3.3544;6.7828,-1.3684,-2.8683;7.6388,-2.4013,-2.7953;7.3621,-3.3391,-2.3171;8.9672,-2.3238,-3.3415;10.0327,-2.9763,-3.5075;9.967,-1.5275,-4.0504;9.9359,-1.4159,-5.1321;10.7629,-0.5104,-3.3042;10.9372,-0.4934,-2.1042;11.2674,0.4141,-4.156;12.0501,1.4478,-3.5396;12.3626,2.1039,-4.3527;12.9206,1.0205,-3.0346;11.4532,1.9994,-2.8078;5.3934,-1.3538,-2.3103;4.6818,-1.1736,-3.1309;5.1473,-2.3362,-1.8881;5.1956,-0.2567,-1.2424;5.8945,-0.4333,-0.4137;5.4682,0.7147,-1.6752;3.7582,-0.213,-0.7052;3.0654,-0.0193,-1.5379;3.4971,-1.2074,-0.3162;3.5253,0.8282,0.4022;2.5155,0.6823,0.8082;4.2178,0.6325,1.233;3.6705,2.2851,-0.0536;3.4481,2.9763,0.7672;2.9794,2.5121,-0.8752;4.6854,2.508,-0.402)|\",5.542959134685\r\n\"[H]O/C(=N/C(C1=C([H])C([H])=C([H])S1)=C(/OC([H])([H])[H])C(=O)C([H])([H])[H])C([H])([H])[H] |(6.6791,-0.9579,0.8696;6.3281,-0.0741,0.6792;5.4624,-0.1467,-0.3643;4.9411,0.9501,-0.7382;4.0348,1.0224,-1.807;4.6491,1.3915,-3.0834;4.1586,1.4088,-4.3765;3.1521,1.0999,-4.6154;5.1142,1.8374,-5.3368;4.9045,1.9109,-6.3987;6.3278,2.1478,-4.7821;7.2187,2.5047,-5.2819;6.3321,1.9172,-3.0687;2.6839,0.837,-1.6306;1.8918,0.8545,-2.7538;0.8487,1.8382,-2.7917;0.4105,1.7674,-3.7901;1.2598,2.8456,-2.6473;0.0867,1.6293,-2.0384;1.9815,0.5477,-0.3485;0.8139,0.1688,-0.3871;2.6408,0.7804,1.0041;1.839,0.963,1.7244;3.3539,1.6079,0.9982;3.1855,-0.1132,1.332;5.2439,-1.5281,-0.9367;4.5105,-1.5104,-1.7432;4.8898,-2.2148,-0.158;6.1895,-1.9219,-1.3297)|\",3.88578578514\r\n\"[H]O/C(=N/C(=C(/OC([H])([H])[H])C(=O)C([H])([H])[H])C([H])([H])[H])C1=NC([H])=C([H])C([H])=C1[H] |(4.6175,-1.0431,0.8273;4.5035,-0.9781,-0.1493;3.3048,-0.401,-0.3347;2.9053,-0.2264,-1.5303;1.6662,0.2059,-1.976;1.4337,1.439,-2.5208;0.133,1.7221,-2.8986;-0.077,1.9406,-4.2969;-1.1558,2.0651,-4.4223;0.2565,1.0734,-4.8841;0.4396,2.8415,-4.6355;2.4041,2.5585,-2.6645;2.0109,3.644,-3.0816;3.8745,2.3713,-2.3268;4.2744,1.426,-2.705;4.025,2.3652,-1.2405;4.4198,3.2188,-2.7488;0.5952,-0.8622,-1.954;-0.3414,-0.4863,-2.3634;0.4248,-1.2168,-0.9294;0.9331,-1.7275,-2.5366;2.6311,-0.0162,0.9616;3.297,-0.4499,2.048;2.8143,-0.1684,3.2592;3.3874,-0.5428,4.1044;1.6435,0.5663,3.453;1.2868,0.7737,4.4568;0.9606,1.0281,2.3296;0.0532,1.6154,2.4354;1.4562,0.7366,1.0585;0.9547,1.1019,0.1713)|\",3.668094704740001\r\n\"[H]O/C(=N/C([H])=C([H])/C([H])=C(\\[H])C([H])=O)C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H] |(6.5487,-1.0514,-3.5266;6.2874,-1.4481,-4.3733;4.9412,-1.3179,-4.5188;4.4892,-1.5363,-5.7014;3.1449,-1.6736,-5.9556;2.4617,-1.8853,-5.1271;2.6411,-1.5958,-7.2164;3.3256,-1.3983,-8.0387;1.2444,-1.7537,-7.4917;0.5816,-1.9577,-6.65;0.6629,-1.6712,-8.7164;1.2567,-1.4714,-9.6066;-0.7805,-1.8444,-8.8896;-1.1388,-1.7615,-9.9395;-1.5782,-2.062,-7.9873;4.1984,-0.9402,-3.2897;3.1511,-0.0091,-3.3256;2.8741,0.4496,-4.2695;2.4741,0.3637,-2.1659;1.6759,1.0937,-2.2301;2.8363,-0.2038,-0.9359;2.2441,0.0798,0.2508;1.158,0.9995,0.2693;0.8388,1.0591,1.3106;0.3236,0.6447,-0.3482;1.4698,1.9942,-0.0732;3.8855,-1.1376,-0.8846;4.1452,-1.5729,0.0749;4.558,-1.4923,-2.0425;5.3495,-2.2357,-1.9896)|\",3.5891816880950005\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])=C([H])[H])[C@]([H])(C1=C([H])C(OC([H])([H])[H])=C(C([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(4.8557,1.5349,-2.2111;5.363,1.6032,-1.3828;4.6316,0.8712,-0.4034;5.0625,1.1813,0.5571;3.1446,1.3083,-0.4121;2.6215,0.8143,0.4143;3.1287,2.3881,-0.2239;2.4541,1.0118,-1.7165;2.1222,-0.0155,-1.868;2.269,1.9017,-2.6969;1.7875,1.6321,-3.6334;2.5676,2.9427,-2.5858;4.8593,-0.658,-0.5731;4.479,-0.9257,-1.5692;4.0872,-1.4843,0.4451;4.3337,-1.3461,1.8228;5.0885,-0.6445,2.1601;3.624,-2.1048,2.7552;3.8077,-2.021,4.1093;4.7867,-1.1252,4.6074;4.7631,-1.2294,5.6939;4.5587,-0.0852,4.3383;5.7906,-1.3784,4.2413;2.6443,-3.0339,2.3436;1.8874,-3.8427,3.3654;1.1838,-4.5254,2.8784;1.3208,-3.1976,4.0481;2.5645,-4.4363,3.9911;2.4172,-3.1632,0.9762;1.6698,-3.8769,0.6363;3.1208,-2.4043,0.0353;2.9193,-2.5393,-1.0247;6.3648,-0.9796,-0.5501;6.5386,-2.0432,-0.7456;6.8056,-0.7444,0.4265;6.8909,-0.3876,-1.3036)|\",5.86677461678\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])C([H])([H])C(=O)OC([H])([H])[H])C([H])([H])C([H])(OC([H])([H])[H])OC([H])([H])[H] |(-3.0561,-1.2743,4.2116;-2.2028,-0.6245,4.4058;-1.0215,-1.1649,4.717;-0.1704,-0.5056,4.9037;-0.7435,-2.6397,4.8257;-1.6732,-3.2034,4.6692;-0.4067,-2.8746,5.8481;0.3321,-3.1182,3.833;1.2544,-2.5418,3.9727;0.0017,-2.9143,2.8077;0.6447,-4.6081,3.9848;-0.2429,-5.2266,3.8001;0.9532,-4.8401,5.0142;1.7513,-5.0801,3.0615;2.4502,-4.3703,2.3713;1.8776,-6.4275,3.1108;2.9175,-6.9778,2.2866;2.8846,-8.055,2.4537;3.8922,-6.5743,2.574;2.7365,-6.7475,1.2331;-2.4705,0.8514,4.2904;-1.5661,1.4276,4.518;-3.2443,1.1443,5.013;-2.9499,1.2546,2.8885;-2.2338,0.9043,2.1313;-4.2033,0.641,2.6884;-4.6581,0.6894,1.343;-5.6218,0.1749,1.3179;-3.9571,0.1742,0.669;-4.7849,1.7215,0.9951;-2.9934,2.6523,2.6751;-3.8399,3.3894,3.5495;-3.9302,4.389,3.1167;-3.4117,3.4815,4.557;-4.8358,2.9367,3.6249)|\",6.827336509045001\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])C([H])=C([H])[H] |(4.5813,-1.9675,-3.378;4.6049,-2.9349,-3.3134;4.1861,-3.2886,-1.9889;3.1617,-2.9236,-1.8184;4.1482,-4.8221,-1.951;3.7673,-5.1463,-0.9722;3.4018,-5.1418,-2.6904;5.4782,-5.5255,-2.257;5.8648,-5.149,-3.2114;6.229,-5.268,-1.497;5.3381,-7.0521,-2.3213;4.9343,-7.426,-1.3677;4.5941,-7.3159,-3.088;6.6557,-7.7753,-2.6284;7.061,-7.3979,-3.5779;7.3981,-7.5189,-1.8591;6.5063,-9.2981,-2.706;7.464,-9.7848,-2.9238;5.7994,-9.587,-3.4936;6.1318,-9.7091,-1.7602;5.0989,-2.6203,-0.9459;5.157,-1.5478,-1.1849;6.1172,-3.0126,-1.0276;4.6215,-2.7197,0.5066;4.4242,-3.7665,0.7791;3.3749,-1.9939,0.5782;2.3914,-2.5359,1.3878;1.461,-1.998,1.1491;2.2487,-3.6154,1.1823;2.7191,-2.3643,2.7542;1.7522,-2.9315,3.6146;2.083,-2.7425,4.6383;1.6571,-4.0193,3.4647;0.7596,-2.4747,3.4714;5.6204,-2.1099,1.5032;5.8775,-1.098,1.1579;5.0932,-1.9694,2.455;6.8896,-2.8813,1.7662;7.6391,-2.3217,2.328;7.1738,-4.1414,1.4304;8.1219,-4.595,1.7059;6.4859,-4.7752,0.8773)|\",7.425986980145\r\n\"[H]C([H])=C([H])C([H])([H])[C@]([H])(OC([H])([H])OC([H])([H])[H])C([H])([H])[C@@]1([H])OC1([H])[H] |(1.3554,-5.3074,2.311;1.9526,-4.4009,2.3656;2.9929,-4.5236,2.6617;1.4358,-3.2017,2.0936;0.3896,-3.1244,1.7983;2.1871,-1.8997,2.1604;3.2426,-2.081,2.4014;1.7873,-1.2899,2.9838;2.1138,-1.0633,0.8643;2.6679,-0.1288,1.0138;0.7526,-0.741,0.5234;0.1763,0.3192,1.2559;-0.8646,0.3579,0.9032;0.1986,0.1237,2.3353;0.8126,1.5534,1.1;0.7263,2.1038,-0.2113;1.0322,3.1497,-0.1283;-0.3067,2.0609,-0.5891;1.3911,1.5915,-0.9149;2.7074,-1.7968,-0.3434;2.1946,-2.7581,-0.4751;3.7618,-2.0154,-0.1336;2.5961,-0.9986,-1.6198;1.5779,-0.8329,-1.971;3.4322,0.1709,-1.7167;3.6819,-0.9265,-2.6062;3.455,-0.7395,-3.6564;4.6115,-1.4674,-2.4244)|\",7.333468270975\r\n\"[H]O[C@@]1([H])[C@@]2([H])C([H])([H])C(=O)C([H])([H])[C@]1([H])N(C(=O)OC([H])([H])[H])[C@@]2([H])C([H])([H])C([H])=C([H])[H] |(2.1202,1.5678,-3.2758;2.072,0.6592,-2.941;2.6884,0.6248,-1.666;2.2669,1.3997,-1.0102;2.5295,-0.762,-1.0192;1.5038,-0.9557,-0.6937;2.9765,-1.8434,-2.0259;2.9563,-2.8435,-1.5807;2.2756,-1.8466,-2.8707;4.3726,-1.6136,-2.611;5.0735,-2.5428,-2.9604;4.8262,-0.1618,-2.7702;4.492,0.1846,-3.7575;5.9184,-0.1261,-2.7528;4.2277,0.7435,-1.6807;4.5924,1.7688,-1.7862;4.5423,0.2422,-0.3344;5.8004,0.3922,0.172;6.6855,1.0498,-0.351;5.947,-0.2792,1.3484;7.2513,-0.1709,1.9354;7.4902,0.8722,2.1591;8.0133,-0.5739,1.2632;7.2053,-0.7572,2.8543;3.4896,-0.6605,0.1913;3.9328,-1.6365,0.422;2.8219,-0.1135,1.475;3.6243,0.1251,2.1833;2.2962,0.8234,1.2526;1.8778,-1.1047,2.0977;2.3197,-2.058,2.3941;0.5756,-0.904,2.3072;-0.0528,-1.6598,2.7706;0.0892,0.0307,2.0344)|\",6.062696589140001\r\n\"[H]C([H])([H])OC(=O)N1[C@@]2([H])C([H])([H])C(=O)C([H])([H])[C@]1([H])[C@]1([H])O[C@@]12[H] |(4.3594,2.4255,0.0115;4.229,1.3594,0.2012;3.9346,0.8413,-0.7153;5.1563,0.9193,0.5764;3.1899,1.2732,1.1881;2.8981,0.0076,1.5926;3.4734,-0.9914,1.1958;1.8511,0.0037,2.4733;1.5477,-1.1866,3.2895;1.5888,-2.0861,2.674;2.5544,-1.2403,4.4558;2.3005,-2.0049,5.1964;3.536,-1.5014,4.038;2.6991,0.1021,5.1899;3.1612,0.1532,6.3107;2.3439,1.3912,4.4276;3.2757,1.7985,4.013;1.966,2.1186,5.1527;1.3651,1.1549,3.2601;1.2573,2.0371,2.6282;0.0116,0.6122,3.7524;-0.9116,1.1575,3.5751;0.0858,-0.1009,4.9992;0.1266,-0.8508,3.771;-0.7013,-1.5362,3.6106)|\",6.34569499366\r\n\"[H]ON1C([H])([H])C([H])([H])C2=C(C([H])=C([H])C([H])=C2[H])[C@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.7612,1.6294,2.0163;2.9521,1.7222,1.4878;3.4316,1.4253,0.1577;3.2078,2.6084,-0.6743;2.1371,2.8137,-0.8297;3.6355,3.4646,-0.1461;3.9346,2.3738,-2.0011;5.0152,2.4755,-1.831;3.6562,3.1528,-2.7221;3.6602,0.998,-2.5779;3.1098,-0.0184,-1.781;2.9133,-1.294,-2.3279;2.4797,-2.0741,-1.7061;3.2537,-1.5696,-3.6485;3.0961,-2.5649,-4.0553;3.7969,-0.5585,-4.4463;4.0626,-0.7599,-5.4807;3.9952,0.7101,-3.9093;4.4207,1.4979,-4.5279;2.6878,0.241,-0.3396;3.0421,-0.5949,0.276;1.165,0.3153,-0.1668;0.3153,0.7976,-1.1721;0.7289,1.0892,-2.1328;-1.063,0.883,-0.9646;-1.704,1.2586,-1.7582;-1.6166,0.4783,0.25;-2.69,0.5396,0.41;-0.7824,-0.0138,1.2559;-1.2037,-0.3373,2.2043;0.5937,-0.0926,1.0468;1.2392,-0.4646,1.8373)|\",5.95385104894\r\n\"[H]C([H])([H])C1(C([H])([H])[H])O[C@@]2([H])C([H])([H])C(=O)O[C@]23C([H])([H])C([H])([H])C([H])([H])C(=O)C([H])([H])[C@@]13[H] |(1.4523,1.0842,-2.0231;2.0981,0.2094,-1.8915;1.4558,-0.6656,-1.7501;2.6698,0.069,-2.8123;3.0038,0.3622,-0.6598;2.1547,0.6116,0.5897;1.6312,1.5724,0.5223;2.7818,0.6174,1.4873;1.4037,-0.1769,0.7031;3.7062,-0.8922,-0.4478;5.0805,-0.8178,-0.8074;5.6924,-1.036,0.0764;5.4037,-1.7304,-1.9876;4.725,-2.5828,-2.0579;6.4296,-2.1168,-1.9474;5.2726,-0.8308,-3.209;5.2242,-1.137,-4.3695;5.2194,0.4771,-2.8003;5.3311,0.6085,-1.3545;6.7128,1.1812,-1.035;6.8369,1.1973,0.0567;7.4705,0.4891,-1.4264;6.9784,2.5817,-1.6054;8.0362,2.8154,-1.4369;6.8384,2.5673,-2.6928;6.1446,3.7179,-0.976;6.6286,4.6841,-1.1447;6.1014,3.5734,0.1155;4.7138,3.8829,-1.4835;4.3011,4.9824,-1.8012;3.7761,2.6765,-1.5986;3.7045,2.4351,-2.6663;2.7865,3.0424,-1.3039;4.1412,1.4161,-0.8078;4.4233,1.7053,0.2142)|\",6.12256163625\r\n\"[H]C1C=C2C(O[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H])C1=C(NC2=O)C([H])=C([H])C([H])=C1[H] |(3.6617,2.561,-3.5185;4.3231,1.7075,-3.3965;5.5566,1.6207,-3.8513;6.7265,1.2852,-3.2937;6.5686,1.1882,-1.844;5.3331,1.121,-1.3104;4.1907,0.7656,-2.2151;4.3487,-0.2816,-2.494;2.9161,0.9383,-1.4023;2.8377,1.987,-1.0869;2.077,0.7483,-2.0857;2.8184,0.0102,-0.184;2.9137,-1.0328,-0.5167;3.666,0.2012,0.4834;1.5027,0.1885,0.5802;1.4551,-0.4845,1.443;0.6375,-0.0265,-0.0587;1.3948,1.2146,0.952;7.6976,1.0028,-1.0679;8.9795,0.8801,-1.7827;9.1498,0.857,-3.0947;8.0629,1.0087,-3.9334;8.1599,1.0088,-5.1548;10.15,0.692,-0.9572;11.099,0.618,-1.4775;10.0605,0.5994,0.3995;10.9624,0.4577,0.9897;8.7939,0.6696,1.0755;8.7596,0.5836,2.1574;7.649,0.8461,0.3595;6.6823,0.9064,0.8487)|\",2.0571807097800003\r\n\"[H]C1C=C2C(O[C@@]1([H])C([H])([H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1NC2=O |(5.3028,0.182,0.5043;5.0793,0.0417,-0.5499;5.8617,-0.5491,-1.4299;5.6697,-1.518,-2.3344;4.3607,-2.1315,-2.1254;3.4483,-1.511,-1.3519;3.6497,-0.0582,-1.0485;3.4971,0.4686,-1.9967;2.5935,0.3279,-0.0227;2.7511,-0.2696,0.8838;2.7859,1.3744,0.247;1.1557,0.1669,-0.5256;0.445,0.4884,0.2427;0.9391,-0.8756,-0.7748;0.9783,0.7739,-1.4212;4.0317,-3.254,-2.8623;2.7522,-3.9009,-2.771;2.0278,-3.5068,-2.0659;2.4798,-5.0039,-3.5215;1.5241,-5.5113,-3.4295;3.4621,-5.5068,-4.4428;3.2248,-6.3893,-5.0315;4.6697,-4.8953,-4.6006;5.409,-5.255,-5.3084;5.0296,-3.7212,-3.8394;6.1797,-3.1353,-4.1295;6.5637,-1.9961,-3.4491;7.6245,-1.4215,-3.6628)|\",2.054459571275\r\n\"[H]C1/C=C2\\C(O[C@@]1([H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1NC2=O |(1.5292,0.0927,3.3317;2.0208,-0.4072,2.5023;3.1525,-1.0767,2.5797;4.291,-1.0591,1.8772;4.2853,0.0744,0.9555;3.1305,0.7138,0.6887;1.8343,0.0703,1.0722;1.1451,0.9181,1.0673;1.4479,-0.9489,0.0145;1.4477,-0.4842,-0.9759;0.4433,-1.3307,0.2252;2.1425,-1.7939,0.0093;5.4371,0.3516,0.2417;5.5182,1.4065,-0.7301;4.6436,2.0306,-0.8822;6.6784,1.6362,-1.4057;6.7498,2.4537,-2.1167;7.821,0.7955,-1.1763;8.7381,0.9968,-1.7242;7.7732,-0.25,-0.3029;8.6241,-0.9029,-0.1401;6.5777,-0.5558,0.4473;6.5879,-1.6517,1.1902;5.4647,-2.0077,1.9079;5.416,-3.0013,2.6232)|\",2.032690463235001\r\n\"[H]O/N=C(\\[H])C(C1=C([H])C([H])=C(O[H])C(O[H])=C1[H])(C([H])([H])[H])C([H])([H])[H] |(4.7653,-2.8616,-2.3384;4.1766,-2.1121,-2.5183;3.8184,-1.6963,-1.2175;3.0061,-0.7143,-1.261;2.6829,-0.3034,-2.2225;2.4569,-0.0727,-0.0009;2.9352,1.3934,0.0252;2.0784,2.4777,0.2173;1.0109,2.3307,0.3336;2.5804,3.7835,0.2669;1.8979,4.6182,0.4199;3.9394,4.0406,0.1208;4.4862,5.2925,0.1553;3.7724,5.9334,0.2975;4.8165,2.956,-0.0814;6.1434,3.2474,-0.2298;6.6243,2.4167,-0.3689;4.3075,1.6627,-0.1266;4.9984,0.8372,-0.2911;0.9175,-0.1851,-0.0781;0.6258,-1.2389,-0.135;0.5141,0.3276,-0.9584;0.4475,0.2432,0.8135;2.9462,-0.8022,1.2693;2.633,-1.8512,1.2601;2.5246,-0.3162,2.1556;4.0357,-0.7806,1.3518)|\",5.57833393525\r\n\"[H]OC(=O)[C@@]1(O[H])C([H])([H])C(=O)[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.1444,2.5433,3.891;4.5279,1.9942,4.6043;5.4628,1.2127,4.0375;6.2205,0.5315,4.681;5.5013,1.2666,2.4903;4.9825,2.566,2.1554;4.6937,2.5588,1.2286;6.9756,1.1655,2.0412;7.0134,1.3874,0.9643;7.5969,1.8972,2.5617;7.5366,-0.2238,2.2479;8.613,-0.4681,2.7584;6.7031,-1.3665,1.67;6.8171,-1.2933,0.572;7.1925,-2.6021,2.1397;8.0567,-2.3861,2.5498;5.1938,-1.2708,1.9904;4.6785,-1.9182,1.2625;4.936,-1.7228,3.2991;5.6151,-2.3999,3.4777;4.6293,0.1563,1.8184;4.6922,0.3754,0.7365;3.1434,0.2336,2.2322;3.0558,-0.0279,3.2921;2.8065,1.2737,2.1325;2.2052,-0.6642,1.413;2.3538,-0.4763,0.3391;2.4598,-1.7168,1.5839;0.732,-0.4409,1.7713;0.0778,-1.0965,1.1861;0.4265,0.5949,1.5778;0.5495,-0.648,2.8323)|\",6.098071389705\r\n\"[H]OC([H])([H])C([H])([H])C1=C([H])C([H])=C(O/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])C([H])=C1[H] |(2.5337,-4.048,-4.5572;2.2961,-4.7657,-5.1661;3.04,-5.9095,-4.7771;2.8149,-6.1974,-3.7374;2.6993,-6.722,-5.4269;4.5611,-5.7209,-4.9445;4.7642,-5.5197,-6.0029;5.0599,-6.6651,-4.6903;5.104,-4.5972,-4.09;5.5957,-4.8383,-2.7974;5.6332,-5.8578,-2.4208;6.0531,-3.8011,-1.9901;6.4458,-3.9878,-0.9957;6.0148,-2.4911,-2.4698;6.4512,-1.5016,-1.6028;7.0835,-0.4149,-2.1038;7.4495,-0.4868,-3.1251;7.2885,0.6837,-1.3623;6.9231,0.7635,-0.3451;8.0328,1.8068,-1.9498;8.4865,1.8451,-3.0813;8.1447,2.823,-1.0572;8.8592,3.9963,-1.5054;8.6221,4.175,-2.5571;8.4527,4.8098,-0.8989;10.3597,3.8455,-1.3;10.8691,4.7763,-1.5756;10.7531,3.0399,-1.9262;10.589,3.6249,-0.2522;5.5141,-2.2144,-3.7418;5.4542,-1.1921,-4.1015;5.0698,-3.2701,-4.5416;4.6914,-3.0559,-5.5378)|\",5.298056669235\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@]([H])(C([H])([H])[C@@]([H])(O[H])C([H])([H])[H])[C@@]([H])(O[H])[C@@]1([H])O[H] |(4.0914,-3.5805,-3.9964;3.6053,-4.3483,-4.3412;2.2499,-4.1468,-3.9789;1.7817,-3.3668,-4.6025;1.7217,-5.0856,-4.1667;2.1331,-3.7841,-2.4976;2.5799,-4.5889,-1.9058;2.9061,-2.6039,-2.206;2.0947,-1.4256,-2.4348;2.2144,-1.1032,-3.4771;2.5426,-0.3167,-1.4835;1.9696,0.5851,-1.7396;2.278,-0.5929,-0.4547;4.0517,0.0055,-1.5031;4.1972,0.8922,-0.873;4.8047,-1.0221,-0.867;4.4959,-1.8618,-1.2497;4.6032,0.3177,-2.8992;4.0616,1.1456,-3.3741;4.5366,-0.5566,-3.5574;5.6589,0.594,-2.8209;0.6308,-1.8728,-2.2596;0.1532,-1.3667,-1.4133;-0.0897,-1.5856,-3.4645;-1.0253,-1.4806,-3.2322;0.7098,-3.4157,-2.0268;0.6223,-3.6301,-0.9577;-0.3354,-4.112,-2.6694;-0.4711,-3.6471,-3.5154)|\",7.733475631210001\r\n\"[H]C([H])([H])C(OC(=O)N1C([H])([H])[C@]([H])(F)C([H])([H])[C@@]1([H])C(=O)F)(C([H])([H])[H])C([H])([H])[H] |(2.3168,-2.8479,-3.0356;3.01,-2.1196,-2.6001;3.8601,-2.6545,-2.1738;3.3663,-1.4671,-3.4047;2.2836,-1.2977,-1.5303;3.1601,-0.2091,-1.0527;4.3238,-0.4747,-0.4262;4.8142,-1.573,-0.2086;4.9399,0.6905,-0.05;4.3926,2.0425,-0.1628;4.4746,2.4324,-1.1857;3.3417,2.0702,0.1346;5.3046,2.8595,0.7653;4.8865,2.9161,1.7759;5.4579,4.1573,0.3012;6.6343,2.0977,0.7557;7.2751,2.319,1.6128;7.1772,2.3568,-0.1593;6.1873,0.6203,0.6913;6.9177,-0.0184,0.1891;5.9735,0.067,2.0957;4.9711,0.037,2.7352;7.1586,-0.3478,2.6129;1.108,-0.5287,-2.138;0.6198,0.0937,-1.381;0.3686,-1.2292,-2.5394;1.4498,0.1191,-2.9516;1.8244,-2.1539,-0.3463;1.086,-2.8867,-0.6902;1.3517,-1.5257,0.4165;2.6634,-2.6854,0.1049)|\",6.449098256849999\r\n\"[H]C([H])(Cl)C(=O)N1C([H])([H])[C@]([H])(F)C([H])([H])[C@@]1([H])C(=O)F |(2.2512,2.4527,0.0765;1.3866,2.7682,-0.5132;1.2275,3.8379,-0.3936;1.7949,2.4694,-2.2555;0.1081,2.0663,-0.0658;-0.8375,2.7168,0.3657;0.0562,0.7047,-0.1298;1.1374,-0.2399,-0.4272;1.2542,-0.4007,-1.5052;2.0942,0.1004,-0.0223;0.6439,-1.5373,0.2274;0.9528,-1.5919,1.2764;1.1567,-2.6456,-0.4252;-0.8785,-1.4512,0.0962;-1.4178,-2.1053,0.7855;-1.1535,-1.7272,-0.9272;-1.1731,0.0484,0.325;-2.0391,0.397,-0.2418;-1.4879,0.3301,1.7894;-2.5593,0.3626,2.3011;-0.3456,0.4359,2.5258)|\",6.43277142582\r\n\"[H]OC(=O)N1C([H])([H])[C@@]2(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])OC(=O)[C@]1([H])C2([H])[H] |(5.6765,3.5058,1.3142;5.8475,3.6693,2.2552;4.8063,3.2582,3.0342;4.7822,3.4943,4.2194;3.793,2.6053,2.3582;3.8711,1.9771,1.0275;3.747,2.7121,0.2226;4.8096,1.4303,0.8814;2.6775,0.9851,1.102;2.2217,0.3058,-0.2064;1.6248,1.3551,-1.1675;1.3,0.8623,-2.0903;2.3496,2.127,-1.4511;0.7494,1.8527,-0.7355;1.1452,-0.7553,0.1062;0.8446,-1.2578,-0.8196;0.2454,-0.3074,0.5427;1.5225,-1.5112,0.7998;3.426,-0.3859,-0.8802;3.0946,-0.935,-1.7686;3.9044,-1.0975,-0.2004;4.1819,0.3379,-1.208;3.1232,-0.0425,2.0521;3.107,0.5121,3.3178;3.4059,-0.096,4.3053;2.6821,1.9731,3.0988;2.4252,2.5217,3.9996;1.6688,1.7807,1.9682;1.3464,2.7243,1.5213;0.8033,1.181,2.2586)|\",6.691279583795\r\n\"[H]OC1=NN([H])[C@@]2([H])C([H])([H])N(C([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])C([H])([H])C([H])([H])[C@@]12[H] |(4.7607,-6.6198,-0.9524;5.6964,-6.3706,-0.9627;5.8523,-5.2437,-0.2185;7.009,-4.7449,-0.0234;6.8611,-3.5088,0.6616;7.4862,-3.5315,1.4636;5.4308,-3.306,1.0469;5.3163,-3.3065,2.1375;4.9261,-1.9763,0.4542;5.2916,-1.9233,-0.5784;5.3828,-1.1325,0.9778;3.4679,-1.8032,0.4736;2.932,-1.1109,1.6429;1.8373,-1.1346,1.5453;3.1584,-1.6354,2.5931;3.3745,0.3368,1.7556;3.3609,1.1781,0.6341;3.0802,0.7628,-0.3296;3.7184,2.5205,0.7497;3.7044,3.1601,-0.1293;4.0944,3.0451,1.9895;4.3743,4.0914,2.0785;4.1141,2.2157,3.1106;4.4121,2.6117,4.078;3.7595,0.8698,2.9903;3.7812,0.226,3.8674;2.7001,-3.015,0.1511;2.2897,-3.4839,1.0666;1.8364,-2.7464,-0.4733;3.5668,-4.0655,-0.5506;3.9871,-3.6565,-1.4762;2.9367,-4.9178,-0.8353;4.6862,-4.5117,0.4118;4.2267,-5.1519,1.1815)|\",5.232749345115\r\n\"[H]OC(=O)SC([H])([H])[C@@]([H])(C(=O)O[H])N([H])C([H])([H])[H] |(5.8999,-5.177,-0.6598;5.4145,-5.466,0.1326;4.4834,-4.5584,0.489;3.7965,-4.6958,1.4626;4.3917,-3.1592,-0.6677;3.0703,-2.1856,0.166;2.1262,-2.3668,-0.3526;2.9919,-2.5897,1.1786;3.3753,-0.683,0.2322;2.5073,-0.2267,0.7357;4.5881,-0.4348,1.156;4.5097,-0.4999,2.3575;5.7323,-0.1572,0.5064;5.52,-0.2128,-0.4543;3.6483,-0.0875,-1.0847;3.7068,0.9225,-0.9579;2.6657,-0.3559,-2.1418;2.9223,0.2526,-3.0134;1.6282,-0.1264,-1.8496;2.7165,-1.4044,-2.4472)|\",6.941624326255001\r\n\"[H]C1=C([H])C([C@@]([H])(N([H])[H])C([H])([H])[H])=C(C([H])([H])[H])S1 |(3.5433,-1.5865,4.2188;3.3929,-0.8537,3.4374;2.9537,-1.052,2.1631;2.6943,-2.0331,1.7767;2.8712,0.1546,1.3832;2.4161,0.1569,-0.0698;2.5039,1.1783,-0.457;3.2347,-0.6857,-0.9632;3.1589,-1.6604,-0.6704;4.2181,-0.4416,-0.8458;0.9466,-0.2635,-0.2099;0.2934,0.3948,0.3718;0.7934,-1.2876,0.1537;0.6478,-0.2285,-1.2623;3.26,1.2631,2.0996;3.2989,2.7098,1.6934;2.4333,3.2651,2.0776;3.2973,2.8115,0.6043;4.1992,3.209,2.0691;3.7257,0.8239,3.7324)|\",5.79602501565\r\n\"[H]OC(=O)C1=C([H])C([H])=C([C@]2([H])C([H])([H])N([H])C([H])([H])[C@]2([H])C(=O)O[H])C([H])=C1[H] |(-5.6489,-0.9332,-0.9834;-4.9271,-0.2812,-1.0515;-3.8526,-0.8593,-0.4564;-3.9049,-1.9707,0.0322;-2.6436,0.0045,-0.474;-1.4773,-0.4813,0.1281;-1.4942,-1.4643,0.5874;-0.3212,0.2931,0.1328;0.5764,-0.0918,0.6121;-0.2942,1.5649,-0.4612;0.97,2.396,-0.4005;1.7049,1.827,0.1818;0.8198,3.78,0.3;0.7619,3.6904,1.3889;-0.0966,4.2786,-0.0399;1.9725,4.5763,-0.1098;2.792,4.2519,0.4031;2.1397,4.2592,-1.5336;1.4858,4.9039,-2.1284;3.1702,4.4407,-1.8508;1.6905,2.7795,-1.7579;2.5642,2.1261,-1.8787;0.8408,2.6253,-3.0202;0.3172,3.5372,-3.6137;0.7044,1.357,-3.4851;1.1014,0.7275,-2.8604;-1.4725,2.0423,-1.0615;-1.4824,3.0161,-1.5415;-2.6323,1.2734,-1.07;-3.5316,1.6536,-1.5415)|\",4.933424109565\r\n\"[H]OC(=O)[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@@]1([H])C1=C([H])C(C([H])([H])C([H])([H])[H])=C([H])C([H])=C1[H] |(6.1771,0.4543,-3.3219;6.2662,-0.3849,-2.8395;5.4729,-1.3301,-3.4064;5.3183,-2.3898,-2.848;4.8683,-0.9589,-4.7618;5.6819,-0.523,-5.3559;4.2564,-2.1965,-5.4884;4.0761,-2.9736,-4.7395;4.9212,-2.6142,-6.2495;2.9753,-1.7898,-6.0813;3.1511,-1.3384,-6.979;2.459,-0.7887,-5.1497;1.6591,-0.1958,-5.6033;2.0388,-1.305,-4.2777;3.6739,0.0832,-4.7189;3.8619,0.7938,-5.5332;3.4763,0.8918,-3.4523;3.0879,0.2901,-2.2446;2.9564,-0.7885,-2.1999;2.8953,1.0353,-1.0766;2.4506,0.3613,0.2047;2.8179,-0.672,0.2197;2.9099,0.8695,1.062;0.9209,0.3601,0.3787;0.6358,-0.127,1.3181;0.4326,-0.1746,-0.4438;0.5249,1.382,0.3921;3.0946,2.4223,-1.1273;2.9571,3.017,-0.2269;3.4761,3.0408,-2.3158;3.629,4.1166,-2.3446;3.6651,2.2807,-3.4715;3.9537,2.7707,-4.3994)|\",5.836842093225\r\n\"[H]OC(=O)C1=C([H])C([H])=C([H])C([C@]([H])(C([H])([H])[H])[C@]2([H])N([H])C([H])([H])C([H])([H])N([H])C2([H])[H])=C1[H] |(2.1034,4.4225,4.7767;1.7432,3.7371,5.3623;1.7807,2.5267,4.7387;1.2062,1.5819,5.2261;2.5551,2.4535,3.456;3.5612,3.3708,3.12;3.8475,4.1662,3.8046;4.2451,3.2323,1.9141;5.0314,3.9353,1.6534;3.9297,2.1858,1.0465;4.4662,2.0898,0.1055;2.9353,1.2483,1.3647;2.5972,0.1187,0.402;3.2914,0.1805,-0.4432;1.1675,0.2986,-0.1491;1.0409,1.2972,-0.5811;0.4167,0.1819,0.6421;0.9446,-0.4311,-0.9339;2.7917,-1.2898,1.0589;1.9508,-1.4598,1.7458;4.0174,-1.4626,1.851;4.059,-0.7653,2.5897;5.2516,-1.4426,1.0632;5.4409,-0.48,0.5554;6.0929,-1.627,1.7416;5.1885,-2.5479,0.0103;5.1912,-3.5261,0.5268;6.0716,-2.5003,-0.6372;3.9878,-2.3471,-0.8061;3.9461,-3.0505,-1.5403;2.7779,-2.4241,0.0179;1.9037,-2.3459,-0.6353;2.7017,-3.3776,0.5714;2.2622,1.398,2.5835;1.489,0.6971,2.8838)|\",4.48443625624\r\n\"[H]OC(=O)C1=C([C@]([H])(C([H])([H])[H])[C@]2([H])N([H])C([H])([H])C([H])([H])N([H])C2([H])[H])C([H])=C([H])C([H])=C1[H] |(5.6559,2.5509,4.7406;4.8261,2.1023,4.9887;4.1701,1.8929,3.8136;4.6658,2.2247,2.7541;2.8284,1.273,4.0118;2.1829,0.5363,2.9878;2.8224,0.2193,1.6373;3.8937,0.4023,1.7022;2.2643,1.1919,0.5768;2.424,2.2262,0.8954;1.1879,1.0431,0.4209;2.7683,1.0648,-0.3865;2.6301,-1.2773,1.2252;1.5832,-1.4309,0.928;2.9178,-2.2711,2.2738;2.349,-2.0801,3.095;4.3317,-2.3445,2.6556;4.724,-1.4115,3.0972;4.4409,-3.1367,3.4059;5.1639,-2.6868,1.4209;4.8939,-3.7067,1.0868;6.2303,-2.6794,1.674;4.9099,-1.6728,0.3958;5.4879,-1.8495,-0.4227;3.4969,-1.6609,0.0113;3.361,-0.9476,-0.8068;3.1469,-2.6486,-0.3423;0.8779,0.0874,3.25;0.35,-0.4692,2.4808;0.2341,0.3318,4.4616;-0.7795,-0.0293,4.6155;0.8919,1.0355,5.4703;0.4038,1.2281,6.4213;2.1827,1.497,5.2414;2.7048,2.0533,6.0114)|\",4.481715117735\r\n\"[H]OC(=O)C1=C([H])C([H])=C([C@]([H])(C([H])([H])[H])[C@]2([H])N([H])C([H])([H])C([H])([H])N([H])C2([H])[H])C([H])=C1[H] |(4.9496,5.5956,4.7081;4.8261,5.01,3.9386;3.9262,4.0721,4.3382;3.4511,4.0765,5.4571;3.6085,3.0815,3.2799;4.1785,3.1363,2.0015;4.8824,3.9252,1.76;3.8403,2.1792,1.0474;4.2853,2.2323,0.0568;2.9295,1.1513,1.3331;2.5527,0.1329,0.2677;3.1487,0.3462,-0.6262;1.0651,0.2878,-0.1118;0.8473,1.3192,-0.409;0.4084,0.0401,0.7312;0.7949,-0.3616,-0.9504;2.8889,-1.3303,0.7129;2.1419,-1.6339,1.4602;4.2061,-1.5204,1.3379;4.303,-0.8986,2.1371;5.3368,-1.3385,0.423;5.4202,-0.3142,0.019;6.2598,-1.551,0.9752;5.1983,-2.3145,-0.7445;5.3024,-3.3454,-0.3572;5.9973,-2.1407,-1.4743;3.9034,-2.0853,-1.3912;3.8061,-2.6952,-2.2;2.8044,-2.3314,-0.4538;1.8558,-2.2282,-0.9891;2.8372,-3.3475,-0.0196;2.3655,1.1105,2.621;1.653,0.3294,2.8739;2.699,2.0584,3.5819;2.2624,2.0264,4.5749)|\",4.5715126884\r\n\"[H]C1=C(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C([H])([H])[C@@]1([H])B1OC(C([H])([H])[H])(C([H])([H])[H])C(C([H])([H])[H])(C([H])([H])[H])O1 |(3.2939,1.0872,-2.2367;3.1796,0.0071,-2.21;2.854,-0.6973,-1.1178;2.5859,-0.1878,0.279;1.5992,-0.5774,0.5802;3.6119,-0.7525,1.2848;3.3846,-0.411,2.3016;4.6252,-0.4177,1.0321;3.6132,-1.8479,1.2927;2.5264,1.3429,0.3686;2.2674,1.6601,1.3852;1.7767,1.7533,-0.3167;3.4951,1.7921,0.1169;2.8162,-2.1836,-1.4401;3.7086,-2.6909,-1.0412;1.9494,-2.6914,-0.9956;2.7879,-2.2154,-2.985;1.7446,-2.2525,-3.3213;3.3026,-3.0829,-3.4113;3.4257,-0.86,-3.4337;4.5116,-0.9932,-3.5718;2.8296,-0.318,-4.7873;1.534,0.1218,-4.9284;1.4032,0.712,-6.251;1.552,2.2298,-6.0701;1.4046,2.7677,-7.0124;2.5388,2.488,-5.673;0.8004,2.5772,-5.3544;0.0124,0.3862,-6.7922;-0.1122,0.7792,-7.8078;-0.7464,0.8514,-6.1548;-0.1758,-0.6896,-6.8089;2.6056,0.0544,-7.0362;2.2411,-1.2733,-7.7132;1.5839,-1.1197,-8.5758;1.7433,-1.9539,-7.0151;3.158,-1.7589,-8.0611;3.3144,0.9715,-8.0322;2.6289,1.2926,-8.825;4.1446,0.4328,-8.5002;3.7227,1.8593,-7.5444;3.5378,-0.2646,-5.9644)|\",6.873595863630001\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]1(C([H])([H])[H])C([H])=C([H])C([H])([H])C1([H])[H] |(4.5681,-0.12,0.711;4.0059,0.2135,1.4313;2.7095,-0.3413,1.2285;2.0278,0.3159,1.783;2.5935,-1.7423,1.8216;3.6686,-2.6399,1.7908;4.6194,-2.3254,1.371;3.5453,-3.9244,2.3218;4.3933,-4.6037,2.2897;2.3437,-4.3331,2.9031;2.2491,-5.3316,3.3214;1.2699,-3.4432,2.956;0.3348,-3.7426,3.4224;1.3986,-2.1601,2.4227;0.5611,-1.4684,2.4831;2.3048,-0.2327,-0.2883;0.8683,-0.7316,-0.517;0.5942,-0.6159,-1.5714;0.151,-0.1536,0.0799;0.7522,-1.7867,-0.2511;3.3144,-0.9679,-1.1546;3.3655,-2.0532,-1.1838;4.1295,-0.1426,-1.821;4.9378,-0.4619,-2.4749;3.8065,1.3129,-1.5729;3.7,1.8749,-2.5099;4.6084,1.8071,-1.0074;2.4813,1.2496,-0.768;1.6427,1.5145,-1.4213;2.4765,1.9502,0.0714)|\",6.37834865572\r\n\"[H]C1=C([H])C([H])([H])[C@]([H])(/C([H])=C(\\[H])C(F)(F)F)OC1=O |(0.0364,-2.4933,-0.4173;-0.0844,-1.4439,-0.1717;-1.2642,-0.8146,-0.2181;-2.1675,-1.3484,-0.5055;-1.3707,0.6349,0.1595;-1.6896,0.7202,1.2082;-2.1385,1.1422,-0.4365;-0.0181,1.3436,-0.0314;-0.0233,2.2728,0.5516;0.2547,1.706,-1.4711;-0.5528,2.2288,-1.9802;1.3938,1.4705,-2.1176;2.2357,0.9706,-1.649;1.6161,1.8745,-3.5397;1.8832,0.7979,-4.314;0.5495,2.5047,-4.0798;2.6758,2.7077,-3.6481;1.0672,0.5859,0.5331;1.1403,-0.7669,0.3076;2.1733,-1.3448,0.5528)|\",5.8776591708\r\n\"[H]C1=C([H])C([H])=C(C(=O)N(/N=C([H])/C([H])=C(\\[H])C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(4.1884,7.7663,-2.1175;4.7624,6.855,-1.9714;4.1167,5.6174,-1.9448;3.0398,5.5622,-2.0802;4.8448,4.4459,-1.7405;4.3334,3.4923,-1.7124;6.2369,4.5011,-1.5782;7.1369,3.3047,-1.4709;8.2514,3.3203,-1.9823;6.7019,2.1585,-0.7922;5.5845,2.2438,-0.0187;5.2087,1.2319,0.6892;5.7451,0.2798,0.7098;4.0172,1.3406,1.5117;3.4981,2.2971,1.4765;3.5574,0.3407,2.2829;4.1041,-0.6041,2.2949;2.3348,0.4101,3.146;2.5828,0.2252,4.2003;1.6032,-0.3581,2.8599;1.8469,1.3877,3.077;7.5538,0.9747,-0.8005;8.3875,1.1794,-1.4682;6.9906,0.1044,-1.1571;7.9339,0.7647,0.207;6.8817,5.7461,-1.6376;7.9625,5.7764,-1.5484;6.1487,6.9167,-1.8159;6.6598,7.8753,-1.8424)|\",4.666752536075\r\n\"[H]C1C(C#N)=C(C2C(=O)C([H])=C([H])C(C([H])([H])[H])=C2[H])N(C([H])([H])[H])N1[H] |(5.6546,5.4807,0.3592;5.6739,4.4134,0.1921;4.6446,3.5033,0.2134;3.2904,3.8662,0.4362;2.1985,4.2213,0.6261;5.2273,2.1786,0.0685;4.5777,0.9223,0.0624;5.2466,-0.2859,0.5776;6.4305,-0.307,0.9944;4.4131,-1.4743,0.626;4.8938,-2.3721,1.0034;3.1102,-1.4763,0.2209;2.534,-2.3995,0.2728;2.4646,-0.2973,-0.2752;1.0282,-0.3577,-0.7333;0.9002,-1.0421,-1.5832;0.6663,0.628,-1.0425;0.3664,-0.7179,0.0656;3.1989,0.859,-0.3279;2.7257,1.7527,-0.7238;6.588,2.3909,-0.0386;7.549,1.5632,-0.7674;7.5049,0.5652,-0.3254;8.5442,1.9932,-0.6266;7.3041,1.5453,-1.8355;6.8341,3.7635,-0.0525;7.7285,4.0613,0.3234)|\",2.85175315324\r\n\"[H]C1C(C#N)=C(C2C(=O)C([H])=C([H])C([H])=C2[H])N(C([H])([H])[H])N1[H] |(5.2585,-1.5433,0.4;4.4538,-0.8498,0.2025;4.4787,0.5215,0.1088;5.6662,1.2902,0.2278;6.6691,1.8705,0.3353;3.1051,0.9764,-0.0301;2.6183,2.3023,-0.1356;1.2947,2.6771,0.4005;0.5049,1.8587,0.9327;0.97,4.0896,0.3201;-0.0047,4.3709,0.708;1.8262,5.0124,-0.2123;1.532,6.0593,-0.2566;3.104,4.6303,-0.7148;3.7658,5.3743,-1.147;3.4841,3.3159,-0.6591;4.4497,3.0304,-1.0645;2.3408,-0.1715,-0.016;1.044,-0.3783,-0.6602;0.3616,0.3633,-0.2376;0.7016,-1.3881,-0.4187;1.1375,-0.271,-1.7466;3.1832,-1.2787,0.0373;2.808,-2.1104,0.481)|\",2.9116182003500004\r\n\"[H]OC1=C(C2=NN(C([H])([H])[H])C([H])=C2C#N)C([H])=C(Cl)C([H])=C1[H] |(3.4609,2.6814,-0.0768;4.1279,3.4058,-0.0755;5.364,2.8634,-0.0447;5.6167,1.4644,-0.0179;4.5105,0.4997,-0.0191;3.2412,0.9302,-0.0555;2.449,-0.164,-0.0488;1.0049,-0.0069,-0.0565;0.5459,-0.9928,-0.1454;0.7096,0.612,-0.9073;0.6738,0.47,0.8704;3.1694,-1.2986,-0.0057;2.7101,-2.276,0.0091;4.5116,-0.9364,0.0137;5.5853,-1.8613,0.0551;6.4488,-2.6419,0.0893;6.9548,1.025,0.0106;7.1828,-0.0334,0.0312;7.9993,1.9361,0.0145;9.66,1.3544,0.0491;7.7516,3.3112,-0.0104;8.5778,4.0142,-0.0071;6.4408,3.762,-0.0393;6.2137,4.8229,-0.0593)|\",4.5633492728850005\r\n\"[H]OC1=C(C2=NN(C([H])([H])[H])C([H])=C2C#N)C([H])=C(C([H])([H])[H])C([H])=C1[H] |(6.8597,0.6294,-0.1903;6.5379,-0.3001,-0.1753;5.1851,-0.2922,-0.1548;4.3971,0.8884,-0.1353;5.0255,2.2134,-0.1344;6.3611,2.3311,-0.1794;6.6465,3.6526,-0.1629;8.031,4.0888,-0.1903;8.0542,5.1793,-0.2194;8.5256,3.6891,-1.0794;8.5526,3.7342,0.703;5.5316,4.4014,-0.1073;5.559,5.481,-0.088;4.4483,3.5298,-0.0872;3.0956,3.9496,-0.0278;1.9939,4.3246,0.0213;2.9934,0.7521,-0.1135;2.3788,1.6455,-0.1012;2.3581,-0.485,-0.1093;0.8508,-0.5941,-0.0955;0.5016,-1.2338,0.7245;0.4698,-1.0308,-1.0279;0.3817,0.3873,0.0247;3.1665,-1.6351,-0.1246;2.7005,-2.6183,-0.1176;4.5489,-1.542,-0.1483;5.1761,-2.428,-0.1593)|\",4.590560657935001\r\n\"[H]OC1=C(C2=NN(C([H])([H])[H])C([H])=C2C#N)C([H])=C([H])C([H])=C1[H] |(3.3655,2.702,0.0039;4.0316,3.426,0.0244;5.2692,2.8823,0.0483;5.5181,1.4817,0.062;4.4106,0.5199,0.0538;3.14,0.9473,0.0137;2.3491,-0.1492,0.0176;0.9058,0.0059,-0.0119;0.4476,-0.9839,-0.0439;0.6139,0.5731,-0.8996;0.5689,0.5369,0.8828;3.0711,-1.2824,0.0607;2.6138,-2.2609,0.071;4.4126,-0.9177,0.0846;5.4855,-1.8431,0.1309;6.347,-2.6261,0.1675;6.8583,1.0466,0.085;7.0727,-0.0158,0.0946;7.9166,1.9444,0.0974;8.9376,1.5761,0.1157;7.6543,3.3192,0.0861;8.4745,4.0323,0.0952;6.3459,3.7808,0.0617;6.1161,4.8416,0.0513)|\",4.737502137205\r\n\"[H]C1=C([H])C2=C(C(=O)/C(=C(\\[H])N([H])N([H])[H])C(=O)O2)C([H])=C1[H] |(3.4047,-1.1018,1.0233;2.5114,-0.6104,0.6477;1.3331,-1.3402,0.5254;1.2791,-2.3899,0.7946;0.1874,-0.7048,0.0423;0.2094,0.6488,-0.3186;-1.0175,1.3002,-0.8269;-1.0493,2.4883,-1.1549;-2.1846,0.4145,-0.9063;-3.3706,0.9603,-1.3741;-3.3511,2.0123,-1.6544;-4.5279,0.3221,-1.5137;-4.5617,-0.6641,-1.2575;-5.7191,0.9036,-1.9889;-5.5658,1.2627,-2.9303;-5.9758,1.6842,-1.3862;-2.1386,-0.9818,-0.5161;-3.0827,-1.7584,-0.5637;-0.943,-1.4856,-0.0541;1.4082,1.3658,-0.1865;1.3995,2.4132,-0.4714;2.5542,0.7456,0.2924;3.4791,1.3058,0.3916)|\",4.639541151025\r\n\"[H]SC1=N[C@@]([H])(C2=C([H])C([H])=C([H])O2)[C@@]2([H])C([H])([H])C([H])([H])C([H])([H])O[C@]2([H])N1[H] |(-2.4892,1.2355,2.7929;-1.9444,0.1295,2.2391;-1.2209,0.9602,0.8121;-1.0785,0.259,-0.2455;-0.3575,0.8549,-1.3832;0.2868,0.065,-1.7899;-1.3123,1.1857,-2.4925;-1.4523,0.7135,-3.7643;-0.8412,-0.0469,-4.2306;-2.5658,1.4079,-4.3443;-2.9705,1.2856,-5.3394;-3.0241,2.2519,-3.3821;-3.8329,2.9651,-3.3348;-2.273,2.1335,-2.2476;0.5626,2.0163,-0.9394;1.3181,1.5689,-0.2775;1.287,2.8036,-2.0384;1.9054,2.137,-2.6517;0.5513,3.261,-2.7134;2.1514,3.8999,-1.3868;2.586,4.5599,-2.1483;2.9846,3.4374,-0.8423;1.3252,4.7367,-0.4036;0.5752,5.3339,-0.9489;1.9553,5.423,0.1687;0.6655,3.9189,0.5669;-0.2131,3.0013,-0.06;-0.9431,3.5584,-0.6689;-0.9071,2.2927,1.0035;-0.6593,2.598,1.9365)|\",6.1579364368150005\r\n\"[H]OC(NNC([H])C1=C([H])C([H])=C([H])C([H])=C1[H])/C([H])=C([H])/C([H])=C(\\[H])C([H])([H])[H] |(9.3002,-1.3407,-3.964;8.9059,-0.8597,-3.22;7.5652,-1.0972,-3.2164;6.9835,-0.8592,-2.0842;5.6078,-1.0408,-2.12;5.0463,-0.719,-1.006;5.6556,-0.3362,-0.1806;3.605,-0.839,-0.7884;2.7326,-1.3069,-1.7888;3.1461,-1.586,-2.7522;1.368,-1.4059,-1.5415;0.7035,-1.7683,-2.3216;0.8471,-1.0407,-0.2947;-0.2202,-1.1194,-0.1056;1.7027,-0.5746,0.7047;1.3047,-0.289,1.6746;3.0711,-0.4745,0.4589;3.7385,-0.1112,1.2371;6.9348,-1.5684,-4.4476;5.9334,-1.9655,-4.3251;7.501,-1.4961,-5.6734;8.4769,-1.0227,-5.7977;6.8787,-1.9786,-6.8893;5.8968,-2.4417,-6.788;7.4407,-1.8826,-8.1084;8.4226,-1.4136,-8.1937;6.8262,-2.3713,-9.3833;6.7202,-1.5541,-10.1097;5.8369,-2.8087,-9.2144;7.4591,-3.1318,-9.8613)|\",3.78782479896\r\n\"[H]OC([H])([H])[C@]1([H])C([H])([H])[C@@]([H])(O[H])[C@]([H])(O[H])[C@]1([H])N([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(2.9743,2.1462,-0.4673;3.5489,1.405,-0.2212;4.7643,1.5176,-0.9734;4.5546,1.5743,-2.0511;5.306,2.4322,-0.6866;5.6423,0.3057,-0.6744;5.7385,0.246,0.4146;7.0445,0.4743,-1.3211;7.7748,0.9248,-0.6378;6.9787,1.1186,-2.2058;7.4553,-0.9312,-1.7784;8.2069,-0.9122,-2.5798;7.886,-1.7508,-0.6885;8.6871,-1.3503,-0.3182;6.1363,-1.5306,-2.2682;6.2113,-2.6251,-2.3477;5.8011,-0.9485,-3.5158;4.8411,-0.7547,-3.4245;5.1068,-1.0792,-1.1926;5.1369,-1.7952,-0.3587;3.7697,-1.0097,-1.796;3.2007,-0.3452,-1.2725;3.0812,-2.3075,-1.857;3.6927,-2.984,-2.468;2.9973,-2.7752,-0.8603;1.7052,-2.1661,-2.4714;1.564,-1.7945,-3.8163;2.4543,-1.6169,-4.4146;0.3006,-1.6548,-4.3875;0.2082,-1.3718,-5.433;-0.8456,-1.8835,-3.6205;-1.831,-1.7771,-4.0664;-0.7175,-2.2512,-2.2814;-1.6028,-2.4311,-1.677;0.5514,-2.3897,-1.7129;0.6471,-2.6764,-0.6678)|\",6.174263267845\r\n\"[H]C([H])([H])C([H])(N(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[C@@]1([H])O[C@]1([H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])C([H])([H])[H] |(5.0184,1.208,0.7324;4.5652,0.4104,0.1317;4.8465,0.5871,-0.913;5.0156,-0.5387,0.444;3.0341,0.3953,0.3192;2.8358,0.269,1.3867;2.3376,-0.723,-0.3563;2.5877,-2.0737,0.1929;3.5767,-2.4479,-0.14;2.5762,-2.1026,1.7284;2.612,-3.1426,2.0702;1.6554,-1.6517,2.1174;3.4313,-1.587,2.1749;1.5257,-3.0588,-0.32;1.7224,-4.063,0.0726;1.5125,-3.1294,-1.4105;0.5291,-2.744,0.0106;2.4231,-0.6707,-1.8234;3.0492,0.172,-2.1378;2.9171,-1.565,-2.2297;1.0715,-0.4757,-2.4961;0.3971,0.1903,-1.9534;1.1285,-0.2301,-3.9133;0.4703,-1.424,-3.45;1.0398,-2.3309,-3.6674;-1.0265,-1.4907,-3.701;-1.4413,-0.5437,-3.3305;-1.697,-2.65,-2.9287;-1.3977,-2.6209,-1.873;-2.7841,-2.4873,-2.9472;-1.399,-4.0347,-3.531;-0.3361,-4.2819,-3.3974;-1.9603,-4.8023,-2.9829;-1.7462,-4.0868,-5.0268;-2.8342,-3.9739,-5.1477;-1.4869,-5.0687,-5.4434;-1.0349,-2.9693,-5.8051;-1.3319,-2.9928,-6.8615;0.0497,-3.1487,-5.7897;-1.3445,-1.5855,-5.2109;-2.4143,-1.3729,-5.3516;-0.794,-0.8024,-5.745;2.4095,1.7462,-0.0594;2.8422,2.5426,0.5569;1.3279,1.7277,0.1105;2.5868,2.0135,-1.1072)|\",7.54299593586\r\n\"[H]C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])O[C@]1([H])C([H])([H])N(C([H])(C([H])([H])[H])C([H])([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H] |(6.4361,-5.6548,-3.8699;6.6998,-4.6677,-4.2659;7.7503,-4.7047,-4.5795;6.0928,-4.4956,-5.1636;6.4728,-3.5702,-3.2222;7.0704,-3.7777,-2.3256;5.4214,-3.58,-2.9011;6.8323,-2.1685,-3.7371;6.2333,-1.9341,-4.6296;7.8856,-2.1469,-4.0464;6.602,-1.0919,-2.7031;5.5648,-0.9814,-2.3726;7.5508,-1.0337,-1.6166;7.4856,0.069,-2.538;8.3498,0.1386,-3.2035;7.0034,1.3883,-1.9592;7.7292,1.7268,-1.1992;6.072,1.1975,-1.4186;6.7871,2.3884,-3.0019;5.4978,3.1003,-2.9892;5.6101,3.9283,-3.698;4.3835,2.1925,-3.536;3.4327,2.7349,-3.6042;4.2213,1.3264,-2.8827;4.6482,1.8192,-4.53;5.0771,3.715,-1.636;4.1633,4.3083,-1.7612;5.8506,4.3707,-1.227;4.8597,2.9431,-0.8885;7.9945,3.1439,-3.3785;8.8225,2.435,-3.2478;7.9805,3.5228,-4.8675;8.9446,3.9542,-5.1611;7.2098,4.2687,-5.0953;7.7897,2.6388,-5.4841;8.3147,4.3697,-2.4978;9.2972,4.7811,-2.7586;8.3358,4.1052,-1.4346;7.5768,5.168,-2.6372)|\",7.333468270975\r\n\"[H]C1([H])N(C([H])([H])[C@@]2([H])O[C@]2([H])C2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(2.6294,-3.3102,3.1561;3.1215,-2.5169,3.73;4.1328,-2.4105,3.3101;2.4138,-1.2503,3.4977;0.9539,-1.3502,3.5971;0.6114,-2.0338,4.3938;0.5469,-0.3644,3.847;0.3535,-1.7939,2.2772;0.7597,-2.7227,1.8689;-1.0773,-1.6896,2.1716;-0.2535,-0.8502,1.3325;-0.2729,0.2,1.6314;-0.3174,-1.1228,-0.1584;-0.2507,-2.2131,-0.2757;0.8748,-0.4804,-0.905;1.8145,-0.7402,-0.4006;0.9261,-0.9203,-1.9113;0.7511,1.0476,-1.045;0.8378,1.522,-0.0572;1.5891,1.4311,-1.641;-0.5869,1.4478,-1.6848;-0.6187,1.0726,-2.7189;-0.6689,2.5405,-1.7461;-1.7737,0.8637,-0.9037;-2.7195,1.1223,-1.3966;-1.8148,1.3224,0.0947;-1.6615,-0.6647,-0.77;-1.7562,-1.1145,-1.7687;-2.4868,-1.0606,-0.1662;3.0173,-0.1664,4.2804;4.0307,-0.0018,3.8841;2.4521,0.7536,4.0917;3.116,-0.4533,5.7899;3.625,0.3791,6.2941;2.1064,-0.5123,6.2209;3.8645,-1.7735,6.0331;4.9155,-1.6503,5.7302;3.8751,-2.0245,7.1011;3.2337,-2.9114,5.2154;2.2359,-3.1409,5.6147;3.8246,-3.8323,5.3075)|\",7.52122682782\r\n\"[H]O/C(=N/[C@@]([H])(C1=C([H])C([H])=C([H])C([H])=C1[H])[C@]([H])(F)C(=O)OC([H])([H])[H])C([H])([H])[H] |(3.0576,-1.2468,1.4672;3.5493,-0.5315,1.0382;2.746,0.0872,0.1251;3.2631,1.0545,-0.5114;2.4688,1.7526,-1.5143;1.8043,1.0785,-2.073;1.6235,2.8688,-0.9098;0.3096,3.0772,-1.3486;-0.1055,2.4339,-2.1212;-0.4645,4.1093,-0.8123;-1.4842,4.2552,-1.1594;0.0697,4.9446,0.1686;-0.5309,5.7467,0.5894;1.3808,4.7435,0.6092;1.8034,5.3933,1.3713;2.1541,3.7144,0.0749;3.1706,3.553,0.4165;3.4535,2.3021,-2.5734;4.0976,1.4757,-2.8997;4.2581,3.2834,-2.0319;2.701,2.817,-3.7981;1.9033,2.1288,-4.4031;3.0237,4.0738,-4.1211;2.3353,4.6081,-5.2659;2.7162,5.6222,-5.3852;1.2568,4.6189,-5.0896;2.547,4.008,-6.1547;1.3407,-0.4708,-0.0205;1.2353,-1.0025,-0.9737;0.6076,0.3408,-0.0102;1.0839,-1.1689,0.783)|\",6.45454053386\r\n\"[H]OC([H])([H])[C@@]([H])(C([H])([H])[H])C([H])([H])[C@@]([H])(C([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC([H])([H])[C@@]1([H])C([H])([H])[H] |(8.4795,0.637,-0.8483;7.7126,0.0453,-0.8618;6.7524,0.5949,-1.7622;6.4675,1.6186,-1.4635;7.1699,0.656,-2.7792;5.5099,-0.2992,-1.7449;5.8577,-1.3206,-1.9531;4.8659,-0.2968,-0.35;4.041,-1.0169,-0.2981;4.458,0.695,-0.1125;5.6003,-0.5545,0.418;4.4909,0.1314,-2.8207;4.3313,1.2131,-2.7422;3.523,-0.3386,-2.5899;4.8553,-0.2125,-4.2834;5.9003,0.0696,-4.4651;4.6968,-1.7168,-4.5566;5.2572,-2.3096,-3.8266;5.0586,-2.0002,-5.5504;3.6433,-2.0188,-4.4853;3.9925,0.6147,-5.2553;2.9357,0.3959,-5.0246;4.2599,1.9996,-4.9789;3.4929,2.9496,-5.7197;2.0297,2.9785,-5.2471;1.4808,3.7512,-5.7937;1.9935,3.2072,-4.1777;1.5181,2.0259,-5.4055;4.176,4.2931,-5.4988;3.657,5.0752,-6.0602;5.2106,4.2354,-5.8468;4.1712,4.5514,-4.4359;3.5812,2.6997,-7.1207;3.299,1.357,-7.5029;2.2405,1.1111,-7.3198;3.4643,1.319,-8.5844;4.206,0.3608,-6.7642;3.8415,-0.6478,-7.0038;5.6668,0.4916,-7.2137;5.7431,0.3676,-8.3004;6.3058,-0.2669,-6.7496;6.0653,1.4776,-6.9593)|\",8.88723835733\r\n\"[H]C(=O)[C@@]([H])(Cl)[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])=C(C([H])([H])[H])C([H])([H])[H] |(10.221,-3.0871,-1.8806;9.4922,-2.2849,-1.6434;9.8468,-1.1748,-1.3231;8.0235,-2.6812,-1.7992;7.8,-2.6359,-2.8729;7.8896,-4.4635,-1.3939;7.0368,-1.7929,-1.0279;7.338,-0.7766,-1.3177;7.1932,-1.9155,0.4949;6.5819,-1.1656,1.0061;8.2311,-1.7538,0.8011;6.8792,-2.9039,0.8467;5.5918,-2.0283,-1.5108;5.5755,-1.9803,-2.6088;5.2831,-3.0452,-1.2352;4.5422,-1.0281,-0.9715;3.566,-1.3268,-1.3677;4.4669,-1.1374,0.1189;4.8367,0.4136,-1.3002;5.629,0.8705,-0.7051;4.2548,1.1889,-2.2294;4.6902,2.6224,-2.423;3.8527,3.3156,-2.2606;5.0399,2.7952,-3.4507;5.4981,2.9009,-1.7392;3.1447,0.7457,-3.1516;2.2624,1.3901,-3.0341;2.8291,-0.2874,-2.9878;3.4554,0.8376,-4.2018)|\",5.07764445033\r\n\"[H]C1=C2N([H])C([H])=C([H])N2N(C([H])([H])C(=O)C2([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(1.1524,-2.32,-1.0286;1.0087,-1.4682,-0.3774;-0.2358,-0.963,-0.095;-1.5284,-1.3454,-0.4011;-1.7391,-2.2514,-0.7902;-2.4348,-0.6053,0.3849;-3.4975,-0.7785,0.332;-1.7227,0.2976,1.094;-2.0295,1.0777,1.7721;-0.3785,0.0948,0.7885;0.6796,0.971,1.0951;0.7556,2.1582,0.1692;-0.2067,2.6772,0.24;1.5376,2.7922,0.595;1.1316,1.8823,-1.2864;2.3048,1.9134,-1.6321;0.0326,1.6133,-2.2631;-0.9761,1.5666,-1.8621;0.3504,0.7481,-3.4689;1.3544,0.3357,-3.4901;-0.437,0.0898,-3.8248;0.2004,2.2184,-3.6577;-0.6899,2.6032,-4.1471;1.1068,2.7976,-3.8072;1.9012,0.1971,1.1804;2.6495,0.6037,1.8507;2.0728,-0.911,0.4243;3.0176,-1.4459,0.4832)|\",3.523874363975\r\n\"[H]OC([H])([H])/C([H])=C(/C(=C=C([H])[H])[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])[H] |(3.6022,1.9379,3.9873;4.1925,1.3274,4.4569;5.1608,0.8771,3.5051;5.8674,1.6809,3.2536;5.7315,0.1043,4.0349;4.5049,0.2945,2.2851;3.7635,-0.4706,2.5118;4.7049,0.6132,0.9924;3.9081,-0.0799,-0.0771;2.5992,-0.1509,0.0156;1.2937,-0.2317,0.0787;0.6506,0.5164,-0.3815;0.7986,-1.048,0.6016;4.72,-0.9435,-1.5789;5.2369,0.3248,-2.8932;5.6432,-0.1879,-3.7743;6.0015,1.0202,-2.5315;4.3753,0.9169,-3.2227;6.233,-1.9119,-0.9726;6.7157,-2.4375,-1.8055;5.9406,-2.661,-0.2279;6.9841,-1.2627,-0.5087;3.4704,-2.1342,-2.3501;3.8953,-2.5979,-3.2489;2.55,-1.6173,-2.6415;3.1945,-2.937,-1.6573;5.7056,1.6362,0.5035;6.2052,2.1635,1.3192;5.2131,2.3801,-0.1351;6.4864,1.1653,-0.1072)|\",5.92119738688\r\n\"[H]O[C@@]1([H])O[C@]([H])(C([H])([H])C2=C([H])C([H])=C([H])C([H])=C2[H])[C@@]([H])(O[H])[C@@]1([H])O[H] |(6.3871,-2.5171,-0.2943;6.3312,-2.8687,0.6172;4.9486,-2.8842,0.9367;4.8925,-3.2054,1.9785;4.3342,-1.625,0.8289;3.7314,-1.4532,-0.4792;4.2926,-0.68,-1.0181;2.2811,-0.9652,-0.3132;2.3253,-0.0426,0.2781;1.9114,-0.6847,-1.3075;1.327,-1.947,0.3336;0.4123,-2.6766,-0.4385;0.3739,-2.5176,-1.5143;-0.458,-3.5946,0.1531;-1.161,-4.1473,-0.4646;-0.4287,-3.7956,1.5334;-1.1069,-4.5068,1.9973;0.4725,-3.0699,2.316;0.4963,-3.2142,3.3931;1.3403,-2.1531,1.7221;2.0393,-1.591,2.3344;3.9176,-2.7918,-1.2059;3.0509,-3.068,-1.8188;5.0946,-2.703,-2.0161;5.3439,-3.6341,-2.1743;4.1915,-3.7901,-0.0588;3.2554,-4.1491,0.3776;4.9675,-4.8946,-0.4928;5.8666,-4.7148,-0.1545)|\",6.51984785798\r\n\"[H]C#CC([H])([H])OC(=O)C([H])([H])C([H])([H])[C@@]1([H])N=C(O[H])OC1=O |(10.6427,-4.4992,-1.7019;10.6627,-3.4332,-1.6685;10.6813,-2.2278,-1.6213;10.7289,-0.7691,-1.5695;11.1168,-0.3592,-2.5062;11.3551,-0.4333,-0.7383;9.408,-0.1919,-1.4301;8.9796,0.0269,-0.1617;9.6454,-0.2003,0.8251;7.5739,0.5939,-0.1707;7.5771,1.509,-0.7739;6.9334,-0.1198,-0.7032;7.0164,0.8703,1.2279;7.0966,-0.0221,1.8569;5.9524,1.118,1.1364;7.7381,2.0245,1.9456;8.7911,1.7444,2.0724;7.6332,3.305,1.2406;7.107,4.1034,2.0807;6.8106,5.3882,1.909;7.0823,5.6156,1.0019;6.7784,3.654,3.3298;7.1392,2.2931,3.3277;6.9542,1.5842,4.2698)|\",6.827336509044999\r\n\"[H]C(=O)/C([H])=C(/Cl)C1=C(Cl)C([H])=C(C([H])([H])[H])OC1=O |(7.8267,-0.2974,-2.7909;6.9688,-0.1575,-2.1006;5.8261,-0.225,-2.5156;7.3684,0.1129,-0.7034;8.4342,0.1378,-0.4964;6.4936,0.3208,0.2952;7.1067,0.6313,1.9168;5.0227,0.2655,0.1785;4.2036,1.3586,0.1907;4.8561,2.9712,0.3263;2.7879,1.2417,0.0574;2.1595,2.1218,0.0542;2.2544,0.0023,-0.0872;0.806,-0.3167,-0.2625;0.2016,0.5929,-0.2416;0.4662,-0.9915,0.5314;0.6456,-0.8303,-1.2175;3.0313,-1.1013,-0.0852;4.4317,-1.0683,0.0517;5.0136,-2.122,0.0679)|\",4.27218745285\r\n\"[H]C1=C([H])C([H])=C(C(=O)[C@]2([H])[C@]3([H])C(=O)OC([H])([H])C([H])([H])[C@@]32[H])C([H])=C1[H] |(-1.3686,0.4686,3.724;-0.6085,0.3811,2.9522;-0.5999,-0.7265,2.0984;-1.3547,-1.5008,2.2048;0.3753,-0.8387,1.1123;0.404,-1.6926,0.4436;1.3502,0.1617,0.9572;2.379,-0.0235,-0.1164;2.5056,-1.1055,-0.6803;3.2609,1.1296,-0.4749;2.9487,2.0984,-0.1014;3.9561,1.2017,-1.8324;3.9348,2.1854,-2.2928;3.8088,0.1001,-2.8557;2.9601,0.1244,-3.7128;4.7248,-0.9008,-2.8362;5.8527,-0.8277,-1.9424;6.2831,-1.8319,-1.9549;6.5907,-0.1322,-2.3656;5.4573,-0.3981,-0.5319;6.3561,-0.339,0.0918;4.7997,-1.1553,-0.0981;4.7815,0.9668,-0.6057;5.3566,1.8063,-0.2223;1.3335,1.2678,1.8203;2.0879,2.0436,1.7381;0.362,1.3744,2.8144;0.3643,2.2323,3.481)|\",5.058596480795\r\n\"[H]C([H])=C([H])C(=O)O[C@]([H])(/C([H])=C(\\[H])C(F)(F)F)C([H])([H])C([H])=C([H])[H] |(8.3858,3.095,0.7589;7.377,2.7031,0.8459;6.9129,2.7089,1.828;6.7179,2.2243,-0.2114;7.151,2.2017,-1.2068;5.341,1.69,-0.0666;4.7055,1.6359,0.9672;4.8744,1.2582,-1.2678;3.5408,0.6776,-1.2845;3.3569,0.2234,-0.3078;3.5538,-0.362,-2.3678;3.8385,-0.0255,-3.3611;3.2033,-1.6304,-2.1662;2.9109,-2.0052,-1.1882;3.1689,-2.6583,-3.2523;1.9212,-3.1623,-3.4048;3.5573,-2.1686,-4.4472;3.9772,-3.7021,-2.9554;2.4954,1.7898,-1.519;1.5009,1.3275,-1.4984;2.5547,2.4661,-0.6558;2.6843,2.5593,-2.799;3.6418,3.0672,-2.9118;1.7739,2.6562,-3.7685;1.9574,3.2359,-4.6691;0.8049,2.166,-3.6963)|\",5.68717947545\r\n\"[H]O[C@]([H])(/C([H])=C(\\[H])C(F)(F)F)C([H])([H])C([H])=C([H])[H] |(5.1913,-1.9084,2.2043;5.6358,-1.5056,1.4368;4.7781,-0.4844,0.9598;5.1046,-0.2841,-0.0703;4.9047,0.8046,1.7388;4.2485,1.6181,1.4333;5.7609,0.989,2.7413;6.4421,0.2035,3.0534;5.8744,2.2683,3.501;7.1233,2.7833,3.4146;5.0165,3.2181,3.0671;5.6262,2.0763,4.8201;3.3062,-0.9783,0.8918;2.682,-0.1804,0.4674;3.2738,-1.8281,0.2011;2.7877,-1.3906,2.2435;2.5759,-0.587,2.9493;2.6325,-2.6581,2.637;2.2816,-2.9091,3.6345;2.8271,-3.4921,1.9653)|\",7.025979619910001\r\n\"[H]/N=C(O[H])\\C(=C(/[H])[C@@]1([H])C2=C(C([H])([H])[H])C([H])([H])C([H])([H])[C@@]2([H])[C@]([H])(C([H])([H])[H])C([H])([H])C1([H])[H])C([H])([H])[H] |(7.7142,-2.7023,-2.9587;8.3261,-2.147,-3.5574;7.6529,-1.2338,-4.134;8.3264,-0.4642,-5.0425;7.7766,0.2914,-5.2966;6.1961,-0.9074,-3.9669;5.645,-1.0213,-2.7449;6.3044,-1.3155,-1.928;4.2437,-0.6817,-2.3128;3.6274,-0.4504,-3.195;3.5129,-1.7626,-1.53;3.7197,-3.0908,-1.4464;4.7509,-3.9722,-2.0913;4.2582,-4.8049,-2.6125;5.4064,-4.429,-1.336;5.3755,-3.4513,-2.818;2.6454,-3.7606,-0.6073;1.9676,-4.3342,-1.2597;3.0628,-4.4847,0.1057;1.9219,-2.5905,0.0857;2.3426,-2.4396,1.0882;0.848,-2.7634,0.2068;2.2408,-1.3519,-0.7838;1.4393,-1.2053,-1.5321;2.3623,-0.0299,-0.0009;3.076,-0.199,0.8217;1.0245,0.407,0.6078;1.1359,1.3378,1.1769;0.6217,-0.3525,1.2872;0.2759,0.5836,-0.1757;2.9567,1.0548,-0.9109;2.2573,1.2624,-1.7357;3.0659,1.9952,-0.3543;4.3154,0.6329,-1.4778;5.023,0.4859,-0.6495;4.7336,1.4259,-2.1095;5.4555,-0.4692,-5.2148;4.3736,-0.5593,-5.0957;5.7503,-1.0745,-6.0793;5.6529,0.5824,-5.4739)|\",5.526632303655001\r\n\"[H]O/C(=N/[C@]1(O[H])C([H])([H])C([H])([H])[C@@]([H])(N([H])[H])C1([H])[H])OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(6.7866,1.1116,0.5354;6.0507,1.6788,0.2417;5.0697,0.7974,-0.0886;5.3007,-0.45,0.0331;4.2812,-1.444,-0.2481;3.1344,-1.2844,0.6103;3.4617,-1.3699,1.521;4.8652,-2.8594,-0.0735;5.8487,-2.8845,-0.5564;5.0246,-3.1037,0.9845;3.8645,-3.7904,-0.7868;3.1482,-4.2151,-0.0746;4.3709,-4.6336,-1.2702;3.087,-2.8955,-1.8206;3.2336,-3.2799,-2.8367;1.6349,-2.8009,-1.6135;1.4795,-2.4135,-0.682;1.2332,-3.7389,-1.6049;3.7274,-1.4953,-1.6834;3.0077,-0.6996,-1.881;4.5697,-1.3836,-2.3755;3.9412,1.3473,-0.5358;3.6118,2.7882,-0.5318;2.1732,2.7808,-1.0569;1.7789,3.802,-1.0889;1.5327,2.1759,-0.4083;2.1334,2.3605,-2.0666;4.5331,3.5483,-1.4906;5.5664,3.5521,-1.14;4.1918,4.5861,-1.5775;4.4995,3.0955,-2.4871;3.6583,3.3376,0.8975;3.28,4.3658,0.9051;4.6744,3.3426,1.2962;3.0212,2.7362,1.5545)|\",6.859990171105\r\n\"[H]C1/C([H])=C([H])\\N=C2\\NC(=O)C(C(F)(F)F)=C([H])[C@@]12[H] |(2.9753,-0.5237,3.3973;2.7555,-0.0708,2.4331;2.6162,-0.7977,1.3157;2.7039,-1.8797,1.313;2.3841,-0.0965,0.0516;2.4671,-0.6678,-0.8747;2.0795,1.1528,-0.0686;1.9469,1.9067,1.1052;1.2263,2.9699,1.0326;0.9759,3.7381,2.1901;0.4387,4.8253,2.104;1.3718,3.1722,3.5194;0.8722,3.8701,4.76;1.3615,5.1177,4.8602;-0.4702,3.945,4.7732;1.2487,3.2015,5.8799;2.1709,2.1041,3.5949;2.4926,1.7097,4.5552;2.684,1.4373,2.3532;3.7372,1.7703,2.2331)|\",4.030006125905\r\n\"[H]O[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@@]1([H])C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.1657,-0.4187,2.323;4.3383,-0.5298,1.8304;4.6273,-1.3035,0.6644;5.2804,-0.7299,-0.011;5.2362,-2.6919,1.0264;5.5905,-2.6589,2.0665;6.0971,-2.9448,0.3969;4.1722,-3.7108,0.8835;4.3321,-4.2247,0.0199;2.912,-2.961,0.7411;2.1609,-3.5657,0.2259;2.5222,-2.7158,1.738;3.2869,-1.6642,-0.0051;3.5116,-1.9375,-1.0475;2.2383,-0.5367,-0.0064;2.0117,-0.2666,1.0294;2.6824,0.355,-0.468;0.9656,-0.9074,-0.741;0.8934,-0.8286,-2.1395;1.7568,-0.472,-2.6981;-0.2679,-1.1911,-2.8223;-0.3026,-1.1181,-3.9065;-1.3858,-1.6394,-2.1152;-2.2933,-1.9173,-2.6447;-1.3309,-1.7186,-0.7231;-2.1972,-2.0596,-0.1618;-0.1661,-1.3538,-0.0454;-0.1343,-1.4112,1.0405)|\",6.043648619605\r\n\"[H]O[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@@]1([H])C1=C(OC([H])([H])[H])C([H])=C([H])C([H])=C1[H] |(3.1005,-0.7834,1.3426;3.9528,-0.9565,1.779;4.867,-1.2787,0.7526;5.8253,-1.3944,1.2694;4.5092,-2.5845,-0.0159;3.4179,-2.7108,0.0198;4.9568,-3.4807,0.4245;4.9208,-2.4223,-1.4148;5.9242,-2.5944,-1.4816;4.668,-1.0017,-1.696;5.2335,-0.6732,-2.5734;3.6036,-0.8768,-1.9165;5.0508,-0.2135,-0.4052;6.1257,-0.0059,-0.4679;4.3723,1.1291,-0.2315;2.9715,1.2889,-0.1382;2.2225,0.1258,-0.1408;0.8045,0.2232,-0.2085;0.4894,0.7837,-1.0968;0.3891,0.6985,0.6888;0.4391,-0.8029,-0.2768;2.3911,2.5551,-0.0404;1.3151,2.665,0.0292;3.199,3.6949,-0.0227;2.7366,4.675,0.0549;4.5814,3.5678,-0.1006;5.2192,4.4468,-0.0849;5.1453,2.2946,-0.2047;6.226,2.1951,-0.274)|\",5.6381989823600005\r\n\"[H]O[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@@]1([H])C1=C([H])C(OC([H])([H])[H])=C([H])C([H])=C1[H] |(6.4368,-1.8853,-2.5282;6.9299,-1.133,-2.8976;6.3346,-0.7901,-4.131;6.6932,0.2244,-4.3404;6.7157,-1.7032,-5.32;6.7075,-2.7527,-4.991;7.7118,-1.4939,-5.7211;5.6801,-1.5088,-6.3383;5.8491,-0.6176,-6.8045;4.4279,-1.4122,-5.5674;3.6766,-0.8473,-6.1272;4.0282,-2.4229,-5.4255;4.751,-0.7668,-4.1773;4.4264,0.2789,-4.1789;4.0961,-1.4488,-2.9933;3.3835,-0.6815,-2.0582;3.3029,0.3883,-2.2165;2.7904,-1.281,-0.9422;2.0747,-0.6106,0.0064;1.9269,0.7944,-0.1252;1.332,1.1134,0.7325;1.3994,1.0606,-1.0507;2.898,1.3063,-0.1037;2.9137,-2.6641,-0.7418;2.4487,-3.1087,0.1322;3.6227,-3.4258,-1.6603;3.7186,-4.4973,-1.5062;4.2113,-2.831,-2.7827;4.7388,-3.454,-3.5003)|\",5.776977046114999\r\n\"[H]O[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@@]1([H])C1=C([H])C(Cl)=C([H])C([H])=C1[H] |(-0.0404,-3.2191,-2.7427;-0.7956,-2.611,-2.7797;-1.6887,-2.9439,-1.7261;-2.6076,-2.4056,-1.9726;-1.9154,-4.4869,-1.6051;-1.3553,-4.9755,-2.4152;-2.964,-4.7739,-1.7316;-1.4102,-4.9386,-0.2942;-2.1938,-5.02,0.3505;-0.5318,-3.8643,0.1843;-0.4064,-3.9166,1.2696;0.4657,-3.9855,-0.2623;-1.1938,-2.5494,-0.2962;-2.1024,-2.4266,0.3104;-0.3835,-1.2807,-0.1795;-0.9351,-0.1713,0.4763;-1.9375,-0.2242,0.8898;-0.2037,1.0075,0.6023;-0.9292,2.3815,1.4339;1.0862,1.1214,0.0905;1.6413,2.0466,0.2005;1.6388,0.0171,-0.5594;2.6442,0.0858,-0.9655;0.9173,-1.1678,-0.6938;1.3751,-2.0073,-1.2052)|\",5.945687633425\r\n\"[H]C1C([H])=C(F)C([H])=C2N=C(C([H])([H])[H])NC(=O)[C@@]12[H] |(4.8742,-2.9418,-2.2802;5.2684,-2.3935,-1.4313;6.5408,-2.5369,-1.0152;7.2457,-3.2066,-1.4968;7.0098,-1.7464,0.0995;8.2753,-1.9673,0.4749;6.2772,-0.794,0.735;6.7064,-0.1773,1.516;4.9404,-0.5385,0.2865;4.2671,0.4836,0.7459;3.0509,0.7602,0.1297;2.2518,1.8277,0.818;2.1008,1.5571,1.8697;1.2912,1.9709,0.322;2.8148,2.7688,0.8156;2.5842,0.2625,-0.9873;3.2854,-0.7678,-1.6021;3.0531,-1.1581,-2.7287;4.3237,-1.4909,-0.7115;3.6678,-2.1508,-0.1045)|\",3.755171136900001\r\n\"[H]O[C@]1(C([H])([H])[H])C([H])([H])[C@@]2([H])C([H])([H])[C@]([H])(C2(C([H])([H])[H])C([H])([H])[H])[C@]1([H])O[H] |(6.2549,-0.5129,-1.963;5.5764,0.1415,-1.7364;5.288,0.019,-0.325;6.6268,0.004,0.4256;6.4764,-0.0466,1.5091;7.2015,0.9084,0.1985;7.2235,-0.8725,0.1367;4.4841,-1.3007,-0.0567;4.3345,-1.8048,-1.0184;5.091,-1.975,0.5628;3.1343,-1.0785,0.6383;2.6583,-2.0371,0.8801;3.3238,-0.0412,1.7782;4.2812,-0.0445,2.3134;2.523,-0.0615,2.5204;3.1181,1.0386,0.6813;2.6295,1.9812,0.9533;2.2656,-0.0101,-0.1148;0.8048,-0.0207,0.3701;0.2773,-0.8998,-0.0219;0.2761,0.8696,0.0072;0.7073,-0.0384,1.46;2.2536,0.0019,-1.6467;1.7066,-0.8744,-2.0215;3.2486,-0.0038,-2.0908;1.7375,0.8961,-2.0146;4.4788,1.3405,0.0458;5.0752,1.8654,0.8113;4.3104,2.2134,-1.0574;4.8836,1.8732,-1.7666)|\",8.00558948171\r\n\"[H]C1([H])C(=O)C([H])([H])[C@@]2([H])C(C(F)(F)F)=NC(=O)N=C2C1([H])[H] |(2.3335,2.5226,-0.1793;2.0909,1.7684,0.5746;1.0547,1.9666,0.8878;2.1138,0.3949,-0.0885;1.8164,0.2288,-1.252;2.548,-0.7745,0.7983;3.6355,-0.8869,0.6919;2.096,-1.6862,0.4056;2.185,-0.5481,2.2844;1.0853,-0.4787,2.3491;2.5933,-1.6382,3.2643;2.678,-3.0824,2.7758;1.544,-3.4005,2.1064;3.712,-3.2261,1.9147;2.8383,-3.9507,3.7687;2.8602,-1.4209,4.4894;2.699,-0.0686,4.9476;2.3411,0.1441,6.0792;2.9979,0.9823,4.0451;2.7467,0.773,2.8071;3.0316,1.8484,1.7866;2.9858,2.8224,2.2799;4.0719,1.7197,1.45)|\",4.58783951943\r\n\"[H]C1C2=C(NC(=O)[C@]1([H])C(=O)OC([H])([H])[H])C([H])=NC([H])=C2[H] |(4.3537,0.8197,0.5589;3.3166,0.6008,0.8073;2.7812,-0.6217,0.5734;1.3692,-0.847,0.9386;0.5596,0.0163,1.4866;1.0339,1.3056,1.7644;0.3241,2.133,2.3004;2.4869,1.6926,1.403;2.9672,2.0409,2.3274;2.4771,2.8963,0.4408;1.6998,3.0414,-0.4704;3.4873,3.7357,0.7335;3.5997,4.8864,-0.1294;3.7394,4.5733,-1.167;4.4702,5.4338,0.2314;2.6989,5.5004,-0.0566;0.8419,-2.1863,0.6563;-0.2011,-2.3616,0.9174;1.5259,-3.1498,0.1313;2.8626,-2.9177,-0.1992;3.3744,-3.7708,-0.6345;3.4974,-1.7376,-0.007;4.5401,-1.6111,-0.2855)|\",3.0313482945699994\r\n\"[H]C([H])=C([H])C([H])([H])O/N=C(/C1=C([H])C([H])=C(N(=O)=O)C([H])=C1[H])C([H])([H])[H] |(5.0365,2.0822,-2.1174;5.6051,1.5049,-1.3943;6.5287,1.0522,-1.7399;5.1787,1.3633,-0.1406;4.2458,1.8307,0.1744;5.8636,0.5948,0.949;5.2336,-0.2386,1.2909;6.0431,1.2409,1.8214;7.1106,0.0783,0.4765;7.646,-0.7588,1.4428;8.8103,-1.2244,1.138;9.4073,-2.1436,2.1409;10.6459,-2.7635,1.9011;11.1809,-2.5775,0.977;11.2101,-3.6293,2.8343;12.163,-4.1113,2.6557;10.5263,-3.8756,4.0209;11.1169,-4.7873,5.0093;12.2036,-5.2998,4.7384;10.4917,-4.9864,6.0514;9.2941,-3.2774,4.2926;8.792,-3.4927,5.2277;8.7433,-2.4184,3.3543;7.7878,-1.9456,3.546;9.5568,-0.8985,-0.1299;10.5612,-0.5239,0.0972;9.0243,-0.1426,-0.7048;9.6733,-1.7931,-0.7537)|\",4.046332956935\r\n\"[H]OC(=O)C([H])([H])/N=C(O[H])\\C([H])=C(/[H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(0.9035,-3.5505,-3.7794;1.2241,-3.5184,-4.7151;1.9145,-2.3751,-4.8158;2.4506,-2.0035,-5.8331;1.976,-1.5601,-3.5116;1.5781,-0.5611,-3.7346;3.0413,-1.4325,-3.2701;1.237,-2.2482,-2.4697;1.1059,-1.7559,-1.3005;0.3187,-2.4544,-0.4388;0.4806,-2.1371,0.4637;1.7091,-0.501,-0.8078;2.5847,-0.1492,-1.3477;1.2176,0.2215,0.2105;0.3155,-0.0996,0.7305;1.8122,1.4906,0.7378;1.0978,2.3112,0.5704;2.7356,1.7397,0.1989;2.0889,1.3995,2.2606;2.8268,0.6056,2.4306;0.9142,1.0379,2.9643;0.4826,1.8988,3.1495;2.6347,2.7202,2.8375;3.4993,3.0718,2.2538;2.9815,2.5203,3.8562;1.5323,3.6914,2.9243;1.4427,4.2284,2.0652;1.6926,4.3595,3.6728)|\",5.1184615279050005\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(-0.584,-0.9656,-2.2714;0.1736,-1.4376,-2.6983;1.0409,-1.849,-1.7503;2.0669,-2.4198,-2.0424;0.6555,-1.501,-0.3137;-0.4212,-1.6423,-0.165;1.1831,-2.2044,0.3359;1.066,-0.0633,0.0785;0.8262,0.0752,1.1416;2.1567,0.0172,-0.0059;0.4273,1.0892,-0.7203;0.7408,1.044,-1.7713;0.8265,2.0287,-0.3131;-1.1105,1.146,-0.6585;-1.4362,0.9083,0.3625;-1.7186,0.1742,-1.5278;-1.9292,0.7057,-2.3335;-1.6897,2.5235,-1.0476;-1.1552,3.3259,-0.518;-2.7389,2.5598,-0.7363;-1.6683,2.6405,-2.5133;-0.7463,2.9082,-2.8512;-2.3254,3.3435,-2.8391)|\",7.455919503700001\r\n\"[H]C1=C([H])C([H])=C(C(NN)C(=O)OC([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(3.4467,6.3707,7.1208;3.2834,5.5143,6.4726;4.3639,4.7812,5.982;5.3791,5.0641,6.2479;4.1667,3.6796,5.1503;5.0121,3.1185,4.776;2.8638,3.2903,4.7902;2.6306,2.1264,3.9073;1.3839,1.8006,3.6325;0.2984,1.5383,3.4135;3.6493,1.2653,3.2856;4.8515,1.3977,3.4189;3.0724,0.2948,2.5306;3.9681,-0.6293,1.8677;3.4141,-0.9532,0.981;4.8644,-0.0862,1.5587;4.3297,-1.8233,2.7549;4.8283,-1.4218,3.6471;3.0836,-2.6035,3.194;3.3549,-3.4238,3.8682;2.5689,-3.0399,2.3276;2.371,-1.9574,3.7163;5.3275,-2.723,2.0108;5.6226,-3.5728,2.636;6.2377,-2.1766,1.7374;4.8869,-3.1277,1.0902;1.7785,4.0357,5.2884;0.7593,3.7634,5.0284;1.9881,5.1327,6.1191;1.1321,5.6908,6.4889)|\",3.888506923645\r\n\"[H]NC(O[H])C([H])([H])/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(1.3498,-0.6392,-6.2354;1.7122,0.3174,-6.2022;2.1042,0.5791,-5.0195;2.6278,1.7787,-4.7257;2.8025,1.7493,-3.7532;2.0654,-0.391,-3.821;2.7718,-1.2071,-4.0372;1.061,-0.8362,-3.7788;2.4389,0.3234,-2.6168;2.2738,-0.1592,-1.4487;2.6219,0.616,-0.4062;2.53,0.1149,0.439;1.7082,-1.5219,-1.0869;1.6085,-2.1274,-1.9901;2.449,-2.0259,-0.4552;0.3435,-1.4462,-0.3342;0.045,-0.3987,-0.2029;-0.4276,-1.8988,-0.9666;0.326,-2.1451,1.0386;0.8571,-3.1053,0.9642;-0.7116,-2.3833,1.3105;0.9141,-1.3022,2.1818;0.2922,-0.4073,2.309;2.2349,-0.8342,1.8891;2.8201,-1.5405,2.2476;0.9871,-2.0622,3.5215;0.033,-2.5684,3.729;1.1591,-1.3362,4.3222;2.1495,-2.966,3.4806;1.9049,-3.867,3.0771;2.5071,-3.146,4.4144)|\",6.424608010305\r\n\"[H]O/C(=N/C([H])([H])C(=O)OC([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(5.009,-1.9406,0.633;4.556,-1.3823,-0.0169;5.4577,-0.9662,-0.958;4.9745,-0.2837,-1.9117;5.8213,0.2073,-2.9713;5.3145,1.03,-3.4832;6.7912,0.6082,-2.6276;6.152,-0.8703,-4.0049;6.3235,-2.0434,-3.752;6.2801,-0.3265,-5.2329;6.6483,-1.2254,-6.3091;7.151,-0.5826,-7.036;7.3556,-1.9641,-5.9243;5.4213,-1.8949,-6.9088;5.7155,-2.5098,-7.7673;4.6989,-1.1471,-7.2516;4.9374,-2.5421,-6.172;6.895,-1.3628,-0.6661;6.9331,-2.4572,-0.5658;7.5268,-1.1281,-1.525;7.4577,-0.6931,0.6042;7.4261,0.3965,0.4926;6.8211,-0.9224,1.4705;8.8959,-1.1272,0.9126;9.536,-0.9345,0.0366;8.9315,-2.2134,1.0791;9.5002,-0.4158,2.1297;8.8536,-0.5803,3.0026;9.501,1.0088,1.9643;10.0409,1.2054,1.1799;10.9183,-0.9102,2.4675;11.5618,-0.7442,1.589;10.9028,-1.9944,2.6361;11.5461,-0.2648,3.619;11.4057,0.7418,3.5286;11.0493,-0.5323,4.4686)|\",6.451819395355\r\n\"[H]OC(=O)C([H])([H])N(C(=O)C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H])C([H])([H])[H] |(3.1406,2.032,0.0608;3.9303,2.6212,0.1829;4.9787,1.8732,0.5554;6.0771,2.3396,0.7493;4.6736,0.3723,0.7225;3.8537,0.2484,1.438;5.5702,-0.1085,1.116;4.2983,-0.2906,-0.5328;3.014,-0.1588,-0.9718;2.228,0.599,-0.3766;2.5646,-0.9628,-2.1834;3.0805,-0.5684,-3.0721;2.8761,-2.0076,-2.0971;1.048,-0.8801,-2.3991;0.5345,-1.2503,-1.5027;0.7627,0.1733,-2.4882;0.5476,-1.6484,-3.6324;1.1204,-1.3377,-4.5194;-0.4963,-1.3555,-3.8159;0.5868,-3.1826,-3.5145;0.0528,-3.4725,-2.5993;1.9113,-3.6935,-3.3934;2.1858,-3.8294,-4.3251;-0.0861,-3.8816,-4.7144;-1.0878,-3.4628,-4.8966;-0.2085,-4.9413,-4.4678;0.8169,-3.7995,-5.876;0.6909,-2.9195,-6.3711;0.6228,-4.5422,-6.5421;5.3661,-0.9932,-1.2377;6.2093,-0.3145,-1.4094;5.0157,-1.3592,-2.201;5.7203,-1.8469,-0.647)|\",6.7130486918350005\r\n\"[H]OC(=O)[C@@]([H])(/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H])C([H])([H])[H] |(1.629,-3.0463,0.2756;1.6539,-3.9388,-0.151;2.9507,-4.1807,-0.3855;3.3541,-5.1952,-0.9055;3.9102,-3.0639,0.0866;4.5301,-2.8179,-0.7869;3.1096,-1.941,0.5484;3.5231,-0.7406,0.5534;2.641,0.1867,1.0191;3.0389,1.0687,0.9612;4.8732,-0.1987,0.1222;5.3916,0.1779,1.0168;5.485,-1.018,-0.2638;4.7896,0.9161,-0.9411;4.2316,0.558,-1.812;4.2342,1.7842,-0.5543;6.1763,1.3861,-1.3979;6.7359,0.5306,-1.7996;6.7454,1.7731,-0.5402;6.1054,2.4646,-2.478;5.5822,3.348,-2.0587;5.3824,1.9517,-3.5818;5.663,2.5152,-4.3317;7.4951,2.9183,-2.9522;8.0551,2.0339,-3.2783;8.0522,3.3823,-2.1222;7.3202,3.7889,-4.1248;7.0697,4.7332,-3.8366;8.1832,3.8622,-4.6574;4.8137,-3.6267,1.198;5.2845,-4.5516,0.8545;4.2213,-3.8412,2.0935;5.5939,-2.9068,1.4679)|\",6.522568996485001\r\n\"[H]OC(=O)C([H])([H])C([H])([H])/N=C(O[H])\\C([H])=C(/[H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(-0.5756,-2.1497,3.6741;-0.7203,-2.8867,4.3266;0.4604,-3.2444,4.8572;0.5547,-4.1552,5.6487;1.672,-2.4017,4.435;1.667,-1.4858,5.0418;2.5653,-2.9693,4.7071;1.7229,-2.0087,2.9481;1.7348,-2.9189,2.3316;2.6681,-1.4828,2.753;0.5402,-1.2323,2.6078;0.5269,-0.3235,1.7127;-0.6873,0.2334,1.4525;-0.5559,1.0604,0.9622;1.6724,0.1702,0.9203;2.6595,0.0282,1.3516;1.5527,0.7342,-0.2913;0.5727,0.8415,-0.7549;2.6847,1.2651,-1.1158;2.7523,0.6763,-2.0435;3.6386,1.157,-0.5834;2.4556,2.7475,-1.5066;2.4178,3.346,-0.5878;1.2098,2.9079,-2.1614;1.4231,2.7465,-3.1049;3.5744,3.2963,-2.4134;4.5634,3.1116,-1.9665;3.441,4.3803,-2.4866;3.4099,2.7341,-3.7634;3.9092,1.8533,-3.861;3.784,3.3627,-4.4686)|\",4.95791435611\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(6.4443,-0.1611,-1.3067;7.2744,0.0233,-1.8406;7.3303,1.3003,-2.2316;8.2379,1.7128,-2.924;6.1877,2.2233,-1.7813;6.5112,3.232,-2.0478;5.3061,2.0001,-2.4005;5.8,2.1614,-0.2844;5.4956,3.1631,0.0432;6.6756,1.8916,0.3191;4.6492,1.2007,0.0431;3.7422,1.5481,-0.4775;4.4373,1.2589,1.1219;4.9859,-0.1552,-0.3699;4.2594,-1.1615,-0.0674;4.6629,-2.3651,-0.5062;4.0034,-3.0545,-0.2402;2.9545,-1.1489,0.709;2.9293,-2.0236,1.3684;2.9202,-0.2643,1.3512;1.7046,-1.1356,-0.2045;0.8207,-1.0888,0.446;1.7054,-0.2062,-0.7875;1.5344,-2.3175,-1.1786;2.3596,-2.3374,-1.902;0.6162,-2.1333,-1.7532;1.4153,-3.6974,-0.5068;0.7735,-3.6051,0.3792;2.6905,-4.1855,-0.0547;2.9459,-4.8196,-0.7708;0.817,-4.7842,-1.4301;-0.0641,-4.3929,-1.959;0.4866,-5.6232,-0.8092;1.8813,-5.2887,-2.31;2.0175,-4.6869,-3.1192;1.668,-6.2193,-2.6568)|\",6.13344619027\r\n\"[H]OC(=O)C([H])([H])C([H])([H])/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(3.1849,3.0055,3.3105;3.7607,3.8057,3.4441;3.8421,4.4869,2.2892;4.5266,5.4789,2.1805;2.9797,3.9587,1.1349;1.9527,4.3134,1.2989;3.3473,4.429,0.2196;2.9496,2.4286,0.9722;3.9712,2.0619,0.7966;2.3603,2.183,0.0785;2.4314,1.8176,2.1909;1.7767,0.7291,2.1958;1.3519,0.299,3.4161;0.8778,-0.5401,3.3133;1.3786,-0.1321,1.0117;2.0993,0.0158,0.2028;1.4564,-1.1899,1.3;-0.0483,0.158,0.4912;-0.0924,1.204,0.1631;-0.2252,-0.4625,-0.3932;-1.1633,-0.1267,1.5086;-1.1322,0.5895,2.3396;-1.0079,-1.1273,1.9396;-2.5754,-0.1219,0.9131;-3.2749,-0.3793,1.7316;-2.6456,-1.1042,-0.1061;-3.4234,-0.8356,-0.6374;-3.0176,1.2354,0.3403;-2.2999,1.559,-0.4209;-3.0333,1.9998,1.134;-4.3095,1.0391,-0.3363;-5.0666,1.0055,0.3448;-4.5138,1.8121,-0.9645)|\",6.3130413316\r\n\"[H]OC(=O)C([H])([H])N(/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H])C([H])([H])[H] |(3.6118,0.497,3.5238;3.5028,0.4589,4.5156;4.5889,-0.1255,5.0436;4.7355,-0.262,6.2374;5.6763,-0.5827,4.0423;6.2596,-1.3621,4.5383;6.3451,0.2682,3.8649;5.2413,-1.0917,2.7311;4.4741,-0.0432,2.092;4.7934,0.2139,0.8728;4.0904,1.1818,0.2717;4.4411,1.3217,-0.6459;5.8666,-0.4669,0.053;6.434,0.3107,-0.4725;6.5389,-0.975,0.7469;5.3314,-1.5039,-0.9635;6.2038,-1.939,-1.4697;4.8598,-2.325,-0.4104;4.3404,-1.0137,-2.0369;3.409,-0.6675,-1.5701;4.0777,-1.8832,-2.6553;4.8809,0.0924,-2.9607;5.9277,-0.1278,-3.2066;4.8624,1.3798,-2.3173;4.0361,1.7886,-2.6766;4.0896,0.2375,-4.2802;3.9185,-0.7497,-4.7334;4.6899,0.8257,-4.9819;2.8642,1.0032,-4.0085;2.1266,0.41,-3.6347;2.504,1.4409,-4.8514;4.3857,-2.2778,2.8727;4.945,-3.0446,3.4181;4.1425,-2.6672,1.88;3.4451,-2.0714,3.4066)|\",5.820515262194999\r\n\"[H]OC(=O)C([H])([H])[C@@]1([H])N=C(O[H])[C@](C([H])([H])[H])(C([H])([H])C([H])=C([H])[H])C1=O |(7.5347,-1.2831,-1.0653;8.3762,-1.118,-1.5586;8.7128,0.1822,-1.4881;9.6946,0.6149,-2.0437;7.8079,1.0759,-0.6265;8.1289,0.9655,0.4185;7.9879,2.1125,-0.9183;6.3054,0.7561,-0.7266;5.9921,0.8723,-1.7749;6.0154,-0.6128,-0.2813;5.1138,-0.5948,0.6239;4.7138,-1.7333,1.2148;3.9802,-1.5333,1.8218;4.5416,0.7556,1.0197;4.7183,1.0729,2.5153;4.5337,2.14,2.6752;5.7338,0.8472,2.8565;4.0139,0.5048,3.1323;3.05,0.9413,0.5721;2.7336,1.9267,0.9315;3.0099,0.9722,-0.5239;2.1177,-0.1369,1.0673;2.0914,-1.0548,0.4788;1.334,-0.0407,2.1456;0.6774,-0.8524,2.4469;1.3006,0.8618,2.7525;5.4355,1.6672,0.1537;5.4406,2.8752,0.1787)|\",5.26268186867\r\n\"[H]OC(=O)C([H])([H])/N=C(/O[H])C([H])([H])C([H])([H])C([H])([H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(3.3042,3.4529,3.1136;3.4213,4.4202,2.9439;3.4184,4.5539,1.6114;3.5306,5.6146,1.0439;3.2625,3.2284,0.8446;4.1571,3.1186,0.2157;2.4072,3.351,0.1667;3.1029,2.1329,1.7836;2.8769,0.944,1.3996;2.7159,0.0159,2.382;2.5945,-0.86,1.9843;2.7151,0.435,-0.0175;3.1493,1.1538,-0.7183;3.2893,-0.4957,-0.1311;1.2356,0.1764,-0.3831;0.7995,-0.5308,0.3367;0.6754,1.1124,-0.2618;1.077,-0.3747,-1.8056;1.6877,-1.2791,-1.9205;1.4689,0.3479,-2.5362;-0.3637,-0.7634,-2.1944;-0.761,-1.4367,-1.4222;-0.3597,-1.4895,-3.4114;-0.4129,-0.7929,-4.0974;-1.3259,0.4325,-2.3393;-1.3339,1.0533,-1.43;-2.3354,0.0291,-2.4676;-0.9855,1.171,-3.567;-0.264,1.866,-3.3903;-1.7937,1.6746,-3.9221)|\",6.449098256850001\r\n\"[H]OC([H])([H])/C([H])=C1/C(=O)O[C@]([H])(C([H])=C(C([H])([H])[H])C([H])([H])[H])[C@]1([H])O[H] |(5.574,4.4193,-1.2754;6.5003,4.7302,-1.2863;7.2679,3.7917,-1.9977;7.0282,3.7825,-3.0799;8.3104,4.1325,-1.9369;7.2199,2.3721,-1.5067;7.9968,1.7146,-1.9008;6.3489,1.8053,-0.6609;6.5092,0.3859,-0.2445;7.4008,-0.3859,-0.4965;5.4457,0.0326,0.5485;4.4194,1.0496,0.5349;4.0681,1.1532,1.5637;3.3126,0.6002,-0.3874;3.6356,-0.1,-1.1568;2.0172,0.956,-0.3346;1.0244,0.399,-1.3263;0.5136,1.2046,-1.8706;0.2411,-0.1708,-0.8089;1.5001,-0.262,-2.0568;1.4353,1.904,0.6851;0.5662,1.4478,1.1762;1.0765,2.8196,0.1969;2.1469,2.2057,1.4573;5.1411,2.3606,0.0583;5.4463,2.954,0.9256;4.2881,3.2195,-0.6918;3.7618,2.6531,-1.2859)|\",5.76337135359\r\n\"[H]C([H])([H])OC(=O)[C@]1([H])N2C(=O)C([H])([H])C([H])([H])C([H])([H])[C@]2([H])OC1([H])[H] |(5.5541,-2.0245,0.8379;4.8386,-1.2741,0.501;4.3469,-0.7995,1.3537;5.3363,-0.5079,-0.0991;3.8753,-1.9892,-0.2917;2.895,-1.2313,-0.8144;2.8344,-0.0261,-0.7215;1.9096,-2.0463,-1.6542;0.9885,-1.4591,-1.6956;1.6864,-3.4031,-1.174;1.4056,-3.653,0.1436;1.1261,-2.7459,0.9186;1.5197,-5.1049,0.6064;2.3417,-5.1015,1.3344;0.6148,-5.3421,1.1754;1.7907,-6.1429,-0.4945;0.8518,-6.4131,-0.9968;2.179,-7.0654,-0.0498;2.7759,-5.5916,-1.5356;3.0093,-6.3362,-2.3038;3.7153,-5.2806,-1.0634;2.1388,-4.3778,-2.18;1.2727,-4.6868,-2.794;3.0342,-3.6415,-2.9845;2.4756,-2.335,-3.0723;3.2626,-1.6436,-3.3786;1.6629,-2.3057,-3.8136)|\",6.906249525690001\r\n\"[H]C1=C(C([H])([H])[H])C([H])([H])[C@]([H])(C([H])([H])[H])/C(=C(/[H])N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])C1([H])[H] |(2.8347,-2.2308,-0.3416;3.2031,-1.2067,-0.2726;2.3878,-0.2443,0.1757;0.9776,-0.5264,0.6256;0.7335,-1.5917,0.5576;0.2473,0.031,0.0218;0.8234,-0.2097,1.6672;2.8527,1.1938,0.2805;3.086,1.4173,1.3329;2.0246,1.8673,0.013;4.0853,1.5249,-0.5983;4.5062,2.4632,-0.2228;3.6738,1.7355,-2.0703;2.9597,2.5635,-2.1613;3.1953,0.8388,-2.4797;4.5442,1.9666,-2.6946;5.1179,0.429,-0.4328;6.3847,0.6248,-0.0213;6.9995,-0.2567,0.191;7.0475,1.8436,0.1694;8.2752,1.8393,0.9782;8.0662,2.1894,2.0002;8.6794,0.8205,1.0675;9.2439,2.7876,0.253;9.9298,3.2973,0.9368;9.8486,2.2269,-0.4705;8.2986,3.7439,-0.4901;8.7711,4.2518,-1.3366;7.9192,4.512,0.195;7.1552,2.8201,-0.9279;7.3931,2.3158,-1.88;6.2198,3.3694,-1.0768;4.6276,-0.9759,-0.7142;4.6976,-1.2056,-1.7931;5.292,-1.7071,-0.23)|\",5.8477266472450005\r\n\"[H]/C(C(=O)OC([H])([H])[H])=C([H])\\C(C(=O)C([H])([H])[H])=C(\\[H])C([H])([H])C([H])([H])C([H])([H])[H] |(3.5691,4.3671,-0.3685;4.2536,4.7074,0.4009;4.1972,6.1399,0.7663;4.8998,6.6952,1.5898;3.2405,6.7787,0.0493;3.1042,8.1804,0.3253;2.3028,8.5312,-0.3257;4.0365,8.7083,0.1065;2.846,8.3428,1.3755;5.1503,3.8956,0.9901;5.8216,4.3706,1.7042;5.3307,2.4603,0.7444;6.7398,1.9027,0.7174;6.9499,0.7272,0.465;7.8943,2.8475,1.0019;7.8352,3.244,2.0234;7.8776,3.7087,0.3234;8.8323,2.3023,0.8844;4.3297,1.5644,0.5731;4.6606,0.538,0.4097;2.8479,1.7752,0.6195;2.6047,2.743,1.074;2.4719,1.8211,-0.4155;2.1071,0.6399,1.355;1.0277,0.7951,1.2327;2.3384,-0.3169,0.8673;2.4479,0.5567,2.8464;1.893,-0.2528,3.3333;2.1961,1.4921,3.3611;3.5168,0.3706,3.0032)|\",4.74022327571\r\n\"[H]O[C@@]([H])([C@@]([H])(C([H])([H])[H])[C@]([H])(O[H])C([H])([H])[H])[C@@]1(C([H])([H])[H])OC(=O)[C@]([H])(C([H])([H])[H])[C@@]1([H])O[H] |(1.9739,-0.8289,2.0384;2.5542,-0.0677,1.8177;1.9544,0.6226,0.7147;2.7907,1.0574,0.1536;1.2379,-0.37,-0.2426;0.7061,0.2203,-0.9997;2.3049,-1.2115,-0.9683;1.8656,-1.9301,-1.6678;2.93,-1.7576,-0.2532;2.9736,-0.5644,-1.5459;0.163,-1.2371,0.4636;-0.5049,-0.5719,1.0116;0.749,-2.06,1.5042;1.2491,-2.7733,1.0737;-0.6709,-2.1011,-0.4809;-1.165,-1.4873,-1.2427;-1.4433,-2.6269,0.0885;-0.0631,-2.852,-1.0008;1.1531,1.8537,1.2173;0.7326,2.7723,0.0718;1.6177,3.1402,-0.4591;0.1864,3.6352,0.4655;0.0799,2.2655,-0.6453;-0.0625,1.4357,1.9041;0.0002,1.6197,3.2577;-0.9344,1.3656,3.9727;1.3692,2.1521,3.6476;1.9459,1.2793,3.9816;1.3127,3.1855,4.7715;2.324,3.5031,5.0449;0.8219,2.7654,5.6541;0.7469,4.0723,4.4626;1.9702,2.6314,2.3166;1.7705,3.7027,2.1952;3.3708,2.5056,2.2078;3.5627,1.5579,2.34)|\",7.170199960674999\r\n\"[H]O[C@@](C([H])([H])[H])(C([H])([H])C([H])([H])[H])[C@]([H])(O[H])[C@@]1(C([H])([H])[H])OC(=O)[C@]([H])(C([H])([H])[H])[C@@]1([H])O[H] |(1.5803,-0.0723,-3.8664;1.9895,0.787,-3.6762;3.0261,0.5831,-2.6717;2.3642,0.0497,-1.3952;3.0768,-0.0169,-0.5685;1.9585,-0.9545,-1.5769;1.5538,0.7075,-1.0769;4.0305,-0.4428,-3.2555;3.4412,-1.3203,-3.5644;4.4653,-0.0236,-4.1657;5.1463,-0.9232,-2.3183;5.7901,-1.6353,-2.8458;4.7549,-1.4291,-1.4296;5.7864,-0.1006,-1.9806;3.688,1.9785,-2.3933;4.6926,1.7486,-2.0248;3.0757,2.7022,-1.336;2.2464,3.0663,-1.7111;3.8685,2.9459,-3.607;4.9492,3.9757,-3.2671;4.6905,4.4913,-2.3383;5.0476,4.7135,-4.071;5.918,3.4819,-3.1393;4.3465,2.1958,-4.7534;3.5722,2.3577,-5.8599;3.8212,1.7997,-6.8982;2.4289,3.3235,-5.5775;1.4944,2.7759,-5.7435;2.4699,4.5193,-6.5392;1.6061,5.1726,-6.3769;2.4553,4.1675,-7.5748;3.3782,5.1171,-6.399;2.545,3.6685,-4.0806;2.6666,4.7452,-3.9365;1.3737,3.3432,-3.3358;1.2224,2.3832,-3.4816)|\",7.317141439945\r\n\"[H]O[C@@]([H])([C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])[C@@]1(C([H])([H])[H])OC(=O)[C@]([H])(C([H])([H])[H])[C@@]1([H])O[H] |(3.731,0.1254,-0.3064;3.6146,-0.8327,-0.4326;4.8615,-1.426,-0.0889;4.6707,-2.5028,-0.1502;6.0397,-1.0949,-1.0653;6.7268,-0.4029,-0.5632;5.5591,-0.3818,-2.3389;6.4019,-0.2431,-3.027;5.1505,0.6074,-2.1117;4.7806,-0.9475,-2.8581;6.8705,-2.362,-1.3938;7.8071,-2.0398,-1.8697;7.1581,-2.8465,-0.4534;6.1862,-3.3886,-2.3064;6.8244,-4.2704,-2.4327;5.9877,-2.979,-3.3027;5.2294,-3.7348,-1.8974;5.1523,-1.1485,1.4193;4.047,-1.7537,2.2844;3.0859,-1.302,2.0258;4.2527,-1.5894,3.3479;3.9791,-2.8323,2.1123;6.3885,-1.8335,1.7803;7.3611,-0.9965,2.2233;8.416,-1.3945,2.6459;6.9108,0.4549,2.0816;7.3995,0.8389,1.1755;7.3256,1.3124,3.2792;7.0924,2.3698,3.1092;8.4017,1.2191,3.4492;6.8121,0.9918,4.1928;5.4066,0.3525,1.7927;4.8368,0.5834,2.7032;4.9639,1.2141,0.7377;4.9525,2.1239,1.0716)|\",7.077681251504999\r\n\"[H]OC1=C2C(=C([H])C([H])=C1[H])O[C@]([H])(C([H])([H])[H])C([H])([H])[C@@]2([H])OC([H])([H])[H] |(8.277,0.7855,-1.642;7.3781,0.9961,-1.3461;6.6916,-0.1802,-1.1686;5.3409,-0.078,-0.7909;4.609,-1.2722,-0.6611;5.2113,-2.5231,-0.8631;4.6074,-3.4169,-0.7495;6.5543,-2.584,-1.2092;7.0271,-3.5501,-1.3639;7.3054,-1.4176,-1.3707;8.3549,-1.4667,-1.6539;3.2777,-1.2996,-0.3793;2.6016,-0.0626,-0.0548;2.7976,0.148,1.004;1.1195,-0.323,-0.2731;0.532,0.5466,0.0409;0.7938,-1.1899,0.3104;0.9124,-0.5223,-1.3303;3.1599,1.0904,-0.884;2.6331,2.0154,-0.628;3.0087,0.8914,-1.9525;4.6504,1.2567,-0.5869;5.1006,2.0069,-1.2516;4.7024,1.7377,0.7641;5.987,2.0815,1.2469;5.8366,2.5061,2.2437;6.477,2.8277,0.6051;6.6475,1.2076,1.3271)|\",5.717111999005\r\n\"[H]O/C1=C2\\C(=O)C([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])\\C2=C(/[H])C2=C1C([H])([H])OC2([H])[H] |(-1.2779,3.1016,-0.0668;-0.3564,2.7986,-0.0865;-0.334,1.4437,-0.0274;0.9122,0.76,-0.0056;2.2079,1.512,-0.0617;2.274,2.7068,-0.2951;3.4733,0.7026,0.2011;3.6027,0.6044,1.2903;4.33,1.2679,-0.1726;3.4134,-0.6968,-0.4194;3.3146,-0.6045,-1.5081;4.5915,-1.443,-0.1831;4.4807,-1.81,0.7134;2.1966,-1.4539,0.1249;2.0836,-2.3851,-0.4388;2.4777,-1.8961,1.472;2.2319,-1.1798,2.0793;0.9065,-0.6585,0.0631;-0.2912,-1.3784,0.1128;-0.2646,-2.4631,0.1668;-1.4939,-0.6828,0.0964;-1.517,0.7037,0.028;-2.9506,1.1746,0.0209;-3.2134,1.7236,-0.8985;-3.1904,1.8284,0.8757;-3.7374,-0.0136,0.1048;-2.9131,-1.1834,0.137;-3.1287,-1.7567,1.0505;-3.1525,-1.8264,-0.7229)|\",4.797367184315\r\n\"[H]O[C@@]([H])(C([H])([H])[H])C([H])([H])[C@]([H])(O[H])[C@]([H])(O[H])C([H])([H])OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(2.5883,-2.4104,5.5212;2.3964,-1.5471,5.1229;3.6412,-0.974,4.6984;4.2561,-0.7572,5.5881;4.4196,-1.9458,3.8079;5.3573,-1.4975,3.4599;3.8321,-2.2446,2.9339;4.6776,-2.8555,4.3656;3.3321,0.3763,4.0398;4.2857,0.8865,3.8569;2.7816,0.9919,4.7617;2.5788,0.3655,2.6992;3.1208,-0.2571,1.9791;2.5987,1.6766,2.114;2.0234,2.2269,2.6799;1.1182,-0.1433,2.7175;1.1118,-1.2154,2.9382;0.5741,0.0223,1.4148;0.9524,0.8638,1.099;0.2335,0.5208,3.7773;0.495,0.1543,4.7798;-0.8125,0.2551,3.566;0.3931,1.9431,3.7212;-0.496,2.6446,4.5854;-1.5365,2.463,4.2757;-0.3863,2.2551,5.6113;-0.1805,4.1192,4.5421;-1.117,5.0417,4.066;-2.0835,4.6898,3.7129;-0.8211,6.4064,4.0365;-1.5601,7.1118,3.666;0.4222,6.8605,4.4762;0.6555,7.9216,4.4522;1.3677,5.9456,4.9477;2.3382,6.2938,5.2913;1.0656,4.5855,4.9836;1.802,3.8751,5.3514)|\",6.375627517214999\r\n\"[H]O[C@@]([H])([C@@]1([H])O[C@]1([H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(3.1756,-0.001,-1.201;3.5991,0.5832,-1.8591;4.7496,1.1127,-1.2049;4.4909,2.0273,-0.6533;5.2923,0.0982,-0.2054;5.9675,0.5093,0.5452;4.2866,-0.7922,0.329;5.3517,-1.3402,-0.4893;5.0119,-1.6642,-1.472;6.3175,-2.263,0.2063;7.2187,-2.3982,-0.4036;5.8662,-3.2491,0.3648;6.6094,-1.8595,1.181;5.7349,1.4843,-2.3329;5.3114,2.3597,-2.851;5.8841,0.4001,-3.2354;4.9739,0.0863,-3.3844;7.1321,1.8392,-1.835;7.7427,2.189,-2.6824;7.6184,0.9366,-1.4306;7.0123,2.8431,-0.8432;8.2769,3.2973,-0.3664;8.8421,2.4514,0.0577;8.8654,3.6954,-1.2091;8.059,4.3639,0.6762;7.8425,5.6934,0.2936;7.8556,5.9535,-0.7622;7.6105,6.6796,1.2519;7.4467,7.7084,0.9422;7.5918,6.3453,2.608;7.4148,7.1134,3.3562;7.8047,5.0226,2.9997;7.7928,4.7572,4.0535;8.0358,4.0392,2.0369;8.2026,3.009,2.3433)|\",6.530732412\r\n\"[H]C1=C([H])C([H])=C(/C([H])=C(\\[H])C(=O)N(C([H])([H])[H])C([H])([H])[H])C(N=C=S)=C1[H] |(-0.0126,-6.1183,-4.7142;0.3116,-5.4021,-3.9648;0.9743,-4.23,-4.3455;1.1685,-4.0292,-5.3951;1.383,-3.3189,-3.379;1.8702,-2.3992,-3.6876;1.1577,-3.5402,-2.0083;1.5877,-2.5983,-0.9736;1.1276,-2.6977,0.0062;2.5077,-1.6273,-1.1161;3.0216,-1.4909,-2.0605;2.8001,-0.7229,0.035;2.11,-0.761,1.0548;3.8628,0.152,-0.0809;4.0796,1.1204,0.9848;5.1116,1.0558,1.3526;3.3875,0.9003,1.7956;3.9061,2.1438,0.6231;4.7472,0.2875,-1.2264;5.77,0.4577,-0.8696;4.4683,1.1391,-1.865;4.7638,-0.617,-1.8327;0.4754,-4.7281,-1.6468;0.2253,-4.9858,-0.3101;-0.2239,-5.6934,0.5465;-0.7814,-6.5528,1.7573;0.0593,-5.6498,-2.6211;-0.4614,-6.5493,-2.3074)|\",4.31028339192\r\n\"[H]O[C@@]1([H])[C@@]([H])(O[H])C([H])=C(C(=O)C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])([H])[C@]1([H])O[H] |(5.1094,5.0513,1.3712;5.2606,4.2325,0.8767;4.7533,4.4073,-0.4573;4.9846,5.4161,-0.822;5.4998,3.3846,-1.3272;5.387,3.6842,-2.3773;6.8944,3.4225,-1.0595;6.9552,3.508,-0.0906;4.9482,1.9847,-1.164;5.5584,1.2005,-1.605;3.7771,1.6939,-0.5681;3.2111,0.303,-0.5686;1.9944,0.1575,-0.644;4.1065,-0.8974,-0.4947;3.6124,-2.1128,-0.9969;2.6245,-2.1229,-1.4457;4.3766,-3.2727,-0.9168;3.9912,-4.2049,-1.3208;5.6367,-3.2393,-0.311;6.2315,-4.1463,-0.2429;6.1263,-2.0423,0.213;7.0979,-2.0159,0.6987;5.3682,-0.8746,0.1195;5.7474,0.0493,0.5428;2.8918,2.7719,0.0259;3.0056,2.7611,1.1179;1.8431,2.5406,-0.1848;3.2345,4.1669,-0.5031;2.7336,4.9248,0.1122;2.7571,4.3813,-1.8321;2.8295,3.5388,-2.3089)|\",4.982404602655\r\n\"[H]N1C([H])([H])C2=C(SC([H])([H])[C@@]([H])(C([H])([H])[H])/C2=N/OC([H])([H])[H])C([H])([H])C1([H])[H] |(4.2878,-2.9916,-4.5427;4.1635,-2.1264,-4.0213;3.8019,-2.4578,-2.6434;2.8404,-2.9801,-2.6532;4.5233,-3.1462,-2.1695;3.6815,-1.2017,-1.7829;4.2836,-0.0546,-2.1927;4.2391,1.5114,-1.3656;3.276,1.1166,0.1462;3.9716,0.7792,0.9208;2.8323,2.0633,0.471;2.1942,0.0571,-0.0955;1.7461,-0.1645,0.8797;1.0797,0.5281,-1.0443;0.5993,1.4315,-0.6501;0.3151,-0.2497,-1.1402;1.4641,0.7552,-2.0428;2.8475,-1.2365,-0.566;2.5462,-2.2428,0.1859;3.1614,-3.4458,-0.1918;2.7241,-4.4702,0.6923;3.2499,-5.3748,0.3762;1.6409,-4.6244,0.6177;2.9836,-4.2319,1.7305;5.0779,0.0407,-3.4805;4.4916,0.6177,-4.2084;6.007,0.5984,-3.3063;5.3902,-1.3394,-4.0584;6.2164,-1.8008,-3.4825;5.7276,-1.2407,-5.0959)|\",4.710290752155\r\n\"[H]N1C([H])([H])C2=C(SC([H])([H])C([H])([H])/C2=N/OC([H])([H])[H])C([H])([H])C1([H])[H] |(6.059,1.1157,2.7041;5.6584,0.1779,2.7341;4.7037,0.0285,1.6386;4.0955,-0.8622,1.8328;4.0101,0.8736,1.6676;5.3304,-0.0715,0.2376;6.6466,-0.3861,0.1196;7.5727,-0.5189,-1.3904;6.2887,-0.2033,-2.6631;6.8231,0.1807,-3.5371;5.8174,-1.1525,-2.9364;5.2452,0.7967,-2.1693;4.5081,0.9822,-2.9564;5.7369,1.7485,-1.9292;4.5219,0.2631,-0.9498;3.2431,0.1835,-1.113;2.5664,-0.3686,-0.0172;1.1723,-0.3685,-0.3;0.6977,-0.8342,0.5673;0.9539,-0.9511,-1.2026;0.7928,0.6522,-0.4284;7.5397,-0.6509,1.3179;8.1471,-1.548,1.1379;8.2538,0.1817,1.4116;6.735,-0.8003,2.6192;7.4006,-0.7147,3.4859;6.2788,-1.7977,2.661)|\",4.67491595159\r\n\"[H]OC([H])([H])C([H])([H])[C@]([H])([C@@]([H])(O[H])N(OC([H])([H])[H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])O[H] |(5.7833,4.9464,3.3919;6.2511,4.0993,3.4421;5.3033,3.0606,3.2131;4.4713,3.1065,3.931;4.8589,3.1413,2.2064;6.055,1.7372,3.3333;6.5283,1.6944,4.3221;6.8838,1.7666,2.6143;5.2111,0.4735,3.0739;4.6342,0.6338,2.154;4.1748,0.1424,4.1763;3.953,-0.9315,4.1397;4.6485,0.3774,5.4858;4.8787,1.3189,5.5525;2.9169,0.9121,3.9921;2.372,0.4397,2.7301;1.8337,1.536,2.0082;1.0401,2.0453,2.5714;1.404,1.1045,1.0989;2.606,2.2684,1.741;1.9185,0.5492,5.0052;1.7045,-0.5301,5.0046;0.9959,1.0986,4.7994;2.2938,0.8399,5.9873;6.1214,-0.7437,2.7795;6.7743,-0.4819,1.9317;5.3511,-1.9068,2.4397;4.8334,-1.7112,1.6434;7.0263,-1.21,3.9211;6.4156,-1.4728,4.7969;7.7094,-0.4085,4.2166;7.8316,-2.2992,3.4991;7.2123,-2.9317,3.0977)|\",7.877695971975\r\n\"[H]OC([H])([H])C([H])([H])[C@]([H])([C@]([H])(O[H])N1C([H])([H])C([H])([H])OC([H])([H])C1([H])[H])[C@]([H])(O[H])C([H])([H])O[H] |(-1.164,-0.1343,1.3829;-1.0379,0.7528,1.0114;0.3599,0.9437,0.8246;0.4594,1.9668,0.4523;0.9008,0.887,1.7851;0.953,-0.0554,-0.1749;0.4464,0.0852,-1.1348;0.7066,-1.0703,0.1759;2.4826,0.0356,-0.3543;2.9268,-0.001,0.6496;2.8854,1.4071,-0.9746;2.5766,2.201,-0.2725;2.1893,1.6838,-2.158;2.3559,0.9356,-2.7748;4.3591,1.5185,-1.1871;5.1043,1.5899,0.0847;4.7809,2.4637,0.6801;4.9183,0.6851,0.6679;6.6044,1.6883,-0.1878;6.9487,0.756,-0.666;7.1512,1.8176,0.7512;6.9247,2.8059,-1.0005;6.2118,2.7456,-2.2286;6.4842,3.6445,-2.79;6.5284,1.8613,-2.8073;4.7047,2.6962,-2.0051;4.1841,2.6309,-2.9611;4.3782,3.6287,-1.5094;3.0462,-1.2387,-1.061;2.7905,-2.0784,-0.3978;4.4585,-1.2242,-1.1467;4.6839,-0.338,-1.5123;2.4342,-1.6313,-2.4119;1.3442,-1.7452,-2.328;2.8679,-2.5989,-2.6935;2.7663,-0.6546,-3.411;2.3956,-0.9404,-4.2594)|\",7.39605445659\r\n\"[H]OC(=O)[C@]1([H])C([H])([H])C(=O)O[C@]1([H])C(=O)OC([H])(C([H])([H])[H])C([H])([H])[H] |(5.1117,1.6273,-2.6185;5.9198,1.9616,-3.0674;7.0297,1.5445,-2.4322;8.1189,1.9772,-2.7238;6.8717,0.4525,-1.3618;7.8964,0.2213,-1.0609;6.185,-0.8475,-1.8106;6.8486,-1.5389,-2.3337;5.3128,-0.6709,-2.4488;5.7097,-1.4644,-0.5022;5.3569,-2.5869,-0.2776;5.7417,-0.504,0.4888;6.1089,0.7835,-0.0475;6.7397,1.2797,0.6907;4.7909,1.5516,-0.1978;4.049,1.4363,-1.1649;4.5386,2.2897,0.8712;3.2574,3.0186,0.9297;3.0231,3.3248,-0.0931;2.1827,2.0777,1.4583;1.228,2.6106,1.5237;2.4445,1.7135,2.4573;2.0498,1.2215,0.7914;3.5162,4.2275,1.8146;2.6078,4.8354,1.8814;4.3169,4.8501,1.4041;3.7986,3.9174,2.8262)|\",6.11711935924\r\n\"[H]O/N=C1\\C2=C(SC([H])([H])[C@@]1([H])C([H])([H])[H])C([H])([H])C([H])([H])N([H])C2([H])[H] |(3.6437,-3.838,1.3784;3.8435,-3.3028,0.5952;3.0569,-2.155,0.83;3.1411,-1.2627,-0.0982;3.8827,-1.3165,-1.373;4.2674,-0.1818,-2.013;4.0221,1.4796,-1.4467;3.2202,1.2136,0.1835;4.003,1.1237,0.9431;2.6485,2.1253,0.3847;2.3123,-0.0214,0.2077;1.9607,-0.135,1.2395;1.086,0.1024,-0.7112;0.492,0.9834,-0.44;0.4491,-0.7822,-0.6082;1.3733,0.1964,-1.7622;4.9513,-0.1949,-3.3661;4.2377,0.1662,-4.1189;5.795,0.5072,-3.3652;5.4303,-1.5956,-3.7481;6.3574,-1.8321,-3.1897;5.6733,-1.6268,-4.8158;4.3462,-2.5301,-3.4709;4.5662,-3.4448,-3.8592;4.1405,-2.6709,-2.0296;3.2751,-3.3199,-1.8645;4.9933,-3.1572,-1.5248)|\",4.775598076275001\r\n\"[H]O/N=C1/C2=C(SC([H])([H])C1([H])[H])C([H])([H])C([H])([H])N([H])C2([H])[H] |(-2.1606,-4.2241,-0.5864;-2.4179,-3.302,-0.4318;-1.1656,-2.6616,-0.3773;-1.2737,-1.3819,-0.2234;-0.007,-0.6453,-0.1012;0.0626,0.7023,0.0409;-1.3314,1.8056,0.0328;-2.5832,0.717,-0.739;-3.5453,1.221,-0.6108;-2.3711,0.654,-1.8118;-2.6051,-0.672,-0.1098;-3.3737,-1.2817,-0.5926;-2.879,-0.5948,0.9515;1.3779,1.4291,0.2437;1.3781,2.3633,-0.3325;1.4656,1.7253,1.3009;2.5775,0.5542,-0.1537;3.5112,1.0085,0.1973;2.6397,0.4928,-1.2481;2.4829,-0.8089,0.3594;2.4523,-0.7733,1.378;1.2779,-1.4797,-0.1246;1.4592,-1.822,-1.1539;1.1262,-2.3892,0.4627)|\",4.623214319995\r\n\"[H]OC(=O)[C@]1([H])C([H])([H])C(=O)O[C@]1([H])C(=O)OC([H])([H])C([H])([H])C([H])([H])[H] |(6.077,-4.7032,3.3236;5.6047,-5.5269,3.0984;4.3034,-5.4855,3.4639;3.525,-6.3109,3.057;3.873,-4.4346,4.4939;4.0414,-4.9052,5.4724;2.3991,-4.0327,4.3811;1.7094,-4.7306,4.8568;2.0919,-3.922,3.3355;2.3457,-2.6741,5.0614;1.3994,-2.0634,5.4671;3.6322,-2.1859,5.1927;4.5771,-3.0405,4.5582;5.4829,-3.0546,5.1686;4.9089,-2.4757,3.1748;4.1308,-1.8958,2.4599;6.1933,-2.775,2.8593;6.6549,-2.3877,1.5272;7.4404,-3.1109,1.2956;5.8255,-2.5137,0.8273;7.1812,-0.9589,1.5231;6.3674,-0.2817,1.8054;7.9637,-0.8651,2.286;7.7331,-0.5711,0.1462;8.1116,0.4557,0.1588;6.9573,-0.6299,-0.6259;8.5588,-1.2265,-0.1557)|\",6.723933245855\r\n\"[H]OC(=O)[C@]1([H])C([H])([H])C(=O)O[C@]1([H])C(=O)OC([H])([H])C([H])([H])[H] |(5.4543,3.1393,-2.5998;5.324,3.6954,-1.8137;5.4252,2.9569,-0.6839;5.2699,3.4665,0.3986;5.8055,1.493,-0.8668;6.8966,1.4674,-0.996;5.1509,0.7115,-2.0167;5.6824,0.7538,-2.9715;4.1082,1.0074,-2.1697;5.1579,-0.7267,-1.5005;5.0022,-1.7403,-2.1204;5.3985,-0.7219,-0.1507;5.4631,0.6205,0.3661;6.2278,0.6451,1.1412;4.0876,0.9602,0.9528;3.0988,1.0868,0.2595;4.1303,1.04,2.2832;2.8919,1.3971,2.96;2.0576,0.9546,2.4116;2.9786,0.9232,3.9401;2.7603,2.9075,3.069;1.8464,3.1572,3.6202;2.705,3.3657,2.0779;3.6151,3.3356,3.6012)|\",6.979720265325\r\n\"[H]OC(=O)[C@]1([H])C([H])([H])C(=O)O[C@]1([H])C(=O)OC([H])([H])[H] |(-0.2084,-3.5448,2.1415;0.568,-3.9175,1.6916;1.6455,-3.1143,1.8523;2.7038,-3.3964,1.3471;1.4535,-1.8965,2.7479;1.5611,-2.2504,3.7828;0.1447,-1.1005,2.6276;-0.6695,-1.4409,3.2733;-0.2102,-1.0534,1.593;0.5619,0.3091,3.0449;-0.1371,1.2269,3.3688;1.9302,0.4017,3.0225;2.5285,-0.8066,2.5168;3.4511,-0.9814,3.0685;2.8229,-0.5887,1.0286;1.9492,-0.4739,0.1946;4.1327,-0.4947,0.7936;4.5023,-0.2962,-0.5863;5.5906,-0.245,-0.5894;4.0677,0.632,-0.9646;4.1512,-1.1363,-1.1898)|\",6.957951157285\r\n\"[H]C([H])([H])OC(=O)[C@@]1([H])OC(=O)C([H])([H])[C@@]1([H])C(=O)N(OC([H])([H])[H])C([H])([H])[H] |(1.8102,4.3627,-1.519;2.5965,4.3381,-0.764;3.0962,5.3076,-0.6949;2.186,4.0746,0.2139;3.52,3.3251,-1.2007;4.6004,3.1562,-0.4027;4.8592,3.8274,0.561;5.483,2.0637,-1.003;5.9385,2.4904,-1.9076;6.532,1.7455,-0.0881;6.8168,0.4127,-0.1013;7.6857,-0.074,0.5688;5.9088,-0.2876,-1.1094;6.499,-0.4958,-2.0099;5.5704,-1.2445,-0.7059;4.7837,0.7308,-1.3902;4.4842,0.7437,-2.4367;3.5739,0.4382,-0.5042;3.4842,0.8678,0.6386;2.6257,-0.4288,-1.0114;2.5971,-0.6032,-2.4103;2.8695,-1.9662,-2.7519;2.1304,-2.6451,-2.3106;3.8752,-2.2652,-2.4358;2.793,-2.005,-3.8412;1.311,-0.541,-0.3957;1.4253,-0.3175,0.6644;0.9295,-1.5587,-0.5169;0.609,0.167,-0.8521)|\",6.821894232035\r\n\"[H]C(=O)[C@@]([H])(C([H])=C([H])[H])/C([H])=C(\\[H])C([H])=C(C([H])([H])[H])C([H])([H])[H] |(1.7668,-1.1698,2.957;2.2667,-0.371,2.3656;2.8998,0.5056,2.9074;2.0617,-0.5146,0.8508;0.9663,-0.438,0.7228;2.7358,0.6034,0.0869;2.8106,1.5471,0.6229;3.2191,0.4987,-1.1506;3.6776,1.3484,-1.6493;3.1719,-0.4308,-1.7128;2.4786,-1.9049,0.4385;3.5526,-2.0919,0.4381;1.631,-2.9001,0.118;0.5629,-2.6866,0.1155;2.0593,-4.2436,-0.2347;3.1378,-4.4034,-0.2137;1.2815,-5.2933,-0.5781;-0.2232,-5.2588,-0.6592;-0.6624,-6.0012,0.0217;-0.5604,-5.5302,-1.6693;-0.6532,-4.2854,-0.4142;1.8997,-6.6266,-0.9141;1.5283,-7.4145,-0.2433;2.9911,-6.6033,-0.8396;1.6339,-6.9395,-1.934)|\",4.81641515385\r\n\"[H]C1=C([H])C([H])=C([H])C(=O)C1C1=C([H])C2ONC(C([H])([H])[H])[C@]2([H])N1[H] |(8.2858,-1.908,-1.5768;7.8783,-2.9124,-1.6585;8.7123,-3.9608,-1.9357;9.7769,-3.7992,-2.0734;8.1722,-5.2786,-2.0423;8.8457,-6.1049,-2.2607;6.8361,-5.5231,-1.8818;6.422,-6.523,-1.9687;5.8978,-4.4597,-1.593;4.6613,-4.6904,-1.4754;6.4756,-3.1107,-1.4649;5.6215,-2.0326,-1.179;5.9457,-0.6096,-1.0334;6.9379,-0.1939,-0.9434;4.7643,0.0197,-0.8733;4.3655,1.1916,-0.338;3.1109,0.8993,0.37;2.7513,-0.307,0.0992;1.5151,-0.8931,0.6939;1.7616,-1.8123,1.2385;0.7998,-1.1619,-0.0932;1.0445,-0.1845,1.379;3.6176,-0.9266,-0.9822;3.0608,-0.8122,-1.9318;4.2842,-2.2069,-0.9629;3.9243,-3.1462,-1.2064)|\",3.05855967962\r\n\"[H]C1=C2/N=C([H])\\N=C(\\N([H])C([H])(C([H])([H])[H])C([H])([H])[H])C2=C([H])[C@@]([H])(OC([H])([H])[H])C1=O |(5.5691,4.1829,4.2405;5.4799,3.9033,3.1962;5.817,2.6297,2.816;6.3601,1.7646,3.7471;6.5916,0.5454,3.3464;7.0442,-0.1258,4.076;6.3301,-0.0444,2.1385;5.8164,0.7253,1.1995;5.5356,0.1815,-0.0028;5.0324,0.7433,-0.6744;5.7122,-1.2418,-0.3264;6.595,-1.5628,0.2325;4.5093,-2.0718,0.142;4.6754,-3.134,-0.0702;3.5914,-1.7616,-0.3724;4.3635,-1.9547,1.2194;5.9717,-1.3781,-1.8284;6.136,-2.4284,-2.0888;6.8548,-0.8068,-2.1332;5.1122,-1.0255,-2.4145;5.5658,2.1648,1.4314;5.1184,3.0312,0.4966;4.8937,2.7484,-0.5289;4.9654,4.4909,0.7817;5.95,4.9399,0.5114;3.9526,5.0123,-0.0413;4.0664,6.4034,-0.331;5.0415,6.6282,-0.7927;3.9391,7.0088,0.5693;3.2749,6.6294,-1.0499;4.8413,4.84,2.2871;4.3034,5.8806,2.6472)|\",3.115703588225\r\n\"[H]C1=C([H])C(N([H])[H])=C([H])C([H])=C1/C([H])=N/OC([H])(C([H])([H])[H])C([H])([H])[H] |(8.2372,-3.0797,1.2026;7.8234,-3.123,0.1975;8.6408,-3.5162,-0.8558;9.682,-3.7703,-0.6701;8.1324,-3.5847,-2.1637;8.9556,-3.9223,-3.2364;9.7736,-4.4702,-3.0015;8.4788,-4.2871,-4.0514;6.7811,-3.244,-2.376;6.3733,-3.2862,-3.3838;5.9731,-2.854,-1.3203;4.9338,-2.596,-1.4975;6.4763,-2.7851,-0.0071;5.6524,-2.3836,1.1307;6.1251,-2.341,2.1163;4.4106,-2.0834,1.0025;3.858,-1.7148,2.234;2.45,-1.446,2.0822;2.3286,-0.7928,1.208;1.671,-2.7413,1.8662;0.6023,-2.5332,1.7379;1.7934,-3.4095,2.7267;2.0346,-3.2566,0.9725;2.0421,-0.7022,3.3484;0.9815,-0.4325,3.3052;2.6307,0.2129,3.4668;2.2013,-1.3305,4.2321)|\",4.52797447232\r\n\"[H]C1=C([H])C([H])=C([H])/C(=C(/[H])N([H])/N=C(\\[H])C([H])([H])[H])C1=O |(9.256,3.9929,2.3455;8.7903,3.1375,1.8653;9.5238,2.1394,1.3041;10.6107,2.1878,1.3297;8.9037,1.0056,0.6679;9.5273,0.2306,0.2326;7.5461,0.9143,0.6175;7.0925,0.0499,0.1328;6.709,1.9351,1.1906;5.3316,1.9173,1.1815;4.8043,2.7453,1.6462;4.5343,0.9683,0.6509;4.9343,0.1497,0.1936;3.1742,1.0859,0.7202;2.4815,0.1437,0.1983;2.9639,-0.72,-0.2827;0.9879,0.1854,0.2248;0.5824,-0.6951,0.7397;0.5799,0.1757,-0.7944;0.6435,1.0872,0.7363;7.3261,3.1195,1.8564;6.6529,4.0252,2.3664)|\",3.278971898524999\r\n\"[H]ON1C(NC([H])([H])OC([H])([H])C([H])([H])[H])=N[N+]([O-])C2=C1C([H])=C([H])C([H])=C2[H] |(7.3856,-1.1074,0.7031;8.0464,-1.7231,1.1293;8.1391,-1.257,2.4363;7.0097,-1.2529,3.2719;5.7738,-1.3403,2.9332;5.2782,-1.4648,1.5971;4.1924,-1.3157,1.6292;5.4814,-2.4508,1.1647;5.816,-0.5274,0.6307;5.4115,0.8342,0.8429;4.3143,0.8819,0.7969;5.7178,1.165,1.8436;6.0373,1.6947,-0.2405;5.7149,2.7353,-0.1267;5.7373,1.3452,-1.2335;7.131,1.6691,-0.181;7.2329,-1.0964,4.6459;8.4225,-0.8725,5.1088;8.5911,-0.7362,6.3353;9.5803,-0.7689,4.2652;9.4092,-0.9978,2.8918;10.5356,-0.9564,2.0514;10.4102,-1.1506,0.9946;11.7785,-0.6727,2.6005;12.6451,-0.6397,1.9462;11.937,-0.4326,3.9755;12.9179,-0.2128,4.3838;10.8325,-0.4861,4.813;10.9001,-0.3231,5.8813)|\",3.643604458195001\r\n\"[H]O/C(=N\\[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@]([H])(C([H])([H])[H])C1([H])[H])OC1=C([H])C([H])=C([H])C([H])=C1[H] |(1.121,-2.9571,2.0668;1.5112,-3.6297,2.6477;2.5846,-3.0759,3.2663;3.0571,-1.905,3.1747;2.4435,-0.9193,2.2986;1.3455,-1.0602,2.2189;2.6647,0.4867,2.8965;3.7414,0.6065,3.0768;2.1678,0.5611,3.8704;2.214,1.5843,2.0426;1.197,1.5891,1.9883;2.7878,1.5159,0.6968;2.3731,2.3389,0.1016;3.8659,1.7062,0.7952;2.5819,0.1668,-0.0324;3.2258,0.1704,-0.9238;1.1339,-0.0143,-0.5161;0.9975,-0.9864,-1.0056;0.8657,0.7629,-1.2418;0.4093,0.0436,0.3055;3.0593,-0.9807,0.8844;4.1495,-0.9263,1.0047;2.8483,-1.9563,0.4241;3.1124,-4.0477,4.0469;4.2027,-3.7669,4.8706;5.3264,-4.5739,4.718;5.3386,-5.3203,3.9302;6.4073,-4.4094,5.5865;7.2866,-5.0374,5.4722;6.3585,-3.4448,6.5936;7.2006,-3.3159,7.2677;5.2211,-2.6452,6.7333;5.1768,-1.8932,7.5165;4.1335,-2.8029,5.875;3.2484,-2.1853,5.9725)|\",5.793303877145\r\n\"[H]O/C(=N/[C@@]1([H])C([H])([H])N([H])C([H])([H])[C@]([H])(C([H])([H])[H])C1([H])[H])C([H])([H])[H] |(5.838,-3.35,4.2183;4.8842,-3.3427,4.0448;4.5649,-2.2192,3.3162;3.3416,-2.0907,3.0284;2.7483,-0.9921,2.2798;1.6733,-1.2209,2.2604;2.8568,0.3906,2.9697;3.9077,0.6974,3.0541;2.4619,0.3243,3.9897;2.1494,1.4518,2.2514;1.145,1.2898,2.3068;2.562,1.542,0.8472;1.9695,2.3299,0.3655;3.6072,1.8846,0.8396;2.4605,0.2245,0.0421;2.9855,0.3813,-0.9118;1.0054,-0.1377,-0.2964;0.9533,-1.0821,-0.8504;0.5463,0.6404,-0.9183;0.3837,-0.2522,0.5991;3.2016,-0.8979,0.8059;4.2801,-0.6975,0.7595;3.0437,-1.868,0.3183;5.782,-1.3621,3.0076;5.5385,-0.4758,2.426;6.5201,-1.9481,2.4442;6.2595,-1.0362,3.9409)|\",6.29671450057\r\n\"[H]OC(=O)/C([H])=C(\\[H])C(=O)[C@@]([H])(C(=O)O[H])C([H])([H])[H] |(5.2678,4.3926,-2.8948;6.0939,4.0725,-2.4966;5.8508,3.0183,-1.6741;6.755,2.4965,-1.0667;4.4271,2.5747,-1.5975;3.6765,3.0239,-2.246;4.0504,1.6258,-0.732;4.7897,1.1814,-0.0719;2.6348,1.1588,-0.6699;1.7684,1.6268,-1.3867;2.3216,0.0151,0.3076;3.0211,-0.8064,0.0805;2.6171,0.4856,1.7382;2.9068,1.6202,2.0258;2.5229,-0.4617,2.7046;2.2863,-1.3216,2.3198;0.8755,-0.4845,0.1526;0.694,-0.7945,-0.8791;0.1663,0.3126,0.3898;0.6698,-1.3384,0.8073)|\",4.819136292355\r\n\"[H]OC(=N/[C@@]([H])(C([H])([H])[H])C([H])([H])OC([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(8.141,0.7027,-1.8873;8.6749,-0.0236,-2.2453;8.4981,-1.1226,-1.4404;9.402,-2.0153,-1.5137;9.2948,-3.282,-0.8086;8.694,-3.2227,0.1136;8.6698,-4.3474,-1.7205;8.7005,-5.3258,-1.2323;9.2227,-4.4041,-2.6649;7.6287,-4.096,-1.9504;10.7153,-3.6706,-0.3843;11.1619,-2.8511,0.2032;11.3422,-3.8133,-1.2802;10.6505,-4.8574,0.3855;11.9192,-5.3019,0.8101;11.7662,-6.2157,1.3909;12.4226,-4.5546,1.4455;12.5803,-5.5266,-0.0428;7.2666,-1.1343,-0.6247;7.281,-1.7723,0.2529;6.1508,-0.4454,-0.9418;6.1458,0.126,-1.8708;4.8892,-0.3983,-0.1994;3.7888,0.2674,-0.7695;3.8982,0.732,-1.747;2.5662,0.3395,-0.1099;1.7356,0.8623,-0.5789;2.3896,-0.249,1.1495;1.0553,-0.1919,1.8538;1.1516,-0.4381,2.9159;0.3444,-0.9042,1.4142;0.6026,0.8031,1.7761;3.4884,-0.9041,1.725;3.3819,-1.3602,2.7065;4.7124,-0.9792,1.0696;5.5408,-1.4857,1.5562)|\",4.410965516605\r\n\"[H]OC(=N/[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C(OC([H])([H])[H])=C1[H] |(3.0717,-1.608,-1.2927;3.5874,-1.4789,-0.4814;4.6405,-2.3635,-0.5;5.6303,-2.0652,0.2406;6.7617,-2.9561,0.4429;6.764,-3.819,-0.2458;6.6939,-3.5079,1.8761;7.5496,-4.1626,2.0789;5.7744,-4.0834,2.0276;6.7006,-2.6855,2.5994;8.0637,-2.1668,0.2129;8.0774,-1.3182,0.9088;8.9106,-2.8143,0.4785;8.23,-1.6605,-1.2221;9.1668,-1.1041,-1.3394;7.4053,-0.9946,-1.4971;8.2476,-2.4919,-1.9392;4.4586,-3.5518,-1.3623;5.3639,-4.0349,-1.7147;3.2522,-4.0584,-1.6885;2.3665,-3.5952,-1.2524;2.9767,-5.2201,-2.5413;3.9552,-5.8305,-3.3416;4.9663,-5.4373,-3.366;3.625,-6.9356,-4.1248;4.3855,-7.4023,-4.745;2.3313,-7.4476,-4.1306;2.0611,-8.3057,-4.7375;1.3421,-6.8432,-3.3414;0.103,-7.4091,-3.4108;-0.9433,-6.8422,-2.6375;-1.8284,-7.446,-2.8449;-1.1397,-5.8009,-2.9253;-0.7197,-6.8838,-1.5633;1.6643,-5.733,-2.5565;0.9112,-5.2559,-1.9387)|\",4.345658192485001\r\n\"[H]C([H])([H])N1C(=O)N(C([H])([H])C#N)[C@@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C1=O |(1.0826,-1.037,-0.9307;1.2449,-0.0338,-0.5425;1.4198,0.6692,-1.3595;0.3749,0.3026,0.0257;2.4274,-0.0891,0.3278;2.8078,1.1292,0.9174;2.1193,2.1277,0.7652;3.9768,1.1266,1.6681;4.4476,2.4501,2.0634;3.5743,3.1036,2.1067;4.9007,2.4092,3.0575;5.4289,3.022,1.1183;6.2203,3.4358,0.3768;4.9376,0.0151,1.531;5.4282,0.084,0.5451;6.0361,0.0385,2.6066;6.6299,0.9554,2.5246;5.565,0.0369,3.6;6.9643,-1.1794,2.466;7.5096,-1.1066,1.5145;7.7199,-1.1516,3.2599;6.1851,-2.5009,2.5093;5.7447,-2.6307,3.5087;6.8641,-3.3475,2.3534;5.0711,-2.5231,1.4553;4.4682,-3.4321,1.5294;5.5076,-2.5247,0.4474;4.1531,-1.3012,1.6039;3.6755,-1.3358,2.5972;3.0175,-1.3268,0.5934;2.6198,-2.3535,0.0718)|\",6.658625921735\r\n\"[H]OC(=O)/C([H])=C(\\[H])C1=C([H])N=C([C@@]2([H])OC([H])([H])C([H])([H])C2([H])[H])N=C1[H] |(7.455,-1.2648,-5.4774;7.7879,-0.3965,-5.7566;7.1098,0.601,-5.1236;7.3932,1.7556,-5.3453;6.0432,0.1637,-4.1831;5.861,-0.9018,-4.0545;5.3329,1.0877,-3.5117;5.5985,2.127,-3.7034;4.2542,0.8656,-2.5567;3.7154,-0.3869,-2.2164;4.0831,-1.3002,-2.6825;2.7413,-0.5399,-1.3237;2.269,0.577,-0.7412;1.16,0.3852,0.2823;1.2511,-0.6358,0.6666;1.2894,1.2901,1.3678;0.3542,2.3743,1.2422;-0.3174,2.3477,2.1114;0.9048,3.3208,1.2576;-0.407,2.1589,-0.0756;-1.4558,2.4646,-0.01;0.0651,2.7257,-0.883;-0.2417,0.6496,-0.3119;-0.3295,0.3549,-1.3617;-0.9821,0.0819,0.263;2.6794,1.8262,-1.0054;3.6566,1.9542,-1.9005;3.9906,2.9698,-2.1152)|\",4.35926388501\r\n\"[H]OC(=O)/C([H])=C(\\[H])C1=C([H])N=C(N([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])N=C1[H] |(8.5365,2.5003,-8.5669;8.7088,3.4531,-8.638;8.2686,4.0954,-7.5155;8.4027,5.2956,-7.4253;7.6484,3.2326,-6.483;7.5816,2.1616,-6.6661;7.1819,3.7856,-5.3444;7.3005,4.8662,-5.2628;6.5448,3.1288,-4.2203;6.2826,1.7452,-4.1252;6.5619,1.0721,-4.9354;5.6979,1.1766,-3.0854;5.3397,2.0085,-2.0703;4.7617,1.4118,-1.0028;4.6033,0.4182,-1.1098;4.2217,2.0931,0.1748;4.8029,3.0136,0.2847;4.4426,1.2006,1.4007;4.1057,1.699,2.3139;5.5035,0.9573,1.5176;3.882,0.2605,1.31;2.7436,2.4795,-0.047;2.6838,3.0305,-0.9931;2.1521,1.5622,-0.1786;2.1435,3.3356,1.0738;1.1254,3.6478,0.8173;2.7364,4.2444,1.2358;2.0915,2.7955,2.0252;5.5259,3.3469,-2.0401;6.1214,3.8714,-3.1037;6.2772,4.9508,-3.0823)|\",4.1878321591950005\r\n\"[H]OC(=O)/C([H])=C(\\[H])C1=C([H])N=C(S[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])N=C1[H] |(7.0193,-9.698,1.8241;7.4928,-9.7377,2.6709;7.4512,-8.5279,3.2975;7.9939,-8.3886,4.37;6.7158,-7.4535,2.5816;6.2615,-7.68,1.6187;6.6296,-6.2286,3.134;7.1216,-6.1062,4.0987;5.9624,-5.0512,2.6019;5.2504,-4.9969,1.3859;5.154,-5.8793,0.7546;4.6596,-3.9001,0.935;4.7675,-2.7996,1.7078;3.9737,-1.3817,1.0244;4.2644,-0.0709,2.3096;5.2819,-0.2363,2.6755;3.2743,-0.2135,3.4675;3.4762,0.5551,4.2247;3.3685,-1.1917,3.946;2.2416,-0.0906,3.123;4.1708,1.3021,1.6208;3.2021,1.3923,1.1095;4.1566,2.0554,2.4204;5.3119,1.6198,0.6496;5.201,2.6303,0.2412;5.3361,0.9186,-0.1908;6.2837,1.5684,1.1553;5.4057,-2.7209,2.8836;5.9911,-3.8397,3.3093;6.5117,-3.7768,4.2649)|\",4.01095815637\r\n\"[H]C1=C(NC(=O)OC([H])([H])[H])ONN1C([H])(C([H])([H])[H])C([H])([H])[H] |(3.3757,-1.1269,2.2107;3.3361,-0.0534,2.2116;3.7265,0.8658,3.2151;4.2319,0.8353,4.4076;4.4917,-0.4209,4.908;4.3078,-1.5093,4.3656;5.0097,-0.3148,6.1549;5.3297,-1.5578,6.7827;5.7326,-1.2945,7.7621;6.0737,-2.1114,6.2016;4.4384,-2.183,6.8935;3.4461,2.1107,2.6619;2.9177,1.987,1.4024;2.8779,0.6977,1.1953;2.3452,0.2052,-0.1059;2.4203,-0.8828,-0.0254;0.8772,0.6132,-0.2509;0.4711,0.1755,-1.1683;0.7795,1.7006,-0.3104;0.2827,0.2541,0.5948;3.2335,0.7021,-1.249;2.8857,0.2655,-2.1905;4.2756,0.4056,-1.096;3.1881,1.7914,-1.3316)|\",4.12524597358\r\n\"[H]C1=C(NC(=O)C(F)(F)F)ONN1C([H])(C([H])([H])[H])C([H])([H])[H] |(2.1406,-2.5217,-1.2181;2.8166,-1.8242,-1.6779;3.6093,-1.9653,-2.8339;3.845,-2.8765,-3.7371;3.1712,-4.0542,-3.5824;2.3725,-4.376,-2.7074;3.521,-5.0612,-4.7015;3.1931,-4.5669,-5.9129;4.8422,-5.3316,-4.709;2.8625,-6.2167,-4.5342;4.2754,-0.755,-2.9453;3.9312,0.0925,-1.9307;3.0742,-0.5851,-1.217;2.4667,0.0651,-0.0187;1.8746,-0.7336,0.4368;3.5679,0.5075,0.9461;3.1057,0.9115,1.852;4.1951,1.2849,0.5009;4.2046,-0.3359,1.2296;1.5455,1.2027,-0.467;1.0298,1.6126,0.4069;0.7925,0.8448,-1.1755;2.119,2.0048,-0.9407)|\",4.15245735863\r\n\"[H]C1=C(NC(=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])ONN1N(C([H])([H])[H])C([H])([H])[H] |(-0.1963,2.6795,3.39;-0.1864,2.4117,2.3485;0.8104,1.809,1.548;1.9974,1.4199,1.8748;2.8623,0.8324,0.9505;2.5958,0.6262,-0.2273;4.2264,0.4525,1.5625;5.1099,-0.187,0.4813;6.0806,-0.4695,0.9068;5.2829,0.5053,-0.3483;4.6394,-1.0829,0.0648;3.9984,-0.5462,2.7181;4.9587,-0.8255,3.1694;3.5187,-1.4642,2.3574;3.3627,-0.1109,3.4942;4.9028,1.7281,2.1108;5.8743,1.4778,2.5553;4.2827,2.2037,2.876;5.0789,2.457,1.3104;0.2047,1.7226,0.2742;-1.0667,2.2371,0.2943;-1.231,2.6207,1.5364;-2.4439,3.2132,1.9481;-2.6474,4.493,1.2496;-3.5195,4.9712,1.7028;-2.8155,4.3726,0.1711;-1.7756,5.1306,1.4142;-3.5589,2.2675,1.7758;-4.4384,2.7209,2.24;-3.3237,1.3411,2.3049;-3.7752,2.0433,0.7229)|\",3.948371970755\r\n\"[H]C1=C(NC(=O)C(=O)OC([H])([H])C([H])([H])[H])ONN1C([H])([H])[H] |(7.56,-2.6906,-1.8211;7.877,-2.2059,-0.9138;7.2178,-1.252,-0.1133;6.0543,-0.7044,-0.2774;5.54,0.1717,0.6535;6.0335,0.5274,1.7124;4.1465,0.7008,0.2711;3.2276,0.7747,1.0548;4.1038,1.111,-1.0124;2.8492,1.6784,-1.4612;2.4203,2.2729,-0.6503;3.132,2.3382,-2.2852;1.8852,0.5942,-1.9196;0.974,1.0517,-2.3226;1.6048,-0.0493,-1.081;2.3384,-0.0221,-2.7028;8.1285,-1.0063,0.9296;9.2623,-1.7506,0.7771;9.0596,-2.4366,-0.3193;10.1169,-3.3493,-0.7546;9.7236,-4.3674,-0.7784;10.9317,-3.274,-0.0354;10.458,-3.0539,-1.7487)|\",4.04361181843\r\n\"[H]C1=C(NC(=O)C([H])([H])[H])ONN1C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.8922,3.6953,-1.0942;5.0879,3.1665,-2.0084;5.5307,3.6599,-3.2577;5.8504,4.8009,-3.7856;5.7549,5.9088,-2.9536;5.3928,5.8907,-1.7752;6.1479,7.2002,-3.6436;5.5109,7.3662,-4.5198;7.178,7.1309,-4.011;6.0552,8.0391,-2.9515;5.6099,2.5233,-4.0614;5.2445,1.3998,-3.3717;4.9493,1.8385,-2.1752;4.4637,0.8592,-1.189;4.7491,-0.1207,-1.5779;5.0141,1.0416,-0.2616;2.9497,0.9615,-0.9768;2.6983,1.968,-0.6194;2.4503,0.8326,-1.9452;2.4458,-0.0897,0.0214;2.7018,-1.0926,-0.3469;2.9726,0.032,0.978;0.9341,0.0008,0.2534;0.5996,-0.7621,0.9642;0.3803,-0.1466,-0.6812;0.6526,0.9805,0.657)|\",4.0817077575\r\n\"[H]C1=NC([H])=C(/C([H])=C(\\[H])N(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(5.0485,-6.7288,0.5919;4.8839,-5.6659,0.4237;3.9904,-5.0692,1.2248;3.7679,-3.7665,1.0436;3.0673,-3.3121,1.7429;4.383,-2.9718,0.0518;4.0922,-1.5539,-0.1507;4.8196,-1.0041,-0.7423;2.9883,-0.9184,0.3129;2.2146,-1.4847,0.827;2.7095,0.4295,0.2441;1.3153,0.8438,0.2562;1.2368,1.8686,0.6349;0.745,0.1886,0.9214;0.8488,0.808,-0.7423;3.6307,1.3053,-0.4538;3.3554,2.3457,-0.2588;3.6308,1.1423,-1.5448;4.6474,1.1412,-0.0829;5.3318,-3.6271,-0.754;5.8561,-3.0689,-1.5269;5.5866,-4.9818,-0.5685;6.3145,-5.5023,-1.1849)|\",4.5851183809250005\r\n\"[H]O/C1=C(C(\\[H])=C(/[H])C([H])([H])[H])/C([H])=C2\\C3=C(C(=O)N2C([H])([H])[H])C([H])=C([H])C([H])=C31 |(3.5146,-2.2955,1.5743;4.464,-2.5143,1.5707;5.1621,-1.3497,1.4314;4.5803,-0.082,1.5462;3.1482,0.0943,1.8525;2.6695,0.9323,1.3415;2.391,-0.6259,2.7016;2.8495,-1.4179,3.295;0.9343,-0.3704,2.9611;0.3229,-1.2538,2.7324;0.5593,0.4646,2.3598;0.7581,-0.1358,4.0194;5.3552,1.1126,1.315;4.8553,2.0739,1.3919;6.6912,1.0008,1.0301;7.2594,-0.289,0.947;8.6213,-0.2053,0.6357;8.9391,1.2534,0.5191;9.9963,1.8057,0.264;7.7185,1.9198,0.7696;7.5917,3.3595,0.7537;7.2621,3.7381,1.7291;6.877,3.6853,-0.0121;8.5774,3.7683,0.5229;9.3517,-1.3673,0.4873;10.4099,-1.3472,0.2439;8.6696,-2.5998,0.6594;9.2277,-3.5248,0.5445;7.3141,-2.6706,0.9661;6.8331,-3.636,1.0859;6.5538,-1.4827,1.1238)|\",3.4504036243400007\r\n\"[H]C1=C(C([H])([H])[H])N([H])C(/N=C(\\[H])OC([H])([H])C([H])([H])[H])=C1C#N |(9.208,-3.9653,2.2437;8.4062,-3.2404,2.2512;8.5468,-1.873,2.2716;9.7674,-1.0112,2.2941;10.6633,-1.6372,2.2667;9.821,-0.3947,3.2014;9.8098,-0.3334,1.4313;7.272,-1.3408,2.274;7.0318,-0.3597,2.2865;6.3132,-2.3212,2.2558;4.9969,-1.9104,2.2535;4.0271,-2.7461,2.2533;4.1049,-3.834,2.2552;2.7479,-2.3666,2.2497;2.4699,-0.9489,2.2431;2.9249,-0.4981,3.1317;2.9467,-0.5016,1.3644;0.9621,-0.7845,2.2247;0.7053,0.2804,2.2212;0.5079,-1.2464,3.1069;0.5296,-1.247,1.3319;7.0051,-3.5434,2.2406;6.4278,-4.8351,2.2173;5.9435,-5.8963,2.1987)|\",4.574233826905\r\n\"[H]C(=O)/C([H])=C1/C([H])=C(C([H])([H])[H])C([H])([H])C(C([H])([H])[H])(C([H])([H])[H])C1([H])[H] |(7.4417,1.6088,0.1294;6.8428,2.4566,-0.2648;7.4002,3.4981,-0.5815;5.4017,2.2526,-0.3739;4.8587,3.1107,-0.7671;4.7112,1.1292,-0.0373;3.2649,1.0981,-0.2045;2.7845,2.0267,-0.5087;2.5117,-0.0057,-0.0105;1.0172,0.0122,-0.1672;0.64,1.0143,-0.3924;0.523,-0.3428,0.748;0.6991,-0.6651,-0.9721;3.1375,-1.3239,0.3827;2.6057,-2.1455,-0.119;2.9641,-1.483,1.4601;4.6509,-1.4232,0.0835;4.8866,-1.6239,-1.4268;4.4354,-2.5639,-1.7667;5.9596,-1.6714,-1.6484;4.4582,-0.8102,-2.0204;5.2489,-2.6162,0.847;4.7609,-3.5535,0.553;5.1278,-2.5004,1.9311;6.3205,-2.7203,0.638;5.3304,-0.1175,0.5628;5.2224,-0.0537,1.6566;6.4041,-0.17,0.3613)|\",4.44634031717\r\n\"[H]C([H])([H])OC(=O)C1(C([H])([H])[H])C([H])([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(3.8149,2.2654,4.8557;3.8868,1.2134,4.5783;4.6187,0.6978,5.2059;2.9177,0.7195,4.6836;4.306,1.2053,3.2036;4.4293,-0.0172,2.6441;4.2372,-1.047,3.2533;4.9325,0.0539,1.1956;6.4738,0.0524,1.2388;6.8869,0.0679,0.2237;6.847,-0.8432,1.7468;6.8516,0.9305,1.7737;4.4129,1.2965,0.453;4.9375,1.3756,-0.5124;4.6052,2.2096,1.0187;3.0049,1.2407,0.2692;2.5268,0.0731,-0.4006;1.0187,0.0701,-0.1889;0.5738,0.9684,-0.6263;0.808,0.0529,0.8833;0.5746,-0.8151,-0.6526;2.8903,0.0943,-1.8937;2.4744,0.9912,-2.3625;2.4723,-0.7889,-2.386;3.9699,0.0946,-2.067;3.005,-1.1089,0.2367;4.4119,-1.175,0.4308;4.5889,-2.0875,1.0029;4.9444,-1.2524,-0.5298)|\",7.306256885925\r\n\"[H]C([H])=C([H])C([H])([H])[C@]([H])(OC(=O)C([H])(OC([H])([H])[H])OC([H])([H])[H])C([H])([H])[H] |(1.326,1.0696,-2.7973;1.7547,0.9022,-1.8128;1.234,1.3596,-0.9733;2.8579,0.1728,-1.6443;3.3549,-0.266,-2.5087;3.5076,-0.1024,-0.3142;3.4722,-1.1799,-0.0942;2.9606,0.408,0.4884;4.9747,0.3394,-0.2548;5.0688,1.393,-0.5314;5.6651,-0.443,-1.2767;6.7051,0.1277,-1.9053;7.1856,1.2048,-1.6369;7.2322,-0.8148,-3.0118;7.5209,-1.7775,-2.5402;6.1843,-0.9995,-3.908;6.45,-1.9809,-4.9048;5.5384,-2.0685,-5.4994;6.678,-2.9547,-4.445;7.284,-1.6817,-5.5477;8.3313,-0.2662,-3.6807;9.5597,-0.35,-2.9732;10.3342,0.0053,-3.6565;9.7858,-1.3893,-2.686;9.5525,0.2813,-2.0775;5.6353,0.0921,1.0962;5.1152,0.6547,1.8795;6.6784,0.4206,1.0764;5.6027,-0.9719,1.3557)|\",6.933460910740001\r\n\"[H]OC(=O)C1=C([H])C([C@@]2([H])N(C([H])([H])[H])C(=O)[C@]([H])(O[H])C2([H])[H])=C([H])N=C1[H] |(1.3186,-1.7993,-3.9094;1.4499,-1.6655,-4.8619;1.5611,-0.3406,-5.1491;1.545,0.0351,-6.2959;1.7063,0.5948,-3.9859;2.1811,0.2108,-2.7299;2.4831,-0.815,-2.5247;2.301,1.1651,-1.7154;2.8585,0.8186,-0.3504;2.6344,1.6491,0.336;2.3087,-0.4251,0.1995;0.8843,-0.5956,0.4287;0.4864,0.2159,1.0534;0.7457,-1.5458,0.9474;0.3336,-0.6075,-0.5171;3.231,-1.1579,0.9066;2.9954,-2.1208,1.6207;4.6028,-0.488,0.7471;5.3762,-1.2286,0.5047;4.899,0.1842,1.9688;4.8062,-0.475,2.6781;4.388,0.5416,-0.3563;4.9709,1.4495,-0.1855;4.6821,0.1205,-1.323;1.9185,2.4743,-2.0239;1.9792,3.249,-1.2597;1.476,2.8714,-3.2241;1.3843,1.948,-4.1812;1.0326,2.274,-5.1564)|\",5.01233712621\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@@](O[H])(C([H])([H])F)[C@]([H])(O[H])[C@]([H])(O[H])[C@]1([H])O[H] |(-0.0972,-0.138,1.2071;0.3237,-0.8305,0.6693;1.2312,-0.1823,-0.1925;2.0918,0.2329,0.3612;1.6338,-0.9466,-0.8659;0.5454,0.8987,-1.0478;-0.1487,0.3963,-1.7271;-0.3342,1.7198,-0.2421;0.1628,2.8409,0.4678;-0.9699,3.6388,0.7472;-1.6608,3.0363,1.0722;0.7978,2.4326,1.8039;0.9471,3.3184,2.4277;1.7446,1.8996,1.6878;-0.0982,1.5831,2.4643;1.0875,3.7173,-0.4031;1.5649,4.4857,0.2225;0.2999,4.342,-1.4166;-0.5508,4.5571,-0.99;2.1735,2.8673,-1.088;2.8212,2.4452,-0.3095;3.002,3.6533,-1.9139;2.5156,3.7377,-2.7544;1.5226,1.7262,-1.8969;2.314,1.0598,-2.2591;0.9176,2.2647,-3.068;0.3072,2.9616,-2.7526)|\",8.07089680583\r\n\"[H]OC1=NC2=NC(=O)N(C([H])([H])[H])[C@@]2([H])C(=O)N1C([H])([H])[H] |(6.9974,3.3439,-0.6247;6.1538,3.8258,-0.5311;5.2205,2.8873,-0.3377;5.5603,1.6386,-0.3276;4.5337,0.7514,-0.1268;4.6252,-0.4266,0.3782;3.2789,-0.9163,0.5025;2.9638,-2.0191,0.8893;2.3797,0.0945,0.1268;0.974,-0.1681,-0.12;0.3633,0.6529,0.2625;0.7236,-1.0933,0.4016;0.7684,-0.2997,-1.1927;3.1135,1.1696,-0.4624;2.9934,1.213,-1.5629;2.8415,2.587,0.0294;1.7704,2.9932,0.4224;3.9627,3.4317,-0.1173;3.8065,4.8754,0.1209;2.7434,5.0584,0.2681;4.1744,5.4372,-0.7394;4.359,5.1791,1.0136)|\",4.696685059630001\r\n\"[H]OC(=O)[C@]1(C([H])([H])[H])C([H])([H])O[C@]([H])(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])OC1([H])[H] |(0.7703,1.9206,2.3964;0.2678,2.728,2.641;1.1223,3.7492,2.8374;0.7169,4.8626,3.0791;2.6337,3.4296,2.7207;3.4255,4.4449,3.5534;4.504,4.2916,3.4313;3.1827,4.3509,4.617;3.171,5.4629,3.2507;3.0471,3.5274,1.2308;4.1476,3.5025,1.1635;2.691,4.459,0.7836;2.4934,2.467,0.4563;2.8642,1.1985,0.9408;3.9675,1.1317,0.9772;2.2913,0.0952,0.0338;2.7628,-1.2709,0.5676;2.3833,-2.0742,-0.0737;2.4043,-1.4477,1.586;3.8578,-1.3406,0.5749;2.8416,0.3127,-1.3892;2.4882,-0.4858,-2.0511;3.9388,0.2935,-1.4009;2.5142,1.2717,-1.7995;0.7519,0.1546,0.014;0.3597,-0.577,-0.7014;0.3971,1.1456,-0.2833;0.3287,-0.0852,0.9956;2.3827,1.0262,2.2769;2.9587,1.9909,3.1712;2.5537,1.7693,4.1628;4.0495,1.8491,3.2023)|\",7.494015442769999\r\n\"[H]NC1=NC2=NC([H])C([C@@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])[H])=N[C@@]2([H])C(O[H])=N1 |(8.0797,-3.0627,3.9376;7.408,-3.5149,3.3091;6.6641,-2.602,2.8112;6.8642,-1.2359,3.1669;5.8874,-0.4272,2.9819;6.0635,0.9336,3.2669;5.2079,1.7465,2.7533;5.3111,2.8044,2.98;4.1179,1.3285,1.8298;3.3613,2.3844,1.0323;2.4501,1.9066,0.6588;2.9788,3.452,1.8843;3.68,4.1232,1.7984;4.1528,2.934,-0.1705;3.4587,3.592,-0.7154;5.1928,3.727,0.4233;5.652,4.2182,-0.275;4.7067,1.8736,-1.1153;3.9033,1.241,-1.5087;5.4367,1.2348,-0.6099;5.2037,2.3483,-1.9694;3.7848,0.0976,1.6978;4.5328,-0.8811,2.4875;3.8984,-1.1402,3.3535;4.7225,-2.1611,1.678;3.7782,-2.408,0.7624;3.2246,-1.6006,0.7096;5.6913,-2.9649,1.8596)|\",3.63544104268\r\n\"[H]OC(=O)[C@@]1([H])C(C([H])([H])[H])(C([H])([H])[H])N(O[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[C@@]1([H])N([H])[H] |(6.4477,1.2787,-2.461;7.1325,0.6241,-2.2493;6.6324,-0.3381,-1.4303;7.3833,-1.1744,-0.9916;5.107,-0.2845,-1.2225;4.6716,-0.3756,-2.229;4.5202,-1.5003,-0.4304;4.7256,-2.7677,-1.2916;5.7888,-2.9379,-1.4698;4.2113,-2.6631,-2.2541;4.3195,-3.6424,-0.7775;5.1535,-1.7733,0.9524;6.1962,-2.0668,0.8276;4.6054,-2.5882,1.4323;5.1259,-0.9219,1.6345;3.0392,-1.2728,-0.3999;2.4407,-2.3882,0.2997;1.9439,-2.8346,-0.4036;2.467,-0.029,0.1822;0.9641,-0.0375,-0.1707;0.5139,0.9346,0.0607;0.4328,-0.8024,0.4024;0.8246,-0.2393,-1.2382;2.5817,0.1143,1.7186;1.9095,0.9096,2.0619;3.5848,0.375,2.0578;2.2809,-0.8167,2.206;3.1227,1.1463,-0.5725;2.7042,1.1653,-1.5879;2.8243,2.0893,-0.0954;4.6596,1.113,-0.7169;4.8728,1.8369,-1.5214;5.4831,1.5572,0.4095;5.4546,0.9023,1.1836;5.1492,2.4525,0.7615)|\",6.446377118345\r\n\"[H]NC1=NC([H])[C@@]([H])(C([H])([H])C([H])([H])Cl)C(O[H])=N1 |(4.0942,-0.8543,4.2145;4.0864,0.0867,3.8086;3.8315,-0.0284,2.5628;3.6084,-1.3268,1.9907;3.0501,-1.3823,0.847;2.8625,-2.3683,0.4129;2.5805,-0.18,0.0559;2.8527,-0.3183,-0.9996;1.0276,-0.0939,0.1334;0.6124,-1.0116,-0.2988;0.7255,-0.0662,1.1886;0.3909,1.1194,-0.5347;0.6999,2.0573,-0.072;-0.6972,1.0484,-0.513;0.8488,1.2605,-2.2996;3.278,1.0461,0.607;3.3204,2.1455,-0.1787;2.9303,1.9527,-1.0501;3.8124,1.1222,1.7612)|\",5.29533553073\r\n\"[H]/N=C(/O[H])C([H])([H])/C(=C(/C#N)C(=O)O[H])N([H])[H] |(-0.1855,0.45,-2.0251;0.6106,-0.0902,-2.3644;1.3745,-0.4213,-1.4028;2.479,-1.1378,-1.6512;3.0113,-1.2333,-0.8272;1.1395,-0.0919,0.0858;2.0609,0.2956,0.5236;0.371,0.6837,0.1603;0.6736,-1.3005,0.8674;1.5143,-2.1468,1.5865;0.9231,-3.1873,2.3556;0.4361,-4.0319,2.9981;2.982,-2.0253,1.6019;3.634,-1.3379,0.8241;3.6432,-2.7305,2.5383;3.0285,-3.2193,3.1152;-0.6566,-1.4995,0.8601;-1.251,-0.9439,0.2628;-1.0639,-2.3257,1.281)|\",5.064038757805001\r\n\"[H]C1=C([H])SC(/C([H])=N/N(C([H])([H])[H])C([H])([H])[H])=N1 |(6.7973,5.1649,0.7963;5.9538,4.54,0.524;4.9296,4.9251,-0.2933;4.7841,5.8751,-0.7886;3.7877,3.6287,-0.4694;4.8065,2.6515,0.5934;4.4748,1.2853,0.9492;5.1712,0.8066,1.6377;3.4139,0.7329,0.4586;3.039,-0.5088,0.8142;2.1305,-1.1321,-0.1373;1.5367,-1.8982,0.371;2.6616,-1.5993,-0.9822;1.4666,-0.3592,-0.5273;3.946,-1.3777,1.546;3.4534,-2.3382,1.7097;4.1857,-0.9437,2.523;4.8893,-1.5471,1.0003;5.8838,3.2629,1.0224)|\",4.21776468275\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N([H])C(C1=N[C@@]([H])(N([H])[H])N=N1)=N2 |(3.067,-1.1315,0.1515;2.1077,-0.6272,0.0777;0.9372,-1.3574,0.2192;0.9462,-2.427,0.4025;-0.2837,-0.67,0.1179;-0.2872,0.7319,-0.1232;0.8919,1.4713,-0.2671;0.8803,2.5415,-0.4512;2.0844,0.7667,-0.1617;3.0249,1.3002,-0.2662;-1.6184,1.0718,-0.165;-2.0365,1.9808,-0.3098;-2.3366,-0.0921,0.0463;-3.7863,-0.0343,0.0587;-4.6443,-0.9668,0.2195;-5.9373,-0.2935,0.1149;-6.4126,-0.615,-0.8236;-6.8681,-0.4832,1.1939;-6.4098,-0.2815,2.0816;-7.1515,-1.4601,1.2228;-5.6309,1.1646,-0.0816;-4.3939,1.3002,-0.1386;-1.5721,-1.151,0.2178)|\",3.692584951285\r\n\"[H]OC(=O)[C@]1([H])N([H])OC(=O)[C@@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]21[H] |(3.1871,1.2313,-1.4856;2.6275,1.4384,-2.268;1.3561,1.469,-1.8438;0.4253,1.6169,-2.5996;1.2094,1.3646,-0.305;1.3051,2.3934,0.0736;2.3962,0.6743,0.2196;2.4468,0.7851,1.2334;2.3549,-0.7696,0.0396;1.1603,-1.4779,0.021;1.254,-2.6731,0.1237;-0.137,-0.7326,-0.2434;-0.2397,-0.728,-1.3406;-1.3475,-1.477,0.3475;-1.3692,-2.5002,-0.0377;-1.2225,-1.5555,1.4368;-2.6514,-0.7328,0.0246;-2.8293,-0.769,-1.0598;-3.4967,-1.2479,0.4967;-2.601,0.7346,0.4802;-2.5429,0.7717,1.5784;-3.5274,1.25,0.1997;-1.3954,1.4723,-0.1236;-1.3459,2.5003,0.2602;-1.497,1.5433,-1.2103;-0.0885,0.7349,0.2151;0.008,0.7377,1.3147)|\",6.854547894095001\r\n\"[H]OC1=NC(=O)N(C([H])([H])[H])[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C1([H])[H] |(5.3617,-3.0075,-1.7865;5.1291,-2.2851,-2.3908;5.3589,-1.0996,-1.7888;5.0118,-0.0444,-2.4151;5.309,1.2281,-1.8649;5.2629,2.2272,-2.5608;5.67,1.2879,-0.5211;5.8552,2.5962,0.0939;6.6361,2.5315,0.8603;4.9307,2.9634,0.5553;6.1632,3.294,-0.6837;5.5239,0.124,0.3273;6.1432,0.27,1.2214;4.0901,-0.1422,0.8488;3.7467,-1.2215,1.2868;3.3152,0.9464,0.7878;1.9388,0.804,1.2388;1.9229,0.1307,2.0989;1.6603,1.8113,1.5557;1.0488,0.3015,0.1138;0.0056,0.287,0.4494;1.3286,-0.7143,-0.179;1.1218,0.9543,-0.7613;6.0177,-1.1152,-0.4272;5.7595,-2.0145,0.1391;7.1073,-1.0755,-0.5476)|\",5.68717947545\r\n\"[H]OC1=NC(=O)/C(=C(/N([H])[H])C([H])([H])[H])C1([H])[H] |(6.7687,0.8813,2.5337;6.0879,1.2677,3.1077;4.8821,1.1135,2.5357;3.8192,1.5383,3.1062;2.7169,1.2184,2.2397;1.5668,1.5029,2.523;3.2303,0.5315,1.0261;2.5074,0.0622,-0.0254;3.1058,-0.6089,-1.0715;2.5829,-0.6835,-1.9334;4.1032,-0.5095,-1.2014;1.0097,0.171,-0.1278;0.6022,0.6578,0.7571;0.5675,-0.8276,-0.234;0.7317,0.7512,-1.0182;4.725,0.43,1.1888;5.3087,0.9594,0.4173;5.1049,-0.6033,1.2288)|\",5.11029811239\r\n\"[H]O/N=C1/C([H])([H])C(=O)C([H])([H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(2.0834,-4.4475,1.3727;2.4145,-3.6912,0.8646;2.9965,-2.8998,1.8746;3.4477,-1.7878,1.4266;4.1067,-0.8799,2.4376;5.1877,-0.8259,2.2236;3.985,-1.2586,3.4544;3.5763,0.5523,2.3546;3.3828,1.2201,3.3509;3.3665,1.1011,0.9478;2.8259,2.0484,1.0246;4.3709,1.3303,0.5587;2.6842,0.1273,-0.0441;2.8716,0.5203,-1.051;1.1706,0.0252,0.1089;0.4057,-0.3173,-1.0169;0.9043,-0.468,-1.9726;-0.9773,-0.4641,-0.9348;-1.5458,-0.7258,-1.8234;-1.6294,-0.2667,0.2844;-2.708,-0.378,0.3534;-0.8847,0.0828,1.4094;-1.3796,0.2473,2.3628;0.5023,0.2297,1.323;1.0497,0.5223,2.2131;3.3652,-1.2742,0.0133;4.3861,-1.1812,-0.3871;2.8263,-1.9836,-0.6155)|\",5.64908353638\r\n\"[H]OC(NN([H])[H])[C@@]1([H])N([H])[C@]([H])(C(NN([H])[H])O[H])C([H])([H])C(=O)C1([H])[H] |(-0.6616,1.9348,-3.1308;-1.1021,2.6361,-2.6284;-0.8457,2.4761,-1.3036;-1.2285,3.4237,-0.5386;-0.9787,3.1794,0.8607;-1.7364,2.5819,1.1986;-1.1074,4.0869,1.3052;-0.1363,1.2132,-0.8298;0.6989,1.581,-0.226;-1.0169,0.4702,0.0839;-1.8113,0.0935,-0.4306;-0.3456,-0.6017,0.8294;-1.1221,-1.2429,1.2598;0.3939,-0.0489,2.0522;0.6495,-0.7373,3.0998;0.2041,-2.0974,3.0524;0.8203,-2.6403,2.4436;0.3812,-2.4518,3.9898;0.84,1.2315,2.0514;0.2129,1.8597,1.6087;0.5342,-1.5155,-0.0921;1.2741,-2.0953,0.465;-0.1416,-2.2354,-0.5797;1.269,-0.7921,-1.2112;2.409,-1.0644,-1.5261;0.4736,0.304,-1.9205;-0.3232,-0.162,-2.5219;1.1626,0.8354,-2.5866)|\",4.6803582286\r\n\"[H]C1=C2C(=O)C([H])=C(N([H])[H])N=C2[C@]([H])(OC([H])([H])[H])C([H])=C1[H] |(2.9656,1.3247,0.336;3.9628,1.5201,0.724;4.7114,0.4384,1.0805;4.1455,-0.946,0.9807;3.0403,-1.148,0.464;4.9943,-1.9654,1.5379;4.6443,-2.9927,1.5112;6.2137,-1.6413,2.0831;7.0759,-2.57,2.605;6.716,-3.4808,2.8509;7.8181,-2.2065,3.1861;6.747,-0.3492,2.1371;6.0343,0.6216,1.6563;6.6467,2.0265,1.6423;7.3262,2.0301,0.763;7.383,2.3547,2.8035;8.7517,1.9663,2.7867;9.197,2.3818,3.6945;9.2714,2.3873,1.9113;8.8601,0.878,2.7833;5.6563,3.1266,1.3761;6.0196,4.1356,1.549;4.4223,2.8805,0.8969;3.7443,3.696,0.6618)|\",2.89529136932\r\n\"[H]C1C(=O)C2=C([H])/C([H])=C([H])/C(C([H])[H])=C\\2N=C1N([H])[H] |(4.9956,4.572,-0.0666;4.4924,3.6101,-0.0568;5.3033,2.4232,-0.0247;6.5409,2.4109,0.0005;4.5172,1.1492,-0.0194;5.1532,-0.0594,0.0067;6.2405,-0.0476,0.0206;4.4379,-1.3064,0.0168;4.9967,-2.2373,0.0365;3.0806,-1.3129,0.0029;2.5263,-2.2478,0.0115;2.3171,-0.0754,-0.0249;0.9637,-0.0996,-0.0389;0.4182,-1.0388,-0.0299;0.3952,0.8234,-0.0607;3.0665,1.2115,-0.0389;2.3986,2.3347,-0.068;3.1179,3.5223,-0.0802;2.3,4.6233,-0.1558;2.6824,5.5237,0.0941;1.3313,4.4664,0.0846)|\",2.359227083835\r\n\"[H]/N=C1/N=C(O[H])[C@]([H])(C(=O)O[H])C([H])=C1C(=O)O[H] |(-3.0728,1.1852,0.6284;-2.768,0.2377,0.3915;-1.5022,0.1983,0.1632;-0.9429,-1.0608,-0.0703;0.3196,-1.2341,0.0075;0.8202,-2.4596,-0.1693;1.7389,-2.4784,0.1806;1.3604,-0.1346,0.2486;2.0762,-0.1903,-0.5861;2.2155,-0.5124,1.4777;2.8388,-1.5522,1.4858;2.2474,0.3279,2.5166;1.6492,1.0839,2.3714;0.737,1.2234,0.2541;1.3848,2.0981,0.2561;-0.5927,1.3899,0.1712;-1.0967,2.8096,0.0596;-0.5136,3.7482,0.5447;-2.2282,2.9939,-0.6608;-2.5362,2.1504,-1.04)|\",4.783761491789999\r\n\"[H]C1=C(Cl)/C2=N/C(=O)/C([H])=C(/C([H])([H])[H])[C@@]2([H])C(=O)C1=O |(4.4198,-4.6306,-1.9294;4.0454,-3.6689,-1.5945;4.8273,-2.8159,-0.8941;6.465,-3.2318,-0.5193;4.3468,-1.4816,-0.4474;5.2015,-0.5649,-0.1961;4.7434,0.74,0.1855;5.5667,1.6091,0.3966;3.2958,0.9707,0.3181;3.0117,1.9667,0.6444;2.3731,0.0231,0.0803;0.9006,0.2514,0.2685;0.7115,1.2305,0.7162;0.4638,-0.5173,0.9201;0.3768,0.1919,-0.6911;2.8306,-1.3485,-0.3402;2.5176,-2.0783,0.4291;2.194,-1.8952,-1.6403;1.3763,-1.3134,-2.3076;2.6865,-3.3091,-2.0277;1.9744,-4.0497,-2.678)|\",3.67353698175\r\n\"[H]OC1=NC2=C(C(=O)C(O[H])=C([H])[C@@]2([H])Cl)C(C([H])([H])[H])=C1[H] |(4.6329,-3.0654,-0.1936;5.2134,-2.2882,-0.171;4.458,-1.1715,-0.064;5.1489,-0.0405,-0.0296;4.4593,1.098,0.0635;3.0548,1.1769,0.0938;2.3798,2.497,0.0984;1.1669,2.6365,0.1831;3.2559,3.706,-0.0448;2.5428,4.8498,-0.1679;3.1577,5.6002,-0.2102;4.5985,3.6236,-0.0497;5.2116,4.5189,-0.1312;5.3135,2.3406,0.1385;6.1586,2.2293,-0.5423;6.1598,2.3779,1.8098;2.3238,-0.0443,0.0711;0.8194,-0.1203,0.1152;0.4917,-1.1641,0.0837;0.3693,0.4218,-0.7215;0.4252,0.3526,1.0189;3.0556,-1.2226,-0.0012;2.5391,-2.1797,-0.0199)|\",4.37014843903\r\n\"[H]C1=C([H])C(C([H])([H])[C@@]2([H])C(C([H])([H])[H])N=C(C([H])([H])[H])NC2=O)=C([H])C([H])=C1F |(3.9731,-2.1652,-4.2891;4.4757,-1.5058,-3.5892;3.7733,-0.5718,-2.8269;2.6946,-0.5013,-2.9462;4.4305,0.2809,-1.9295;3.6577,1.2916,-1.1082;2.7183,1.5418,-1.6134;4.2294,2.2208,-1.0144;3.326,0.8587,0.3647;2.7668,1.698,0.7983;2.4688,-0.3809,0.4515;1.0293,-0.2941,0.0296;0.5096,0.4877,0.5983;0.5345,-1.2532,0.1927;0.9438,-0.0222,-1.0298;2.9119,-1.5041,0.8991;4.2637,-1.5515,1.3238;4.7355,-2.9437,1.6105;4.1043,-3.3963,2.3847;5.7778,-2.9393,1.9311;4.6188,-3.5629,0.7126;5.0711,-0.5565,1.5097;4.6195,0.7276,1.1724;5.2647,1.72,1.4567;5.8263,0.1803,-1.8171;6.3584,0.8353,-1.1327;6.5458,-0.7463,-2.5695;7.6248,-0.8273,-2.4899;5.8554,-1.5773,-3.4446;6.5453,-2.4752,-4.179)|\",4.266745175840001\r\n\"[H]C(NNC([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C1=C([H])C([H])=C([H])O1 |(2.4433,1.9904,1.4683;1.7453,2.1813,0.6484;1.304,1.2236,-0.0977;1.8254,-0.0087,0.286;1.3896,-0.9725,-0.4512;0.6901,-0.7634,-1.2677;1.7967,-2.3603,-0.2416;1.2668,-3.3626,-1.071;0.5634,-3.0893,-1.8542;1.6338,-4.696,-0.8969;1.2158,-5.4618,-1.5445;2.5369,-5.044,0.1088;2.8243,-6.0829,0.2467;3.0715,-4.0526,0.9403;3.7748,-4.3232,1.7233;2.7063,-2.7222,0.77;3.111,-1.9438,1.4082;1.3351,3.5423,0.4295;0.5,4.1603,-0.4706;-0.0523,3.6626,-1.2543;0.5192,5.5487,-0.156;-0.0199,6.3423,-0.6543;1.362,5.6849,0.9112;1.6908,6.5313,1.4951;1.8667,4.4819,1.2812)|\",3.8259207380300007\r\n\"[H]SC1=N[C@]([H])(C([H])([H])[H])[C@@]([H])(N([H])[H])C(S[H])=N1 |(4.2231,-3.3568,-1.1938;4.5508,-2.4959,-2.1794;3.7185,-1.1017,-1.4428;3.0708,-1.2073,-0.3479;2.3984,0.0104,0.118;2.3912,-0.032,1.216;0.9404,-0.0023,-0.3694;0.376,0.8301,0.0678;0.4595,-0.9403,-0.0783;0.8879,0.0788,-1.4609;3.1463,1.3088,-0.2788;2.4438,2.147,-0.2065;4.3231,1.6661,0.5217;4.9767,0.8836,0.5415;4.0388,1.8295,1.4869;3.6094,1.1687,-1.7271;3.8285,2.6066,-2.7342;3.6046,3.502,-1.7507;3.8921,0.0373,-2.2624)|\",4.824578569365\r\n\"[H]SC1=N[C@]([H])(C([H])([H])[H])[C@@]([H])(C([H])([H])[H])C(S[H])=N1 |(4.5939,1.2096,-4.1424;4.7178,-0.0795,-3.7647;3.9788,0.1633,-2.1603;3.616,1.3166,-1.7528;3.0389,1.3718,-0.4031;2.3053,2.1879,-0.4106;4.1414,1.733,0.604;3.7162,1.907,1.5998;4.6543,2.6438,0.2817;4.8874,0.9341,0.6796;2.2803,0.0726,-0.0308;2.1298,0.0507,1.0551;0.9055,-0.022,-0.7244;0.2688,0.808,-0.3987;0.396,-0.9603,-0.4818;1.0118,0.0371,-1.8119;3.1513,-1.1007,-0.4472;3.1008,-2.6358,0.4436;2.2551,-2.225,1.4122;3.9257,-1.0689,-1.469)|\",4.94702980209\r\n\"[H]C([H])([H])C1=NC(C2([H])C([H])([H])C2([H])[H])=C(C2C([H])([H])C2([H])[H])C(=O)N1 |(1.4849,0.9368,-1.6675;1.594,0.4625,-0.6848;1.2405,1.1916,0.0543;0.9861,-0.4416,-0.6367;3.0394,0.1409,-0.4321;3.8722,1.2625,-0.4875;5.1482,1.0927,-0.2829;6.0266,2.282,-0.3412;7.078,2.1021,-0.1578;5.4717,3.6261,0.1224;4.4487,3.6109,0.4843;6.1537,4.2767,0.6626;5.6884,3.3938,-1.3319;6.5248,3.8797,-1.8265;4.8119,3.2217,-1.9484;5.6938,-0.2399,-0.0039;6.9948,-0.4968,0.21;8.3988,-0.0768,0.334;8.936,0.2112,-0.569;8.6975,0.4809,1.221;7.9684,-1.5402,0.4975;7.9627,-1.9802,1.4933;8.2012,-2.2507,-0.2939;4.7177,-1.3888,0.0376;5.0828,-2.5321,0.2644;3.3688,-1.0938,-0.1955)|\",3.610950796135001\r\n\"[H]/N=C(\\C1=C([H])C(F)=C(C([H])([H])[H])C([H])=C1F)N([H])O[H] |(6.9832,-1.348,1.3726;7.5216,-0.7548,0.7455;6.777,0.0087,0.0228;5.2842,0.0335,0.0022;4.57,-1.1613,0.1847;5.0936,-2.1052,0.2919;3.1859,-1.1502,0.2016;2.5295,-2.3196,0.3703;2.427,0.014,0.044;0.9225,-0.0435,0.0646;0.4908,0.9568,-0.0256;0.5575,-0.4949,0.9939;0.5406,-0.6584,-0.7588;3.1377,1.2026,-0.1387;2.6184,2.1475,-0.2633;4.5264,1.196,-0.1598;5.1559,2.3878,-0.3387;7.4111,0.8141,-0.9038;7.0266,1.7461,-1.0115;8.8052,0.9213,-0.7172;8.9591,0.2474,-0.0109)|\",4.585118380925\r\n\"[H]C1=NC([H])=C(/C([H])=C(/C#N)C2=NC3=C(C([H])=C([H])C([H])=C3[H])N2[H])N1[H] |(-2.8326,-5.2509,-1.2952;-3.5702,-4.465,-1.2017;-4.8844,-4.6325,-1.3642;-5.4135,-3.4039,-1.159;-6.4795,-3.2236,-1.2258;-4.4155,-2.464,-0.8654;-4.5737,-1.081,-0.5896;-5.6131,-0.7641,-0.6256;-3.6776,-0.0786,-0.2881;-4.2181,1.2288,-0.0646;-4.5912,2.316,0.129;-2.2234,-0.154,-0.1591;-1.4451,-1.2179,-0.3049;-0.1516,-0.7782,-0.0844;-0.1579,0.6071,0.2062;1.0106,1.322,0.4745;0.9943,2.3848,0.6965;2.2029,0.6045,0.4439;3.1369,1.1204,0.6465;2.2267,-0.776,0.1563;3.1802,-1.2959,0.1434;1.0594,-1.4824,-0.1098;1.0763,-2.5451,-0.3314;-1.4887,0.9709,0.1506;-1.8783,1.8917,0.3035;-3.2389,-3.1944,-0.9041;-2.3232,-2.7482,-0.722)|\",3.5510857490250007\r\n\"[H]O[C@@]([H])(C1=C(C([H])=C([H])[H])C(OC([H])([H])[H])=C([H])C([H])=C1[H])[C@]([H])(O[H])C([H])=C([H])[H] |(4.5396,1.9205,-3.3733;4.3467,0.9758,-3.256;3.8147,0.877,-1.9404;2.9164,1.5088,-1.8775;3.4472,-0.5676,-1.6459;2.6197,-0.9013,-0.5434;2.0114,0.165,0.276;1.9007,1.1176,-0.238;1.5884,0.1307,1.5484;1.1614,1.0236,1.9975;1.6544,-0.751,2.1721;2.3688,-2.2799,-0.3026;1.5705,-2.5751,0.7659;1.1912,-3.9235,0.9869;0.648,-4.3353,0.1263;0.531,-3.9064,1.8562;2.0585,-4.5605,1.2045;2.9053,-3.2683,-1.1313;2.7018,-4.3155,-0.9424;3.7097,-2.9064,-2.2102;4.124,-3.6794,-2.8521;3.9827,-1.5683,-2.4657;4.6032,-1.2809,-3.3054;4.8282,1.4847,-0.9252;4.3206,1.5532,0.0458;5.097,2.8346,-1.3281;5.9451,2.8157,-1.8007;6.0938,0.6776,-0.7639;5.9803,-0.4011,-0.8369;7.298,1.2026,-0.5304;8.1774,0.5746,-0.4217;7.453,2.2732,-0.4138)|\",5.015058264715\r\n\"[H]O[C@]1([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])OC(=O)C([H])(Cl)Cl |(1.6884,0.696,2.2171;1.572,0.0653,1.488;2.4842,0.4413,0.462;2.2266,1.4424,0.0784;2.3762,-0.5825,-0.6832;3.0513,-0.2262,-1.4774;0.9528,-0.6345,-1.2511;0.9002,-1.3266,-2.0994;0.6296,0.3527,-1.6025;0.2429,-0.9686,-0.4887;2.8738,-1.9664,-0.2229;2.1888,-2.3519,0.5425;2.8349,-2.6656,-1.0676;4.2999,-1.9049,0.3447;5.0051,-1.6404,-0.4569;4.607,-2.8904,0.7144;4.4113,-0.875,1.4818;5.4455,-0.7884,1.8319;3.7911,-1.186,2.33;3.9193,0.4866,1.0003;4.5945,0.8993,0.245;3.8821,1.4313,2.1223;4.9937,2.1091,2.4021;6.0493,2.0413,1.8159;4.8163,3.0443,3.6044;5.7523,3.5726,3.7642;3.5494,4.2793,3.2806;4.4675,2.1031,5.0965)|\",6.17698440635\r\n\"[H]O[C@]([H])(C1=C([H])C([H])=C(C(=O)C([H])([H])[H])C([H])=C1[H])C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(6.4564,-0.1169,-1.5114;6.9879,-0.1238,-0.7006;6.1134,-0.4833,0.369;5.4404,0.3603,0.5979;6.9634,-0.7396,1.6071;8.3598,-0.7259,1.5505;8.8488,-0.5247,0.6048;9.1143,-0.9603,2.7007;10.1978,-0.944,2.6294;8.4905,-1.2098,3.9301;9.2504,-1.4675,5.1938;8.6629,-1.6824,6.2443;10.7711,-1.4594,5.1517;11.1521,-0.4865,4.819;11.1515,-2.2154,4.4544;11.1483,-1.6711,6.1537;7.0862,-1.219,3.9819;6.6086,-1.4118,4.937;6.3359,-0.9871,2.8382;5.25,-1.0063,2.8933;5.2472,-1.6795,-0.0044;3.8539,-1.6158,0.0797;3.3767,-0.7013,0.4266;3.0663,-2.7141,-0.2719;1.9835,-2.6434,-0.1944;3.6482,-3.9013,-0.729;2.7977,-5.0797,-1.142;1.8355,-5.0828,-0.6194;3.3005,-6.0299,-0.9324;2.5835,-5.0578,-2.2191;5.0483,-3.9574,-0.8146;5.5247,-4.8725,-1.16;5.8368,-2.8673,-0.4595;6.9193,-2.9315,-0.5281)|\",5.13206722043\r\n\"[H]C1=C([H])[C@]2([H])N(C(=O)C1([H])[H])C([H])([H])C([H])([H])[C@]2([H])C(=O)N1C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-2.5146,0.2061,-3.6326;-2.7609,1.2249,-3.3379;-2.6137,1.6052,-2.069;-2.272,0.9031,-1.3116;-2.9832,2.9766,-1.5909;-3.8678,2.8785,-0.9344;-3.2994,3.9071,-2.6732;-3.3806,3.6147,-4.0043;-3.5354,4.4882,-4.8508;-3.2788,2.145,-4.4045;-4.2708,1.8158,-4.7495;-2.6413,2.1178,-5.2964;-3.1183,5.3029,-2.2516;-2.4623,5.7959,-2.9706;-4.0751,5.837,-2.2363;-2.4828,5.2024,-0.8481;-3.2392,5.3469,-0.0683;-1.704,5.9552,-0.7065;-1.9268,3.7622,-0.7484;-1.9317,3.4043,0.2849;-0.5296,3.6705,-1.3626;-0.2312,4.322,-2.3639;0.3792,2.8426,-0.7686;1.7414,2.7525,-1.3208;1.7278,2.1857,-2.2611;2.1177,3.7524,-1.5516;2.5295,2.0277,-0.222;3.3781,1.4629,-0.6188;2.9199,2.7519,0.5032;1.468,1.1362,0.4432;1.7403,0.8059,1.4501;1.2907,0.2429,-0.1679;0.2119,2.0228,0.4402;0.1736,2.6516,1.3422;-0.708,1.431,0.4055)|\",6.386512071235\r\n\"[H]O[C@]1([H])/C([H])=C2/C3=C(N=C(OC([H])([H])[H])C([H])=C3[H])C([H])([H])[C@@]2([H])C([H])([H])[C@@]1([H])C([H])([H])[H] |(5.1259,2.6561,-1.3393;4.4228,2.8106,-0.6899;3.3295,3.4352,-1.3787;3.7292,4.2667,-1.9914;2.4242,4.0347,-0.3349;2.9201,4.6278,0.4315;1.0896,3.927,-0.3804;0.0646,4.5823,0.4359;-1.165,4.529,-0.2345;-2.3069,5.0507,0.2262;-2.2418,5.6554,1.4119;-3.3612,6.1944,1.9449;-4.5675,6.0756,1.1871;-5.3341,6.5659,1.7896;-4.8269,5.0256,1.0208;-4.473,6.5693,0.215;-1.0621,5.7794,2.1765;-1.1026,6.2875,3.1336;0.1097,5.2354,1.6765;1.0353,5.3075,2.2419;-1.0223,3.8434,-1.578;-1.8509,3.1605,-1.7909;-1.0115,4.5967,-2.3799;0.352,3.1344,-1.4577;0.1611,2.1295,-1.0478;1.2115,2.9784,-2.7118;0.738,2.2941,-3.4285;1.3109,3.9499,-3.2188;2.6056,2.4466,-2.3216;2.4639,1.5258,-1.7373;3.4589,2.1134,-3.5509;4.4385,1.7094,-3.2667;2.97,1.3595,-4.1784;3.6277,3.0042,-4.1701)|\",4.781040353285\r\n\"[H]C(=O)[C@]1(C([H])([H])[H])[C@]([H])(C([H])([H])[H])C([H])([H])/C([H])=C2/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@]21[H] |(1.3927,0.4198,-2.7166;2.3852,0.0127,-2.4141;2.8961,-0.8746,-3.0619;2.9928,0.638,-1.1606;4.4925,0.2822,-1.1128;4.6248,-0.7796,-0.8833;5.0159,0.8702,-0.3537;4.9737,0.4591,-2.0775;2.2748,0.0077,0.0854;2.5155,-1.0636,0.0767;0.7399,0.1344,0.0853;0.3326,-0.3567,0.9768;0.2803,-0.3456,-0.7851;0.4107,1.1786,0.1113;2.8724,0.632,1.359;2.26,0.34,2.224;3.871,0.2176,1.5604;2.9344,2.1362,1.2901;3.0269,2.6438,2.2465;2.8397,2.8573,0.1629;2.7232,4.3949,0.179;1.2313,4.7736,-0.0132;1.1095,5.8643,-0.0092;0.6247,4.3618,0.801;0.8212,4.3984,-0.9559;3.1975,5.016,1.5063;3.1662,6.1096,1.4342;4.2271,4.7231,1.7413;2.561,4.7298,2.3506;3.5739,5.0201,-0.9591;3.3708,6.0987,-0.9993;4.6369,4.9132,-0.6976;3.3429,4.3813,-2.3305;2.3154,4.563,-2.6736;4.0019,4.8481,-3.0736;3.6148,2.878,-2.2616;4.6724,2.7277,-2.0127;3.454,2.4099,-3.2422;2.7289,2.1781,-1.2037;1.6888,2.3117,-1.5485)|\",5.787861600135001\r\n\"[H]C1=C([H])[C@@]2([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]2([H])[C@]([H])(OC(=O)C([H])([H])[H])C1([H])[H] |(6.9853,0.6,1.9642;6.3942,0.4123,1.069;6.4899,1.2477,0.0263;7.132,2.1244,0.0471;5.6022,0.9995,-1.1534;4.6379,1.5091,-0.9985;6.1084,1.3272,-2.4456;5.3568,0.5083,-3.3712;4.2385,1.3187,-4.0165;4.6622,2.1213,-4.6285;3.5998,1.7613,-3.2471;3.6224,0.6768,-4.6543;6.3416,-0.0751,-4.3765;6.8645,0.7301,-4.9024;5.8162,-0.6938,-5.1104;7.084,-0.6902,-3.8601;4.7456,-0.5539,-2.5915;5.3596,-0.4887,-1.3167;6.3407,-0.9918,-1.3437;4.6212,-1.0639,-0.1281;4.3997,-2.1272,-0.239;3.3628,-0.3582,-0.0006;2.3737,-0.991,0.6771;2.5078,-2.0667,1.2183;1.1006,-0.179,0.6412;0.3683,-0.6196,1.3187;0.6987,-0.1748,-0.3781;1.2985,0.8605,0.9192;5.5239,-0.8327,1.113;4.8891,-0.8268,2.0066;6.1768,-1.7098,1.2312)|\",6.9008072486800005\r\n\"[H]OC([H])([H])C#C/C(=N\\OC([H])([H])C([H])=C([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(9.2508,1.6106,1.6138;8.4564,1.8046,2.1354;7.6175,0.6567,2.0783;8.0639,-0.1965,2.6154;6.7051,0.9293,2.6199;7.274,0.2418,0.7133;6.9955,-0.0955,-0.4178;6.6388,-0.4852,-1.7483;5.4588,-0.9341,-2.0517;4.6108,-0.9784,-0.9606;3.3411,-1.5225,-1.3655;3.0431,-1.0337,-2.3023;3.4385,-2.5995,-1.5524;2.3585,-1.2408,-0.2707;2.2682,-0.1949,0.0202;1.6044,-2.1715,0.3125;0.8754,-1.919,1.0774;1.6825,-3.223,0.0436;7.6366,-0.3802,-2.8447;7.2946,-0.7253,-4.1641;6.2873,-1.0669,-4.3744;8.2363,-0.6286,-5.1831;7.957,-0.8988,-6.198;9.5346,-0.1865,-4.9088;10.2672,-0.1117,-5.708;9.8813,0.159,-3.6033;10.8866,0.5052,-3.3783;8.9398,0.0631,-2.5784;9.2125,0.3326,-1.5632)|\",4.48987853325\r\n\"[H]C1=C([H])[C@]([H])(N([H])[H])[C@]([H])(C(=O)N(OC([H])([H])[H])C([H])([H])[H])C1([H])[H] |(8.2901,-0.3144,-2.6463;7.3284,-0.6295,-2.2495;7.0099,-1.882,-1.9221;7.6504,-2.7523,-2.0299;5.588,-2.0274,-1.4333;5.5545,-2.4694,-0.4308;4.8425,-2.9455,-2.2959;4.7141,-2.5378,-3.2211;3.9183,-3.1205,-1.9054;5.1017,-0.51,-1.3379;5.0472,-0.2452,-0.2789;3.7319,-0.3253,-1.9721;3.5639,0.0723,-3.1184;2.6286,-0.612,-1.176;2.8746,-1.4929,-0.0961;2.5767,-0.8518,1.1462;3.2239,0.0173,1.3105;2.765,-1.6089,1.9116;1.5268,-0.538,1.1919;1.327,-0.8433,-1.7863;1.2094,-0.1264,-2.5982;0.5443,-0.6862,-1.0395;1.2464,-1.8621,-2.1871;6.1925,0.3398,-2.0402;5.8264,0.7406,-2.9931;6.4862,1.2008,-1.4253)|\",6.37834865572\r\n\"[H]C1=C([H])C([H])=C([C@@]2(C(=O)C([H])([H])[H])C(=O)O[C@]([H])(C([H])([H])C([H])([H])[H])C2([H])[H])S1 |(4.9538,-2.8249,-6.5489;4.6182,-3.0065,-5.5367;5.2145,-3.7551,-4.5626;6.1495,-4.286,-4.7047;4.4806,-3.7592,-3.3371;4.7842,-4.3099,-2.4545;3.3353,-3.0055,-3.3835;2.2719,-2.8033,-2.3191;1.0156,-3.6999,-2.5984;-0.0954,-3.214,-2.5488;1.2531,-5.1575,-2.9136;1.7289,-5.2596,-3.8953;0.2953,-5.6811,-2.9176;1.9322,-5.5933,-2.1743;2.8801,-3.1805,-0.9404;3.1061,-4.2878,-0.5165;3.1455,-2.0636,-0.2347;2.8576,-0.8595,-1.0063;3.8117,-0.5412,-1.4462;2.3257,0.2153,-0.0695;2.0359,1.0755,-0.6885;1.4065,-0.1599,0.3984;3.3293,0.6503,1.003;2.8994,1.4259,1.6455;4.2418,1.0579,0.5512;3.6159,-0.1953,1.6358;1.883,-1.331,-2.0894;1.9438,-0.7184,-2.9917;0.8522,-1.2964,-1.7249;3.1439,-2.2849,-4.9729)|\",5.417786763455\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])C([H])=C([H])C(=O)C([H])([H])[C@@]1([H])OC([H])([H])[H] |(0.3654,-0.1366,0.6445;0.8914,0.0215,-0.1557;2.2761,-0.194,0.1923;2.5115,-1.6984,0.3874;3.5457,-1.9116,0.6753;2.2723,-2.2518,-0.5272;1.8604,-2.0743,1.1862;2.599,0.5745,1.4813;3.6489,0.4613,1.7542;1.9802,0.1914,2.3045;2.3854,1.6417,1.3677;3.0246,0.3344,-1.0705;2.4944,-0.1546,-1.9046;2.8377,1.822,-1.256;1.8946,2.2263,-0.8968;3.731,2.6283,-1.8523;3.5616,3.6974,-1.9507;4.9889,2.108,-2.4196;5.8532,2.8368,-2.8836;5.1148,0.5877,-2.4539;6.1675,0.3257,-2.5899;4.5808,0.2281,-3.3462;4.5024,-0.0845,-1.2143;4.5568,-1.1776,-1.3394;5.1986,0.2742,-0.0238;6.4974,-0.2769,0.0972;6.8359,-0.0595,1.1135;7.211,0.1667,-0.6104;6.4874,-1.3685,-0.0507)|\",5.151115189965\r\n\"[H]C1=C(OC([H])([H])[H])[C@]([H])(C(=O)N2C(=O)OC([H])([H])C2([H])[H])C([H])([H])C([H])([H])C1=O |(2.8496,2.4334,-0.411;3.3166,2.1204,0.5159;3.5704,0.8211,0.7918;3.2614,-0.2319,0.0047;2.5365,0.0142,-1.1994;2.3593,-0.9654,-1.6453;3.1218,0.6336,-1.8889;1.5818,0.5072,-0.9848;4.2673,0.3506,2.0468;4.9765,-0.4326,1.7727;3.2343,-0.2356,3.0124;2.0938,0.1835,3.1074;3.6235,-1.2977,3.8394;4.8982,-1.8601,4.0201;5.9378,-1.5358,3.5042;4.796,-2.8667,4.9241;3.414,-3.1189,5.2484;3.0934,-4.0153,4.7075;3.3469,-3.3,6.3221;2.6628,-1.8619,4.7887;2.4724,-1.1552,5.6039;1.7141,-2.0775,4.2948;5.0465,1.5114,2.7139;5.4166,1.1952,3.694;5.9315,1.708,2.0977;4.1963,2.7814,2.8167;4.7724,3.621,3.2166;3.3463,2.6178,3.4937;3.6227,3.1981,1.4651;3.3977,4.3716,1.1977)|\",5.259960730165\r\n\"[H]C1=C(/C([H])=C(\\[H])OC([H])([H])C([H])([H])Cl)C2=C([H])/C([H])=C([H])\\C([H])=C\\2C([H])([H])C1([H])[H] |(3.2923,-0.9352,-0.3064;2.3309,-0.4284,-0.273;1.2034,-1.1515,-0.0918;1.2278,-2.6229,0.0213;0.5874,-3.0722,0.7762;1.9924,-3.3965,-0.7669;2.5913,-2.9836,-1.573;2.1447,-4.748,-0.7234;1.4482,-5.4577,0.2942;1.7309,-5.083,1.2865;0.3635,-5.3374,0.1749;1.846,-6.9173,0.1325;2.9196,-7.0499,0.2726;1.5598,-7.2953,-0.85;1.0035,-7.9279,1.3736;-0.1054,-0.4525,-0.0193;-1.3185,-1.136,-0.1886;-1.3035,-2.2031,-0.39;-2.539,-0.463,-0.1288;-3.4667,-1.012,-0.2669;-2.5623,0.9123,0.0974;-3.5079,1.4462,0.1416;-1.3599,1.6057,0.2583;-1.3745,2.68,0.4308;-0.1335,0.9441,0.2027;1.1751,1.6758,0.4124;1.0552,2.7434,0.195;1.4607,1.6009,1.4736;2.2997,1.0699,-0.4384;3.2638,1.5068,-0.1536;2.1452,1.3291,-1.4991)|\",4.72389644468\r\n\"[H]C1=C(/C([H])=C(\\[H])OC([H])([H])/C([H])=C(\\[H])C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(10.2279,-2.9252,2.1321;10.1492,-1.8404,2.2062;8.9199,-1.2872,2.2791;7.7348,-2.1407,2.3027;7.8974,-3.215,2.2333;6.4653,-1.7154,2.4129;6.1936,-0.6645,2.5069;5.418,-2.5881,2.3837;4.1638,-2.0641,2.8369;4.1843,-1.9221,3.9271;4.0027,-1.0767,2.3729;3.0745,-3.0124,2.4435;3.0076,-3.2399,1.3796;2.2079,-3.5524,3.3033;2.3063,-3.3149,4.3644;1.0811,-4.4726,2.9349;1.1695,-5.4327,3.4605;1.0593,-4.6734,1.8588;0.1104,-4.0467,3.2231;8.746,0.2202,2.3612;7.9297,0.5317,1.6959;8.4215,0.4908,3.3782;10.0218,0.9946,1.9974;10.1634,0.9692,0.9078;9.9075,2.0497,2.2754;11.2514,0.3829,2.6764;12.1503,0.9726,2.4581;11.1115,0.4076,3.7662;11.4438,-1.0703,2.2205;12.169,-1.5798,2.8709;11.895,-1.0883,1.2141)|\",5.123903804915001\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@]1([H])C([H])([H])C([H])([H])C(C([H])=C([H])[H])=C([H])[C@]1([H])O[H] |(5.6401,5.4384,0.3395;4.9785,5.0134,-0.2296;5.5669,3.7883,-0.7147;5.7784,2.8546,0.4922;6.2569,1.9077,0.2195;4.8423,2.624,1.0022;6.4444,3.3477,1.2127;6.9249,4.1251,-1.3566;7.3587,3.255,-1.8552;7.6334,4.4824,-0.5965;6.7954,4.9172,-2.1017;4.5722,3.2883,-1.8162;4.8182,3.8933,-2.698;3.0785,3.5497,-1.5175;2.9341,4.6016,-1.2651;2.5272,3.3664,-2.4506;2.4803,2.6592,-0.4186;1.3846,2.7129,-0.4688;2.747,3.0469,0.5743;2.9142,1.2134,-0.5334;2.2536,0.1915,0.2831;2.6145,-0.8267,0.1342;1.28,0.4024,1.1807;0.8539,-0.4189,1.7492;0.8747,1.3899,1.3821;3.91,0.86,-1.3707;4.1908,-0.1896,-1.4643;4.6868,1.803,-2.2566;4.2537,1.7539,-3.2663;6.0245,1.3264,-2.4607;6.3782,1.067,-1.596)|\",5.32254691578\r\n\"[H]OC(C([H])([H])[H])(C([H])([H])[H])[C@@]1([H])C(OC([H])([H])[H])=C([H])C(=O)C([H])([H])C1([H])[H] |(3.7924,0.3188,1.683;3.9148,0.4255,0.7261;2.704,-0.0065,0.0857;1.5201,0.8054,0.6394;0.5814,0.5416,0.1398;1.6825,1.8771,0.5173;1.3973,0.5919,1.7096;2.4703,-1.499,0.3811;1.634,-1.9026,-0.2018;2.2261,-1.6334,1.4425;3.3631,-2.0921,0.1689;2.9044,0.236,-1.4521;1.9274,0.0246,-1.9059;3.2338,1.6764,-1.7899;2.1247,2.4559,-1.8019;2.2709,3.8418,-2.1041;1.2667,4.2649,-2.0526;2.6816,3.9814,-3.1109;2.9225,4.3375,-1.3753;4.4848,2.1117,-2.0706;4.6972,3.1486,-2.3049;5.651,1.2212,-2.0762;6.7803,1.6512,-2.2811;5.3995,-0.2614,-1.8241;6.1157,-0.8281,-2.4283;5.6307,-0.4548,-0.7708;3.955,-0.6708,-2.1357;3.7864,-1.7193,-1.8741;3.7904,-0.5972,-3.2187)|\",5.31710463877\r\n\"[H]OC([H])([H])[C@@]1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])C([H])([H])[H])[C@@]2([H])[C@]1([H])C([H])=C([H])[C@]2([H])C([H])([H])[H] |(6.2761,3.2594,-2.0102;7.1502,3.5228,-1.6856;7.1113,3.5162,-0.2616;6.7411,2.5518,0.1154;8.1569,3.6005,0.0538;6.3047,4.6843,0.3809;6.3589,4.4371,1.8999;7.4007,4.3407,2.2331;5.8366,3.512,2.1678;5.9118,5.2492,2.4771;7.0473,5.9822,-0.0239;8.1036,5.8591,0.2567;7.0388,6.0384,-1.1202;6.5709,7.321,0.5492;6.7217,7.3431,1.638;7.2302,8.1007,0.1436;5.1161,7.7011,0.2472;4.9243,7.608,-0.8319;4.993,8.7671,0.4813;4.0665,6.8985,1.0529;4.5188,6.6616,2.0254;2.8504,7.8007,1.3899;2.1851,7.281,2.089;3.2391,8.665,1.9464;2.0376,8.3206,0.197;1.25,9.0027,0.5376;1.5507,7.5123,-0.359;2.6686,8.8737,-0.5083;3.6801,5.565,0.3698;3.0916,5.8258,-0.521;4.8425,4.6486,-0.2077;4.9727,4.9115,-1.2673;4.2188,3.271,-0.0882;4.628,2.3884,-0.5745;3.1558,3.2447,0.7137;2.584,2.3539,0.9653;2.7664,4.614,1.2173;3.0118,4.707,2.2886;1.2495,4.8275,1.0569;0.6973,4.0475,1.5958;0.9604,4.7661,0.0006;0.9123,5.7919,1.4451)|\",7.012373927385\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])O[C@]2([H])C([H])([H])C([H])([H])[C@]3([H])OC([H])([H])[C@@]2([H])O3)C([H])=C1[H] |(-1.6696,-2.7616,-4.8969;-0.7808,-2.3354,-4.4392;0.4759,-2.8874,-4.6969;0.5678,-3.7461,-5.357;1.618,-2.3417,-4.1109;2.5946,-2.7742,-4.3033;1.5157,-1.2407,-3.2518;2.7516,-0.6164,-2.6441;2.4912,-0.0714,-1.7219;3.1739,0.1264,-3.3422;3.707,-1.6273,-2.3644;4.941,-1.1516,-1.8307;4.7258,-0.3167,-1.1463;5.9241,-0.6591,-2.9114;5.3974,-0.0432,-3.6487;6.6488,-0.0051,-2.4148;6.6876,-1.8066,-3.5964;7.5719,-1.4042,-4.1044;6.0756,-2.308,-4.3521;7.1024,-2.8582,-2.5397;8.1175,-3.2385,-2.681;6.1946,-3.9596,-2.5472;5.2457,-3.7223,-1.5005;4.2295,-3.8095,-1.8855;5.4024,-4.449,-0.693;5.5988,-2.3086,-1.0133;5.4332,-2.1794,0.0578;7.0182,-2.2952,-1.2381;0.2528,-0.6983,-2.9909;0.1605,0.1496,-2.3152;-0.8896,-1.2373,-3.5849;-1.8639,-0.8054,-3.3716)|\",6.451819395355\r\n\"[H]O[C@@]1(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])(C([H])([H])[H])C([H])([H])[H])[C@@]2([H])[C@]1([H])C([H])=C([H])[C@]2([H])C([H])([H])[H] |(1.354,-2.6306,-5.1551;1.8937,-1.8905,-4.8338;3.2556,-2.1566,-5.245;3.7202,-3.4576,-4.575;4.7437,-3.7287,-4.8501;3.0697,-4.2858,-4.8884;3.6583,-3.3725,-3.4877;3.2413,-2.2979,-6.7746;2.5619,-3.1283,-7.0246;2.7832,-1.3903,-7.191;4.5861,-2.5365,-7.4591;5.046,-3.4703,-7.1067;4.3887,-2.6903,-8.5286;5.5859,-1.3814,-7.3244;5.0657,-0.4296,-7.5112;6.3051,-1.4848,-8.145;6.4061,-1.2544,-6.012;6.7918,-2.2579,-5.7747;7.6515,-0.3599,-6.4239;7.2917,0.2649,-7.2538;8.2685,0.6326,-5.4246;8.97,1.2806,-5.9648;8.8342,0.1382,-4.6317;7.5266,1.2865,-4.9568;8.7668,-1.261,-6.9839;9.574,-0.6668,-7.4292;8.3939,-1.944,-7.7553;9.2078,-1.8719,-6.1854;5.5844,-0.768,-4.7773;5.7373,0.3155,-4.6916;4.0061,-0.8682,-4.7649;3.5825,-0.0576,-5.3718;3.724,-0.6511,-3.2895;2.7424,-0.3704,-2.9254;4.7838,-0.9523,-2.5394;4.8158,-0.9335,-1.4522;5.9836,-1.3512,-3.3758;6.0175,-2.4505,-3.4507;7.3073,-0.9213,-2.7349;7.3612,-1.3162,-1.7124;7.3986,0.1681,-2.6717;8.1742,-1.3126,-3.2753)|\",7.02597961991\r\n\"[H]C([H])=C([H])C([H])([H])S[C@]1([H])C2=C(C([H])=C([H])C([H])=C2[H])C([H])([H])C1([H])[H] |(0.5405,3.8026,-0.6921;1.5376,3.9789,-0.2983;2.0666,4.8486,-0.6825;2.0838,3.1721,0.6128;1.5329,2.3016,0.9674;3.4487,3.3638,1.2032;3.3985,3.3536,2.2981;3.881,4.3151,0.8774;4.5395,1.9619,0.6849;5.9285,2.039,1.9226;6.4909,1.1379,1.6547;5.4759,2.0149,3.3649;5.7608,3.2304,4.0011;5.4262,3.4162,5.3432;5.6477,4.3564,5.8433;4.8063,2.3772,6.0412;4.5472,2.5096,7.0884;4.517,1.1657,5.4019;4.0362,0.3643,5.9565;4.8461,0.9799,4.0581;4.6125,0.0446,3.5559;6.4183,4.2011,3.0427;5.7075,4.9811,2.7355;7.2763,4.7195,3.486;6.83,3.305,1.8469;7.874,2.9929,1.9716;6.7546,3.8098,0.8808)|\",5.488536364585\r\n\"[H]C(=O)C1=C(F)C(C([H])([H])[H])=C([H])C(C([H])=O)=C1OC([H])([H])[H] |(2.9054,-3.5388,0.7078;3.6782,-2.7505,0.7621;4.8284,-3.0408,1.0368;3.1644,-1.3849,0.4976;1.8295,-1.247,0.0861;1.1148,-2.3706,-0.1398;1.168,-0.0318,-0.1069;-0.2701,-0.0024,-0.5576;-0.6215,1.0282,-0.6529;-0.9218,-0.5228,0.1535;-0.3949,-0.4987,-1.5268;1.9147,1.111,0.1531;1.4715,2.0962,0.0352;3.2551,1.056,0.5672;3.9702,2.3298,0.8275;5.0099,2.2295,1.186;3.4613,3.4256,0.6688;3.8922,-0.1871,0.7306;5.1731,-0.1927,1.1882;6.2079,-0.5279,0.2478;6.1878,0.1674,-0.6005;6.1077,-1.5602,-0.0902;7.1449,-0.4123,0.7947)|\",4.70756961365\r\n\"[H]C(=O)C1=C(OC([H])([H])[H])C([H])=C([H])C(C([H])([H])[H])=C1F |(5.9016,-2.4003,-0.1704;4.8018,-2.5105,-0.1566;4.2894,-3.6147,-0.1781;4.0687,-1.2225,-0.1117;4.7805,0.007,-0.093;6.1372,-0.0911,-0.1196;6.9111,1.1004,-0.0913;6.7241,1.6776,0.8228;6.7093,1.7295,-0.9674;7.9527,0.7762,-0.1096;4.0919,1.2212,-0.0524;4.6246,2.1639,-0.0387;2.6954,1.2215,-0.0305;2.1718,2.1739,0.0002;1.9462,0.0446,-0.0464;0.4388,0.0246,-0.0205;0.039,1.0429,-0.0194;0.06,-0.4909,0.8698;0.03,-0.5023,-0.8902;2.6686,-1.1502,-0.0872;1.9549,-2.2862,-0.1027)|\",4.770155799265\r\n\"[H]C(=O)C1=C([H])C(F)=C(C([H])=O)C([H])=C1OC([H])([H])[H] |(1.0475,4.3986,-0.2426;1.9001,5.0955,-0.164;1.7447,6.2971,-0.2865;3.2243,4.4725,0.097;4.3427,5.3044,0.2076;4.2161,6.3767,0.104;5.5869,4.7496,0.4486;6.6614,5.5597,0.5569;5.7561,3.3676,0.5849;7.0904,2.7702,0.8443;7.9288,3.486,0.9228;7.2717,1.5728,0.9661;4.636,2.5311,0.4736;4.804,1.4665,0.584;3.3716,3.0682,0.232;2.2384,2.3283,0.1116;2.3318,0.914,0.2516;1.315,0.5385,0.1304;2.7098,0.6369,1.243;2.9777,0.4797,-0.5211)|\",3.981025632815\r\n\"[H]C(=O)C1=C([H])C(OC([H])([H])[H])=C([H])C(C([H])=O)=C1F |(5.0058,-2.7596,-4.346;4.3711,-2.7387,-3.4407;3.9728,-3.7674,-2.9292;4.0531,-1.3876,-2.9107;3.2658,-1.2546,-1.7649;2.8963,-2.1507,-1.2771;2.9522,0.0084,-1.2432;2.1812,0.0122,-0.1236;1.8257,1.2679,0.4429;1.2122,1.0361,1.3145;1.2437,1.8751,-0.262;2.7131,1.8299,0.7605;3.4359,1.1514,-1.8836;3.2251,2.1528,-1.5267;4.2289,1.0433,-3.039;4.7284,2.2736,-3.7001;5.3476,2.1158,-4.6024;4.4856,3.3947,-3.2931;4.5248,-0.2255,-3.5333;5.2867,-0.3408,-4.6441)|\",4.247697206305\r\n\"[H]OC(=O)[C@@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])=C1C([H])([H])OC([H])([H])[H] |(3.698,0.3804,-0.974;2.8451,0.1928,-1.4209;2.9831,-0.7478,-2.3791;2.0118,-1.1939,-2.9471;4.4129,-1.1723,-2.7174;4.6914,-2.6588,-2.7595;5.9482,-3.1191,-3.1849;6.6991,-2.4053,-3.5123;6.2522,-4.4797,-3.1983;7.2322,-4.8075,-3.536;5.3033,-5.4149,-2.7837;5.5365,-6.4762,-2.7937;4.0501,-4.971,-2.3609;3.2976,-5.6877,-2.0422;3.7444,-3.6097,-2.3492;2.7566,-3.2887,-2.0409;5.1808,-0.2377,-3.6342;5.3173,0.0457,-4.6678;5.5326,-0.1932,-2.39;6.3373,0.2992,-1.252;6.9444,1.1701,-1.5433;7.0198,-0.4903,-0.8969;5.4149,0.6474,-0.2227;6.0333,1.1086,0.9681;5.2316,1.3361,1.6732;6.6239,2.0177,0.7839;6.6884,0.3373,1.3983)|\",5.5728916582400005\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C([H])=C1C([H])([H])OC([H])([H])[H] |(5.671,2.3707,0.0941;5.3468,3.2095,0.4717;4.0635,2.9525,1.0025;3.7117,3.8967,1.4361;3.3474,2.6815,0.2093;4.0574,1.8476,2.082;2.6772,1.3567,2.4542;1.849,0.7495,1.4972;2.2139,0.6173,0.4813;0.5726,0.2988,1.835;-0.0501,-0.1749,1.0801;0.0967,0.4516,3.1389;-0.8981,0.1026,3.403;0.9092,1.054,4.1009;0.5481,1.1784,5.1188;2.1859,1.5014,3.758;2.8161,1.9665,4.5125;5.1814,1.783,3.0859;5.625,2.2176,3.971;5.251,0.9154,2.124;5.9482,-0.1298,1.3458;5.4097,-1.0905,1.4127;6.9691,-0.2898,1.7277;5.984,0.3088,-0.0127;6.6776,-0.5878,-0.8623;6.6351,-0.1689,-1.8701;6.2063,-1.5824,-0.8646;7.7304,-0.6977,-0.5608)|\",6.19875351439\r\n\"[H]OC([H])([H])[C@@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C(=C([H])[H])[C@@]1([H])C([H])([H])C([H])=C([H])[H] |(6.521,2.2315,0.7068;6.0288,2.1578,1.5411;6.4161,0.9317,2.149;7.4884,0.9422,2.4007;5.8618,0.8735,3.0886;6.1317,-0.2987,1.2805;6.8886,-0.2894,-0.0254;6.3831,0.4126,-1.1312;5.4066,0.885,-1.0583;7.107,0.4816,-2.3238;6.6984,1.0247,-3.172;8.3461,-0.152,-2.4286;8.9078,-0.1033,-3.3576;8.8566,-0.8561,-1.3357;9.8168,-1.3597,-1.4129;8.1338,-0.9216,-0.1441;8.5303,-1.477,0.7018;5.8562,-1.5736,1.9706;6.3008,-2.5321,2.7683;5.6483,-3.3302,3.116;7.335,-2.5666,3.1045;4.7175,-0.9437,1.2853;4.4478,-1.378,0.321;3.516,-0.3632,2.0316;2.7833,-1.1658,2.1886;3.8302,-0.0331,3.0304;2.8763,0.7821,1.289;3.5071,1.6616,1.162;1.6403,0.7654,0.7881;1.2268,1.6169,0.2538;0.9887,-0.1,0.8987)|\",6.293993362065001\r\n\"[H]C1=NC(S/C([H])=C(\\[H])C(=O)OC([H])([H])C([H])([H])[H])=NC([H])=C1[H] |(8.0899,4.1445,4.9418;7.2917,3.8469,4.2641;7.4533,2.6755,3.6439;6.4653,2.3185,2.8138;6.77,0.7397,2.0323;5.3529,0.5326,1.0218;4.625,1.3386,1.0573;5.1576,-0.5443,0.2442;5.8605,-1.3693,0.1838;3.9324,-0.6304,-0.5752;3.0474,0.2043,-0.6215;3.919,-1.789,-1.2822;2.7742,-2.0059,-2.1377;2.7056,-3.0932,-2.2273;1.8814,-1.625,-1.6358;2.9645,-1.3466,-3.4962;2.1246,-1.5988,-4.154;3.8889,-1.6928,-3.97;3.005,-0.2588,-3.3918;5.3521,2.9981,2.5429;5.2148,4.1692,3.1764;4.3071,4.7282,2.957;6.1704,4.6536,4.0654;6.0501,5.6033,4.5743)|\",4.66403139757\r\n\"[H]C1=C([H])C([H])=C(C([H])(C([H])([H])[H])C([H])([H])[H])C(S/C([H])=C(\\[H])C(=O)OC([H])([H])[H])=C1[H] |(4.4323,0.9647,-5.1941;4.0307,0.629,-4.2422;3.741,-0.7185,-4.0271;3.9126,-1.4483,-4.8139;3.2308,-1.1328,-2.7993;3.0142,-2.1868,-2.6473;2.9917,-0.2342,-1.7467;2.4525,-0.7626,-0.4215;2.3907,0.08,0.273;1.0301,-1.3362,-0.5775;0.6511,-1.6796,0.3921;1.0142,-2.1909,-1.2642;0.3344,-0.5836,-0.9638;3.4045,-1.7992,0.2056;3.0279,-2.1135,1.1859;4.4085,-1.3836,0.3438;3.4954,-2.697,-0.417;3.272,1.1279,-1.9899;3.0649,2.4048,-0.7316;1.321,2.6134,-0.7065;0.7628,2.1113,-1.493;0.6596,3.3608,0.1939;1.1555,3.8866,1.0038;-0.8057,3.4788,0.0939;-1.5142,2.9444,-0.7401;-1.2902,4.2746,1.0816;-2.7133,4.45,1.078;-2.931,5.101,1.9256;-3.2217,3.4884,1.192;-3.0432,4.9136,0.1439;3.7918,1.5471,-3.2234;4.0014,2.6012,-3.3761)|\",5.042269649765\r\n\"[H]C1=C([H])C([H])=C(S/C([H])=C(\\[H])C2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])S1 |(-4.2081,-6.1225,5.5611;-3.4797,-5.3816,5.2581;-2.1369,-5.5558,5.0482;-1.6336,-6.5096,5.1649;-1.4757,-4.3505,4.6734;-0.4131,-4.2735,4.4714;-2.3224,-3.271,4.5932;-1.9274,-1.5919,4.1986;-1.8425,-1.6502,2.4148;-2.5803,-2.2947,1.9402;-0.9928,-0.9009,1.7057;-0.2696,-0.2821,2.236;-0.9851,-0.8486,0.1956;-1.7434,-1.5602,-0.1582;0.3844,-1.2788,-0.3915;0.683,-2.2425,0.0393;0.2567,-1.4417,-1.4717;1.4875,-0.2291,-0.1769;1.7159,-0.1409,0.8947;2.4142,-0.5636,-0.6606;1.07,1.1452,-0.7233;0.9722,1.0819,-1.8175;1.8523,1.8884,-0.5228;-0.2688,1.6054,-0.1261;-0.5752,2.5591,-0.5745;-0.1421,1.7991,0.9485;-1.3702,0.5551,-0.3417;-1.5653,0.4654,-1.4205;-2.3082,0.8814,0.1238;-3.9626,-3.7417,4.9918)|\",5.115740389400001\r\n\"[H]C1=NC(S/C([H])=C(\\[H])C2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=C([H])C([H])=C1[H] |(-2.886,-5.91,3.6993;-2.7329,-5.1257,4.4386;-2.2798,-3.9581,3.9593;-2.076,-2.9667,4.8247;-1.4754,-1.3829,4.2452;-1.3399,-1.6335,2.495;-1.8203,-2.5443,2.1483;-0.7173,-0.7749,1.6813;-0.2347,0.109,2.0981;-0.6446,-0.9662,0.1839;-1.1822,-1.8949,-0.0496;0.8154,-1.1286,-0.314;1.3216,-1.9017,0.2769;0.7835,-1.4932,-1.3515;1.6172,0.1831,-0.2805;1.7705,0.5025,0.7601;2.6181,0.013,-0.6978;0.9009,1.2999,-1.0556;0.8714,1.0353,-2.1233;1.4658,2.2382,-0.9818;-0.5356,1.5002,-0.5484;-1.0475,2.2638,-1.148;-0.5077,1.8865,0.4805;-1.3351,0.1874,-0.5898;-1.4467,-0.1241,-1.639;-2.3473,0.3427,-0.1971;-2.3159,-3.0902,6.2048;-2.1391,-2.2518,6.8718;-2.7821,-4.3069,6.683;-2.9764,-4.4368,7.7443;-2.9982,-5.3571,5.7843;-3.3607,-6.3241,6.1179)|\",4.710290752155001\r\n\"[H]C1=NC(S/C([H])=C(\\[H])C2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H])=NC([H])=C1[H] |(-2.9173,-5.87,3.7313;-2.7933,-5.009,4.3863;-2.3825,-3.879,3.799;-2.2314,-2.8237,4.6054;-1.7035,-1.2578,3.95;-1.5128,-1.594,2.2173;-2.0262,-2.487,1.8732;-0.8206,-0.7906,1.4031;-0.3079,0.0776,1.8168;-0.7138,-1.0164,-0.0869;-1.2616,-1.9407,-0.3146;0.7554,-1.2113,-0.5439;1.2351,-1.9812,0.0726;0.7425,-1.5939,-1.5753;1.5766,0.0887,-0.5142;1.7072,0.4262,0.5238;2.5853,-0.105,-0.9009;0.8985,1.2014,-1.3286;0.8921,0.9173,-2.3916;1.4761,2.1321,-1.257;-0.5471,1.4335,-0.8624;-1.0319,2.1934,-1.4885;-0.5394,1.838,0.1598;-1.3654,0.1324,-0.901;-1.4521,-0.1985,-1.9466;-2.3855,0.3099,-0.5393;-2.4512,-2.7838,5.9292;-2.8566,-3.9251,6.4865;-3.0339,-3.8974,7.5606;-3.0487,-5.0966,5.7514;-3.3777,-6.0192,6.2165)|\",4.468109425210001\r\n\"[H]C(=O)C1=C([H])C([H])=C(OC([H])([H])[H])C(C([H])=O)=C1F |(4.5736,5.2714,-3.7407;4.6059,4.2058,-4.0334;5.0505,3.8651,-5.1157;4.0814,3.2487,-3.0359;4.0806,1.8735,-3.3095;4.4763,1.5544,-4.269;3.5961,0.9485,-2.3976;3.6107,-0.1061,-2.642;3.0925,1.3933,-1.1638;2.6038,0.5611,-0.2177;2.5759,-0.8428,-0.4655;2.1382,-1.2873,0.4289;3.5869,-1.2392,-0.6149;1.9526,-1.0795,-1.3355;3.0719,2.777,-0.8387;2.5367,3.2329,0.4697;2.175,2.4148,1.1184;2.4809,4.3905,0.8383;3.5741,3.6647,-1.7999;3.5738,4.9759,-1.5368)|\",4.72933872169\r\n\"[H]O[C@@]1([H])C([H])([H])C([H])=C([H])C([H])([H])[C@]1([H])OC(=O)C([H])(Cl)Cl |(0.1711,0.6407,-2.7172;0.5023,1.4437,-2.282;-0.0057,1.4396,-0.9509;0.1108,2.4649,-0.5866;-1.4814,1.0225,-0.9001;-1.9107,1.2922,0.0749;-2.0311,1.6023,-1.6523;-1.6457,-0.4601,-1.1398;-2.6312,-0.8119,-1.4397;-0.6489,-1.3394,-0.9853;-0.8274,-2.3965,-1.174;0.75,-0.955,-0.5683;1.4447,-1.0919,-1.4103;1.1209,-1.6216,0.219;0.8592,0.4985,-0.0967;1.9003,0.8256,-0.1;0.3529,0.6311,1.2698;1.1957,0.3138,2.264;2.3413,-0.0392,2.1477;0.4437,0.4515,3.599;-0.3814,1.1535,3.5077;-0.2869,-1.1507,3.9976;1.5258,1.0319,4.8895)|\",5.880380309305001\r\n\"[H]OC1=C([H])\\C2=C(C(/[H])=C/1[H])\\N([H])[C@]1([H])C(=O)N=C([H])C([H])=C21 |(4.1787,0.324,-0.2048;3.594,-0.3331,0.2026;2.2902,0.0136,-0.0362;1.2977,-0.8156,0.477;1.5763,-1.695,1.0484;-0.0421,-0.483,0.2424;-0.3901,0.6695,-0.5149;0.6143,1.4994,-1.021;0.3692,2.3831,-1.6021;1.9427,1.1659,-0.7695;2.7334,1.8061,-1.157;-1.7687,0.7908,-0.6732;-2.2184,1.6916,-0.5468;-2.39,-0.3192,0.0263;-2.8188,-1.027,-0.7105;-3.5602,-0.0229,0.9519;-4.1593,1.0342,0.9131;-3.9326,-1.0891,1.7895;-3.0136,-1.9702,2.0889;-3.3257,-2.7526,2.7847;-1.6265,-1.9708,1.6718;-0.9147,-2.6102,2.184;-1.273,-1.056,0.7295)|\",2.9932523555\r\n\"[H]C1=C([H])C(C([H])([H])O[C@@]([H])(C([H])([H])C([H])([H])[H])[C@@]2([H])OC2([H])[H])=C([H])C([H])=C1OC([H])([H])[H] |(7.752,-0.0643,-5.652;7.8401,0.6866,-4.8748;6.8743,0.746,-3.8674;6.0563,0.0323,-3.8706;6.9487,1.6893,-2.8414;5.875,1.78,-1.7827;5.0222,2.3803,-2.1449;6.2606,2.2918,-0.8914;5.4173,0.4662,-1.4657;4.2187,0.4004,-0.7009;3.4596,1.0716,-1.1409;3.7427,-1.0558,-0.7876;3.7088,-1.3172,-1.8516;4.5105,-1.6986,-0.3375;2.3812,-1.322,-0.1375;2.0757,-2.3615,-0.2981;1.6013,-0.6793,-0.5642;2.4027,-1.1477,0.9435;4.4629,0.8472,0.7277;5.2415,0.2852,1.2492;4.5113,2.2649,0.9648;3.4305,1.5205,1.531;3.4434,1.4245,2.617;2.4501,1.718,1.0962;8.0281,2.5843,-2.8434;8.1157,3.3215,-2.048;8.9931,2.5465,-3.8426;9.8302,3.2379,-3.8449;8.9061,1.5943,-4.8676;9.9074,1.634,-5.7976;9.8777,0.6927,-6.8567;10.7562,0.9032,-7.4697;8.9735,0.802,-7.4706;9.935,-0.3391,-6.4843)|\",5.831399816215\r\n\"[H]OC([H])([H])[C@@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C(=C([H])[H])[C@@]1([H])C([H])([H])C([H])([H])[H] |(4.7665,3.353,-1.1411;3.9911,2.9343,-1.5498;2.878,3.2038,-0.7067;2.6817,4.2867,-0.6548;2.0152,2.7387,-1.1888;3.0456,2.6691,0.7196;4.215,3.2841,1.4496;5.5141,2.787,1.2573;5.6591,1.9059,0.6372;6.6098,3.3932,1.8759;7.6076,2.9907,1.7211;6.4231,4.5059,2.6974;7.2742,4.9756,3.1832;5.1351,5.0072,2.8982;4.9809,5.8688,3.5428;4.0422,4.4023,2.2769;3.0405,4.79,2.4422;1.8227,2.3047,1.4621;0.6799,2.7064,1.9963;-0.0271,1.9991,2.425;0.4022,3.7577,2.0363;2.6871,1.1905,1.0464;3.3178,0.7709,1.834;2.2692,0.1637,0.0017;1.7023,-0.6271,0.5122;1.5709,0.6181,-0.711;3.4585,-0.4465,-0.7509;3.1231,-1.2105,-1.4617;4.1614,-0.9243,-0.0568;3.997,0.3285,-1.3055)|\",6.326647024125\r\n\"[H]OC([H])([H])[C@@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])C(=C([H])[H])[C@@]1([H])C([H])([H])[H] |(5.4679,-2.6434,0.1399;5.3417,-2.0904,0.9279;3.9593,-1.7674,0.9945;3.3426,-2.6786,1.0467;3.8213,-1.2278,1.9342;3.4745,-0.9145,-0.1832;3.5945,-1.6102,-1.5165;4.8381,-1.6838,-2.1654;5.6941,-1.1748,-1.7297;4.9756,-2.3782,-3.369;5.9446,-2.4207,-3.8599;3.8708,-3.0081,-3.9444;3.9754,-3.5454,-4.8832;2.6285,-2.9368,-3.3107;1.7618,-3.418,-3.7567;2.4935,-2.2453,-2.1061;1.5232,-2.184,-1.6206;2.3643,0.0262,0.0645;1.0939,0.1925,0.3985;0.6733,1.1831,0.5582;0.4165,-0.6502,0.5211;3.6908,0.6239,-0.1553;3.8522,1.0416,-1.1512;4.4579,1.3501,0.9367;4.1781,2.4102,0.9535;5.5384,1.2854,0.7665;4.2469,0.9408,1.9286)|\",6.299435639075\r\n\"[H]OC([H])([H])[C@]1(C2=C(C([H])([H])[H])C([H])=C([H])C([H])=C2[H])/C(=C(/[H])C([H])([H])[H])C1([H])[H] |(1.0775,-3.2552,-2.3473;1.3098,-2.4966,-2.9081;2.727,-2.3747,-2.8708;3.2004,-3.2552,-3.3334;2.9694,-1.5015,-3.4834;3.2517,-2.17,-1.4471;3.0068,-3.3481,-0.5299;3.8519,-4.4837,-0.5313;5.1004,-4.5613,-1.3824;5.5324,-5.5661,-1.3415;5.8681,-3.8572,-1.0394;4.9061,-4.3249,-2.4336;3.5277,-5.5614,0.3022;4.178,-6.4335,0.3062;2.399,-5.5443,1.1226;2.1759,-6.3985,1.7564;1.5674,-4.4274,1.1225;0.6866,-4.3939,1.7582;1.8765,-3.3427,0.3;1.243,-2.461,0.3109;3.1964,-0.7902,-0.925;2.4607,0.259,-0.5854;2.9758,1.1815,-0.312;0.9563,0.2925,-0.5523;0.5788,1.1599,-1.1094;0.5766,0.3907,0.4745;0.5333,-0.6077,-1.0067;4.5281,-1.3184,-1.2434;5.1896,-1.611,-0.4288;5.053,-0.9471,-2.1245)|\",6.223243760935\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C(F)C([H])=C2[H])/C(=C(/[H])C([H])([H])[H])C1([H])[H] |(4.5109,-4.1298,-1.1078;3.604,-4.4757,-1.1307;2.7278,-3.3551,-1.0285;2.7479,-2.7642,-1.9566;1.7238,-3.7784,-0.9256;3.0329,-2.4288,0.148;2.7675,-2.9522,1.54;2.8377,-4.3268,1.8241;3.0893,-5.0208,1.0298;2.5979,-4.8104,3.1104;2.6475,-5.8716,3.3327;2.2981,-3.9094,4.1238;2.0686,-4.3724,5.3728;2.2242,-2.5432,3.8844;1.9795,-1.8672,4.6974;2.4536,-2.0758,2.591;2.3804,-1.0106,2.3933;2.9122,-0.9813,-0.129;2.1075,0.0507,-0.3425;2.5563,1.0203,-0.5644;0.6052,-0.0033,-0.2975;0.1716,0.2711,-1.2686;0.2069,0.7097,0.4365;0.248,-1.0027,-0.0319;4.2776,-1.5011,-0.0613;4.8823,-1.2936,0.8203;4.8558,-1.6078,-0.9816)|\",6.049090896615\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C(OC([H])([H])[H])C([H])=C2[H])/C(=C(/[H])C([H])([H])[H])C1([H])[H] |(4.7103,-3.8572,-1.7874;3.8168,-4.2284,-1.8693;2.9016,-3.1503,-1.6823;2.9156,-2.4795,-2.5548;1.911,-3.613,-1.6335;3.1606,-2.3233,-0.4231;2.8729,-2.9708,0.9106;2.4591,-2.213,2.0113;2.3217,-1.1413,1.8988;2.2069,-2.795,3.2568;1.8843,-2.1661,4.079;2.3586,-4.1755,3.4188;2.1354,-4.8551,4.5841;1.7168,-4.1163,5.7186;1.6007,-4.8433,6.5247;0.7558,-3.6139,5.5435;2.4639,-3.3665,6.0126;2.7653,-4.9519,2.325;2.8785,-6.023,2.463;3.0234,-4.3585,1.0961;3.3478,-4.9731,0.2633;3.018,-0.8604,-0.5772;2.2007,0.1739,-0.7201;2.6367,1.1663,-0.8463;0.6987,0.0926,-0.7186;0.2837,0.4446,-1.6728;0.2689,0.731,0.0648;0.3534,-0.9322,-0.5523;4.3924,-1.3605,-0.5265;4.9769,-1.2186,0.3812;4.9875,-1.3758,-1.442)|\",5.7388811070450005\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])/C(=C(/[H])C([H])([H])C([H])=C([H])[H])C1([H])[H] |(3.8511,-5.5785,-0.1258;4.0233,-5.0579,-0.9275;5.3813,-4.6396,-0.868;6.0597,-5.5084,-0.8631;5.5608,-4.0743,-1.7872;5.6663,-3.756,0.3479;5.5125,-4.4758,1.6662;6.5543,-5.2804,2.1532;7.4878,-5.334,1.5978;6.4089,-5.9996,3.3386;7.2292,-6.6127,3.7028;5.2137,-5.9294,4.0591;5.1004,-6.4872,4.9848;4.1697,-5.1347,3.5858;3.2385,-5.069,4.1425;4.3183,-4.4144,2.3974;3.5096,-3.7825,2.0425;5.3571,-2.3202,0.2281;4.4404,-1.3741,0.0829;4.7646,-0.3401,-0.0422;2.9464,-1.6151,0.0415;2.7654,-2.6967,0.0847;2.4641,-1.1541,0.9158;2.3219,-1.0566,-1.2147;2.6797,-1.4928,-2.1479;1.4031,-0.091,-1.2502;0.9901,0.2725,-2.1875;1.023,0.3709,-0.3407;6.7801,-2.6909,0.2251;7.3821,-2.5042,1.1135;7.3399,-2.632,-0.709)|\",6.18242668336\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])/C(=C(/[H])C([H])([H])C([H])([H])[H])C1([H])[H] |(7.783,1.1347,-2.2701;7.8611,0.2609,-1.8543;6.5606,-0.0985,-1.3942;5.9091,-0.3596,-2.2417;6.7027,-1.0024,-0.7935;5.8708,0.9829,-0.5638;6.4279,1.2833,0.8082;7.7786,1.0474,1.1145;8.4355,0.6493,0.3488;8.2811,1.3331,2.3849;9.3292,1.139,2.5998;7.4535,1.8692,3.372;7.8487,2.093,4.3594;6.1114,2.1129,3.0772;5.4513,2.5246,3.8367;5.6046,1.8172,1.8124;4.5535,1.9927,1.6014;4.4261,1.1639,-0.8257;3.1965,0.6926,-0.6705;2.369,1.2334,-1.135;2.837,-0.5543,0.0958;3.7356,-0.9663,0.5691;2.1467,-0.289,0.9104;2.1684,-1.6207,-0.7886;1.8857,-2.4989,-0.1974;2.8449,-1.9494,-1.5853;1.26,-1.2298,-1.2627;5.3062,2.1812,-1.3998;5.3554,3.1646,-0.9344;5.4821,2.1901,-2.4775)|\",6.315762470105\r\n\"[H]OC([H])([H])[C@]1(C2=C([H])C([H])=C([H])C([H])=C2[H])/C(=C(/[H])C([H])([H])[H])C1([H])[H] |(4.8616,-4.1945,0.0896;3.9909,-4.6232,0.067;3.0251,-3.5944,-0.1381;3.0723,-3.2231,-1.1731;2.0518,-4.0764,-0.0026;3.1622,-2.4072,0.8136;2.8269,-2.6277,2.2702;2.9406,-3.8991,2.8566;3.2796,-4.734,2.2531;2.6345,-4.0919,4.2047;2.7271,-5.086,4.6351;2.2204,-3.0221,4.9987;1.986,-3.1747,6.0489;2.1096,-1.7532,4.4283;1.7827,-0.9098,5.0316;2.4034,-1.5608,3.0787;2.2939,-0.5724,2.6416;2.9551,-1.0747,0.2053;2.0944,-0.1939,-0.2858;2.4846,0.7399,-0.694;0.602,-0.3754,-0.3229;0.2285,-0.3612,-1.3557;0.0924,0.4406,0.2065;0.3023,-1.3212,0.1383;4.3454,-1.4374,0.4759;4.8577,-0.9829,1.3226;5.0041,-1.6932,-0.3567)|\",6.315762470105\r\n\"[H]OC(=O)[C@]1([H])[C@]([H])(C(=O)OC([H])([H])C([H])([H])[H])[C@]2([H])C([H])([H])[C@@]1([H])[C@]([H])(C([H])([H])[H])[C@@]2([H])C([H])([H])[H] |(4.3679,4.3084,-1.5679;4.9799,3.5983,-1.2976;4.8256,3.4664,0.0467;4.1218,4.2074,0.6937;5.7108,2.3559,0.5864;6.7257,2.6355,0.2839;5.4116,0.9086,0.0287;6.2698,0.517,-0.5217;4.1941,0.7885,-0.8676;3.1172,1.3151,-0.6691;4.4443,-0.0462,-1.9003;3.3415,-0.3191,-2.7995;2.7654,0.5989,-2.9379;3.8242,-0.5911,-3.7415;2.4664,-1.4465,-2.2738;1.6864,-1.6859,-3.0058;1.98,-1.1519,-1.3396;3.0596,-2.3497,-2.0967;5.1847,0.1011,1.3498;4.6684,-0.8497,1.1824;4.4596,1.1416,2.2284;3.51,1.4781,1.8063;4.2942,0.7982,3.2554;5.5875,2.1866,2.1197;5.4163,3.1354,2.6319;6.792,1.3758,2.6704;6.6441,1.3063,3.7576;8.1905,1.9536,2.4404;8.9501,1.3275,2.924;8.2791,2.963,2.8599;8.4518,2.0095,1.3769;6.5612,-0.0524,2.0571;7.3388,-0.2667,1.3108;6.5887,-1.1746,3.0999;7.5403,-1.178,3.6455;6.47,-2.1591,2.6314;5.7862,-1.0552,3.8376)|\",7.208295899745001\r\n\"[H]O[C@@]([H])(C([H])([H])[H])[C@]([H])(OC(=O)C([H])(Cl)Cl)C([H])([H])[H] |(2.047,1.5613,-0.509;1.9734,0.996,-1.2963;3.2255,0.3371,-1.4471;3.0931,-0.3292,-2.3083;4.3499,1.332,-1.7466;5.2861,0.8256,-2.0054;4.5406,1.9793,-0.8833;4.0549,1.9604,-2.5918;3.4817,-0.5683,-0.2321;2.6047,-1.2049,-0.0886;3.5341,0.3462,0.9168;3.1754,-0.13,2.1072;2.8545,-1.2664,2.367;3.1974,0.9542,3.1916;2.8885,0.5037,4.1309;4.8562,1.6044,3.4219;2.0137,2.2572,2.8085;4.75,-1.4078,-0.2824;4.7511,-2.0227,-1.1896;4.7966,-2.0769,0.5809;5.6447,-0.7785,-0.2892)|\",6.18786896037\r\n\"[H]O[C@@]1([H])C([H])([H])C([H])=C([H])C([H])([H])[C@]1([H])OC(=O)C([H])(C([H])([H])[H])C([H])([H])[H] |(5.6095,2.1314,-4.811;5.4457,1.5099,-4.0825;6.06,2.0623,-2.9196;6.1349,1.2363,-2.2056;7.4542,2.6259,-3.224;8.0013,2.7836,-2.2844;8.0182,1.874,-3.7908;7.3696,3.9246,-3.9914;8.26,4.2469,-4.5288;6.2676,4.6838,-4.0223;6.2683,5.6105,-4.5944;4.9864,4.3278,-3.3078;4.2091,4.0663,-4.0415;4.587,5.191,-2.7634;5.1459,3.1516,-2.3367;4.1677,2.7355,-2.0877;5.7868,3.5834,-1.1048;4.9967,4.1512,-0.163;3.8033,4.3253,-0.2984;5.7916,4.5566,1.0705;6.8126,4.1789,0.9532;5.8293,6.0937,1.1587;6.3919,6.4076,2.0447;4.8138,6.4965,1.2321;6.3104,6.5341,0.278;5.1516,3.9378,2.3236;5.7012,4.2435,3.2206;5.1578,2.843,2.2788;4.1126,4.2675,2.4213)|\",7.178363376189999\r\n\"[H]/N=C(\\C1=C([H])C([H])=C(F)C(OC([H])([H])[H])=C1[H])N([H])O[H] |(7.506,-4.8059,0.3055;7.8381,-3.8973,0.623;6.8744,-3.0689,0.8246;5.4125,-3.3304,0.7513;4.9028,-4.5686,1.1711;5.5683,-5.3261,1.5724;3.536,-4.8086,1.0942;3.1043,-5.7524,1.4116;2.68,-3.8285,0.6047;1.3586,-4.123,0.5345;3.1558,-2.5748,0.1886;2.4256,-1.5486,-0.3191;0.9978,-1.5385,-0.2382;0.7047,-0.5464,-0.5861;0.6538,-1.6826,0.7909;0.5514,-2.3005,-0.8827;4.5372,-2.3516,0.2709;4.916,-1.3961,-0.0758;7.2208,-1.7518,1.1041;6.7119,-1.3374,1.881;8.5998,-1.5585,1.3311;8.9561,-2.465,1.1566)|\",5.053154203785001\r\n\"[H]O[C@@]1([H])C2=C([H])C([H])=C([H])C(OC([H])([H])[H])=C2C(=O)C([H])([H])[C@@]1([H])O[H] |(4.8917,-5.3627,0.1326;4.0515,-5.59,-0.294;2.9781,-5.0543,0.5049;2.9387,-5.5917,1.4633;3.1839,-3.572,0.7595;3.9938,-3.2027,1.8368;4.3838,-3.9716,2.5006;4.3074,-1.8633,2.0632;4.9457,-1.5839,2.897;3.8013,-0.8838,1.2146;4.0491,0.1644,1.3507;2.959,-1.2269,0.152;2.5595,-0.2173,-0.6783;1.3139,0.4107,-0.3564;1.1999,1.2287,-1.0705;1.3351,0.8173,0.6638;0.4856,-0.2923,-0.4718;2.6265,-2.5836,-0.0878;1.6997,-2.9922,-1.1952;0.9706,-2.2028,-1.7749;1.6803,-4.4708,-1.5492;2.5691,-4.6973,-2.1533;0.7918,-4.672,-2.1518;1.7123,-5.361,-0.2972;0.8384,-5.1473,0.3292;1.642,-6.7386,-0.6118;2.5264,-6.9782,-0.9399)|\",4.819136292355\r\n\"[H]C1=C([H])C2=C(C(OC([H])([H])[H])=C1[H])/C([H])=C(/[H])[C@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]21[H] |(5.5284,-3.9078,-1.7328;5.628,-3.0251,-1.1072;4.8435,-1.9004,-1.3521;4.1222,-1.8986,-2.1648;4.9791,-0.7687,-0.5478;5.9029,-0.7463,0.5118;6.6898,-1.8963,0.7487;7.5696,-1.8093,1.7915;8.3785,-2.9342,2.0931;8.9862,-2.6431,2.9518;9.0379,-3.1939,1.2544;7.7707,-3.8091,2.3589;6.55,-3.03,-0.0586;7.1555,-3.9116,0.1167;6.0427,0.4556,1.3391;6.6815,0.3864,2.2131;5.4292,1.6077,1.0313;5.5498,2.4934,1.6502;4.5824,1.7642,-0.2038;5.1265,2.3474,-0.9593;3.3654,2.4728,0.0567;2.3086,1.5189,0.2582;1.8621,1.5171,1.7172;1.4671,2.4999,1.9934;2.7135,1.2805,2.3607;1.081,0.7665,1.8755;1.1794,1.8585,-0.7149;0.817,2.876,-0.5365;0.3457,1.1601,-0.5904;1.5405,1.797,-1.746;2.8595,0.2337,-0.0386;4.081,0.4265,-0.7678;3.8586,0.5341,-1.8398)|\",4.737502137205\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])OC(=O)C([H])(Cl)Cl |(2.7552,0.9073,0.3855;2.6122,0.8392,-0.5742;1.2592,0.4764,-0.7999;1.2319,0.21,-1.8631;0.3425,1.6921,-0.5752;0.5989,2.1419,0.3947;0.6334,2.4295,-1.3314;-1.1785,1.4495,-0.6278;-1.664,2.384,-0.9355;-1.4247,0.7171,-1.4112;-1.7881,1.019,0.7172;-2.883,1.0432,0.638;-1.5215,1.773,1.472;-1.3744,-0.3683,1.2287;-1.8073,-1.142,0.577;-1.827,-0.5194,2.2165;0.1409,-0.6203,1.3522;0.2883,-1.554,1.9073;0.6213,0.1676,1.9402;0.8123,-0.7963,-0.0195;0.1043,-1.3335,-0.6573;1.9228,-1.757,0.0155;2.9078,-1.611,0.9008;3.0534,-0.7206,1.7059;3.8417,-2.8287,0.7774;3.7931,-3.2493,-0.2238;5.5331,-2.3875,1.1133;3.24,-4.1065,1.9049)|\",6.068138866150001\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])OC(=O)C([H])(C([H])([H])[H])C([H])([H])[H] |(4.7533,-5.7855,-1.4303;5.2647,-5.3856,-2.1532;6.2739,-4.6161,-1.5341;6.707,-3.9945,-2.3278;7.3912,-5.4138,-0.8358;6.9278,-6.1968,-0.2221;8.0579,-5.9061,-1.55;8.0992,-4.3696,0.0543;8.9561,-3.9346,-0.4718;8.4904,-4.815,0.9739;7.0324,-3.2656,0.345;7.3714,-2.2871,-0.0117;6.8134,-3.149,1.4092;5.7678,-3.6749,-0.4247;5.1817,-2.8304,-0.7899;4.891,-4.5011,0.3972;4.034,-3.8428,1.22;4.0109,-2.6365,1.3291;3.1023,-4.8105,1.9377;3.677,-5.7212,2.147;2.5958,-4.2011,3.2489;1.9368,-4.9077,3.7651;2.0377,-3.2805,3.0541;3.4236,-3.9515,3.9204;1.9399,-5.1837,0.9932;1.2791,-5.911,1.477;2.305,-5.6237,0.0594;1.3454,-4.2975,0.7441)|\",7.284487777885\r\n\"[H]O[C@]1([H])C([H])([H])C([H])([H])C([H])([H])[C@]1([H])OC(=O)C([H])(Cl)Cl |(2.173,-0.1947,-0.5743;2.24,0.2646,0.2797;0.9713,0.1853,0.887;1.0191,0.8432,1.764;0.5065,-1.2166,1.3237;0.6684,-1.9159,0.4937;1.0687,-1.5882,2.1851;-1.0012,-1.0373,1.5997;-1.1689,-0.7849,2.6525;-1.5668,-1.9519,1.4007;-1.4614,0.146,0.6886;-1.9206,0.9468,1.2777;-2.2003,-0.1555,-0.0588;-0.193,0.6783,0.0076;-0.195,1.7532,-0.1759;0.0157,0.0136,-1.2835;-0.5393,0.568,-2.3601;-1.2303,1.5592,-2.3878;-0.2156,-0.2104,-3.6413;-0.6324,0.3327,-4.485;1.56,-0.3319,-3.9025;-0.9994,-1.8291,-3.6034)|\",6.00283154203\r\n\"[H]O[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@@]1([H])OC(=O)C([H])([H])Br |(-2.0368,0.9458,0.0731;-2.3076,0.0584,-0.2178;-1.2199,-0.4877,-0.9457;-1.5205,-1.5163,-1.1849;-0.9473,0.2669,-2.2548;-1.8594,0.2511,-2.8617;-0.7389,1.3198,-2.0184;0.2429,-0.3342,-3.021;-0.01,-1.3549,-3.3443;0.431,0.2431,-3.9342;1.5048,-0.3764,-2.1459;1.8222,0.6489,-1.9144;2.3329,-0.8476,-2.6889;1.2486,-1.1368,-0.8349;2.1292,-1.1172,-0.1823;1.0467,-2.1949,-1.0547;0.0393,-0.5933,-0.0706;-0.1698,-1.1925,0.8175;0.3199,0.7709,0.3818;0.6544,0.9535,1.671;0.85,0.0778,2.482;0.8039,2.4294,1.9921;1.5172,2.9077,1.3203;1.0947,2.5502,3.0323;-0.9001,3.4045,1.7486)|\",5.738881107045\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]2([H])C(=NN=C2OC([H])([H])C([H])([H])[H])OC1=O |(3.904,-3.0207,-2.1672;4.5359,-2.1738,-2.4136;5.2746,-2.1513,-3.5386;5.2916,-3.2655,-4.5489;4.5136,-3.9925,-4.2872;5.0333,-2.8488,-5.5309;6.6537,-3.9879,-4.6751;6.5711,-4.7041,-5.5033;7.4224,-3.266,-4.9803;7.0946,-4.728,-3.4066;7.1779,-4.0201,-2.5706;6.3123,-5.4435,-3.1163;8.4245,-5.4676,-3.5838;8.7168,-5.9857,-2.664;9.2328,-4.7743,-3.8471;8.3587,-6.2168,-4.3822;6.1852,-0.9533,-3.7193;7.1978,-1.2214,-3.3752;5.6545,0.2197,-2.95;5.5972,1.3148,-3.6104;6.0969,1.0438,-4.9362;6.3452,-0.2184,-5.031;6.7879,-0.8541,-6.1176;7.0134,-0.0414,-7.3052;6.9771,-0.7641,-8.1234;6.1833,0.6631,-7.4022;8.3512,0.6779,-7.2496;8.5199,1.2054,-8.1954;9.1709,-0.0333,-7.1033;8.3622,1.4127,-6.4408;5.1785,0.0853,-1.6865;4.561,-1.1289,-1.3669;4.089,-1.2533,-0.2678)|\",4.713011890660001\r\n\"[H]OC(=O)C1=NN=C2OC(=O)/C([H])=C(/C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@@]21[H] |(6.5998,0.3346,8.3537;6.0328,0.567,7.594;6.4303,-0.2173,6.5795;7.3166,-1.0474,6.6796;5.6899,0.0299,5.3208;4.9594,1.086,5.1495;4.4501,1.127,3.8272;4.9582,0.1186,3.2028;4.619,-0.191,1.9425;4.5878,-1.5596,1.5912;4.2612,-1.8376,0.4709;4.9463,-2.5204,2.6464;4.7421,-3.5528,2.3824;5.5495,-2.205,3.8091;6.0137,-3.2538,4.7806;5.8153,-2.9279,5.8067;5.4374,-4.1704,4.6071;7.522,-3.5695,4.6463;7.7246,-3.9114,3.6221;8.1002,-2.653,4.8075;7.981,-4.6322,5.6522;7.7764,-4.2707,6.6686;7.3825,-5.5447,5.5192;9.469,-4.9707,5.5179;9.7694,-5.7307,6.2478;10.0926,-4.0842,5.6838;9.7005,-5.3588,4.5184;5.8746,-0.7343,4.0277;6.9226,-0.5728,3.72)|\",4.914376140030001\r\n\"[H]/C1=C(\\C([H])([H])C([H])([H])[H])[C@]2([H])C(=NN=C2C([H])([H])[H])OC1=O |(3.124,-3.0106,0.4948;3.0945,-2.1224,1.1179;3.582,-2.1255,2.3721;4.2644,-3.3262,2.9757;4.0468,-3.3774,4.0495;3.8545,-4.237,2.5246;5.7913,-3.301,2.7541;6.2594,-4.16,3.2463;6.248,-2.3894,3.1529;6.0252,-3.3481,1.686;3.3693,-0.8615,3.1839;2.4272,-0.9901,3.7468;3.2616,0.3191,2.2677;3.9497,1.3381,2.6262;4.6019,0.9822,3.8583;4.3604,-0.2501,4.1603;4.8664,-0.858,5.4304;4.0368,-1.1911,6.0672;5.446,-0.1093,5.9756;5.5043,-1.7301,5.247;2.5569,0.2721,1.1131;2.4276,-0.9701,0.4793;1.804,-1.0192,-0.5474)|\",5.02866395724\r\n\"[H]OC1=C([H])C(=O)C([H])=C([H])C1C1=NC2=C([H])C([H])=C(C([H])[H])C([H])=C2N1[H] |(9.9854,-0.0891,0.0627;9.0544,-0.3659,0.0431;9.0311,-1.717,-0.0922;10.1813,-2.4505,-0.1767;11.1506,-1.9578,-0.1363;10.1949,-3.8983,-0.3228;11.2392,-4.5584,-0.4028;8.8691,-4.5339,-0.3717;8.846,-5.6137,-0.4805;7.7335,-3.8017,-0.2871;6.7826,-4.3267,-0.3313;7.7268,-2.3619,-0.1431;6.5165,-1.6712,-0.062;6.3479,-0.31,0.0697;5.0476,-0.093,0.106;4.3527,1.1557,0.233;4.9246,2.0741,0.3134;2.9955,1.1386,0.2475;2.4385,2.0667,0.342;2.2166,-0.097,0.1401;0.857,-0.038,0.163;0.3332,0.9077,0.2596;0.2468,-0.933,0.0862;2.9252,-1.3588,0.0112;2.3574,-2.281,-0.0692;4.2846,-1.3258,-0.0022;5.2613,-2.2903,-0.1067;5.0949,-3.2799,-0.1919)|\",1.9891522471550005\r\n\"[H]OC(C#CC1C([H])O[C@@]2([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]2([H])C1=O)(C([H])([H])[H])C([H])([H])[H] |(2.3675,-1.1118,-1.9104;2.5723,-1.3337,-0.9868;2.8229,-0.0888,-0.3105;3.1443,-0.4266,1.0826;3.4189,-0.6554,2.2459;3.8514,-1.0747,3.524;4.4486,-0.2044,4.4091;4.8461,-0.565,5.3612;3.9506,1.0795,4.4523;2.6689,0.865,5.167;2.9355,0.6547,6.2149;1.8323,2.1355,5.1048;2.4281,2.9844,5.458;1.5635,2.3351,4.0597;0.5689,1.956,5.9719;0.8707,1.8889,7.0268;-0.0629,2.848,5.8864;-0.2336,0.6991,5.5954;-0.6784,0.8371,4.5994;-1.0699,0.5705,6.2935;0.637,-0.5715,5.5805;0.0543,-1.436,5.2442;0.9915,-0.8107,6.5912;1.8378,-0.3666,4.649;1.4749,-0.1138,3.6477;2.7947,-1.605,4.5246;2.7985,-2.5551,5.2654;1.556,0.7909,-0.3605;1.6926,1.7224,0.1979;0.7068,0.2445,0.0593;1.3249,1.048,-1.4025;4.0216,0.6295,-0.962;4.2447,1.5711,-0.4507;3.792,0.8551,-2.0115;4.9069,-0.0109,-0.9278)|\",3.37965402321\r\n\"[H]C1O[C@@]2([H])C([H])([H])C([H])([H])[C@@]([H])(F)C([H])([H])[C@]2([H])C(=O)C1N([H])[H] |(0.7588,-0.0267,0.8884;-0.0178,0.0421,0.1284;0.433,0.51,-1.1198;0.7954,1.8937,-0.8252;1.6819,1.8614,-0.1686;1.1543,2.6364,-2.1062;1.9253,2.0831,-2.653;0.2698,2.6947,-2.7513;1.653,4.0536,-1.7558;1.8211,4.6318,-2.6711;2.6179,3.9917,-1.2345;0.6754,4.8173,-0.8615;1.0943,5.7916,-0.5805;-0.483,5.0761,-1.6197;0.2495,4.0369,0.3844;-0.5028,4.6101,0.9355;1.1076,3.9122,1.0563;-0.3133,2.6749,-0.0321;-1.147,2.8513,-0.7158;-0.8047,1.7899,1.1784;-0.5789,2.0409,2.3426;-1.2649,0.5314,0.4733;-2.2811,0.5916,-0.43;-2.1208,0.2623,-1.3777;-3.0136,1.2761,-0.3091)|\",3.823199599525\r\n\"[H]C1=C([H])C([H])=C2C(=O)C(/C([H])=C(\\[H])C#N)=C(/C([H])([H])[H])OC2=C1[H] |(7.4775,3.4436,-0.395;6.8809,2.5394,-0.3152;7.513,1.2856,-0.249;8.5968,1.225,-0.276;6.7538,0.1299,-0.1484;7.2089,-0.8537,-0.096;5.3506,0.2025,-0.1116;4.5202,-1.0145,-0.0041;5.0327,-2.133,0.0554;3.061,-0.7783,0.0261;2.1069,-1.8753,0.1314;1.0576,-1.5946,0.1464;2.39,-3.1961,0.2099;3.4134,-3.5513,0.2006;1.3478,-4.1632,0.3092;0.5091,-4.9681,0.3898;2.5826,0.509,-0.0449;1.138,0.9247,-0.0257;0.6419,0.6002,0.8951;0.5884,0.4943,-0.8697;1.0737,2.0125,-0.0895;3.3827,1.5946,-0.1452;4.7486,1.4608,-0.1784;5.496,2.6368,-0.2807;4.9842,3.5923,-0.3299)|\",4.449061455675\r\n\"[H]C(=O)/C([H])=C(\\[H])[C@]1([H])C([H])([H])C([H])([H])[C@]2([H])OC([H])=C([H])C(=O)[C@@]2([H])C1([H])[H] |(1.4633,-0.6334,6.6154;2.5216,-0.4139,6.8879;3.0589,-0.9856,7.8177;3.1746,0.5986,6.0449;4.2103,0.8353,6.2824;2.5195,1.1983,5.0381;1.4821,0.9045,4.8652;3.0975,2.2511,4.1317;4.1412,2.4055,4.4329;2.3584,3.6097,4.2835;2.291,3.8838,5.3421;2.9714,4.3821,3.7985;0.9554,3.6143,3.6516;0.5086,4.6127,3.7204;0.2831,2.9299,4.1847;1.0322,3.1844,2.1926;1.6117,3.9225,1.6144;-0.3074,3.2055,1.6387;-0.4291,2.6474,0.421;-1.3796,2.9072,-0.0382;0.4928,1.8614,-0.1794;0.3256,1.4952,-1.186;1.6805,1.4089,0.5464;2.561,0.7172,0.0544;1.6889,1.8136,2.0279;1.0551,1.0749,2.5454;3.0945,1.8127,2.6439;3.5454,0.8241,2.5221;3.7271,2.5095,2.0772)|\",5.025942818735\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])N([H])/C(=C(/[H])C1=C([H])C([H])=C(F)C([H])=C1[H])C2=O |(2.6146,2.2851,-0.065;1.907,1.4626,-0.0269;0.5362,1.7143,-0.0181;0.1412,2.7255,-0.0563;-0.3408,0.6314,0.0312;0.1429,-0.6888,0.081;1.5122,-0.9531,0.0666;1.8945,-1.9696,0.1011;2.3796,0.1402,0.0105;3.4513,-0.0408,-0.0013;-0.9214,-1.5937,0.1688;-0.8254,-2.5408,-0.1717;-2.1368,-0.9037,0.0498;-3.3995,-1.3777,-0.0348;-4.1459,-0.5992,-0.1796;-3.8812,-2.753,0.0064;-3.1472,-3.8193,0.5694;-2.1948,-3.6243,1.0517;-3.6519,-5.1165,0.5831;-3.0971,-5.9372,1.0259;-4.9046,-5.3567,0.0285;-5.3936,-6.6128,0.0345;-5.6713,-4.3339,-0.5204;-6.6499,-4.558,-0.932;-5.1577,-3.0416,-0.5204;-5.7486,-2.2338,-0.9435;-1.8134,0.5705,0.0069;-2.6254,1.4836,-0.0401)|\",3.371490607695\r\n\"[H]OC1=C([H])C([H])=C([H])C([H])=C1C(=O)/C([H])=C(/N1C([H])([H])C([H])([H])C([H])([H])C1([H])[H])C([H])([H])[H] |(4.9353,-1.9986,-2.6433;4.1916,-1.678,-2.1092;3.0272,-2.0133,-2.7559;3.0871,-2.6708,-3.9918;4.061,-2.893,-4.4273;1.9235,-3.0438,-4.6574;1.9926,-3.5489,-5.6173;0.6809,-2.772,-4.0826;-0.2357,-3.0636,-4.5869;0.6292,-2.128,-2.851;-0.3212,-1.9228,-2.37;1.7823,-1.7168,-2.1621;1.5396,-1.027,-0.8275;0.4596,-1.2704,-0.2678;2.5429,-0.0932,-0.3494;3.4115,0.0166,-0.98;2.4916,0.6433,0.8163;3.4937,1.5176,1.1289;3.564,2.3097,2.3639;2.8589,3.1548,2.333;3.3129,1.7033,3.239;5.0153,2.8059,2.3932;5.1266,3.7307,2.967;5.6603,2.0456,2.8497;5.3566,2.9674,0.9041;6.4318,2.9623,0.7034;4.9512,3.9133,0.5254;4.6333,1.7841,0.2439;5.2766,0.8949,0.174;4.2864,2.0165,-0.7698;1.3694,0.5483,1.82;1.725,0.0961,2.7553;0.5728,-0.0711,1.4137;0.9787,1.543,2.066)|\",4.28307200687\r\n\"[H]OC1=C(C(=O)/C([H])=C(\\[H])N(C([H])([H])[H])C([H])([H])[H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(1.6383,-3.2331,-0.2879;2.5555,-2.9563,-0.4394;2.6372,-1.6048,-0.2157;3.8744,-0.9381,-0.3422;5.2116,-1.5307,-0.7332;6.2205,-0.8301,-0.5753;5.2909,-2.8725,-1.3045;4.3878,-3.4455,-1.4479;6.5227,-3.3587,-1.6272;7.3812,-2.7279,-1.4094;6.8173,-4.5466,-2.2177;5.762,-5.4993,-2.5149;6.1507,-6.2706,-3.1855;4.9316,-4.9905,-3.0136;5.3745,-5.9848,-1.6065;8.1829,-5.0445,-2.2243;8.4071,-5.5245,-3.1842;8.3608,-5.7783,-1.4232;8.8747,-4.2105,-2.0847;3.8774,0.439,-0.0617;4.8344,0.9432,-0.1423;2.7344,1.1349,0.307;2.7898,2.2012,0.5111;1.5078,0.4669,0.4202;0.2424,1.2051,0.7862;0.4593,2.0868,1.398;-0.4518,0.5668,1.3435;-0.2857,1.5539,-0.1116;1.4846,-0.9039,0.1599;0.546,-1.4515,0.2533)|\",4.326610222949999\r\n\"[H]OC1=C([H])C([H])=C([H])C([H])=C1C(=O)/C([H])=C(\\[H])C1=C([H])C([H])=C([H])C(C([H])([H])[H])=C1[H] |(3.2399,-6.3709,-3.0903;3.6638,-5.8892,-2.3628;4.753,-6.6053,-1.943;5.0839,-7.8106,-2.5759;4.4776,-8.1541,-3.4131;6.1677,-8.566,-2.1409;6.4097,-9.4966,-2.6473;6.9316,-8.129,-1.0556;7.7762,-8.7145,-0.7054;6.5958,-6.9365,-0.4273;7.1611,-6.5767,0.4258;5.5194,-6.1382,-0.8535;5.2885,-4.8693,-0.0724;5.7913,-4.7666,1.0464;4.5245,-3.7491,-0.6708;4.167,-3.8608,-1.6857;4.2939,-2.6324,0.05;4.6799,-2.6262,1.0683;3.5801,-1.4271,-0.3711;3.4045,-0.3881,0.5624;3.8005,-0.5057,1.5679;2.7303,0.7761,0.2078;2.5978,1.5705,0.9374;2.2232,0.9253,-1.0839;1.6986,1.838,-1.3567;2.3845,-0.0887,-2.039;1.8448,0.0692,-3.4428;1.312,1.0177,-3.5619;1.1486,-0.7393,-3.6974;2.6514,0.045,-4.1861;3.0618,-1.2521,-1.6671;3.1914,-2.0413,-2.4034)|\",4.155178497135\r\n\"[H]O/C(=N/NC([H])C1=C([H])C([H])=C([H])O1)C(=O)N([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H] |(5.3303,4.0669,-3.1638;4.8764,3.7029,-3.9641;4.5318,2.4659,-3.6019;3.8964,1.744,-4.4623;3.5557,0.4673,-4.0344;2.9371,-0.2008,-4.9505;2.744,0.2486,-5.9271;2.4752,-1.5455,-4.7365;2.5011,-2.4299,-3.6834;2.9253,-2.2407,-2.708;1.8603,-3.619,-4.133;1.6965,-4.5286,-3.5724;1.4868,-3.3828,-5.426;0.9783,-3.974,-6.1724;1.8492,-2.1345,-5.8094;4.9963,2.1894,-2.164;5.5637,3.1271,-1.5858;4.7746,0.9763,-1.6398;4.2707,0.3282,-2.2461;5.166,0.6328,-0.2776;5.4217,-0.4342,-0.2649;6.0713,1.1997,-0.0433;4.08,0.9251,0.7773;3.8511,1.9973,0.7091;2.7939,0.1339,0.5059;2.0216,0.3755,1.2448;2.9784,-0.9479,0.5634;2.3847,0.357,-0.4859;4.631,0.6357,2.1801;3.8888,0.8766,2.9493;5.5317,1.2253,2.3876;4.8908,-0.4254,2.292)|\",3.991910186835\r\n\"[H]OC(=O)C([H])([H])[C@@]1([H])C([H])N=C(N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])NC1=O |(1.9363,5.5867,0.763;2.0175,6.5313,0.4512;1.9279,6.586,-0.8801;2.0133,7.6269,-1.4926;1.715,5.2464,-1.603;1.6028,5.4937,-2.6619;2.6196,4.6352,-1.496;0.4904,4.4237,-1.1183;-0.3006,5.1316,-0.8122;-0.1041,3.6104,-2.2343;-0.1326,4.0616,-3.2297;-0.592,2.4337,-2.1129;-0.5207,1.8758,-0.813;-1.2098,0.7388,-0.6786;-1.9685,0.0648,-1.7502;-2.9029,0.607,-1.9427;-1.3933,0.053,-2.6768;-2.2371,-1.3268,-1.1598;-3.134,-1.7871,-1.5832;-1.3889,-1.9904,-1.3658;-2.3443,-1.0583,0.3512;-2.1866,-1.9525,0.9601;-3.3359,-0.6589,0.5957;-1.2705,0.0096,0.6012;-0.2873,-0.4265,0.816;-1.5022,0.7011,1.4147;0.1663,2.3323,0.221;0.7817,3.5408,0.1093;1.5471,3.9582,0.9852)|\",4.337494776970001\r\n\"[H]OC1=NC([H])([H])N([H])[C@@]2([H])C([H])([H])N(C([H])([H])C3=C([H])C([H])=C([H])C([H])=C3[H])C([H])([H])[C@]12[H] |(4.245,-2.3138,-4.1624;4.229,-3.2803,-4.2363;4.2213,-3.8151,-2.9805;4.2295,-5.0787,-2.8742;4.1989,-5.7129,-1.5719;3.2303,-6.2439,-1.512;4.9835,-6.479,-1.5502;4.4054,-4.8217,-0.402;4.099,-5.3174,0.4347;3.6392,-3.6064,-0.6056;2.5814,-3.8201,-0.8628;3.6426,-2.5165,0.4691;4.5699,-2.5468,1.0583;2.7967,-2.6259,1.1618;3.5549,-1.249,-0.3054;2.5826,-0.2778,0.1718;1.5391,-0.601,-0.0194;2.6917,-0.2172,1.263;2.7882,1.1062,-0.4198;1.7022,1.8512,-0.8908;0.7046,1.4177,-0.8641;1.8841,3.142,-1.3932;1.0283,3.7063,-1.7548;3.1615,3.7001,-1.4382;3.3071,4.7021,-1.8331;4.2543,2.9604,-0.9775;5.2533,3.3879,-1.0117;4.0685,1.6746,-0.4718;4.9158,1.0921,-0.1204;3.4928,-1.55,-1.7467;2.456,-1.6693,-2.1153;3.9578,-0.7387,-2.3192;4.253,-2.8771,-1.8009;5.3057,-2.674,-1.5599)|\",5.71983313751\r\n\"[H]C1=NC(=O)C([H])=C(C([H])([H])[H])[C@]1([H])Cl |(5.222,-2.0689,0.0421;4.6971,-1.1113,0.0009;5.3954,-0.0513,-0.1219;4.7087,1.2088,-0.1657;5.3536,2.2309,-0.2948;3.2342,1.2304,-0.0605;2.7806,2.2159,-0.1153;2.4868,0.1217,0.074;0.9877,0.1352,0.1491;0.6001,1.1523,0.0482;0.6398,-0.2798,1.102;0.5501,-0.4833,-0.6458;3.1925,-1.2009,0.1024;2.8133,-1.8644,-0.6835;2.8229,-2.1291,1.6574)|\",4.454503732685001\r\n\"[H]C1=NC(=S)[C@@]([H])(N(=O)=O)C([H])=C1Br |(-0.4606,2.2323,-0.725;-0.2908,1.2158,-0.3724;-1.3023,0.4139,-0.3357;-1.1084,-0.8843,0.1139;-2.3338,-1.7391,0.7857;0.292,-1.477,-0.0222;0.3887,-2.3493,0.6208;0.4056,-2.091,-1.4703;0.0692,-3.2584,-1.576;0.7875,-1.3504,-2.3634;1.3736,-0.4643,0.1768;2.3791,-0.807,0.3973;1.0679,0.8322,0.023;2.3522,2.2227,0.2138)|\",2.91706047736\r\n\"[H]C1=NC(=O)[C@@]([H])(F)C([H])=C1N(=O)=O |(-0.44,2.3199,0.6562;-0.2904,1.2868,0.3484;-1.3158,0.5138,0.2616;-1.1059,-0.8096,-0.2223;-1.9668,-1.4126,-0.8076;0.2702,-1.4263,0.0904;0.2649,-1.6852,1.1653;0.4752,-2.5755,-0.6311;1.3739,-0.4384,-0.1345;2.3713,-0.7871,-0.3812;1.0845,0.8576,0.041;2.1313,1.8796,-0.0903;1.7731,3.0516,0.0191;3.2802,1.499,-0.2945)|\",3.92932400122\r\n\"[H]OB(O[H])[C@@]1([H])C([H])=NC(=O)C([H])=C1[H] |(0.2383,-0.7922,2.3688;1.1645,-0.6579,2.6156;1.975,-0.3412,1.5752;3.289,-0.134,1.8666;3.8327,0.0649,1.0939;1.3987,-0.1824,0.0524;2.2461,-0.3214,-0.6355;0.375,-1.2448,-0.2014;0.7566,-2.2599,-0.3537;-0.9003,-1.1031,-0.211;-1.4453,0.2139,-0.047;-2.6428,0.3525,0.1191;-0.5184,1.3656,-0.1081;-0.9765,2.3485,-0.1557;0.8118,1.1893,-0.0733;1.4907,2.0398,-0.0888)|\",4.938866386575001\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])N([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])[H])O1 |(8.1982,6.2415,0.9921;8.2373,5.4414,0.2686;8.7233,5.338,-0.9971;9.2088,6.1183,-1.5665;8.4566,3.9916,-1.4157;8.697,3.5431,-2.3698;7.8326,3.3725,-0.374;7.289,2.0022,-0.1512;7.7667,1.5603,0.7462;7.583,1.3761,-1.0024;5.8251,2.0042,-0.0609;5.5433,2.6802,0.648;5.2785,0.692,0.2956;5.5777,-0.0072,-0.5007;5.696,0.2869,1.2384;3.7794,0.7478,0.3831;3.2704,1.1052,-0.5128;3.0691,0.411,1.4629;3.6006,0.0667,2.3527;1.5715,0.4556,1.5652;1.158,-0.5387,1.7832;1.115,0.8165,0.6375;1.2484,1.1136,2.3833;7.6932,4.2547,0.6684)|\",6.26950311552\r\n\"[H]/N=C(/O[H])N([H])/N=C(\\[H])C(=O)/C([H])=C(\\[H])N1C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(0.1,-7.4104,2.4958;-0.2822,-8.2796,2.8681;-1.4757,-8.1038,3.2631;-2.2119,-9.1094,3.7999;-1.6256,-9.8859,3.7863;-2.2512,-6.9502,3.2487;-3.208,-7.0002,3.5923;-1.7475,-5.8073,2.7498;-2.4991,-4.766,2.7328;-3.5304,-4.7718,3.1055;-2.0362,-3.4508,2.2004;-2.852,-2.5204,2.2244;-0.6784,-3.355,1.69;-0.0535,-4.2381,1.7281;-0.2701,-2.1606,1.1739;-1.0115,-1.3636,1.1643;0.9427,-1.801,0.6852;2.056,-2.7417,0.5663;1.8735,-3.5824,1.2373;2.0918,-3.1403,-0.4612;3.3895,-2.0658,0.9134;3.3957,-1.8278,1.9852;4.2052,-2.7765,0.7341;3.5967,-0.7797,0.1015;3.7096,-1.0316,-0.9634;4.5238,-0.2806,0.4073;2.3969,0.1613,0.2775;2.4997,1.0486,-0.3594;2.3477,0.5099,1.3173;1.0888,-0.5564,-0.0718;1.0721,-0.7806,-1.1523;0.2235,0.0768,0.1429)|\",4.03816954142\r\n\"[H]/N=C(/O[H])N([H])/N=C(\\[H])C(=O)/C([H])=C(\\[H])N1C([H])([H])C([H])([H])OC([H])([H])C1([H])[H] |(1.5522,2.6164,3.538;2.2179,3.1482,4.0986;3.3904,3.0257,3.6289;4.461,3.6374,4.1943;4.101,4.1238,4.9565;3.8297,2.3022,2.525;4.8195,2.3241,2.2884;2.9513,1.6042,1.786;3.3891,0.9519,0.7693;4.4467,0.9411,0.4803;2.4926,0.1449,-0.1078;3.0181,-0.4467,-1.058;1.0713,0.1131,0.2073;0.7263,0.6698,1.069;0.2459,-0.6041,-0.6041;0.7073,-1.0962,-1.4584;-1.0899,-0.8213,-0.4828;-1.8601,-1.3422,-1.613;-2.2626,-0.511,-2.2133;-1.1963,-1.9344,-2.2506;-3.0206,-2.1965,-1.1048;-2.6305,-3.1053,-0.618;-3.6702,-2.4911,-1.934;-3.827,-1.4603,-0.1958;-3.0707,-1.0603,0.938;-3.7561,-0.518,1.5955;-2.6951,-1.9473,1.4741;-1.8934,-0.165,0.5456;-1.2695,0.0198,1.4232;-2.2647,0.8038,0.1761)|\",4.07626548049\r\n\"[H]/N=C(/O[H])N([H])/N=C(\\[H])C(=O)/C([H])=C(\\[H])N1C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(2.8608,5.1147,-4.3525;2.8562,5.9876,-4.8801;1.9025,6.7317,-4.495;1.677,7.9532,-5.0414;2.3722,8.0608,-5.7137;0.9428,6.5034,-3.516;0.254,7.2289,-3.3281;0.9573,5.3566,-2.8134;0.0632,5.1867,-1.9069;-0.706,5.9361,-1.6846;-0.0127,3.9503,-1.0743;-0.9251,3.8936,-0.2397;0.9766,2.9102,-1.2953;1.7338,3.0637,-2.0544;0.9109,1.7744,-0.5409;0.1119,1.7073,0.1951;1.7294,0.7057,-0.6025;1.5953,-0.4767,0.2566;1.9724,-0.2696,1.2696;0.5446,-0.7729,0.3433;2.4637,-1.5254,-0.4506;2.833,-2.2935,0.2349;1.8833,-2.0232,-1.2363;3.5885,-0.6787,-1.0738;4.0943,-1.1792,-1.9041;4.3447,-0.4428,-0.316;2.87,0.6067,-1.5169;2.5148,0.5456,-2.5558;3.5056,1.4972,-1.4382)|\",4.01095815637\r\n\"[H]/N=C(/O[H])N([H])/N=C(C(\\[H])=N\\N([H])/C(=N\\[H])O[H])/C([H])=C(\\[H])C#N |(-3.0873,0.8513,-0.1933;-3.2624,0.5473,-1.1514;-2.1715,0.3506,-1.7647;-2.12,-0.057,-3.0552;-3.0464,-0.1453,-3.3399;-0.8568,0.4917,-1.3095;-0.0763,0.297,-1.9373;-0.6325,0.8789,-0.061;0.5974,1.0127,0.3731;1.8245,0.7776,-0.3855;2.7667,0.9366,0.1519;1.8327,0.4013,-1.6237;3.0168,0.212,-2.2271;3.8889,0.3532,-1.7218;3.0845,-0.1884,-3.5605;2.1592,-0.4267,-4.3934;1.2307,-0.2957,-3.996;4.3938,-0.3022,-3.8909;4.4007,-0.5815,-4.8233;0.773,1.4404,1.7637;1.8004,1.5575,2.1002;-0.222,1.6945,2.6388;-1.2612,1.5883,2.3417;0.0335,2.1082,3.9759;0.2267,2.4488,5.0733)|\",3.4313556548049995\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\[H])C(=O)/C([H])=C(\\[H])C#N |(-4.4069,0.1626,-1.4478;-5.1786,-0.0213,-0.801;-4.7487,-0.1844,0.3779;-5.8469,-0.5232,1.7555;-6.9505,-0.4567,0.9877;-3.4118,-0.138,0.8162;-3.1914,-0.2655,1.8033;-2.4341,0.0911,-0.0595;-1.2167,0.1384,0.3636;-0.9395,0.0011,1.414;-0.0977,0.3887,-0.5674;1.0487,0.4338,-0.1341;-0.4098,0.5759,-2.0127;-1.4435,0.5239,-2.3347;0.5921,0.8028,-2.8795;1.6154,0.847,-2.512;0.386,0.9917,-4.2769;0.238,1.1482,-5.4209)|\",4.076265480490001\r\n\"[H]OC1=N[C@]([H])(Cl)N([H])[C@@]([H])(O[H])C1([H])[H] |(-3.2115,-0.0602,0.1396;-2.6513,-0.8515,0.1691;-1.3503,-0.4905,0.0774;-0.4882,-1.4255,0.1245;0.8813,-1.0772,0.0119;1.4366,-1.8462,-0.5201;1.6469,-1.2822,1.7948;1.1719,0.2062,-0.5147;2.1692,0.4054,-0.4818;0.4151,1.3059,0.0754;0.6328,1.4231,1.145;0.7977,2.5225,-0.513;0.7066,2.4126,-1.4757;-1.071,0.9845,-0.1294;-1.3555,1.2441,-1.1598;-1.6819,1.5986,0.5433)|\",6.291272223560001\r\n\"[H]C1C([H])=C2N=C(C([H])([H])[H])NC(=O)C2=C([H])[C@]1([H])F |(7.2427,3.1999,0.1223;6.6654,2.2812,0.0649;5.3237,2.2878,0.0706;4.7476,3.2056,0.1352;4.5634,1.0403,0.0265;3.2633,1.0827,0.0224;2.5878,-0.1544,0.0149;1.0963,-0.0072,-0.0408;0.7498,0.5697,0.8251;0.6146,-0.9854,-0.055;0.8121,0.5667,-0.931;3.0981,-1.3488,0.0576;4.4934,-1.4866,0.1021;5.0253,-2.5819,0.1734;5.3005,-0.2249,0.0428;6.6471,-0.2374,0.0512;7.1815,-1.1824,0.1129;7.4546,1.0142,-0.0882;7.905,1.0078,-1.0976;8.5225,0.9989,0.8125)|\",3.376932884705\r\n\"[H]C(NO)C1=C([H])C2=C(O1)C([H])[C@]([H])(Br)/C([H])=N\\2 |(-4.2383,1.069,-0.1469;-3.7144,0.1164,-0.1148;-4.4305,-1.0809,-0.135;-5.6555,-0.9308,-0.1901;-2.356,0.0175,-0.0519;-1.5053,-1.1459,0.008;-1.8653,-2.1638,0.0161;-0.2235,-0.6884,0.0617;-0.2746,0.7612,0.0282;-1.5841,1.1612,-0.0326;0.8166,1.5426,0.0763;0.78,2.6255,0.0691;2.1102,0.8343,0.1053;2.8247,1.245,0.8199;3.1718,1.1951,-1.6515;2.037,-0.6639,0.1998;2.9908,-1.1826,0.2885;0.9701,-1.3862,0.1662)|\",2.217727881575\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N\\C(=O)C([H])([H])Cl |(1.3274,-0.3631,-2.2118;0.9112,-0.202,-1.2172;1.2398,0.8578,-0.565;0.6864,1.0478,0.7145;1.0093,1.9684,1.1932;-0.1633,0.2054,1.3477;-0.5442,0.414,2.342;-0.5785,-1.032,0.6965;-1.2993,-1.8859,1.1967;-0.0174,-1.2283,-0.705;-0.3014,-2.2352,-1.4367;-1.2551,-3.2178,-1.1208;-2.3963,-3.1273,-1.5182;-0.7728,-4.5088,-0.4667;-0.8971,-5.3247,-1.1817;-1.4056,-4.6905,0.4022;0.9453,-4.5193,0.0836)|\",3.0857710646700003\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N/C(=O)C([H])([H])[H] |(0.5015,-1.6396,0.1238;1.1356,-2.5261,0.1391;0.5667,-3.6763,0.0529;1.373,-4.8291,0.0913;0.8098,-5.7556,0.0197;2.7188,-4.8409,0.2044;3.2847,-5.7668,0.2234;3.4698,-3.5839,0.2936;4.6854,-3.5238,0.3769;2.6002,-2.3275,0.2684;3.1715,-1.1861,0.3507;2.4936,0.047,0.4202;1.8595,0.3709,1.4056;2.7714,0.9516,-0.7604;3.8294,1.237,-0.7609;2.1505,1.8456,-0.6788;2.5737,0.4382,-1.708)|\",3.099376757194999\r\n\"[H]C1=C([H])C(=O)C(=C=O)C(Cl)=N1 |(-0.4095,2.3806,0.0002;-0.2667,1.3036,-0.0003;-1.3325,0.4627,-0.0005;-2.3499,0.8381,-0.0004;-1.1592,-0.9853,0.0001;-2.0591,-1.8169,0.0011;0.2959,-1.3674,-0.0003;0.5918,-2.6805,-0.0003;0.7936,-3.8149,-0.0005;1.3112,-0.3428,-0.0005;2.9925,-0.8715,-0.0007;1.0677,0.9146,-0.0008)|\",4.26130289883\r\n\"[H]C1=NC([H])=C([H])C(=O)[C@@]1([H])C#C[Si](C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(4.6258,-4.4057,-1.3217;4.2776,-4.333,-2.3535;4.0489,-5.4213,-2.9891;3.5616,-5.3192,-4.3042;3.4436,-6.2789,-4.8013;3.2071,-4.1725,-4.9257;2.7607,-4.1761,-5.9152;3.3153,-2.881,-4.2395;2.854,-1.8421,-4.6726;4.1424,-2.9313,-2.9264;5.1654,-2.6783,-3.2743;3.7343,-1.9511,-1.9271;3.3707,-1.1558,-1.081;2.8008,0.0861,0.1614;0.9446,-0.1474,0.4324;0.5577,0.5999,1.136;0.7231,-1.1401,0.84;0.3933,-0.0419,-0.5085;3.1598,1.8149,-0.5126;2.8317,2.583,0.1985;2.6364,1.9867,-1.4596;4.2315,1.9574,-0.6905;3.7451,-0.202,1.7752;3.4228,0.5124,2.5426;4.824,-0.0747,1.6321;3.5741,-1.212,2.1645)|\",4.38103299305\r\n\"[H]OC(=O)C1=C([H])[C@]([H])(OC([H])([H])[H])C(=O)N=C1[H] |(-0.3094,1.5331,-3.0575;-0.0772,2.0996,-3.8118;0.9661,2.9056,-3.4953;1.2918,3.81,-4.2263;1.6972,2.6065,-2.2229;1.7787,1.3903,-1.6528;1.2843,0.5012,-2.0391;2.6639,1.1829,-0.4537;3.6808,0.9582,-0.8454;2.1681,0.106,0.2914;3.104,-0.4729,1.199;2.6126,-1.3527,1.6199;3.3651,0.2232,2.0005;4.0159,-0.7904,0.67;2.8035,2.5039,0.3286;2.7622,2.5637,1.5347;3.0296,3.6696,-0.4556;2.4933,3.7003,-1.6252;2.6179,4.5998,-2.2287)|\",3.84224756906\r\n\"[H]C1=C([H])[C@@]([H])(OC([H])([H])[H])C(OC([H])([H])[H])=N/C1=C(\\[H])NO |(6.1617,-2.2671,2.1286;5.1991,-2.03,1.6913;5.0156,-0.9618,0.9038;5.83,-0.2878,0.6547;3.6829,-0.6382,0.2949;3.3247,0.3352,0.6753;3.852,-0.5575,-1.117;3.0118,0.3773,-1.7828;3.3587,0.4118,-2.8184;3.1049,1.3812,-1.3408;1.9594,0.0739,-1.7605;2.6522,-1.6926,0.6801;1.4544,-1.4389,0.1451;0.403,-2.3896,0.4007;-0.469,-1.9965,-0.122;0.2144,-2.4676,1.474;0.6796,-3.3738,0.0155;2.8331,-2.7267,1.4234;4.0925,-2.9422,1.9751;4.2436,-4.044,2.7758;3.4189,-4.7261,2.9735;5.4847,-4.3023,3.3558;5.5076,-5.3211,4.0565)|\",2.7510710285550006\r\n\"[H]C1=NC([H])=C(N([H])[H])C(=O)[C@@]1([H])OC([H])([H])[H] |(2.5343,-1.4982,1.8848;2.6847,-2.0209,0.9397;2.7718,-3.3003,0.9345;2.9365,-3.9649,-0.2842;3.0436,-5.0441,-0.1966;2.8953,-3.3758,-1.5161;2.9838,-4.0213,-2.731;2.6837,-4.9861,-2.77;2.6625,-3.4523,-3.5065;2.6534,-1.9136,-1.5943;2.3714,-1.3649,-2.6511;2.8659,-1.155,-0.2818;3.961,-0.9396,-0.3022;2.1321,0.0309,-0.1472;2.644,1.1469,-0.8704;2.0429,2.0076,-0.5673;2.5615,0.9984,-1.9505;3.6963,1.336,-0.6052)|\",3.68442153577\r\n\"[H]C1=C([H])C(F)=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])C2([H])[H])C([H])=C1[H] |(-2.0492,-1.7069,0.2092;-1.3824,-0.8545,0.1163;-0.1562,-0.8609,0.7798;0.1618,-1.6977,1.3929;0.6842,0.2379,0.6465;1.8707,0.2096,1.2992;0.3579,1.3611,-0.1175;1.3166,2.5225,-0.2439;2.1974,2.3094,0.372;1.79,2.7126,-1.6363;0.9888,2.6037,-2.2591;2.1975,4.0935,-1.7512;3.2155,4.1166,-1.7736;1.7645,4.859,-0.5548;1.2734,5.7885,-0.8677;2.9377,5.2534,0.2258;3.2837,4.4666,0.7759;2.6948,5.9918,0.8842;0.7619,3.9102,0.1399;0.7103,4.0833,1.2197;-0.2461,4.0417,-0.2721;-0.884,1.3373,-0.7695;-1.1835,2.1922,-1.3712;-1.7466,0.2467,-0.6611;-2.7005,0.2584,-1.1801)|\",5.287172115215\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])N([H])N([H])[C@]([H])(N([H])[H])C2([H])[H])C(C([H])([H])[H])=C1[H] |(5.1893,-2.1625,-0.0953;4.5979,-1.2542,-0.1775;5.0915,-0.1507,-0.8673;6.0733,-0.1837,-1.3321;4.3158,1.0065,-0.9596;4.7104,1.8567,-1.507;3.0467,1.0855,-0.3741;2.2129,2.3523,-0.4495;1.1787,2.065,-0.696;2.1492,3.0488,0.8618;3.1034,3.0971,1.2227;1.7584,4.4128,0.588;0.7863,4.5233,0.8709;1.8293,4.6766,-0.8717;2.3614,5.6184,-1.0514;0.4795,4.859,-1.4083;0.0201,3.958,-1.5434;0.5178,5.3122,-2.3199;2.634,3.473,-1.4145;2.4133,3.2709,-2.4679;3.7074,3.6763,-1.3223;2.541,-0.0385,0.3221;1.169,-0.0249,0.958;1.0212,-0.9119,1.5815;0.3754,-0.025,0.1975;1.0268,0.8679,1.5742;3.3326,-1.1891,0.4075;2.946,-2.0521,0.9451)|\",5.676294921429999\r\n\"[H]C1=C([H])C([H])=C([C@@]([H])(N([H])[H])C2([H])[C@@]([H])(C([H])([H])[H])N([H])N([H])[C@]2([H])C([H])([H])[H])C([H])=C1[H] |(5.6353,-6.6134,2.0136;5.1893,-5.721,1.583;5.5812,-5.2772,0.3198;6.336,-5.8226,-0.2408;5.0066,-4.128,-0.2265;5.3188,-3.7888,-1.2127;4.0335,-3.3988,0.4717;3.3586,-2.1953,-0.1935;3.9732,-1.9094,-1.0558;2.0344,-2.5527,-0.752;1.4267,-2.8639,0.0073;2.1464,-3.3666,-1.357;3.2232,-0.9642,0.7243;2.5794,-1.2422,1.5735;2.5944,0.3123,0.1082;3.078,0.5142,-0.8601;1.0808,0.3263,-0.078;0.7813,-0.4063,-0.8285;0.751,1.3243,-0.3872;0.5707,0.0798,0.8643;2.997,1.3821,1.0402;2.3536,1.3624,1.834;4.2901,1.032,1.5706;4.9618,1.5821,1.0383;4.5604,-0.4179,1.318;4.7563,-0.9114,2.2781;5.8095,-0.5733,0.4431;6.1255,-1.6166,0.359;6.64,-0.0165,0.8947;5.6508,-0.1741,-0.5667;3.6478,-3.8605,1.7396;2.893,-3.3216,2.3074;4.2194,-5.0075,2.2907;3.9087,-5.3428,3.2769)|\",5.461324979535\r\n\"[H]OC1=N[C@]([H])(C2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])N([H])[C@@]([H])(C([H])([H])[H])[C@@]1([H])C([H])([H])C([H])([H])[H] |(3.897,-2.0234,-0.2445;3.7381,-2.318,-1.1538;2.9945,-1.3686,-1.8001;2.713,-1.6058,-3.0138;1.9457,-0.655,-3.7849;1.0863,-1.2017,-4.1976;2.7615,-0.1316,-4.9728;4.1585,-0.1323,-4.9949;4.7026,-0.5955,-4.1782;4.8546,0.4209,-6.0726;5.9426,0.4037,-6.0687;4.1826,0.9835,-7.1624;4.9351,1.591,-8.3238;6.0029,1.3547,-8.2753;4.8393,2.6848,-8.3356;4.5536,1.2256,-9.2847;2.7799,0.967,-7.1446;2.2283,1.3798,-7.987;2.084,0.4175,-6.0714;0.9949,0.404,-6.0927;1.3748,0.4436,-2.9656;1.047,1.1797,-3.589;2.3407,0.9921,-2.0121;3.3092,1.2235,-2.4946;1.8085,2.2906,-1.4074;2.4521,2.6379,-0.5909;0.7916,2.1661,-1.0231;1.7861,3.0796,-2.1688;2.6134,-0.1359,-0.9886;3.49,0.1526,-0.3853;1.4247,-0.4009,-0.0314;1.2249,0.5254,0.5201;0.5388,-0.604,-0.6417;1.6332,-1.5317,0.9841;0.7766,-1.5943,1.664;2.5218,-1.357,1.6081;1.7407,-2.5082,0.5012)|\",6.049090896615\r\n\"[H]OC1=N[C@]([H])(C2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])N([H])[C@@]([H])(O[H])[C@@]1([H])C([H])([H])C([H])([H])[H] |(3.2761,-1.8423,1.2868;4.0357,-1.4672,1.7573;3.9838,-0.1048,1.6628;4.908,0.5382,2.2457;4.9261,1.98,2.2265;5.8974,2.2794,1.8098;4.848,2.544,3.6486;5.3667,3.8203,3.904;5.8752,4.3633,3.1085;5.2564,4.4008,5.1659;5.6749,5.39,5.3394;4.6275,3.7231,6.2192;4.4854,4.3618,7.5814;4.4025,3.6074,8.3708;3.586,4.9901,7.6356;5.342,5.0034,7.8153;4.1262,2.4413,5.9636;3.6519,1.884,6.7688;4.2347,1.8573,4.7007;3.8702,0.8481,4.538;3.895,2.594,1.3388;3.7504,3.5593,1.6336;2.6052,1.9224,1.3692;2.1458,1.8851,2.3699;1.686,2.6566,0.5886;2.161,2.8975,-0.2257;2.8241,0.4762,0.8692;1.9142,-0.096,1.1064;3.0553,0.407,-0.6601;2.1881,0.8789,-1.1374;3.933,1.0146,-0.9072;3.2222,-1.0058,-1.2323;3.2949,-0.969,-2.3245;2.3583,-1.6411,-0.991;4.1258,-1.4979,-0.8585)|\",6.160657575320001\r\n\"[H]C([H])([H])C1N=C(C([H])([H])C([H])([H])C([H])([H])[H])NC(=O)[C@@]1([H])C([H])([H])C([H])([H])[H] |(7.9568,2.732,-0.3445;7.1561,2.7779,-1.0935;6.4414,3.5555,-0.8175;7.6296,3.0474,-2.0462;6.4583,1.4517,-1.2018;5.1831,1.4081,-1.0219;4.5512,0.1425,-1.1286;3.0516,0.2292,-1.118;2.7453,0.9876,-1.8515;2.6442,-0.7395,-1.421;2.5047,0.6325,0.2688;2.8184,-0.1198,1.004;2.9648,1.5815,0.5679;0.979,0.761,0.275;0.6119,1.0344,1.2706;0.4989,-0.1827,-0.0103;0.6426,1.5321,-0.4288;5.1181,-1.0208,-1.1789;6.5225,-1.0664,-1.1983;7.1236,-2.1091,-1.0237;7.281,0.2243,-1.5132;8.1747,0.23,-0.8765;7.7976,0.1962,-2.988;8.4884,1.0362,-3.1298;8.3896,-0.7208,-3.0819;6.7063,0.2293,-4.0621;7.1596,0.1973,-5.0586;6.1013,1.1416,-4.0031;6.0303,-0.628,-3.9774)|\",4.247697206305\r\n\"[H]OC1=NC(C([H])([H])C([H])([H])C([H])([H])[H])=NC(=O)[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.773,-1.2517,0.9791;5.3802,-1.0267,0.2545;5.5031,0.309,0.1696;6.3082,0.77,-0.7171;6.419,2.1687,-0.7844;7.1839,2.65,-1.9879;7.2881,3.7362,-1.9158;8.1863,2.2032,-1.9383;6.5326,2.2437,-3.3274;7.2224,2.5211,-4.1343;6.4363,1.1523,-3.3555;5.1707,2.9008,-3.5717;4.7624,2.6051,-4.5446;5.2476,3.9944,-3.5566;4.4402,2.6134,-2.8058;5.9729,3.0412,0.0704;5.2592,2.5728,1.1776;5.061,3.2623,2.1602;4.6619,1.1601,1.084;4.6604,0.7376,2.0974;3.1868,1.2111,0.5679;3.1691,1.7558,-0.3851;2.86,0.1863,0.3419;2.2118,1.8531,1.5622;2.2148,1.2712,2.4944;2.5655,2.8542,1.829;0.7869,1.928,1.0038;0.1046,2.376,1.7341;0.3995,0.9328,0.7523;0.7479,2.5395,0.0944)|\",4.590560657935001\r\n\"[H]OC1=NC(C([H])([H])C([H])([H])C([H])([H])[H])=NC(=O)[C@@]1([H])C([H])([H])C([H])([H])[H] |(8.0435,0.1409,2.5564;7.0943,0.2062,2.7514;6.4001,-0.4955,1.8425;5.1209,-0.4896,1.9569;4.4063,-1.2326,1.0072;2.9436,-1.3619,1.3253;2.5135,-2.1315,0.6781;2.8447,-1.6807,2.3715;2.1928,-0.0249,1.1412;2.6747,0.7394,1.7619;2.2982,0.2981,0.0975;0.7094,-0.1394,1.5024;0.1956,0.8177,1.3585;0.5776,-0.4348,2.5505;0.2039,-0.8868,0.8792;4.8534,-1.7471,-0.1012;6.2113,-1.6096,-0.3873;6.6613,-1.7949,-1.5019;7.1565,-1.2777,0.7909;7.3752,-2.2549,1.2577;8.4898,-0.6632,0.3156;9.2186,-0.6607,1.1415;8.8964,-1.3463,-0.4347;8.358,0.7382,-0.2925;9.3366,1.1073,-0.6165;7.9476,1.4627,0.4211;7.7031,0.7123,-1.1678)|\",4.70212733664\r\n\"[H]OC1=NC(C([H])([H])C([H])([H])C([H])([H])[H])=NC(=O)C1=C([H])[H] |(8.3278,-1.3631,3.2196;7.3699,-1.2751,3.3462;6.7662,-1.0576,2.1649;5.4781,-1.0552,2.1656;4.8379,-0.8546,0.9359;3.3404,-0.7974,1.0492;3.0843,-0.0836,1.8437;2.9345,-0.432,0.1019;2.7302,-2.1711,1.4028;2.9982,-2.8904,0.6181;3.1835,-2.5309,2.3338;1.2074,-2.1063,1.5497;0.7944,-3.0923,1.7905;0.7318,-1.7623,0.6234;0.9153,-1.417,2.3512;5.3769,-0.763,-0.2459;6.7664,-0.8096,-0.3477;7.3408,-0.8023,-1.4231;7.5607,-0.8357,0.945;8.8835,-0.6124,0.9122;9.3637,-0.4519,-0.0486;9.5187,-0.5587,1.7941)|\",3.74428658288\r\n\"[H]OC1=NC(C([H])([H])C([H])([H])[H])=NC(=O)[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H] |(4.5083,-3.8153,-0.041;5.0609,-3.7417,-0.837;5.295,-2.4453,-1.1028;6.0507,-2.1774,-2.105;6.282,-0.8149,-2.3566;6.9535,-0.5561,-3.678;7.1889,0.509,-3.7404;7.8958,-1.12,-3.6876;6.0915,-1.0052,-4.8712;6.6312,-0.8414,-5.8096;5.1562,-0.4359,-4.9165;5.8464,-2.0686,-4.7949;6.0068,0.1923,-1.582;5.3658,-0.0695,-0.3672;5.3374,0.7408,0.5395;4.6231,-1.4088,-0.2411;4.6636,-1.7088,0.8142;3.1238,-1.2556,-0.6567;3.085,-0.8308,-1.668;2.6773,-2.2576,-0.7272;2.3045,-0.3948,0.3124;2.3234,-0.859,1.3083;2.7803,0.5846,0.4262;0.8544,-0.2269,-0.1531;0.2821,0.3742,0.5615;0.3492,-1.1956,-0.2544;0.8054,0.2769,-1.1258)|\",4.60416635046\r\n\"[H]OC1=NC(C([H])([H])C([H])([H])[H])=NC(=O)C1=C([H])[H] |(4.5589,-2.3402,4.6119;4.4265,-1.5514,4.0628;3.8801,-1.8957,2.8837;3.514,-0.9315,2.1121;2.9204,-1.2787,0.8919;2.5989,-0.0891,0.0293;3.5148,0.5067,-0.0787;2.2981,-0.4513,-0.9567;1.4984,0.7939,0.647;1.3087,1.6619,0.0072;1.798,1.1514,1.6365;0.5616,0.2353,0.7482;2.596,-2.4644,0.4635;2.9096,-3.5664,1.2583;2.5788,-4.7005,0.9574;3.7206,-3.3126,2.5154;4.2723,-4.3487,3.1657;4.0939,-5.3539,2.7954;4.9141,-4.2529,4.0392)|\",3.74972885989\r\n\"[H]OC(=N/C1([H])C([H])([H])C1([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=C([C@]2([H])C([H])([H])[C@@]2([H])C([H])([H])[H])O1 |(10.1497,2.1868,0.192;10.4316,2.0732,-0.7291;9.4779,1.3179,-1.3747;9.8855,0.6633,-2.3904;9.002,-0.061,-3.2287;7.9286,-0.0191,-3.0358;9.53,-1.3649,-3.7979;10.5274,-1.6552,-3.484;8.8276,-2.1861,-3.9145;9.397,-0.1645,-4.6911;8.6018,-0.1464,-5.4315;10.3047,0.3571,-4.9768;8.1175,1.3828,-0.8213;7.4446,0.5675,-1.0645;7.6674,2.3954,-0.0434;8.3125,3.2473,0.1644;6.3596,2.4983,0.5342;5.7684,3.4631,1.3143;6.2344,4.3824,1.6427;4.4448,3.0191,1.5936;3.6916,3.5267,2.1797;4.2982,1.8068,0.97;3.1789,0.8649,0.8774;2.3066,1.1895,1.4391;3.4168,-0.6376,0.8026;4.4491,-0.9747,0.8021;2.7309,-1.2745,1.3559;2.8997,0.0891,-0.4047;3.6294,0.2535,-1.1952;1.4741,-0.0973,-0.8771;1.387,-0.9788,-1.524;1.125,0.7729,-1.4456;0.7926,-0.2394,-0.0298;5.4562,1.4788,0.3236)|\",3.86945895411\r\n\"[H]OC(=N/C1([H])C([H])([H])C1([H])[H])/C([H])=C(\\[H])C1=C(C([H])([H])[H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(7.2991,-2.342,-2.0011;8.1123,-1.815,-1.9572;7.7718,-0.4947,-2.1385;8.7032,0.2657,-2.5614;8.5438,1.6686,-2.6802;7.6107,2.1302,-2.3525;9.2412,2.33,-3.8545;9.7742,1.6683,-4.5297;8.7411,3.1684,-4.3321;9.792,2.5068,-2.4684;9.6767,3.4681,-1.9751;10.6936,1.9611,-2.2095;6.3835,-0.1306,-1.8001;5.9991,0.7751,-2.259;5.6045,-0.839,-0.9545;6.0583,-1.6836,-0.4391;4.2062,-0.5564,-0.6136;3.6484,-1.0386,0.5983;4.4656,-1.8447,1.5835;3.8685,-2.1009,2.4634;5.349,-1.2944,1.9287;4.8254,-2.7859,1.1472;2.3154,-0.7437,0.8916;1.8951,-1.113,1.8255;1.4998,0.0065,0.0344;0.0684,0.3143,0.404;-0.4889,0.7106,-0.4506;0.0171,1.0617,1.2067;-0.455,-0.5792,0.764;2.0583,0.4633,-1.1643;1.4483,1.0327,-1.8615;3.3831,0.1832,-1.4801;3.7829,0.521,-2.432)|\",4.1878321591950005\r\n\"[H]C1=C2OC([H])([H])OC2=C([H])C(N(=O)=O)=C1/C([H])=N/OC([H])([H])C([H])([H])[H] |(6.6997,3.1061,-2.6238;7.0468,2.108,-2.8577;8.3464,1.8591,-3.227;9.3746,2.7481,-3.3451;10.5158,1.9915,-3.7787;11.3095,2.0705,-3.028;10.8522,2.3676,-4.7508;10.1141,0.6212,-3.9089;8.7906,0.5782,-3.5677;7.9483,-0.5037,-3.5346;8.265,-1.5036,-3.7966;6.6191,-0.2639,-3.1261;5.7753,-1.4629,-3.0555;6.1326,-2.447,-3.7063;4.7716,-1.4425,-2.3382;6.1369,1.0216,-2.7918;4.7386,1.3172,-2.4444;3.9681,0.5697,-2.5974;4.4476,2.4751,-1.9685;3.0845,2.5952,-1.7537;2.7927,3.8769,-1.1767;3.3328,3.9767,-0.2269;3.1424,4.6661,-1.8541;1.2895,3.9416,-0.9729;1.0149,4.9053,-0.5307;0.9532,3.1446,-0.3023;0.7621,3.8355,-1.9263)|\",3.703469505305\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)N2C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(C([H])([H])[H])C2([H])[H])C([H])=C1[H] |(6.3421,4.1407,-2.5713;5.642,3.8603,-1.7802;4.7027,3.0377,-2.1144;3.7711,2.6787,-1.1214;2.682,2.2351,-1.4357;4.1553,2.841,0.3659;3.2358,3.1355,0.8826;4.6803,1.4988,0.9693;5.892,1.3514,1.099;3.7679,0.5554,1.3438;4.2612,-0.703,1.9205;5.3473,-0.6368,1.9724;3.8688,-0.7863,2.9455;3.797,-1.904,1.089;4.2898,-1.8701,0.1087;4.1227,-2.828,1.5832;2.2728,-1.8951,0.9075;1.7864,-2.0702,1.8802;1.9598,-2.7165,0.2505;1.78,-0.5529,0.3397;2.2018,-0.4234,-0.6661;0.2526,-0.4877,0.2388;-0.1328,-1.309,-0.3759;-0.073,0.4536,-0.2184;-0.2146,-0.5669,1.2292;2.3104,0.6136,1.1951;1.8594,0.565,2.1995;2.0181,1.5506,0.7274;5.199,3.8978,0.5869;5.4073,4.1975,1.6105;5.8706,4.4169,-0.4507;6.6425,5.1695,-0.3241)|\",4.166063051155\r\n\"[H]/N=C(/O[H])N([H])/N=C(\\[H])C1=C(Cl)N=C2/C([H])=C([H])\\C([H])=C(\\[H])N21 |(-0.7514,4.7409,-0.193;-0.2557,5.6312,-0.2043;1.0021,5.446,-0.2087;1.874,6.4871,-0.2244;1.3141,7.283,-0.2318;1.7282,4.267,-0.2063;2.7429,4.3244,-0.1695;1.0953,3.0701,-0.1544;1.8401,2.0151,-0.1084;2.9332,2.0843,-0.1069;1.3245,0.6787,-0.0567;2.0152,-0.5379,0.0003;3.7435,-0.6888,0.0204;1.223,-1.6141,0.0379;-0.0214,-1.1163,0.0079;-1.26,-1.7845,0.0273;-1.2532,-2.8674,0.0736;-2.4234,-1.0489,-0.0118;-3.3855,-1.5508,0.0028;-2.3724,0.3683,-0.0713;-3.2852,0.9527,-0.1026;-1.1709,1.0246,-0.0899;-1.0462,2.0964,-0.1336;-0.0114,0.2876,-0.0499)|\",4.078986618995001\r\n\"[H]OC1=NC(=O)N(/N=C(\\[H])C2=C([H])C(O[H])=C([H])C([H])=C2[H])C1([H])[H] |(-5.5089,-6.7072,-0.7784;-4.6476,-7.1542,-0.7436;-3.6658,-6.2539,-0.6248;-2.4356,-6.5983,-0.5565;-1.6727,-5.3976,-0.4368;-0.4666,-5.3325,-0.3443;-2.5608,-4.3037,-0.4448;-2.3472,-2.9594,-0.339;-1.1474,-2.5007,-0.2232;-0.2765,-3.1518,-0.2062;-0.9227,-1.0549,-0.108;0.3973,-0.6042,0.0151;1.2262,-1.3059,0.0245;0.6699,0.7622,0.1276;1.9822,1.1293,0.2438;2.0346,2.0953,0.3092;-0.3778,1.6868,0.1183;-0.1673,2.7513,0.2059;-1.6957,1.2338,-0.0044;-2.5063,1.9577,-0.0107;-1.9769,-0.1217,-0.117;-2.9971,-0.4763,-0.212;-3.924,-4.7652,-0.5631;-4.5359,-4.4814,0.3044;-4.4138,-4.3872,-1.4712)|\",4.4980419487650005\r\n\"[H]OC1=NC(=O)N(/N=C(\\[H])C2=C([H])N=C([H])C([H])=C2[H])C1([H])[H] |(-5.829,-6.3447,-1.3247;-5.015,-6.8612,-1.2077;-3.9843,-6.049,-0.9546;-2.8012,-6.497,-0.7599;-1.9624,-5.3693,-0.5205;-0.7729,-5.405,-0.2925;-2.7522,-4.2032,-0.5939;-2.4377,-2.8873,-0.4345;-1.2238,-2.5351,-0.1772;-0.4165,-3.2579,-0.0788;-0.8961,-1.1169,-0.0066;-1.8563,-0.0904,-0.0945;-2.8927,-0.3479,-0.2985;-1.5814,1.2051,0.0581;-0.3068,1.542,0.3109;-0.1093,2.6058,0.4309;0.7275,0.6131,0.4204;1.7421,0.9406,0.6266;0.4263,-0.7366,0.2595;1.2038,-1.4933,0.3374;-4.1271,-4.5452,-0.8773;-4.8073,-4.2287,-0.0747;-4.478,-4.1081,-1.8222)|\",4.59328179644\r\n\"[H]OC(NOC([H])([H])[H])[C@@]1([H])N([H])N([H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(5.7994,1.2352,1.6996;4.9457,1.0187,2.1527;4.2064,0.4736,1.1678;2.9605,0.2707,1.3891;2.3601,-0.3118,0.2358;0.9696,-0.4404,0.4889;0.5491,-0.8996,-0.4102;0.5029,0.5376,0.6598;0.7819,-1.083,1.3579;4.9613,0.1646,-0.1279;4.4602,0.6888,-0.947;6.359,0.6467,0.0229;6.6437,1.1165,-0.8336;7.2316,-0.5329,0.1298;7.8123,-0.3775,0.9523;6.3637,-1.7095,0.3638;6.0866,-1.7964,1.4272;7.0206,-3.0077,-0.0654;7.9131,-3.0572,-1.1433;8.1814,-2.1332,-1.6457;8.4652,-4.2726,-1.5485;9.1603,-4.2956,-2.3841;8.1326,-5.4548,-0.884;8.5658,-6.4003,-1.1994;7.2468,-5.4134,0.1936;6.9866,-6.3264,0.7228;6.6996,-4.1964,0.6012;6.0155,-4.1696,1.4473;5.0961,-1.3404,-0.4227;4.2185,-1.9092,-0.1167;5.2631,-1.499,-1.4937)|\",5.657246951895\r\n\"[H]OC(=O)C([H])([H])C([H])([H])OC([H])([H])C([H])([H])OC([H])([H])C([H])([H])C([H])([H])C(=O)O[H] |(10.4378,-4.7942,-1.1757;10.0509,-5.6572,-1.3992;8.692,-5.5721,-1.4092;8.0277,-6.5468,-1.6582;8.1218,-4.197,-1.0983;8.4987,-3.8521,-0.1259;8.4842,-3.4755,-1.8434;6.5984,-4.1965,-1.0853;6.2259,-4.9068,-0.3321;6.2124,-4.5265,-2.0614;6.1759,-2.8775,-0.7912;4.7668,-2.7572,-0.7246;4.2989,-3.0749,-1.6703;4.3545,-3.3859,0.0802;4.4386,-1.2958,-0.4543;4.8411,-0.6685,-1.2659;4.9115,-0.9725,0.4859;3.0297,-1.1876,-0.3784;2.5786,0.1343,-0.1378;3.099,0.5643,0.7319;2.7957,0.7756,-1.0082;1.0736,0.0798,0.1073;0.6823,1.1052,0.1449;0.6003,-0.4143,-0.7489;0.7002,-0.6918,1.3942;1.2007,-1.662,1.3842;-0.3834,-0.8656,1.4239;1.1505,0.0521,2.6428;2.1896,-0.1453,3.2217;0.3191,1.0485,3.0697;-0.4802,1.0733,2.5188)|\",7.227343869280001\r\n\"[H]C#CC([H])([H])OC1=C([H])C([H])=C(C2([H])C([H])([H])C2([H])[H])C([H])=C1[H] |(12.5587,1.8294,-1.4407;11.4924,1.8254,-1.4202;10.2868,1.8179,-1.3919;8.8275,1.854,-1.363;8.4428,2.2812,-2.3015;8.4841,2.5029,-0.543;8.3329,0.5296,-1.1858;6.973,0.3609,-1.1469;6.5325,-0.9592,-0.9937;7.2744,-1.7472,-0.9093;5.1717,-1.2412,-0.944;4.8554,-2.2709,-0.8055;4.2089,-0.2261,-1.0524;2.734,-0.4887,-1.0113;2.1994,0.1349,-0.2954;2.1314,-1.8539,-1.2351;1.2776,-2.1369,-0.6257;2.793,-2.6813,-1.4747;1.9522,-0.7844,-2.2808;2.5189,-0.8832,-3.2026;0.9735,-0.3271,-2.4011;4.6707,1.0871,-1.1982;3.9491,1.897,-1.2807;6.0327,1.3929,-1.2448;6.3386,2.4267,-1.3598)|\",5.714390860500001\r\n\"[H]C1=C([H])N2/C([H])=C([H])\\C([H])=C/2C(N2C([H])([H])C([H])([H])[C@]([H])(N([H])[H])C2([H])[H])=N1 |(2.2761,-1.1145,-1.8062;1.5724,-0.5827,-1.1714;0.4981,-1.2396,-0.6638;0.2652,-2.2828,-0.833;-0.3656,-0.5236,0.1525;-1.4777,-0.9517,0.8209;-1.8147,-1.9741,0.729;-1.9799,0.1147,1.5515;-2.8562,0.0832,2.185;-1.1538,1.2307,1.3238;-1.2783,2.2076,1.7622;-0.1411,0.8352,0.4408;1.015,1.4336,-0.1846;1.3293,2.7521,0.0047;2.5473,3.3097,-0.6042;3.4207,2.7175,-0.3176;2.4831,3.2676,-1.7003;2.5727,4.7489,-0.0823;3.1158,5.429,-0.7468;3.0509,4.7901,0.9063;1.0782,5.1069,0.0588;0.6748,5.3179,-0.94;0.6999,6.2245,0.9173;1.0327,7.0998,0.5169;1.1582,6.1323,1.8243;0.4605,3.7869,0.5586;-0.5737,3.6727,0.2221;0.4586,3.7712,1.6603;1.8438,0.7372,-0.954)|\",4.677637090095001\r\n\"[H]C1=C([H])N2/N=C([H])\\C([H])=C/2C(N2C([H])([H])C([H])([H])[C@]([H])(N([H])[H])C2([H])[H])=N1 |(0.8988,-1.0413,-2.5756;0.6954,-0.5295,-1.6393;0.223,-1.2344,-0.5751;0.0215,-2.2954,-0.5388;-0.006,-0.5258,0.585;-0.4379,-1.0758,1.7319;-0.4853,-0.0455,2.5923;-0.8075,-0.226,3.6096;-0.0894,1.1644,2.0151;-0.0361,2.1271,2.4957;0.2216,0.8415,0.686;0.7084,1.493,-0.5093;0.9712,2.8323,-0.5381;1.557,3.4494,-1.7386;2.4283,2.8787,-2.0708;0.8353,3.4374,-2.5671;1.8838,4.873,-1.276;1.8884,5.5955,-2.0984;2.8729,4.8989,-0.7983;0.8035,5.1687,-0.2132;-0.14,5.3945,-0.7267;1.0367,6.2402,0.7481;1.0218,7.1429,0.277;1.9675,6.1436,1.1546;0.6506,3.81,0.4972;-0.3622,3.6724,0.8863;1.3505,3.7499,1.3458;0.9373,0.812,-1.6261)|\",4.6803582286\r\n\"[H]C1=C([H])N2C(=N1)C(N1C([H])([H])C([H])([H])[C@]([H])(N([H])[H])C1([H])[H])=NC([H])=C2[H] |(2.4909,5.1794,1.7102;2.8616,4.1997,1.4377;4.1755,3.7708,1.4467;5.1033,4.2586,1.7044;4.1378,2.4659,1.0219;2.8,2.1648,0.7761;2.0164,3.2162,1.0269;2.5036,0.8287,0.3129;1.2335,0.4367,0.0388;0.9658,-0.9181,-0.4696;1.6154,-1.1447,-1.3206;1.1906,-1.6611,0.3068;-0.5246,-0.8768,-0.8335;-1.0042,-1.8554,-0.7273;-0.6647,-0.5344,-1.8652;-1.1035,0.1876,0.1096;-1.1876,-0.2515,1.1209;-2.3595,0.7526,-0.3849;-3.0776,0.0292,-0.3929;-2.6924,1.4671,0.262;0.0118,1.243,0.1396;0.0185,1.8532,1.0432;-0.0981,1.9169,-0.7211;3.4743,-0.068,0.1416;4.7583,0.2974,0.4048;5.5103,-0.4714,0.2477;5.1434,1.5291,0.8402;6.1592,1.8346,1.0523)|\",4.75110782973\r\n\"[H]C1=NC([H])=C2C(=C1[H])/N=C(C(=O)N([H])C([H])([H])C([H])([H])OC([H])([H])[H])\\C([H])=C/2[H] |(11.8764,-2.1964,0.5415;11.0344,-1.5179,0.4255;11.012,-0.465,1.291;10.01,0.3829,1.1886;10.002,1.2218,1.886;8.9596,0.2666,0.238;8.9978,-0.8401,-0.6613;10.0815,-1.7452,-0.5412;10.1372,-2.597,-1.2111;8.0385,-1.039,-1.6081;7.0488,-0.1653,-1.6771;5.9792,-0.3835,-2.7309;5.0422,0.4047,-2.8507;6.1645,-1.4908,-3.4865;6.9935,-2.0451,-3.306;5.2834,-1.8689,-4.5732;4.736,-2.7855,-4.3154;4.5575,-1.0623,-4.7028;6.0625,-2.1018,-5.8597;6.5854,-1.1791,-6.1607;5.3708,-2.3814,-6.6724;6.9934,-3.1419,-5.6145;7.8301,-3.4088,-6.7209;8.5097,-4.2122,-6.4253;8.4218,-2.5246,-7.0063;7.2501,-3.7358,-7.5986;6.9216,0.9665,-0.8295;6.0701,1.6212,-0.9713;7.8813,1.1792,0.1296;7.8268,2.0315,0.8029)|\",4.574233826905\r\n\"[H]C([H])([H])OC(=O)C([H])([H])[C@@]1([H])C(=O)OC1([H])[H] |(-2.0828,5.3379,1.6107;-1.5805,4.8467,0.7774;-2.295,4.2791,0.1759;-1.0822,5.5818,0.1404;-0.614,3.9644,1.3778;0.1137,3.2285,0.5172;0.0023,3.2867,-0.6898;1.0709,2.3097,1.2542;0.4755,1.5591,1.7918;1.6028,2.8778,2.0252;2.0525,1.6468,0.2961;2.667,2.3988,-0.2083;2.8987,0.5037,0.8714;3.6815,0.3482,1.7618;2.4369,-0.4298,-0.0242;1.5397,0.5524,-0.6549;1.7778,0.6738,-1.7124;0.4967,0.2548,-0.5256)|\",7.357958517520001\r\n\"[H]SC([H])([H])C([H])([H])C1=NN(C([H])([H])[H])C(=O)C1([H])[H] |(6.5623,-4.1983,1.8864;7.2531,-3.1127,2.2988;5.9026,-1.8898,2.0378;5.0486,-2.1472,2.6711;6.3165,-0.9485,2.4138;5.4669,-1.75,0.5678;5.0651,-2.7032,0.2082;6.3487,-1.5121,-0.0411;4.4097,-0.7036,0.3796;3.203,-1.0224,0.0478;2.4539,0.1462,-0.0441;1.0589,0.0854,-0.4156;0.6893,1.1124,-0.4317;0.4883,-0.5005,0.3127;0.9409,-0.3648,-1.4069;3.1706,1.2946,0.2303;2.7448,2.4354,0.2135;4.5801,0.7843,0.5473;5.3129,1.2163,-0.1451;4.8792,1.0844,1.5585)|\",5.39873879392\r\n\"[H]C1=C(N([H])C2=C([H])C(OC([H])([H])[H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(7.3884,-0.7283,4.5081;7.4085,-0.7264,3.4281;6.7045,-1.5202,2.5407;5.8393,-2.5834,2.7277;5.9758,-3.1108,3.5778;4.6445,-2.8336,2.0318;4.079,-1.882,1.1675;4.5429,-0.91,1.0643;2.8889,-2.1605,0.4845;2.2802,-1.294,-0.3598;2.8602,-0.0136,-0.5683;2.1991,0.4976,-1.2692;3.8634,-0.0931,-1.0069;2.9157,0.5602,0.3653;2.2592,-3.4046,0.6802;0.7719,-3.7745,-0.1665;2.8168,-4.3367,1.5473;2.3179,-5.2896,1.6882;4.0059,-4.0654,2.2195;4.4414,-4.8121,2.8777;7.1263,-1.0772,1.3228;6.8717,-1.419,0.4068;8.0223,-0.0604,1.4456;8.1972,0.1321,2.7209)|\",5.069481034815\r\n\"[H]C1=C(N([H])C2=C([H])C(F)=C([H])C([H])=C2[H])N([H])N=N1 |(4.9224,-0.4165,-0.4966;4.4503,0.3732,0.0692;3.1422,0.47,0.509;2.0787,-0.4131,0.4453;2.3262,-1.3908,0.4023;0.7258,-0.101,0.2581;-0.2228,-1.1156,0.4672;0.0728,-2.1115,0.7825;-1.5666,-0.8303,0.2801;-2.4613,-1.8181,0.486;-2.0252,0.4284,-0.0939;-3.087,0.6047,-0.2225;-1.0717,1.4248,-0.2994;-1.393,2.4151,-0.6095;0.2913,1.1766,-0.1352;1.0113,1.9586,-0.349;3.118,1.6661,1.1604;2.3478,2.105,1.6449;4.3304,2.2831,1.1116;5.1306,1.4863,0.4651)|\",5.2790086997\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Cl)C(C#N)=C2[H])N([H])N=N1 |(-3.1946,-3.5085,1.2835;-3.3767,-2.9672,0.3666;-2.4838,-2.2815,-0.4324;-1.1031,-2.0994,-0.3898;-0.5649,-2.9142,-0.1291;-0.465,-0.8635,-0.2128;-1.1766,0.3416,-0.0941;-2.2613,0.338,-0.1016;-0.5007,1.5492,0.0552;-1.0605,2.4734,0.1499;0.8908,1.5864,0.0985;1.712,3.1149,0.2899;1.6182,0.387,-0.0101;3.0505,0.3672,0.0305;4.2114,0.3093,0.0575;0.934,-0.8259,-0.1698;1.5096,-1.7418,-0.265;-3.2694,-1.8069,-1.4427;-2.9951,-1.2722,-2.2561;-4.563,-2.1594,-1.272;-4.6208,-2.8758,-0.1829)|\",4.503484225775\r\n\"[H]C1=C(N([H])C2=C([H])C(Cl)=C([H])C([H])=C2C([H])([H])[H])N([H])N=N1 |(0.1131,2.8598,0.2909;1.0237,3.0996,-0.2385;2.0358,2.2571,-0.6607;2.1679,0.8773,-0.6318;1.295,0.3749,-0.5598;3.325,0.1535,-0.2916;4.4648,0.7925,0.2165;4.4644,1.8583,0.4115;5.5984,0.0383,0.5093;7.021,0.86,1.138;5.626,-1.3399,0.3268;6.5181,-1.9113,0.5561;4.4747,-1.9602,-0.1613;4.4798,-3.0374,-0.308;3.3188,-1.2481,-0.4829;2.0974,-1.9486,-1.0258;2.2938,-3.0142,-1.1718;1.7849,-1.5286,-1.9908;1.2359,-1.8744,-0.345;2.9174,3.1027,-1.263;3.79,2.8837,-1.7231;2.4869,4.3922,-1.2052;1.3361,4.3771,-0.5983)|\",5.246355037640001\r\n\"[H]C1=C(N([H])C2=C([H])C(C([H])([H])[H])=C([H])C(Cl)=C2[H])N([H])N=N1 |(7.2661,-3.7503,-0.6755;7.3621,-3.0276,0.1216;6.4236,-2.1354,0.6083;5.0861,-1.9475,0.305;4.6084,-2.7565,-0.0644;4.3872,-0.7344,0.2431;2.9892,-0.7748,0.1197;2.4875,-1.7397,0.0847;2.234,0.3967,0.05;0.7275,0.3304,-0.0519;0.3176,1.24,-0.5014;0.2706,0.2194,0.94;0.4048,-0.5235,-0.6569;2.8854,1.6362,0.1125;2.3223,2.562,0.068;4.2712,1.6651,0.2319;5.0926,3.2204,0.2896;5.0412,0.5064,0.291;6.1213,0.5754,0.3293;7.0948,-1.4817,1.597;6.7573,-0.7639,2.2229;8.3759,-1.9293,1.7069;8.521,-2.872,0.8222)|\",5.175605436510001\r\n\"[H]C([H])=C([H])C1=C([H])C(Cl)=C([H])C([H])=C1N([H])C1=C([H])N=NN1[H] |(-0.149,1.2836,-2.1487;0.7034,1.8119,-1.7322;1.0251,2.7094,-2.255;1.3242,1.3693,-0.6319;0.9805,0.4428,-0.1711;2.4525,2.0359,0.0444;2.5247,3.4342,0.1178;1.7347,4.0304,-0.3249;3.5683,4.0583,0.793;3.6154,5.8135,0.8854;4.5655,3.3066,1.4111;5.3791,3.7991,1.9328;4.5173,1.9169,1.3366;5.3183,1.3349,1.7801;3.4722,1.2709,0.6621;3.4114,-0.1343,0.5587;3.0737,-0.494,-0.3233;4.1745,-1.028,1.293;4.8329,-2.1948,0.9492;4.9565,-2.647,-0.024;5.3515,-2.7557,2.0789;5.0738,-2.0088,3.1071;4.3427,-0.9585,2.6424;3.9951,-0.2624,3.2871)|\",4.45722487119\r\n\"[H]C1=C(N([H])C2=C(C([H])([H])[H])C(Cl)=C([H])C([H])=C2[H])N([H])N=N1 |(-0.0928,-2.5746,-1.4854;0.6109,-2.9214,-0.7428;1.5668,-2.2062,-0.045;1.8668,-0.8534,0.0063;1.1116,-0.2481,-0.2819;3.159,-0.2913,-0.0284;3.2778,1.0967,0.234;2.0418,1.9063,0.5368;1.3992,2.0168,-0.3496;2.293,2.9117,0.8741;1.4428,1.4225,1.3178;4.569,1.6374,0.1996;4.8194,3.3605,0.4955;5.7071,0.8707,-0.0548;6.6846,1.3384,-0.0599;5.5536,-0.4893,-0.3034;6.4259,-1.0999,-0.5182;4.2893,-1.0726,-0.3019;4.178,-2.1246,-0.5404;2.1521,-3.1506,0.7424;2.8897,-3.0353,1.4232;1.6055,-4.3791,0.5351;0.6658,-4.2268,-0.352)|\",5.28989325372\r\n\"[H]C1=C(N([H])C2=C([H])C(C([H])([H])[H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(6.0523,4.994,-1.1971;5.3586,4.7919,-0.3941;4.9544,3.5744,0.1242;5.3525,2.2766,-0.1428;6.2854,2.1778,-0.5157;4.5421,1.131,-0.1819;3.1411,1.1983,-0.173;2.6472,2.1648,-0.1754;2.3456,0.0464,-0.2045;0.8422,0.1598,-0.1992;0.5291,1.2071,-0.1651;0.4081,-0.3577,0.6641;0.4064,-0.2998,-1.0936;2.9974,-1.1937,-0.2576;2.0595,-2.6851,-0.2986;4.3884,-1.2815,-0.2833;4.8651,-2.2549,-0.3288;5.1605,-0.127,-0.2387;6.2451,-0.2007,-0.2408;4.0785,3.9273,1.106;3.5722,3.33,1.7443;3.9333,5.2792,1.1806;4.7224,5.7906,0.2815)|\",5.107576973885\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(F)C([H])=C2C([H])([H])[H])N([H])N=N1 |(1.5957,-4.8441,-0.2558;2.282,-4.5007,0.5046;2.6541,-3.2071,0.8259;2.2226,-1.9741,0.3673;1.3508,-2.0062,-0.1425;3.048,-0.8542,0.1089;4.4443,-0.9604,0.0516;4.919,-1.9297,0.1615;5.2367,0.1649,-0.1743;6.318,0.0921,-0.2207;4.6148,1.3897,-0.3637;5.3668,2.4893,-0.5875;3.231,1.5151,-0.331;2.7854,2.493,-0.4853;2.4262,0.4006,-0.086;0.9242,0.5325,-0.0283;0.62,1.5806,-0.0924;0.5235,0.1161,0.9042;0.4293,0.0038,-0.8566;3.5335,-3.3797,1.8505;4.0258,-2.6775,2.3844;3.7086,-4.7003,2.1398;2.9378,-5.3652,1.3308)|\",5.102134696875\r\n\"[H]C1=C(N([H])C2=C(C([H])([H])N([H])[H])C([H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(3.2467,3.8993,-1.336;2.4375,3.9674,-0.6239;1.672,2.9516,-0.076;1.7565,1.5781,-0.1878;2.6266,1.1936,-0.5683;0.6886,0.675,-0.2553;0.9923,-0.7083,-0.1844;2.419,-1.1768,0.0094;2.4356,-2.2766,0.0384;2.7874,-0.8251,0.9829;3.3222,-0.6048,-1.011;3.0805,-0.9657,-1.9327;4.284,-0.8825,-0.8233;-0.0438,-1.6381,-0.2611;0.1832,-2.6989,-0.2102;-1.3689,-1.2219,-0.3849;-2.6543,-2.4207,-0.4616;-1.6758,0.1332,-0.4529;-2.7054,0.4555,-0.5669;-0.6494,1.0748,-0.3971;-0.8931,2.1269,-0.5005;0.8133,3.6188,0.745;0.1107,3.2415,1.3644;1.0259,4.9655,0.7035;2.015,5.1608,-0.1183)|\",4.83818426189\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Cl)C([H])=C2F)N([H])N=N1 |(0.6942,4.7194,-0.845;0.4036,4.3594,0.131;0.4707,3.0773,0.6427;0.9833,1.8958,0.1314;1.7675,1.9794,-0.4993;0.3779,0.6389,0.1756;-0.8923,0.3741,0.7054;-1.4992,1.188,1.0869;-1.4047,-0.9249,0.7219;-2.3891,-1.1128,1.1358;-0.6563,-1.9756,0.1986;-1.298,-3.6092,0.2206;0.606,-1.7437,-0.3532;1.2045,-2.5428,-0.7743;1.092,-0.4505,-0.3483;2.3103,-0.1903,-0.8856;-0.031,3.2148,1.9027;-0.1151,2.5176,2.6293;-0.3987,4.4992,2.1521;-0.1193,5.1851,1.0815)|\",5.05043306528\r\n\"[H]C1=NC(Cl)=C([H])C([H])=C1N([H])C1=C([H])N=NN1[H] |(1.3953,-1.6566,-0.6082;0.871,-0.7474,-0.3132;1.6346,0.3186,-0.0683;1.0351,1.4397,0.2882;2.0652,2.835,0.6062;-0.3482,1.5734,0.4258;-0.7879,2.5176,0.7257;-1.1415,0.4609,0.1705;-2.2201,0.5238,0.2733;-0.5285,-0.7413,-0.2118;-1.2532,-1.9028,-0.5261;-0.703,-2.753,-0.5368;-2.5988,-2.0678,-0.1725;-3.264,-2.4901,0.9611;-2.8581,-2.7993,1.9135;-4.6044,-2.4715,0.7219;-4.8176,-2.0528,-0.5007;-3.6145,-1.8104,-1.0478;-3.5437,-1.4916,-2.0056)|\",5.145672912955\r\n\"[H]OC([H])([H])C1=C(N([H])C2=C([H])N=NN2[H])C([H])=C([H])C(Cl)=C1[H] |(4.0679,-1.6641,-1.5045;3.154,-1.3431,-1.5613;2.4443,-1.832,-0.4085;2.3692,-2.9267,-0.4412;2.9899,-1.5554,0.5078;1.0636,-1.2284,-0.3726;0.896,0.1767,-0.372;2.0479,0.9731,-0.4711;2.7879,0.5446,-1.0211;2.0993,2.3503,-0.3466;2.7828,3.3203,-1.0585;3.3905,3.2117,-1.9449;2.5797,4.5299,-0.463;1.8021,4.3856,0.5701;1.5126,3.0576,0.659;0.9435,2.7182,1.4214;-0.3964,0.7128,-0.2759;-0.5441,1.7867,-0.3157;-1.508,-0.1214,-0.168;-2.5036,0.3039,-0.0985;-1.3349,-1.5019,-0.1751;-2.7302,-2.5645,-0.0493;-0.0584,-2.0512,-0.288;0.0634,-3.1299,-0.299)|\",4.884443616475\r\n\"[H]C1=C(N([H])C2=C(C([H])([H])[H])C([H])=C(Cl)C([H])=C2C([H])([H])[H])N([H])N=N1 |(1.8557,-4.289,-0.9564;2.0522,-3.7236,-0.0574;2.7265,-2.5236,0.0921;3.3501,-1.6881,-0.8284;3.8339,-2.1924,-1.5597;3.9404,-0.4475,-0.4537;5.3418,-0.2968,-0.5157;6.2271,-1.443,-0.9478;7.2828,-1.1836,-0.8313;6.0771,-1.7049,-2.0048;6.0355,-2.3474,-0.3583;5.9065,0.9342,-0.173;6.9824,1.069,-0.2136;5.0899,1.9834,0.2388;5.8176,3.5195,0.6876;3.7073,1.8394,0.2911;3.0875,2.6789,0.5888;3.1091,0.6276,-0.0721;1.604,0.5055,-0.0876;1.1381,1.4949,-0.0889;1.2185,-0.0401,0.7824;1.2696,-0.0404,-0.9755;2.6591,-2.2878,1.428;3.0518,-1.5174,1.9508;1.9751,-3.2781,2.0768;1.6259,-4.1407,1.1692)|\",5.113019250895\r\n\"[H]C1=C(N([H])C2=C([H])C(F)=C(Cl)C([H])=C2[H])N([H])N=N1 |(-0.7283,-4.5859,-0.3925;-0.1402,-4.1524,0.4031;-0.0598,-2.8339,0.81;-0.7383,-1.6912,0.4125;-1.6709,-1.8446,0.0564;-0.1758,-0.4374,0.1414;1.2098,-0.2218,0.1476;1.9186,-1.0235,0.3186;1.7036,1.0502,-0.1143;3.0319,1.2319,-0.1026;0.8616,2.1247,-0.3972;1.5282,3.7062,-0.7169;-0.5159,1.9015,-0.4155;-1.1831,2.7283,-0.6349;-1.0328,0.64,-0.1439;-2.1088,0.4885,-0.1446;0.8281,-2.8776,1.8433;1.1468,-2.1232,2.4356;1.2832,-4.1405,2.0524;0.6806,-4.9058,1.1887)|\",5.1184615279050005\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(F)C(Cl)=C2[H])N([H])N=N1 |(-3.193,-3.848,0.9574;-3.3913,-3.0764,0.2279;-2.5172,-2.1717,-0.3452;-1.1435,-2.005,-0.2514;-0.6281,-2.8389,-0.0078;-0.4761,-0.7848,-0.0342;-1.155,0.4093,0.2531;-2.2345,0.4173,0.3535;-0.4403,1.5909,0.4416;-0.9504,2.5211,0.6697;0.9456,1.5921,0.3628;1.6246,2.7353,0.5518;1.6315,0.4072,0.0907;3.3742,0.418,-0.0042;0.9251,-0.7733,-0.1171;1.4695,-1.6828,-0.3511;-3.3159,-1.4507,-1.1815;-3.0586,-0.7054,-1.8137;-4.6071,-1.8707,-1.1197;-4.6396,-2.8631,-0.2779)|\",4.993289156675\r\n\"[H]C1=C(N([H])C2=C(OC([H])([H])[H])C([H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(2.4005,-5.0298,-1.6656;3.3825,-4.6795,-1.9476;3.857,-3.381,-2.0208;3.2639,-2.1746,-1.7048;2.6004,-2.1778,-0.9434;3.5791,-0.9136,-2.2119;3.1167,0.2114,-1.4821;2.412,-0.0974,-0.3502;1.8625,0.9665,0.4168;1.3414,0.4955,1.2516;1.1507,1.5562,-0.1742;2.6499,1.6249,0.8038;3.3791,1.5025,-1.9269;3.0331,2.366,-1.3734;4.1101,1.6846,-3.1063;4.4473,3.3228,-3.6508;4.5607,0.6011,-3.8459;5.1086,0.7547,-4.7687;4.2845,-0.6948,-3.3979;4.5947,-1.5399,-4.0034;5.1455,-3.541,-2.4319;5.8563,-2.8374,-2.5713;5.4509,-4.8582,-2.6112;4.383,-5.5337,-2.3038)|\",4.88172247797\r\n\"[H]C1=C(N([H])C2=C([H])C(Cl)=C([H])C([H])=C2[H])N([H])N=N1 |(5.1477,-0.998,0.7272;4.5236,-1.5475,0.0377;3.2807,-1.215,-0.4695;2.5011,-0.0773,-0.3383;3.0092,0.7741,-0.1472;1.1109,-0.0165,-0.171;0.3336,-1.165,0.0465;0.7906,-2.1428,0.1389;-1.0465,-1.0318,0.1874;-2.0044,-2.4827,0.4536;-1.6841,0.205,0.1357;-2.7596,0.2764,0.2472;-0.8943,1.34,-0.0626;-1.3656,2.3179,-0.1038;0.4832,1.2401,-0.2218;1.0803,2.132,-0.3938;2.9781,-2.2703,-1.2764;2.1636,-2.4083,-1.8584;3.9646,-3.2069,-1.2588;4.8991,-2.7536,-0.4749)|\",5.18648999053\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Cl)C(C([H])([H])C([H])([H])[H])=C2[H])N([H])N=N1 |(4.3682,-2.2241,-6.4353;4.8698,-1.6821,-5.6468;4.8771,-1.9297,-4.2852;4.3397,-2.9547,-3.5263;4.1863,-3.8221,-4.0198;3.7423,-2.8463,-2.2596;3.4618,-4.0228,-1.5473;3.7162,-4.9906,-1.9721;2.8723,-3.9519,-0.2913;2.6574,-4.8597,0.2624;2.566,-2.712,0.2711;1.8384,-2.6903,1.8786;2.8303,-1.5162,-0.4116;2.4669,-0.1544,0.1411;2.66,-0.1308,1.2188;3.1209,0.5974,-0.316;0.9978,0.2254,-0.1213;0.7804,1.2188,0.2867;0.3159,-0.491,0.3474;0.7839,0.2452,-1.1957;3.4173,-1.6115,-1.6811;3.5937,-0.6945,-2.2353;5.6665,-0.9355,-3.7907;5.9496,-0.7709,-2.835;6.1088,-0.1127,-4.7812;5.6346,-0.5819,-5.8978)|\",5.07764445033\r\n\"[H]C1=C(N([H])C2=C(C([H])([H])[H])C([H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(1.0326,-3.4441,-1.5795;1.8008,-3.4976,-0.8219;2.4591,-2.4696,-0.1717;2.2983,-1.0932,-0.2017;1.396,-0.787,-0.5366;3.3346,-0.1391,-0.2585;2.9989,1.2236,-0.0865;1.5667,1.6431,0.1388;1.4997,2.717,0.3318;1.1245,1.1164,0.9939;0.9309,1.4352,-0.7349;4.0197,2.1745,-0.1247;3.777,3.2243,0.007;5.3471,1.7936,-0.3165;6.6069,3.0209,-0.3402;5.6799,0.4544,-0.4895;6.711,0.1615,-0.6552;4.6708,-0.5069,-0.468;4.9281,-1.5465,-0.6416;3.3117,-3.1232,0.6653;3.9603,-2.7343,1.3352;3.201,-4.4734,0.5328;2.2761,-4.6881,-0.3566)|\",5.156557466974999\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Cl)C([H])=C2C([H])([H])OC([H])([H])[H])N([H])N=N1 |(2.5666,-3.3566,-1.0205;3.426,-3.2156,-0.3816;4.147,-2.0609,-0.1327;3.9572,-0.7493,-0.5281;3.0329,-0.5144,-0.8777;4.9558,0.1534,-0.9238;6.2863,-0.2304,-1.1454;6.5753,-1.2717,-1.0498;7.2443,0.7066,-1.5295;8.2701,0.3977,-1.7009;6.8719,2.0343,-1.7146;8.0704,3.2237,-2.2059;5.5493,2.4288,-1.5173;5.2697,3.4667,-1.6693;4.5863,1.5067,-1.1103;3.1805,1.9647,-0.815;3.0731,3.0332,-1.0513;2.9563,1.8372,0.2586;2.2439,1.2019,-1.58;0.9009,1.5852,-1.3305;0.2664,0.9498,-1.9525;0.7325,2.6377,-1.602;0.6287,1.4472,-0.2728;5.11,-2.4756,0.7367;5.8265,-1.9292,1.1934;5.001,-3.8076,1.0035;3.9739,-4.2398,0.3324)|\",4.900770447505\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Br)C([H])=C2[H])N([H])N=N1 |(-2.9979,-4.2797,0.8459;-3.3287,-3.3836,0.3416;-2.5786,-2.3276,-0.1436;-1.212,-2.1166,-0.2146;-0.6451,-2.9519,-0.2392;-0.5316,-0.9099,0.0038;-1.1599,0.2289,0.5324;-2.2026,0.1924,0.83;-0.4408,1.4091,0.7188;-0.9369,2.2812,1.1311;0.9121,1.4572,0.3918;1.9008,3.0811,0.6499;1.5559,0.3305,-0.12;2.6103,0.3675,-0.3713;0.8346,-0.842,-0.3176;1.3338,-1.7139,-0.7338;-3.5165,-1.4943,-0.6746;-3.3826,-0.6151,-1.1542;-4.7724,-1.9934,-0.5136;-4.6458,-3.1408,0.0864)|\",5.05043306528\r\n\"[H]C1=C(N([H])C2=C([H])C([H])=C(Cl)C([H])=C2[H])N([H])N=N1 |(-3.3309,-3.832,0.928;-3.4887,-3.0404,0.2102;-2.5783,-2.1363,-0.3063;-1.2083,-2.0062,-0.1518;-0.7199,-2.8554,0.093;-0.4882,-0.8109,0.0018;0.9151,-0.8635,-0.0416;1.4129,-1.8174,-0.1979;1.6763,0.291,0.1051;2.7593,0.2389,0.0717;1.0391,1.5193,0.2829;1.9973,2.9838,0.4504;-0.351,1.5906,0.3297;-0.8413,2.5459,0.4842;-1.1133,0.4307,0.1966;-2.1933,0.4956,0.2753;-3.3326,-1.38,-1.1519;-3.0398,-0.6241,-1.755;-4.6335,-1.7798,-1.1494;-4.7132,-2.7919,-0.3355)|\",5.066759896310001\r\n\"[H]C1=C(C2([H])C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])N([H])N=C1C(=O)OC([H])([H])C([H])([H])[H] |(7.0558,-0.8119,0.677;6.633,0.1415,0.3983;7.254,1.375,0.3961;8.6545,1.7948,0.7394;9.1568,0.9016,1.1359;8.7051,2.8842,1.8364;8.156,3.7746,1.5021;8.2196,2.5212,2.7496;10.1522,3.2951,2.1285;10.6836,2.4337,2.5823;10.1693,4.1092,2.8621;10.7861,3.7624,0.8959;11.718,4.1169,1.0989;10.866,2.7018,-0.1088;11.3974,3.0887,-0.9857;11.4279,1.8158,0.2512;9.4527,2.2615,-0.5028;9.4986,1.4511,-1.2398;8.9391,3.1078,-0.9769;6.2934,2.2517,-0.0369;6.3872,3.2493,-0.1632;5.1083,1.6841,-0.3103;5.3108,0.3879,-0.0456;4.2667,-0.646,-0.2079;4.4656,-1.8239,0.0276;3.0918,-0.1416,-0.6355;2.0194,-1.0907,-0.8293;2.0706,-1.8511,-0.0462;1.1098,-0.4983,-0.7013;2.0821,-1.7213,-2.2129;1.2143,-2.3736,-2.3655;2.9882,-2.3244,-2.3193;2.0754,-0.9506,-2.9905)|\",5.771534769105\r\n\"[H]C1=C(C2([H])C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])N([H])N=C1C(=O)OC([H])([H])[H] |(2.0569,1.7815,4.3158;2.0445,2.1225,3.2916;1.9363,3.4201,2.8319;1.798,4.7318,3.5502;1.8208,4.4996,4.6239;0.4504,5.4318,3.2548;0.3575,5.621,2.1773;-0.3813,4.78,3.5465;0.3599,6.7711,3.9935;0.3255,6.575,5.0846;-0.5709,7.2849,3.7277;1.4891,7.6193,3.6096;1.39,8.5392,4.0334;2.7701,7.0283,3.9986;3.5726,7.7272,3.7371;2.8406,6.8424,5.0898;2.9681,5.701,3.2595;3.9181,5.2402,3.5542;3.0224,5.9075,2.1825;1.9686,3.3139,1.4658;1.9149,4.064,0.7917;2.0885,2.0568,1.0111;2.1353,1.3205,2.1277;2.2648,-0.1513,2.1217;2.3209,-0.8184,3.1379;2.3126,-0.6714,0.8787;2.4356,-2.0995,0.8241;2.4703,-2.3473,-0.2373;1.5775,-2.58,1.3025;3.3499,-2.4287,1.3258)|\",5.752486799570001\r\n\"[H]C1=NC([H])=C(C([H])([H])C([H])=C([H])[H])C(C#CC2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])=C1[H] |(8.3835,0.1864,-0.2305;7.3076,0.3097,-0.3418;6.8455,1.5657,-0.2709;5.5257,1.7304,-0.4077;5.1679,2.7582,-0.3469;4.5995,0.7028,-0.6126;3.1239,1.0149,-0.7631;2.5411,0.2101,-0.2953;2.8901,1.9397,-0.223;2.704,1.1575,-2.2089;2.8787,0.284,-2.837;2.1569,2.2525,-2.736;1.8708,2.3016,-3.7832;1.9727,3.1446,-2.1403;5.1056,-0.6159,-0.6853;4.2504,-1.7357,-0.8925;3.5205,-2.6921,-1.0766;2.6798,-3.8236,-1.279;3.2102,-5.1268,-1.2359;4.2702,-5.2622,-1.0448;2.3858,-6.2309,-1.4313;2.8151,-7.2291,-1.3912;1.0146,-6.0814,-1.6756;0.1217,-7.2792,-1.8964;0.662,-8.2165,-1.7314;-0.2728,-7.298,-2.9204;-0.7411,-7.2641,-1.2199;0.4899,-4.7799,-1.7177;-0.5725,-4.6379,-1.903;1.3002,-3.6681,-1.5244;0.8764,-2.669,-1.5582;6.4944,-0.799,-0.5439;6.9188,-1.7967,-0.5915)|\",4.359263885010001\r\n\"[H]OC(=O)[C@@]([H])(N([H])C([H])([H])[C@]12C([H])C1C([H])=C([H])C2=O)C([H])([H])O[H] |(1.1662,1.3846,-2.8685;0.6344,1.2789,-3.6929;-0.5894,0.8753,-3.3249;-1.4815,0.6682,-4.1148;-0.7955,0.7464,-1.8024;-1.3538,-0.1863,-1.6273;0.4829,0.7815,-1.094;0.317,1.1568,-0.1609;1.1512,-0.5219,-0.9859;0.5618,-1.2559,-0.4125;1.2712,-0.923,-2.0003;2.5287,-0.3819,-0.3568;3.031,-1.3764,0.6669;3.5647,-2.3079,0.8116;2.7891,-0.1713,1.1164;3.4831,1.0954,1.3808;3.7668,1.4648,2.3603;3.8577,1.5967,0.1838;4.517,2.444,0.0313;3.3981,0.7388,-0.972;3.5545,0.9978,-2.1492;-1.6768,1.9125,-1.3462;-2.6321,1.8564,-1.8823;-1.186,2.8639,-1.6036;-1.8429,1.7876,0.0643;-2.3185,2.5698,0.3817)|\",3.823199599525\r\n\"[H]C1=C([H])C(/N=C(\\[H])N(C([H])([H])[H])C([H])([H])[H])=C(C#N)C(Cl)=C1[H] |(7.1166,-0.5139,4.994;6.7013,0.2796,4.3787;6.1815,-0.0232,3.1244;6.162,-1.0515,2.7769;5.6345,0.9839,2.3081;5.07,0.7529,1.0642;5.612,-0.0922,0.2546;6.5695,-0.5877,0.4591;5.0706,-0.4302,-0.9391;5.746,-1.3364,-1.8477;5.9921,-0.8332,-2.7925;5.1111,-2.2023,-2.0773;6.6735,-1.6971,-1.3956;3.8092,0.1439,-1.3853;3.9445,0.6406,-2.3539;3.4779,0.8732,-0.6467;3.0488,-0.6398,-1.4974;5.5995,2.3158,2.8105;5.0365,3.3482,1.9964;4.5806,4.1878,1.3326;6.1313,2.5912,4.0826;6.1021,4.2282,4.7039;6.6852,1.5862,4.8698;7.0818,1.8244,5.8499)|\",4.696685059629999\r\n\"[H]O[C@@]1([H])C2=C(OC([H])([H])C1([H])[H])C([H])=C([H])C([H])=C2OC([H])([H])[H] |(5.6324,-1.2948,-0.1887;6.1373,-1.2309,-1.0162;5.2273,-0.6978,-1.9871;4.4065,-1.4126,-2.1428;4.6336,0.6286,-1.5482;4.993,1.8365,-2.1594;5.9381,1.9047,-3.1412;6.8087,0.767,-3.2414;7.3911,0.9321,-4.1508;7.4867,0.757,-2.3775;6.0077,-0.5252,-3.2898;6.6713,-1.3858,-3.424;5.3134,-0.4901,-4.1382;4.3722,3.0436,-1.8013;4.6681,3.9551,-2.3096;3.4138,3.0403,-0.7993;2.9348,3.9726,-0.5124;3.0638,1.86,-0.1299;2.332,1.887,0.6685;3.6853,0.6701,-0.5071;3.4491,-0.547,0.0893;2.4591,-0.6195,1.1051;1.4774,-0.3068,0.728;2.7272,-0.0018,1.9717;2.4147,-1.6676,1.407)|\",5.787861600135\r\n\"[H]C1=C(N(=O)=O)C(=O)N(C([H])([H])[H])C([H])=C1F |(6.2921,-0.0832,0.2262;5.2114,-0.035,0.156;4.4829,-1.197,0.0901;5.2332,-2.4555,0.0853;4.6709,-3.4643,-0.3232;6.4046,-2.4072,0.4779;3.0226,-1.2222,0.0249;2.2711,-2.1828,0.0137;2.4565,0.0945,0.0002;0.9918,0.1337,-0.0681;0.6661,1.1738,-0.1085;0.5694,-0.362,0.809;0.6526,-0.405,-0.9555;3.1813,1.2411,0.0455;2.6431,2.1813,0.0168;4.5464,1.197,0.1265;5.2479,2.3455,0.1736)|\",3.86945895411\r\n\"[H]C1=C(F)C(N(=O)=O)=C(C([H])([H])[H])C(N(=O)=O)=C1F |(6.3297,-0.1015,-0.0544;5.2467,-0.0975,-0.0431;4.5266,-1.2813,-0.0492;5.1948,-2.4329,-0.0605;3.1284,-1.2822,0.0051;2.4459,-2.5849,-0.0022;1.5719,-2.7662,0.8431;2.8,-3.3954,-0.8522;2.3904,-0.0881,0.0556;0.8797,-0.0834,0.0768;0.4816,0.8189,-0.3885;0.5149,-0.1118,1.108;0.4756,-0.9546,-0.4401;3.1373,1.1013,0.04;2.4655,2.4092,0.0718;1.5951,2.5732,0.9245;2.8245,3.2407,-0.7555;4.5356,1.0914,-0.0145;5.2123,2.2379,0.0093)|\",5.015058264715\r\n\"[H]C1=C(Cl)N=C(N(=O)=O)C(OC([H])([H])[H])=C1[H] |(3.0398,4.651,1.0599;3.2288,3.6085,1.2869;4.1631,3.2498,2.2575;5.0412,4.4991,3.1196;4.4357,1.9922,2.5774;3.7804,1.0509,1.9347;4.1108,-0.3464,2.3196;3.4414,-0.8391,3.2188;5.0227,-0.8832,1.7037;2.8162,1.258,0.9379;2.2497,0.1629,0.3899;1.2652,0.3465,-0.6249;0.9557,-0.657,-0.917;1.6851,0.8683,-1.4934;0.4,0.9003,-0.2405;2.5455,2.5932,0.6177;1.8138,2.8463,-0.1409)|\",4.906212724515\r\n\"[H]OB(O[H])C1=C([H])N=C(C(=O)OC([H])([H])[H])N=C1[H] |(-1.2951,4.9356,-2.9277;-0.7695,5.6304,-3.3452;0.5631,5.5385,-3.0661;1.3386,6.5691,-3.5126;2.2839,6.4168,-3.3836;1.1722,4.3135,-2.2728;2.264,4.4335,-1.4003;2.7352,5.4022,-1.2273;2.7896,3.4118,-0.7201;2.2094,2.2211,-0.9125;2.7607,1.0222,-0.1615;2.2844,-0.0877,-0.2297;3.8323,1.3483,0.5794;4.4072,0.2598,1.3202;5.2501,0.6904,1.8608;4.7446,-0.5291,0.6425;3.6752,-0.1598,2.0155;1.1673,1.9626,-1.7119;0.6684,3.007,-2.3746;-0.1721,2.7845,-3.0337)|\",4.917097278535\r\n\"[H]OB(O[H])C1=C([H])N=C(N2C([H])([H])C([H])([H])C([H])(F)C([H])([H])C2([H])[H])N=C1[H] |(4.4251,-5.3616,-2.7895;3.9453,-5.7708,-3.5213;2.6322,-5.383,-3.5688;1.8617,-6.0508,-4.4839;0.9773,-5.6717,-4.5678;2.0527,-4.2558,-2.6454;0.696,-4.1637,-2.2896;-0.0192,-4.9095,-2.6438;0.1762,-3.2221,-1.5115;1.0397,-2.289,-1.0427;0.5323,-1.3046,-0.2435;1.3636,-0.236,0.3076;1.3586,-0.3181,1.4059;2.3856,-0.3854,-0.0356;0.8255,1.1414,-0.1127;0.9294,1.2632,-1.1978;1.4085,1.935,0.3691;-0.6476,1.2752,0.256;-0.7582,1.2914,1.3505;-1.138,2.4926,-0.2223;-1.4706,0.1323,-0.3276;-2.5128,0.2118,0.0034;-1.4608,0.2138,-1.4214;-0.887,-1.2254,0.0961;-1.4043,-2.0488,-0.3923;-0.996,-1.3541,1.1846;2.3682,-2.2593,-1.3073;2.8326,-3.2281,-2.0875;3.9035,-3.1735,-2.2956)|\",4.938866386575\r\n\"[H]OB(O[H])C1=C([H])N=C(N2C([H])([H])C2([H])[H])N=C1[H] |(-4.7181,-4.9388,0.9231;-5.5318,-4.6669,0.4793;-5.4555,-3.3908,-0.0143;-6.6304,-2.8914,-0.5123;-6.5104,-2.0407,-0.9537;-4.1236,-2.5634,-0.0096;-4.0853,-1.1578,0.0165;-5.0131,-0.5823,0.0497;-2.9754,-0.4262,0.0231;-1.8218,-1.1321,-0.0034;-0.6791,-0.4213,-0.001;0.7306,-0.4444,-0.0145;1.2427,-0.7652,0.8938;1.2261,-0.7288,-0.944;-0.0783,0.8541,0.019;-0.1367,1.4579,-0.8878;-0.1194,1.4214,0.9499;-1.6989,-2.4789,-0.0323;-2.846,-3.1508,-0.0318;-2.74,-4.2373,-0.067)|\",5.10485583538\r\n\"[H]OB(O[H])C1=C([H])N=C(N2C([H])([H])C([H])([H])[C@@]([H])(F)C2([H])[H])N=C1[H] |(5.9967,3.9848,1.3581;6.7961,3.4643,1.2067;6.5187,2.1552,0.9122;7.6115,1.331,0.8501;7.3894,0.4493,0.5242;5.0588,1.6383,0.6646;4.6581,0.3097,0.891;5.3719,-0.423,1.2742;3.4304,-0.158,0.6934;2.5291,0.7422,0.2293;1.2697,0.2935,0.019;0.1792,1.1323,-0.4893;0.1508,2.0876,0.0391;0.3188,1.3435,-1.5572;-1.0581,0.2559,-0.2445;-1.8791,0.4601,-0.9361;-1.4258,0.3887,0.7799;-0.5207,-1.1636,-0.4109;-1.1487,-1.9414,0.0345;-0.4149,-1.4391,-1.7794;0.8826,-1.1076,0.1918;1.5656,-1.7836,-0.3302;0.878,-1.389,1.2532;2.7734,2.0491,-0.0356;4.0171,2.4566,0.1923;4.2036,3.5087,-0.0348)|\",4.98512574116\r\n\"[H]C1=C([H])/C2=N/C(=O)/C([H])=C(/C(=O)N(OC([H])([H])[H])C([H])([H])[H])[C@@]2([H])C([H])=C1[H] |(4.9751,-1.7286,-0.7391;6.0298,-1.5687,-0.9488;6.6552,-0.4729,-0.4549;6.1493,0.244,0.1835;8.0822,-0.2735,-0.6454;8.6729,0.6181,0.09;10.0738,0.7271,0.0384;10.6521,1.6636,0.5676;10.8492,-0.3415,-0.6472;11.9202,-0.3221,-0.4882;10.2624,-1.2206,-1.476;11.0125,-2.1168,-2.4212;10.6365,-2.2084,-3.5897;12.0692,-2.8682,-1.9651;12.6205,-2.5945,-0.7055;12.2011,-3.575,0.2524;11.1189,-3.528,0.415;12.7269,-3.3124,1.1732;12.4885,-4.585,-0.0626;13.0004,-3.5225,-2.8701;12.4933,-3.6608,-3.8243;13.2913,-4.494,-2.4599;13.8968,-2.907,-3.0103;8.7643,-1.1177,-1.7145;8.6681,-0.524,-2.6481;8.0347,-2.3972,-2.0517;8.5625,-3.1154,-2.6688;6.7436,-2.5723,-1.7199;6.2122,-3.4668,-2.0326)|\",3.986467909825\r\n\"[H]OC(=N/C([H])([H])C1([H])C([H])([H])C1([H])[H])/C([H])=C(\\[H])C1=C([H])C([H])=NC([H])=C1[H] |(-5.6711,-0.9567,-2.0792;-5.0747,-0.9478,-1.3144;-3.7876,-1.1047,-1.7643;-2.8617,-0.7057,-0.9909;-1.4611,-0.9134,-1.3425;-1.2365,-1.9859,-1.4551;-1.2231,-0.4419,-2.3118;-0.5553,-0.3313,-0.2786;-0.7037,0.7342,-0.1144;0.8527,-0.8506,-0.121;1.1723,-1.6685,-0.7634;1.643,-0.1421,0.1118;-0.1889,-1.1457,0.9337;-0.1152,-0.64,1.8923;-0.5758,-2.1599,0.9981;-3.6296,-1.7581,-3.081;-2.6978,-1.5547,-3.601;-4.5335,-2.6042,-3.6128;-5.4239,-2.8439,-3.0316;-4.4393,-3.3,-4.9007;-3.4291,-3.0645,-5.8492;-2.6501,-2.3296,-5.6719;-3.4355,-3.7816,-7.0415;-2.6589,-3.6071,-7.7842;-4.3567,-4.7025,-7.3616;-5.3201,-4.9243,-6.4602;-6.0641,-5.6727,-6.7282;-5.4068,-4.2597,-5.2379;-6.2165,-4.4896,-4.5502)|\",4.435455763149999\r\n\"[H]OC(=N/C([H])([H])C([H])([H])C([H])([H])[H])/C([H])=C([H])/C1=C([H])/C([H])=C(/[H])C2=C1C([H])=C([H])C([H])=C2[H] |(2.6018,1.6229,-1.2207;3.3312,1.5453,-0.5861;4.5094,1.4954,-1.2906;5.5088,0.9888,-0.6897;6.8232,0.9885,-1.3195;6.8209,0.37,-2.2341;7.1205,2.0025,-1.6321;7.8714,0.4333,-0.3511;7.8621,1.0464,0.5587;7.5617,-0.5734,-0.0436;9.2782,0.3971,-0.9532;10.0053,-0.0032,-0.2382;9.6163,1.4003,-1.2425;9.3145,-0.234,-1.8505;4.4821,2.0689,-2.6518;5.26,1.7281,-3.3292;3.6035,3.0087,-3.0573;2.9026,3.4019,-2.3225;3.5166,3.5739,-4.4104;3.8201,2.7894,-5.5153;4.1082,1.7532,-5.3645;3.7201,3.2886,-6.8301;3.9616,2.6403,-7.6676;3.2974,4.5797,-7.0485;3.2042,4.9681,-8.0597;2.9694,5.4279,-5.9584;3.0847,4.9323,-4.6161;2.7861,5.8225,-3.5486;2.9057,5.491,-2.5226;2.3689,7.1135,-3.7897;2.1461,7.7725,-2.955;2.2369,7.591,-5.1142;1.9071,8.6109,-5.2914;2.5364,6.7643,-6.1719;2.4503,7.124,-7.1945)|\",4.008237017865\r\n\"[H]O/C(=N/[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@@]1([H])N=NC(=O)C([H])=C1[H] |(8.1874,5.0296,-4.7045;7.3383,4.6096,-4.4615;7.5237,3.9475,-3.2936;6.6942,3.0555,-2.9631;6.8154,2.234,-1.7651;7.5347,2.6404,-1.0318;7.3141,0.843,-2.1896;7.3796,0.1751,-1.3232;8.3055,0.9031,-2.6531;6.6283,0.4045,-2.9219;5.4413,2.1369,-1.0772;4.7309,1.7232,-1.8052;5.5164,1.412,-0.2547;4.9029,3.4674,-0.5393;5.6021,3.8679,0.2134;4.8695,4.1942,-1.3601;3.5144,3.3328,0.1026;2.8056,2.9605,-0.6516;3.5593,2.5608,0.8843;2.9624,4.6291,0.7183;2.0414,4.3905,1.2668;3.6727,5.0035,1.4702;2.661,5.7376,-0.2973;2.2219,6.6131,0.1946;1.9498,5.3918,-1.0578;3.5636,6.0736,-0.8199;8.7553,4.4317,-2.4792;9.3159,3.5336,-2.1763;9.6686,5.1376,-3.4231;10.3897,6.0864,-3.067;10.3734,6.5574,-1.6636;11.3229,7.2059,-1.2865;9.1758,6.2498,-0.881;8.9813,6.8523,0.0006;8.3954,5.2294,-1.2632;7.5077,4.9459,-0.7051)|\",3.561970303045\r\n\"[H]C1=NC2=C(C3N=C(C([H])([H])C([H])([H])[H])NC(=O)[C@]3([H])S2)C([H])=C1[H] |(4.8108,-5.3659,7.9523;4.5834,-4.7774,7.0655;5.1704,-3.5751,7.0056;4.8944,-2.8429,5.9301;4.0335,-3.2514,4.8858;3.9359,-2.2727,3.8266;3.4153,-2.4633,2.6589;3.6027,-1.4195,1.7304;2.7379,-1.5264,0.5046;3.0331,-0.7388,-0.1927;2.9434,-2.497,0.0337;1.2379,-1.4396,0.838;0.6418,-1.5629,-0.0719;0.9869,-0.4672,1.2762;0.9492,-2.2222,1.5463;4.4681,-0.4522,1.8155;5.2465,-0.3369,2.9785;6.295,0.2666,3.017;4.5954,-0.9645,4.2112;3.7958,-0.2665,4.5078;5.6489,-1.2555,5.6872;3.4443,-4.5169,4.965;2.7903,-4.8602,4.1689;3.7245,-5.2936,6.0832;3.292,-6.2809,6.2064)|\",3.55380688753\r\n\"[H]OC(=O)C1([H])C([H])([H])C([H])([H])N([C@@]2([H])N=NC(=S)C([H])=C2[H])C([H])([H])C1([H])[H] |(2.491,-2.9316,6.1784;3.2989,-2.4472,6.4161;3.3035,-1.2207,5.8206;4.217,-0.4609,6.0219;2.1187,-0.933,4.8998;1.196,-1.1757,5.4524;2.1667,-1.8263,3.638;3.1221,-1.6675,3.1237;2.1108,-2.8905,3.9007;1.0161,-1.4867,2.685;0.0522,-1.7656,3.145;1.1178,-2.075,1.7651;1.0537,-0.0648,2.3273;0.1622,0.288,1.2431;0.2747,-0.4847,0.4636;-1.2754,0.1541,1.6702;-2.1735,0.8764,1.19;-1.8614,1.8785,0.1832;-3.0908,2.3899,-0.7733;-0.4996,2.3731,0.1234;-0.3181,3.3227,-0.3695;0.4849,1.6192,0.6413;1.5305,1.9138,0.6051;0.926,0.8017,3.5109;0.9525,1.8457,3.1791;-0.0426,0.647,4.0183;2.0744,0.5433,4.4877;1.9596,1.1796,5.3717;3.0284,0.811,4.0217)|\",2.67760028892\r\n\"[H]OC(=O)[C@]1([H])C(=O)C2=C([H])C([H])=C([H])C([H])=C2/N=C\\1SC([H])([H])[H] |(2.7746,-5.0824,1.8328;3.0482,-4.4446,2.5418;3.4202,-3.2939,1.9787;3.9396,-2.3985,2.5921;3.0472,-3.1834,0.4555;1.9479,-3.1352,0.4609;3.3979,-4.4812,-0.2803;2.9625,-5.5515,0.1588;4.2523,-4.3851,-1.4555;4.6529,-5.5439,-2.1427;4.2979,-6.5046,-1.7835;5.4878,-5.4464,-3.2462;5.8018,-6.3408,-3.7757;5.9262,-4.1827,-3.6727;6.5802,-4.1032,-4.537;5.5317,-3.0284,-3.0054;5.8586,-2.0462,-3.3305;4.6882,-3.1084,-1.8861;4.3068,-1.9116,-1.2773;3.5646,-1.9262,-0.2255;2.9922,-0.4066,0.4598;3.5948,0.7803,-0.7868;3.261,1.7633,-0.4454;3.1743,0.5583,-1.7693;4.6838,0.7518,-0.8419)|\",3.714354059325\r\n\"[H]/C1=C(\\C([H])(C([H])([H])[H])C([H])([H])[H])N2C(=S)N=N/C2=N/C1=O |(3.7735,-0.3771,1.8395;3.5324,0.6785,1.8708;2.7335,1.2059,0.9106;2.1929,0.3109,-0.1935;2.6253,-0.6626,0.0645;0.6646,0.1092,-0.1475;0.3878,-0.6699,-0.8664;0.1117,1.0132,-0.4061;0.3477,-0.2231,0.847;2.7389,0.6466,-1.5962;2.4491,-0.1508,-2.2893;3.8331,0.6988,-1.584;2.3487,1.587,-1.9877;2.4736,2.5995,1.03;1.7441,3.5516,0.3272;0.7982,3.5265,-1.0004;1.9432,4.8154,1.0333;2.6873,4.6563,2.0314;3.0718,3.2814,2.1115;3.8251,2.8261,3.03;4.1208,1.4319,2.9821;4.8291,0.9269,3.834)|\",2.67760028892\r\n\"[H]SC1=NC([H])([H])N([H])[C@]2([H])[C@]1([H])N([H])C([H])([H])[C@@]2([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.3936,1.0373,-4.2438;5.8475,-0.2078,-4.5151;4.9069,-0.9595,-3.1932;4.7062,-2.2151,-3.2105;3.9024,-2.8666,-2.1902;3.078,-3.3527,-2.7414;4.5064,-3.6712,-1.7521;3.3912,-2.0134,-1.084;2.5549,-2.4482,-0.6953;3.0904,-0.6876,-1.5889;2.4044,-0.7059,-2.4589;4.417,-0.089,-2.0471;5.1387,-0.2361,-1.2283;4.1311,1.3342,-2.2708;4.9185,1.8957,-1.9614;2.9252,1.6628,-1.4528;3.1035,2.534,-0.8139;2.0894,1.9144,-2.1163;2.5832,0.3904,-0.6199;3.2282,0.3621,0.268;1.1423,0.2653,-0.173;0.0815,0.2726,-1.0932;0.2862,0.3787,-2.1558;-1.2393,0.1441,-0.6645;-2.0449,0.1528,-1.3941;-1.5278,0.0052,0.695;-2.5572,-0.0941,1.0284;-0.4842,-0.0055,1.6205;-0.6955,-0.1142,2.6811;0.8367,0.1229,1.1872;1.6451,0.113,1.915)|\",5.708948583490001\r\n\"[H]C1=C([H])/C2=N/C(=S)N(C([H])([H])C([H])([H])C([H])([H])[H])C(=O)[C@@]2([H])C([H])=C1C([H])([H])[H] |(-0.0681,-1.7143,1.2753;0.6329,-2.396,0.7979;1.7065,-2.8396,1.4995;1.8816,-2.555,2.5319;2.6235,-3.7826,0.9077;3.5419,-4.3456,1.6331;4.335,-5.3433,1.1016;5.7671,-5.7336,1.8494;3.8641,-6.0364,-0.0323;4.4012,-7.3645,-0.4065;4.7526,-7.8374,0.5115;3.5596,-7.9344,-0.8067;5.5288,-7.2876,-1.4406;5.1551,-6.7787,-2.3371;6.3446,-6.6831,-1.0299;6.0416,-8.6831,-1.8102;6.8448,-8.6196,-2.5521;5.2428,-9.3025,-2.2363;6.4406,-9.2059,-0.9327;2.8965,-5.5047,-0.8799;2.4634,-6.1091,-1.8448;2.4931,-4.0634,-0.5677;3.2885,-3.4608,-1.054;1.1965,-3.6482,-1.198;1.0095,-4.0149,-2.2022;0.3313,-2.8309,-0.5656;-0.9538,-2.3578,-1.1967;-1.8236,-2.6707,-0.6048;-1.071,-2.7559,-2.2083;-0.9852,-1.2621,-1.2561)|\",2.96059869344\r\n\"[H]C([H])([H])C1NN(C([H])([H])C([H])([H])[H])[C@]2([H])C(=O)N(C([H])([H])C([H])([H])[H])C(=S)N=C12 |(6.152,8.9957,2.6418;6.2463,8.0241,3.1409;7.3062,7.7931,3.2701;5.778,8.1256,4.1264;5.5808,6.9532,2.3412;6.2307,5.9331,1.8233;5.4174,5.187,1.0374;5.7774,3.7888,0.794;5.6695,3.2014,1.7202;5.0445,3.4206,0.0713;7.1909,3.654,0.236;7.4149,2.5966,0.0607;7.9295,4.0545,0.9347;7.2827,4.1918,-0.7126;4.0351,5.5918,1.2398;3.5242,4.8703,1.9037;3.144,5.7714,0.0218;3.2578,5.107,-0.9921;2.1204,6.6931,0.2348;1.0712,6.7501,-0.8115;1.0322,5.7554,-1.2573;0.1292,6.9557,-0.3043;1.3695,7.8035,-1.8768;0.5694,7.8031,-2.6253;2.3136,7.5861,-2.386;1.4222,8.8045,-1.4379;2.095,7.5957,1.3345;0.7336,8.4514,1.7456;3.2733,7.8065,2.0297;4.1927,6.9027,1.9454)|\",3.0776076491550004\r\n\"[H]C1C([H])/C2=N/C(=S)C3=NN=C(C([H])([H])[H])N3[C@]2([H])C([H])=C1[H] |(4.176,-6.4421,0.0602;4.0576,-5.3879,0.3144;3.4287,-4.9959,1.4816;3.4562,-3.9212,1.6673;2.0662,-5.5065,1.3353;1.5479,-6.6545,1.5456;0.3059,-7.0159,1;-0.321,-8.5041,1.314;-0.3842,-6.0226,0.176;-1.5881,-6.0894,-0.364;-1.8405,-4.8923,-0.9665;-0.7934,-4.1041,-0.7876;-0.6984,-2.6958,-1.269;-1.6959,-2.3782,-1.5796;-0.0295,-2.5937,-2.1326;-0.3431,-2.0163,-0.4852;0.1675,-4.7785,-0.0811;1.4713,-4.3549,0.4097;1.3295,-3.4776,1.0561;2.4665,-4.0112,-0.7122;2.0584,-3.4468,-1.5483;3.7543,-4.4572,-0.7653;4.361,-4.3008,-1.6558)|\",3.015021463540001\r\n\"[H]/N=C(/O[H])C([H])([H])/C(=N\\[H])N([H])OC(=O)C1=C([H])C([H])=C(C([H])([H])[H])C([H])=C1[H] |(5.51,-4.799,0.4065;6.0044,-4.4294,1.2192;7.1518,-4.0094,0.8741;7.9657,-3.5071,1.8337;8.7624,-3.124,1.4262;7.7569,-4.068,-0.5353;8.4086,-4.9492,-0.5894;6.954,-4.1917,-1.2652;8.595,-2.8462,-0.8617;9.8617,-2.7194,-0.6991;10.2896,-3.6207,-0.4733;7.9135,-1.7729,-1.4278;8.406,-0.8774,-1.385;6.6112,-1.5946,-0.8643;6.2888,-0.2701,-0.6972;7.0525,0.6179,-1.0199;4.9422,-0.0993,-0.112;4.3754,1.1833,-0.157;4.9338,1.9885,-0.623;3.1153,1.4069,0.3838;2.6805,2.4024,0.3353;2.3962,0.3676,0.9948;1.0431,0.6263,1.6116;0.465,-0.2973,1.7147;0.4576,1.3316,1.0119;1.1463,1.0623,2.6141;2.9795,-0.9062,1.0427;2.4433,-1.7219,1.5211;4.2373,-1.1477,0.497;4.677,-2.1367,0.5718)|\",5.390575378405\r\n\"[H]OC1=NC(=O)N(C([H])([H])[H])[C@]2([H])N([H])[C@@]([H])(C([H])([H])[H])N([H])C([H])([H])[C@@]12[H] |(1.183,-0.5027,-0.5092;1.6548,-1.1166,0.0746;1.0274,-1.1942,1.2694;1.5281,-1.9786,2.1403;0.9203,-2.1049,3.4138;1.2321,-3.0255,4.1484;-0.0193,-1.1438,3.8106;-0.2587,-1.0323,5.2515;-1.2186,-0.5476,5.4282;0.5265,-0.4419,5.7462;-0.2548,-2.0373,5.6696;-0.2062,0.0446,2.9893;0.627,0.7525,3.1428;-1.4313,0.7484,3.3562;-2.2065,0.0886,3.2665;-1.6942,1.9175,2.4893;-2.7307,2.2081,2.6882;-0.7999,3.1024,2.8693;-1.0965,3.9811,2.29;0.2602,2.9157,2.6705;-0.9074,3.3181,3.9359;-1.6187,1.6497,1.0378;-2.4454,1.1218,0.7608;-0.4303,0.8911,0.6417;-0.5478,0.6339,-0.4197;0.4489,1.544,0.7162;-0.2185,-0.3727,1.5068;-1.0824,-1.0425,1.3654)|\",5.564728242725\r\n\"[H]OC1=N[C@]([H])(C2=C([H])C(OC([H])([H])[H])=C(OC([H])([H])C([H])([H])[H])C([H])=C2[H])N([H])O1 |(11.1604,-2.5937,2.0916;11.3205,-3.4506,1.66;10.8923,-3.3254,0.4041;10.4651,-2.2753,-0.175;9.9941,-2.7609,-1.4787;10.3839,-2.1226,-2.2826;8.474,-2.812,-1.5537;7.7157,-1.7805,-0.982;8.1995,-0.9702,-0.4469;6.3298,-1.7665,-1.0791;5.6397,-0.6972,-0.5561;4.8368,-0.9818,0.59;4.3998,-0.0286,0.8975;5.4513,-1.3793,1.4095;4.0376,-1.6917,0.3549;5.6612,-2.8069,-1.765;4.3003,-2.7087,-1.8298;3.578,-3.7031,-2.5548;3.9348,-3.7361,-3.594;3.7512,-4.6928,-2.108;2.106,-3.3329,-2.4985;1.7514,-3.3114,-1.4631;1.5122,-4.0672,-3.0535;1.9405,-2.3453,-2.94;6.4155,-3.8413,-2.324;5.9268,-4.6547,-2.8481;7.8107,-3.8395,-2.2209;8.3829,-4.6518,-2.6557;10.6132,-4.1154,-1.6514;11.5502,-3.9705,-2.0392;10.9287,-4.5003,-0.2772)|\",5.864053478274999\r\n\"[H]/N=C(/O[H])N([H])/N=C(/C([H])([H])[H])C([H])([H])C1=C([H])C([H])=C(C([H])([H])[H])O1 |(-0.9683,0.7745,-2.9843;-0.7652,0.5197,-3.9514;0.1107,-0.3992,-3.9777;0.5472,-0.9324,-5.1499;0.0482,-0.457,-5.8368;0.8053,-0.998,-2.9278;1.2035,-1.911,-3.1348;0.3123,-0.7634,-1.6675;0.889,-1.315,-0.6624;0.2997,-1.0649,0.6985;1.0318,-0.5872,1.3638;0.0059,-2.0111,1.1727;-0.5806,-0.4235,0.6185;2.1341,-2.1975,-0.7403;2.6444,-2.1679,0.2276;2.8247,-1.7731,-1.4832;1.8566,-3.63,-1.0694;1.8699,-4.7929,-0.362;2.1368,-4.9025,0.6802;1.4739,-5.8322,-1.2706;1.3775,-6.887,-1.0529;1.24,-5.2413,-2.4749;0.809,-5.7384,-3.8095;0.6783,-6.8234,-3.7751;1.55,-5.5067,-4.5845;-0.1421,-5.2884,-4.1199;1.4732,-3.8894,-2.3653)|\",5.58921848927\r\n\"[H]C1=C2/C([H])=C(OC([H])([H])C([H])([H])[H])\\C([H])=C(\\[H])[C@]2([H])C(C([H])([H])[H])=NC1=O |(3.7617,-1.6704,2.9167;4.2447,-2.5875,2.5915;5.3955,-2.5693,1.8734;6.0367,-1.3542,1.4298;5.6478,-0.4137,1.803;7.0229,-1.3986,0.4899;7.6698,-0.3363,-0.041;7.327,0.9769,0.4198;6.2647,1.1655,0.2137;7.4747,1.0295,1.507;8.2165,1.9675,-0.3085;7.9806,2.9872,0.0136;8.0644,1.902,-1.3903;9.2715,1.7712,-0.0939;7.4604,-2.6665,-0.088;8.1803,-2.6146,-0.8991;6.9988,-3.8363,0.3765;7.3598,-4.7692,-0.044;6.0667,-3.8943,1.5623;6.7349,-4.103,2.4254;5.0825,-5.0673,1.6187;5.4789,-6.3551,0.9436;5.5078,-6.2316,-0.1468;4.7553,-7.1307,1.1983;6.4805,-6.6778,1.257;3.9796,-5.0397,2.2685;3.5661,-3.8371,2.9345;2.633,-3.8903,3.7203)|\",3.790545937465\r\n\"[H]C1=C([H])C(OC([H])([H])C([H])([H])[H])=C(/C([H])=C2/C(=O)ON=C2C([H])([H])[H])C([H])=C1[H] |(7.3894,0.2896,0.3986;6.6486,-0.4809,0.2032;5.2949,-0.1755,0.3544;5.0022,0.8219,0.6595;4.328,-1.1567,0.1103;2.991,-0.9539,0.2221;2.5015,0.3064,0.6965;2.9305,0.5187,1.6852;2.817,1.1024,0.0085;0.9883,0.2091,0.7621;0.5687,1.1542,1.1225;0.6814,-0.5903,1.4433;0.5724,-0.0016,-0.2279;4.718,-2.4601,-0.3055;3.6886,-3.4732,-0.483;2.8169,-3.3696,0.163;3.6185,-4.5346,-1.3258;2.4833,-5.4874,-1.2103;1.5673,-5.5341,-0.4249;2.6425,-6.4089,-2.2223;3.7833,-6.0689,-3.0027;4.3508,-5.0203,-2.4883;5.5463,-4.4576,-3.1959;5.459,-3.3727,-3.3133;5.6232,-4.9208,-4.1829;6.4754,-4.6586,-2.6521;6.0906,-2.7333,-0.4294;6.3972,-3.7438,-0.6734;7.0534,-1.7597,-0.1813;8.1082,-2.0003,-0.2701)|\",3.790545937465001\r\n\"[H]/N=C1/N([H])N=C(C(NNC([H])C2=C([H])C([H])=C([H])N2C([H])([H])[H])O[H])N1[H] |(3.116,4.3782,-5.2551;2.7756,5.3164,-5.0504;2.6388,5.5002,-3.7921;2.1911,6.6835,-3.2081;1.918,7.5094,-3.7161;2.1221,6.6434,-1.8466;2.5284,5.4391,-1.5384;2.5989,5.0039,-0.1395;2.9675,3.8616,0.3327;3.3469,2.9291,-0.627;3.7104,1.8108,-0.083;3.6809,1.7303,1.0069;4.1475,0.6769,-0.8482;4.2695,0.4865,-2.2263;4.0329,1.2265,-2.9776;4.7449,-0.8202,-2.4444;4.9498,-1.2907,-3.396;4.9057,-1.4067,-1.1988;5.2484,-2.3951,-0.926;4.5455,-0.5078,-0.2352;4.5809,-0.7686,1.1957;4.9505,-1.7834,1.3541;3.5816,-0.6894,1.636;5.2508,-0.0698,1.7067;2.2203,5.9441,0.7522;1.9796,6.746,0.2493;2.8517,4.7095,-2.6554;3.1765,3.7527,-2.5777)|\",3.589181688095\r\n\"[H]ON([H])/C(=C(/NO)[C@@]1([H])C(C([H])([H])[H])(C([H])([H])[H])[C@]1([H])C([H])([H])C#N)C([H])([H])[H] |(4.1433,3.4632,2.9649;4.9875,3.3281,2.5;4.6853,2.3686,1.5178;4.8772,1.4318,1.8576;4.9291,2.6941,0.2211;5.1122,1.6988,-0.7302;5.0482,2.1441,-2.0506;5.0952,1.2593,-2.9211;5.2186,0.2371,-0.4391;4.2714,-0.3008,-0.5214;6.4049,-0.5955,-0.9037;7.6148,0.0942,-1.5151;8.5175,-0.5102,-1.3594;7.7884,1.0798,-1.0716;7.4704,0.2338,-2.5902;6.1013,-1.9398,-1.5427;6.9577,-2.6216,-1.465;5.8822,-1.7891,-2.6055;5.231,-2.4369,-1.101;6.1854,-0.3611,0.578;6.8944,0.3315,1.0305;5.7381,-1.4548,1.5459;5.0589,-2.165,1.0629;6.5986,-2.0317,1.9082;5.0454,-0.8653,2.6988;4.5044,-0.3143,3.5674;4.9091,4.1599,-0.1086;4.9497,4.2925,-1.1874;5.765,4.6579,0.3598;4.0059,4.6301,0.2946)|\",3.4231922392900005\r\n\"[H]OC([H])([H])[C@@]12C([H])([H])N([H])C([H])([H])[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H] |(1.5289,-0.7893,-1.963;0.6508,-0.8163,-2.4024;-0.2566,-1.3237,-1.4412;-1.2536,-1.0392,-1.7886;-0.2325,-2.4296,-1.4238;-0.0399,-0.8538,0.0253;1.3749,-1.2563,0.4789;1.6557,-2.2746,0.1825;1.4597,-1.1882,1.5776;2.2139,-0.2696,-0.2413;3.1448,-0.1834,0.1552;1.4854,1.0318,-0.292;1.9588,1.7952,0.3403;1.474,1.4039,-1.3221;0.0739,0.6822,0.2142;0.0953,0.8317,1.3064;-1.1579,1.428,-0.2923;-1.0442,2.5121,-0.1602;-1.2941,1.2537,-1.3671;-2.3799,0.9335,0.5141;-2.2755,1.2954,1.5477;-3.3024,1.3808,0.1234;-2.5239,-0.6036,0.5362;-2.8842,-0.9459,-0.4417;-3.3034,-0.8849,1.2557;-1.2141,-1.3501,0.8888;-1.3648,-2.4322,0.7731;-0.9702,-1.1807,1.9473)|\",8.12531957593\r\n\"[H]OC1=NN=C2C([H])([H])C([H])([H])N([H])C([H])([H])[C@@]12[H] |(5.924,-1.9641,0.3886;5.9534,-1.0056,0.2505;4.7047,-0.5522,0.005;4.4782,0.696,-0.2143;3.0663,0.8469,-0.4489;2.4878,-0.299,-0.3167;1.0711,-0.6176,-0.6543;0.5361,-0.9918,0.2308;0.5528,0.2789,-1.0092;1.079,-1.733,-1.7454;1.3994,-1.2924,-2.6975;0.0677,-2.126,-1.8915;1.9724,-2.8555,-1.4539;1.6158,-3.3911,-0.6636;3.3481,-2.4482,-1.2062;3.9529,-3.3384,-0.9979;3.7403,-1.9903,-2.1217;3.4664,-1.4069,-0.0402;3.2831,-1.9163,0.9163)|\",5.483094087575\r\n\"[H]OC1=N[C@]2([H])N(C([H])([H])[H])C(=O)N=C(O[H])[C@]2([H])N1C([H])([H])[H] |(4.3836,-0.2125,0.3499;4.0507,0.3223,-0.3922;2.958,0.9529,0.0669;2.4727,0.826,1.2443;1.317,1.7404,1.2912;1.5936,2.6119,1.8988;0.1462,1.1502,1.9042;0.2809,0.8555,3.329;0.6361,1.7552,3.8428;0.9924,0.0408,3.5082;-0.6946,0.568,3.7187;-0.7312,0.3529,1.1777;-1.4304,-0.4907,1.7084;-0.8846,0.6269,-0.2054;-0.1325,1.4672,-0.7874;-0.4162,1.7503,-2.0839;0.1708,2.4505,-2.409;1.0648,2.1937,-0.183;0.8964,3.2769,-0.2566;2.3477,1.8318,-0.8072;2.6238,1.875,-2.2321;3.6848,1.6704,-2.3824;2.4233,2.8841,-2.6155;2.044,1.1452,-2.8136)|\",5.298056669235001\r\n\"[H]C1=C([H])C(/C([H])=N/N2C(=O)N=NC2=O)=C([H])C([H])=C1Cl |(1.1167,-2.4702,-0.1067;0.565,-1.5372,-0.0678;1.2286,-0.3183,-0.1049;2.3109,-0.2856,-0.1732;0.5078,0.8889,-0.0541;1.169,2.1933,-0.0899;0.5337,3.0768,-0.0472;2.4531,2.2599,-0.1682;3.0414,3.5108,-0.1995;4.4335,3.6059,-0.2924;5.2514,2.7383,-0.3557;4.738,5.07,-0.2993;3.68,5.7208,-0.2216;2.5082,4.7899,-0.1516;1.3652,5.1544,-0.0714;-0.8921,0.8446,0.0345;-1.4591,1.7713,0.0746;-1.5681,-0.3724,0.0726;-2.6498,-0.4061,0.1414;-0.8312,-1.5549,0.0209;-1.6686,-3.094,0.068)|\",2.8898490923100004\r\n\"[H]C1/C([H])=C2/N=C([H])NC(=O)/C2=C(\\[H])[C@]1([H])OC([H])([H])C([H])([H])OC([H])([H])[H] |(7.8618,-4.6633,3.483;8.6466,-3.9116,3.4934;9.7406,-4.0772,4.2544;9.8841,-4.9503,4.8837;10.7899,-3.0652,4.2968;11.8473,-3.2629,5.037;12.7874,-2.2329,5.0419;13.6633,-2.4579,5.651;12.7705,-1.0805,4.4493;11.6531,-0.7724,3.6507;11.5561,0.2997,3.0782;10.5993,-1.832,3.5329;9.494,-1.6519,2.7822;9.3488,-0.7123,2.255;8.4553,-2.7152,2.5957;8.5681,-3.0671,1.548;7.1686,-2.1183,2.7386;6.1244,-2.7588,2.0076;6.3761,-2.7933,0.9362;5.9791,-3.7933,2.3587;4.8223,-1.9888,2.1964;4.655,-1.8164,3.2718;4.0068,-2.6207,1.8237;4.751,-0.7844,1.4657;5.349,0.3385,2.0949;5.0896,1.2094,1.4871;6.4399,0.249,2.1566;4.9529,0.4834,3.112)|\",3.4422402088250004\r\n\"[H]C1=C([H])C([H])=C(O[C@]2([H])C([H])([H])C([H])([H])[C@]3([H])[C@]([H])(N([H])[H])N([H])N([H])[C@@]3([H])C2([H])[H])C([H])=C1[H] |(-2.1181,-2.2386,0.7246;-1.171,-1.7774,0.4607;-1.1412,-0.5964,-0.2886;-2.0691,-0.1336,-0.6145;0.0704,-0.0035,-0.6251;0.1092,0.9109,-1.2093;1.2807,-0.5853,-0.2165;2.4086,0.0739,-0.6225;3.7162,-0.3978,-0.2416;3.7351,-1.492,-0.324;4.0552,0.0246,1.2021;3.2742,-0.3362,1.8802;4.983,-0.4923,1.4874;4.2478,1.5472,1.3706;4.583,1.7663,2.3914;3.2935,2.0667,1.2349;5.2645,2.0309,0.3398;6.2078,1.486,0.5019;5.6456,3.5125,0.1899;6.3963,3.8264,0.9243;4.5027,4.4,0.34;3.7691,4.1419,-0.319;4.7903,5.3412,0.0743;6.2921,3.5494,-1.1539;7.2862,3.3604,-1.022;5.8029,2.3857,-1.9237;5.3444,2.7786,-2.7433;4.801,1.7029,-1.0815;3.7927,2.1255,-1.2272;4.691,0.192,-1.2783;4.3391,-0.0645,-2.2842;5.6807,-0.2633,-1.1482;1.2584,-1.7664,0.5369;2.1759,-2.2361,0.8716;0.0314,-2.3514,0.8662;0.0272,-3.2675,1.4515)|\",5.581055073755\r\n\"[H]OC1=NC([H])=NC(=O)/C1=C(/C([H])([H])[H])C([H])([H])C(=O)OC([H])([H])[H] |(1.0864,0.8195,-2.0743;1.3312,1.7512,-2.1941;2.5865,1.944,-1.7477;3.1554,3.0513,-2.1023;4.487,3.1734,-1.758;4.895,4.1706,-1.9213;5.3084,2.259,-1.341;4.7759,0.9904,-1.0906;5.4978,0.0146,-0.9382;3.2743,0.9143,-0.9529;2.6692,0.0524,-0.0846;1.2004,0.0584,0.2541;1.0765,-0.0295,1.3399;0.7302,-0.8357,-0.1807;0.6616,0.9487,-0.0724;3.4524,-1.0207,0.6346;2.9487,-1.2957,1.57;4.4729,-0.7131,0.8667;3.529,-2.2797,-0.2268;2.8025,-2.5184,-1.1683;4.4781,-3.108,0.2299;4.6424,-4.333,-0.5091;5.4423,-4.8717,-0.0014;4.9208,-4.1144,-1.5427;3.7167,-4.9141,-0.4993)|\",3.986467909825\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@]([H])(O[H])[C@]([H])(N([H])[C@@]([H])(O[H])C([H])([H])[H])[C@]([H])(O[H])[C@]1([H])O[H] |(1.2667,-6.5065,-0.5812;1.7244,-7.1467,-0.0077;2.6316,-6.3857,0.7649;2.1358,-5.9409,1.6435;3.3984,-7.0729,1.1407;3.2866,-5.3015,-0.104;3.8153,-5.8206,-0.9132;2.2741,-4.5495,-0.794;1.5465,-3.5667,-0.0125;0.9988,-4.0688,0.7941;0.6177,-2.9749,-0.8417;1.1138,-2.2649,-1.3003;2.5584,-2.5461,0.5464;2.0225,-1.8297,1.1809;3.0559,-1.8172,-0.6311;3.4948,-2.4779,-1.2712;3.8996,-0.6581,-0.4616;3.4187,0.0173,0.2534;5.1893,-0.9477,0.1304;5.5106,-1.7759,-0.2784;4.1048,0.0435,-1.8013;3.1507,0.3849,-2.2141;4.5683,-0.6398,-2.5245;4.7693,0.9022,-1.673;3.5621,-3.3113,1.4457;2.9453,-3.8527,2.1772;4.4148,-2.5059,2.2162;4.8353,-1.853,1.6125;4.2925,-4.4017,0.6402;4.8832,-5.0076,1.341;5.189,-3.7632,-0.2959;5.7884,-4.4386,-0.6487)|\",7.611024398485\r\n\"[H]OC1=N[C@]2([H])N(C(=O)N=C(O[H])C2([H])[H])C(=O)C([H])([H])C1([H])[H] |(-2.1969,4.6093,-6.3023;-1.7287,4.5267,-5.4556;-1.9951,5.6008,-4.6662;-1.6168,5.687,-3.4517;-1.6758,4.4561,-2.7089;-1.3465,3.5705,-3.2693;-3.135,4.3214,-2.4072;-3.543,3.4162,-1.3899;-4.7098,3.1421,-1.221;-2.5334,2.8079,-0.6119;-1.3723,3.337,-0.561;-0.48,2.7796,0.2821;0.3647,3.2535,0.222;-0.9217,4.5214,-1.3861;-1.1604,5.4628,-0.8758;0.1594,4.498,-1.571;-4.0682,5.3778,-2.782;-4.7656,5.8991,-1.9458;-4.3117,5.7105,-4.2809;-4.519,4.7667,-4.8008;-5.2206,6.3163,-4.2987;-3.1628,6.4701,-5.0266;-3.0339,7.4658,-4.5994;-3.3802,6.5564,-6.0981)|\",5.491257503089999\r\n\"[H]OC1=N[C@]2([H])N(C(=O)N=C(O[H])C2([H])[H])C([H])([H])C([H])([H])C1([H])[H] |(-2.1379,4.515,-6.2303;-1.6732,4.4968,-5.3781;-2.0026,5.5987,-4.6452;-1.6565,5.7466,-3.4247;-1.6943,4.5191,-2.6675;-1.2865,3.6509,-3.2037;-3.1583,4.2827,-2.4621;-3.5523,3.397,-1.4616;-4.7238,3.184,-1.1922;-2.5282,2.7307,-0.7376;-1.4025,3.3145,-0.5983;-0.492,2.7251,0.2075;0.3215,3.2535,0.2136;-1.0079,4.5942,-1.309;-1.3478,5.478,-0.7545;0.0791,4.6688,-1.4368;-4.1632,5.2891,-2.9006;-5.105,4.8945,-2.5174;-3.9682,6.2453,-2.3996;-4.3447,5.5529,-4.4176;-4.4615,4.5949,-4.9389;-5.2861,6.1019,-4.5442;-3.1969,6.3817,-5.0982;-3.161,7.3894,-4.6791;-3.3396,6.4458,-6.1844)|\",5.464046118040001\r\n\"[H]OC1=N[C@@]2([H])N(C(=O)N=C(O[H])C2([H])[H])C([H])([H])C1([H])[H] |(-0.6628,7.4864,-4.4621;-1.0562,6.6423,-4.1918;-0.1785,5.975,-3.394;-0.5272,4.8217,-2.9959;0.322,4.0761,-2.0955;0.6679,3.1844,-2.6483;1.4731,4.8136,-1.5754;2.1794,4.3844,-0.4508;3.2496,4.8826,-0.1452;1.614,3.3554,0.3349;0.4001,3.0135,0.1407;-0.1061,2.0566,0.9466;-1.0253,1.8732,0.6956;-0.5156,3.6102,-0.9003;-1.2739,2.8978,-1.243;-1.0395,4.4758,-0.474;2.1378,5.687,-2.5289;2.9646,6.1738,-2.013;2.5538,5.1024,-3.3649;1.1142,6.6915,-3.0611;1.5037,7.1961,-3.956;0.9105,7.4669,-2.3108)|\",5.83956323173\r\n\"[H]O/C1=N/C(=O)NC2C(=O)NN[C@@]21[H] |(1.5307,-1.6526,-1.6532;0.5899,-1.7917,-1.4359;0.199,-0.8749,-0.5497;-1.024,-0.7832,-0.194;-1.3698,0.2301,0.7307;-2.2694,0.1216,1.5218;-0.6355,1.4913,0.6431;0.6001,1.3298,0.3905;1.6078,2.3738,-0.0098;1.656,3.551,0.1853;2.622,1.6318,-0.839;2.3955,0.406,-0.8395;1.2467,0.0295,0.0611;1.6906,-0.4769,0.9307)|\",3.132030419255\r\n\"[H]OC1=NC(=S)NC2=NC(C([H])([H])C#N)=NC21 |(4.7282,-3.103,0.4776;5.6584,-2.8209,0.3912;5.6897,-1.4984,0.2296;6.8305,-0.8933,0.1126;6.8665,0.4841,-0.0538;8.337,1.2341,-0.2063;5.7266,1.331,-0.1063;4.6007,0.7169,0.0127;3.3048,1.2515,-0.0005;2.555,0.1994,0.1439;1.0543,0.2477,0.1643;0.7334,1.2933,0.149;0.6681,-0.2368,-0.7438;0.4871,-0.438,1.3312;0.0326,-0.9743,2.2536;3.2153,-1.0794,0.262;4.4626,-0.7336,0.184)|\",1.7496920587149989\r\n\"[H]OC1=NC(=O)N/C2=N/C(C(=O)OC([H])([H])C([H])([H])[H])=N\\C12 |(6.4427,-3.6076,-0.5906;7.0665,-3.5975,-1.3413;7.2236,-2.3383,-1.7494;8.0058,-2.0645,-2.7328;8.1658,-0.7238,-3.1595;8.9043,-0.4699,-4.0814;7.4774,0.4141,-2.5425;6.712,0.1024,-1.5708;5.9142,0.9507,-0.769;5.3523,0.1265,0.0559;4.4063,0.5086,1.1597;4.0184,-0.2967,1.9775;4.0936,1.8036,1.1037;3.1995,2.3035,2.1371;3.468,1.8325,3.0858;3.4268,3.3702,2.1807;1.7437,2.0507,1.7787;1.0968,2.5106,2.5341;1.5291,0.9788,1.7519;1.501,2.4881,0.8052;5.6786,-1.2888,-0.0804;6.4998,-1.2693,-1.0706)|\",2.9823678014800006\r\n\"[H]OC(=O)C([H])([H])C([H])([H])C([H])([H])C1=NC2C(=N1)NC(=O)N=C2O[H] |(0.0182,3.4097,1.9307;0.361,2.7833,2.6079;-0.1837,1.554,2.4775;0.1495,0.647,3.203;-1.1835,1.3693,1.3343;-1.8601,0.5628,1.629;-1.7826,2.2749,1.1913;-0.4905,0.9625,0.0169;0.0163,0.0027,0.1637;-1.2536,0.8086,-0.7549;0.5807,1.962,-0.5074;1.0159,1.5714,-1.431;1.3768,2.0565,0.2389;0.0275,3.3167,-0.7738;-0.2505,3.7326,-2.1472;-0.7227,4.9194,-1.9832;-0.7755,5.2896,-0.5545;-0.268,4.1853,0.1516;-1.2117,6.4012,-0.1051;-1.6657,7.3358,-1.1326;-2.0767,8.4059,-0.7516;-1.6451,7.0464,-2.5215;-1.2029,5.9138,-2.9377;-1.1807,5.6354,-4.242;-0.8157,4.7394,-4.367)|\",2.0789498178200008\r\n\"[H]OC1=NC(=O)NC2=NC(C([H])([H])C#N)=NC21 |(4.7002,-2.9882,0.3731;5.637,-2.7302,0.2832;5.7059,-1.4136,0.0847;6.8459,-0.8347,-0.0494;6.9125,0.5654,-0.2617;7.9867,1.0991,-0.4051;5.7355,1.4333,-0.3219;4.6227,0.8268,-0.1816;3.3155,1.3624,-0.1987;2.5686,0.3213,-0.0307;1.0688,0.358,-0.0006;0.7384,1.4,-0.0382;0.6826,-0.1515,-0.8951;0.5193,-0.3078,1.1861;0.0826,-0.8288,2.1256;3.2379,-0.9686,0.1131;4.4767,-0.6294,0.0232)|\",3.0558385411149995\r\n\"[H]NC1=NC(S[H])N=C(O[H])/C1=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.8256,1.126,-0.236;5.5298,0.3992,-0.0797;5.0058,-0.765,-0.2129;5.8402,-1.881,-0.114;5.3898,-2.9947,-0.5852;6.35,-4.4786,-0.461;7.3579,-3.9057,0.2287;4.2052,-3.2337,-1.2769;3.3243,-2.2943,-1.2132;2.2098,-2.4689,-1.942;1.6106,-1.7129,-1.8192;3.5415,-1.0567,-0.4338;2.5928,-0.2452,0.0954;2.9732,0.6404,0.6051;1.125,-0.352,0.1258;0.371,0.8375,0.0695;0.8878,1.7899,-0.0189;-1.0197,0.8057,0.1122;-1.5822,1.7331,0.0537;-1.6881,-0.4145,0.2415;-2.7732,-0.4401,0.2847;-0.9552,-1.5992,0.3345;-1.4667,-2.5487,0.4629;0.4375,-1.573,0.2792;0.9933,-2.4982,0.3897)|\",3.5755759955700004\r\n\"[H]C1=C([H])C([C@]2([H])N=NC3=NC([H])([H])C([H])([H])C([H])([H])N32)=C([H])C([H])=C1C([H])([H])[H] |(2.4417,-1.678,-2.4272;2.939,-1.1487,-1.6176;4.3262,-1.2068,-1.513;4.9014,-1.7781,-2.2367;4.9875,-0.5358,-0.4775;6.4978,-0.5697,-0.3736;6.8071,-0.1082,0.5792;7.1428,0.2726,-1.4516;7.9952,-0.3832,-2.0845;8.0826,-1.7339,-1.5643;8.9348,-2.5842,-1.9792;8.8976,-3.8739,-1.2812;9.1761,-4.6551,-1.9976;9.6886,-3.8677,-0.5158;7.5417,-4.221,-0.6364;7.6332,-5.1179,-0.0142;6.8135,-4.4434,-1.4258;7.0072,-3.0536,0.2086;7.5819,-2.9503,1.1426;5.9583,-3.2147,0.4818;7.112,-1.8499,-0.5947;4.2302,0.1915,0.4445;4.7288,0.7164,1.2564;2.8412,0.2486,0.3343;2.2693,0.8153,1.0654;2.1714,-0.4179,-0.6991;0.6694,-0.3322,-0.8361;0.2567,-1.2329,-1.3026;0.3779,0.521,-1.463;0.1841,-0.2007,0.1367)|\",4.34293705398\r\n\"[H]C1=NC([H])=C([C@@]2([H])/N=N\\C3=NC([H])([H])C([H])([H])C([H])([H])N32)C([H])=C1[H] |(0.9302,1.7413,2.4296;0.4349,1.0077,1.7959;-0.7757,0.604,2.1998;-1.4066,-0.2995,1.4421;-2.3886,-0.6153,1.7887;-0.8812,-0.8318,0.2588;-1.6277,-1.8798,-0.5407;-1.3067,-1.8289,-1.595;-1.2543,-3.269,-0.0713;-2.2724,-3.9291,0.2155;-3.4672,-3.141,-0.0059;-4.6434,-3.6011,0.151;-5.696,-2.6284,-0.1554;-6.6036,-3.1797,-0.4219;-5.9302,-2.0619,0.7596;-5.3148,-1.6599,-1.2904;-6.1092,-0.9247,-1.4561;-5.2019,-2.2358,-2.217;-3.9968,-0.9308,-0.9831;-4.16,-0.097,-0.2875;-3.5667,-0.5082,-1.9002;-3.0563,-1.8837,-0.4018;0.3831,-0.3951,-0.1446;0.8325,-0.784,-1.0556;1.0565,0.5399,0.6367;2.0398,0.9035,0.3541)|\",4.378311854545\r\n\"[H]C1=NC([H])=C([H])C([C@]2([H])N([H])N([H])C3=NC([H])([H])C([H])([H])C([H])([H])N32)=C1[H] |(-1.4643,-5.4936,-2.2263;-1.3556,-4.41,-2.2364;-1.1449,-3.8512,-3.4321;-1.0077,-2.5173,-3.4581;-0.8382,-2.0753,-4.4385;-1.069,-1.7048,-2.328;-0.9424,-0.6316,-2.4285;-1.2958,-2.296,-1.0785;-1.3777,-1.4801,0.2067;-1.6314,-2.1622,1.0272;-2.4037,-0.4158,0.1464;-2.8112,-0.3313,1.0764;-1.6935,0.8184,-0.064;-1.9388,1.2457,-0.9516;-0.3068,0.5789,0.0693;0.5853,1.4543,-0.1878;1.9673,1.003,-0.0379;2.5917,1.5729,-0.736;2.325,1.2588,0.9721;2.1568,-0.5032,-0.2862;3.186,-0.805,-0.0619;1.9754,-0.7176,-1.3468;1.1805,-1.3228,0.567;1.5175,-1.363,1.6119;1.1206,-2.3544,0.2034;-0.1565,-0.7295,0.5113;-1.4386,-3.6846,-1.0454;-1.6191,-4.2024,-0.1061)|\",4.8517899544150005\r\n\"[H]C1=C(C(=O)C([H])([H])[H])C(=O)C2=C([H])C([H])=C([H])[C@@]([H])(OC([H])([H])C([H])([H])[H])C2=N1 |(7.183,1.4849,-3.9746;6.3,1.7759,-3.412;5.3694,0.8063,-3.1372;5.6505,-0.5819,-3.6462;6.6791,-0.7986,-4.2731;4.663,-1.6958,-3.3761;4.5249,-1.8406,-2.2996;5.0426,-2.6097,-3.838;3.6729,-1.4465,-3.7701;4.1731,1.1817,-2.373;3.2524,0.4188,-2.0684;4.1389,2.6206,-1.97;3.13,3.0808,-1.166;2.3666,2.3655,-0.8684;3.0793,4.4339,-0.6736;2.284,4.7137,0.0108;4.0324,5.3205,-1.0285;4.0473,6.3349,-0.6404;5.1113,4.9917,-2.021;4.7891,5.4857,-2.964;6.3193,5.5687,-1.5762;7.1887,6.0655,-2.6055;7.5062,5.2402,-3.2496;6.6369,6.7949,-3.2221;8.3732,6.7253,-1.9227;9.06,7.1332,-2.6724;8.9188,5.9971,-1.3145;8.0438,7.5415,-1.2712;5.2115,3.5054,-2.3711;6.2477,3.1066,-3.0567)|\",2.9469930009150005\r\n\"[H]C1=NC2=C(C(=O)[C@@]1([H])C(=O)C([H])([H])[H])C([H])=C([H])C(Cl)=C2C([H])([H])[H] |(2.1312,-4.3108,0.1064;2.7983,-3.4472,0.0768;2.2509,-2.2939,0.0529;3.0584,-1.1384,0.0369;4.4707,-1.2,-0.0158;5.1624,-2.5027,-0.0544;6.3755,-2.6074,-0.1797;4.2738,-3.747,0.0816;4.5175,-4.3965,-0.7723;4.6579,-4.5415,1.3687;3.9288,-4.521,2.3365;5.9522,-5.316,1.3096;6.7507,-4.6863,0.9049;5.834,-6.1671,0.6256;6.2061,-5.6864,2.3047;5.2298,-0.0256,-0.0473;6.3108,-0.1049,-0.093;4.6002,1.2093,-0.0159;5.1707,2.1312,-0.035;3.2046,1.2571,0.0403;2.4501,2.844,0.0792;2.3957,0.1108,0.0654;0.8938,0.2052,0.1231;0.508,0.8002,-0.7123;0.5718,0.7057,1.0439;0.4523,-0.7898,0.0898)|\",4.527974472319999\r\n\"[H]C1=C([H])C(F)=C2/N=C(/[H])[C@]([H])(C(=O)C([H])([H])[H])C(=O)C2=C1[H] |(8.7147,-0.7792,-1.4398;7.8677,-0.1001,-1.4277;8.086,1.2793,-1.4937;9.086,1.696,-1.5569;7.0012,2.1472,-1.4788;7.2294,3.466,-1.5466;5.6796,1.6837,-1.395;4.6466,2.6345,-1.3876;3.4415,2.2235,-1.2806;2.6593,2.9843,-1.2586;2.9791,0.7925,-1.1687;2.2907,0.5698,-1.998;2.1676,0.5904,0.1492;2.2951,1.3722,1.0663;1.2421,-0.6014,0.186;0.4191,-0.4461,-0.5244;0.8349,-0.7215,1.1917;1.7764,-1.501,-0.1357;4.1064,-0.2486,-1.2594;3.8587,-1.4462,-1.2719;5.4828,0.2849,-1.3293;6.5705,-0.5971,-1.3488;6.3726,-1.6626,-1.301)|\",4.421850070625\r\n\"[H]C1=C([H])C2=C(/N=C(/[H])[C@]([H])(C(=O)C([H])([H])[H])C2=O)C(Cl)=C1[H] |(3.913,-6.3735,-0.0801;4.2662,-5.3521,-0.1842;3.3757,-4.3254,-0.4762;2.3151,-4.5108,-0.6101;3.842,-3.0121,-0.6024;5.2158,-2.7047,-0.4483;5.7423,-1.4114,-0.5831;4.947,-0.4357,-0.7992;5.3949,0.556,-0.8842;3.4493,-0.5041,-0.9379;3.1483,-0.0634,-1.8996;2.7646,0.3474,0.1793;3.3696,0.62,1.1926;1.3482,0.7885,-0.0995;1.3539,1.5465,-0.8943;0.9107,1.2177,0.804;0.7552,-0.0565,-0.4634;2.8832,-1.9308,-0.9183;1.7,-2.1412,-1.1444;6.095,-3.765,-0.1577;7.8047,-3.467,0.0445;5.6254,-5.0712,-0.0245;6.3303,-5.8641,0.2025)|\",4.383754131555\r\n\"[H]N1C([H])([H])C([H])([H])C(=O)C([H])([H])[C@]1([H])OC([H])([H])[H] |(1.4187,1.6919,2.5049;1.9181,2.1324,1.7369;0.9912,2.8402,0.8461;0.3226,2.1568,0.2998;0.366,3.5003,1.4564;1.798,3.6709,-0.1648;2.3275,4.4646,0.3815;1.1534,4.1381,-0.9146;2.8502,2.8273,-0.8726;3.0593,2.9034,-2.0666;3.6577,1.9117,0.0418;4.3659,2.5351,0.6032;4.2176,1.1985,-0.5666;2.7657,1.1792,1.0528;3.3788,0.6758,1.816;2.0724,0.1804,0.2927;1.357,-0.7509,1.0783;0.9442,-1.4957,0.3935;0.5251,-0.2865,1.6302;2.0146,-1.2584,1.8027)|\",6.13888846728\r\n\"[H]C1=NC2=C(C(=S)[C@@]1([H])C(=O)OC([H])([H])C([H])([H])[H])C([H])=C([H])C([H])=C2[H] |(4.7058,1.3176,-0.7058;5.3986,1.0106,0.0793;5.7313,1.8752,0.958;6.6092,1.4816,1.9877;7.2686,0.222,2.0256;7.0324,-0.7415,0.9536;7.9308,-2.0975,0.6809;5.8544,-0.4343,0.0451;6.1078,-0.6977,-0.9871;4.6262,-1.3,0.4108;4.4387,-1.8327,1.4772;3.7821,-1.3336,-0.6382;2.5387,-2.0635,-0.4469;2.2441,-2.3486,-1.4593;2.7518,-2.9615,0.1373;1.4846,-1.1972,0.2234;0.5394,-1.7486,0.2841;1.3105,-0.2801,-0.3487;1.79,-0.9303,1.2389;8.1526,-0.0447,3.0879;8.6526,-1.0073,3.1102;8.3691,0.8918,4.0897;9.0444,0.6612,4.9085;7.7176,2.1311,4.0427;7.8902,2.8674,4.8227;6.8509,2.4236,2.994;6.3413,3.3793,2.9272)|\",3.24359709796\r\n\"[H]C1=C([H])C(=S)C2=C([H])[C@]([H])(F)C([H])=C([H])C2=N1 |(3.1918,-1.4222,-0.0912;2.3218,-0.7707,-0.0528;2.4818,0.5843,-0.0018;3.4735,1.023,0.0011;1.3477,1.4657,0.0202;1.528,3.1254,0.0291;0.0355,0.7707,0.0028;-1.1477,1.4294,0.0614;-1.1579,2.5132,0.1504;-2.4719,0.7508,-0.0591;-2.8812,1.0091,-1.0539;-3.3658,1.2888,0.8737;-2.4212,-0.7396,0.0609;-3.375,-1.2555,0.1233;-1.2546,-1.4007,0.029;-1.2028,-2.4843,0.0695;0.0278,-0.7053,-0.0292;1.112,-1.434,-0.0539)|\",2.413649853934999\r\n\"[H]C1=C([H])C(=S)/C2=C([H])/C([H])=C(/[H])[C@]([H])(F)C2=N1 |(3.7463,-2.6814,-4.5593;3.3298,-2.8163,-3.5641;2.6468,-3.9537,-3.2475;2.5176,-4.7434,-3.9794;2.0976,-4.1485,-1.9329;1.3185,-5.5728,-1.5137;2.3022,-3.0114,-1.0182;1.7399,-2.9505,0.2358;1.1388,-3.7963,0.5603;1.8645,-1.8061,1.1024;1.3326,-1.8158,2.0493;2.5896,-0.7315,0.7306;2.666,0.1577,1.3501;3.3835,-0.7279,-0.5378;4.45,-0.819,-0.2653;3.2423,0.4996,-1.1733;3.0751,-1.8712,-1.4974;3.5662,-1.7694,-2.6957)|\",2.1551416959600003\r\n\"[H]C1=C([H])C(=S)/C2=C([H])/C([H])=C(/[H])[C@]([H])(Cl)C2=N1 |(3.2431,5.4158,1.38;2.9043,4.3996,1.5648;2.3306,4.0644,2.7566;2.2151,4.8031,3.5421;1.8725,2.724,3.0094;1.2102,2.2919,4.4864;2.0389,1.8048,1.8714;1.4817,0.547,1.8316;0.9122,0.2158,2.6962;1.5912,-0.3213,0.6905;1.0761,-1.2769,0.7167;2.3179,0.0347,-0.3922;2.4237,-0.624,-1.2486;3.0624,1.3232,-0.4246;2.9897,1.8357,-1.3848;4.8729,0.8696,-0.3258;2.7409,2.2895,0.6926;3.1256,3.5216,0.524)|\",2.07350754081\r\n\"[H]C1C(=S)C2=C(N=C1[H])C(C([H])[H])=C([H])C(C([H])([H])[H])=C2[H] |(7.2488,-3.3922,0.0855;6.6629,-2.4797,0.0844;5.2356,-2.5877,0.0831;4.4697,-4.0876,0.0795;4.5331,-1.2942,0.0842;5.321,-0.0703,0.0838;6.6353,-0.0514,0.083;7.2802,-1.2565,0.0836;8.3665,-1.1931,0.0834;4.5959,1.2235,0.0848;5.2707,2.4017,0.0853;4.7415,3.3501,0.0861;6.3547,2.4151,0.0847;3.1498,1.2057,0.0855;2.634,2.1632,0.0862;2.437,0.0411,0.0855;0.9292,0.013,0.0862;0.5101,1.0234,0.0764;0.5447,-0.5058,0.9729;0.5443,-0.5232,-0.7899;3.1565,-1.1987,0.0857;2.5953,-2.1312,0.087)|\",1.896633537985\r\n\"[H]C1=C([H])C(=S)[C@]2([H])C(=N1)C(OC([H])([H])[H])=C([H])C([H])=C2OC([H])([H])[H] |(1.3364,0.4494,-6.2146;1.8277,-0.0303,-5.3705;1.6262,-1.3645,-5.1362;0.9692,-1.9402,-5.7796;2.1767,-2.0098,-3.987;1.6506,-3.5024,-3.4761;3.1726,-1.1676,-3.1866;2.6942,-1.0491,-2.1948;3.3091,0.2887,-3.6474;2.6007,0.8141,-4.6116;4.2621,1.0892,-2.9094;4.3258,2.4479,-3.0175;3.1123,3.1908,-2.8492;2.4328,3.0321,-3.6886;3.4169,4.2384,-2.8006;2.615,2.9158,-1.9096;5.2802,0.4501,-2.249;6.041,1.0727,-1.7871;5.4614,-0.9728,-2.2532;6.3928,-1.3748,-1.8726;4.5113,-1.7848,-2.7961;4.5968,-3.1067,-2.9589;5.7272,-3.7902,-2.4232;5.5553,-4.8468,-2.6285;5.8043,-3.626,-1.3421;6.6511,-3.4594,-2.9129)|\",2.6721580119099992\r\n\"[H]C1=C([H])C(=S)C2=C([H])[C@]([H])(Br)C([H])=C([H])C2=N1 |(3.2022,-1.4854,0.0417;2.3358,-0.8289,0.0104;2.5032,0.526,-0.0471;3.4972,0.9592,-0.0665;1.3744,1.4125,-0.0953;1.5662,3.0693,-0.1956;0.0602,0.7272,-0.0545;-1.121,1.4011,-0.0648;-1.1081,2.4868,-0.1049;-2.4335,0.7273,-0.019;-3.1254,1.1345,-0.7603;-3.3564,1.3845,1.7118;-2.408,-0.7523,-0.0048;-3.3616,-1.2706,0.0017;-1.2431,-1.4248,0.0073;-1.2082,-2.5096,0.0197;0.0427,-0.7435,-0.0027;1.1234,-1.4821,0.0266)|\",2.217727881575001\r\n\"[H]C1=C2C(=S)/C([H])=C([H])\\N=C/2[C@]([H])(F)C([H])=C1F |(0.4584,-2.0177,2.3482;1.0939,-1.4916,1.6409;0.5319,-0.6399,0.7244;-0.9204,-0.3964,0.639;-2.0581,-1.1214,1.6288;-1.3024,0.5457,-0.3797;-2.3564,0.7712,-0.4989;-0.3678,1.1437,-1.1712;-0.6638,1.8575,-1.9356;0.9932,0.9171,-1.0917;1.4089,0.0641,-0.2052;2.9137,-0.1904,-0.19;3.4159,0.7778,-0.0569;3.3066,-0.6649,-1.4464;3.3884,-1.1552,0.8504;4.4474,-1.3844,0.8985;2.5106,-1.7288,1.6893;2.9205,-2.5908,2.6368)|\",2.1388148649300005\r\n\"[H]C1C(F)=C([H])[C@@]([H])(F)C2=C1/C1=C(C(=O)N(C([H])([H])[H])N1)\\C([H])=N/2 |(3.3041,-1.0698,-3.9164;4.3636,-1.062,-3.6802;5.2977,-1.3779,-4.7265;4.7477,-1.674,-5.9197;6.6312,-1.3755,-4.5663;7.2991,-1.5967,-5.3912;7.2305,-1.0654,-3.2309;7.8336,-1.9172,-2.8856;8.1185,0.0076,-3.3674;6.2314,-0.7341,-2.1235;4.798,-0.736,-2.4208;3.9603,-0.3714,-1.3028;4.5809,-0.0674,-0.0439;3.4637,0.2504,0.8736;3.4405,0.5719,2.052;2.3441,0.093,0.0392;0.9696,0.2915,0.4425;0.9849,0.5636,1.4994;0.392,-0.628,0.3045;0.5094,1.096,-0.1399;2.6464,-0.2728,-1.2461;5.9306,-0.1181,0.0935;6.4299,0.1037,1.0329;6.7583,-0.4595,-0.9613)|\",2.01908477071\r\n\"[H]OC(NN([H])[H])[C@]1([H])C(=O)C2=C(N=C1[H])/C(F)=C([H])\\C(F)=C/2[H] |(2.2804,2.3132,2.3394;3.1336,1.85,2.4873;3.4712,1.2427,1.3364;4.716,1.0694,1.1046;4.9563,0.385,-0.1616;5.5654,1.0304,-0.6708;5.588,-0.3815,0.0828;2.3272,0.7538,0.4304;2.6313,1.0262,-0.5967;0.9836,1.4259,0.6556;0.888,2.5362,1.1774;-0.1945,0.677,0.1787;-0.0458,-0.7062,-0.0751;1.1475,-1.4167,0.1108;2.2147,-0.7615,0.3702;3.1366,-1.3222,0.5033;-1.191,-1.4121,-0.4818;-1.0983,-2.7234,-0.7249;-2.4222,-0.7923,-0.6416;-3.2877,-1.3603,-0.9624;-2.5194,0.572,-0.3686;-3.7132,1.1698,-0.5204;-1.4321,1.3191,0.053;-1.5274,2.376,0.2732)|\",3.55924916454\r\n\"[H]NC1=NC(N2C([H])([H])C([H])([H])C([H])([H])C2([H])[H])N=C(O[H])C1=S |(2.1041,6.1621,-1.2708;2.2598,5.1915,-0.9777;1.1403,4.5441,-1.0219;1.1172,3.2165,-0.6683;-0.0229,2.5708,-0.7154;-0.0479,1.2688,-0.3572;-1.2334,0.3982,-0.3808;-1.8878,0.6236,0.4721;-1.8124,0.5571,-1.292;-0.6278,-1.0081,-0.2664;-1.3281,-1.7241,0.1735;-0.35,-1.3768,-1.2612;0.6323,-0.7836,0.5872;1.3733,-1.5811,0.4821;0.3618,-0.7172,1.648;1.1631,0.571,0.0935;1.8616,0.4649,-0.7469;1.6742,1.1586,0.861;-1.2764,3.0593,-1.1046;-1.3364,4.298,-1.4513;-2.5139,4.7906,-1.8299;-2.3745,5.7386,-2.0516;-0.1484,5.2005,-1.4493;-0.3202,6.7786,-1.8939)|\",2.0027579396800004\r\n\"[H]SC1N[C@]([H])(N([H])[H])N([H])[C@@]([H])(N2C([H])([H])C([H])([H])SC([H])([H])C2([H])[H])N1[H] |(-1.193,7.0998,0.0346;-1.2735,6.2892,1.1073;-1.1745,4.7801,0.1365;-1.3444,4.7942,-1.1313;-1.1317,3.4889,-1.7385;-0.0555,3.2645,-1.7439;-1.5984,3.5406,-3.1034;-2.5033,4.0059,-3.1352;-1.7011,2.5978,-3.4726;-1.7789,2.3882,-0.9673;-2.785,2.5479,-1.0303;-1.4086,2.3364,0.456;-2.3359,2.1073,0.9993;-0.464,1.3339,0.8995;-1.0064,-0.0054,1.121;-0.3801,-0.502,1.8743;-2.0116,0.1001,1.5511;-1.0798,-0.9012,-0.1239;-1.7331,-0.4573,-0.881;-1.4675,-1.8905,0.1408;0.5863,-1.1841,-0.8439;1.0778,0.5828,-0.9768;2.1355,0.5803,-1.2601;0.5065,1.0485,-1.7839;0.8921,1.3415,0.3474;1.1977,2.3857,0.2213;1.5578,0.895,1.0982;-0.9034,3.6546,0.88;-0.7401,3.749,1.874)|\",5.651804674885001\r\n\"[H]OC1=N[C@]([H])(N2C([H])([H])C([H])([H])SC([H])([H])C2([H])[H])N([H])[C@@]([H])(N([H])[H])C1([H])[H] |(-4.9773,3.8259,2.3706;-4.6729,2.9523,2.0807;-3.5553,3.1035,1.3147;-2.9995,2.0495,0.8843;-1.8449,2.1341,0.0085;-2.1809,1.7555,-0.9672;-0.7745,1.2098,0.411;-1.1293,-0.193,0.1569;-1.8932,-0.5572,0.8625;-1.5607,-0.2497,-0.8509;0.0983,-1.1023,0.2039;0.8278,-0.7888,-0.5509;-0.1902,-2.1377,-0.0023;0.9011,-1.1151,1.8543;1.0324,0.7056,2.0321;1.3732,0.8884,3.0561;1.794,1.0838,1.3408;-0.2987,1.4165,1.7838;-0.1374,2.4907,1.9271;-1.0418,1.0881,2.5288;-1.327,3.4943,-0.2137;-0.336,3.4293,-0.4325;-1.5964,4.545,0.768;-1.0671,4.4028,1.7262;-1.2087,5.864,0.29;-1.5116,5.9517,-0.6812;-0.1932,5.9483,0.2809;-3.0988,4.5214,1.0537;-3.6389,4.9261,0.1866;-3.3131,5.1755,1.9073)|\",6.247734007479999\r\n\"[H]SC1=NC([H])([H])C([H])([H])[C@@]([H])(C2=C([H])SC([H])=C2[H])N1[H] |(-1.8811,-3.01,0.1715;-0.5878,-3.0757,-0.1966;-0.3431,-1.2899,-0.0668;-1.3409,-0.5276,0.1542;-1.0818,0.9092,0.2616;-1.9361,1.4388,-0.1766;-1.0674,1.1849,1.326;0.2144,1.3607,-0.4218;0.4518,2.3989,-0.1677;0.088,1.3034,-1.5098;1.3984,0.4495,-0.0366;2.2266,0.6635,-0.7228;1.9084,0.6596,1.3782;3.0607,1.3467,1.6623;3.7482,1.8045,0.9626;3.3683,1.441,3.366;1.9325,0.5304,3.7134;1.6704,0.2988,4.7371;1.2648,0.1859,2.5727;0.3514,-0.3983,2.5734;0.9751,-0.9328,-0.29;1.6712,-1.6457,-0.1112)|\",5.575612796744999\r\n\"[H]C1=N[C@@]2([H])C(NON2[H])N([H])C(C2=C([H])C([H])=C([H])C([H])=C2[H])=C1[H] |(4.028,-0.788,-1.7141;4.9302,-1.3364,-1.425;6.0336,-0.8336,-1.8551;7.2582,-1.4322,-1.363;7.3312,-1.2691,-0.2789;7.381,-2.9013,-1.7235;8.2074,-3.1312,-2.6755;8.8151,-1.9062,-3.0131;8.4539,-0.8984,-2.016;9.225,-0.9812,-1.3463;6.639,-3.8764,-1.0913;6.9082,-4.8341,-1.2804;5.3973,-3.6709,-0.5085;4.8912,-4.8436,0.2528;3.5319,-5.1945,0.215;2.849,-4.618,-0.4011;3.0631,-6.2927,0.9328;2.0098,-6.5549,0.8857;3.9435,-7.0633,1.6955;3.5767,-7.9214,2.2517;5.2977,-6.728,1.7358;5.9886,-7.3175,2.332;5.7696,-5.6293,1.0189;6.8204,-5.3595,1.077;4.6678,-2.5179,-0.6141;3.7179,-2.5139,-0.089)|\",4.0272849874\r\n\"[H]OC1=NC(N(C([H])([H])C([H])=C([H])[H])C([H])([H])C([H])=C([H])[H])=N[C@]2([H])N=NC([H])=C12 |(9.8534,0.3474,5.1918;9.6398,-0.2074,4.4238;8.3083,-0.1683,4.2136;7.8154,-0.927,3.3021;6.4225,-0.8548,3.0651;6.0395,-1.4662,1.9013;6.9315,-2.2993,1.0875;7.8307,-2.4934,1.6737;6.4302,-3.2576,0.9;7.2959,-1.6385,-0.2179;7.8042,-0.6783,-0.1295;7.0414,-2.1538,-1.4211;7.3393,-1.65,-2.3367;6.5345,-3.1103,-1.5357;4.6278,-1.4252,1.5016;4.2099,-0.4855,1.8724;4.5948,-1.4158,0.4069;3.8444,-2.594,2.0436;3.7969,-2.659,3.1295;3.2392,-3.508,1.2844;2.6775,-4.3324,1.716;3.2686,-3.458,0.1972;5.5156,-0.2991,3.8223;6.0093,0.3582,4.9849;5.8243,-0.2797,5.8797;5.3309,1.6089,5.4571;6.2049,2.4635,5.7633;7.5244,1.9976,5.4285;8.3593,2.6765,5.5537;7.4518,0.7402,4.9567)|\",3.2027800203850005\r\n\"[H]OC(=O)C1=NC(=S)N=C([H])[C@@]1([H])N([H])[H] |(0.5958,-4.4009,-0.0491;0.2012,-3.5535,0.234;0.8507,-2.5895,-0.4352;1.7569,-2.7777,-1.2153;0.3057,-1.2069,-0.1745;-0.9614,-1.022,-0.2357;-1.4283,0.3111,-0.1857;-2.8715,0.6856,-0.8689;-0.6588,1.312,0.4358;0.609,1.13,0.4936;1.212,1.9044,0.976;1.3362,-0.1122,0.0341;1.8349,0.0664,-0.9295;2.3547,-0.5692,0.9829;2.0462,-0.4381,1.9439;3.2306,-0.0692,0.8607)|\",2.9796466629750005\r\n\"[H]O/C(=N/[C@]1([H])[C@@]([H])(N([H])[H])N([H])N([H])[C@]1([H])N([H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(2.2027,-2.3976,0.5815;2.3092,-1.5575,1.0555;2.7338,-0.6076,0.1623;2.6915,0.5971,0.5586;3.2631,1.6908,-0.1818;3.6494,1.4058,-1.1742;4.4516,2.3401,0.586;4.1881,2.4,1.6503;5.7341,1.6705,0.4797;5.83,1.2191,-0.4291;5.8097,0.9407,1.1826;4.522,3.7029,0.0395;5.0967,3.6723,-0.8041;3.1896,4.0763,-0.3616;2.8243,4.7122,0.3456;2.2856,2.8917,-0.3546;1.7668,2.8256,-1.3203;1.2621,3.0374,0.6714;1.5999,2.6194,1.537;-0.0333,2.4424,0.345;-0.7303,2.6381,1.1669;-0.0073,1.354,0.173;-0.4364,2.9243,-0.5533;3.2098,-1.1673,-1.1442;2.7047,-0.6967,-2.3656;1.9553,0.0893,-2.368;3.1452,-1.2475,-3.5687;2.7418,-0.8801,-4.5081;4.0968,-2.2698,-3.5673;4.4409,-2.6942,-4.5063;4.6024,-2.7472,-2.3565;5.3438,-3.5413,-2.3495;4.1563,-2.2053,-1.1514;4.5541,-2.5743,-0.2092)|\",4.264024037335\r\n\"[H]C1=NC([H])=C([H])C(=O)/C1=N/C(=O)C1([H])C([H])([H])C([H])([H])N([H])C([H])([H])C1([H])[H] |(-2.6587,0.9099,-2.495;-3.5918,1.1949,-2.0095;-4.6947,0.9127,-2.6086;-5.9052,1.2924,-2.0007;-6.7853,1.012,-2.5733;-6.0208,1.9409,-0.8215;-6.9874,2.2035,-0.4034;-4.8265,2.3035,-0.0506;-4.8624,2.8685,1.0303;-3.5067,1.9013,-0.7077;-2.4122,2.1909,-0.1116;-1.1249,1.9322,-0.623;-0.6818,2.5536,-1.5708;-0.3439,0.9049,0.1862;-0.6025,1.0531,1.2439;1.1737,1.0766,-0.0019;1.4104,1.0266,-1.0692;1.488,2.0634,0.3564;1.9341,-0.025,0.7434;1.7898,0.1089,1.8349;3.0077,0.0673,0.5448;1.4869,-1.3382,0.2758;2.0453,-2.0701,0.709;0.0709,-1.5627,0.5542;-0.1973,-2.5731,0.2246;-0.1736,-1.4919,1.6338;-0.7713,-0.5317,-0.2066;-1.8347,-0.6945,0.0084;-0.617,-0.6773,-1.2832)|\",2.37283277636\r\n\"[H]C1=NNC(=O)[C@@]2([H])C1N(C([H])([H])[H])N=C2SC([H])([H])C([H])([H])[H] |(10.167,-2.4259,2.5234;9.3937,-2.1843,3.243;9.6445,-2.5701,4.5287;8.8103,-2.5073,5.5188;7.5092,-1.9836,5.3435;6.6608,-2.0355,6.2069;7.2576,-1.2289,4.013;7.568,-0.2071,4.2976;8.1436,-1.6918,2.9175;7.4253,-1.6429,1.7848;7.8323,-2.0752,0.4615;8.0822,-3.1428,0.4648;8.7019,-1.5024,0.1253;6.9964,-1.8974,-0.2144;6.0694,-1.3352,2.0132;5.9381,-1.1361,3.2931;4.442,-0.7241,4.0834;3.3218,-0.5763,2.6266;3.7393,0.1745,1.9503;3.3181,-1.5358,2.103;1.9238,-0.1914,3.107;1.2524,-0.1078,2.2457;1.9272,0.7725,3.6266;1.5096,-0.9449,3.7849)|\",3.1537995272950003\r\n\"[H]OC([H])([H])C1=C([H])N(C2=C([H])C([H])=C(Cl)C([H])=C2[H])C(=O)C([H])=C1[H] |(-1.6042,-2.5694,1.955;-2.0828,-2.7903,1.1399;-1.1696,-2.6381,0.0565;-0.3628,-3.3877,0.0956;-1.7659,-2.8543,-0.8373;-0.5782,-1.2528,-0.0211;0.7703,-1.0492,-0.0387;1.4774,-1.8704,0.0124;1.3364,0.2041,-0.1023;2.7657,0.3116,-0.1941;3.4748,1.167,0.656;2.9425,1.7845,1.3676;4.8633,1.2317,0.5766;5.4187,1.8921,1.2335;5.5391,0.4392,-0.352;7.2916,0.518,-0.4449;4.8443,-0.4097,-1.2104;5.379,-1.0102,-1.9381;3.4532,-0.4635,-1.1324;2.9013,-1.0988,-1.8187;0.5431,1.401,-0.1198;1.0711,2.5065,-0.149;-0.8876,1.1563,-0.1027;-1.5067,2.0462,-0.1279;-1.4194,-0.0954,-0.0543;-2.4971,-0.2349,-0.0227)|\",4.49532081026\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C(N(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(0.3097,0.473,-1.3049;0.3867,1.4122,-0.9011;1.1418,1.3576,0.12;1.5016,2.8369,1.0861;0.7292,3.6223,0.3119;1.7923,0.2544,0.6585;2.3831,0.2938,1.4939;1.6305,-0.9358,0.0454;2.2062,-2.0114,0.4915;3.0868,-2.1326,1.6726;3.3624,-1.0306,2.4077;4.1498,-1.1091,3.4747;4.3189,-0.172,3.9955;4.7467,-2.3084,3.9337;5.5341,-2.3462,5.0664;5.9174,-1.1027,5.7177;6.5,-1.3341,6.611;5.0338,-0.5376,6.0366;6.5242,-0.4532,5.0682;6.2633,-3.562,5.3888;6.8015,-3.4165,6.3269;6.9936,-3.8375,4.612;5.5779,-4.4068,5.5282;4.4746,-3.4567,3.1576;4.9035,-4.4168,3.4196;3.6522,-3.3664,2.0465;3.4538,-4.2598,1.466;1.913,-3.2558,-0.3171;1.4113,-4.0256,0.2838;2.8281,-3.7041,-0.7254;1.2597,-2.9884,-1.1494)|\",4.076265480490001\r\n\"[H]/N=C(/S[H])N([H])/N=C(/C1=NC([H])=C(OC([H])([H])C([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(9.2644,5.3219,-2.5279;10.2515,5.477,-2.7523;10.8567,4.3614,-2.7872;12.6184,4.2685,-3.1544;12.7432,5.6044,-3.2586;10.3316,3.0906,-2.5707;10.9366,2.2769,-2.6087;9.0137,2.9614,-2.2863;8.5368,1.7768,-2.0805;7.0887,1.6832,-1.7705;6.5858,0.448,-1.5751;5.2946,0.3147,-1.2957;4.9128,-0.6932,-1.1437;4.4074,1.3986,-1.1879;3.1177,1.0947,-0.8926;2.1693,2.1601,-0.7807;2.4843,2.8532,0.0121;2.1307,2.7209,-1.7253;0.8214,1.5405,-0.459;0.0619,2.3236,-0.3636;0.8677,0.9842,0.4823;0.5141,0.8525,-1.2526;4.9225,2.6851,-1.3893;4.292,3.5646,-1.322;6.2746,2.8222,-1.6828;6.7066,3.8026,-1.845;9.3627,0.5152,-2.1382;8.7315,-0.35,-1.9392;10.1697,0.5366,-1.3925;9.8244,0.3897,-3.1275)|\",4.400080962584999\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C(OC([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(2.7067,-1.6195,-2.3965;1.9033,-1.0293,-2.6356;1.8238,-0.079,-1.7967;0.5191,1.1583,-1.9159;-0.0208,0.5969,-3.0141;2.6478,0.1882,-0.7069;2.5135,0.982,-0.0748;3.6785,-0.6429,-0.4706;4.4879,-0.4561,0.5288;4.4351,0.6324,1.5315;3.46,1.5617,1.4423;3.3878,2.5547,2.3332;2.5747,3.2602,2.1917;4.2962,2.6906,3.3892;4.2747,3.6745,4.3231;3.2538,4.6637,4.2384;3.4279,5.3412,5.0752;2.2557,4.2174,4.3318;3.3171,5.2227,3.2964;5.3154,1.7351,3.499;6.0313,1.8204,4.3105;5.3857,0.709,2.574;6.172,-0.0315,2.6563;5.5878,-1.4877,0.643;5.542,-2.0338,1.5943;6.5851,-1.0338,0.574;5.4794,-2.2058,-0.1713)|\",4.16334191265\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C(F)C([H])=C1[H])C([H])([H])[H] |(4.1113,2.472,-1.1108;5.1323,2.4032,-1.0519;5.4439,1.3016,-0.5031;7.1636,0.8419,-0.2377;7.6314,1.9768,-0.7903;4.5941,0.3006,-0.0313;4.9268,-0.5636,0.4042;3.2716,0.4901,-0.1517;2.4044,-0.386,0.2592;2.6902,-1.6889,0.9006;3.9808,-2.0466,1.103;4.2643,-3.2156,1.6774;5.3115,-3.4679,1.8245;3.2674,-4.0958,2.0823;3.6046,-5.2629,2.6554;1.9314,-3.7667,1.8939;1.1505,-4.4511,2.2094;1.6421,-2.5467,1.2953;0.6094,-2.2625,1.1357;0.9613,0.0081,0.0363;0.4025,0.072,0.979;0.4352,-0.7071,-0.6094;0.9356,0.9872,-0.4448)|\",4.1442939431150005\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C(N([H])[H])C([H])=C1[H])C([H])([H])[H] |(3.2286,0.0518,-3.1752;4.0923,0.5524,-3.4082;4.3782,1.3428,-2.4551;5.8431,2.3904,-2.5259;6.2006,1.9369,-3.742;3.6895,1.548,-1.2646;3.9808,2.2185,-0.5481;2.5675,0.8326,-1.0536;1.8634,0.9727,0.0288;2.1347,1.8831,1.1623;3.2256,2.6837,1.1176;3.4907,3.5093,2.1239;4.3835,4.1259,2.0205;2.6932,3.619,3.2777;2.9875,4.5409,4.2712;2.6096,4.3441,5.1888;3.9465,4.862,4.3096;1.5597,2.7924,3.3349;0.9002,2.832,4.1989;1.2832,1.9318,2.2843;0.4071,1.2963,2.3369;0.6392,0.087,0.0952;-0.2865,0.6719,0.1736;0.6712,-0.5926,0.957;0.5906,-0.5142,-0.8143)|\",4.1442939431150005\r\n\"[H]/N=C(/S[H])N([H])N([H])/C(=C1/N=C([H])C(=O)C([H])=C1[H])C([H])([H])[H] |(3.7135,-2.0523,-2.2042;3.4827,-1.8476,-3.1808;3.1547,-0.6322,-3.3303;2.6898,0.0218,-4.9382;3.0027,-1.1211,-5.5768;3.0384,0.3657,-2.3437;3.1417,1.3267,-2.6584;3.6025,0.1088,-1.1092;4.6141,0.123,-1.0553;2.8498,0.001,0.0315;3.4536,-0.094,1.2845;2.5953,-0.2781,2.3605;3.0881,-0.3895,3.5523;2.396,-0.5347,4.3828;4.5266,-0.3365,3.8987;4.9232,-0.4528,5.0608;5.4042,-0.1308,2.7481;6.4726,-0.067,2.9318;4.8782,-0.0148,1.5018;5.5563,0.1554,0.6664;1.3632,-0.0294,-0.1528;0.8885,-0.1955,0.8118;1.0795,-0.8204,-0.8558;1.0082,0.9159,-0.5793)|\",3.66809470474\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C([H])C([H])=C1F)C([H])([H])[H] |(3.4692,-2.0375,-2.826;4.4802,-2.2069,-2.8385;5.0414,-1.4285,-2.0076;6.8247,-1.4494,-1.7623;7.0115,-2.4454,-2.6486;4.4488,-0.472,-1.1818;4.9707,0.1088,-0.5207;3.119,-0.3241,-1.2499;2.4563,0.5283,-0.5231;3.0421,1.4553,0.4755;4.3821,1.4244,0.6757;4.9653,2.2314,1.5627;6.0446,2.1363,1.6549;4.2592,3.1489,2.337;4.7689,3.7873,3.0509;2.8793,3.2155,2.1631;2.254,3.9028,2.7238;2.2851,2.3731,1.2371;0.9485,2.466,1.093;0.9636,0.5116,-0.7797;0.3927,0.2684,0.1226;0.762,-0.2475,-1.5375;0.5963,1.4777,-1.1417)|\",3.981025632815\r\n\"[H]/N=C(/S[H])N([H])/N=C(/C1=C(OC([H])([H])C([H])([H])[H])C([H])=C([H])C([H])=N1)C([H])([H])[H] |(1.6239,-2.9209,1.5323;1.2859,-3.8833,1.6349;2.2859,-4.6539,1.7662;2.0861,-6.431,1.9785;0.7422,-6.3553,1.9891;3.6355,-4.3027,1.7572;4.3463,-5.0004,1.9542;3.9479,-2.9941,1.6287;5.1932,-2.6451,1.6615;5.5373,-1.2059,1.5196;4.5945,-0.1535,1.2982;3.2789,-0.4583,1.2005;2.3339,0.5896,0.9591;2.3847,1.3291,1.7705;2.5798,1.0999,0.0172;0.9563,-0.0448,0.8893;0.2022,0.7221,0.6817;0.7031,-0.5303,1.8368;0.9155,-0.7948,0.0937;5.08,1.1549,1.1873;4.3977,1.9788,1.018;6.4466,1.4015,1.2924;6.8365,2.4116,1.2091;7.2952,0.3214,1.5053;8.3712,0.4586,1.5923;6.8488,-0.9298,1.6127;6.3165,-3.6423,1.8397;7.2785,-3.1345,1.8171;6.2967,-4.3972,1.0416;6.227,-4.1685,2.8006)|\",4.356542746505\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C([H])C([H])=C1OC([H])([H])[H])C([H])([H])[H] |(2.7562,-3.6656,0.3719;3.4248,-4.0924,-0.2775;4.0313,-3.1708,-0.9069;5.2883,-3.5544,-2.1389;5.1574,-4.8836,-1.9713;3.8715,-1.792,-0.7915;4.4032,-1.1054,-1.3336;2.9662,-1.3378,0.0879;2.7278,-0.0727,0.2908;3.403,1.0592,-0.4017;4.3394,0.7415,-1.3211;5.0045,1.6788,-1.9994;5.7387,1.3224,-2.7179;4.7802,3.0346,-1.8069;5.3335,3.7792,-2.3703;3.8234,3.4147,-0.8678;3.6279,4.4663,-0.6959;3.1224,2.4403,-0.1533;2.18,2.7497,0.7738;1.8841,4.1155,1.0345;1.1073,4.1051,1.8003;1.5054,4.6232,0.1385;2.7629,4.6514,1.4149;1.6567,0.1529,1.3398;0.7976,0.7022,0.9427;2.0308,0.7228,2.1959;1.3222,-0.8272,1.6857)|\",3.95109310926\r\n\"[H]/N=C(/S[H])N([H])/N=C(/C1=C(C([H])([H])[H])C([H])=C([H])C([H])=N1)C([H])([H])[H] |(0.4865,-2.8922,-2.1568;0.4055,-3.8513,-2.5052;1.3989,-4.5353,-2.1097;1.5643,-6.2762,-2.5421;0.3679,-6.3139,-3.1572;2.4846,-4.1147,-1.3431;3.1574,-4.7963,-1.0072;2.5196,-2.8342,-0.9067;3.5033,-2.4662,-0.1487;3.5429,-1.0554,0.3285;2.5916,-0.054,-0.0174;1.3997,-0.2575,-0.9189;0.855,0.6853,-1.0354;0.7121,-1.0078,-0.515;1.6965,-0.6138,-1.9102;2.7936,1.2168,0.5321;2.0875,2.0075,0.29;3.8702,1.4815,1.3725;4.0272,2.4675,1.7992;4.7431,0.4328,1.6465;5.6055,0.5754,2.2958;4.5847,-0.7889,1.1422;4.6069,-3.4119,0.2674;5.3354,-2.8895,0.8835;5.1182,-3.8247,-0.6132;4.2054,-4.2552,0.8473)|\",4.547022441855001\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\C1=NC([H])=C([H])C([H])=C1N([H])[H])C([H])([H])[H] |(4.0684,-0.5562,2.8762;5.0485,-0.7528,2.6497;5.1315,-1.0783,1.4256;6.7195,-1.4707,0.6701;7.4186,-1.0705,1.7488;4.1002,-1.2419,0.4985;4.2981,-1.1745,-0.4997;2.8505,-0.8865,0.9022;1.8576,-0.9764,0.0791;1.9596,-1.4582,-1.3348;2.9763,-0.9624,-2.0642;3.0931,-1.2873,-3.3531;3.9343,-0.8523,-3.8874;2.1898,-2.1372,-3.9924;2.2996,-2.3688,-5.0475;1.1705,-2.7076,-3.2424;0.4731,-3.4073,-3.6982;1.0474,-2.4011,-1.8781;0.0483,-2.9928,-1.1076;-0.3518,-3.8304,-1.5134;0.2832,-3.1336,-0.1327;0.542,-0.4217,0.5785;-0.3166,-1.0442,0.3131;0.3653,0.5645,0.1276;0.5878,-0.2956,1.6631)|\",4.416407793615001\r\n\"[H]/N=C(/S[H])N([H])N([H])/C(=C1/N=C([H])C([H])=C([H])C1=O)C([H])([H])[H] |(3.3834,0.2307,-2.07;2.7511,-0.1757,-2.7654;1.9201,-0.9687,-2.2279;0.6893,-1.8377,-3.2123;0.9712,-1.1344,-4.3249;1.8223,-1.3588,-0.8836;0.9164,-1.671,-0.5512;2.5941,-0.6999,0.0596;3.484,-1.1219,0.3991;2.2147,0.4275,0.6853;3.0548,0.9522,1.6983;2.6283,2.1068,2.3064;3.3608,2.6262,3.2559;2.9958,3.5438,3.7147;4.5914,2.0483,3.7039;5.1462,2.5364,4.5018;5.051,0.9006,3.1281;5.9781,0.4234,3.4312;4.3012,0.2703,2.0604;4.7078,-0.7846,1.5006;0.9198,1.0778,0.3017;0.8147,2.011,0.8518;0.0698,0.4266,0.5418;0.8922,1.2706,-0.7764)|\",3.643604458195\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\[H])C1=C([H])C([H])=C(OC([H])([H])C([H])([H])[H])C([H])=N1 |(10.5777,1.3754,1.5116;11.581,1.1867,1.4311;11.7627,-0.0303,1.1189;13.4196,-0.7023,0.905;14.0133,0.4389,1.3004;10.8,-1.0077,0.8798;11.0746,-1.9779,0.7395;9.4894,-0.6914,0.9947;8.6284,-1.634,0.8462;8.9168,-2.6718,0.6409;7.1874,-1.3951,0.9393;6.6377,-0.128,1.1728;7.2932,0.728,1.292;5.2562,0.0168,1.247;4.8218,0.9938,1.4264;4.4564,-1.1211,1.0844;3.1008,-1.1525,1.1293;2.3885,0.0711,1.3386;2.691,0.5154,2.2972;2.6382,0.7842,0.5403;0.9064,-0.257,1.3332;0.3201,0.656,1.4811;0.6642,-0.9604,2.1356;0.6153,-0.7065,0.3789;5.0983,-2.352,0.8526;4.4882,-3.2439,0.7232;6.4157,-2.4919,0.7809)|\",4.356542746505\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\[H])C1=NC([H])=C(OC([H])([H])[H])C([H])=C1[H] |(3.5936,7.426,1.8145;4.123,8.1846,2.2539;5.1018,7.7026,2.9023;6.2541,8.7674,3.7872;5.7365,9.8919,3.2589;5.4436,6.3626,3.0727;6.3416,6.1129,3.4821;4.6866,5.4055,2.4875;5.0853,4.1869,2.5767;6.0095,3.9028,3.0937;4.3321,3.0767,1.9911;4.8869,1.8648,2.1529;4.2619,0.7977,1.6461;4.7528,-0.1583,1.8057;3.0475,0.8837,0.9503;2.3717,-0.1664,0.4165;2.9199,-1.4699,0.5731;2.2185,-2.1479,0.0846;3.012,-1.7409,1.6325;3.9024,-1.5517,0.0911;2.4696,2.1526,0.7839;1.5283,2.2371,0.2496;3.1145,3.2575,1.3069;2.6994,4.2538,1.2009)|\",4.38103299305\r\n\"[H]/N=C(/S[H])N([H])/N=C(\\[H])C1=NC([H])=C([H])C([H])=C1OC(=O)C([H])([H])[H] |(1.8903,2.3246,-0.1955;1.3508,3.0061,-0.7378;1.9262,3.2424,-1.8439;1.3085,4.4846,-2.9886;0.5797,5.1164,-2.0493;3.0466,2.605,-2.3874;3.6521,3.1292,-3.0187;3.5295,1.5056,-1.7633;4.7218,1.1271,-2.0552;5.3662,1.6905,-2.7399;5.3634,-0.0552,-1.4724;6.6972,-0.0906,-1.6669;7.3991,-1.1095,-1.1721;8.4715,-1.0841,-1.3551;6.8234,-2.1662,-0.4645;7.4351,-2.9768,-0.0811;5.4436,-2.1565,-0.2808;4.9281,-2.9581,0.2388;4.703,-1.0952,-0.7878;3.3204,-1.1635,-0.7069;2.6714,-0.4486,0.2772;3.2476,0.1874,1.1219;1.1818,-0.5862,0.1103;0.6854,-0.3071,1.0407;0.91,-1.6034,-0.1828;0.8545,0.0942,-0.6839)|\",4.50076308727\r\n\"[H]OC([H])([H])C1=C([H])N(C2=C([H])C([H])=C(C([H])([H])[H])C([H])=C2[H])C(=O)C([H])=C1[H] |(9.9194,2.8348,-0.0179;10.1949,2.8785,-0.9476;9.0569,2.5507,-1.7418;8.2808,3.3306,-1.6816;9.4322,2.5506,-2.7715;8.4648,1.2101,-1.3863;7.1433,1.0688,-1.0764;6.4584,1.9102,-1.0799;6.5721,-0.1434,-0.7692;5.1843,-0.1736,-0.3906;4.7202,0.7119,0.5873;5.4184,1.3763,1.0883;3.3712,0.7252,0.9338;3.0255,1.4193,1.6963;2.46,-0.1501,0.3288;0.9987,-0.1319,0.712;0.4407,-0.9185,0.1949;0.531,0.8285,0.4606;0.8658,-0.2812,1.7904;2.9482,-1.0401,-0.6364;2.2628,-1.7342,-1.1164;4.2934,-1.0593,-1.0002;4.6527,-1.7598,-1.7432;7.3259,-1.3648,-0.7686;6.7908,-2.4383,-0.5172;8.7339,-1.1843,-1.0748;9.3288,-2.0909,-1.0557;9.2745,0.0308,-1.3642;10.336,0.1252,-1.5799)|\",4.530695610825\r\n\"[H]C1=C([H])C(C([H])([H])OC([H])([H])[H])=C([H])N(C2=C([H])C([H])=C(F)C([H])=C2[H])C1=O |(7.1028,3.5118,0.3416;6.7289,2.5004,0.4542;5.4006,2.229,0.5876;4.6802,3.0456,0.5871;4.9348,0.8883,0.7347;3.4661,0.6,0.9238;2.8813,1.0879,0.1225;3.1115,1.0361,1.8761;3.2433,-0.7936,0.9148;1.8869,-1.1331,1.1147;1.8223,-2.2236,1.0907;1.2442,-0.7179,0.3223;1.5168,-0.7728,2.0875;5.8648,-0.108,0.7211;5.5835,-1.145,0.8447;7.2113,0.1379,0.5708;8.1069,-0.9833,0.4836;9.2599,-1.0389,1.2752;9.5009,-0.2151,1.9341;10.1006,-2.1462,1.201;10.9999,-2.2118,1.804;9.7758,-3.1863,0.3363;10.5906,-4.2591,0.2676;8.6402,-3.15,-0.4623;8.4254,-3.9745,-1.1336;7.8077,-2.0335,-0.3899;6.9291,-1.9737,-1.0248;7.7424,1.4625,0.4488;8.9498,1.6563,0.3456)|\",4.465388286705\r\n\"[H]OC1=C([H])C([H])=C(N2C(=O)C([H])=C([H])C(C([H])([H])OC([H])([H])[H])=C2[H])C([H])=C1[H] |(11.9938,-4.1691,-1.0702;11.3038,-4.3155,-1.7358;10.3951,-3.2975,-1.6637;10.4936,-2.2593,-0.7301;11.3205,-2.2413,-0.0223;9.5382,-1.247,-0.6982;9.6249,-0.4404,0.0182;8.4737,-1.2645,-1.6038;7.4535,-0.251,-1.5759;7.8351,1.1288,-1.5859;9.0154,1.4639,-1.5403;6.7123,2.044,-1.6631;6.9748,3.0955,-1.6969;5.4178,1.6198,-1.6966;4.6116,2.3497,-1.753;5.0997,0.2298,-1.6528;3.6652,-0.2377,-1.6488;3.1199,0.211,-2.4997;3.1512,0.1097,-0.7332;3.6232,-1.6463,-1.726;2.3087,-2.1614,-1.6935;2.3897,-3.2491,-1.7585;1.7074,-1.7955,-2.5409;1.7894,-1.8955,-0.759;6.1376,-0.6524,-1.5914;5.969,-1.7196,-1.5395;8.3868,-2.2955,-2.5453;7.5779,-2.2942,-3.27;9.3339,-3.3138,-2.5735;9.273,-4.1161,-3.3017)|\",4.424571209130001\r\n\"[H]OC([H])([H])C([H])([H])[C@]1(C2=C([H])C(F)=C([H])C([H])=C2[H])C([H])([H])N([H])C([H])([H])C([H])([H])C1([H])[H] |(-0.9653,-2.2987,-4.6329;-1.5518,-2.4593,-3.8784;-0.7897,-2.2746,-2.6868;0.0318,-3.0075,-2.6336;-0.3382,-1.2726,-2.6553;-1.7537,-2.457,-1.5171;-2.2301,-3.4371,-1.6353;-2.5582,-1.7204,-1.6326;-1.1378,-2.3397,-0.0936;-0.4193,-0.9859,0.0439;0.9581,-0.883,0.2834;1.5952,-1.753,0.3834;1.5527,0.3696,0.4;2.8838,0.4252,0.6304;0.8371,1.5521,0.2885;1.3432,2.5063,0.389;-0.5345,1.4548,0.0456;-1.127,2.3608,-0.0492;-1.1506,0.2108,-0.0751;-2.2184,0.1738,-0.2676;-0.1822,-3.5394,0.195;0.3017,-3.3703,1.1664;0.6185,-3.5914,-0.551;-0.8173,-4.8534,0.2659;-1.1705,-5.1242,-0.6499;-1.9017,-4.8974,1.2501;-2.3649,-5.8902,1.2118;-1.4423,-4.7985,2.2448;-2.9497,-3.7901,1.0643;-3.5386,-3.9875,0.1586;-3.6607,-3.7958,1.9017;-2.2756,-2.4108,0.9736;-1.8384,-2.1626,1.9509;-3.0295,-1.6414,0.7699)|\",5.62187215133\r\n\"[H]OC(=N/C([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])/C([H])=C(\\[H])C(=O)OC([H])([H])[H] |(-2.3272,4.5899,3.691;-1.4159,4.2588,3.6739;-1.1504,3.7596,2.4265;0.0753,3.6282,2.1211;0.5125,3.0274,0.8783;0.7866,1.984,1.1017;-0.2639,2.9701,0.0998;1.7323,3.7348,0.3041;2.0112,3.6269,-1.0634;1.3333,3.0713,-1.7089;3.1468,4.2263,-1.6094;3.3468,4.1348,-2.6739;4.0173,4.9486,-0.7913;4.9,5.4206,-1.2147;3.7418,5.0652,0.5723;4.4113,5.6306,1.2157;2.6073,4.4621,1.1181;2.3813,4.5579,2.1748;-2.326,3.3948,1.6062;-2.2014,3.3515,0.5283;-3.535,3.0891,2.1029;-3.7455,3.0579,3.1685;-4.6486,2.7283,1.1905;-4.5784,2.6833,-0.0213;-5.77,2.4553,1.8928;-6.9138,2.0916,1.1021;-7.1548,2.8853,0.3901;-7.7282,1.9493,1.8124;-6.7177,1.1681,0.5507)|\",4.11436141956\r\n\"[H]O[C@@]([H])(C(=C([H])[H])C([H])=C([H])[H])C1=C([H])C([H])=C([H])C([H])=C1Br |(4.2641,0.4615,-0.2546;4.7627,0.4774,-1.0863;3.8097,0.341,-2.1444;3.4492,-0.6952,-2.1935;2.6082,1.2522,-1.8872;1.3691,0.7418,-1.8293;0.4996,1.3757,-1.6814;1.1851,-0.321,-1.9551;2.9011,2.6848,-1.6769;3.7511,3.0827,-2.2288;2.2271,3.4991,-0.8563;2.4845,4.5507,-0.7681;1.4028,3.1454,-0.2415;4.5328,0.6657,-3.4462;5.6841,1.4694,-3.42;6.0439,1.8046,-2.4542;6.3654,1.8125,-4.5856;7.254,2.4348,-4.527;5.9097,1.3508,-5.82;6.4323,1.6089,-6.7367;4.7779,0.5394,-5.8776;4.4171,0.1603,-6.8274;4.1081,0.2065,-4.7003;2.5759,-0.9546,-4.885)|\",5.513026611130001\r\n\"[H]C([H])([H])C(NO)(C([H])([H])[H])C([H])([H])C([H])([H])C#N |(0.4196,0.7138,0.2192;0.8106,-0.2202,-0.1983;0.5147,-1.0451,0.456;0.3304,-0.3912,-1.1659;2.3262,-0.1276,-0.3567;2.9505,-1.4147,-0.8696;2.1604,-2.291,-1.1353;2.7291,0.9166,-1.4171;2.4424,1.9186,-1.0806;2.2253,0.7165,-2.3685;3.81,0.904,-1.5898;3.0841,0.1739,0.9574;4.1588,0.1831,0.7451;2.8145,1.182,1.2922;2.8118,-0.8341,2.0966;3.0557,-1.8534,1.7734;1.7524,-0.834,2.3765;3.5999,-0.5245,3.2926;4.2288,-0.2614,4.2321)|\",3.757892275405\r\n\"[H]C([H])=C(C([H])([H])[H])[C@@]1([H])C([H])([H])C(=O)C(C([H])([H])[H])(C([H])([H])[H])[C@]([H])(C([H])=C([H])[H])C1([H])[H] |(2.4084,-1.7775,5.0756;2.9738,-2.0302,4.1817;2.863,-3.047,3.8208;3.7555,-1.12,3.5879;3.8816,0.2765,4.1559;3.314,0.3796,5.0853;3.5184,1.0413,3.4574;4.9312,0.522,4.3691;4.5923,-1.3978,2.3433;5.6143,-1.0626,2.5781;4.6746,-2.8842,1.9387;4.915,-3.5318,2.7859;5.4898,-2.9879,1.2082;3.4063,-3.4153,1.2791;2.9231,-4.48,1.622;2.7958,-2.5839,0.1308;3.6693,-2.8345,-1.1264;3.2213,-2.3521,-2.0013;3.7268,-3.9086,-1.3348;4.6922,-2.4582,-1.0217;1.3691,-3.0809,-0.1495;0.9229,-2.5133,-0.9716;0.732,-2.9651,0.734;1.3742,-4.142,-0.4113;2.7711,-1.0651,0.5483;2.0306,-0.9845,1.356;2.3288,-0.1532,-0.5702;3.0213,-0.0477,-1.4068;1.1928,0.546,-0.5875;0.9398,1.2069,-1.4125;0.4707,0.4855,0.2244;4.1239,-0.5809,1.1137;4.8982,-0.6419,0.3369;4.0393,0.4807,1.3693)|\",5.951129910435\r\n\"[H]O[C@@]([H])(C(=C([H])[H])C([H])=C([H])[H])C1=C([H])C([H])=C([H])C([H])=C1C([H])([H])[H] |(5.4285,-0.3896,1.2565;5.6746,-0.5313,0.3274;4.6968,-1.409,-0.2308;4.7162,-2.3656,0.31;3.2727,-0.8588,-0.1166;3.0342,0.3978,0.2843;2.0246,0.795,0.3271;3.8451,1.0689,0.548;2.2003,-1.8069,-0.4813;2.4401,-2.5035,-1.2854;1.0011,-1.8883,0.1062;0.2558,-2.605,-0.2271;0.7262,-1.2475,0.9402;5.0967,-1.6603,-1.6812;5.1004,-0.5729,-2.5632;4.7987,0.4017,-2.1914;5.4807,-0.7245,-3.8943;5.479,0.1328,-4.5621;5.8574,-1.9841,-4.36;6.1532,-2.1219,-5.3968;5.8563,-3.0709,-3.4876;6.1553,-4.0511,-3.852;5.4856,-2.9341,-2.142;5.5313,-4.1488,-1.2379;5.8794,-5.0257,-1.7921;4.5482,-4.3963,-0.8187;6.2148,-4.001,-0.3928)|\",5.736159968539999\r\n\"[H]/C1=N/[C@]2([H])C(N=C(Cl)NC2=O)N1C([H])(C([H])([H])[H])C([H])([H])[H] |(3.5482,-2.9878,-0.5068;3.3482,-1.9271,-0.6252;2.6199,-1.4196,-1.5461;2.7012,0.0184,-1.3306;3.2761,0.4609,-2.1635;3.4727,0.2091,-0.0535;3.5749,1.3228,0.6145;2.7326,2.2861,0.0937;3.0635,3.8844,0.739;1.7299,2.1754,-0.7103;1.4432,0.9084,-1.2602;0.3637,0.6009,-1.6976;3.9208,-1.0232,0.2991;4.6803,-1.4002,1.5127;4.8727,-2.4728,1.3915;6.0219,-0.6613,1.5617;6.5992,-1.0106,2.4238;5.8675,0.4164,1.6617;6.6084,-0.8509,0.6568;3.8345,-1.1867,2.7746;4.3869,-1.543,3.6504;2.8917,-1.7406,2.7165;3.6104,-0.1257,2.9141)|\",4.435455763149999\r\n\"[H]C1=N[C@@]2([H])C(N=C(Cl)NC2=O)N1C([H])([H])C([H])([H])[H] |(1.5378,2.2999,-0.9771;2.6105,2.1336,-1.0005;3.4754,2.996,-1.383;4.7654,2.3487,-1.1746;5.2816,2.8674,-0.3475;4.4611,0.9386,-0.7578;5.3048,-0.0495,-0.7031;6.5222,0.3215,-1.2412;7.7899,-0.8422,-0.9006;6.8326,1.3235,-1.9928;5.8444,2.2892,-2.2778;5.894,3.0339,-3.2233;3.1173,0.8835,-0.5799;2.3437,-0.3207,-0.2635;2.9618,-0.92,0.4098;1.4551,-0.0011,0.2903;1.9647,-1.1255,-1.5082;1.3884,-2.0092,-1.2146;2.8617,-1.4608,-2.0367;1.3533,-0.5328,-2.197)|\",4.449061455675\r\n\"[H]C([H])=C1O[C@@]([H])(C([H])([H])[H])[C@]2(C1([H])[H])C([H])([H])C(=C([H])[H])O[C@]2([H])C([H])([H])[H] |(0.9622,-1.1251,1.4547;1.4314,-0.1667,1.2665;1.0653,0.7013,1.8034;2.4261,-0.0667,0.3834;3.0551,1.1278,0.1258;4.1431,0.9445,-0.8081;3.7626,1.1884,-1.8132;5.2277,1.9426,-0.4298;6.078,1.8886,-1.1155;5.5825,1.7684,0.5902;4.8174,2.9563,-0.4734;4.4624,-0.5705,-0.762;3.0516,-1.1412,-0.4678;2.489,-1.2537,-1.4055;3.0719,-2.1179,0.0233;5.4599,-1.0395,0.3191;6.3383,-0.3833,0.3607;5.0265,-1.076,1.3223;5.8531,-2.4042,-0.1966;6.3193,-3.4535,0.482;6.4634,-3.3912,1.5541;6.572,-4.381,-0.0195;5.645,-2.4523,-1.5571;5.0851,-1.2063,-2.0293;4.304,-1.475,-2.7499;6.1733,-0.4146,-2.7525;5.7672,0.4937,-3.212;6.5974,-1.0361,-3.5474;6.9886,-0.134,-2.0778)|\",6.642299090705\r\n\"[H]C1=C([H])C([H])=C([H])[C@@]2([H])C1=NC(=S)C(C1([H])OC([H])([H])C([H])([H])O1)=C2[H] |(5.5299,1.6351,-0.3844;4.5206,1.266,-0.5349;3.4296,2.0564,-0.3719;3.5571,3.103,-0.108;2.0765,1.5353,-0.4921;1.2426,2.2005,-0.2865;1.8596,0.256,-0.8448;0.8513,-0.1391,-0.947;2.9987,-0.6596,-1.2068;2.9952,-0.68,-2.3198;4.3753,-0.144,-0.8258;5.4175,-0.9174,-0.7623;5.2794,-2.29,-0.9168;6.5908,-3.1672,-1.4207;3.9466,-2.8786,-0.6286;3.7358,-4.3252,-0.2175;2.695,-4.423,0.1415;3.8937,-5.2358,-1.2985;4.8542,-6.2261,-0.9184;4.4914,-7.2064,-1.244;5.818,-6.0061,-1.3882;4.9208,-6.0794,0.6027;5.9065,-6.2834,1.0276;4.1683,-6.7055,1.1065;4.635,-4.696,0.7996;2.8651,-2.0937,-0.7733;1.8632,-2.4929,-0.6248)|\",2.7211385050000003\r\n\"[H]C#C[C@@](O[Si](C([H])([H])C([H])([H])[H])(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])[H])(C([H])([H])[H])C([H])([H])C([H])([H])C([H])=O |(5.1521,2.3998,4.6785;4.7681,1.7166,3.9547;4.3504,0.9395,3.1288;3.7938,-0.0245,2.1549;4.7651,-0.4568,1.2018;6.4453,-0.6906,1.2949;6.8118,-1.7182,-0.2532;6.5818,-1.0945,-1.1277;7.8962,-1.8982,-0.2955;6.0533,-3.0526,-0.3626;6.311,-3.586,-1.2857;4.9705,-2.8885,-0.3627;6.2844,-3.7212,0.4757;7.3972,0.9497,1.2222;7.059,1.577,2.0565;8.4529,0.7212,1.4317;7.2985,1.7433,-0.0935;7.8521,2.6881,-0.0304;6.2594,1.9917,-0.3378;7.7089,1.1809,-0.94;6.9009,-1.6425,2.8762;6.5434,-1.076,3.746;6.3328,-2.5836,2.872;8.4013,-1.9516,3.0388;8.5905,-2.5496,3.9389;8.9975,-1.0362,3.1294;8.7956,-2.5173,2.1861;3.2456,-1.251,2.9138;2.4378,-0.958,3.5922;4.0353,-1.7247,3.5026;2.8606,-1.978,2.1918;2.6426,0.6351,1.3533;2.249,-0.1364,0.6795;1.8406,0.9053,2.0508;3.0689,1.8682,0.5418;3.9106,1.5745,-0.1005;3.3916,2.6858,1.1926;1.9475,2.3582,-0.343;1.5633,1.6011,-1.0671;1.466,3.469,-0.304)|\",6.20963806841\r\n\"[H]O[C@@]1([H])[C@@]([H])(O[H])[C@@]([H])(O[H])C([H])([H])N2C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[C@]21[H] |(-1.9899,2.8748,0.2037;-1.8386,2.4838,1.0816;-1.6816,1.09,0.8641;-2.0564,0.5829,1.7611;-2.5718,0.6318,-0.3054;-3.62,0.7822,-0.0287;-2.4109,1.4881,-1.4624;-1.5152,1.353,-1.8136;-2.344,-0.8321,-0.7037;-2.7194,-1.4734,0.1018;-3.0722,-1.1508,-1.8777;-3.1243,-0.3254,-2.3909;-0.8382,-1.1193,-0.8555;-0.4514,-0.6104,-1.763;-0.7038,-2.1906,-1.0347;-0.1632,-0.7457,0.3741;1.0028,-1.5099,0.7947;1.1694,-1.2822,1.8576;0.7403,-2.5753,0.7445;2.3207,-1.2841,0.0331;3.0908,-1.9132,0.5022;2.2157,-1.6476,-0.9996;2.7964,0.1764,-0.0041;2.8838,0.5719,1.019;3.8108,0.1969,-0.423;1.8983,1.0928,-0.8554;1.6117,0.5385,-1.7602;2.4872,1.9489,-1.2099;0.6406,1.6602,-0.1698;0.0203,2.1257,-0.9467;0.9255,2.4961,0.4798;-0.1887,0.6739,0.7156;0.2402,0.7392,1.7226)|\",6.495357611435\r\n\"[H]O/C(=N/C(=C(/OC([H])([H])[H])C(=O)C([H])([H])[H])C([H])([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.7997,0.7968,1.1556;4.6403,0.6811,0.205;3.4986,-0.0461,0.0379;3.3108,-0.5279,-1.123;2.1386,-1.1089,-1.5995;2.1018,-2.3706,-2.1262;0.9303,-2.7581,-2.7481;0.2197,-3.8334,-2.134;-0.6801,-3.9825,-2.7364;-0.0764,-3.5724,-1.1069;0.8149,-4.7494,-2.1308;3.2254,-3.3455,-2.2168;3.0394,-4.4233,-2.7738;4.5935,-3.0254,-1.639;4.5578,-2.9615,-0.5445;4.9712,-2.0619,-1.9941;5.2693,-3.8341,-1.9269;0.9465,-0.1845,-1.6905;1.2214,0.7034,-2.273;0.6438,0.1624,-0.6955;0.107,-0.6798,-2.1759;2.6426,-0.1719,1.2561;2.0025,-1.3819,1.5643;2.116,-2.2318,0.9003;1.2273,-1.4904,2.7175;0.7439,-2.4343,2.9526;1.0779,-0.3955,3.572;0.4711,-0.4837,4.4687;1.707,0.8136,3.27;1.5847,1.672,3.9244;2.491,0.9242,2.1224;2.9594,1.8746,1.8787)|\",4.08715003451\r\n\"[H]OC(=O)[C@@]1([H])ON(C([H])([H])[H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(4.9547,-1.4255,0.6839;5.848,-1.0548,0.5254;5.7621,-0.3482,-0.6134;6.6786,0.2972,-1.0616;4.4105,-0.4857,-1.339;4.5448,-1.1968,-2.1605;3.419,-1.0499,-0.4769;2.9147,0.147,0.2307;1.7939,-0.3051,1.0412;1.3676,0.5771,1.5287;2.1724,-0.978,1.8157;1.0091,-0.8123,0.4652;2.615,1.0919,-0.883;2.6564,2.0891,-0.4324;1.269,0.9354,-1.5749;0.9282,-0.217,-2.3023;1.6367,-1.0362,-2.376;-0.3192,-0.3275,-2.9161;-0.5667,-1.2284,-3.4713;-1.2464,0.7133,-2.8227;-2.2149,0.6274,-3.3079;-0.9197,1.8632,-2.1044;-1.6319,2.6804,-2.0266;0.3263,1.9671,-1.4829;0.5738,2.8679,-0.9249;3.8405,0.87,-1.81;4.5931,1.6523,-1.6947;3.5385,0.843,-2.8586)|\",5.796025015650001\r\n\"[H]OC(=O)C1=C([H])C(/C([H])=C(\\[H])C2=C([H])C([H])=C([H])C([H])=C2[H])=NO1 |(10.083,0.9372,-1.2406;9.1923,0.5601,-1.1133;8.3729,1.6128,-0.8696;8.7433,2.7632,-0.8306;6.9857,1.1544,-0.6638;6.3984,-0.0749,-0.679;6.8712,-1.0283,-0.8586;5.023,0.1956,-0.405;3.9404,-0.7779,-0.3059;4.2384,-1.8112,-0.4631;2.6581,-0.4566,-0.0433;2.426,0.5974,0.1002;1.5195,-1.3696,0.0716;1.6279,-2.7682,-0.0593;2.5954,-3.2223,-0.2526;0.5078,-3.5838,0.0585;0.6136,-4.6604,-0.0451;-0.7502,-3.0263,0.3104;-1.6227,-3.6673,0.4017;-0.8762,-1.6432,0.4437;-1.8482,-1.1991,0.64;0.2466,-0.8265,0.3259;0.1445,0.251,0.4302;4.8279,1.4988,-0.242;6.0612,2.1036,-0.4039)|\",4.127967112085\r\n\"[H]C1=C([H])C([H])=C([C@]2([H])OC([H])([H])C3=C([H])C(=O)C([H])([H])[C@]32[H])S1 |(-1.9751,1.3328,-1.0327;-1.1632,0.7957,-0.5608;-0.753,0.8535,0.7417;-1.2345,1.4671,1.4954;0.3853,0.0288,0.9945;0.8634,-0.0595,1.965;0.8219,-0.6473,-0.1145;1.9259,-1.6497,-0.1994;2.5806,-1.5185,0.6788;2.6768,-1.4333,-1.4161;3.3731,-2.6363,-1.7851;3.2169,-2.8059,-2.8582;4.4526,-2.5274,-1.61;2.7731,-3.7021,-0.9204;3.1373,-4.884,-0.4047;4.0031,-5.4879,-0.6529;2.1941,-5.2484,0.6893;2.1787,-6.278,1.3321;1.2675,-4.0262,0.9168;1.5822,-3.5361,1.8478;0.2295,-4.3411,1.0494;1.5166,-3.1362,-0.3043;0.6857,-3.2013,-1.0207;-0.1623,-0.2618,-1.506)|\",4.884443616475\r\n\"[H]C#CC([H])([H])O[C@@]([H])(C([H])=C([H])[H])C1=C([H])C([H])=C([H])S1 |(-0.0955,-1.4453,1.3409;0.9584,-1.5809,1.2511;2.151,-1.7309,1.1503;3.5891,-1.9503,1.016;3.8469,-2.1026,-0.0457;3.8806,-2.8643,1.5552;4.2894,-0.823,1.532;5.6971,-0.9019,1.3608;5.9241,-1.1644,0.3105;6.2924,0.4596,1.6466;7.3786,0.4918,1.5785;5.5949,1.5487,1.9611;6.0931,2.494,2.1574;4.5126,1.5272,2.0305;6.3097,-1.9628,2.2599;6.0382,-2.1965,3.5825;5.2998,-1.6249,4.1342;6.8191,-3.253,4.1412;6.739,-3.5811,5.1721;7.6834,-3.8089,3.2389;8.39,-4.6142,3.3876;7.5455,-3.0535,1.6824)|\",5.7688136306\r\n\"[H]C1=C([H])C([H])=C([H])C([C@]2([H])OC([H])([H])[C@@]3([H])C(=C([H])C(=O)C3([H])[H])C2([H])[H])=N1 |(-0.2294,-2.2168,1.8322;-0.4115,-1.2084,1.4642;-1.7018,-0.6815,1.4544;-2.5405,-1.2722,1.8101;-1.8759,0.6175,0.9763;-2.863,1.0717,0.9525;-0.7632,1.326,0.5302;-0.8497,2.3403,0.1557;0.493,0.7068,0.5724;1.7509,1.4755,0.1549;2.0872,2.0472,1.0287;1.4689,2.4839,-0.8192;1.1553,1.9889,-2.1218;0.2496,1.364,-2.0935;0.9422,2.8748,-2.7272;2.3215,1.1751,-2.7066;3.168,1.8637,-2.8389;2.6946,0.1052,-1.6995;2.6874,-1.1306,-2.2294;2.8958,-2.0601,-1.711;2.3256,-1.078,-3.6611;2.261,-2.0079,-4.4436;2.0114,0.3955,-3.9967;2.6026,0.7208,-4.8582;0.9562,0.4685,-4.2899;2.9183,0.5577,-0.2938;3.8358,1.1606,-0.2538;3.0157,-0.2796,0.3994;0.6655,-0.5388,1.0357)|\",5.18104771352\r\n\"[H]C#CC([H])([H])[C@]([H])(OC([H])([H])C([H])=C([H])[H])C1=C([H])C([H])=C([H])C([H])=N1 |(0.1073,0.3381,1.1772;1.1035,0.1914,0.827;2.2305,0.0143,0.4328;3.5944,-0.1988,-0.0457;4.3085,0.3403,0.5893;3.7096,0.2068,-1.0595;4.031,-1.682,-0.0647;3.9529,-2.0922,0.9518;5.389,-1.6529,-0.4924;6.099,-2.8648,-0.2384;6.0793,-3.1009,0.837;5.6107,-3.7004,-0.7668;7.5084,-2.7022,-0.7224;7.6114,-2.3402,-1.745;8.5892,-2.995,-0.0004;9.592,-2.9006,-0.4079;8.5131,-3.3481,1.026;3.1616,-2.5457,-0.9688;3.2959,-2.4874,-2.362;4.0482,-1.842,-2.8047;2.4632,-3.2787,-3.1492;2.5427,-3.2556,-4.2329;1.5277,-4.101,-2.5206;0.8584,-4.7366,-3.0928;1.4787,-4.0968,-1.1269;0.7698,-4.7326,-0.5985;2.2763,-3.3456,-0.3575)|\",6.0490908966150005\r\n\"[H]OC([H])([H])C([H])([H])N1C2N=C(Cl)NC(=O)[C@@]2([H])N=C1[H] |(-0.2785,1.9367,-1.836;0.1531,1.5258,-1.0722;0.0158,0.1145,-1.1734;-1.0095,-0.2118,-0.94;0.271,-0.2511,-2.1781;0.9624,-0.5045,-0.1518;0.8446,-1.5906,-0.1368;0.7384,-0.109,0.844;2.3651,-0.2309,-0.4546;3.2112,-1.0781,-1.0896;2.9372,-2.2598,-1.5611;3.9381,-2.6861,-2.4116;3.7906,-4.3863,-2.8237;4.8859,-2.0247,-2.9855;5.0425,-0.6573,-2.6727;5.6153,0.1261,-3.3865;4.4953,-0.3224,-1.2684;5.2596,-0.747,-0.5936;4.2209,1.0457,-0.8472;3.0311,1.0182,-0.3763;2.5042,1.862,0.0522)|\",4.4626671482\r\n\"[H]O[C@]1([H])C([H])([H])[C@]2(C([H])(C([H])([H])[H])C([H])([H])[H])O[C@@]1(C([H])([H])[H])[C@@]1([H])/C(=C\\2[H])[C@]([H])(C([H])([H])[H])C([H])([H])C1([H])[H] |(2.0791,-1.847,-4.341;1.5913,-2.627,-4.0324;1.8491,-2.7749,-2.6425;1.2061,-3.6056,-2.3451;1.5249,-1.5027,-1.8175;0.5282,-1.5395,-1.3685;1.5646,-0.6276,-2.4788;2.6788,-1.456,-0.7897;3.0277,-0.0727,-0.1936;3.8271,-0.2594,0.5376;3.5804,0.9057,-1.2421;3.9269,1.8226,-0.7513;2.8114,1.1979,-1.9681;4.4178,0.4668,-1.7888;1.8391,0.5582,0.5511;2.1409,1.51,1.0028;1.4575,-0.0779,1.3565;1.0079,0.7689,-0.133;3.7932,-1.8827,-1.5997;3.3481,-3.0862,-2.2706;4.252,-3.3205,-3.4711;3.9096,-4.1807,-4.0539;5.2787,-3.5039,-3.1373;4.2673,-2.4469,-4.1303;3.4827,-4.1923,-1.2022;4.5606,-4.2711,-0.9858;2.7675,-3.7791,0.0725;2.4006,-2.5128,0.2794;1.8133,-2.2257,1.1495;2.4386,-5.0181,0.8997;1.3472,-5.0731,1.0224;3.0721,-5.0303,2.2981;2.834,-5.9605,2.8285;2.7095,-4.1941,2.9064;4.1635,-4.9477,2.2326;2.9025,-6.1958,-0.0052;3.9088,-6.5159,0.2976;2.2491,-7.0704,0.0806;2.9588,-5.6222,-1.4383;1.9532,-5.6043,-1.8786;3.5979,-6.2111,-2.1048)|\",6.582434043595001\r\n\"[H]C#C[C@@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])/C([H])=C(/C([H])=O)C([H])([H])[H] |(4.5773,-2.2956,-2.948;4.2436,-1.7628,-2.0867;3.8649,-1.1603,-1.1099;3.4066,-0.4296,0.0786;3.6611,0.6323,-0.0608;1.8747,-0.53,0.2321;1.3658,-0.1663,-0.6656;1.5409,0.0692,1.0866;1.5695,-1.5686,0.3989;4.1257,-0.921,1.3588;3.7161,-0.3696,2.2147;3.8796,-1.9794,1.517;5.656,-0.7442,1.3181;5.9023,0.3155,1.1911;6.0361,-1.2634,0.4258;6.323,-1.3195,2.5302;6.0836,-2.3683,2.7241;7.1643,-0.7238,3.3993;7.6695,-1.5592,4.5104;7.2983,-2.6093,4.5127;8.4313,-1.1668,5.3753;7.6596,0.6964,3.3613;7.1804,1.2888,2.5787;8.7433,0.7185,3.1959;7.4873,1.1828,4.3275)|\",5.404181070930001\r\n\"[H]O/C(=N/C(C(=O)OC([H])([H])[H])=C(/[H])C1=C([H])C([H])=C([H])C([H])=C1F)C([H])([H])[H] |(4.026,-2.1768,-1.3856;3.3847,-1.7683,-1.9871;2.657,-0.843,-1.3055;1.7592,-0.2434,-1.9762;1.0004,0.8136,-1.4817;1.7743,2.0336,-1.0759;2.9853,2.0578,-0.9499;0.9951,3.1177,-0.8611;1.6946,4.3032,-0.4567;2.2248,4.1368,0.4853;0.926,5.0668,-0.3343;2.4168,4.6026,-1.2209;-0.3593,0.8347,-1.4556;-0.8114,1.7498,-1.0964;-1.2994,-0.2092,-1.8417;-0.9592,-1.4313,-2.4627;0.0854,-1.6308,-2.6672;-1.9365,-2.353,-2.8287;-1.6419,-3.281,-3.3106;-3.2868,-2.0927,-2.5804;-4.0484,-2.8153,-2.8595;-3.6606,-0.8958,-1.9682;-4.6976,-0.6539,-1.7595;-2.6726,0.0113,-1.6189;-3.0581,1.1695,-1.031;2.9962,-0.7035,0.1599;2.2099,-0.1656,0.6914;3.9301,-0.1453,0.2799;3.111,-1.6933,0.6195)|\",4.01640043338\r\n\"[H]O/C(=N/C(C1=C([H])C([H])=C([H])S1)=C(/[H])C#N)C([H])([H])[H] |(5.6317,0.4971,-0.102;5.2507,-0.3555,0.1598;4.4193,-0.7958,-0.8151;3.8351,-1.903,-0.5853;2.8758,-2.484,-1.3951;3.1982,-3.8482,-1.8155;4.2611,-4.6107,-1.3818;4.9667,-4.2435,-0.6466;4.3077,-5.9033,-1.9728;5.0669,-6.644,-1.7472;3.2843,-6.1173,-2.8568;3.082,-7.0003,-3.4484;2.2424,-4.7393,-2.9849;1.6715,-1.8977,-1.6853;0.904,-2.4551,-2.2127;1.3368,-0.584,-1.2788;1.0732,0.5087,-0.9635;4.3303,0.0755,-2.0441;3.8274,-0.4452,-2.8601;5.3351,0.3658,-2.3743;3.755,0.981,-1.8191)|\",4.217764682750001\r\n\"[H]OC([H])([H])C([H])([H])N([H])[C@]1([H])C([H])([H])SC2=C(C([H])=C([H])C([H])=C2[H])C1([H])[H] |(-3.9057,-3.0291,-1.6848;-4.3277,-3.8119,-1.2841;-4.7153,-3.3775,0.0041;-5.6665,-2.8135,-0.0295;-4.8882,-4.2684,0.6176;-3.6288,-2.5032,0.6317;-2.7162,-3.0996,0.725;-3.9214,-2.1803,1.6467;-3.3608,-1.379,-0.2777;-4.1809,-0.7716,-0.2683;-2.178,-0.5614,0.0422;-2.1304,-0.3259,1.1198;-2.286,0.7549,-0.7335;-3.2194,1.2766,-0.4925;-2.2788,0.5548,-1.8099;-0.9665,1.9441,-0.296;0.459,0.8779,-0.145;0.3918,-0.5267,-0.1495;1.5856,-1.2423,0.0244;1.541,-2.3296,0.0237;2.8113,-0.605,0.1945;3.7167,-1.1904,0.327;2.8617,0.7906,0.1984;3.808,1.3079,0.3308;1.6927,1.5247,0.0292;1.7294,2.6111,0.0285;-0.8964,-1.2986,-0.3702;-0.8255,-2.25,0.1679;-1.0009,-1.5611,-1.4333)|\",5.526632303655\r\n\"[H]O[C@]1([H])C([H])([H])[C@@]([H])(OC([H])([H])[H])O[C@]([H])(C([H])([H])[H])[C@]1([H])N([H])C([H])([H])[H] |(0.3055,-0.5389,2.0323;0.6697,-1.2864,2.5283;2.023,-1.5009,2.1127;2.23,-2.5297,2.432;3.0222,-0.5705,2.8178;4.034,-0.9836,2.7274;2.7737,-0.5076,3.8822;3.0545,0.8203,2.1997;2.101,1.3532,2.3721;4.0975,1.5312,2.7852;4.1581,2.8992,2.405;4.958,3.3501,2.9971;3.2117,3.4165,2.6277;4.3818,3.0114,1.3385;3.2993,0.7398,0.7967;2.3101,0.0234,0.0457;2.7447,-0.0352,-0.9588;0.9911,0.7963,-0.0865;0.2846,0.2183,-0.6868;1.1772,1.7589,-0.5741;0.5146,1.0181,0.8764;2.1988,-1.4361,0.5819;3.1874,-1.8889,0.3867;1.1354,-2.1768,-0.0945;0.654,-2.7717,0.5736;1.5607,-2.9371,-1.2622;0.7078,-3.4962,-1.6609;2.382,-3.6518,-1.0664;1.9013,-2.2555,-2.0504)|\",7.5647650439\r\n\"[H]C(=O)C1=C([H])[C@]2([H])C([H])([H])C([H])([H])[C@]([H])(OC(=O)C([H])([H])[H])O[C@@]2([H])C([H])([H])C1([H])[H] |(3.5832,-2.7781,1.5713;3.4427,-2.6137,2.6637;3.937,-3.372,3.4779;2.6311,-1.4322,3.0139;2.1369,-0.6714,2.0188;2.3367,-0.9526,0.9831;1.297,0.5524,2.2601;0.2366,0.2521,2.1983;1.5057,1.6995,1.256;2.5562,2.0176,1.2711;1.2849,1.3689,0.2335;0.5896,2.87,1.6447;-0.4576,2.5856,1.4779;0.7807,3.7606,1.0386;0.7235,3.2406,3.1243;-0.0469,3.9501,3.4278;2.0219,3.8661,3.3606;2.0938,5.2097,3.1996;1.1725,5.8921,2.8057;3.4616,5.7238,3.5834;3.519,6.7929,3.3769;4.2392,5.1908,3.0273;3.6397,5.5426,4.6488;0.6256,2.1462,3.9839;1.5343,1.0746,3.6839;2.5623,1.4608,3.748;1.3548,-0.0353,4.7103;1.4388,0.375,5.722;0.3422,-0.4476,4.6127;2.4078,-1.1343,4.4801;2.1237,-2.0599,4.9935;3.3691,-0.8452,4.927)|\",5.32254691578\r\n\"[H]OC([H])([H])[C@@]([H])(C1=C2C(=NC([H])=C1[H])C([H])=C([H])C([H])=C2[H])C([H])([H])C#N |(1.6034,-5.3249,0.8895;0.9012,-4.722,0.6039;1.4462,-3.4086,0.5252;1.6564,-2.9998,1.5248;2.3845,-3.4085,-0.047;0.3893,-2.5361,-0.1622;-0.5346,-2.7132,0.3971;0.7086,-1.0509,-0.1345;-0.3325,-0.0698,-0.0206;0.0514,1.3133,-0.0149;1.3483,1.7295,-0.1174;2.2711,0.7999,-0.2315;3.3008,1.1451,-0.32;2.0051,-0.5906,-0.2461;2.8387,-1.2755,-0.3573;-0.9468,2.3166,0.0997;-0.6128,3.3493,0.1032;-2.2769,1.9796,0.1968;-3.0339,2.7541,0.2828;-2.6631,0.6189,0.1856;-3.7151,0.3575,0.2605;-1.7176,-0.3779,0.0815;-2.0461,-1.4122,0.0768;0.1005,-3.0481,-1.6059;-0.2258,-4.0915,-1.5431;-0.7113,-2.466,-2.0558;1.2716,-2.9827,-2.4842;2.2181,-2.9398,-3.1554)|\",4.83818426189\r\n\"[H]/C(C(=O)C([H])([H])[H])=C(/[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C(=O)OC([H])([H])C([H])([H])[H] |(3.1692,4.2208,2.2951;3.637,3.7212,3.1425;2.8827,3.6807,4.4269;3.3316,3.1449,5.4299;1.5167,4.3456,4.4256;1.609,5.4089,4.1687;1.0552,4.2467,5.4098;0.8681,3.889,3.6669;4.8508,3.1638,3.0258;5.2569,2.6744,3.9127;5.6993,3.1467,1.7903;5.2011,3.6946,0.98;6.6373,3.6804,2.0057;6.0569,1.7204,1.3161;6.5502,1.1817,2.1379;6.8,1.7992,0.5109;4.846,0.9206,0.8219;4.3586,1.455,-0.0029;4.0941,0.8481,1.6164;5.2237,-0.4855,0.353;5.6695,-1.0734,1.1657;5.991,-0.4466,-0.4333;4.0437,-1.2627,-0.2004;2.9327,-0.8132,-0.3862;4.4011,-2.5377,-0.4799;3.3733,-3.3942,-1.034;2.7598,-2.8081,-1.7229;3.9269,-4.1513,-1.595;2.5236,-4.0225,0.0604;1.8052,-4.7227,-0.3815;1.9647,-3.2545,0.6024;3.1486,-4.5739,0.7706)|\",5.276287561195001\r\n\"[H]C1=C([H])C([H])=C([C@@]2([H])OC2(C(=O)C([H])([H])[H])C(=O)C([H])([H])[H])C([H])=C1[H] |(7.558,-2.8797,4.4576;6.6761,-2.4427,3.9976;6.0946,-3.0518,2.8825;6.5238,-3.961,2.4709;4.9616,-2.4936,2.2938;4.5081,-2.9679,1.4269;4.4056,-1.315,2.8088;3.1659,-0.763,2.1838;2.3053,-1.4342,2.1514;2.8056,0.5854,2.4361;3.1599,0.2582,1.0804;1.9394,0.1838,0.1451;0.8889,-0.2116,0.6094;2.0763,0.5982,-1.3009;2.8245,-0.0186,-1.8074;2.4276,1.6343,-1.378;1.1027,0.5,-1.7842;4.4755,0.8012,0.5603;4.9054,0.3888,-0.5028;5.1887,1.8473,1.3831;4.5009,2.6465,1.6806;6.0214,2.252,0.8041;5.5711,1.3993,2.3072;4.9862,-0.7109,3.9299;4.54,0.1922,4.3351;6.1188,-1.2749,4.5208;6.5633,-0.8037,5.3932)|\",5.4966997801\r\n\"[H]O[C@@]1([H])O[C@@]2([H])O[C@@]3([H])C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(O[H])[C@@]13C([H])([H])C2([H])[H] |(0.0766,1.9128,-1.7933;-0.0813,1.0277,-2.1609;0.7616,0.1504,-1.4375;1.0607,-0.6219,-2.1553;1.9264,0.8628,-1.0351;2.1011,0.8669,0.382;2.9139,1.5676,0.5781;0.955,1.3953,1.0151;-0.1796,0.5081,0.9012;-0.2823,-0.0392,1.8545;-1.448,1.3378,0.7117;-1.5903,1.9702,1.5964;-1.3266,2.0087,-0.1448;-2.6563,0.4195,0.4843;-2.8341,-0.1881,1.3842;-3.5613,1.0191,0.3341;-2.4261,-0.499,-0.7246;-2.3343,0.1132,-1.6304;-3.276,-1.1768,-0.8677;-1.1585,-1.3509,-0.5677;-1.3257,-2.0539,0.2597;-0.9433,-2.189,-1.7043;-1.0451,-1.6308,-2.4925;0.1131,-0.5372,-0.2064;1.2276,-1.4555,0.36;0.8201,-2.0532,1.1833;1.5553,-2.1585,-0.4122;2.3952,-0.5591,0.8507;3.357,-0.8819,0.4401;2.4824,-0.5646,1.9426)|\",8.704922077495\r\n\"[H]C(=O)C1=C([H])[C@]2([H])C([H])([H])C([H])([H])C(=O)O[C@@]2([H])C([H])([H])C1([H])[H] |(-2.0963,-1.04,0.1315;-2.133,-0.1842,0.8424;-2.7754,-0.2537,1.8734;-1.3551,1.0018,0.4294;-0.6844,0.9581,-0.7367;-0.7414,0.0595,-1.353;0.1414,2.1016,-1.26;-0.4729,2.6595,-1.9881;1.4439,1.6932,-1.9586;2.0853,1.1632,-1.2423;1.2538,0.9994,-2.7863;2.1462,2.956,-2.4789;1.7061,3.257,-3.4395;3.208,2.7855,-2.6788;2.0565,4.2092,-1.6045;2.727,5.1869,-1.8267;1.1279,4.2581,-0.608;0.4925,3.0576,-0.1199;1.2086,2.5547,0.5478;-0.7403,3.4506,0.6834;-0.4701,4.1979,1.4362;-1.4671,3.9197,0.0082;-1.3484,2.2029,1.348;-2.3715,2.4017,1.6858;-0.7927,1.943,2.2599)|\",5.298056669235\r\n\"[H]OC(=O)C([H])([H])N([H])[C@]1([H])C([H])([H])SC2=C(C([H])=C([H])C([H])=C2[H])C1([H])[H] |(-3.8688,-2.197,-1.4051;-4.3674,-3.0436,-1.2883;-4.3002,-3.3055,0.0275;-4.8226,-4.2614,0.546;-3.4592,-2.2855,0.8147;-2.5012,-2.7686,1.0324;-3.9486,-2.1061,1.7802;-3.2258,-1.0688,0.0225;-4.0081,-0.4327,0.1716;-1.967,-0.3415,0.2927;-1.7966,-0.2337,1.3759;-2.0852,1.056,-0.32;-2.9562,1.5877,0.0805;-2.2063,0.9792,-1.4057;-0.6603,2.1275,0.0914;0.7085,0.9817,0.0086;0.5635,-0.4085,-0.1431;1.7229,-1.1972,-0.1425;1.6176,-2.2743,-0.2551;2.9917,-0.6422,-0.0031;3.8697,-1.2815,-0.0062;3.1209,0.7401,0.1457;4.1022,1.193,0.2568;1.9864,1.5448,0.1512;2.0852,2.6211,0.2661;-0.7815,-1.0836,-0.3378;-0.7193,-2.1011,0.0643;-0.9975,-1.1956,-1.4108)|\",5.537516857675\r\n\"[H]C1=C([H])C2=C(C([H])=C1[H])C([H])([H])[C@]([H])(N([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])C([H])([H])S2 |(13.4981,2.6366,1.186;12.4512,2.3652,1.0795;11.6733,2.1478,2.2171;12.101,2.2637,3.209;10.3279,1.7943,2.0803;9.7427,1.652,0.8083;10.5313,1.9099,-0.3173;10.0879,1.8183,-1.3062;11.8768,2.2588,-0.1884;12.4743,2.4457,-1.0766;8.3233,1.1494,0.7067;8.3087,0.1129,1.071;8.0113,1.1173,-0.3414;7.2668,1.9301,1.5371;6.4697,1.207,1.7941;6.717,3.0607,0.7751;6.2822,3.7122,1.429;5.7039,2.6839,-0.214;6.1837,2.0681,-0.9855;4.9052,2.0568,0.2296;5.0747,3.9121,-0.8785;4.2909,3.5624,-1.5646;4.5589,4.5145,-0.114;6.0696,4.799,-1.6387;6.8567,5.1294,-0.9517;6.5732,4.1944,-2.4066;5.4021,6.0098,-2.2984;6.1346,6.6341,-2.8226;4.8993,6.6421,-1.5558;4.6464,5.6982,-3.0304;7.8516,2.4854,2.8529;7.0995,2.4973,3.6476;8.1955,3.5113,2.6853;9.2873,1.5392,3.5105)|\",5.65452581339\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])SC([H])([H])[C@@]([H])(N([H])C([H])([H])[H])C2([H])[H] |(3.8745,-1.0027,-7.3872;3.5899,-1.0075,-6.3384;3.4883,-2.2105,-5.6367;3.6964,-3.1563,-6.129;3.1236,-2.1871,-4.2936;3.0482,-3.1216,-3.7409;2.8492,-0.9899,-3.6159;2.9548,0.2144,-4.3359;3.323,0.194,-5.6909;3.3979,1.1319,-6.2354;2.6966,1.832,-3.6223;1.9823,1.4091,-1.9882;2.1012,2.3168,-1.3894;0.9083,1.2111,-2.0929;2.6807,0.206,-1.3291;3.7588,0.4118,-1.2976;2.2562,-0.0489,0.0476;1.2508,-0.2136,0.0738;2.6263,0.9658,1.0282;2.2914,0.641,2.0188;2.2175,1.9761,0.8492;3.7187,1.0514,1.0597;2.4385,-1.0617,-2.1559;2.9609,-1.8919,-1.6686;1.365,-1.3064,-2.0951)|\",5.45860384103\r\n\"[H]NC1=NC(C([H])([H])[H])=C([C@@]2([H])O[C@]([H])(C([H])([H])O[H])[C@]([H])(O[H])C2([H])[H])C([H])N1[H] |(7.9423,1.7455,-2.4692;7.2258,1.0138,-2.4814;6.1831,1.3803,-1.8298;5.0864,0.531,-1.7257;3.9908,0.9099,-1.1299;2.8855,-0.1162,-1.0488;1.9772,0.2243,-1.561;2.6096,-0.3161,-0.0054;3.2264,-1.0433,-1.5118;3.8095,2.2227,-0.5324;2.5088,2.664,0.1085;2.1524,1.876,0.7837;2.7408,3.8505,0.9049;1.9109,4.944,0.4678;2.5395,5.8426,0.5065;0.7572,5.1173,1.4579;0.0729,4.257,1.3903;0.1788,6.0145,1.2189;1.2619,5.2819,2.7716;1.9093,4.5661,2.8918;1.4811,4.5731,-0.9653;2.2705,4.871,-1.6708;0.2262,5.103,-1.3666;0.344,6.0439,-1.5692;1.3984,3.0462,-0.8979;1.5297,2.5788,-1.876;0.4082,2.7714,-0.5181;4.9044,3.0292,-0.55;4.9136,4.0019,-0.0725;6.0528,2.6269,-1.1594;6.8585,3.2381,-1.1431)|\",4.421850070625\r\n\"[H]O[C@]([H])(C([H])([H])[H])[C@@]([H])(OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H])C([H])=C([H])[H] |(-0.4411,2.3885,-1.9914;0.1535,2.1309,-2.7169;1.4001,1.8301,-2.0998;1.8359,2.7478,-1.6699;2.3329,1.2825,-3.1696;3.3184,1.0541,-2.7483;2.4578,2.0253,-3.9631;1.924,0.3695,-3.6128;1.1776,0.8648,-0.9165;2.1289,0.7193,-0.3784;0.2549,1.574,-0.0732;0.0534,1.0072,1.2167;-0.4062,0.0122,1.1399;1.0344,0.8698,1.7047;-0.816,1.9284,2.0397;-0.6182,3.3149,2.0098;0.1401,3.7276,1.3514;-1.3974,4.1575,2.8016;-1.2365,5.2319,2.7661;-2.3803,3.6252,3.6399;-2.9868,4.2829,4.2569;-2.5835,2.2454,3.6753;-3.3508,1.8225,4.3184;-1.8085,1.4043,2.8744;-1.979,0.3301,2.8962;0.6172,-0.4685,-1.3363;-0.2867,-0.4165,-1.9407;1.1571,-1.646,-1.0198;0.7219,-2.5853,-1.351;2.0633,-1.7176,-0.4206)|\",6.4763096419\r\n\"[H]OC([H])([H])[C@@]1([H])O[C@]([H])(C2=C([H])N=C(C([H])([H])[H])N=C2C([H])([H])[H])C([H])([H])[C@@]1([H])O[H] |(0.6839,3.0381,3.4404;0.3962,3.9545,3.5902;0.5255,4.6007,2.3371;-0.2122,4.2246,1.6127;0.3277,5.6656,2.5132;1.9271,4.4093,1.7648;2.6568,4.8186,2.4779;2.1656,2.9893,1.6626;2.5883,2.6445,0.3325;1.7062,2.3486,-0.2521;3.5752,1.5025,0.3833;4.3115,1.2255,1.5342;4.1544,1.8179,2.4317;5.2144,0.2408,1.6153;5.3809,-0.4988,0.5107;6.3911,-1.614,0.5595;6.7998,-1.7097,1.5671;7.2097,-1.4179,-0.1431;5.9306,-2.5604,0.2563;4.7188,-0.3309,-0.6456;3.8212,0.6601,-0.7187;3.103,0.8142,-2.0368;2.0142,0.7634,-1.9115;3.4191,0.0164,-2.7114;3.3269,1.7796,-2.5067;3.1406,3.9671,-0.2368;4.1579,4.1351,0.1333;3.1468,4.0065,-1.3287;2.1837,4.9985,0.3599;2.633,5.999,0.426;1.0274,5.0226,-0.4761;0.4132,5.6903,-0.1343)|\",5.7143908605\r\n\"[H]NC1=C([C@@]2([H])O[C@]([H])(C([H])([H])O[H])[C@]([H])(O[H])C2([H])[H])C([H])N([H])/C(=N\\[H])N1[H] |(-2.7132,-1.3987,0.5267;-2.934,-1.6882,1.479;-1.9101,-1.5697,2.2486;-0.5538,-1.1039,1.9258;-0.2202,-0.7063,0.5111;-1.008,-0.0445,0.121;1.0323,-0.0015,0.4831;1.7554,-0.3025,-0.7304;2.6739,-0.8359,-0.4463;2.1444,1.0241,-1.3763;1.2359,1.5456,-1.7107;2.7843,0.8467,-2.2497;2.8963,1.8233,-0.4793;2.3471,1.895,0.3195;0.8465,-1.2354,-1.5622;1.441,-1.9763,-2.1148;-0.0372,-0.5423,-2.4423;0.4837,-0.1609,-3.1654;-0.0161,-1.8739,-0.4756;0.5317,-2.6825,0.0198;-0.9478,-2.2747,-0.8815;0.3626,-1.0453,2.9149;1.369,-0.6881,2.733;0.0825,-1.4215,4.211;0.7939,-1.3268,4.9197;-1.1718,-1.8848,4.6215;-1.5167,-2.2462,5.8016;-0.7425,-2.1635,6.4609;-2.0863,-1.932,3.5871;-3.0117,-2.2567,3.8399)|\",5.281729838205001\r\n\"[H]OC1=C(/C(=C(/[H])Cl)C([H])([H])N([H])C([H])([H])[H])C([H])=C([H])C([H])=C1C([H])([H])[H] |(1.5546,-2.3073,0.7659;2.498,-2.4422,0.591;3.077,-1.2347,0.2864;4.4553,-1.243,0.0073;5.2437,-2.5019,0.1041;5.8472,-3.0955,-0.9301;6.4483,-3.9872,-0.8089;5.7509,-2.5539,-2.5945;5.4091,-3.0922,1.4992;4.474,-2.9254,2.0611;6.1859,-2.5193,2.0283;5.8214,-4.4918,1.4831;5.0859,-5.0342,1.0322;6.0675,-5.0247,2.8184;6.2951,-6.0932,2.7467;5.2241,-4.8969,3.5227;6.9412,-4.5261,3.2545;5.0715,-0.0308,-0.3273;6.1352,-0.0316,-0.5455;4.3442,1.1563,-0.3789;4.8371,2.0886,-0.6381;2.9795,1.1388,-0.0944;2.4048,2.0613,-0.1327;2.3226,-0.0492,0.2394;0.8406,-0.0759,0.536;0.4123,0.9272,0.4583;0.6244,-0.4375,1.5522;0.287,-0.7153,-0.1671)|\",5.646362397875\r\n\"[H]OC(=N/C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])/C([H])=C(\\[H])C(=O)OC([H])(C([H])([H])[H])C([H])([H])[H] |(4.8404,-1.3072,-2.1505;5.8,-1.3102,-2.2928;6.426,-1.2658,-1.0683;7.6431,-1.6176,-1.0578;8.5643,-1.5904,0.0906;8.6339,-0.2068,0.7719;9.4371,-0.1972,1.5176;7.7049,0.0618,1.2832;8.8516,0.5701,0.0306;8.2,-2.6962,1.1046;8.9764,-2.7728,1.8742;8.1256,-3.664,0.5974;7.2497,-2.506,1.6136;9.9519,-1.9093,-0.5032;10.721,-1.9271,0.278;10.2278,-1.1556,-1.248;9.936,-2.883,-1.0031;5.5773,-0.7674,0.0434;5.8107,-1.0951,1.0504;4.5487,0.0814,-0.113;4.2694,0.4938,-1.0785;3.7648,0.5377,1.0634;3.9724,0.1969,2.2126;2.7831,1.3787,0.6757;1.9124,1.9156,1.7179;1.7901,1.1327,2.4718;2.5744,3.1365,2.3485;1.9201,3.5622,3.1175;2.7642,3.9064,1.5922;3.5215,2.8604,2.8199;0.5861,2.2224,1.037;-0.1314,2.6044,1.7711;0.1651,1.3217,0.5794;0.7151,2.9792,0.2554)|\",4.468109425210001\r\n\"[H]/N=C(/N([H])C([H])([H])C([H])([H])C([H])([H])/C([H])=C(\\[H])[C@@]([H])(C(=O)O[H])N([H])[H])C([H])([H])[H] |(1.6912,4.0299,-0.5623;2.1042,4.2698,0.34;2.7359,3.2617,0.8234;3.2575,3.3773,2.1042;3.1505,4.3249,2.4517;4.4256,2.6634,2.5936;4.4008,2.699,3.6901;4.3322,1.6058,2.3233;5.7758,3.211,2.0956;5.7942,3.1866,0.9974;5.8636,4.2676,2.3832;6.9742,2.4211,2.6559;6.9622,2.4534,3.7533;6.861,1.3631,2.3731;8.298,2.9197,2.1462;8.4336,2.8885,1.0626;9.3018,3.371,2.9078;9.1838,3.3884,3.9937;10.6195,3.8868,2.3827;10.6423,3.7249,1.2939;10.7406,5.4266,2.5622;11.7757,5.9723,2.8575;9.6186,6.1384,2.3097;8.8837,5.5122,2.1662;11.7476,3.1652,2.9651;11.6882,3.2308,3.9821;12.5967,3.6787,2.7252;2.9127,1.9148,0.1477;2.3537,1.8924,-0.7907;2.5509,1.1011,0.7869;3.9652,1.7102,-0.083)|\",5.662689228905\r\n\"[H]OC(=O)[C@@]1(O[H])C([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])C(=O)[C@@]1([H])C([H])([H])C([H])([H])C([H])([H])[H] |(6.7415,-0.6472,3.3527;5.9923,-0.0578,3.6437;5.4074,0.5584,2.6269;4.6209,1.4713,2.7984;5.6693,0.0711,1.1694;5.1815,1.0839,0.3186;4.6599,1.6731,0.904;7.1614,-0.1301,0.8021;7.2027,-0.0869,-0.2924;7.7469,0.722,1.1647;7.8316,-1.4275,1.2498;8.8043,-1.5233,0.7502;8.0539,-1.3995,2.6669;8.2012,-2.3375,2.9091;6.9872,-2.6817,0.9425;6.975,-2.8553,-0.1451;7.557,-3.7792,1.6246;6.8006,-4.2027,2.0857;5.5471,-2.467,1.4202;5.0696,-3.2729,2.2025;4.8358,-1.2455,0.8849;4.8904,-1.3186,-0.2128;3.3631,-1.1551,1.3078;3.2894,-1.2904,2.3936;3.0093,-0.1426,1.0862;2.4563,-2.1724,0.6003;2.539,-2.0339,-0.4876;2.8046,-3.1895,0.816;0.9893,-2.0368,1.0203;0.3584,-2.7648,0.4983;0.6025,-1.0356,0.7945;0.8702,-2.2034,2.0975)|\",5.956572187445\r\n\"[H]C(=O)C([H])([H])OC([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C([H])([H])C([H])([H])[H] |(7.2833,-5.2859,-7.1649;7.771,-6.2759,-7.0314;7.8893,-7.0682,-7.9372;8.2685,-6.5426,-5.6206;9.347,-6.7759,-5.6633;7.7614,-7.4438,-5.2322;8.0024,-5.4074,-4.8351;8.4308,-5.538,-3.4859;7.9454,-6.4147,-3.023;9.5196,-5.712,-3.4556;8.0723,-4.2659,-2.7279;8.4977,-4.3442,-1.7176;8.5731,-3.4187,-3.2149;6.5636,-4.0026,-2.6425;6.1503,-3.9636,-3.658;6.0762,-4.8538,-2.1434;6.2196,-2.7096,-1.8936;6.644,-2.7486,-0.8793;6.6981,-1.8549,-2.3912;4.7039,-2.4495,-1.7993;4.278,-2.3913,-2.81;4.2317,-3.3176,-1.3126;4.3661,-1.1991,-1.0334;4.7061,-1.1698,0.005;3.704,-0.1465,-1.5217;3.3674,-0.1738,-2.5613;3.3777,1.1106,-0.7612;2.2851,1.2326,-0.709;3.7291,1.0166,0.2744;3.9821,2.3711,-1.4031;3.7026,3.2715,-0.844;3.6319,2.4966,-2.4349;5.0759,2.3112,-1.4289)|\",5.643641259370001\r\n\"[H]OC1=N[C@]([H])(N([H])C([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])([H])C(=O)N1C([H])([H])[H] |(8.4211,-2.9513,-2.6352;9.0733,-2.3439,-2.2411;8.6261,-2.1313,-0.9824;7.5825,-2.7321,-0.5592;7.1099,-2.344,0.7691;6.5448,-3.206,1.1639;6.2719,-1.1459,0.6577;6.0674,-0.8044,1.5973;5.0086,-1.3619,-0.0554;5.2564,-1.7717,-1.0384;4.3739,-2.1188,0.4447;4.2143,-0.0592,-0.1973;3.935,0.2961,0.807;3.267,-0.2907,-0.7066;4.9294,1.0841,-0.9445;5.8611,1.2973,-0.4044;5.3003,0.6983,-2.3839;5.7715,1.5413,-2.9034;4.4086,0.4128,-2.9587;6.0038,-0.1404,-2.4124;4.0616,2.3513,-0.929;4.5745,3.1895,-1.4154;3.8173,2.658,0.0954;3.115,2.188,-1.4615;8.287,-2.0439,1.7022;8.8356,-2.9757,1.8956;7.9699,-1.6429,2.669;9.2579,-1.0579,1.0799;9.8835,-0.229,1.7146;9.4392,-1.2319,-0.2961;10.4365,-0.4235,-1.0042;10.9161,0.2085,-0.2584;9.9551,0.1986,-1.7637;11.1762,-1.0661,-1.4872)|\",5.964735602959999\r\n\"[H]OC1=N[C@]([H])(N([H])C([H])([H])C2([H])C([H])([H])C2([H])[H])C([H])([H])C(=O)N1C([H])([H])[H] |(2.5824,2.758,-0.6194;1.8886,2.2424,-0.1704;2.4571,1.0374,0.0748;3.6428,0.7788,-0.3092;4.2011,-0.5176,0.1092;4.5503,-0.4046,1.1476;5.3706,-0.9115,-0.6554;6.1139,-0.2439,-0.4546;5.2033,-0.9623,-2.1132;4.5288,-1.792,-2.3623;4.7423,-0.043,-2.5136;6.5412,-1.1913,-2.7796;7.038,-2.1088,-2.4697;7.4453,-0.015,-3.079;7.0812,0.9813,-2.8374;8.5175,-0.1385,-2.9532;6.7627,-0.7541,-4.2048;7.3659,-1.3868,-4.8501;5.945,-0.2528,-4.7172;3.1175,-1.6019,0.0941;3.4783,-2.538,0.5243;2.8061,-1.8088,-0.939;1.8751,-1.1743,0.8541;1.1463,-1.9474,1.4485;1.5916,0.1961,0.7791;0.3496,0.7097,1.3646;-0.0992,-0.1108,1.9224;-0.3366,1.0476,0.5831;0.5646,1.547,2.0326)|\",5.983783572495\r\n\"[H]OC1=NON([H])[C@@]1([H])[C@]([H])(C(=O)OC([H])([H])C([H])([H])[H])N([H])[H] |(3.8656,-0.3346,-1.8212;3.5935,0.492,-2.2857;4.4176,1.4588,-1.8703;4.1259,2.7029,-1.963;5.215,3.443,-1.4453;6.3692,2.5475,-1.2793;6.8428,2.642,-2.1821;5.7769,1.2082,-1.2215;6.3786,0.5003,-1.8045;5.6885,0.728,0.2567;6.6998,0.7938,0.6707;5.2657,-0.7392,0.3326;4.587,-1.3088,-0.5165;5.7156,-1.3341,1.4314;5.3244,-2.7178,1.6613;5.3479,-2.8169,2.7483;4.3021,-2.855,1.3032;6.2899,-3.6782,0.9862;6.0229,-4.7089,1.2451;7.3172,-3.4967,1.3181;6.2441,-3.5769,-0.1019;4.8221,1.5181,1.1285;3.9002,1.6264,0.7054;5.2105,2.4573,1.1988)|\",5.352479439335\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])C([H])([H])[C@@]1([H])C(O[H])=NON1[H] |(5.5245,-0.2201,-1.1188;5.2295,0.4089,-1.8292;3.91,0.4938,-1.6742;3.2272,1.2567,-2.3318;3.3433,-0.4883,-0.6224;3.1208,-1.4042,-1.1875;4.4437,-0.8318,0.2997;4.4974,-0.1571,1.0621;4.2952,-1.7453,0.7213;2.0344,0.0246,-0.0068;1.2989,0.0769,-0.8154;1.6543,-0.7226,0.6983;2.1594,1.3993,0.6828;2.9859,1.9743,0.2445;2.3821,1.2841,1.7498;0.9001,2.2838,0.5583;1.134,3.2746,0.9668;0.4096,2.3995,-0.8811;1.0884,3.0454,-1.8497;1.861,2.4963,-2.1165;-0.7063,1.8068,-1.0812;-1.1419,1.2445,0.1189;-0.2812,1.7265,1.2293;-0.8385,2.5096,1.5796)|\",5.7035063064800005\r\n\"[H]/C(=C(/[H])C([H])([H])C([H])([H])[H])C([H])([H])/C([H])=C(\\[H])C([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(5.6705,0.8657,0.6585;5.6935,0.1881,1.5142;5.5379,-1.1228,1.3103;5.5726,-1.7975,2.1696;5.3005,-1.7725,-0.0264;5.3207,-1.0079,-0.8136;6.1259,-2.4666,-0.2458;3.9748,-2.5497,-0.0893;3.8481,-3.0333,-1.0647;3.121,-1.8834,0.0756;3.9382,-3.3333,0.6775;5.948,0.84,2.8536;5.8853,0.0757,3.6416;5.1645,1.5817,3.0647;7.2964,1.5198,2.9024;8.1585,0.8605,2.7821;7.4938,2.8351,3.0292;6.6364,3.5007,3.1296;8.8434,3.5016,3.0104;8.9359,4.1176,2.1043;9.6393,2.7467,2.9581;9.129,4.4205,4.2023;10.1347,4.8555,4.0829;8.1552,5.4681,4.2478;8.0309,5.9128,5.6063;8.7045,7.2703,5.7871;8.679,7.5702,6.8397;8.1935,8.0328,5.1908;9.7492,7.2139,5.4672;6.5469,5.9265,5.966;6.409,6.2723,6.9953;6.1302,4.9192,5.8726;5.9978,6.5937,5.2936;8.7358,4.9577,6.4112;9.0001,3.8178,5.6008;9.9172,3.3444,5.9636;8.1756,3.0919,5.6446)|\",6.62325112117\r\n\"[H]C(=O)[C@@]1([H])OC(=O)C([H])([H])C([H])([H])C([H])([H])/C([H])=C(/[H])C1([H])[H] |(-3.3712,1.9931,1.2787;-3.184,1.2943,0.43;-3.9374,1.2168,-0.5057;-1.8964,0.493,0.6115;-2.0284,-0.0785,1.5439;-1.7942,-0.3793,-0.5023;-0.7981,-1.2993,-0.6693;-0.719,-1.859,-1.7351;0.1437,-1.6518,0.4789;-0.1205,-1.1705,1.4251;0.0174,-2.7308,0.6229;1.6203,-1.379,0.1191;2.2556,-1.9645,0.7947;1.796,-1.7561,-0.8946;2.0418,0.1046,0.2255;1.9186,0.4278,1.2677;3.1159,0.1702,0.0138;1.2973,1.03,-0.7083;1.7523,1.1875,-1.6853;0.1258,1.6323,-0.4659;-0.32,2.241,-1.2506;-0.683,1.474,0.7926;-1.0741,2.4475,1.1192;-0.0531,1.1106,1.6081)|\",5.986504711\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])[C@@]1([H])C(O[H])=NON1[H] |(-1.0008,1.0574,-0.5679;-0.5344,1.2467,-1.4301;0.7577,0.9634,-1.3058;1.5794,1.2391,-2.1529;1.1678,0.1631,-0.0438;0.8074,-0.8624,-0.2192;2.6333,0.2191,0.0881;3.003,0.491,-0.8266;3.0213,-0.6884,0.3325;0.5389,0.666,1.288;0.8073,-0.0586,2.0716;1.0421,2.0444,1.6957;2.343,2.2945,1.8171;2.7923,1.5861,1.2693;0.1122,2.8896,1.9365;-1.1239,2.2908,1.7274;-0.9069,0.8953,1.204;-1.4126,0.3394,1.8919)|\",5.90487055585\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])[C@@]1([H])C(O[H])=NON1[H] |(0.2422,-1.0612,1.4563;-0.5655,-1.2432,0.9473;-0.264,-1.8,-0.2486;-1.1351,-2.0978,-1.0276;1.2325,-2.0007,-0.5556;1.6954,-2.5184,0.2967;1.4118,-2.8514,-1.7421;0.6986,-2.5795,-2.4237;1.1849,-3.8171,-1.5057;1.9365,-0.6368,-0.7327;1.7835,-0.0163,0.1583;1.486,-0.0932,-1.5722;3.4656,-0.7636,-0.937;3.8464,-1.5779,-0.3068;3.8721,-0.9552,-2.3969;3.6987,-2.0964,-3.0658;2.949,-2.5778,-2.6091;4.3771,0.1011,-2.9167;4.4165,1.1005,-1.915;4.1689,0.4794,-0.6095;5.1167,0.2247,-0.3153)|\",5.401459932425001\r\n\"[H]O[C@@]1([H])[C@@]([H])(O[H])C([H])=C([P@@](=O)(O[H])OC([H])([H])[H])C([H])([H])[C@]1([H])O[H] |(5.4422,1.5802,2.9644;6.3616,1.3975,3.2212;7.1332,1.2331,2.0239;8.165,1.1217,2.3661;6.6939,-0.0536,1.3018;7.4684,-0.3399,0.578;6.5837,-1.1296,2.2205;6.2926,-0.7158,3.0546;5.3872,0.1594,0.5628;4.8582,-0.747,0.2795;4.8838,1.3672,0.2546;3.2765,1.5263,-0.5484;2.3182,2.4313,0.1392;3.7149,1.974,-2.0465;2.9678,2.3943,-2.5071;2.7336,0.0203,-0.7976;1.7785,-0.545,0.1217;1.3518,-1.4166,-0.3776;0.9951,0.1793,0.3561;2.2759,-0.8614,1.045;5.5722,2.6702,0.6337;4.9889,3.1761,1.4154;5.577,3.3632,-0.2167;7.0249,2.4501,1.088;7.3795,3.3382,1.62;7.9096,2.2867,-0.0196;7.46,1.7419,-0.6853),wU:11.11|\",6.372906378710001\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])[C@@]([H])(N([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(2.837,-0.5719,-5.6144;2.79,-0.2061,-4.7143;3.0885,-1.2986,-3.8386;4.1689,-1.5018,-3.8845;2.7509,-0.7789,-2.4231;3.4859,0.0202,-2.2264;1.4467,-0.2055,-2.4383;1.4549,0.388,-3.2102;2.9181,-1.8318,-1.2976;2.2908,-2.7023,-1.5268;4.3076,-2.3264,-1.2715;4.9611,-1.5456,-1.2221;4.5285,-2.8346,-2.1244;2.4862,-1.3559,0.1372;3.0712,0.0308,0.4755;2.8262,0.2971,1.5104;2.6607,0.8114,-0.1734;4.1659,0.0487,0.3911;2.9907,-2.3818,1.1748;2.6048,-2.1208,2.1679;4.0811,-2.4199,1.2203;2.6395,-3.3923,0.933;0.9469,-1.3005,0.2527;0.6625,-1.0273,1.2764;0.5053,-2.2837,0.0428;0.5069,-0.5776,-0.4345;2.3902,-2.5888,-4.3138;2.808,-3.4249,-3.7283;2.8233,-2.7057,-5.6839;2.4184,-3.5024,-6.0621;0.8524,-2.6839,-4.2294;0.5355,-2.5719,-3.189;0.5822,-3.7099,-4.528;0.0701,-1.7608,-5.046;0.1992,-0.819,-4.6855;0.4429,-1.7528,-5.9948)|\",7.366121933035\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])[C@@]([H])(N([H])[H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(-2.8093,1.8186,3.0878;-2.2384,1.0891,2.7423;-3.0959,0.2139,2.0276;-3.2999,0.5946,1.0133;-2.3544,-1.1419,1.953;-2.362,-1.5358,2.9769;-3.1053,-2.1254,1.204;-3.0681,-1.8694,0.2677;-0.8598,-1.1204,1.5235;-0.4192,-0.245,2.0068;-0.1562,-2.3073,2.0487;-0.6902,-3.1392,1.7924;-0.1535,-2.2783,3.0675;-0.5813,-1.0736,-0.0095;-0.9072,-2.049,-0.4057;0.9366,-0.9571,-0.2925;1.4699,-1.7053,0.2992;1.1053,-1.1949,-1.3532;1.4914,0.4488,-0.0074;1.4376,0.6578,1.0699;2.5569,0.4824,-0.27;0.7231,1.5353,-0.7754;0.9116,1.4155,-1.8532;1.0945,2.5319,-0.5036;-0.7897,1.4428,-0.5193;-1.322,2.1688,-1.1483;-1.0102,1.7077,0.5229;-1.3078,0.0258,-0.8201;-1.1345,-0.1891,-1.8851;-2.3964,-0.0147,-0.6887;-4.4705,0.0719,2.7315;-4.2767,-0.3049,3.755;-5.3474,-0.79,2.0274;-4.8108,-1.5552,1.7408;-5.2169,1.4088,2.8345;-6.2342,1.201,3.1976;-5.3139,1.8284,1.8266;-4.4756,2.3797,3.664;-4.5518,2.1311,4.6495;-4.878,3.3094,3.5662)|\",7.458640642204999\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(N([H])[H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(-0.876,-3.3239,-3.8975;-0.5743,-3.1945,-2.9866;0.6943,-2.5236,-3.0206;1.3788,-3.0193,-3.7204;0.5876,-1.4762,-3.3216;1.2414,-2.6393,-1.5971;0.6024,-2.0386,-0.931;1.1914,-4.0134,-1.1983;0.361,-4.3545,-1.5784;2.7137,-2.2382,-1.4034;2.9229,-2.3279,-0.3279;3.5451,-3.1535,-2.1204;3.13,-4.0246,-1.9793;3.1395,-0.8356,-1.8857;4.179,-0.71,-1.5366;3.1224,-0.8257,-3.3095;3.5389,-1.6714,-3.5591;2.31,0.3263,-1.281;1.2506,0.1698,-1.5232;2.4011,0.2999,0.1914;3.3755,0.2459,0.4856;1.9382,-0.5232,0.5693;2.6678,1.7602,-1.8181;1.956,2.8098,-0.9381;2.1073,3.8105,-1.3614;2.3236,2.8015,0.0899;0.8755,2.6228,-0.899;2.1602,1.9471,-3.2646;2.3471,2.9778,-3.5916;1.0768,1.7784,-3.3219;2.6472,1.2664,-3.9631;4.188,2.0147,-1.778;4.4038,3.0495,-2.0698;4.7253,1.3573,-2.4698;4.603,1.8732,-0.7715)|\",7.684495138120001\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(N([H])[H])C1([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C1([H])[H] |(-1.3655,0.1978,6.0507;-0.7847,-0.148,5.3578;-1.6042,-0.6039,4.2678;-2.4197,-1.238,4.6392;-2.0231,0.2373,3.711;-0.6718,-1.4292,3.3806;0.0399,-0.7503,2.8954;0.0205,-2.3718,4.2174;0.193,-1.8855,5.0443;-1.3501,-2.318,2.3248;-0.5345,-2.8464,1.8088;-2.1802,-3.2773,2.9951;-1.6504,-3.5519,3.7665;-2.2684,-1.6988,1.2595;-2.5699,-2.5464,0.6262;-3.4401,-1.1528,1.8709;-3.7327,-1.8388,2.4964;-1.6927,-0.6122,0.3293;-2.4673,-0.4804,-0.4453;-1.5055,0.6236,1.1075;-1.3168,1.4004,0.4778;-2.397,0.8328,1.5539;-0.3675,-1.027,-0.3631;0.3619,-1.2133,0.4359;0.2198,0.0938,-1.255;0.3376,1.0231,-0.6855;1.2384,-0.2072,-1.5391;-0.5922,0.3349,-2.5389;-1.5929,0.7151,-2.2879;-0.1106,1.1159,-3.1413;-0.7285,-0.9575,-3.3589;0.2649,-1.2477,-3.7325;-1.3559,-0.7857,-4.2431;-1.2995,-2.1063,-2.513;-1.3042,-3.0369,-3.0947;-2.3496,-1.8933,-2.2693;-0.4888,-2.3109,-1.2203;0.529,-2.6237,-1.4951;-0.9099,-3.1395,-0.6395)|\",7.902186218520001\r\n\"[H]OC([H])([H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(O[H])[C@@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])[H] |(7.0554,4.9521,2.7637;6.093,4.865,2.8681;5.8352,3.4741,2.7906;6.1737,2.9455,3.6953;4.7514,3.3579,2.7091;6.5149,2.8606,1.5674;6.0953,3.3183,0.6646;7.9141,3.2169,1.6995;8.3108,3.2327,0.8151;6.4357,1.3248,1.4699;6.8891,1.0485,0.497;7.1703,0.7323,2.5299;7.9715,1.2793,2.6209;5.0061,0.7662,1.5131;4.5391,1.0985,2.4471;4.2161,1.369,0.4789;4.5677,1.0616,-0.3728;4.8961,-0.7802,1.489;5.4775,-1.1535,2.3394;3.5022,-1.215,1.6715;2.9141,-0.6462,1.0605;3.2014,-1.0024,2.622;5.4619,-1.4209,0.2076;4.8417,-1.1265,-0.6553;6.4728,-1.0363,0.0153;5.515,-2.9527,0.2741;6.1717,-3.2468,1.1052;4.5156,-3.3244,0.5227;6.0157,-3.5939,-1.0235;6.0488,-4.6862,-0.939;7.0264,-3.2507,-1.2792;5.3603,-3.3478,-1.8686)|\",7.366121933035\r\n\"[H]O[C@@]([H])(C([H])([H])C([H])=C([H])[H])[C@]([H])(O[H])C([H])([H])OC(=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(4.5385,-0.4118,4.5088;5.3907,-0.0246,4.2389;5.7318,-0.6896,3.0294;6.4893,-0.0633,2.5422;6.3319,-2.085,3.3131;5.5439,-2.6901,3.7827;6.6003,-2.5835,2.3713;7.5391,-2.015,4.207;7.3933,-1.475,5.1411;8.7292,-2.5479,3.9276;9.5676,-2.4665,4.6145;8.913,-3.0911,3.0019;4.4765,-0.7335,2.1359;4.1921,0.3121,1.9311;3.4802,-1.3804,2.915;2.7336,-1.5801,2.3164;4.7229,-1.4232,0.789;4.5553,-2.5007,0.8777;5.7411,-1.2404,0.4381;3.9113,-0.9047,-0.3002;2.5937,-1.1728,-0.3117;2.014,-1.7348,0.6037;1.9216,-0.6966,-1.6027;0.4143,-0.9831,-1.5171;-0.0765,-0.6476,-2.4373;-0.0431,-0.4598,-0.6719;0.219,-2.0519,-1.3897;2.1686,0.8202,-1.7681;1.6819,1.1717,-2.6848;3.236,1.0469,-1.8354;1.7483,1.3849,-0.9279;2.5436,-1.4602,-2.7944;2.0631,-1.1402,-3.7258;2.3947,-2.5417,-2.6949;3.6167,-1.2656,-2.8761)|\",6.2803876695400005\r\n\"[H]C1C2C3=C([H])C([H])=C([H])C([H])=C3[C@@]([H])(C([H])([H])[H])[C@@]2([H])OC1=O |(0.6448,0.4175,1.0088;0.0266,-0.0925,1.7355;-0.7575,-1.1723,1.5977;-1.0242,-2.2936,0.7067;-1.0095,-2.4066,-0.6853;-0.7934,-1.5438,-1.309;-1.298,-3.6459,-1.2578;-1.3025,-3.7529,-2.3388;-1.5891,-4.752,-0.4501;-1.8153,-5.7084,-0.9137;-1.5926,-4.6429,0.9449;-1.8126,-5.5114,1.5607;-1.3072,-3.4124,1.528;-1.1836,-3.0663,3.0142;-1.9881,-3.5249,3.6004;0.1716,-3.4891,3.6048;0.2801,-4.5782,3.5719;0.2529,-3.1601,4.6451;1.0029,-3.0495,3.0435;-1.3486,-1.5226,2.9384;-2.4202,-1.2794,2.9635;-0.6909,-0.6839,3.8792;0.0818,0.2471,3.1809;0.6939,1.1151,3.7464)|\",4.87628020096\r\n\"[H]OC([H])([H])C1=C([H])[C@]([H])(O[H])[C@]([H])(O[H])[C@@]([H])(OC(=O)C([H])([H])[H])C1=O |(5.2693,-2.5953,-1.7474;6.1956,-2.4058,-1.9772;6.2514,-1.0062,-2.1832;7.2677,-0.774,-2.5168;5.5622,-0.6961,-2.9861;5.9308,-0.2007,-0.9375;6.6425,0.8676,-0.5422;7.5342,1.1723,-1.0884;6.2342,1.7674,0.5858;5.5724,2.5458,0.1849;7.4311,2.3543,1.1257;7.1804,3.1702,1.5871;5.505,0.9966,1.7092;4.9901,1.7169,2.3677;6.4569,0.2435,2.4399;7.2775,0.7703,2.4141;4.4629,-0.0199,1.2061;4.5193,-0.8734,1.8868;3.0992,0.4255,1.3608;2.6417,1.3419,0.4805;3.3389,1.8555,-0.371;1.1733,1.61,0.6893;0.8813,2.4993,0.1298;0.9479,1.7325,1.7522;0.5997,0.7494,0.327;4.7041,-0.5826,-0.2087;3.9009,-1.3855,-0.6693)|\",4.922539555545001\r\n\"[H]O/N=C1/C([H])=C(C#CC2=C([H])C([H])=C([H])C(C([H])([H])[H])=N2)C([H])([H])C([H])([H])C1([H])[H] |(5.8338,-10.1051,-0.1398;6.5258,-9.4296,-0.0655;5.7809,-8.2413,-0.1303;6.5425,-7.1958,-0.0802;5.853,-5.9176,-0.1077;4.7699,-5.9464,-0.175;6.5077,-4.7294,-0.0233;5.7946,-3.5051,0.001;5.213,-2.4358,0.0229;4.5346,-1.1784,0.0494;5.2725,0.0154,0.1626;6.3547,-0.0209,0.2312;4.5792,1.2192,0.1845;5.1144,2.161,0.2725;3.1881,1.2006,0.0928;2.6168,2.124,0.1067;2.5294,-0.0326,-0.0178;1.0264,-0.1192,-0.127;0.5543,0.864,-0.0415;0.6254,-0.7722,0.6556;0.7401,-0.5606,-1.0886;3.1884,-1.1989,-0.0383;8.0263,-4.6752,0.0623;8.388,-3.7774,-0.4513;8.3218,-4.5625,1.1166;8.6673,-5.9327,-0.538;9.7489,-5.9272,-0.3602;8.5226,-5.9192,-1.6262;8.0496,-7.2173,0.0331;8.3163,-7.3265,1.095;8.4403,-8.1049,-0.4719)|\",3.82592073803\r\n\"[H]OC([H])([H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])O[C@]1([H])[C@@]([H])(OC([H])([H])OC([H])([H])[H])C([H])=C([H])[H] |(3.5222,4.2739,-5.6032;3.589,3.3504,-5.8965;2.7643,2.6017,-5.0203;1.6969,2.833,-5.175;2.9225,1.545,-5.2522;3.1251,2.8561,-3.5648;4.1696,2.5617,-3.3926;2.9582,4.2587,-3.3148;2.7011,4.4393,-1.9211;1.575,5.4568,-1.7732;1.8698,6.4155,-2.2118;1.3401,5.6122,-0.7158;0.679,5.0968,-2.2865;3.9758,4.8436,-1.1822;4.3194,5.8238,-1.5273;4.7659,4.1093,-1.3651;3.7908,4.8935,-0.1044;2.2738,3.1579,-1.4249;2.2049,2.234,-2.5104;1.1742,2.1925,-2.8989;2.5831,0.8263,-2.0033;1.8743,0.5628,-1.2056;2.4437,-0.0951,-3.0899;1.1955,-0.757,-3.1465;0.3632,-0.0453,-3.0834;1.1938,-1.2592,-4.1239;0.9849,-1.6628,-2.1046;1.842,-2.7987,-2.144;1.5046,-3.4718,-1.3527;2.8875,-2.5226,-1.9661;1.7664,-3.314,-3.1133;3.9739,0.7563,-1.4294;4.1451,1.4379,-0.5987;4.9456,-0.0493,-1.8566;5.924,-0.0494,-1.3844;4.7939,-0.7296,-2.6887)|\",7.085844667020001\r\n\"[H]/N=C(/O[H])C1([H])C([H])=NC(=O)N=C1[H] |(0.9413,-1.3659,2.76;1.8917,-1.0032,2.6863;2.0934,-0.4455,1.5679;3.2993,0.1533,1.3695;3.4024,0.4192,0.4432;1.1053,-0.2814,0.4104;1.6551,-0.4027,-0.5405;0.4793,1.1069,0.4193;1.0765,1.9318,0.816;-0.7089,1.3478,0.0264;-1.4345,0.2454,-0.5254;-2.2394,0.4143,-1.4059;-1.1979,-1.0625,0.0081;-0.0134,-1.3038,0.4132;0.215,-2.3054,0.786)|\",4.677637090095001\r\n\"[H]/N=C(O[H])\\C(C([H])=O)=C([H])\\N=C(\\O[H])N([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(-0.9482,-4.4488,2.2327;-1.3112,-3.4942,2.1882;-0.4557,-2.7328,1.6332;-0.7586,-1.4039,1.4844;-1.6546,-1.32,1.8596;0.8846,-3.0738,1.1056;1.3895,-4.4491,1.1987;2.3957,-4.5911,0.7646;0.7957,-5.3981,1.6972;1.6596,-2.103,0.5257;1.2284,-1.1006,0.465;2.8881,-2.3248,-0.0275;3.704,-1.34,-0.2687;4.7803,-1.5799,-1.0336;4.6933,-2.5125,-1.3062;3.6127,-0.0712,0.2097;2.9172,0.0465,0.9343;4.4222,1.0713,-0.0314;5.1376,1.2707,-1.2189;5.1182,0.523,-2.0004;5.8768,2.4416,-1.3838;6.4317,2.5871,-2.3064;5.9014,3.423,-0.3921;6.4775,4.3323,-0.5343;5.1752,3.2253,0.7833;5.1812,3.9792,1.5653;4.4426,2.0554,0.9667;3.8877,1.8999,1.8893)|\",4.106198004045\r\n\"[H]C#CC1=C([H])/C(=N/O[H])C([H])([H])C([H])([H])C1([H])[H] |(0.0157,0.0206,-0.0249;1.0816,0.0083,-0.0057;2.2924,-0.0298,0.0225;3.7157,-0.0148,0.0553;4.4307,-1.1606,-0.0705;3.9288,-2.1117,-0.219;5.8822,-1.1924,0.0213;6.4223,-2.3623,-0.0885;7.8192,-2.3022,0.0465;8.0732,-3.2296,-0.0803;6.6294,0.0982,0.2673;6.7889,0.2038,1.351;7.6234,0.0355,-0.1838;5.8452,1.3043,-0.2689;6.3574,2.2346,0.0018;5.8265,1.2599,-1.3657;4.4016,1.328,0.2504;3.8242,2.1079,-0.2587;4.3833,1.586,1.3204)|\",4.49532081026\r\n\"[H]O[C@@](C([H])([H])[H])(C([H])([H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])[C@@]1([H])OC(C([H])([H])[H])(C([H])([H])[H])OC1([H])[H] |(6.5883,1.5718,-1.8604;6.1874,2.446,-1.9935;4.9703,2.2491,-2.7337;5.3057,1.5312,-4.0507;4.4246,1.3991,-4.6883;5.7062,0.5317,-3.8378;6.0538,2.1029,-4.6025;3.9435,1.4071,-1.9298;3.0791,1.2358,-2.5874;4.3907,0.4121,-1.7765;3.4252,1.9248,-0.566;3.122,2.9764,-0.6797;4.496,1.8596,0.5355;4.1281,2.2786,1.4776;5.3919,2.4101,0.2423;4.7842,0.8164,0.7284;2.1562,1.1306,-0.1815;1.4437,1.1848,-1.0168;2.4206,0.0672,-0.0784;1.4487,1.6041,1.0937;0.5166,1.0488,1.2476;1.1933,2.6696,1.0334;2.0659,1.4606,1.9866;4.4493,3.6765,-2.9791;4.4756,4.2136,-2.0201;5.272,4.3555,-3.9285;4.495,5.3907,-4.5358;4.8434,6.7514,-3.9358;4.1784,7.5235,-4.336;5.8783,7.0177,-4.1721;4.7268,6.723,-2.8485;4.7169,5.3281,-6.0445;4.1305,6.1037,-6.5471;4.4089,4.35,-6.4253;5.7754,5.4776,-6.2804;3.1215,5.1051,-4.2201;3.0628,3.8222,-3.6134;2.2439,3.8213,-2.8881;2.8731,3.0389,-4.3625)|\",8.307635855765\r\n\"[H]O[C@@]([H])([C@@]([H])(O[H])[C@@]([H])(N([H])[H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]([H])(O[H])C([H])([H])N([H])[H] |(7.4271,0.2019,1.0327;7.0824,-0.3687,0.3196;6.246,0.5111,-0.4429;6.8795,1.1764,-1.0493;5.4815,-0.4164,-1.4014;6.265,-0.884,-2.0216;4.7966,-1.4347,-0.6735;5.4377,-1.7263,-0.0013;4.4719,0.2111,-2.3773;4.2109,-0.6114,-3.0701;3.2917,0.6876,-1.6403;2.5733,0.9734,-2.3032;2.91,-0.1088,-1.132;5.089,1.3401,-3.2182;5.3036,2.2016,-2.5728;6.0532,0.9919,-3.6162;4.2094,1.7958,-4.3913;3.9562,0.9249,-5.0127;3.258,2.1951,-4.0132;4.8818,2.8651,-5.2595;4.2317,3.1727,-6.0861;5.8186,2.4915,-5.6906;5.122,3.7593,-4.6717;5.4802,1.4213,0.5321;4.915,2.1689,-0.0413;6.5128,2.0531,1.3009;6.0564,2.288,2.1386;4.5259,0.7164,1.5121;5.0467,-0.1324,1.9653;3.6461,0.3386,0.9838;4.2149,1.6737,2.594;3.4992,2.3294,2.2831;3.8337,1.1893,3.4031)|\",8.220559423605\r\n\"[H]C(=O)C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(OC([H])([H])C1=C([H])C([H])=C(OC([H])([H])[H])C([H])=C1[H])C([H])([H])[H] |(1.0375,0.1254,3.1421;1.3446,1.1338,3.5077;0.5399,2.0371,3.5696;2.8033,1.2392,3.8935;3.0032,2.2651,4.2223;2.9602,0.5704,4.7537;3.743,0.8083,2.75;3.4847,-0.2034,2.416;4.7676,0.7542,3.1409;3.7019,1.7547,1.5439;3.9916,2.7678,1.854;2.6761,1.8277,1.1572;4.6133,1.3141,0.3946;5.6441,1.21,0.7775;4.1469,0.0279,-0.0229;5.1001,-0.7631,-0.7224;5.4406,-0.2349,-1.6281;5.9916,-0.9309,-0.0956;4.4715,-2.0823,-1.0966;3.2348,-2.1194,-1.7616;2.7165,-1.1881,-1.971;2.6594,-3.3249,-2.1347;1.702,-3.3599,-2.6453;3.3152,-4.5356,-1.8586;2.6693,-5.6658,-2.2727;3.2781,-6.9192,-2.0109;2.6004,-7.6725,-2.4169;3.4054,-7.09,-0.9334;4.2546,-7.0062,-2.5062;4.5463,-4.5175,-1.1964;5.0708,-5.4369,-0.9626;5.1061,-3.2918,-0.8191;6.0602,-3.2907,-0.2963;4.6057,2.32,-0.7606;4.9796,3.2945,-0.4261;3.5868,2.4527,-1.1409;5.2401,1.9897,-1.5897)|\",5.38241196289\r\n\"[H]C([H])([H])O/N=C1\\[C@@]2([H])C([H])([H])[C@@]3([H])C([H])([H])[C@]1([H])C([H])([H])[C@@](F)(C2([H])[H])C3([H])[H] |(0.97,-0.5941,1.1462;1.3206,-0.4667,0.1187;1.0258,-1.3345,-0.4839;0.8783,0.4386,-0.3144;2.7347,-0.358,0.2102;3.2541,-0.1869,-1.0896;4.5294,-0.077,-1.093;5.4789,-0.1096,0.0883;4.9219,-0.2337,1.0181;6.4646,-1.2874,-0.1236;7.1728,-1.3239,0.7151;5.9154,-2.2371,-0.1247;7.222,-1.1048,-1.4564;7.9249,-1.9353,-1.5974;6.2106,-1.0759,-2.6227;5.6545,-2.0207,-2.6684;6.7357,-0.9613,-3.5805;5.2252,0.1033,-2.426;4.4776,0.1233,-3.2248;6.019,1.4334,-2.3886;5.3413,2.2853,-2.2573;6.5612,1.5826,-3.3308;7.0165,1.3841,-1.2258;7.7341,2.5907,-1.1956;6.271,1.2232,0.1041;6.993,1.2245,0.9304;5.5965,2.0747,0.2534;8.0024,0.2285,-1.4231;8.5567,0.3804,-2.358;8.7342,0.233,-0.6053)|\",6.557943797049999\r\n\"[H]C1([H])OO[C@@]2(O1)[C@@]1([H])C([H])([H])[C@]3([H])C([H])([H])[C@](F)(C1([H])[H])C([H])([H])[C@@]2([H])C3([H])[H] |(-3.5518,-0.5529,0.8226;-3.0371,-0.5682,-0.1483;-3.6138,-1.1124,-0.9025;-2.8436,0.7417,-0.6384;-1.7126,1.135,0.2013;-0.8454,-0.0021,0.0763;-1.7439,-1.1181,-0.0152;0.0447,0.0681,-1.1812;-0.608,0.1286,-2.0581;0.9505,1.3149,-1.0899;1.5764,1.3763,-1.9895;0.3345,2.2206,-1.0601;1.834,1.2255,0.1719;2.4858,2.1059,0.23;2.699,-0.0538,0.105;3.3397,-0.1403,0.9919;3.3559,-0.0338,-0.7736;1.7799,-1.2762,0.0237;2.5678,-2.4365,-0.0316;0.9136,-1.2076,-1.2392;1.56,-1.1874,-2.1254;0.2858,-2.1022,-1.3039;0.8867,-1.3521,1.2673;0.2564,-2.2461,1.2155;1.5124,-1.4345,2.1642;0.0182,-0.0769,1.3466;-0.6518,-0.1262,2.2117;0.929,1.1698,1.4214;0.3221,2.0798,1.493;1.539,1.1154,2.3327)|\",7.20013248423\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])/C([H])=C(\\[H])C(=O)C([H])([H])[H] |(2.8797,-7.1143,-12.1489;2.459,-7.7676,-11.5686;1.6969,-7.0502,-10.6033;0.8978,-6.462,-11.0857;1.2072,-7.8172,-9.9929;2.5538,-6.142,-9.7194;3.3442,-6.7506,-9.2607;3.0614,-5.4003,-10.3561;1.746,-5.4157,-8.6363;0.9483,-4.8235,-9.1089;1.2382,-6.1581,-8.0031;2.5997,-4.4981,-7.7512;3.3959,-5.0905,-7.2769;3.1091,-3.7554,-8.3827;1.7922,-3.7729,-6.6668;1.0006,-3.1752,-7.1425;1.2759,-4.5159,-6.0407;2.6465,-2.8639,-5.7743;3.1656,-2.1217,-6.3965;3.4356,-3.4622,-5.2945;1.8328,-2.1442,-4.6917;1.3142,-2.886,-4.0696;1.0474,-1.5371,-5.1644;2.6799,-1.2426,-3.7664;3.4557,-1.84,-3.2701;2.0229,-0.8541,-2.9733;3.2996,-0.0775,-4.4766;2.6249,0.5911,-5.014;4.6066,0.2206,-4.5146;5.3356,-0.4049,-4.0007;5.1121,1.4154,-5.2464;4.3681,2.1723,-5.8532;6.6132,1.6469,-5.2001;6.9495,1.7709,-4.1625;6.8692,2.5381,-5.7759;7.1489,0.7793,-5.6066)|\",5.2844509767100005\r\n\"[H]OC1=N/C(=N\\C([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@]([H])(N([H])[H])C(=O)N1C([H])([H])[H] |(9.446,0.7391,-0.8739;9.7465,1.654,-1.026;8.6444,2.3289,-1.4031;7.5197,1.7153,-1.5119;6.3861,2.4677,-1.87;5.1788,2.1308,-1.6418;4.8694,0.874,-0.9739;4.3131,0.2641,-1.6994;5.7646,0.3086,-0.688;3.9558,1.1141,0.2398;3.6287,0.1352,0.6213;3.0562,1.6368,-0.1107;4.5817,1.9216,1.3935;4.922,2.8811,0.9778;3.5244,2.2216,2.4663;3.9441,2.8206,3.2835;3.135,1.2921,2.903;2.6747,2.7741,2.0481;5.7959,1.2146,2.0152;6.2113,1.8051,2.8412;6.5962,1.0566,1.2846;5.5119,0.2344,2.4218;6.6445,3.7546,-2.6362;5.79,4.4205,-2.5095;6.7839,3.4689,-4.0781;7.5591,2.8266,-4.2401;7.0045,4.3354,-4.5684;7.8877,4.4934,-2.1364;8.0471,5.6923,-2.2753;8.9044,3.6721,-1.6162;10.209,4.2553,-1.2785;10.1434,5.3184,-1.5049;11,3.7902,-1.8718;10.4276,4.112,-0.2176)|\",5.16472088249\r\n\"[H]OC1=N/C(=N/C([H])([H])C2([H])C([H])([H])C2([H])[H])[C@]([H])(N([H])[H])C(=O)N1C([H])([H])[H] |(3.4438,3.3371,0.5951;2.6045,2.8404,0.6084;2.9327,1.6012,0.1928;4.1434,1.3358,-0.1265;4.496,0.046,-0.5462;5.7396,-0.1924,-0.6954;6.2326,-1.4898,-1.1243;5.5033,-2.3101,-1.0352;6.4676,-1.4137,-2.1956;7.4912,-1.8435,-0.3549;8.2649,-1.0808,-0.4123;7.9519,-3.2781,-0.2888;7.339,-4.0356,-0.7721;9.0185,-3.481,-0.3348;7.3986,-2.6026,0.9447;8.0845,-2.3441,1.7469;6.4117,-2.9042,1.2894;3.3637,-0.951,-0.8236;3.5895,-1.8928,-0.3086;3.2991,-1.2618,-2.2607;3.1227,-0.4095,-2.7922;2.5038,-1.8806,-2.4198;1.9794,-0.5443,-0.2977;1.0416,-1.323,-0.3529;1.8297,0.7554,0.1777;0.5133,1.2176,0.6366;-0.1655,0.3708,0.5515;0.1625,2.0451,0.0152;0.5721,1.5528,1.6745)|\",5.1619997439850005\r\n\"[H]OC1=N/C(=N\\C([H])([H])C([H])([H])C([H])(C([H])([H])[H])C([H])([H])[H])[C@]([H])(NO)C(=O)N1C([H])([H])[H] |(9.5239,0.4656,1.4945;9.9104,-0.1088,2.1812;8.916,-0.9263,2.5636;7.7563,-0.8299,2.0168;6.7457,-1.6947,2.445;5.6517,-1.9285,1.8346;5.37,-1.2814,0.5597;6.102,-0.5005,0.3115;5.4443,-2.0581,-0.2136;3.9475,-0.7037,0.5765;3.8777,0.0057,1.4122;3.2386,-1.5126,0.8;3.5321,0.0113,-0.7239;4.2868,0.7845,-0.9383;3.4774,-0.9396,-1.93;3.162,-0.4056,-2.8343;2.7575,-1.7502,-1.7554;4.4499,-1.3975,-2.142;2.181,0.7171,-0.5335;1.8831,1.2575,-1.4399;2.2194,1.4383,0.2915;1.3898,-0.0085,-0.3037;6.945,-2.3475,3.8016;6.3012,-3.2135,3.9451;6.5465,-1.2588,4.8417;5.6149,-1.5965,5.5163;8.3887,-2.6931,4.1086;8.7078,-3.5969,4.855;9.3284,-1.8325,3.5273;10.7465,-1.9571,3.8912;10.8118,-2.7472,4.6372;11.1121,-1.0144,4.3035;11.3434,-2.2221,3.0152)|\",3.583739411085\r\n\"[H]OC1=N/C(=N/C([H])([H])C2([H])C([H])([H])C2([H])[H])[C@]([H])(NO)C(=O)N1C([H])([H])[H] |(3.2836,3.1738,-0.3255;2.5918,2.5455,-0.0458;3.1745,1.3361,-0.0941;4.3993,1.2291,-0.4599;4.9927,-0.0329,-0.5166;6.2591,-0.1051,-0.6287;6.995,-1.3487,-0.75;6.4009,-2.2568,-0.5592;7.3351,-1.4195,-1.794;8.2032,-1.3374,0.1676;8.8493,-0.474,0.0236;8.8716,-2.6362,0.5411;8.4416,-3.5617,0.1651;9.9554,-2.6538,0.6161;8.0888,-1.8623,1.5765;8.6329,-1.3519,2.3662;7.1299,-2.2678,1.8926;4.0192,-1.2192,-0.5338;4.4659,-2.1587,-0.2126;3.6017,-1.3737,-2.0173;4.174,-2.2828,-2.5559;2.7393,-0.9949,0.2503;2.1047,-1.9172,0.7255;2.3126,0.3288,0.3192;1.0053,0.643,0.9109;0.5551,-0.304,1.203;0.3752,1.153,0.1791;1.1294,1.2841,1.7867)|\",3.600066242115\r\n\"[H]C1=C(C([H])([H])C([H])([H])C([H])([H])C([H])([H])[H])[C@]2([H])C(=NN=C2C([H])([H])OC([H])([H])[H])OC1=O |(3.8446,-3.615,0.0008;4.25,-3.2175,0.9235;5.2463,-3.8303,1.5893;5.9273,-5.0984,1.1504;6.9838,-4.8553,0.9544;5.9553,-5.7793,2.0131;5.3373,-5.8093,-0.0722;4.2819,-6.0495,0.1159;5.3527,-5.1349,-0.9391;6.0975,-7.0956,-0.4224;7.1528,-6.853,-0.6117;6.09,-7.7683,0.4468;5.5142,-7.8227,-1.6378;6.0766,-8.7357,-1.8619;5.5409,-7.1874,-2.5313;4.4695,-8.1085,-1.4654;5.6379,-3.2224,2.9225;5.0836,-3.7624,3.7103;5.3042,-1.7636,2.9479;6.2317,-1.0067,3.4088;7.3183,-1.8735,3.7801;7.0513,-3.0899,3.4459;8.0072,-4.209,3.7276;8.7628,-3.8515,4.4419;8.5379,-4.5084,2.8057;7.2724,-5.3082,4.2424;8.1045,-6.4004,4.5915;7.4515,-7.1903,4.9697;8.8246,-6.1219,5.3753;8.6627,-6.7784,3.7207;4.1455,-1.2858,2.4431;3.5544,-2.002,1.3908;2.5313,-1.5746,0.9266)|\",5.191932267540001\r\n\"[H]C1=NSC2=C1/C([H])=C(C1([H])N([H])N([H])N([H])N1[H])\\C([H])=C/2[H] |(-1.7357,1.8616,1.2399;-1.4739,0.8957,0.813;-2.3832,-0.0407,0.7998;-1.7507,-1.4533,0.089;-0.1759,-0.7555,-0.2077;-0.1825,0.5791,0.2595;0.9736,1.3683,0.1325;0.9747,2.3961,0.488;2.1139,0.8249,-0.4481;3.369,1.6707,-0.5421;3.1181,2.6651,-0.9256;4.4303,1.104,-1.4252;4.8339,1.9183,-1.9017;5.4106,0.6331,-0.477;6.3093,0.682,-0.966;5.3972,1.7929,0.4894;5.8634,1.4574,1.3375;3.9943,1.8167,0.803;3.77,0.9396,1.2901;2.1004,-0.5092,-0.9123;3.0011,-0.9007,-1.3726;0.9692,-1.3057,-0.7986;0.9772,-2.3278,-1.1653)|\",4.74566555272\r\n\"[H]C1=NC2=C(S1)/C([H])=C([C@@]1([H])N([H])N([H])C([H])([H])C1([H])[H])\\C([H])=C/2[H] |(3.7193,0.1112,0.4897;2.6515,0.0091,0.3284;1.8994,0.959,-0.117;0.5831,0.5282,-0.2149;0.3486,-0.8109,0.177;1.8631,-1.533,0.6868;-0.9301,-1.3719,0.1305;-1.1017,-2.3989,0.4399;-1.9964,-0.5926,-0.319;-3.3974,-1.1696,-0.3466;-3.3263,-2.2406,-0.1233;-4.2638,-0.5506,0.71;-3.8955,0.3713,0.9413;-5.5728,-0.337,0.1642;-6.0836,-1.2073,0.3129;-5.3909,-0.123,-1.2815;-6.3106,-0.3677,-1.8227;-5.1809,0.9417,-1.4474;-4.1791,-0.9869,-1.6892;-3.5681,-0.5206,-2.4674;-4.5036,-1.9617,-2.0694;-1.7604,0.747,-0.7069;-2.5927,1.3523,-1.0573;-0.4936,1.3089,-0.6614;-0.3157,2.3361,-0.9637)|\",5.20281682156\r\n\"[H]C(C1=NC2=C(N1)C([H])=C([H])C([H])=C2[H])/C([H])=C1\\N([H])ON([H])N1[H] |(3.8728,-0.6754,-1.1001;3.3881,0.2554,-0.8126;2.0462,0.1487,-0.4897;1.3929,-1.0752,-0.5526;0.1567,-0.7574,-0.1818;0.0847,0.6797,0.1034;1.2771,1.2332,-0.0942;-1.146,1.2668,0.5222;-1.2025,2.3302,0.7343;-2.2335,0.4422,0.6431;-3.1884,0.8533,0.9604;-2.1629,-0.9711,0.3627;-3.0675,-1.562,0.4809;-1.0028,-1.5766,-0.042;-0.9501,-2.6405,-0.2528;4.1162,1.4627,-0.7794;3.5913,2.3608,-0.4709;5.4405,1.569,-1.1056;6.3557,0.5498,-1.3827;5.9982,-0.2246,-1.9385;7.3739,1.2087,-2.2104;7.4955,2.4556,-1.5405;7.9658,2.2144,-0.6603;6.1609,2.7747,-1.1598;5.7575,3.4636,-1.7905)|\",2.7456287515449995\r\n\"[H]OC1=C2O[C@@]3([H])C([H])([H])N(C([H])([H])[H])C([H])([H])C([H])([H])[C@]3(C([H])([H])C([H])([H])[H])C2=C([H])C([H])=C1[H] |(2.6759,5.7145,-3.7957;2.2331,5.3567,-3.0105;2.9974,4.3399,-2.5093;2.5534,3.6671,-1.3697;1.3874,3.9578,-0.6933;1.6159,3.3461,0.599;2.3044,4.0313,1.1173;0.4376,3.1237,1.5336;-0.2401,2.348,1.1573;-0.1501,4.0374,1.6686;0.9813,2.7069,2.842;1.4556,3.8145,3.6648;1.7015,3.4369,4.6635;0.6515,4.5503,3.7772;2.3447,4.3479,3.2817;1.9192,1.5754,2.7552;2.3694,1.443,3.745;1.331,0.6696,2.5643;3.0462,1.6818,1.683;3.5838,0.7253,1.6349;3.7776,2.445,1.9772;2.4235,2.0575,0.322;1.613,0.8395,-0.2307;2.3525,0.051,-0.4299;0.9645,0.4417,0.5571;0.7644,1.0454,-1.4925;0.2835,0.0993,-1.7673;1.3707,1.3685,-2.3436;-0.0205,1.7924,-1.3431;3.2862,2.6157,-0.8035;4.4891,2.2054,-1.3703;5.0577,1.3834,-0.9435;4.9541,2.8809,-2.5038;5.898,2.5954,-2.9596;4.2157,3.9271,-3.0659;4.5926,4.4433,-3.9478)|\",5.657246951895001\r\n\"[H]O[C@@]1([H])C([H])=C([H])[C@@]2(OC(C([H])([H])[H])(C([H])([H])[H])OC2=O)C([H])([H])[C@]1([H])O[H] |(5.011,-0.6737,-5.1978;5.8355,-0.1822,-5.05;5.8532,0.2256,-3.6875;6.8537,0.6671,-3.5529;4.8133,1.2895,-3.4118;4.6332,1.9827,-4.2315;4.134,1.4045,-2.2683;3.3835,2.1827,-2.1465;4.3258,0.4655,-1.097;3.1487,-0.3476,-0.9505;2.5457,-0.1425,0.3313;1.1307,0.3897,0.1513;0.6615,0.5739,1.1225;0.5306,-0.3402,-0.4002;1.1534,1.3247,-0.4154;2.616,-1.4252,1.1507;2.171,-1.2752,2.1389;3.6571,-1.736,1.2759;2.0727,-2.2219,0.6338;3.3372,0.8704,0.9931;4.3919,1.237,0.2272;5.2236,2.0352,0.5686;5.5862,-0.401,-1.2737;6.4627,0.217,-1.0349;5.5624,-1.2367,-0.568;5.7079,-0.9441,-2.7032;4.7875,-1.4903,-2.948;6.745,-1.9037,-2.824;7.5945,-1.4353,-2.7809)|\",6.47086736489\r\n\"[H]/N=C(/O[H])N([H])/N=C(\\C1=C(C([H])([H])[H])ON=C1C(=O)O[H])C([H])([H])[H] |(3.8284,3.3526,-0.0662;4.4781,3.3797,-0.8523;4.8061,2.2021,-1.191;5.6519,1.9627,-2.2265;5.8739,2.8431,-2.5764;4.4523,0.9812,-0.62;4.6306,0.144,-1.1698;3.4706,0.9834,0.3212;2.999,-0.1328,0.7565;3.5089,-1.4663,0.3381;4.7591,-1.9769,0.5835;5.9714,-1.435,1.2561;5.7571,-0.4489,1.6728;6.8023,-1.338,0.548;6.2955,-2.1026,2.0618;4.8333,-3.2406,0.1056;3.6304,-3.6014,-0.4753;2.8483,-2.5519,-0.3326;1.4648,-2.559,-0.8768;0.7141,-1.6075,-0.8113;1.1359,-3.7279,-1.455;0.2222,-3.6201,-1.7806;1.9112,-0.0722,1.7954;2.2215,-0.5813,2.7172;1.0053,-0.5652,1.4288;1.6818,0.9714,2.0232)|\",4.34837933099\r\n\"[H]O/C(=N/C1C([H])O[C@@]2([H])C([H])([H])[C@]([H])(O[H])C([H])([H])[C@@]([H])(O[H])[C@]2([H])C1=O)C([H])([H])[H] |(-0.6749,2.8671,2.1095;0.2401,3.0915,1.8776;0.8263,2.0241,1.275;2.069,2.1491,1.0105;2.7947,1.2122,0.3275;2.5321,0.8929,-0.9933;3.1642,0.1688,-1.5139;2.122,1.9624,-1.7794;3.3821,2.7079,-1.95;3.999,2.0971,-2.6296;3.124,4.0611,-2.5967;2.5619,3.9353,-3.527;2.5259,4.6787,-1.9152;4.4697,4.7597,-2.9143;4.2508,5.7837,-3.234;5.0996,4.1635,-4.0529;5.5728,3.369,-3.7633;5.4105,4.813,-1.6964;5.007,5.5069,-0.9479;6.3853,5.2028,-2.0088;5.592,3.4505,-1.0125;6.0656,2.738,-1.7129;6.4278,3.6285,0.1147;6.5716,2.7383,0.4807;4.2078,2.9015,-0.6276;3.6994,3.6366,0.002;4.2774,1.5465,0.1631;5.2907,0.8984,0.318;-0.0643,0.8251,1.0587;-0.5943,0.5756,1.9875;0.5159,-0.0391,0.733;-0.8155,1.0385,0.2887)|\",3.54292233351\r\n\"[H]O/C(=N/C1C([H])O[C@@]2([H])[C@]([H])(C([H])([H])[H])[C@]([H])(O[H])C([H])([H])C([H])([H])[C@]2([H])C1=O)C([H])([H])[H] |(5.0809,-4.9889,2.9066;5.7244,-4.5695,2.3144;5.0908,-3.6256,1.5639;5.7968,-3.0651,0.6619;5.3369,-2.0449,-0.1298;4.994,-0.8073,0.3842;4.668,-0.0136,-0.2926;5.7768,-0.3645,1.4398;7.0234,0.0169,0.7409;6.7929,0.9219,0.1612;8.1055,0.3496,1.7754;8.3844,-0.5886,2.2757;7.6288,1.3521,2.8332;8.4322,1.556,3.5512;6.769,0.9563,3.3795;7.3425,2.3008,2.3705;9.3612,0.8539,1.0257;10.1634,0.9876,1.7703;9.0208,2.1219,0.4537;9.8006,2.452,-0.0189;9.8393,-0.1303,-0.0544;10.6942,0.3089,-0.589;10.2192,-1.0356,0.4397;8.7254,-0.5084,-1.0448;9.0879,-1.2481,-1.7665;8.4136,0.3697,-1.6206;7.5337,-1.0757,-0.2702;7.8542,-1.9438,0.3131;6.3332,-1.5296,-1.1821;6.206,-1.2463,-2.3488;3.6288,-3.4182,1.8774;3.1115,-4.3855,1.9341;3.1521,-2.8005,1.1151;3.5085,-2.915,2.8445)|\",3.5646914415499995\r\n\"[H]O/C(=N/C1C([H])O[C@@]2([H])C([H])([H])[C@]([H])(O[H])C([H])([H])C([H])([H])[C@]2([H])C1=O)C([H])([H])[H] |(-0.4415,3.0872,2.4703;0.4668,3.2676,2.1818;0.929,2.2026,1.4694;2.1523,2.2617,1.1122;2.7541,1.315,0.3239;2.3586,1.0765,-0.9801;2.8915,0.3336,-1.5785;1.9549,2.1975,-1.6939;3.2517,2.8575,-1.9488;3.7813,2.2391,-2.6866;3.0066,4.2394,-2.5427;2.3722,4.1578,-3.4311;2.4872,4.8671,-1.8088;4.3439,4.8894,-2.9336;4.148,5.9276,-3.2475;4.8408,4.135,-4.0453;5.7116,4.4929,-4.2787;5.3467,4.9012,-1.7666;6.3105,5.293,-2.1226;4.9894,5.6106,-1.0071;5.5366,3.5137,-1.1262;6.212,3.5796,-0.2663;6.0005,2.8232,-1.8392;4.1771,2.9674,-0.6819;3.7072,3.6682,0.0143;4.2412,1.5622,0.0238;5.2008,0.8295,0.0249;-0.0614,1.0842,1.252;-0.5733,0.8413,2.1929;0.4386,0.1911,0.8744;-0.8233,1.378,0.52)|\",3.57013371856\r\n\"[H]C1=C([H])C([H])=C(C(=O)OC(=O)[C@@]([H])(N([H])[H])[C@]([H])(C([H])([H])[H])C([H])([H])C([H])([H])[H])C([H])=C1[H] |(2.2162,-5.4479,5.3388;2.4615,-5.1054,4.3371;2.6139,-6.0318,3.3012;2.4896,-7.0931,3.4964;2.9261,-5.5941,2.0178;3.0498,-6.2948,1.1988;3.0899,-4.2239,1.7634;3.4251,-3.82,0.3753;3.5851,-4.5695,-0.5539;3.4705,-2.4325,0.2327;4.2602,-1.8517,-0.7588;5.291,-2.3331,-1.1423;3.6536,-0.5329,-1.2377;3.387,0.0416,-0.3413;4.5957,0.2563,-2.0237;5.0574,-0.3573,-2.6942;5.3427,0.5931,-1.4184;2.33,-0.8031,-2.0169;1.6994,-1.4092,-1.3519;2.5826,-1.6102,-3.3009;1.639,-1.8639,-3.7933;3.1036,-2.5515,-3.0919;3.183,-1.0357,-4.0164;1.5991,0.5285,-2.2793;1.52,1.0728,-1.3273;2.2217,1.15,-2.9327;0.1949,0.372,-2.8735;-0.3022,1.3458,-2.9487;-0.4353,-0.2739,-2.2489;0.218,-0.0589,-3.8802;2.9368,-3.2969,2.8044;3.0645,-2.2392,2.6045;2.6236,-3.7407,4.0878;2.5056,-3.0217,4.8935)|\",4.982404602655\r\n\"[H]C1=C([H])C([H])=C(C(=O)OC(=O)[C@@]([H])(N([H])[H])C([H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(0.4052,-5.5658,-5.0133;0.5836,-5.1823,-4.0122;1.2763,-3.9821,-3.8395;1.6359,-3.4314,-4.7041;1.508,-3.4864,-2.5577;2.0407,-2.5527,-2.4189;1.0451,-4.1966,-1.4407;1.263,-3.7301,-0.0486;0.8971,-4.299,0.9468;1.8972,-2.4837,-0.0096;2.7014,-2.1237,1.068;3.2683,-2.9217,1.7652;2.7995,-0.6019,1.2114;1.883,-0.1703,0.7956;2.893,-0.202,2.6163;3.5919,-0.7858,3.0748;2.0143,-0.4142,3.0877;3.9979,-0.0153,0.4079;3.9348,1.0629,0.6022;5.36,-0.5057,0.9205;6.1675,-0.0174,0.3629;5.4752,-1.5886,0.797;5.504,-0.2651,1.9787;3.8616,-0.2419,-1.1045;4.638,0.315,-1.6409;2.8878,0.0952,-1.479;3.9764,-1.3002,-1.3672;0.348,-5.4015,-1.6184;-0.0003,-5.9358,-0.7407;0.1194,-5.8917,-2.9005;-0.4183,-6.8259,-3.0354)|\",4.971520048635\r\n\"[H]OC1=NN=C2C1=C(F)C(F)=C(F)[C@]2([H])F |(0.9352,2.7493,-0.5249;-0.0171,2.5628,-0.6276;-0.2202,1.3311,-0.1499;0.7119,0.5692,0.3513;0.1191,-0.6625,0.7282;-1.1523,-0.615,0.4513;-1.5052,0.6569,-0.1423;-2.7565,0.8662,-0.6086;-3.1508,1.98,-1.2014;-3.7684,-0.1742,-0.4768;-4.9752,0.088,-0.9874;-3.4814,-1.3563,0.1039;-4.4177,-2.2975,0.2017;-2.1592,-1.6861,0.7476;-1.8158,-2.6712,0.4118;-2.3944,-1.79,2.1242)|\",3.47761500939\r\n\"[H]O[C@@]1([H])C2(NC([H])[C@@]1([H])O[H])C([H])([H])C2([H])[H] |(-0.0367,1.2027,-2.0916;0.3181,1.8664,-1.4778;0.3488,1.2689,-0.1718;0.1116,2.0674,0.5343;-0.5637,0.0478,-0.0675;0.2309,-1.1523,-0.2729;1.4647,-0.8225,-0.2288;2.2537,-1.5616,-0.3664;1.7568,0.6442,0.0678;2.046,0.7407,1.1245;2.7728,1.2282,-0.7112;2.2982,1.6431,-1.458;-1.9859,0.0508,-0.5844;-2.3754,0.9868,-0.9765;-2.3162,-0.8437,-1.1038;-1.725,-0.0465,0.8948;-1.875,-1.0079,1.3763;-1.9411,0.8182,1.5165)|\",6.372906378710001\r\n\"[H]C1=C([H])C([H])=C(C(=O)O[C@@]2([H])C([H])([H])[C@@]3([H])N([H])[C@]([H])(C([H])([H])C3([H])[H])C2([H])[H])S1 |(-1.694,1.246,-1.9492;-1.1058,0.4299,-1.5499;-1.5244,-0.5773,-0.7163;-2.5409,-0.6681,-0.3493;-0.478,-1.482,-0.3915;-0.5887,-2.3474,0.2507;0.7185,-1.1471,-0.9838;2.0268,-1.8111,-0.9062;3.0249,-1.4143,-1.4809;1.9803,-2.9136,-0.1226;3.2184,-3.6637,0.0299;3.7699,-3.5573,-0.9071;4.0561,-3.0904,1.1793;4.4858,-2.1288,0.8752;3.3994,-2.8973,2.039;5.1595,-4.092,1.5995;5.7595,-3.6603,2.4082;4.5843,-5.3811,2.0287;3.8416,-5.2445,2.7157;4.0528,-5.9354,0.7694;3.7634,-6.9825,0.9109;5.2837,-5.7906,-0.1581;5.9121,-6.6825,-0.0852;4.9946,-5.6764,-1.2081;6.0328,-4.5443,0.4032;6.153,-3.7491,-0.339;7.0301,-4.8229,0.7544;2.8347,-5.1269,0.2626;2.0341,-5.1743,1.0137;2.4314,-5.5599,-0.6618;0.5681,0.2947,-1.9536)|\",4.623214319995\r\n\"[H]C1=C([H])N([H])C(C(=O)O[C@@]2([H])C([H])([H])[C@@]3([H])N([H])[C@]([H])(C([H])([H])C3([H])[H])C2([H])[H])=C1[H] |(-2.1506,0.2669,0.7244;-1.1619,0.0986,0.3199;-0.4833,0.9498,-0.5414;-0.7727,1.9,-0.967;0.7252,0.3893,-0.821;1.4573,0.7541,-1.4142;0.8476,-0.8128,-0.1591;2.0811,-1.5722,-0.3245;3.0112,-1.1882,-1.0203;2.0687,-2.7299,0.3727;3.258,-3.5627,0.2826;3.6908,-3.3958,-0.7063;4.2747,-3.1556,1.3563;4.7243,-2.1919,1.0883;3.7468,-3.0153,2.3098;5.3522,-4.2506,1.5316;6.0771,-3.9331,2.2893;4.7501,-5.542,1.9139;4.1129,-5.4341,2.7039;4.0264,-5.9393,0.6922;3.6878,-6.978,0.7741;5.1379,-5.7711,-0.3733;5.7062,-6.7004,-0.4679;4.7278,-5.5344,-1.3608;6.0363,-4.6317,0.1962;6.1206,-3.7773,-0.4833;7.0477,-5.0005,0.3889;2.8094,-5.0191,0.4286;2.1116,-5.1005,1.273;2.2622,-5.3326,-0.4698;-0.323,-1.0136,0.5616;-0.5317,-1.8705,1.186)|\",5.36064285485\r\n\"[H]C1=C([H])N([H])C(C(=O)O[C@@]2([H])C([H])([H])[C@@]3([H])N(C([H])([H])[H])[C@]([H])(C([H])([H])C3([H])[H])C2([H])[H])=C1[H] |(6.5511,-7.3082,-0.7834;6.8124,-6.2675,-0.6495;8.0894,-5.7332,-0.7525;9.0354,-6.2068,-0.973;8.0057,-4.3947,-0.5191;8.7539,-3.7157,-0.5182;6.6972,-4.0457,-0.2643;6.3935,-2.6467,0.01;7.2448,-1.7678,0.0196;5.0797,-2.4465,0.2518;4.6615,-1.0867,0.5407;5.489,-0.6027,1.0656;4.3681,-0.3267,-0.7606;5.3112,-0.088,-1.2647;3.7764,-0.969,-1.4222;3.5591,0.9514,-0.4682;3.3627,1.4817,-1.4057;2.2841,0.5318,0.1495;1.2442,1.5554,0.103;0.9938,1.765,-0.9427;1.5089,2.5143,0.5851;0.3417,1.173,0.5921;2.7175,0.1882,1.5195;1.8459,0.1073,2.1771;3.6822,1.3391,1.9465;3.1315,2.1277,2.4691;4.4624,0.99,2.6313;4.2538,1.8579,0.5962;5.3455,1.7913,0.5436;3.9911,2.9087,0.437;3.4301,-1.1756,1.446;2.7314,-1.9109,1.0328;3.7264,-1.5162,2.4459;5.9334,-5.2035,-0.3415;4.8647,-5.2556,-0.1909)|\",5.21914365259\r\n\"[H]C1=C([H])C([H])=C(C(=O)O[C@@]2([H])C([H])([H])[C@@]3([H])N([H])[C@]([H])(C([H])([H])C3([H])[H])C2([H])[H])O1 |(-2.2898,-2.3195,1.8454;-1.4865,-1.8071,1.3373;-1.3685,-0.5636,0.7837;-2.1306,0.2023,0.7522;-0.0438,-0.492,0.263;0.4287,0.3331,-0.2498;0.5461,-1.6972,0.5384;1.8994,-2.1738,0.2332;2.7178,-1.4888,-0.3535;2.1083,-3.4313,0.6726;3.4215,-4.0053,0.4222;3.775,-3.5961,-0.5273;4.3998,-3.6148,1.5378;4.6641,-2.5554,1.4389;3.9008,-3.7387,2.5089;5.6571,-4.5166,1.4988;6.3507,-4.2128,2.2905;5.3044,-5.9406,1.6522;4.6952,-6.0868,2.458;4.6044,-6.2415,0.3897;4.4527,-7.3219,0.2898;5.6196,-5.6965,-0.6449;6.3358,-6.4789,-0.9115;5.13,-5.3715,-1.5689;6.3333,-4.5309,0.1056;6.2353,-3.5692,-0.4082;7.4012,-4.7402,0.2121;3.2371,-5.5206,0.3153;2.6049,-5.865,1.145;2.7083,-5.769,-0.6138;-0.337,-2.5104,1.1992)|\",4.922539555545\r\n\"[H]OC(=O)C([H])([H])/C(=N\\[C@]([H])(C(=O)O[H])C([H])([H])C(=O)O[H])O[H] |(-1.9739,1.019,1.1404;-2.717,1.2463,0.5386;-2.3898,2.2507,-0.2698;-3.0691,2.5402,-1.2389;-1.1099,3.0366,0.0284;-1.3889,4.0968,0.0051;-0.6904,2.7981,1.0011;-0.1021,2.8005,-1.0916;1.1199,2.4434,-0.9927;1.8095,2.2262,0.2629;1.3451,2.709,1.1319;3.2347,2.8305,0.152;4.0134,2.7947,1.0765;3.5215,3.362,-1.0388;2.723,3.2033,-1.6021;1.9817,0.7276,0.5802;2.7589,0.6304,1.3503;2.3413,0.2097,-0.3183;0.7258,0.0508,1.0982;-0.2169,0.6439,1.5878;0.6954,-1.2911,1.0359;1.5076,-1.6355,0.6263;-0.5853,3.0269,-2.3218;-1.5667,3.097,-2.2669)|\",6.138888467280001\r\n\"[H]C1C([H])=C([H])N=C2N[C@]3([H])C(=NC(=O)C4=C([H])C([H])=C([H])C([H])=C43)N12 |(-4.7139,1.3701,1.3526;-4.1402,2.2678,1.5572;-4.538,3.3233,2.314;-5.514,3.346,2.7818;-3.6139,4.4186,2.4766;-3.9288,5.2753,3.0729;-2.4134,4.4871,1.9722;-1.9875,3.4206,1.2154;-0.8283,3.2692,0.6741;-0.9011,2.0225,-0.0859;-0.9758,2.3069,-1.1522;-2.2032,1.3506,0.2875;-2.5878,0.1432,0.12;-1.5897,-0.7439,-0.3727;-1.9006,-1.8228,-0.8398;-0.1498,-0.3366,-0.189;0.8429,-1.3236,-0.2106;0.5468,-2.3522,-0.3893;2.175,-0.977,0.0015;2.9431,-1.7451,-0.0076;2.5236,0.3583,0.2294;3.5621,0.6252,0.4061;1.5455,1.3545,0.2297;1.8032,2.3953,0.399;0.2126,1.006,0.0223;-2.8995,2.3281,0.986)|\",3.719796336335\r\n\"[H]C1=NC([H])=C([H])N2C1=NC(C1=C([H])C([H])=C([H])C([H])=C1[H])=C2N=C=O |(3.622,-2.7939,-1.6307;2.8076,-2.2819,-1.123;2.8974,-0.9813,-0.943;1.8635,-0.3566,-0.2985;1.9607,0.7157,-0.1602;0.7493,-0.9976,0.1656;-0.0766,-0.523,0.6799;0.6711,-2.3554,-0.0359;1.7014,-3.048,-0.6848;1.43,-4.3465,-0.7708;0.211,-4.5216,-0.1855;-0.4028,-5.8532,-0.0838;0.4296,-6.9845,-0.0262;1.5045,-6.8427,-0.0653;-0.1177,-8.26,0.0776;0.5391,-9.1244,0.1261;-1.504,-8.4316,0.1217;-1.929,-9.4281,0.2053;-2.3399,-7.317,0.0467;-3.4192,-7.4405,0.06;-1.795,-6.0376,-0.0597;-2.4619,-5.1879,-0.168;-0.2778,-3.2924,0.2936;-1.3841,-2.8911,0.999;-2.3368,-3.329,1.6049;-3.3096,-3.6118,2.202)|\",4.337494776970001\r\n\"[H]S/C(=N/C([H])([H])C([H])=C([H])[H])N([H])/N=C([H])/C([H])=C(\\[H])C1=C([H])C([H])=C([H])O1 |(8.3788,3.5148,-0.8104;7.6766,4.1769,-1.7482;6.066,3.6859,-1.0689;5.9813,3.2767,0.124;4.745,2.7594,0.7121;4.0273,3.5706,0.8825;4.2513,2.0601,0.0186;5.0517,2.056,2.0084;5.8658,1.3323,1.9657;4.3919,2.2515,3.1491;4.6304,1.6968,4.0527;3.5834,2.9765,3.2241;5.0953,3.8515,-2.0562;5.4076,4.0029,-3.0152;3.7769,3.9688,-1.7671;2.977,4.1948,-2.7573;3.3553,4.291,-3.7853;1.5604,4.3306,-2.5445;1.2187,4.2267,-1.5173;0.686,4.5734,-3.5507;1.0573,4.6713,-4.5695;-0.7367,4.7202,-3.4111;-1.6257,4.679,-2.3604;-1.3781,4.5069,-1.3222;-2.9221,4.9047,-2.907;-3.8613,4.9407,-2.3728;-2.7448,5.0693,-4.2503;-3.418,5.2614,-5.0718;-1.4306,4.9622,-4.5751)|\",3.62999876567\r\n\"[H]OC(=O)[C@@]([H])(N([H])[H])C([H])([H])/N=C(/O[H])[C@@]([H])(N([H])[H])C([H])([H])N([H])[H] |(-4.3142,0.0316,2.2012;-3.5647,0.5897,2.4823;-2.5986,0.4218,1.5457;-2.7592,-0.2789,0.5697;-1.3618,1.2477,1.8652;-1.6269,2.2965,1.6242;-0.9703,1.0505,3.2609;-0.0967,1.5609,3.3923;-1.6667,1.46,3.8794;-0.1932,0.8517,0.9547;-0.5293,0.896,-0.0905;0.0563,-0.1958,1.1736;0.9268,1.7273,1.256;1.8953,1.863,0.4464;2.9188,2.6802,0.781;3.485,2.716,-0.0255;2.1145,1.1492,-0.8949;1.1393,0.9562,-1.3642;2.9862,1.9931,-1.7296;3.4052,1.4334,-2.4699;2.448,2.7266,-2.1875;2.8024,-0.2196,-0.6303;2.1543,-0.7971,0.0367;3.7369,-0.0383,-0.0872;3.1084,-1.0411,-1.7943;2.2955,-1.173,-2.3924;3.8624,-0.6588,-2.3587)|\",6.438213702830001\r\n\"[H]O/N=C([H])/C([H])=C(\\[H])C1=C([H])C([H])=C(C(F)(F)F)C([H])=C1[H] |(4.7433,-1.664,0.9789;4.3108,-0.8636,0.6412;2.9874,-1.2814,0.4722;2.2605,-0.3191,0.0213;2.6964,0.6639,-0.1756;0.8538,-0.5432,-0.2243;0.4904,-1.543,;0.0353,0.4195,-0.7016;0.4683,1.3994,-0.9032;-1.3948,0.3066,-0.9874;-2.1319,-0.881,-0.7963;-1.6394,-1.7718,-0.4196;-3.4879,-0.9325,-1.0827;-4.0403,-1.8548,-0.9292;-4.1506,0.2021,-1.5698;-5.621,0.1116,-1.868;-6.1248,1.274,-2.334;-5.8821,-0.8414,-2.7938;-6.3311,-0.2223,-0.7643;-3.4412,1.3864,-1.7659;-3.9506,2.2668,-2.1417;-2.0799,1.433,-1.4766;-1.5331,2.3596,-1.6316)|\",4.119803696569999\r\n\"[H]C1=C([H])C(OC([H])([H])[H])=C(/C([H])=C(\\[H])C(=O)N(OC([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(11.0451,-2.3086,-1.039;10.0214,-2.0253,-0.8094;9.756,-1.3202,0.3646;10.5725,-1.0685,1.0312;8.4436,-0.9464,0.6754;8.1055,-0.263,1.8046;9.1302,0.119,2.7084;8.6293,0.6572,3.5147;9.8628,0.7812,2.229;9.6488,-0.7555,3.1223;7.3721,-1.2792,-0.1961;6.0118,-0.8706,0.1518;5.9106,-0.2114,1.0086;4.8731,-1.2305,-0.4721;4.859,-1.8926,-1.3285;3.5826,-0.6936,0.0178;3.4838,0.1148,0.9379;2.4277,-1.2034,-0.5763;2.5756,-1.8459,-1.8259;2.2731,-3.2366,-1.6957;2.984,-3.7376,-1.0279;2.3575,-3.6457,-2.7061;1.2531,-3.391,-1.3226;1.1643,-0.4934,-0.4702;1.1773,0.0553,0.471;0.335,-1.2076,-0.4704;1.0364,0.2069,-1.3055;7.6825,-1.9852,-1.3702;6.8795,-2.2323,-2.058;8.9856,-2.3601,-1.6823;9.1927,-2.9019,-2.6003)|\",4.33205249996\r\n\"[H]/N=C(/N([H])O[H])C([H])([H])C1([H])C([H])([H])C([H])([H])N(C([H])([H])[H])C([H])([H])C1([H])[H] |(7.0317,-2.4541,0.4051;7.2095,-2.3973,1.406;6.882,-1.2521,1.8899;7.1854,-1.0263,3.2243;6.4718,-0.532,3.7537;7.5285,-2.1925,3.9469;7.5777,-2.8592,3.2183;6.2268,-0.0831,1.185;6.6211,0.8521,1.6032;6.5211,-0.1056,0.1284;4.6833,-0.0803,1.2873;4.4098,-0.1139,2.3549;4.0325,-1.2918,0.5998;4.3413,-1.3164,-0.4548;4.3696,-2.2266,1.0619;2.5044,-1.208,0.6592;2.17,-1.3079,1.7143;2.0645,-2.0476,0.1079;2.0213,0.0357,0.0624;0.569,0.0749,-0.0065;0.2035,-0.7817,-0.5836;0.2475,0.99,-0.5161;0.0811,0.0491,0.9884;2.5641,1.2027,0.7536;2.1686,2.1043,0.2706;2.2338,1.2364,1.8144;4.0949,1.2125,0.6987;4.4761,2.0887,1.2395;4.4111,1.3122,-0.3484)|\",6.353858409175\r\n\"[H]OC(=O)[C@@]1([H])[C@]2([H])C([H])([H])[C@]3([H])[C@@]1([H])C(O[H])=N[C@]3([H])C2([H])[H] |(-0.9529,0.6942,3.1917;-1.9137,0.7836,3.0764;-2.1937,1.1684,1.8127;-3.3499,1.3124,1.4741;-1.0151,1.4888,0.9053;-0.8393,2.5643,1.0615;0.3488,0.7652,1.1119;0.8294,0.9858,2.074;1.107,1.2901,-0.1268;2.0956,0.8326,-0.2438;1.2221,2.3803,-0.1394;0.077,0.7593,-1.131;0.2587,0.9437,-2.1922;-1.3053,1.2579,-0.6207;-1.702,2.1547,-1.1031;-2.1607,0.0156,-0.8815;-3.4963,0.0893,-1.0141;-3.802,0.798,-0.4141;-1.5173,-1.0823,-0.9765;-0.1017,-0.7481,-0.7553;0.5421,-1.4289,-1.3184;0.2447,-0.7452,0.7751;-0.5155,-1.2847,1.3487;1.2119,-1.2232,0.9679)|\",6.272224254025001\r\n\"[H]OC1=N[C@]([H])(C2([H])C([H])([H])C2([H])[H])N([H])[C@@]([H])(C([H])([H])C(=O)OC([H])([H])[H])C1([H])[H] |(-0.5326,3.6931,-2.0483;-0.8939,2.7933,-2.0611;-0.6244,2.1944,-0.8621;-0.9925,0.9893,-0.7246;-0.7637,0.3163,0.542;0.0249,-0.4379,0.3449;-2.0093,-0.4418,0.9621;-2.8555,0.2101,1.1658;-1.8748,-1.6824,1.8174;-0.8737,-2.0244,2.0737;-2.6129,-1.8578,2.5951;-2.3023,-1.7919,0.3728;-3.3369,-2.0412,0.1556;-1.5907,-2.1959,-0.3426;-0.3893,1.2352,1.6264;-0.1563,0.6998,2.4598;0.6687,2.18,1.2778;0.8564,2.8101,2.1538;2.0092,1.5159,0.8776;1.9267,1.0336,-0.1059;2.261,0.7161,1.5843;3.177,2.4802,0.8218;3.0962,3.6904,0.7573;4.3534,1.8176,0.8471;5.5302,2.6407,0.7649;6.3721,1.9487,0.792;5.5737,3.3333,1.6095;5.5335,3.2145,-0.1656;0.0857,3.0588,0.16;0.8756,3.6541,-0.3116;-0.6392,3.7598,0.5928)|\",6.22596489944\r\n\"[H]ON([H])/C([H])=C1\\C([H])=NN=C1C1=C([H])C([H])=C(Cl)C([H])=C1[H] |(-2.5339,6.5835,0.4795;-1.8508,6.2437,1.0843;-0.8311,5.7894,0.2269;-0.0593,6.4496,0.2052;-0.5552,4.4629,0.2883;-1.3938,3.8704,0.6418;0.6188,3.878,-0.0964;1.8939,4.4765,-0.4262;2.1363,5.5301,-0.5271;2.8152,3.5639,-0.572;2.2274,2.2993,-0.3762;0.9513,2.455,-0.1073;0.0768,1.3011,0.1516;0.6104,0.1607,0.7778;1.6574,0.166,1.0613;-0.1805,-0.9549,1.0309;0.2365,-1.8296,1.5187;-1.5249,-0.9392,0.6542;-2.5323,-2.3427,0.9793;-2.078,0.1688,0.0163;-3.1175,0.1562,-0.2938;-1.2733,1.2795,-0.2354;-1.6948,2.1178,-0.7818)|\",3.888506923645001\r\n\"[H]OC1=N[C@]([H])(N([H])[H])N([H])[C@@]([H])(N2C([H])([H])C([H])([H])C([H])([H])[C@@]([H])(N([H])[H])C2([H])[H])C1([H])[H] |(-5.6736,-1.8039,-0.9526;-5.8739,-1.7178,-0.0082;-4.7383,-1.3648,0.6544;-4.7991,-1.2218,1.9142;-3.5345,-0.8255,2.5299;-2.8345,-1.674,2.5174;-3.8065,-0.4858,3.9089;-4.6656,0.0584,3.9566;-3.0411,0.0707,4.2834;-2.8651,0.2831,1.8105;-3.4531,1.105,1.9422;-2.664,0.0536,0.3641;-3.0545,0.9335,-0.1583;-1.2639,-0.0486,-0.0532;-0.5953,1.2595,-0.0531;-1.2223,1.9643,-0.6134;-0.4939,1.6585,0.974;0.7844,1.1631,-0.7095;0.6543,0.8901,-1.7649;1.2724,2.1455,-0.6855;1.6499,0.1091,-0.0067;1.8822,0.4473,1.0155;2.6101,-0.018,-0.525;0.9091,-1.2363,0.0821;0.7579,-1.6192,-0.9361;1.6056,-2.2871,0.8354;1.8269,-1.9414,1.7702;2.5035,-2.4829,0.3938;-0.4784,-1.03,0.7048;-0.3614,-0.7169,1.7602;-0.9889,-1.997,0.7133;-3.4721,-1.1735,-0.1506;-2.8678,-2.0824,-0.0486;-3.6702,-1.0493,-1.2211)|\",6.364742963195\r\n\"[H]OC1=N[C@]([H])(N([H])[H])N([H])[C@@]([H])(N2C([H])([H])C([H])([H])C([H])(N([H])[H])C([H])([H])C2([H])[H])C1([H])[H] |(-0.1824,-5.527,-0.9069;0.605,-5.3229,-1.4347;1.2737,-4.2813,-0.8543;2.3135,-3.8616,-1.4412;3.0699,-2.7789,-0.8306;2.8682,-1.8866,-1.4466;4.5074,-2.9978,-0.874;4.7778,-3.1311,-1.8479;4.6973,-3.8832,-0.4026;2.7335,-2.5845,0.5769;3.3159,-1.8581,0.979;1.3258,-2.4431,0.8741;1.2401,-2.3322,1.9715;0.6526,-1.2984,0.2034;1.4097,-0.0457,0.3298;1.4808,0.2753,1.3914;2.43,-0.2027,-0.0259;0.7611,1.0711,-0.4958;0.7868,0.7851,-1.5566;1.3512,1.9917,-0.3901;-0.6976,1.3221,-0.0777;-0.6944,1.6969,0.9572;-1.4218,2.3154,-0.8791;-1.4141,2.0249,-1.8579;-0.9232,3.2051,-0.8521;-1.4517,-0.0102,-0.0922;-2.4614,0.1302,0.3099;-1.5564,-0.3558,-1.1304;-0.7107,-1.085,0.7094;-1.2756,-2.0192,0.6503;-0.691,-0.7969,1.7819;0.6926,-3.7857,0.4553;-0.3968,-3.7193,0.3679;0.9054,-4.5304,1.2334)|\",6.73753893838\r\n\"[H]OC1=N[C@]([H])(N([H])[H])N([H])[C@@]([H])(N2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C1([H])[H] |(-0.1538,-5.6782,-0.6613;0.647,-5.5507,-1.1927;1.3159,-4.4497,-0.7341;2.3782,-4.123,-1.3396;3.1324,-2.9762,-0.8543;2.9555,-2.1692,-1.5846;4.5678,-3.2125,-0.8314;4.8595,-3.4753,-1.7723;4.7341,-4.0311,-0.2444;2.7631,-2.5994,0.5075;3.3442,-1.8317,0.8271;1.3491,-2.3999,0.7351;1.2278,-2.1386,1.8032;0.7287,-1.3452,-0.1085;1.4886,-0.0866,-0.0654;1.5208,0.3324,0.9599;2.516,-0.2729,-0.3868;0.8693,0.9498,-1.0018;0.9557,0.5753,-2.0392;1.4382,1.884,-0.9312;-0.513,1.2025,-0.5996;-0.9292,1.9065,-1.2055;-1.2899,-0.033,-0.6614;-2.3162,0.173,-0.3353;-1.3352,-0.4657,-1.6786;-0.6639,-1.0688,0.2724;-1.2552,-1.9866,0.2116;-0.7288,-0.6978,1.3144;0.7073,-3.778,0.4816;-0.3789,-3.7142,0.3597;0.8925,-4.4128,1.3578)|\",6.745702353895\r\n\"[H]C1=N[C@]2([H])C(=C1[H])C(=O)C(F)=C([H])N2[H] |(0.3173,2.4286,1.3942;1.1292,1.8487,0.9587;2.2554,2.4104,0.6594;3.0395,1.3594,0.0182;3.0542,1.6381,-1.0553;2.2584,0.0766,0.1432;1.0615,0.4066,0.6708;0.2269,-0.2533,0.8732;2.8796,-1.2316,-0.1807;2.2533,-2.2688,-0.3668;4.3388,-1.1299,-0.1453;5.044,-2.2587,-0.3649;5.011,0.0053,0.1938;6.0933,-0.0201,0.2717;4.4026,1.2005,0.4939;4.9809,2.0307,0.4527)|\",3.67353698175\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])[C@]1([H])C(=O)C([H])=C([H])N=C1[H] |(3.271,1.8676,-1.0954;2.4165,1.4122,-1.0663;2.5275,0.324,-0.1509;3.281,-0.4035,-0.4924;2.8369,0.6774,0.8468;1.1675,-0.3559,-0.0555;0.8855,-0.7044,-1.0587;1.2658,-1.2429,0.582;0.082,0.571,0.5075;0.0528,1.4942,-0.0814;0.3338,0.8448,1.5373;-1.3238,-0.0659,0.5219;-2.0108,0.6886,0.9529;-1.4396,-1.2642,1.4725;-0.8361,-1.2978,2.5372;-2.3351,-2.334,1.0241;-2.5242,-3.156,1.7077;-2.8344,-2.3266,-0.2328;-3.4658,-3.14,-0.5831;-2.5883,-1.3439,-1.2077;-1.8783,-0.3337,-0.8529;-1.6796,0.4243,-1.6159)|\",4.5170899183\r\n\"[H]OC1=NC([H])([H])[C@]([H])(C(=O)OC([H])([H])[H])C([H])([H])[C@@]1([H])C1=C([H])C([H])=NC([H])=C1[H] |(3.8507,-1.3039,6.9715;4.0902,-1.9909,6.3299;3.8193,-1.5404,5.0688;3.9727,-2.3644,4.1201;3.7065,-1.9671,2.7486;4.6739,-1.8323,2.2422;3.2103,-2.8023,2.2433;2.8544,-0.6829,2.5867;1.819,-0.9285,2.8553;2.8596,-0.2304,1.1397;3.447,0.7379,0.7054;2.1394,-1.0755,0.368;2.1163,-0.7632,-1.0355;1.5043,-1.5384,-1.4973;3.1285,-0.7741,-1.4483;1.6768,0.224,-1.2007;3.3965,0.403,3.5158;2.8277,1.3322,3.4153;4.4287,0.632,3.2287;3.3685,-0.0845,4.9847;4.1062,0.5102,5.5426;2.014,0.1365,5.6542;1.0279,-0.8542,5.722;1.2083,-1.8502,5.3282;-0.1976,-0.5534,6.3206;-0.9727,-1.3156,6.3824;-0.5024,0.6387,6.8455;0.4456,1.5829,6.7814;0.1894,2.5482,7.2151;1.7008,1.3842,6.2086;2.4253,2.195,6.193)|\",5.727996553024999\r\n\"[H]C1=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C([H])=C(C#N)C1=O |(-0.8368,2.3776,0.0454;-0.4366,1.3693,0.0894;0.8769,1.1394,0.3034;1.5348,1.9902,0.4372;1.4425,-0.2069,0.3662;2.7827,-0.4415,0.5807;3.4154,-1.8117,0.6413;3.8287,-1.9557,1.6495;2.7129,-2.6247,0.4562;4.5884,-1.9173,-0.3583;4.1807,-1.8718,-1.3863;5.0759,-2.8905,-0.2381;5.5583,-0.8643,-0.0775;6.3747,-0.97,-0.6757;4.9787,0.4636,-0.2463;5.7487,1.2156,-0.0433;4.5949,0.6455,-1.2685;3.8165,0.6455,0.7549;3.4075,1.6508,0.6517;4.2373,0.5614,1.7668;0.4931,-1.2942,0.1959;0.8445,-2.3174,0.2441;-0.8351,-1.086,-0.0207;-1.7227,-2.1972,-0.182;-2.4225,-3.1175,-0.3101;-1.4101,0.2858,-0.0984;-2.6033,0.4873,-0.3002)|\",3.7578922754050006\r\n\"[H]C1=C([H])C(C#N)=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C1=O |(-2.3729,0.6933,-0.1164;-1.3318,0.3873,-0.0994;-0.3321,1.2698,0.1118;-0.5377,2.3229,0.281;1.0494,0.8255,0.1093;2.0952,1.7804,0.3166;2.9368,2.5663,0.4882;1.351,-0.4933,-0.0821;2.3962,-0.7724,-0.0731;0.3415,-1.5135,-0.2855;0.7012,-2.8447,-0.3962;2.1295,-3.3414,-0.3484;2.3574,-3.7816,-1.3293;2.866,-2.5604,-0.1615;2.3008,-4.4581,0.7046;2.1553,-4.0191,1.7108;3.3271,-4.8376,0.6562;1.3751,-5.5436,0.4094;1.5295,-6.3192,1.0502;-0.0109,-5.0927,0.4882;-0.6731,-5.9389,0.277;-0.2808,-4.7032,1.4886;-0.2726,-3.986,-0.555;-1.3013,-3.6417,-0.5045;-0.1093,-4.4229,-1.5517;-1.095,-1.0403,-0.3585;-2.0564,-1.7641,-0.6292)|\",3.3742117462000003\r\n\"[H]C1=C([H])C(Cl)=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C1=O |(-2.4146,0.8166,-0.0795;-1.3695,0.526,-0.0419;-0.3893,1.4217,0.2124;-0.6116,2.4686,0.396;0.9846,0.9872,0.2349;2.2166,2.2052,0.5443;1.3307,-0.3048,0.0304;2.3807,-0.5615,0.0597;0.3389,-1.3427,-0.2238;0.722,-2.6613,-0.3621;2.1569,-3.1371,-0.2788;2.4336,-3.5359,-1.265;2.8702,-2.3509,-0.0322;2.3087,-4.2881,0.7384;2.116,-3.8895,1.7536;3.3428,-4.6494,0.717;1.4151,-5.3794,0.3704;1.5581,-6.1712,0.9939;0.0184,-4.9541,0.4124;-0.6192,-5.8043,0.1476;-0.2937,-4.6064,1.4162;-0.224,-3.8147,-0.5979;-1.2601,-3.4906,-0.5714;-0.0183,-4.2123,-1.6034;-1.1026,-0.893,-0.3193;-2.0508,-1.6252,-0.6188)|\",3.36876946919\r\n\"[H]C1=C([H])C(C([H])([H])[H])=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C1=O |(4.8584,-2.4753,-0.2218;4.3789,-1.5022,-0.178;3.043,-1.3627,-0.0237;2.4111,-2.2449,0.0687;2.4119,-0.0582,0.0141;0.9137,0.0203,0.1624;0.5775,-0.4691,1.0863;0.403,-0.487,-0.667;0.5658,1.0576,0.1859;3.1935,1.0482,-0.0837;2.6995,2.0124,-0.0539;4.646,1.0158,-0.2261;5.384,2.1799,-0.2525;4.7825,3.5659,-0.1429;4.9748,4.0918,-1.0889;3.7048,3.5669,0.0192;5.4555,4.382,0.9806;5.2007,3.9173,1.9535;5.0434,5.3975,0.9827;6.8917,4.453,0.7373;7.3337,5.0409,1.4412;7.4957,3.1222,0.7466;8.5731,3.2223,0.5754;7.3614,2.5982,1.7127;6.8885,2.2518,-0.3714;7.334,1.2613,-0.3726;7.125,2.7337,-1.3323;5.2798,-0.3539,-0.3342;6.4791,-0.5637,-0.5493)|\",3.4313556548050004\r\n\"[H]C1=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C(=O)C(F)=C1[H] |(1.7562,1.7539,0.4583;0.9887,1.0125,0.2583;1.3284,-0.285,0.0627;2.3768,-0.5496,0.1095;0.3462,-1.331,-0.2066;0.7258,-2.6496,-0.3516;2.1595,-3.1262,-0.2514;2.4511,-3.519,-1.2358;2.8677,-2.3409,0.0113;2.2952,-4.2836,0.7609;2.0857,-3.891,1.7753;3.3296,-4.6448,0.7544;1.4084,-5.3731,0.3721;1.5416,-6.169,0.9928;0.0106,-4.9494,0.3928;-0.6211,-5.7982,0.1099;-0.3198,-4.6097,1.3933;-0.2145,-3.8024,-0.613;-1.2518,-3.4816,-0.6017;0.0105,-4.1922,-1.6174;-1.0973,-0.8965,-0.3146;-2.052,-1.6196,-0.6038;-1.3516,0.5335,-0.052;-2.6407,0.899,-0.1136;-0.3862,1.4394,0.2078;-0.6528,2.4794,0.371)|\",3.33611580713\r\n\"[H]C1=C([H])C(F)=C([H])C(=C2C([H])([H])C([H])([H])N([H])C([H])([H])C2([H])[H])C1=O |(-2.4076,0.8308,-0.0963;-1.3628,0.542,-0.0476;-0.3814,1.4342,0.2153;-0.5856,2.4855,0.3984;0.982,0.987,0.2498;1.9198,1.9282,0.501;1.3403,-0.2978,0.0504;2.3949,-0.534,0.0979;0.3493,-1.3344,-0.2146;0.7253,-2.6537,-0.3536;2.1601,-3.1296,-0.2684;2.4423,-3.5186,-1.2572;2.8695,-2.3426,-0.0123;2.3114,-4.2888,0.7385;2.1166,-3.8994,1.7571;3.3456,-4.6501,0.7162;1.4186,-5.3776,0.3604;1.5603,-6.1734,0.9792;0.0215,-4.9522,0.4054;-0.6159,-5.8009,0.1352;-0.2895,-4.6121,1.4121;-0.2216,-3.8054,-0.5957;-1.2578,-3.4815,-0.5643;-0.0204,-4.196,-1.605;-1.0913,-0.8794,-0.3217;-2.0392,-1.6088,-0.6302)|\",3.35244263816\r\n\"[H]OC1=NC([H])([H])[C@]([H])(Cl)C([H])([H])[C@@]1([H])C1=C([H])N=C([H])C([H])=C1[H] |(-1.7195,-1.3913,-2.2903;-1.7279,-2.3123,-2.598;-1.5092,-3.1364,-1.5356;-1.4064,-4.3746,-1.7817;-1.2101,-5.3311,-0.71;-1.857,-6.1951,-0.8959;-0.1766,-5.7074,-0.7735;-1.4221,-4.8228,0.7205;-0.9708,-5.5119,1.4375;-3.2097,-4.8253,1.1586;-0.8544,-3.4183,0.8777;0.2319,-3.4869,0.725;-1.0163,-3.0317,1.888;-1.4578,-2.4532,-0.164;-2.5004,-2.2588,0.1233;-0.7262,-1.1217,-0.1859;-1.314,0.0266,0.3626;-2.3178,-0.0316,0.7843;-0.7269,1.2272,0.4182;0.5053,1.3308,-0.0915;0.9654,2.3159,-0.0378;1.1897,0.2598,-0.6717;2.1868,0.4,-1.078;0.5642,-0.9825,-0.7183;1.0659,-1.8354,-1.1699)|\",5.52391116515\r\n\"[H]C1=NC(=S)C(N([H])[H])=C([H])[C@@]1([H])C(F)(F)F |(0.6826,-2.2946,0.1863;0.3238,-1.2657,0.1285;-0.9258,-1.0938,-0.0955;-1.4603,0.203,-0.1691;-3.0995,0.3685,-0.3073;-0.5572,1.3782,-0.0975;-1.1125,2.616,-0.3193;-0.6052,3.4177,0.0297;-2.1205,2.6602,-0.2195;0.7616,1.1858,0.1674;1.4346,2.0333,0.264;1.3494,-0.1776,0.2836;2.098,-0.3417,-0.5131;2.1518,-0.3658,1.5779;3.111,0.5768,1.6794;1.3726,-0.2854,2.6702;2.761,-1.5724,1.5937)|\",3.2272702669300006\r\n\"[H]C1=NC(=S)[C@@]([H])(N(=O)=O)C([H])=C1C(F)(F)F |(-0.5393,2.282,0.2985;-0.3453,1.2099,0.3213;-1.3497,0.4299,0.5469;-1.1319,-0.939,0.578;-2.1,-1.9326,1.4488;0.0584,-1.4841,-0.2106;0.3018,-2.4853,0.1386;-0.4143,-1.7251,-1.6902;-0.2428,-0.815,-2.4878;-0.9423,-2.8023,-1.906;1.2257,-0.5507,-0.2099;2.206,-0.9336,-0.4744;1.0231,0.7432,0.0782;2.1304,1.761,0.094;1.8468,2.7775,-0.7468;2.2718,2.29,1.3279;3.3132,1.2332,-0.2684)|\",2.88984909231\r\n\"[H]OC(=O)[C@@]1([H])C([H])([H])C(O[H])=NC([H])([H])[C@]1([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.6049,0.1963,-0.2992;5.6059,-0.2286,-1.1766;4.4839,-0.9685,-1.3753;4.2909,-1.4927,-2.4449;3.487,-1.0906,-0.2144;2.8565,-1.939,-0.4882;4.1415,-1.3658,1.1516;4.9443,-2.1096,1.0591;3.3891,-1.7914,1.8281;4.688,-0.0974,1.781;5.5653,-0.2698,2.8084;5.7235,-1.2176,2.9414;4.4117,1.0995,1.4531;3.4338,1.3717,0.3996;2.7731,2.1655,0.7658;3.9784,1.8122,-0.4458;2.5877,0.1813,-0.1003;2.2423,0.4228,-1.1131;1.3493,-0.1149,0.7411;0.2764,-0.7989,0.1483;0.3401,-1.0854,-0.8997;-0.8731,-1.1051,0.8756;-1.6937,-1.6284,0.3919;-0.9724,-0.7315,2.2181;-1.8676,-0.9666,2.7871;0.0846,-0.0488,2.8197;0.0177,0.2517,3.862;1.2355,0.258,2.0882;2.0442,0.7956,2.575)|\",6.397396625255001\r\n\"[H]OC(=O)[C@@]1([H])C([H])([H])N=C(O[H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2[H])C1([H])[H] |(6.0376,2.181,0.0612;6.2426,2.8638,-0.5986;5.5251,2.6541,-1.7405;5.6596,3.402,-2.6753;4.5981,1.4432,-1.7461;4.0465,1.5148,-2.6882;5.393,0.1154,-1.7328;6.1391,0.127,-0.9221;5.9689,0.0253,-2.6611;4.5931,-1.0876,-1.5868;3.4032,-1.0172,-1.1574;2.713,-2.1889,-1.0504;1.7843,-1.9896,-0.8504;2.6334,0.2245,-0.7173;2.243,0.0079,0.2873;1.4302,0.5079,-1.6167;1.4413,0.2174,-2.9883;2.3094,-0.2622,-3.4327;0.3389,0.5228,-3.7888;0.3648,0.288,-4.8492;-0.7908,1.1219,-3.2311;-1.6487,1.3573,-3.8547;-0.8154,1.411,-1.8653;-1.6931,1.8717,-1.4201;0.2853,1.103,-1.0666;0.2581,1.3292,-0.0024;3.595,1.4306,-0.5818;3.0204,2.3612,-0.5367;4.1297,1.3342,0.375)|\",6.111677082229999\r\n\"[H]OC(=O)C1=C([H])C(=O)N=C([H])[C@@]1([H])C1=C([H])C([H])=C(F)C([H])=C1[H] |(-0.243,5.84,-0.2579;-0.1048,5.6253,-1.1957;0.6354,4.4994,-1.3138;0.8065,3.9793,-2.3912;1.2594,3.951,-0.0612;1.6391,4.7233,0.9706;1.4795,5.7986,0.9991;2.3544,4.1667,2.148;2.6028,4.8621,3.1134;2.7615,2.7994,2.1233;2.4041,2.0775,1.128;2.7215,1.0308,1.1373;1.5591,2.4754,-0.0596;2.134,2.2516,-0.9694;0.2882,1.616,-0.1142;-0.5294,1.4852,1.0168;-0.26,1.984,1.9443;-1.689,0.7131,0.9749;-2.3286,0.5987,1.8436;-2.0211,0.0743,-0.2148;-3.1396,-0.6744,-0.2633;-1.234,0.1864,-1.3551;-1.5307,-0.3239,-2.2652;-0.0767,0.9618,-1.2972;0.5366,1.0755,-2.1858)|\",4.22320695976\r\n\"[H]OC(=O)C1=C([H])C(=O)N=C([H])[C@@]1([H])C1=C([H])C([H])=C([H])C(F)=C1[H] |(-0.6975,-6.3258,0.093;-0.4724,-6.0435,-0.8094;-0.7572,-4.7315,-0.9744;-0.4212,-4.1438,-1.9753;-1.5361,-4.0535,0.1176;-2.4402,-4.6905,0.8802;-2.641,-5.7562,0.8005;-3.2792,-3.9702,1.873;-4.0162,-4.5779,2.6246;-3.2167,-2.5458,1.9233;-2.359,-1.9492,1.1836;-2.3225,-0.858,1.2495;-1.3537,-2.5634,0.2368;-1.522,-2.1281,-0.7583;0.0706,-2.1764,0.6601;0.4954,-2.3802,1.9801;-0.1862,-2.8063,2.7111;1.7908,-2.0305,2.363;2.1119,-2.1883,3.3885;2.6784,-1.4763,1.4402;3.69,-1.1951,1.7127;2.2379,-1.2853,0.1357;3.0832,-0.7523,-0.7675;0.9531,-1.6242,-0.2732;0.6641,-1.475,-1.3078)|\",4.302119976405001\r\n\"[H]OC(=O)[C@@]1([H])C([H])([H])C(O[H])=NC([H])([H])[C@]1([H])C1=C(F)C([H])=C([H])C([H])=C1[H] |(2.9925,0.3862,-0.2881;3.6501,0.9727,-0.7016;3.6418,2.1879,-0.0897;4.4175,3.0471,-0.4271;2.5856,2.3379,1.0063;2.6841,1.4609,1.6568;2.8085,3.6029,1.8463;3.8821,3.82,1.8941;2.4623,3.4291,2.8741;2.0788,4.8227,1.3183;2.3643,5.9873,1.9703;3.0882,5.833,2.5967;1.2093,4.8755,0.3998;0.8881,3.6638,-0.3372;-0.1679,3.7119,-0.6277;1.4595,3.6801,-1.2759;1.1543,2.3252,0.3775;1.1356,1.5482,-0.3955;0.116,1.9298,1.413;0.045,0.591,1.8078;0.9326,-0.2869,1.248;-0.866,0.1048,2.7314;-0.8632,-0.9504,2.9839;-1.7662,1.0051,3.3062;-2.4916,0.6495,4.0321;-1.7297,2.3523,2.9453;-2.4281,3.0539,3.3917;-0.798,2.8083,2.0087;-0.7743,3.8595,1.7378)|\",6.125282774755\r\n\"[H]OC(=O)C1=C([H])[C@]([H])(C2=C([H])C([H])=C(F)C([H])=C2[H])C(=O)N=C1[H] |(1.4751,2.9972,-4.019;1.7195,3.9024,-4.2705;1.6692,4.729,-3.1954;1.764,5.9245,-3.3374;1.5075,4.0998,-1.8451;1.8061,2.8212,-1.5459;2.1724,2.1179,-2.2935;1.6535,2.301,-0.1496;2.5978,1.8252,0.1443;0.562,1.2316,-0.039;0.7389,0.1601,0.8466;1.6527,0.0902,1.4291;-0.2495,-0.8093,1.0022;-0.1238,-1.6438,1.6839;-1.4209,-0.6944,0.2615;-2.3782,-1.6316,0.401;-1.632,0.3593,-0.6195;-2.5627,0.418,-1.1739;-0.6346,1.3237,-0.7615;-0.7986,2.1552,-1.4411;1.4305,3.4184,0.8969;1.6739,3.2084,2.0644;0.9266,4.6761,0.4814;1.0103,4.9774,-0.7675;0.6671,5.9661,-1.0732)|\",3.76605569092\r\n\"[H]OC(=O)C1=C([H])[C@]([H])(C2=C([H])C(F)=C([H])C([H])=C2[H])C(=O)N=C1[H] |(-2.6935,-3.9317,-3.8408;-2.0789,-4.636,-4.1048;-1.3678,-5.0674,-3.0326;-0.6878,-6.0624,-3.1041;-1.4564,-4.2442,-1.7839;-1.6771,-2.915,-1.7653;-1.8166,-2.3401,-2.6806;-1.673,-2.1565,-0.4733;-2.6061,-1.5807,-0.416;-0.5214,-1.1518,-0.3602;-0.6923,-0.0301,0.4607;-1.6167,0.1371,1.0033;0.3559,0.8696,0.5968;0.1833,1.9498,1.3838;1.5734,0.699,-0.0542;2.3623,1.4314,0.0797;1.7356,-0.424,-0.8634;2.6787,-0.5833,-1.3779;0.6996,-1.3486,-1.0162;0.8485,-2.2212,-1.6447;-1.7173,-3.0866,0.7618;-2.1891,-2.6927,1.8055;-1.179,-4.3916,0.6564;-1.09,-4.9087,-0.5183;-0.6874,-5.9184,-0.6054)|\",3.855853261585001\r\n\"[H]OC(=O)C1=C([H])[C@]([H])(C2=C([H])C([H])=C([H])C([H])=C2F)C(=O)N=C1[H] |(0.4658,3.58,-3.8385;1.1832,4.1014,-4.2357;1.9563,4.648,-3.2629;2.7703,5.5003,-3.5257;1.759,4.1262,-1.8744;1.3803,2.8663,-1.5864;1.1678,2.133,-2.3623;1.3275,2.3959,-0.1543;2.329,1.9844,0.0676;0.3164,1.3008,0.1002;0.6573,0.0919,0.714;1.6886,-0.0757,1.0149;-0.3038,-0.8925,0.95;-0.0167,-1.823,1.4302;-1.63,-0.6772,0.5728;-2.3822,-1.4392,0.7547;-1.9979,0.5213,-0.0417;-3.0195,0.7238,-0.3454;-1.0198,1.4795,-0.2601;-1.365,2.6444,-0.8639;1.1685,3.6009,0.8024;0.4869,3.5542,1.7988;1.9127,4.7671,0.4866;2.1537,5,-0.7561;2.6956,5.9106,-1.0147)|\",4.114361419560002\r\n\"[H]C1=NC(=O)[C@@]([H])(C2=C([H])C([H])=C(F)C([H])=C2[H])C([H])=C1[H] |(-0.0149,2.5101,-0.5395;0.1306,1.4547,-0.2943;-0.9385,0.7383,-0.1889;-0.7755,-0.6229,0.1636;-1.7067,-1.2555,0.6161;0.5952,-1.3086,-0.0521;0.7135,-1.9616,0.8228;0.5499,-2.2201,-1.2833;1.076,-1.8242,-2.5186;1.552,-0.8533,-2.62;0.9967,-2.6613,-3.6326;1.4044,-2.367,-4.5941;0.3764,-3.8962,-3.4965;0.2986,-4.7129,-4.5675;-0.1674,-4.3155,-2.2869;-0.6489,-5.2857,-2.2221;-0.0768,-3.4707,-1.1834;-0.5136,-3.7753,-0.2371;1.7387,-0.3366,-0.0697;2.7491,-0.7395,-0.0367;1.5086,0.9836,-0.1494;2.3134,1.7123,-0.1722)|\",3.989189048330001\r\n\"[H]C1=NC(=O)[C@@]([H])(C2=C(F)C([H])=C([H])C([H])=C2[H])C([H])=C1[H] |(5.0905,4.1555,0.4364;4.1962,3.5346,0.3349;3.8429,3.2309,-0.8697;2.7239,2.3733,-1.0281;2.581,1.741,-2.0502;1.6749,2.3324,0.1096;1.0385,3.2176,-0.0728;0.7809,1.1139,0.0348;1.3126,-0.1594,0.2466;2.6341,-0.2592,0.5206;0.5481,-1.3157,0.196;1.0248,-2.2752,0.3668;-0.8161,-1.2033,-0.0794;-1.4298,-2.0983,-0.1247;-1.3842,0.0523,-0.2979;-2.4443,0.1425,-0.5146;-0.587,1.1973,-0.2402;-1.0307,2.1742,-0.4166;2.3072,2.5473,1.4641;1.7362,2.2566,2.3427;3.5111,3.1313,1.561;3.9786,3.3412,2.5182)|\",4.296677699395001\r\n\"[H]C1=C([H])[C@]([H])(Cl)C(=O)N=C1C(=O)OC([H])([H])C([H])([H])[H] |(6.6828,1.1293,1.3314;6.6538,0.0899,1.0281;7.6116,-0.7921,1.3526;8.4849,-0.5086,1.9333;7.5129,-2.2012,0.869;7.7944,-2.9339,1.6261;8.7287,-2.4451,-0.4942;6.1086,-2.5652,0.3649;5.682,-3.6918,0.4847;5.318,-1.5501,-0.2204;5.5666,-0.3355,0.1369;4.6935,0.7845,-0.407;4.8004,1.9267,-0.0021;3.8428,0.3663,-1.341;2.9612,1.3672,-1.9169;3.5159,2.3025,-2.0246;2.7225,0.967,-2.9043;1.7128,1.5515,-1.0686;1.029,2.2467,-1.5689;1.9656,1.9664,-0.0888;1.1941,0.598,-0.9288)|\",3.78782479896\r\n\"[H]C1=NC(=O)[C@@]([H])(Cl)C([H])=C1C(=O)OC([H])([H])C([H])([H])[H] |(4.9815,-1.6486,0.333;5.8878,-1.3921,-0.2141;6.5791,-2.3652,-0.6995;7.7751,-2.0389,-1.3757;8.685,-2.832,-1.4646;7.907,-0.635,-1.9912;8.9696,-0.4033,-2.0753;7.2885,-0.6615,-3.7239;7.147,0.4095,-1.2488;7.3413,1.459,-1.4499;6.2002,0.0406,-0.3643;5.4378,1.1,0.3664;5.6503,2.2887,0.2444;4.4973,0.5661,1.169;3.6894,1.4965,1.9397;3.492,2.3764,1.3235;2.7572,0.9549,2.1141;4.3763,1.8709,3.2434;3.7168,2.5184,3.8323;5.3047,2.4152,3.0494;4.6015,0.9794,3.8372)|\",3.945650832250001\r\n\"[H]C1=NC(=O)[C@@]([H])(C(=O)OC([H])([H])C([H])([H])[H])C([H])=C1Cl |(8.7644,2.1079,-1.6925;8.0703,1.6391,-0.9925;7.081,2.3418,-0.5657;6.2064,1.7436,0.3741;5.5449,2.4232,1.1241;6.1053,0.2001,0.3925;5.8006,-0.0659,1.409;4.963,-0.233,-0.5538;5.1285,-0.6187,-1.6885;3.7827,-0.0978,0.0652;2.5928,-0.369,-0.727;2.7926,-1.2284,-1.3713;1.8394,-0.6368,0.0167;2.1811,0.8532,-1.5313;1.2362,0.6515,-2.0486;2.937,1.0966,-2.2833;2.0392,1.7189,-0.877;7.3916,-0.4721,0.0089;7.5178,-1.5299,0.2177;8.3286,0.2396,-0.6288;9.8534,-0.4434,-1.1455)|\",4.42457120913\r\n\"[H]C1=C([H])[C@]([H])(Cl)C(=O)N=C1C(=O)OC([H])([H])[H] |(2.2653,-2.0334,3.4711;1.8066,-1.2081,4.0018;1.2455,-1.341,5.2134;1.2057,-2.2962,5.7295;0.6035,-0.1604,5.864;0.8193,-0.0901,6.9308;-1.2257,-0.3543,5.7665;1.0052,1.1737,5.2176;1.0812,2.1834,5.881;1.305,1.1911,3.8362;1.7112,0.0846,3.3115;2.1098,0.0697,1.8456;2.5845,-0.9176,1.3187;1.8857,1.2316,1.2345;2.2476,1.2753,-0.1584;1.6841,0.5268,-0.7213;3.3172,1.0854,-0.2787;1.9933,2.2815,-0.4894)|\",3.785103660455001\r\n\"[H]C1=C([H])[C@@]([H])(Br)C(C(=O)OC([H])([H])[H])=NC1=O |(1.0648,-3.2296,-6.134;1.3476,-2.789,-5.1837;1.164,-1.4901,-4.9091;0.7174,-0.8083,-5.6274;1.5451,-0.9411,-3.5865;2.1104,-0.0097,-3.6328;-0.1397,-0.3589,-2.6222;2.2507,-1.9339,-2.7012;2.9065,-1.334,-1.4705;3.2599,-0.1721,-1.4406;3.058,-2.2149,-0.4846;3.7083,-1.7107,0.6986;3.1437,-0.8706,1.1107;4.7234,-1.3823,0.462;3.7249,-2.5479,1.3952;2.4183,-3.1775,-2.9481;1.9105,-3.7166,-4.1774;1.968,-4.9154,-4.3652)|\",3.945650832250001\r\n\"[H]C1=NC([H])=C(C(=O)OC([H])([H])[H])C(=O)[C@@]1([H])N(=O)=O |(-0.2344,-5.5259,0.3534;0.4254,-4.7831,0.8009;1.6149,-4.6598,0.3433;2.4203,-3.6472,0.8824;3.4557,-3.6651,0.5553;2.0057,-2.6346,1.6908;3.0171,-1.5922,2.057;4.1336,-1.5545,1.5784;2.5701,-0.7509,2.9977;3.5046,0.2612,3.415;2.9812,0.8373,4.1775;3.778,0.8969,2.5692;4.4077,-0.1982,3.8249;0.5785,-2.5784,2.0806;-0.0324,-1.6139,2.4768;-0.1008,-3.972,1.9558;0.1133,-4.4947,2.8989;-1.5909,-3.8169,1.8888;-2.0667,-3.6583,0.7669;-2.2045,-3.835,2.9457)|\",4.2096012672350005\r\n\"[H]C1=NC(=S)C(=C([H])[H])C([H])=C1Br |(5.9962,0.0427,0.0495;4.9062,0.0409,0.0509;4.3141,1.1949,0.053;2.922,1.2704,0.0547;2.2285,2.7776,0.0613;2.135,-0.0022,0.0508;0.7796,-0.0341,0.0472;0.2463,-0.9806,0.0448;0.1983,0.8815,0.0466;2.871,-1.2575,0.0504;2.3131,-2.1888,0.0501;4.2201,-1.2353,0.0505;5.2705,-2.8287,0.0498)|\",2.4952840090850006\r\n\"[H]C1=NC([H])=C(/C([H])=C(\\C#N)C(=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(-1.237,-5.7754,4.9288;-0.5469,-5.0092,4.5801;-0.2633,-4.0325,5.4525;0.5793,-3.0855,5.0453;0.7988,-2.2994,5.7678;1.189,-3.0373,3.7713;2.0804,-1.9165,3.5231;2.1887,-1.2282,4.3602;2.817,-1.5447,2.4378;2.8158,-2.3087,1.233;2.8017,-2.9459,0.2567;3.6404,-0.2654,2.5822;3.5911,0.3203,3.6508;4.5032,0.2787,1.4263;5.5944,-0.7517,1.0457;6.2221,-0.9961,1.9106;5.1846,-1.6802,0.6422;6.243,-0.3158,0.2771;5.1861,1.5735,1.9034;5.8009,1.9806,1.093;5.8295,1.3909,2.7689;4.4504,2.329,2.1939;3.6103,0.6058,0.2045;4.2312,1.054,-0.5798;2.8341,1.3328,0.4697;3.1287,-0.2767,-0.2222;0.8701,-4.0805,2.8774;1.2969,-4.1157,1.8829;-0.008,-5.0736,3.2925;-0.2737,-5.8905,2.6286)|\",4.21776468275\r\n\"[H]/N=C(/S[H])N1C2=C([H])\\C([H])=C(C([H])([H])[H])/C([H])=C\\2C([H])([H])C1([H])[H] |(6.8617,-1.8399,-0.9007;7.7389,-1.6319,-0.4202;7.7351,-0.4442,0.053;9.2557,0.1825,0.8064;9.9658,-0.8999,0.4364;6.695,0.4677,0.0972;5.3188,0.1603,0.0087;4.6714,-1.0689,0.1295;5.2174,-1.9842,0.3256;3.2738,-1.0915,0.0409;2.7637,-2.0469,0.1373;2.5164,0.0673,-0.1556;1.0078,0.015,-0.2315;0.6495,-1.0141,-0.3344;0.627,0.5885,-1.0851;0.5437,0.4359,0.6705;3.1964,1.2953,-0.2585;2.6313,2.2145,-0.4018;4.578,1.3396,-0.1719;5.5241,2.5169,-0.2645;5.6271,2.8539,-1.3048;5.2032,3.3796,0.3274;6.8565,1.9312,0.2582;7.0047,2.1776,1.3184;7.7256,2.2818,-0.3015)|\",5.1184615279050005\r\n\"[H]OC(=O)C1=C([H])S/C(=N\\C2=C([H])C([H])=C([H])C([H])=C2F)N1[H] |(7.2141,-3.3299,-1.8902;7.6638,-3.1217,-1.0551;6.8296,-2.4923,-0.1973;7.2262,-2.0814,0.8695;5.4123,-2.3288,-0.6088;4.7504,-2.7913,-1.695;5.1353,-3.3962,-2.5048;3.0612,-2.3186,-1.6897;3.2805,-1.4761,-0.0764;2.4204,-0.9056,0.6638;1.085,-0.7314,0.2908;0.6787,-0.1208,-0.9081;1.439,0.2217,-1.6041;-0.673,0.0829,-1.1909;-0.9585,0.5608,-2.1236;-1.6498,-0.3133,-0.2774;-2.7025,-0.1574,-0.4936;-1.2703,-0.9101,0.9278;-2.0007,-1.2296,1.6642;0.0761,-1.1023,1.1965;0.4362,-1.6806,2.3592;4.6134,-1.5938,0.2501;4.9735,-1.2433,1.1296)|\",3.91299717019\r\n\"[H]SC1=NN([H])C(C2C([H])=C([H])C(=O)C(OC([H])([H])[H])=C2[H])N1C([H])([H])[H] |(4.0214,-3.218,0.3216;2.7558,-2.7624,0.4054;3.2376,-1.0902,0.1167;4.4362,-0.699,-0.1955;4.3118,0.6699,-0.2923;5.0626,1.1982,-0.709;3.0395,1.1225,-0.0956;2.6122,2.4555,-0.1619;1.2497,2.8465,-0.3702;0.4779,2.0913,-0.4812;0.9101,4.16,-0.5177;-0.118,4.4513,-0.7129;1.8802,5.2484,-0.4914;1.5568,6.4337,-0.6815;3.2723,4.8225,-0.238;4.2996,5.7251,-0.2122;4.1297,6.9056,0.5837;5.0726,7.4516,0.4985;3.9648,6.6389,1.6372;3.2962,7.5065,0.2205;3.5951,3.5023,-0.0965;4.6364,3.2689,0.1143;2.3371,-0.0401,0.2042;0.986,-0.1202,0.7447;0.2388,-0.1826,-0.052;0.7964,0.768,1.3507;0.9107,-1.0063,1.3791)|\",3.46673045537\r\n\"[H]/N=C(/SC([H])([H])[H])N([H])/N=C([H])/C1=C([H])/C([H])=C2/OC([H])([H])O/C2=C\\1[H] |(2.707,2.2765,-0.6252;2.5399,1.3705,-0.1741;2.7048,1.4584,1.0895;2.6414,-0.0782,2.0088;1.9627,0.4467,3.6259;2.6444,1.1086,4.1631;0.9928,0.9314,3.4994;1.8296,-0.4752,4.1983;2.8228,2.667,1.8106;2.4053,3.4643,1.3458;3.8012,3.0907,2.6835;4.7734,2.3168,3.0197;4.8911,1.308,2.6179;5.7938,2.7652,3.97;6.8674,1.9088,4.2552;6.9098,0.9367,3.7715;7.8908,2.2692,5.1475;8.7198,1.6053,5.3659;7.791,3.5142,5.7351;8.6381,4.0996,6.6398;8.1029,5.4045,6.8998;8.8116,6.1647,6.5479;7.9176,5.5127,7.9741;6.8694,5.5317,6.1864;6.7247,4.374,5.4591;5.7131,4.0371,4.5925;4.8836,4.702,4.3841)|\",4.244976067800001\r\n\"[H]/N=C1/SC([H])=C(C2=C(O[H])C([H])=C(F)C([H])=C2[H])N1[H] |(-0.1644,6.0185,1.0816;0.7532,6.0618,0.6312;1.1213,4.9196,0.201;2.6766,4.6856,-0.6639;2.4112,2.949,-0.7547;3.1303,2.3068,-1.2353;1.2284,2.5725,-0.2171;0.6188,1.2389,-0.1397;1.3874,0.0498,-0.1594;2.7415,0.1594,-0.2846;3.1368,-0.7256,-0.2429;0.7762,-1.2006,-0.0509;1.3626,-2.1152,-0.0594;-0.6057,-1.2751,0.0692;-1.1771,-2.4907,0.1718;-1.4036,-0.1402,0.0783;-2.4809,-0.2311,0.1566;-0.7764,1.0989,-0.0289;-1.3957,1.9899,-0.0589;0.4949,3.6744,0.2681;-0.2243,3.5389,0.9662)|\",4.155178497135\r\n\"[H]C1=C(N/C([H])=C(\\C#N)C(=O)C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])N([H])OC1C([H])([H])[H] |(3.0558,1.5503,0.8681;3.3133,0.6617,0.3149;4.2242,0.5971,-0.8017;4.9203,1.4677,-1.4625;5.2684,2.7022,-1.0504;5.2418,3.4749,-1.8178;5.7737,3.1125,0.1699;5.9653,2.1508,1.2003;6.1024,1.3256,2.0148;6.1555,4.5599,0.2948;5.9507,5.3057,-0.6549;6.7928,5.1123,1.5915;7.0598,6.6156,1.3982;7.5122,7.0274,2.3078;6.1351,7.1622,1.1918;7.7395,6.7947,0.5602;5.823,4.9257,2.7826;6.2622,5.3767,3.6805;5.6234,3.8749,3.0061;4.8673,5.4285,2.5918;8.1357,4.3966,1.8708;8.6058,4.8425,2.7555;8.8269,4.52,1.0288;8.0151,3.328,2.0639;4.1582,-0.724,-1.2568;5.0199,-1.2103,-1.4873;3.4353,-1.4944,-0.3176;2.8966,-0.6025,0.5583;1.9746,-1.1872,1.5676;1.6475,-0.416,2.2676;2.4753,-1.9845,2.1276;1.0948,-1.6225,1.0799)|\",4.198716713214999\r\n\"[H]C1=C([H])C([H])=C2C(=C1[H])/N=C(/C1=C([H])C([H])=C([H])N(C([H])([H])[H])C1=O)N2C([H])([H])[H] |(6.7907,7.4503,1.5492;6.7577,6.4434,1.1421;7.7717,6.0299,0.2529;8.5638,6.7257,-0.0099;7.7776,4.753,-0.3024;8.5555,4.4464,-0.9956;6.7328,3.9003,0.0655;5.7093,4.2989,0.9582;5.7209,5.5905,1.503;4.9364,5.9018,2.1863;4.8149,3.2654,1.1465;5.266,2.2736,0.3978;4.5888,0.9675,0.3557;4.0295,0.4785,1.5151;4.1552,1.0541,2.4268;3.2967,-0.7318,1.5306;2.8684,-1.1225,2.4457;3.1427,-1.4091,0.3552;2.5985,-2.3454,0.2832;3.6651,-0.9425,-0.8159;3.4257,-1.6935,-2.0527;2.3561,-1.7129,-2.2849;3.7931,-2.7185,-1.9447;3.9618,-1.1862,-2.8524;4.3927,0.275,-0.9143;4.7879,0.6687,-2.0133;6.4356,2.5921,-0.2753;7.2934,1.7346,-1.0781;8.3341,2.0209,-0.8998;7.0564,1.8162,-2.1409;7.1617,0.6954,-0.7755)|\",3.940208555240001\r\n\"[H]C1=C(C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])ON=C1C(=O)C([H])([H])C#N |(1.6274,-2.6634,-0.3642;2.2997,-2.2404,0.3646;2.6956,-0.9449,0.5194;2.3938,0.3493,-0.1927;1.7558,1.3417,0.8081;1.5462,2.292,0.3046;2.4252,1.5402,1.6506;0.8123,0.9508,1.2051;1.417,0.0742,-1.3504;1.1905,1.0098,-1.8724;0.4717,-0.3447,-0.9875;1.8453,-0.622,-2.08;3.7108,0.9426,-0.7467;3.5053,1.893,-1.2519;4.1752,0.2646,-1.471;4.4307,1.1296,0.0559;3.5567,-0.8736,1.5742;3.7404,-2.1273,2.1223;2.9908,-2.9353,1.3961;2.9243,-4.3837,1.7095;2.138,-5.1159,1.1454;3.8937,-4.9178,2.7848;3.9537,-4.2127,3.621;3.4991,-5.8748,3.139;5.236,-5.1144,2.2342;6.2923,-5.2749,1.7811)|\",5.303498946245\r\n\"[H]C1=C([H])C(S(=O)(=O)N([H])[H])=C([H])C(C2=NNN(C([H])([H])[H])N2)=C1[H] |(9.3806,0.0955,-0.1551;8.5327,-0.5829,-0.1399;8.7436,-1.9576,-0.2435;9.7418,-2.3676,-0.3518;7.6367,-2.8086,-0.2255;7.8934,-4.5808,-0.3492;6.6335,-5.183,-0.7952;9.1614,-4.7997,-1.0518;8.1448,-5.0542,1.2636;7.3848,-5.6637,1.5595;9.0421,-5.5309,1.3317;6.3377,-2.3207,-0.1221;5.4934,-3.0006,-0.1364;6.1355,-0.9378,-0.0179;4.7784,-0.3931,0.0898;4.4988,0.936,0.1833;3.1965,1.039,0.2588;2.7256,-0.2064,0.2099;1.3068,-0.5115,0.2674;1.101,-1.1379,1.1385;1.0083,-1.0388,-0.6416;0.7755,0.4365,0.3477;3.6678,-1.136,0.1057;7.2404,-0.0735,-0.0246;7.0721,0.9954,0.0542)|\",5.44227701\r\n\"[H]/N=C(\\N=C1C2=C(C([H])([H])[H])/C([H])=C(C([H])([H])[H])\\N=C2/N(C([H])([H])[H])N/1[H])S[H] |(0.6463,1.9235,-2.0356;-0.1456,1.897,-1.3846;0.2204,2.3802,-0.2546;1.4262,2.9366,0.1095;2.5731,2.3423,0.0778;3.0777,0.9702,-0.0678;2.5322,-0.3286,-0.0239;1.0827,-0.6353,0.2256;0.9417,-1.7075,0.3881;0.4537,-0.3168,-0.6136;0.7099,-0.0984,1.1047;3.45,-1.3736,-0.1843;3.0909,-2.3978,-0.1594;4.8265,-1.1399,-0.3379;5.799,-2.2794,-0.4979;5.3051,-3.2515,-0.4172;6.5844,-2.217,0.2634;6.295,-2.2196,-1.4737;5.3578,0.0936,-0.3337;4.4801,1.0849,-0.2181;4.8701,2.4127,-0.1983;6.1208,2.8048,0.4439;6.8814,2.0906,0.1269;6.0331,2.7952,1.5382;6.4048,3.8067,0.1098;3.7193,3.1372,0.2;3.6508,4.0597,-0.2184;-0.9898,2.5338,1.0693;-2.0142,2.0328,0.3488)|\",4.111640281055\r\n\"[H]OC([H])([H])C1=C(OC([H])([H])[H])C([H])=C(F)C([H])=N1 |(3.061,5.3645,-1.3972;2.4585,5.4087,-0.6287;2.2495,4.074,-0.2402;2.5695,3.922,0.8033;1.1753,3.8298,-0.2675;2.9985,3.0994,-1.1313;2.9283,1.7021,-0.9275;2.1507,1.2937,0.108;2.0492,-0.1019,0.3659;3.029,-0.5353,0.603;1.6102,-0.6335,-0.4878;1.391,-0.1976,1.2305;3.6412,0.861,-1.7797;3.6396,-0.2179,-1.685;4.3874,1.4601,-2.7933;5.088,0.6771,-3.6359;4.416,2.8356,-2.9441;4.9967,3.3012,-3.7349;3.7228,3.6274,-2.1118)|\",5.493978641595\r\n\"[H]OC([H])([H])C1=C(F)C([H])=C(OC([H])([H])[H])C([H])=N1 |(5.3131,-1.4911,5.4274;5.7473,-0.8429,6.0163;5.5748,0.4051,5.393;5.0287,1.0936,6.0582;6.5526,0.8748,5.1996;4.8183,0.2787,4.0828;4.524,1.3773,3.2678;4.9287,2.6018,3.6622;3.8357,1.2282,2.0826;3.606,2.0785,1.4502;3.4369,-0.0677,1.7167;2.7622,-0.1796,0.5434;2.3349,-1.475,0.1377;1.8235,-1.3335,-0.8155;1.6384,-1.9122,0.8643;3.1874,-2.1515,-0.0023;3.7585,-1.127,2.5731;3.4763,-2.1497,2.346;4.4311,-0.9407,3.7187)|\",5.444998148505\r\n\"[H]C1=C(C(=O)N([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])SC(C([H])([H])C([H])([H])C([H])([H])[H])=C1[H] |(7.0111,-2.8836,0.8054;6.1133,-2.5157,0.3231;6.1365,-1.3972,-0.4706;7.36,-0.591,-0.7403;8.4622,-1.0127,-0.3889;7.1725,0.5993,-1.3861;6.2221,0.9237,-1.5037;8.2401,1.574,-1.7173;7.5497,2.7372,-2.4446;8.2884,3.4913,-2.7344;6.8084,3.2276,-1.8007;7.0461,2.39,-3.3547;8.9164,2.0759,-0.4274;9.7022,2.8021,-0.6669;9.3636,1.239,0.1131;8.1859,2.5656,0.2271;9.2741,0.9145,-2.649;10.0551,1.6363,-2.9152;8.7952,0.5686,-3.572;9.7387,0.0581,-2.1574;4.5347,-1.0684,-1.1047;3.8475,-2.4417,-0.2677;2.3921,-2.792,-0.3999;2.1384,-3.4839,0.4137;1.7724,-1.8978,-0.2456;2.0155,-3.4353,-1.7512;2.27,-2.7431,-2.5643;2.6345,-4.3286,-1.9032;0.5311,-3.8039,-1.8296;0.2858,-4.2507,-2.7993;-0.1052,-2.92,-1.6993;0.2586,-4.527,-1.0511;4.8231,-3.1027,0.4366;4.6133,-3.9882,1.0285)|\",5.156557466975\r\n\"[H]O/C(=N\\[C@]([H])(/C(=N\\N([H])[H])O[H])C([H])([H])[H])OC([H])([H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.2948,0.222,0.0491;4.7026,-0.2584,0.8021;3.762,-0.7114,1.6638;2.5037,-0.5047,1.699;1.866,0.4068,0.738;0.7874,0.2682,0.8777;2.1266,-0.0091,-0.6964;1.591,-0.9629,-1.3446;0.5802,-1.6867,-0.6898;0.4438,-2.5394,-1.2234;0.8685,-1.944,0.2566;3.147,0.6929,-1.3154;3.2577,0.2857,-2.1959;2.1813,1.8837,1.0245;1.8894,2.1115,2.0537;1.6238,2.5396,0.3475;3.2462,2.1089,0.9171;4.2809,-1.5237,2.6066;5.712,-1.6565,2.7265;5.8201,-2.5564,3.3397;6.1592,-1.847,1.7479;6.3638,-0.4673,3.3957;7.5205,0.1012,2.8546;7.9258,-0.2927,1.9256;8.1509,1.1718,3.4912;9.0497,1.604,3.0596;7.6196,1.6922,4.6716;8.1057,2.5289,5.1662;6.4579,1.1356,5.2137;6.0386,1.5392,6.1315;5.836,0.0593,4.5822;4.9296,-0.3707,4.9996)|\",5.63819898236\r\n\"[H]OC1=NC2=C(O1)C(C([H])([H])C([H])([H])O[H])=C([H])C([H])=C2[H] |(4.0105,-0.8221,-0.0416;3.6049,0.0615,-0.0041;2.2886,-0.127,-0.0108;1.6437,-1.247,-0.0592;0.3027,-0.8392,-0.0397;0.2522,0.5601,0.0224;1.565,1.0186,0.0394;-0.9079,1.3213,0.0638;-0.9048,2.8308,0.0769;-1.7585,3.1937,0.6604;0.0085,3.2031,0.5559;-0.9985,3.4265,-1.3422;-0.1116,3.132,-1.9257;-1.0047,4.5192,-1.2775;-2.1982,3.0709,-2.0101;-2.1972,2.1046,-2.1066;-2.0904,0.5594,0.0285;-3.0422,1.0822,0.0726;-2.0776,-0.8436,-0.0372;-3.0245,-1.3759,-0.0555;-0.8853,-1.5696,-0.0726;-0.8744,-2.6534,-0.1207)|\",5.78514046163\r\n\"[H]C1=C([H])C([H])=C(OC([H])([H])C(=O)C([H])([H])N([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])C([H])=C1[H] |(8.8277,-0.0701,6.0271;8.5355,-0.348,5.0188;8.9738,0.4031,3.9228;9.612,1.2693,4.0757;8.6023,0.0495,2.6302;8.9368,0.6175,1.7677;7.7817,-1.0669,2.4202;7.4729,-1.3295,1.1058;6.7443,-2.5074,0.8021;6.883,-2.6612,-0.2751;7.1404,-3.3881,1.3205;5.2355,-2.4119,1.0542;4.593,-3.4249,1.2591;4.597,-1.035,0.995;5.0493,-0.4768,0.1626;4.9422,-0.5147,1.9097;3.1595,-1.1392,0.858;2.8673,-2.039,1.2374;2.3237,-0.0546,1.4119;2.4504,0.0841,2.9465;1.8098,0.8902,3.3235;3.4768,0.314,3.2548;2.1479,-0.8454,3.4448;0.8711,-0.4153,1.0562;0.1756,0.3471,1.4244;0.5841,-1.3733,1.5088;0.7547,-0.5063,-0.0284;2.7034,1.2698,0.7288;2.0257,2.0716,1.0427;2.64,1.1683,-0.36;3.7226,1.5817,0.9854;7.3336,-1.8236,3.5085;6.6903,-2.6857,3.3704;7.7186,-1.4549,4.8022;7.3672,-2.0463,5.6434)|\",5.0177794032200005\r\n\"[H]OC(=O)C1=C(N(=O)=O)C([H])=C2OC([H])([H])OC2=C1C([H])([H])[H] |(3.2499,3.4206,-0.6357;2.3626,3.4242,-0.2171;2.2612,2.3534,0.5783;1.4343,2.2981,1.4582;3.1243,1.1299,0.2716;4.5336,1.0802,0.2692;5.3723,2.2796,0.3706;6.4695,2.1704,0.9056;4.9661,3.3416,-0.129;5.2718,-0.1157,0.1637;6.3525,-0.1041,0.1963;4.5394,-1.264,0.0051;4.9892,-2.5427,-0.1538;3.8202,-3.3619,-0.3013;3.7919,-4.1054,0.5017;3.8319,-3.842,-1.2858;2.6715,-2.5074,-0.2003;3.1463,-1.2383,-0.0227;2.3941,-0.0851,0.1272;0.887,-0.1771,0.1287;0.4327,0.6231,-0.4619;0.4944,-0.0837,1.1452;0.5718,-1.1382,-0.2845)|\",3.82047846102\r\n\"[H]OC1=NC([H])([H])[C@@]2([H])C([H])([H])N(C(=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H])[C@]1([H])C2([H])[H] |(6.6314,-1.2314,2.3867;7.0439,-2.0338,2.0098;7.1398,-1.8286,0.6727;7.0925,-2.8151,-0.1262;7.1577,-2.5443,-1.5635;8.0691,-3.0163,-1.9589;6.3176,-3.0748,-2.0323;7.1423,-1.0544,-1.9873;7.4027,-0.9721,-3.0477;5.7943,-0.3523,-1.6866;4.9327,-1.0174,-1.8009;5.6383,0.5207,-2.3311;5.9508,0.0735,-0.2779;4.9433,0.2594,0.6224;5.121,0.2038,1.8404;3.78,0.526,-0.0006;2.5435,0.7944,0.761;2.7313,2.0373,1.6372;1.7772,2.3011,2.107;3.0561,2.8871,1.0268;3.4714,1.8598,2.4185;2.152,-0.4435,1.5727;1.1782,-0.2774,2.047;2.8881,-0.6561,2.3496;2.0651,-1.317,0.9172;1.5263,1.0651,-0.3502;0.5475,1.2917,0.0849;1.4211,0.1933,-1.0043;1.8412,1.9181,-0.9596;7.2619,-0.3662,0.2223;7.6012,0.2806,1.0331;8.108,-0.3082,-1.0505;8.2738,0.7301,-1.3577;9.0748,-0.8079,-0.939)|\",6.854547894095001\r\n\"[H]C1=NC(C(=O)OC([H])([H])C([H])([H])[H])=C(C2([H])OC([H])([H])C([H])([H])O2)C([H])=C1[H] |(5.332,-2.6477,-1.0578;5.7164,-1.6306,-1.1109;4.9795,-0.6936,-0.5082;5.4079,0.5742,-0.558;4.4867,1.5981,0.0591;4.3028,2.7034,-0.4142;3.8898,1.1357,1.1667;2.9382,2.0215,1.8044;2.9327,1.698,2.848;3.3129,3.0461,1.7417;1.5592,1.895,1.1741;0.8415,2.5147,1.7242;1.2142,0.8566,1.2031;1.5777,2.233,0.1341;6.604,0.9684,-1.1916;7.0854,2.4199,-1.1731;6.3574,3.0761,-1.6533;7.2558,2.8889,0.1582;8.628,2.6743,0.4765;8.9004,3.3595,1.2824;8.8048,1.6384,0.8027;9.3136,2.9682,-0.8571;10.2439,2.4112,-1.0071;9.5101,4.0412,-0.9819;8.3504,2.533,-1.8207;7.3547,-0.0325,-1.8104;8.2751,0.2394,-2.3144;6.9082,-1.352,-1.7773;7.4722,-2.1473,-2.2559)|\",5.6164298743200005\r\n\"[H]N1C(Cl)=NC(C(=O)OC([H])([H])[H])=C(Cl)/C1=N/C([H])([H])[H] |(1.129,2.3593,-0.1592;2.1339,2.259,-0.0986;2.9141,3.3743,-0.0979;2.0223,4.8789,-0.2036;4.1882,3.3852,-0.0279;4.8281,2.1532,0.0675;6.3282,2.2389,0.1883;7.0101,1.5066,0.8699;6.8085,3.2497,-0.5549;8.2322,3.4366,-0.4784;8.7554,2.5296,-0.7919;8.5292,3.6837,0.5443;8.4514,4.2637,-1.1534;4.145,0.9691,0.0716;4.9051,-0.5789,0.148;2.6773,0.951,-0.0301;1.9871,-0.1217,-0.0493;0.5434,-0.027,-0.1498;0.12,-1.034,-0.1776;0.2135,0.4927,-1.0661;0.0932,0.4939,0.7126)|\",4.277629729859999\r\n\"[H]C(=O)/C1=C(\\[H])C2=C(C([H])=C(Br)C([H])=C2[H])N1C([H])([H])[H] |(3.5043,3.1774,0.0235;2.7156,2.3958,0.0098;1.531,2.6975,0.0291;3.2556,1.0461,-0.0318;4.5989,0.7087,-0.0514;5.4196,1.4138,-0.0394;4.6806,-0.7078,-0.0882;3.3365,-1.1927,-0.0896;3.0496,-2.568,-0.1211;2.0363,-2.9516,-0.1218;4.1332,-3.4281,-0.1514;3.8036,-5.3204,-0.1941;5.4744,-2.9849,-0.1521;6.28,-3.7097,-0.1772;5.7468,-1.629,-0.1207;6.7753,-1.2791,-0.1216;2.4835,-0.1142,-0.0548;1.026,-0.1719,-0.0508;0.6166,0.3581,-0.9133;0.6229,0.2867,0.8549;0.7206,-1.2178,-0.0921)|\",4.13068825059\r\n\"[H]OC1=NC([H])=C(C([H])=O)C([H])=C1C([H])([H])[H] |(1.5943,-2.3255,-0.0627;2.5586,-2.424,-0.0881;3.1418,-1.2017,-0.0815;4.4727,-1.2392,-0.1128;5.124,-0.078,-0.1077;6.2102,-0.1216,-0.1337;4.4853,1.169,-0.0715;5.2571,2.424,-0.0672;4.6351,3.3477,-0.0341;6.4717,2.4945,-0.0962;3.0834,1.1763,-0.0396;2.5493,2.1251,-0.0112;2.3693,-0.014,-0.044;0.8614,-0.0603,-0.0101;0.442,0.9496,0.0062;0.4846,-0.5792,0.8827;0.444,-0.5671,-0.8918)|\",5.069481034815\r\n\"[H]C1=C2N=C(C([H])([H])C([H])([H])[H])N(C([H])([H])[H])C2=C([H])C([H])=C1C(=O)OC([H])([H])[H] |(2.7937,4.7378,1.7899;3.2017,4.0077,2.4785;3.2273,2.6517,2.1493;2.7863,2.0058,1.004;3.0417,0.734,1.2023;2.7626,-0.3389,0.1925;2.3485,-1.2303,0.681;1.9831,0.0529,-0.4665;3.9974,-0.7233,-0.6449;3.7303,-1.4739,-1.3969;4.7995,-1.1401,-0.0253;4.395,0.1547,-1.1632;3.6402,0.4873,2.4358;4.0487,-0.7815,3.0096;5.1228,-0.7774,3.2269;3.8401,-1.5882,2.3068;3.5032,-0.9779,3.9397;3.7645,1.7126,3.0618;4.2811,2.0844,4.3069;4.6913,1.3587,5.0031;4.2488,3.4368,4.6188;4.6349,3.7941,5.5672;3.7181,4.3926,3.7206;3.7354,5.8129,4.1597;4.169,6.2039,5.2277;3.2089,6.6522,3.234;3.1979,8.0372,3.6033;2.7452,8.5616,2.7608;4.2151,8.3982,3.7786;2.6088,8.1899,4.5119)|\",5.276287561195\r\n\"[H]O/C(=N/C1=C(N([H])C([H])([H])[H])C([H])=C([H])C(C(=O)OC([H])([H])[H])=C1[H])C([H])([H])C([H])([H])[H] |(4.4291,-2.4193,-1.5395;3.8004,-1.7492,-1.2314;2.5296,-2.1754,-1.48;1.6071,-1.3674,-1.1389;0.2337,-1.6329,-1.2471;-0.5681,-0.5538,-1.7384;0.0773,0.5815,-2.161;1.0398,0.6245,-1.8443;-0.6102,1.8339,-2.3943;0.1319,2.6017,-2.6266;-1.2005,2.1697,-1.5269;-1.2874,1.7538,-3.2535;-1.9648,-0.7121,-1.7896;-2.5833,0.0955,-2.167;-2.5647,-1.8902,-1.3576;-3.6429,-1.995,-1.404;-1.7888,-2.9389,-0.8484;-2.366,-4.2037,-0.3503;-1.7223,-5.1425,0.087;-3.7238,-4.2274,-0.4291;-4.3385,-5.4329,0.0391;-5.4114,-5.2863,-0.0946;-4.1046,-5.6039,1.0937;-3.9953,-6.2948,-0.5405;-0.3914,-2.7882,-0.7928;0.1923,-3.5895,-0.3527;2.4036,-3.5338,-2.1478;2.5727,-4.318,-1.3965;1.371,-3.6504,-2.4844;3.3616,-3.742,-3.3346;3.1425,-4.6967,-3.8225;4.4177,-3.7819,-3.036;3.2526,-2.9478,-4.0803)|\",4.525253333815\r\n\"[H]C1=NC(Cl)=C(C([H])([H])[H])C([H])=C1C([H])(F)F |(6.2322,-0.1171,-0.2101;5.1517,-0.0401,-0.1283;4.4698,-1.1905,-0.1784;3.1569,-1.1336,-0.0672;2.3131,-2.683,-0.1447;2.3988,0.04,0.1064;0.9002,0.0357,0.2419;0.5207,1.0542,0.3638;0.5823,-0.5573,1.1068;0.4234,-0.4087,-0.6388;3.139,1.2222,0.1506;2.626,2.1689,0.2913;4.5305,1.1973,0.0317;5.3322,2.472,0.0325;5.4786,2.896,-0.9681;4.7012,3.4234,0.7887;6.5726,2.2509,0.5656)|\",5.978341295485\r\n\"[H]/N=C(/O[H])C1=C([H])C(Cl)=NC([H])=C1N(=O)=O |(-2.7313,0.2721,-2.3194;-3.4264,-0.1939,-1.7351;-2.9128,-0.6121,-0.6512;-3.704,-1.1757,0.2797;-3.1724,-1.8183,0.7886;-1.4595,-0.3753,-0.284;-1.0226,0.9529,-0.3274;-1.7238,1.7471,-0.5541;0.3191,1.2445,-0.0709;0.8355,2.9123,-0.1094;1.2484,0.3383,0.1974;0.8557,-0.9336,0.2288;1.6129,-1.6821,0.4387;-0.4683,-1.33,0.0186;-0.7377,-2.7645,0.1399;0.1806,-3.5373,-0.1011;-1.8649,-3.1202,0.5088)|\",4.34293705398\r\n\"[H]C1=C2/N=C(/C([H])([H])[H])C(=O)N(C([H])([H])[H])C2=C([H])C([H])=C1C(=O)OC([H])([H])[H] |(4.5217,3.0285,-0.0693;5.0688,2.0937,-0.0952;4.336,0.9033,0.0065;2.9585,0.9898,0.135;2.269,-0.101,0.2313;0.7773,-0.0428,0.3727;0.2909,-0.5881,-0.4443;0.4487,0.998,0.3723;0.4588,-0.5337,1.2997;2.8681,-1.4634,0.2108;2.1896,-2.4817,0.3013;4.2603,-1.5021,0.08;4.9022,-2.8166,0.0515;5.6061,-2.9152,0.8839;5.4373,-2.9588,-0.8927;4.1141,-3.5613,0.1448;5.0153,-0.3419,-0.0225;6.4176,-0.3613,-0.1526;6.9547,-1.3018,-0.1764;7.1194,0.8287,-0.2509;8.1996,0.824,-0.3512;6.455,2.0689,-0.2235;7.2776,3.3031,-0.3343;8.4886,3.3126,-0.4472;6.5291,4.4287,-0.2959;7.2646,5.6575,-0.3976;6.5163,6.4494,-0.3541;7.8148,5.6989,-1.3413;7.9726,5.7509,0.4303)|\",4.443619178664999\r\n\"[H]OC(=O)[C@]([H])(/N=C(/O[H])C([H])([H])[H])C([H])([H])C1=C([H])C([H])=C([H])C(O[H])=C1[H] |(1.7868,-1.1069,2.5869;1.267,-0.924,3.408;1.1917,0.4115,3.4927;0.6146,0.9873,4.3856;1.9385,1.1699,2.3706;1.2399,1.9328,2.0022;2.3482,0.2097,1.3671;2.344,0.4405,0.1179;2.7374,-0.5956,-0.6738;2.7254,-0.3157,-1.602;1.9552,1.6943,-0.6292;1.5942,2.4795,0.0333;1.1729,1.4643,-1.3638;2.8244,2.0905,-1.1671;3.1584,1.8922,3.0141;2.7834,2.4098,3.9029;3.8665,1.125,3.3469;3.8387,2.8697,2.0843;3.3596,4.1851,1.9718;2.5211,4.5064,2.5846;3.9639,5.0796,1.0913;3.5944,6.0987,1.0133;5.0509,4.6837,0.3099;5.5253,5.3867,-0.3725;5.5351,3.3766,0.4229;6.5989,2.9181,-0.3064;6.9555,3.6477,-0.8367;4.9292,2.4741,1.3045;5.3293,1.4672,1.3767)|\",5.845005508740001\r\n\"[H]C1=C([H])C(N(=O)=O)=C(N(=O)=O)C([H])=C1C(=O)OC([H])([H])C([H])([H])[H] |(4.4988,1.8887,-0.0742;5.5116,1.5042,-0.0732;6.5938,2.3826,-0.0893;6.4498,3.4567,-0.0968;7.8904,1.8808,-0.0808;8.9916,2.8562,0.0311;8.8797,3.8915,-0.6183;9.8989,2.5816,0.8074;8.1108,0.4992,-0.0719;9.4609,-0.0821,-0.2017;9.6848,-1.1073,0.433;10.2313,0.4751,-0.9755;7.037,-0.3776,-0.0448;7.2075,-1.4477,-0.0377;5.7299,0.1228,-0.0443;4.61,-0.8732,-0.0232;4.7871,-2.0741,-0.0189;3.4057,-0.2746,-0.0142;2.2432,-1.1469,0.0146;1.4505,-0.5447,-0.4345;2.4459,-2.0141,-0.6178;1.8997,-1.5618,1.4364;0.9753,-2.1503,1.4334;1.7461,-0.6846,2.0731;2.6958,-2.1773,1.8641)|\",4.781040353285\r\n\"[H]OC([H])([H])C1=C(C(F)(F)F)C([H])=C([H])N=C1Cl |(2.4204,-2.1692,1.8596;2.9937,-1.5307,1.4065;2.4974,-1.4047,0.0812;3.2378,-0.8189,-0.4638;2.4141,-2.3776,-0.414;1.157,-0.6845,0.0448;-0.0736,-1.3618,0.102;-0.1247,-2.8741,0.156;-1.3758,-3.3289,0.3451;0.3406,-3.4399,-0.9781;0.6294,-3.3651,1.1789;-1.2754,-0.6523,0.1088;-2.2271,-1.1668,0.1522;-1.2175,0.736,0.062;-2.126,1.3336,0.0605;-0.0669,1.4143,0.0209;1.0583,0.7214,0.0211;2.5242,1.6966,-0.0142)|\",5.545680273190001\r\n\"[H]OC(OC([H])([H])C([H])([H])[H])C1C(=O)N=C(O[H])C(C#N)=C1[H] |(1.7737,-0.9367,1.643;2.3981,-0.464,1.0625;3.6011,-0.4835,1.6506;3.5179,-0.9702,2.8671;4.6109,-1.628,3.6059;5.41,-0.8982,3.7247;4.1345,-1.8376,4.5647;5.0607,-2.8999,2.9107;5.8143,-3.3898,3.5369;5.5213,-2.6905,1.9409;4.2292,-3.5985,2.7678;4.6931,-0.0075,0.9138;5.9586,0.4892,1.529;6.0748,0.6548,2.7438;7.0084,0.8095,0.6791;6.8315,0.8633,-0.6006;7.9035,1.1895,-1.3449;7.6632,1.208,-2.2888;5.565,0.5777,-1.2697;5.4758,0.7026,-2.6835;5.4925,0.8304,-3.8429;4.5247,0.1382,-0.4893;3.5805,-0.1271,-0.9517)|\",3.975583355805\r\n\"[H]C1=C(Cl)N=C(OC([H])([H])[H])C([H])=C1C([H])(F)F |(4.6642,3.3945,3.1347;4.1601,2.7516,2.4257;4.3413,1.375,2.4355;5.4092,0.669,3.646;3.7721,0.5251,1.5923;2.9661,1.0333,0.6616;2.3722,0.2097,-0.2199;2.6504,-1.1919,-0.1042;2.0776,-1.6618,-0.9048;2.3324,-1.5742,0.8699;3.7186,-1.3895,-0.2302;2.6931,2.4069,0.5383;2.035,2.7639,-0.2437;3.3057,3.2644,1.4393;3.0104,4.7453,1.4015;2.1694,5.023,2.0479;4.1012,5.4571,1.8169;2.7007,5.1388,0.1303)|\",5.423229040464999\r\n\"[H]C1=NC(Cl)=C(C([H])(F)F)C(Cl)=N1 |(-1.1934,2.0576,-0.0009;-0.6441,1.1199,-0.0016;-1.3639,0.0037,0.1146;-0.6774,-1.1282,0.1131;-1.6451,-2.5849,0.2714;0.7178,-1.2046,-0.0032;1.4782,-2.5123,-0.0026;0.8174,-3.3738,0.0985;2.1781,-2.6453,-1.1666;2.3691,-2.5333,1.0308;1.3408,0.0531,-0.1188;3.0712,0.1773,-0.2749;0.6823,1.2023,-0.1186)|\",6.008273819039999\r\n\"[H]C1=C([H])C(Cl)=C(C(=O)N(OC([H])([H])[H])C([H])([H])[H])C(Cl)=N1 |(8.3704,0.5975,-2.9105;7.5541,0.3286,-2.2443;7.5313,0.7928,-0.9343;8.3199,1.4259,-0.545;6.4557,0.4105,-0.1321;6.3889,0.976,1.5232;5.442,-0.4025,-0.6426;4.327,-0.9457,0.226;4.5043,-1.9145,0.9462;3.1426,-0.2491,0.1918;2.9665,0.631,-0.8897;2.9249,1.9877,-0.4316;2.1053,2.1433,0.28;2.7458,2.5815,-1.3309;3.8737,2.2803,0.0303;1.9042,-0.8362,0.6766;2.1654,-1.5518,1.4563;1.3776,-1.3481,-0.1373;1.2619,-0.0555,1.0934;5.6033,-0.8041,-1.9805;4.3882,-1.8555,-2.6927;6.605,-0.4602,-2.7658)|\",5.621872151330001\r\n\"[H]O[C@@]([H])(C1=C([H])N(C([H])([H])C2=C([H])C([H])=C([H])C([H])=C2[H])N=N1)C([H])([H])[H] |(2.8057,-0.5692,1.5633;2.1143,-0.9172,0.9743;2.3315,-0.3022,-0.2936;1.3995,-0.4412,-0.8534;2.5847,1.1734,-0.1154;2.0683,2.2883,-0.7375;1.3343,2.4203,-1.5178;2.6969,3.3344,-0.1388;2.5059,4.7698,-0.3369;3.327,5.2379,0.2142;2.6434,4.9988,-1.3988;1.1583,5.2683,0.1491;0.3561,6.0621,-0.6761;0.69,6.3038,-1.6829;-0.8716,6.5458,-0.2175;-1.4874,7.1598,-0.8691;-1.3086,6.2311,1.0689;-2.2658,6.6017,1.4255;-0.5117,5.4364,1.8977;-0.8471,5.1857,2.9003;0.717,4.9617,1.4429;1.3367,4.3441,2.088;3.5445,2.8969,0.8187;3.4806,1.5933,0.8262;3.4793,-0.989,-1.0466;4.4251,-0.8335,-0.5156;3.5878,-0.5869,-2.0611;3.2849,-2.0642,-1.1107)|\",6.408281179275001\r\n\"[H]O/C(=N/C([H])([H])[C@]([H])(O[H])C([H])([H])O[H])N([H])C1=NC(C([H])([H])[H])=C([H])C(C([H])([H])[H])=N1 |(0.2807,-2.5039,0.4051;1.163,-2.1,0.3258;1.1747,-1.0695,1.217;0.1215,-0.7699,1.8582;0.1177,0.1651,2.9733;0.9477,-0.0543,3.6617;0.2523,1.1936,2.6197;-1.2174,0.0271,3.7156;-1.1431,0.5053,4.6991;-1.5101,-1.3506,3.9738;-1.2605,-1.8038,3.1458;-2.3989,0.6448,2.9581;-2.4029,0.2648,1.9223;-2.299,1.7361,2.9138;-3.6154,0.3547,3.6232;-3.5148,-0.572,3.9075;2.4743,-0.5661,1.3014;3.2004,-1.2227,1.0401;2.8992,0.7564,1.2277;4.2349,0.8897,1.2147;4.7033,2.1403,1.1225;6.2001,2.2992,1.0987;6.6411,1.8519,1.9962;6.4958,3.3504,1.0445;6.6217,1.769,0.2373;3.8355,3.2359,1.0498;4.2166,4.2485,0.9753;2.4637,2.9814,1.0659;1.4409,4.0824,0.9863;0.8504,4.1177,1.9095;0.741,3.8883,0.1667;1.9075,5.0594,0.8339;1.9863,1.7277,1.1501)|\",5.16472088249\r\n\"[H]C1=C([H])C([H])=C(C2=C([H])C(C([H])([H])[H])=C(N([H])[H])C(Cl)=C2[H])O1 |(4.8295,5.7587,0.3169;5.308,4.801,0.1809;6.5986,4.4053,0.0097;7.4699,5.0445,-0.0252;6.5653,2.9809,-0.1152;7.4122,2.3268,-0.2665;5.2504,2.6027,-0.0108;4.5623,1.3192,-0.0544;3.1634,1.2343,0.0498;2.59,2.1484,0.1645;2.4867,0.0196,0.0058;0.9814,-0.0317,0.1029;0.5608,0.9753,0.1711;0.6427,-0.5849,0.9918;0.5386,-0.5246,-0.7726;3.2154,-1.1859,-0.1447;2.556,-2.4038,-0.2517;1.6667,-2.4621,0.226;3.1337,-3.2213,-0.1024;4.6154,-1.0883,-0.2452;5.5666,-2.5693,-0.4208;5.2835,0.1262,-0.2039;6.3649,0.1348,-0.2833;4.4753,3.7208,0.1716)|\",4.498041948765\r\n\"[H]C1=NC(C(=O)OC([H])([H])C([H])([H])[H])=C(Cl)C(C#N)=C1[H] |(5.6355,-3.1526,0.0573;6.1918,-2.222,-0.0256;5.4761,-1.0978,0.0495;6.1053,0.0802,-0.0324;5.2322,1.3015,0.158;5.5254,2.2232,0.8871;4.1038,1.1982,-0.5557;3.1302,2.2634,-0.3879;3.6612,3.2142,-0.299;2.5632,2.2485,-1.3212;2.2397,2.0037,0.8161;1.4605,2.7726,0.8701;2.8199,2.0386,1.7424;1.755,1.0256,0.7369;7.4957,0.179,-0.2171;8.3037,1.7073,-0.3778;8.2424,-1.0102,-0.3103;9.6615,-0.9963,-0.5129;10.8113,-1.0358,-0.6777;7.5714,-2.2356,-0.2034;8.1245,-3.1663,-0.2658)|\",5.058596480795\r\n\"[H]C1=C([H])C(C([H])([H])N2N=NC(C(=O)C([H])([H])[H])=C2[H])=C([H])C([H])=C1Cl |(0.486,6.5075,-1.9161;1.5402,6.255,-1.8798;1.9803,4.9845,-2.2419;1.2584,4.2362,-2.5584;3.342,4.658,-2.1948;3.8143,3.2785,-2.6099;4.8991,3.1908,-2.5009;3.5616,3.0688,-3.6531;3.1884,2.2,-1.8439;2.153,1.4991,-2.393;1.7487,0.643,-1.5042;2.5071,0.7746,-0.3726;2.336,-0.045,0.8478;3.0677,0.1374,1.8098;1.2385,-1.0863,0.8352;1.3824,-1.7854,0.0037;0.264,-0.6107,0.6749;1.2411,-1.6238,1.7851;3.4332,1.7808,-0.5868;4.202,2.1956,0.046;4.2578,5.6279,-1.7787;5.3181,5.3902,-1.7357;3.8333,6.9077,-1.4173;4.5466,7.659,-1.0959;2.4746,7.2088,-1.4714;1.9261,8.8142,-1.0163)|\",5.600103043290002\r\n\"[H]C1=C([H])C(C([H])([H])N2N=NC(C(=O)C([H])([H])[H])=C2[H])=C([H])C([H])=C1F |(-1.6241,4.793,-1.307;-0.8571,4.9017,-0.5473;0.4667,4.5454,-0.7884;0.7467,4.1408,-1.7575;1.4419,4.6933,0.2084;2.8805,4.3086,-0.0694;3.5189,4.5506,0.7856;3.2748,4.8327,-0.9442;3.0479,2.8869,-0.3816;3.3015,2.5079,-1.6684;3.369,1.2112,-1.6857;3.159,0.7314,-0.4212;3.1722,-0.7034,-0.06;2.9674,-1.0425,1.0964;3.4502,-1.6943,-1.1692;4.4392,-1.5088,-1.6041;2.7267,-1.5748,-1.9835;3.4028,-2.7076,-0.7664;2.9472,1.8093,0.4211;2.7408,1.8636,1.4784;1.0681,5.2069,1.454;1.8144,5.3265,2.2359;-0.2539,5.5738,1.7115;-0.5584,5.9742,2.6726;-1.1949,5.4128,0.703;-2.4745,5.7618,0.9399)|\",5.60010304329\r\n\"[H]O[C@]([H])(C1=C([H])N(C([H])([H])C2=C([H])C([H])=C(OC([H])([H])[H])C([H])=C2[H])N=N1)C([H])([H])[H] |(1.4197,1.2154,-0.8365;2.1463,1.6539,-1.313;3.0606,2.0969,-0.3161;4.0387,2.1642,-0.808;3.1377,1.0852,0.8007;4.1754,0.6454,1.5932;5.2254,0.8915,1.6348;3.6049,-0.2646,2.426;4.209,-1.0358,3.515;3.4602,-1.7923,3.7657;5.0909,-1.5526,3.123;4.5773,-0.1909,4.7162;3.597,0.5213,5.4136;2.5636,0.4776,5.0781;3.9196,1.2957,6.5279;3.1347,1.8359,7.0445;5.2499,1.3604,6.9662;5.6772,2.0789,8.0425;4.7203,2.8282,8.7766;5.277,3.3223,9.5747;3.9525,2.1795,9.2182;4.235,3.5871,8.1491;6.2409,0.649,6.2757;7.2657,0.7104,6.6277;5.9026,-0.1129,5.1633;6.6828,-0.6598,4.6381;2.2849,-0.3839,2.1723;2.0049,0.4326,1.1914;2.669,3.4877,0.2054;3.4115,3.8713,0.9155;1.7,3.4399,0.7146;2.5887,4.185,-0.6345)|\",5.804188431165\r\n\"[H]O[C@]([H])(C1=C([H])N(C([H])([H])C2=C([H])C([H])=C(F)C([H])=C2[H])N=N1)C([H])([H])[H] |(3.9541,-1.4224,-0.5007;3.0381,-1.7087,-0.342;2.3739,-0.5975,0.2537;1.3029,-0.7686,0.0927;2.7766,0.6794,-0.4398;2.054,1.7651,-0.8832;1.0026,2.0077,-0.8596;2.9846,2.5943,-1.4261;2.8131,3.9139,-2.0316;3.7742,4.1303,-2.5065;2.0576,3.8397,-2.82;2.4285,4.9897,-1.0355;1.2513,5.7244,-1.2108;0.6035,5.5115,-2.058;0.8934,6.735,-0.3168;-0.0158,7.3135,-0.4415;1.7329,6.9929,0.7587;1.4008,7.9642,1.6327;2.9098,6.2785,0.966;3.5337,6.517,1.821;3.2533,5.2772,0.0616;4.1681,4.7093,0.2103;4.2201,2.0568,-1.3254;4.0922,0.9028,-0.729;2.6514,-0.5375,1.7622;2.0726,0.2601,2.2433;3.7147,-0.3451,1.9437;2.3859,-1.4947,2.2218)|\",6.23140717645\r\n\"[H]O[C@]([H])(C1=C([H])N(C([H])([H])C2=C([H])C([H])=C(Cl)C([H])=C2[H])N=N1)C([H])([H])[H] |(0.4522,-1.6259,0.9907;-0.0218,-1.6005,0.1415;0.9308,-1.1848,-0.8322;0.5299,-1.5089,-1.7998;2.2518,-1.871,-0.5905;3.1665,-2.4557,-1.4384;3.1976,-2.5854,-2.5095;4.1541,-2.9024,-0.6177;5.4116,-3.5666,-0.9526;5.7604,-4.0027,-0.0121;5.1974,-4.3866,-1.6449;6.4473,-2.6277,-1.5408;6.828,-1.4698,-0.8498;6.3609,-1.2348,0.103;7.7929,-0.6131,-1.3732;8.0868,0.2836,-0.8383;8.3853,-0.9221,-2.5992;9.6048,0.1542,-3.2643;8.0245,-2.0678,-3.3039;8.4917,-2.2928,-4.2566;7.0516,-2.9135,-2.7677;6.7659,-3.8073,-3.3176;3.8732,-2.6098,0.6712;2.7259,-1.9875,0.6846;1.0787,0.3437,-0.8391;1.7445,0.6766,-1.6444;1.4955,0.6875,0.1142;0.0966,0.8062,-0.9785)|\",6.0953502512\r\n\"[H]C(=O)C1=C(C([H])([H])[H])C([H])=C(OC(=O)C([H])([H])[H])C([H])=C1C([H])([H])[H] |(1.0155,-2.7895,-0.1884;2.1207,-2.8656,-0.1967;2.6354,-3.9726,-0.2113;2.832,-1.5704,-0.189;4.2517,-1.5085,-0.1952;5.1251,-2.7404,-0.213;6.1808,-2.4522,-0.2172;4.9231,-3.3651,-1.0881;4.9368,-3.3784,0.6556;4.8819,-0.2609,-0.1867;5.9623,-0.2047,-0.1835;4.1199,0.9039,-0.1683;4.6656,2.1811,-0.2342;5.8017,2.5296,0.4588;6.4111,1.7895,1.1878;6.1387,3.9709,0.1682;7.0586,4.2383,0.6888;5.3218,4.6207,0.4991;6.2589,4.1209,-0.9095;2.728,0.8526,-0.1675;2.1706,1.784,-0.1642;2.0672,-0.3734,-0.1775;0.5502,-0.3614,-0.1727;0.1841,0.6692,-0.1663;0.1347,-0.8626,0.7088;0.128,-0.8541,-1.0558)|\",4.968798910129999\r\n\"[H]C([H])=C(C1=C([H])C([H])=C([H])C([H])=C1[H])C1=C([H])C(N([H])[H])=C([H])C([H])=C1[H] |(0.1102,0.8057,-0.3156;0.8107,0.4341,0.4262;0.4243,0.2714,1.4276;2.0986,0.1898,0.1208;2.6269,0.4921,-1.2403;2.2105,1.6384,-1.9383;1.5408,2.3405,-1.4499;2.6647,1.8964,-3.231;2.3331,2.7926,-3.749;3.5519,1.0157,-3.8525;3.9102,1.218,-4.8583;3.9847,-0.1213,-3.167;4.6766,-0.8127,-3.6412;3.5324,-0.377,-1.8734;3.8731,-1.2642,-1.3481;3.0251,-0.3894,1.1365;2.5788,-1.3897,2.0107;1.5689,-1.7776,1.9013;3.4132,-1.9173,3.0078;2.9304,-2.8772,3.9048;3.6409,-3.477,4.3061;2.1458,-3.4184,3.5624;4.7262,-1.4303,3.1172;5.3865,-1.8304,3.8836;5.1784,-0.4443,2.2436;6.1969,-0.0763,2.3385;4.3461,0.0758,1.2537;4.709,0.8478,0.5834)|\",4.658589120559999\r\n\"[H]OC([H])([H])C1=C(OC([H])([H])[H])C([H])=C([H])C(OC([H])([H])[H])=C1N(=O)=O |(0.3505,-4.1238,1.1071;0.085,-3.1919,1.1973;0.4038,-2.5907,-0.0466;-0.0944,-3.1025,-0.882;0.0223,-1.5711,-0.0031;1.9066,-2.5427,-0.2913;2.6464,-1.3905,0.0542;1.9213,-0.3432,0.5438;2.6083,0.8355,0.9311;3.3328,0.6347,1.7313;1.8415,1.5171,1.3025;3.1257,1.2998,0.0808;4.0298,-1.3642,-0.1266;4.6037,-0.483,0.1342;4.7079,-2.4732,-0.6394;5.7843,-2.4251,-0.7507;4.0131,-3.6251,-1.0035;4.5806,-4.7636,-1.4759;5.9898,-4.7904,-1.6528;6.2167,-5.7822,-2.0459;6.5175,-4.6448,-0.7013;6.3173,-4.0297,-2.3729;2.6146,-3.625,-0.8166;1.8721,-4.836,-1.2058;1.1302,-5.3534,-0.3598;2.0243,-5.2622,-2.3421)|\",3.812315045505\r\n\"[H]C1=C([H])C([H])=C(C(F)(F)F)C(ON([H])[H])=N1 |(-1.9638,1.3281,-0.2269;-1.1036,0.6726,-0.109;-1.263,-0.7,0.0135;-2.2497,-1.1493,-0.0052;-0.1121,-1.4825,0.1625;-0.1818,-2.5597,0.2648;1.1367,-0.8782,0.1809;2.4,-1.6814,0.3435;2.1204,-2.9992,0.4808;3.2178,-1.5533,-0.7193;3.0934,-1.3034,1.4371;1.1766,0.5298,0.0479;2.3999,1.1098,0.0407;2.4305,2.5517,-0.0765;2.4191,2.8589,0.9008;1.5002,2.7915,-0.4362;0.0937,1.2809,-0.092)|\",5.387854239900001\r\n\"[H]SC1=NC2=C([H])C([H])=C(C(F)(F)F)C([H])=C2O1 |(4.2894,1.2199,-0.0003;4.0467,-0.1058,-0.0006;2.3051,0.1066,0.0002;1.618,1.2034,0.001;0.2926,0.7595,0.0014;-0.9145,1.4583,0.0022;-0.9317,2.5429,0.0026;-2.0922,0.7102,0.0024;-3.0504,1.2166,0.003;-2.0691,-0.6945,0.0019;-3.3505,-1.4836,0.0024;-4.4443,-0.694,0.0017;-3.4344,-2.292,-1.0808;-3.4346,-2.2904,1.0868;-0.8641,-1.4115,0.0011;-0.8398,-2.4953,0.0006;0.282,-0.6436,0.0009;1.5944,-1.0659,0.0001)|\",5.333431469800001\r\n\"[H]C(=O)C1=C(Cl)C([H])=C(OC(=O)C([H])([H])[H])C([H])=C1Cl |(9.9398,1.6762,2.1373;9.6657,0.8473,1.4623;10.527,0.1653,0.9462;8.1948,0.6813,1.2756;7.2883,1.536,1.9391;7.8643,2.8173,3.0049;5.9045,1.4473,1.8173;5.2599,2.1271,2.3507;5.3851,0.4516,0.9879;4.0466,0.2154,0.7414;2.9965,0.9073,1.3042;3.1107,1.8128,2.0902;1.7013,0.3466,0.7761;0.8677,0.8876,1.2242;1.6273,-0.7196,1.0134;1.6668,0.4426,-0.3141;6.2333,-0.4229,0.3057;5.8085,-1.187,-0.3337;7.6092,-0.3074,0.449;8.5679,-1.4549,-0.4543)|\",4.718454167669999\r\n\"[H]/N=C1\\C2=C([H])C([H])=C(OC([H])([H])[H])C(F)=C2N([H])N1[H] |(4.1298,5.2021,-3.5524;4.8117,4.6582,-4.0843;4.8802,3.4662,-3.6217;4.1825,2.7416,-2.5482;3.2904,3.1386,-1.5486;2.9861,4.177,-1.4522;2.8073,2.1851,-0.6657;2.1153,2.4473,0.1273;3.1965,0.8266,-0.7487;2.6499,0.0203,0.1957;2.8646,-1.3931,0.1861;2.2678,-1.7728,1.0172;3.9176,-1.6416,0.3459;2.5201,-1.844,-0.7496;4.0863,0.4439,-1.76;4.5304,-0.8342,-1.9037;4.5507,1.3999,-2.6585;5.3949,1.1933,-3.7569;6.234,0.6632,-3.5254;5.7621,2.5149,-4.194;5.8081,2.5665,-5.2071)|\",4.756550106740001\r\n\"[H]OC([H])([H])C1=NN2C(=C1C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])C2([H])[H] |(2.5523,4.2855,4.1898;2.8159,4.5744,3.3;3.0174,3.4016,2.5344;2.7,3.598,1.5044;4.087,3.1381,2.477;2.2593,2.2243,3.0874;1.6929,2.3276,4.2895;1.1571,1.1042,4.5072;1.3381,0.2377,3.4953;2.0707,0.9221,2.5272;2.5597,0.4555,1.2351;3.2193,1.1184,0.4548;2.1899,-0.8316,0.9888;2.6361,-1.3604,-0.2685;3.7283,-1.3645,-0.3208;2.2457,-0.7654,-1.0986;2.2479,-2.3788,-0.312;0.6418,-1.0537,3.7938;-0.2542,-1.1596,3.169;1.2677,-1.9286,3.5989;0.2834,-0.8951,5.3062;1.0408,-1.3981,5.9153;-0.6858,-1.3344,5.5541;0.3184,0.629,5.5989;0.7645,0.8774,6.5655;-0.6715,1.0982,5.5433)|\",6.027321788575\r\n\"[H]OC([H])([H])C1=NN2C(=C1C(=O)OC([H])([H])[H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H] |(3.3123,-6.0232,0.8991;4.135,-5.6066,0.5917;3.8311,-4.2492,0.3313;4.6706,-3.6278,0.6636;3.7285,-4.0639,-0.7506;2.5674,-3.8083,1.0201;1.8124,-4.7173,1.6219;0.7403,-4.0313,2.1041;0.7988,-2.7055,1.8392;1.9818,-2.5141,1.1157;2.5487,-1.295,0.5486;3.5962,-1.2416,-0.0716;1.7752,-0.1984,0.7811;2.2899,1.0302,0.2465;3.2667,1.2614,0.68;1.5604,1.7949,0.5167;2.3935,0.9653,-0.8399;-0.2951,-1.7773,2.2845;0.1458,-0.8403,2.635;-0.9068,-1.5095,1.4113;-1.1808,-2.4198,3.3666;-0.6537,-2.4089,4.3305;-2.0909,-1.8245,3.4975;-1.5331,-3.8679,2.9992;-2.0654,-3.8876,2.0389;-2.2009,-4.3115,3.7459;-0.2675,-4.7226,2.9019;-0.4586,-5.6886,2.4268;0.1478,-4.9178,3.8992)|\",5.972899018475\r\n\"[H]OC(=O)C1=C(C([H])([H])C([H])([H])[H])N=C(N2C([H])=C([H])C([H])=C2[H])S1 |(4.4422,-3.7357,-2.8617;3.8441,-3.9309,-2.1212;3.2944,-2.7912,-1.6119;2.6958,-2.8385,-0.5601;3.4626,-1.5554,-2.3974;3.2823,-0.2578,-1.9599;2.8604,0.174,-0.5829;3.1995,-0.5584,0.1524;3.3485,1.1327,-0.3753;1.3326,0.3328,-0.4698;1.0678,0.7187,0.5206;0.8392,-0.6342,-0.603;0.951,1.0314,-1.2222;3.5042,0.7156,-2.905;3.8468,0.2063,-4.0546;4.1131,0.9733,-5.1727;4.0415,2.3668,-5.1984;3.7671,2.9015,-4.3036;4.3641,2.7777,-6.4613;4.3992,3.8044,-6.7987;4.6441,1.6092,-7.2458;4.9297,1.5825,-8.2884;4.4847,0.5188,-6.4368;4.598,-0.5362,-6.6376;3.9237,-1.548,-4.1017)|\",4.465388286705\r\n\"[H]OC1=C([H])C([H])=C(Cl)C([H])=C1/N=C(/O[H])C([H])([H])C([H])([H])Cl |(-5.1783,1.376,1.0512;-4.2226,1.2475,1.1544;-3.841,0.1677,0.3942;-4.763,-0.6295,-0.284;-5.8246,-0.4003,-0.2106;-4.3462,-1.7183,-1.0501;-5.0678,-2.3352,-1.5734;-2.9842,-1.9936,-1.1311;-2.4296,-3.3589,-2.095;-2.0485,-1.2011,-0.4718;-0.9874,-1.404,-0.5622;-2.4606,-0.1179,0.3163;-1.4907,0.7179,0.8844;-1.3101,0.8342,2.1335;-0.3699,1.7396,2.5235;-0.2134,1.6585,3.4771;-2.0003,0.0868,3.2639;-2.6188,-0.7109,2.848;-1.2372,-0.3864,3.8966;-2.8642,1.0231,4.1133;-3.657,1.4611,3.5092;-2.282,1.8216,4.5779;-3.6599,0.1176,5.4701)|\",5.123903804915\r\n\"[H]OC(=O)C1=C2N(C(=O)C([H])=C1[H])C([H])([H])C([H])([H])C([H])([H])C([H])([H])C2([H])[H] |(-2.461,-3.4547,0.7416;-1.8453,-3.9298,0.1604;-0.8328,-3.099,-0.2293;0.1332,-3.5772,-0.7819;-1.0563,-1.6495,0.0426;-0.006,-0.7709,0.2966;-0.2674,0.5616,0.4832;-1.5848,1.1401,0.4218;-1.7453,2.3457,0.5859;-2.6424,0.1972,0.1534;-3.6421,0.6091,0.0811;-2.384,-1.1246,-0.0242;-3.2054,-1.7944,-0.2694;0.8225,1.5325,0.731;0.3085,2.4593,0.9792;1.3907,1.2127,1.611;1.7469,1.7468,-0.4776;1.1285,1.8212,-1.3813;2.2287,2.7248,-0.3522;2.8357,0.6797,-0.6483;3.4506,0.929,-1.5225;3.508,0.7263,0.2215;2.317,-0.7566,-0.7871;3.1739,-1.4373,-0.8519;1.757,-0.8815,-1.7219;1.432,-1.2239,0.3937;1.8612,-0.8649,1.3386;1.4418,-2.3101,0.424)|\",4.66947367458\r\n\"[H]C([H])([H])OC(=O)C1=C(C(=O)OC([H])([H])[H])N(C([H])([H])C(=O)C([H])([H])[H])N=N1 |(-2.0456,-6.5562,-5.6204;-1.0288,-6.7307,-5.2687;-0.9871,-7.6201,-4.6342;-0.3458,-6.8631,-6.1122;-0.6801,-5.5579,-4.5165;0.5622,-5.5601,-4.002;1.3499,-6.4764,-4.1164;0.8566,-4.2874,-3.2984;1.8419,-4.0232,-2.3521;2.8534,-4.8376,-1.6365;3.9205,-4.3886,-1.2582;2.4431,-6.0909,-1.4523;3.3855,-6.9669,-0.8072;3.6074,-6.6071,0.2005;4.3101,-7.021,-1.3867;2.8944,-7.9385,-0.7732;1.7434,-2.6835,-2.1495;2.4648,-1.8268,-1.2306;3.5406,-1.9636,-1.3734;2.1989,-0.7991,-1.4924;2.1301,-2.1151,0.2425;1.4952,-3.0934,0.5698;2.6625,-1.0997,1.2307;2.1049,-0.1592,1.1327;3.7175,-0.8732,1.0358;2.545,-1.4809,2.2466;0.7476,-2.1559,-2.9003;0.2202,-3.1185,-3.597)|\",5.54023799618\r\n\"[H]C(=O)C1=C2C(=O)N([H])C([H])=C([H])N2N=C1C1=C([H])C([H])=C([H])C([H])=C1[H] |(4.9149,-3.2205,-0.9362;3.8833,-2.8535,-0.803;2.946,-3.6371,-0.8327;3.8215,-1.3979,-0.6012;4.9945,-0.6231,-0.6097;6.4174,-0.9287,-0.7883;6.8971,-2.0381,-0.9827;7.2305,0.2054,-0.7148;8.2183,0.0175,-0.8358;6.7894,1.4981,-0.5;7.5407,2.2755,-0.4698;5.4757,1.7517,-0.3386;5.0333,2.7208,-0.1668;4.6061,0.6716,-0.3974;3.2779,0.7962,-0.2522;2.7655,-0.4418,-0.3698;1.2982,-0.5714,-0.2387;0.6219,-1.7933,-0.382;1.1857,-2.6934,-0.5936;-0.7662,-1.85,-0.2483;-1.2704,-2.8058,-0.3628;-1.5026,-0.6993,0.0279;-2.5834,-0.75,0.1315;-0.8381,0.5221,0.1719;-1.3996,1.4274,0.3871;0.5442,0.5873,0.0404;1.0556,1.5363,0.1525)|\",4.138851666105\r\n\"[H]O/C(=N/N([H])C1=NC([H])([H])C([H])([H])C1([H])[H])C1=C([H])C(N([H])[H])=C([H])C([H])=C1[H] |(-0.366,-2.5138,2.9736;-0.374,-2.4948,2.0034;-0.543,-1.1869,1.6119;-1.0127,-1.0396,0.4302;-1.1108,0.205,-0.1204;-0.6478,1.0011,0.3122;-1.4116,0.3302,-1.4598;-1.2878,1.4553,-2.0612;-1.7307,1.2918,-3.4472;-2.5576,1.9857,-3.6515;-0.9157,1.5777,-4.1258;-2.1582,-0.1984,-3.6447;-3.2024,-0.2761,-3.963;-1.5525,-0.6891,-4.4128;-1.9387,-0.8437,-2.2566;-1.2157,-1.6663,-2.2597;-2.8537,-1.2429,-1.8049;-0.0973,-0.1549,2.5851;1.0929,-0.3746,3.2944;1.6764,-1.2698,3.0931;1.5573,0.5546,4.2375;2.7213,0.3086,4.9646;3.1754,1.1303,5.3433;3.3856,-0.3128,4.52;0.7934,1.7114,4.4745;1.1337,2.4377,5.2089;-0.3939,1.9237,3.7801;-0.9757,2.8187,3.9832;-0.85,1.0058,2.833;-1.7911,1.1687,2.3185)|\",4.160620774145\r\n\"[H]OC1=NN=C2C(=O)N(C([H])([H])C([H])([H])[H])C([H])([H])C([H])([H])N12 |(2.2278,-6.4,-3.7519;1.9614,-5.4703,-3.6409;2.632,-5.0099,-2.5739;3.4693,-5.676,-1.811;3.9066,-4.8005,-0.8364;3.3195,-3.6462,-1.0548;3.4357,-2.3883,-0.2749;3.996,-2.3343,0.8092;2.8092,-1.3009,-0.8717;2.9244,0.0119,-0.2279;3.623,-0.1098,0.6013;3.372,0.7156,-0.9434;1.5818,0.543,0.2791;1.7149,1.5311,0.734;1.1639,-0.1301,1.0343;0.8485,0.646,-0.53;2.4019,-1.3286,-2.2778;1.7297,-0.485,-2.4577;3.2734,-1.2059,-2.9399;1.6706,-2.6234,-2.6302;1.5237,-2.7084,-3.71;0.6919,-2.6649,-2.1372;2.4994,-3.7244,-2.1668)|\",5.72527541452\r\n\"[H]C1=C(C([H])(F)F)C(Cl)=C([H])C(Cl)=N1 |(1.2699,-2.0117,-0.1578;0.669,-1.1054,-0.1002;-0.7224,-1.2038,-0.0787;-1.3568,-2.5669,-0.1406;-0.6004,-3.3579,-0.1966;-2.1262,-2.7944,0.9649;-2.1652,-2.6763,-1.2363;-1.4376,0.0027,-0.0015;-3.179,0.0357,0.0354;-0.7498,1.2095,0.0483;-1.2751,2.1542,0.1076;0.6471,1.1529,0.0186;1.5396,2.6623,0.0826;1.3562,0.0414,-0.0538)|\",5.99194698801\r\n\"[H]C1=C([H])C(C([H])(F)F)=C(Cl)C(Cl)=N1 |(-2.0492,1.2191,0.08;-1.1184,0.658,0.0585;-1.1134,-0.7314,0.1265;-2.0464,-1.2823,0.2019;0.1054,-1.4106,0.0967;0.0882,-2.9219,0.171;-0.9368,-3.302,0.2441;0.6622,-3.4658,-0.9398;0.7797,-3.3601,1.2612;1.2845,-0.6578,-0.001;2.843,-1.4242,-0.0417;1.1607,0.7436,-0.0653;2.5848,1.7519,-0.1922;0.0019,1.3785,-0.0363)|\",5.518468888139999\r\n\"[H]OC([H])([H])C([H])([H])N1C(=O)C2=C(C1=O)C([H])=C([H])C([H])=C2[H] |(-2.814,-3.0572,-1.336;-3.1628,-2.4651,-2.024;-4.0039,-1.5074,-1.4109;-4.0306,-0.6429,-2.0835;-5.0386,-1.8754,-1.3025;-3.5205,-1.0695,-0.0229;-3.6298,-1.8924,0.6923;-4.1092,-0.2187,0.3315;-2.1216,-0.6684,-0.0098;-1.6754,0.6648,-0.0614;-2.3938,1.6422,-0.1124;-0.1843,0.5873,-0.0243;0.1826,-0.7586,0.0507;-1.062,-1.5797,0.0542;-1.1821,-2.7918,0.1137;1.5131,-1.1466,0.0997;1.787,-2.1955,0.155;2.4811,-0.1335,0.0705;3.5342,-0.3974,0.1055;2.1134,1.2159,-0.0041;2.8871,1.978,-0.0258;0.7655,1.5971,-0.0528;0.4694,2.6399,-0.1108)|\",4.740223275710001\r\n\"[H]C1=C([H])C([H])=C(C([H])([H])N2C(=O)C([H])([H])C([H])([H])[C@@]2([H])C([H])([H])C#N)C([H])=C1[H] |(-2.561,0.3702,-1.6036;-1.5008,0.3557,-1.365;-0.5604,0.0977,-2.367;-0.8884,-0.0835,-3.3874;0.7997,0.0834,-2.063;1.533,-0.1066,-2.8435;1.2388,0.3281,-0.7535;2.7216,0.307,-0.4322;2.8987,0.7251,0.5627;3.2845,0.903,-1.1561;3.3057,-1.0348,-0.5142;4.1557,-1.3871,-1.5403;4.5587,-0.6374,-2.4146;4.498,-2.8664,-1.3663;4.531,-3.3618,-2.3393;5.5005,-2.946,-0.9269;3.3987,-3.3895,-0.4309;3.7305,-4.1975,0.2277;2.556,-3.7682,-1.0179;2.9342,-2.139,0.363;1.849,-2.1415,0.5139;3.6181,-2.0761,1.7594;3.4147,-3.0125,2.294;4.7053,-1.9987,1.6386;3.1535,-0.972,2.6032;2.7617,-0.1034,3.2668;0.2912,0.5844,0.244;0.6228,0.7765,1.2619;-1.0729,0.599,-0.0601;-1.798,0.8022,0.7235)|\",6.34569499366\r\n\"[H]C1=C(N([H])[H])C(N(=O)=O)=C([H])C([H])=C1N([H])C([H])([H])C([H])([H])N(C([H])([H])[H])C([H])([H])[H] |(5.9552,-1.3621,-0.5496;6.5049,-1.9671,0.1643;7.5026,-2.8289,-0.3373;7.7201,-2.8876,-1.6799;7.3312,-2.1663,-2.2671;8.5236,-3.4173,-1.993;8.2147,-3.6347,0.6009;9.2672,-4.5272,0.214;9.8336,-5.1966,1.0863;9.5923,-4.6072,-0.9921;7.9015,-3.5553,1.9736;8.4655,-4.1852,2.6499;6.9261,-2.7124,2.4401;6.7046,-2.6582,3.5021;6.2053,-1.8894,1.5278;5.2567,-1.0332,2.0241;4.9102,-1.218,2.9599;4.3025,-0.3096,1.2102;4.8262,0.3556,0.5119;3.6899,-0.9916,0.5976;3.4216,0.5513,2.124;2.6549,1.0625,1.5114;4.0502,1.3281,2.5738;2.8342,-0.2269,3.2166;2.3854,0.6286,4.3108;2.0165,0.0069,5.1332;1.5736,1.3203,4.0157;3.2241,1.225,4.6841;1.7596,-1.1065,2.7637;1.395,-1.7003,3.608;2.1279,-1.8006,2.0029;0.9029,-0.5496,2.3381)|\",4.019121571885\r\n\"[H]O/C(=N/N([H])C(C([H])([H])[H])(C([H])([H])[H])C([H])([H])N(=O)=O)OC(C([H])([H])[H])(C([H])([H])[H])C([H])([H])[H] |(5.8228,1.2792,-2.6118;4.9585,0.8692,-2.7933;4.5796,0.3058,-1.6158;5.3259,0.4199,-0.5892;4.924,-0.2598,0.5786;3.9068,-0.3554,0.6076;5.3898,0.4141,1.8098;5.0102,1.9048,1.8634;5.3479,2.3773,2.7913;3.9195,2.0057,1.8126;5.4452,2.4481,1.0208;4.7601,-0.3518,2.9814;5.1544,0.0006,3.9373;4.9659,-1.4236,2.8953;3.6727,-0.2077,2.987;6.9298,0.2179,1.7788;7.3348,0.6277,0.8557;7.1746,-0.839,1.8838;7.6218,0.9335,2.9009;8.0803,2.0499,2.6661;7.6687,0.3682,3.9932;3.412,-0.3541,-1.6075;2.4945,-0.4926,-2.7561;1.3483,-1.3084,-2.1516;0.5802,-1.4958,-2.9088;0.8877,-0.7691,-1.3177;1.7142,-2.2712,-1.7816;3.1769,-1.2744,-3.8827;2.4472,-1.4891,-4.6713;3.561,-2.2276,-3.5046;4.0024,-0.7105,-4.3204;1.9979,0.8845,-3.2075;1.2146,0.7612,-3.9637;2.8032,1.4813,-3.6394;1.5681,1.4305,-2.3608)|\",3.8912280621500006\r\n\"[H]C1=C(C(F)(F)F)N(C([H])([H])[H])N=C1C(=O)C([H])([H])[H] |(2.3135,2.7304,0.8913;2.8988,2.0236,1.4578;3.6877,2.2632,2.5608;3.9554,3.5459,3.2724;3.3048,4.5594,2.681;3.562,3.4965,4.569;5.2768,3.8489,3.2881;4.2335,1.0625,2.9381;5.1402,0.7706,4.0416;5.3239,1.6799,4.6105;6.0842,0.3872,3.6466;4.6871,0.017,4.6898;3.8384,0.0669,2.1383;3.0263,0.6345,1.2333;2.3752,-0.1525,0.1559;1.6461,0.4094,-0.6465;2.6527,-1.6405,0.105;3.7271,-1.8271,-0.0028;2.1103,-2.0773,-0.7354;2.3446,-2.12,1.0411)|\",5.4966997801\r\n\"[H]O[C@]([H])(C1=NN(C([H])([H])[H])C(C(F)(F)F)=C1[H])C([H])([H])[H] |(2.7606,-1.8702,-0.699;2.7383,-1.4249,0.1655;2.5992,-0.0348,-0.0876;1.5317,0.2437,-0.0238;3.0778,0.3107,-1.4812;3.2824,-0.6788,-2.3535;3.6716,-0.0924,-3.502;3.9201,-0.9027,-4.6848;4.7462,-0.4747,-5.2548;3.0287,-0.954,-5.3182;4.1856,-1.9039,-4.3456;3.7019,1.2635,-3.3737;4.105,2.1728,-4.4829;3.9891,3.4583,-4.1102;5.3892,1.9674,-4.8683;3.3426,1.9868,-5.5879;3.3269,1.5658,-2.0784;3.2561,2.5502,-1.6412;3.3563,0.725,1.001;3.211,1.8069,0.9053;4.4266,0.503,0.9441;2.9887,0.4129,1.9833)|\",6.370185240205\r\n\"[H]OC([H])([H])C([H])([H])C([H])([H])C1=NN(C([H])([H])[H])C(C(F)(F)F)=C1[H] |(3.371,2.0238,1.5615;3.8892,2.5681,2.1864;5.1567,2.7861,1.6026;5.0896,3.4364,0.7101;5.7368,3.3389,2.3507;5.9324,1.5106,1.2305;6.9891,1.7897,1.1162;5.8789,0.8082,2.0716;5.5146,0.8079,-0.0821;6.3113,0.116,-0.3787;5.4598,1.5629,-0.8803;4.2293,0.0216,-0.066;3.0755,0.5549,0.3531;2.1397,-0.4056,0.2227;0.7485,-0.1168,0.5395;0.2845,-1.0003,0.9803;0.1965,0.176,-0.3592;0.7364,0.7019,1.2591;2.6767,-1.5432,-0.2953;1.874,-2.7704,-0.5589;2.6352,-3.7232,-1.1222;1.3406,-3.2836,0.5772;0.8327,-2.5208,-1.3894;4.0248,-1.309,-0.4909;4.7539,-2.0067,-0.875)|\",5.918476248375\r\n\"[H]OC([H])([H])C([H])([H])C1=NN(C([H])([H])[H])C(C(F)(F)F)=C1[H] |(3.2712,2.9688,-0.1712;4.0429,3.5607,-0.2396;5.0834,2.7826,-0.7994;4.8627,2.4984,-1.8424;5.9737,3.4202,-0.8143;5.384,1.5143,0.0183;5.6048,1.818,1.0505;6.287,1.028,-0.3717;4.2532,0.5213,0.0082;2.9806,0.9385,0.0564;2.2181,-0.1725,0.0424;0.7683,-0.0552,-0.007;0.3155,-0.8705,0.559;0.4089,-0.0842,-1.0406;0.5015,0.8998,0.4458;2.9863,-1.2944,-0.026;2.411,-2.6682,-0.0646;3.3828,-3.5894,-0.1711;1.6937,-2.9473,1.052;1.5644,-2.8309,-1.1104;4.3074,-0.8885,-0.0454;5.1793,-1.5244,-0.0852)|\",6.2041957914\r\n\"[H]C1=C([H])C(C2=NC(C([H])([H])[H])=C(C(=O)C([H])([H])[H])O2)=C([H])C([H])=C1F |(9.474,-1.1009,-0.1227;8.7104,-0.3357,-0.0318;7.3558,-0.6527,-0.0433;7.0442,-1.6866,-0.1464;6.388,0.3582,0.0759;4.964,0.05,0.0616;3.9626,0.888,0.1695;2.8251,0.1113,0.0951;1.4586,0.7033,0.1782;0.7024,-0.0821,0.1585;1.2901,1.3874,-0.662;1.3569,1.2911,1.0976;3.1939,-1.2059,-0.0603;2.4527,-2.4616,-0.1895;1.227,-2.4665,-0.179;3.2631,-3.7372,-0.3309;3.9382,-3.6787,-1.1924;2.5778,-4.5778,-0.4525;3.8886,-3.8968,0.5556;4.5795,-1.2409,-0.0807;6.7972,1.6965,0.209;6.0418,2.4693,0.3011;8.1488,2.0215,0.221;8.4866,3.0474,0.3223;9.0849,0.9978,0.0995;10.3953,1.3055,0.1101)|\",4.364706162020001\r\n\"[H]OC(=O)C([H])([H])[C@]([H])(C(=O)OC([H])([H])C([H])=C([H])[H])C1=C([H])C([H])=C([H])C([H])=C1[H] |(5.8565,-3.562,-1.6609;6.5502,-3.5036,-2.3458;5.9825,-3.6497,-3.5687;6.6482,-3.5778,-4.5718;4.4725,-3.8908,-3.5726;4.2107,-4.2318,-4.5761;4.2157,-4.6883,-2.8669;3.6302,-2.6255,-3.2626;3.9334,-1.8286,-3.9488;3.8124,-2.0272,-1.8675;3.6339,-0.8643,-1.5984;4.1494,-2.9673,-0.9306;4.2226,-2.4937,0.456;4.8541,-3.2283,0.9592;4.7168,-1.5187,0.4467;2.8626,-2.4178,1.0788;2.1851,-1.6812,0.6525;2.4858,-3.1773,2.1075;1.5016,-3.0813,2.5573;3.1517,-3.9159,2.5496;2.1411,-2.8944,-3.4856;1.4703,-2.2741,-4.5455;2.0093,-1.5861,-5.1921;0.1189,-2.5323,-4.7798;-0.3863,-2.0438,-5.6086;-0.5801,-3.4108,-3.9515;-1.6326,-3.6115,-4.1319;0.0798,-4.0287,-2.8873;-0.4582,-4.7107,-2.2343;1.4311,-3.7731,-2.6551;1.9321,-4.2516,-1.818)|\",6.119840497745\r\n\"[H]/N=C(\\C1=C([H])C(Cl)=C([H])C([H])=C1[H])N([H])C#N |(-2.442,-1.897,1.2665;-2.1881,-2.5991,0.5704;-1.1719,-2.2165,-0.1009;-0.4632,-0.9099,0.0043;0.9372,-0.8435,0.0211;1.539,-1.7396,-0.0768;1.5572,0.3924,0.1884;3.3093,0.4721,0.2332;0.8147,1.5654,0.3246;1.3211,2.5169,0.4458;-0.5784,1.4913,0.298;-1.1651,2.4001,0.3934;-1.2177,0.2635,0.1439;-2.3018,0.2116,0.1076;-0.6502,-3.1618,-1.0129;-1.1488,-4.0471,-0.9948;0.0586,-2.8765,-2.1165;0.6977,-2.667,-3.069)|\",5.513026611130001\r\n"
  },
  {
    "path": "tests/test_architectures.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different architectures of graphium/nn/architectures...\n\nThe layers are not thoroughly tested due to the difficulty of testing them\n\"\"\"\n\nimport torch\nimport unittest as ut\nfrom copy import deepcopy\nimport sys\nimport traceback\n\nfrom graphium.nn.architectures import FeedForwardNN, FeedForwardPyg, FullGraphMultiTaskNetwork\nfrom graphium.nn.base_layers import FCLayer\nfrom graphium.nn.residual_connections import (\n    ResidualConnectionConcat,\n    ResidualConnectionDenseNet,\n    ResidualConnectionNone,\n    ResidualConnectionSimple,\n    ResidualConnectionWeighted,\n)\nfrom torch_geometric.data import Data, Batch\n\nfrom graphium.utils.spaces import LAYERS_DICT\n\n\nclass test_FeedForwardNN(ut.TestCase):\n    kwargs = {\n        \"activation\": \"relu\",\n        \"last_activation\": \"none\",\n        \"normalization\": \"none\",\n        \"dropout\": 0.2,\n        \"name\": \"LNN\",\n        \"layer_type\": FCLayer,\n    }\n\n    norms = [\"none\", \"batch_norm\", \"layer_norm\"]\n\n    def test_forward_no_residual(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [5, 6, 7]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"none\",\n            residual_skip_steps=1,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_simple_residual_1(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"simple\",\n            residual_skip_steps=1,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_norms(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        for normalization in self.norms:\n            this_kwargs = deepcopy(self.kwargs)\n            this_kwargs[\"normalization\"] = normalization\n            err_msg = f\"normalization = {normalization}\"\n            lnn = FeedForwardNN(\n                in_dim=in_dim,\n                out_dim=out_dim,\n                hidden_dims=hidden_dims,\n                residual_type=\"simple\",\n                residual_skip_steps=1,\n                **this_kwargs,\n            )\n\n            self.assertEqual(len(lnn.layers), len(hidden_dims) + 1, msg=err_msg)\n            self.assertEqual(lnn.layers[0].in_dim, in_dim, msg=err_msg)\n            self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0], msg=err_msg)\n            self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1], msg=err_msg)\n            self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2], msg=err_msg)\n\n            feat = torch.FloatTensor(batch, in_dim)\n            feat_out = lnn.forward(feat)\n\n            self.assertListEqual(list(feat_out.shape), [batch, out_dim], msg=err_msg)\n\n    def test_forward_simple_residual_2(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"simple\",\n            residual_skip_steps=2,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, hidden_dims[4])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_concat_residual_1(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"concat\",\n            residual_skip_steps=1,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, 2 * hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, 2 * hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, 2 * hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, 2 * hidden_dims[4])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_concat_residual_2(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"concat\",\n            residual_skip_steps=2,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, 1 * hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, 1 * hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, 2 * hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, 1 * hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, 2 * hidden_dims[4])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_densenet_residual_1(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"densenet\",\n            residual_skip_steps=1,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, 2 * hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, 3 * hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, 4 * hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, 5 * hidden_dims[4])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_densenet_residual_2(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"densenet\",\n            residual_skip_steps=2,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, 1 * hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, 2 * hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, 1 * hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, 3 * hidden_dims[4])\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_weighted_residual_1(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"weighted\",\n            residual_skip_steps=1,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, hidden_dims[4])\n\n        self.assertEqual(len(lnn.residual_layer.residual_list), len(hidden_dims))\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n    def test_forward_weighted_residual_2(self):\n        in_dim = 8\n        out_dim = 16\n        hidden_dims = [6, 6, 6, 6, 6]\n        batch = 2\n\n        lnn = FeedForwardNN(\n            in_dim=in_dim,\n            out_dim=out_dim,\n            hidden_dims=hidden_dims,\n            residual_type=\"weighted\",\n            residual_skip_steps=2,\n            **self.kwargs,\n        )\n\n        self.assertEqual(len(lnn.layers), len(hidden_dims) + 1)\n        self.assertEqual(lnn.layers[0].in_dim, in_dim)\n        self.assertEqual(lnn.layers[1].in_dim, hidden_dims[0])\n        self.assertEqual(lnn.layers[2].in_dim, hidden_dims[1])\n        self.assertEqual(lnn.layers[3].in_dim, hidden_dims[2])\n        self.assertEqual(lnn.layers[4].in_dim, hidden_dims[3])\n        self.assertEqual(lnn.layers[5].in_dim, hidden_dims[4])\n\n        self.assertEqual(len(lnn.residual_layer.residual_list), (len(hidden_dims) // 2 + 1))\n\n        feat = torch.FloatTensor(batch, in_dim)\n        feat_out = lnn.forward(feat)\n\n        self.assertListEqual(list(feat_out.shape), [batch, out_dim])\n\n\nclass test_FeedForwardGraph(ut.TestCase):\n    kwargs = {\n        \"activation\": \"relu\",\n        \"last_activation\": \"none\",\n        \"dropout\": 0.2,\n        \"name\": \"LNN\",\n    }\n\n    in_dim = 7\n    out_dim = 11\n    in_dim_edges = 13\n    hidden_dims = [6, 6, 6, 6, 6]\n\n    edge_idx1 = (torch.tensor([0, 1, 2]), torch.tensor([1, 2, 3]))\n    edge_idx2 = (torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0]))\n    num_edges1 = len(edge_idx1[0])\n    num_nodes1 = max(edge_idx1[0].max(), edge_idx1[1].max()) + 1\n    num_edges2 = len(edge_idx2[0])\n    num_nodes2 = max(edge_idx2[0].max(), edge_idx2[1].max()) + 1\n    h1 = torch.zeros(num_nodes1, in_dim, dtype=torch.float32)\n    e1 = torch.ones(num_edges1, in_dim_edges, dtype=torch.float32)\n    h2 = torch.ones(num_nodes2, in_dim, dtype=torch.float32)\n    e2 = torch.zeros(num_edges2, in_dim_edges, dtype=torch.float32)\n\n    g1 = Data(feat=h1, edge_index=torch.stack(edge_idx1), edge_feat=e1)\n    g2 = Data(feat=h2, edge_index=torch.stack(edge_idx2), edge_feat=e2)\n    data_list = [g1, g2, deepcopy(g1), deepcopy(g2)]\n    batch_pyg = Batch.from_data_list(data_list)\n    num_nodes = batch_pyg.num_nodes\n    num_edges = batch_pyg.num_edges\n    batch_size = len(data_list)\n\n    virtual_nodes = [\"none\", \"mean\", \"sum\"]\n    norms = [\"none\", \"batch_norm\", \"layer_norm\"]\n    pna_kwargs = {\"aggregators\": [\"mean\", \"max\", \"sum\"], \"scalers\": [\"identity\", \"amplification\"]}\n\n    layers_kwargs = {\n        \"pyg:gin\": {},\n        \"pyg:gine\": {\"in_dim_edges\": in_dim_edges},\n        \"pyg:gated-gcn\": {\"in_dim_edges\": in_dim_edges, \"hidden_dims_edges\": hidden_dims},\n        \"pyg:pna-msgpass#1\": {\"layer_kwargs\": pna_kwargs, \"in_dim_edges\": 0},\n        \"pyg:pna-msgpass#2\": {\"layer_kwargs\": pna_kwargs, \"in_dim_edges\": in_dim_edges},\n    }\n\n    def test_forward_no_residual(self):\n        for residual_skip_steps in [1, 2]:\n            for virtual_node in self.virtual_nodes:\n                for normalization in self.norms:\n                    for layer_name, this_kwargs in self.layers_kwargs.items():\n                        err_msg = f\"virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}\"\n                        layer_type = layer_name.split(\"#\")[0]\n\n                        # PYG\n                        if layer_type.startswith(\"pyg:\"):\n                            layer_class = FeedForwardPyg\n                            bg = deepcopy(self.batch_pyg)\n\n                        gnn = layer_class(\n                            in_dim=self.in_dim,\n                            out_dim=self.out_dim,\n                            hidden_dims=self.hidden_dims,\n                            residual_type=\"none\",\n                            residual_skip_steps=residual_skip_steps,\n                            layer_type=layer_type,\n                            normalization=normalization,\n                            **this_kwargs,\n                            **self.kwargs,\n                        )\n                        # gnn.to(torch.float32)\n\n                        self.assertIsInstance(gnn.residual_layer, ResidualConnectionNone)\n                        self.assertEqual(len(gnn.layers), len(self.hidden_dims) + 1, msg=err_msg)\n                        self.assertEqual(gnn.layers[0].out_dim, self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[1].out_dim, self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].out_dim, self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].out_dim, self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].out_dim, self.hidden_dims[4], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].out_dim, self.out_dim, msg=err_msg)\n\n                        f = gnn.layers[0].out_dim_factor\n                        self.assertEqual(gnn.layers[0].in_dim, self.in_dim, msg=err_msg)\n                        self.assertEqual(gnn.layers[1].in_dim, f * self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].in_dim, f * self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].in_dim, f * self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].in_dim, f * self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].in_dim, f * self.hidden_dims[4], msg=err_msg)\n\n                        out_g = gnn.forward(bg)\n                        feat_out = out_g[\"feat\"]\n                        edge_feat_out = out_g[\"edge_feat\"]\n\n                        out_dim_edges = (\n                            gnn.full_dims_edges[-1] if gnn.full_dims_edges is not None else self.in_dim_edges\n                        )\n\n                        self.assertListEqual(\n                            list(feat_out.shape), [self.num_nodes, self.out_dim], msg=err_msg\n                        )\n                        self.assertListEqual(\n                            list(edge_feat_out.shape), [self.num_edges, out_dim_edges], msg=err_msg\n                        )\n\n    def test_forward_simple_residual(self):\n        for residual_skip_steps in [1, 2]:\n            for virtual_node in self.virtual_nodes:\n                for normalization in self.norms:\n                    for layer_name, this_kwargs in self.layers_kwargs.items():\n                        err_msg = f\"virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}\"\n                        layer_type = layer_name.split(\"#\")[0]\n\n                        # PYG\n                        if layer_type.startswith(\"pyg:\"):\n                            layer_class = FeedForwardPyg\n                            bg = deepcopy(self.batch_pyg)\n\n                        gnn = layer_class(\n                            in_dim=self.in_dim,\n                            out_dim=self.out_dim,\n                            hidden_dims=self.hidden_dims,\n                            residual_type=\"simple\",\n                            residual_skip_steps=1,\n                            layer_type=layer_type,\n                            **this_kwargs,\n                            **self.kwargs,\n                        )\n                        # gnn.to(torch.float32)\n\n                        self.assertIsInstance(gnn.residual_layer, ResidualConnectionSimple)\n                        self.assertEqual(len(gnn.layers), len(self.hidden_dims) + 1, msg=err_msg)\n                        self.assertEqual(gnn.layers[0].out_dim, self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[1].out_dim, self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].out_dim, self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].out_dim, self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].out_dim, self.hidden_dims[4], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].out_dim, self.out_dim, msg=err_msg)\n\n                        f = gnn.layers[0].out_dim_factor\n                        self.assertEqual(gnn.layers[0].in_dim, self.in_dim, msg=err_msg)\n                        self.assertEqual(gnn.layers[1].in_dim, f * self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].in_dim, f * self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].in_dim, f * self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].in_dim, f * self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].in_dim, f * self.hidden_dims[4], msg=err_msg)\n\n                        out_g = gnn.forward(bg)\n                        feat_out = out_g[\"feat\"]\n                        edge_feat_out = out_g[\"edge_feat\"]\n\n                        out_dim_edges = (\n                            gnn.full_dims_edges[-1] if gnn.full_dims_edges is not None else self.in_dim_edges\n                        )\n\n                        self.assertListEqual(\n                            list(feat_out.shape), [self.num_nodes, self.out_dim], msg=err_msg\n                        )\n                        self.assertListEqual(\n                            list(edge_feat_out.shape), [self.num_edges, out_dim_edges], msg=err_msg\n                        )\n\n    def test_forward_weighted_residual(self):\n        for residual_skip_steps in [1, 2]:\n            for virtual_node in self.virtual_nodes:\n                for normalization in self.norms:\n                    for layer_name, this_kwargs in self.layers_kwargs.items():\n                        err_msg = f\"virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}\"\n                        layer_type = layer_name.split(\"#\")[0]\n\n                        # PYG\n                        if layer_type.startswith(\"pyg:\"):\n                            layer_class = FeedForwardPyg\n                            bg = deepcopy(self.batch_pyg)\n\n                        gnn = layer_class(\n                            in_dim=self.in_dim,\n                            out_dim=self.out_dim,\n                            hidden_dims=self.hidden_dims,\n                            residual_type=\"weighted\",\n                            residual_skip_steps=residual_skip_steps,\n                            layer_type=layer_type,\n                            **this_kwargs,\n                            **self.kwargs,\n                        )\n                        # gnn.to(torch.float32)\n\n                        self.assertIsInstance(gnn.residual_layer, ResidualConnectionWeighted)\n                        self.assertEqual(len(gnn.layers), len(self.hidden_dims) + 1, msg=err_msg)\n                        self.assertEqual(gnn.layers[0].out_dim, self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[1].out_dim, self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].out_dim, self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].out_dim, self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].out_dim, self.hidden_dims[4], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].out_dim, self.out_dim, msg=err_msg)\n\n                        f = gnn.layers[0].out_dim_factor\n                        self.assertEqual(gnn.layers[0].in_dim, self.in_dim, msg=err_msg)\n                        self.assertEqual(gnn.layers[1].in_dim, f * self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].in_dim, f * self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].in_dim, f * self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].in_dim, f * self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].in_dim, f * self.hidden_dims[4], msg=err_msg)\n\n                        out_g = gnn.forward(bg)\n                        feat_out = out_g[\"feat\"]\n                        edge_feat_out = out_g[\"edge_feat\"]\n\n                        out_dim_edges = (\n                            gnn.full_dims_edges[-1] if gnn.full_dims_edges is not None else self.in_dim_edges\n                        )\n\n                        self.assertListEqual(\n                            list(feat_out.shape), [self.num_nodes, self.out_dim], msg=err_msg\n                        )\n                        self.assertListEqual(\n                            list(edge_feat_out.shape), [self.num_edges, out_dim_edges], msg=err_msg\n                        )\n\n    def test_forward_concat_residual(self):\n        for residual_skip_steps in [1, 2]:\n            for virtual_node in self.virtual_nodes:\n                for normalization in self.norms:\n                    for layer_name, this_kwargs in self.layers_kwargs.items():\n                        err_msg = f\"virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}\"\n                        layer_type = layer_name.split(\"#\")[0]\n\n                        # PYG\n                        if layer_type.startswith(\"pyg:\"):\n                            layer_class = FeedForwardPyg\n                            bg = deepcopy(self.batch_pyg)\n\n                        gnn = layer_class(\n                            in_dim=self.in_dim,\n                            out_dim=self.out_dim,\n                            hidden_dims=self.hidden_dims,\n                            residual_type=\"concat\",\n                            residual_skip_steps=residual_skip_steps,\n                            layer_type=layer_type,\n                            **this_kwargs,\n                            **self.kwargs,\n                        )\n                        # gnn.to(torch.float32)\n\n                        self.assertIsInstance(gnn.residual_layer, ResidualConnectionConcat)\n                        self.assertEqual(len(gnn.layers), len(self.hidden_dims) + 1, msg=err_msg)\n                        self.assertEqual(gnn.layers[0].out_dim, self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[1].out_dim, self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].out_dim, self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].out_dim, self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].out_dim, self.hidden_dims[4], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].out_dim, self.out_dim, msg=err_msg)\n\n                        f = gnn.layers[0].out_dim_factor\n                        f2 = [2 * f if ((ii % residual_skip_steps) == 0 and ii > 0) else f for ii in range(6)]\n                        self.assertEqual(gnn.layers[0].in_dim, self.in_dim, msg=err_msg)\n                        self.assertEqual(gnn.layers[1].in_dim, f2[0] * self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].in_dim, f2[1] * self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].in_dim, f2[2] * self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].in_dim, f2[3] * self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].in_dim, f2[4] * self.hidden_dims[4], msg=err_msg)\n\n                        out_g = gnn.forward(bg)\n                        feat_out = out_g[\"feat\"]\n                        edge_feat_out = out_g[\"edge_feat\"]\n\n                        out_dim_edges = (\n                            gnn.full_dims_edges[-1] if gnn.full_dims_edges is not None else self.in_dim_edges\n                        )\n\n                        self.assertListEqual(\n                            list(feat_out.shape), [self.num_nodes, self.out_dim], msg=err_msg\n                        )\n                        self.assertListEqual(\n                            list(edge_feat_out.shape), [self.num_edges, out_dim_edges], msg=err_msg\n                        )\n\n    def test_forward_densenet_residual(self):\n        for residual_skip_steps in [1, 2]:\n            for virtual_node in self.virtual_nodes:\n                for normalization in self.norms:\n                    for layer_name, this_kwargs in self.layers_kwargs.items():\n                        err_msg = f\"virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}\"\n                        layer_type = layer_name.split(\"#\")[0]\n\n                        # PYG\n                        if layer_type.startswith(\"pyg:\"):\n                            layer_class = FeedForwardPyg\n                            bg = deepcopy(self.batch_pyg)\n\n                        gnn = layer_class(\n                            in_dim=self.in_dim,\n                            out_dim=self.out_dim,\n                            hidden_dims=self.hidden_dims,\n                            residual_type=\"densenet\",\n                            residual_skip_steps=residual_skip_steps,\n                            layer_type=layer_type,\n                            **this_kwargs,\n                            **self.kwargs,\n                        )\n                        # gnn.to(torch.float32)\n\n                        self.assertIsInstance(gnn.residual_layer, ResidualConnectionDenseNet)\n                        self.assertEqual(len(gnn.layers), len(self.hidden_dims) + 1, msg=err_msg)\n                        self.assertEqual(gnn.layers[0].out_dim, self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[1].out_dim, self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].out_dim, self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].out_dim, self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].out_dim, self.hidden_dims[4], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].out_dim, self.out_dim, msg=err_msg)\n\n                        f = gnn.layers[0].out_dim_factor\n                        f2 = [\n                            (\n                                ((ii // residual_skip_steps) + 1) * f\n                                if ((ii % residual_skip_steps) == 0 and ii > 0)\n                                else f\n                            )\n                            for ii in range(6)\n                        ]\n                        self.assertEqual(gnn.layers[0].in_dim, self.in_dim, msg=err_msg)\n                        self.assertEqual(gnn.layers[1].in_dim, f2[0] * self.hidden_dims[0], msg=err_msg)\n                        self.assertEqual(gnn.layers[2].in_dim, f2[1] * self.hidden_dims[1], msg=err_msg)\n                        self.assertEqual(gnn.layers[3].in_dim, f2[2] * self.hidden_dims[2], msg=err_msg)\n                        self.assertEqual(gnn.layers[4].in_dim, f2[3] * self.hidden_dims[3], msg=err_msg)\n                        self.assertEqual(gnn.layers[5].in_dim, f2[4] * self.hidden_dims[4], msg=err_msg)\n\n                        out_g = gnn.forward(bg)\n                        feat_out = out_g[\"feat\"]\n                        edge_feat_out = out_g[\"edge_feat\"]\n\n                        out_dim_edges = (\n                            gnn.full_dims_edges[-1] if gnn.full_dims_edges is not None else self.in_dim_edges\n                        )\n\n                        self.assertListEqual(\n                            list(feat_out.shape), [self.num_nodes, self.out_dim], msg=err_msg\n                        )\n                        self.assertListEqual(\n                            list(edge_feat_out.shape), [self.num_edges, out_dim_edges], msg=err_msg\n                        )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_attention.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the attention layer\n\"\"\"\n\nfrom ast import Assert\nimport numpy as np\nimport torch\nimport unittest as ut\nfrom torch_geometric.data import Data, Batch\nfrom copy import deepcopy\nfrom graphium.ipu.to_dense_batch import to_dense_batch\n\nfrom graphium.nn.base_layers import MultiheadAttentionMup\n\n\ndef seed_everything(seed: int):\n    import random, os\n    import numpy as np\n    import torch\n\n    random.seed(seed)\n    os.environ[\"PYTHONHASHSEED\"] = str(seed)\n    np.random.seed(seed)\n    torch.manual_seed(seed)\n\n\nclass test_MultiHeadAttention(ut.TestCase):\n    seed_everything(42)\n    in_dim = 12\n    out_dim = 12\n    in_dim_edges = 10\n    out_dim_edges = 10\n    edge_idx1 = torch.stack([torch.tensor([0, 1, 2, 3, 2]), torch.tensor([1, 2, 3, 0, 0])])\n    edge_idx2 = torch.stack([torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0])])\n    x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\n    e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\n    x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\n    e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\n    g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\n    g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\n    bg = Batch.from_data_list([g1, g2])\n\n    attn_kwargs = {\"embed_dim\": in_dim, \"num_heads\": 2, \"batch_first\": True}\n\n    def test_attention_class(self):\n        bg = deepcopy(self.bg)\n        seed_everything(42)\n        attention_layer = MultiheadAttentionMup(biased_attention=False, **self.attn_kwargs)\n        attention_layer.eval()\n        seed_everything(42)\n        attention_layer_bias = MultiheadAttentionMup(biased_attention=True, **self.attn_kwargs)\n        attention_layer_bias.eval()\n\n        h_dense, mask, _ = to_dense_batch(\n            bg.feat,\n            batch=bg.batch,\n            batch_size=None,\n            max_num_nodes_per_graph=None,\n            drop_nodes_last_graph=False,\n        )\n        # attn_bias [batch, num_heads, nodes, nodes]\n        nodes = h_dense.size()[1]\n        attn_bias_3d = torch.zeros(2, 2, nodes, nodes)\n        h_dense.nodepair_gaussian_bias_3d = attn_bias_3d\n        # Apply attention layer and attention layer with bias.\n        h_attn_output = attention_layer(\n            h_dense,\n            h_dense,\n            h_dense,\n            attn_bias=None,\n            attn_mask=None,\n            key_padding_mask=~mask,\n            need_weights=False,\n        )[0]\n        h_attn_output_biased = attention_layer_bias(\n            h_dense,\n            h_dense,\n            h_dense,\n            attn_bias=attn_bias_3d,\n            attn_mask=None,\n            key_padding_mask=~mask,\n            need_weights=False,\n        )[0]\n        self.assertEqual(h_attn_output.size(), h_attn_output_biased.size())\n        np.testing.assert_array_almost_equal(\n            h_attn_output.detach().numpy(), h_attn_output_biased.detach().numpy(), decimal=0\n        )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_base_layers.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different layers of graphium/nn/base_layers\n\"\"\"\n\nimport torch\nimport unittest as ut\nfrom torch_geometric.data import Data, Batch\nfrom copy import deepcopy\n\nfrom graphium.nn.base_layers import DropPath, TransformerEncoderLayerMup\nfrom graphium.ipu.to_dense_batch import to_dense_batch, to_sparse_batch\n\n\nclass test_Base_Layers(ut.TestCase):\n    in_dim = 21\n    out_dim = 11\n    in_dim_edges = 13\n    out_dim_edges = 17\n\n    edge_idx1 = torch.stack([torch.tensor([0, 1, 2, 3, 2]), torch.tensor([1, 2, 3, 0, 0])])\n    edge_idx2 = torch.stack([torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0])])\n    x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\n    e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\n    x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\n    e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\n\n    g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\n    g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\n    bg = Batch.from_data_list([g1, g2])\n\n    batch_size = 2\n    max_num_nodes_per_graph = max(x1.shape[0], x2.shape[0])\n\n    # for drop_rate=0.5, test if the output shape is correct\n    def test_droppath_layer_0p5(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        layer = DropPath(drop_rate=0.5)\n        h_out = layer.forward(feat_in, bg.batch)\n        self.assertEqual(h_out.shape, feat_in.shape)\n\n    # for drop_rate=1.0, test if the output are all zeros\n    def test_droppath_layer_1p0(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        zero_tesor = torch.zeros(feat_in.shape)\n        layer = DropPath(drop_rate=1.0)\n        h_out = layer.forward(feat_in, bg.batch)\n        self.assertTrue(torch.allclose(zero_tesor, h_out.detach()))\n\n    # for drop_rate=0.0, test if the output matches the original output\n    def test_droppath_layer_0p0(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        layer = DropPath(drop_rate=0.0)\n        h_out = layer.forward(feat_in, bg.batch)\n        self.assertTrue(torch.allclose(feat_in.detach(), h_out.detach()))\n\n    # test output shape is correct for TransformerEncoderLayerMup\n    def test_transformer_encoder_layer_mup(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        layer = TransformerEncoderLayerMup(\n            biased_attention=False, d_model=self.in_dim, nhead=1, dim_feedforward=4 * self.in_dim\n        )\n\n        feat_dense, key_padding_mask, idx = to_dense_batch(\n            feat_in,\n            batch=bg.batch,\n            batch_size=self.batch_size,\n            max_num_nodes_per_graph=self.max_num_nodes_per_graph,\n            drop_nodes_last_graph=False,\n        )\n        attn_mask = None\n        key_padding_mask = ~key_padding_mask\n\n        h_out_dense = layer.forward(feat_dense)\n\n        h_out = to_sparse_batch(h_out_dense, mask_idx=idx)\n\n        self.assertEqual(h_out.shape, feat_in.shape)\n"
  },
  {
    "path": "tests/test_collate.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the collate\n\"\"\"\n\nimport unittest as ut\nfrom copy import deepcopy\nimport torch\nimport numpy as np\nfrom torch_geometric.data import Data\n\nfrom graphium.data.collate import collate_labels, graphium_collate_fn\n\n\nclass test_Collate(ut.TestCase):\n    def test_collate_labels(self):\n        # Create fake labels\n        labels_size_dict = {\n            \"graph_label1\": [1],\n            \"graph_label2\": [3],\n            \"node_label2\": [5],\n            \"edge_label3\": [5, 2],\n            \"node_label4\": [5, 1],\n        }\n        labels_dtype_dict = {\n            \"graph_label1\": torch.float32,\n            \"graph_label2\": torch.float16,\n            \"node_label2\": torch.float32,\n            \"edge_label3\": torch.float32,\n            \"node_label4\": torch.float32,\n        }\n        fake_label = {\n            \"graph_label1\": torch.FloatTensor([1]),\n            \"graph_label2\": torch.HalfTensor([1, 2, 3]),\n            \"node_label2\": torch.FloatTensor([1, 2, 3, 4, 5]),\n            \"edge_label3\": torch.FloatTensor([[1, 2], [3, 4], [5, 6], [7, 8], [9, 10]]),\n            \"node_label4\": torch.FloatTensor([[1], [2], [3], [4], [5]]),\n        }\n        fake_labels = []\n        num_labels = 10\n        for i in range(num_labels):\n            pyg_labels = Data(x=torch.empty(5, 0), edge_index=torch.empty(2, 5))\n            for key, val in fake_label.items():\n                pyg_labels[key] = val + 17 * 2\n            fake_labels.append(pyg_labels)\n\n        # Collate labels and check for the right shapes and dtypes\n        collated_labels = collate_labels(\n            deepcopy(fake_labels), deepcopy(labels_size_dict), deepcopy(labels_dtype_dict)\n        )\n        self.assertEqual(collated_labels[\"graph_label1\"].shape, torch.Size([num_labels, 1]))  # , 1\n        self.assertEqual(collated_labels[\"graph_label2\"].shape, torch.Size([num_labels, 3]))  # , 1\n        self.assertEqual(collated_labels[\"node_label2\"].shape, torch.Size([num_labels * 5, 1]))  # , 5\n        self.assertEqual(collated_labels[\"edge_label3\"].shape, torch.Size([num_labels * 5, 2]))  # , 5, 2\n        self.assertEqual(collated_labels[\"node_label4\"].shape, torch.Size([num_labels * 5, 1]))  # , 5, 1\n\n        self.assertEqual(collated_labels[\"graph_label1\"].dtype, torch.float32)\n        self.assertEqual(collated_labels[\"graph_label2\"].dtype, torch.float16)\n\n        # Check that the values are correct\n        graph_label1_true = deepcopy(torch.stack([this_label[\"graph_label1\"] for this_label in fake_labels]))\n        graph_label2_true = deepcopy(torch.stack([this_label[\"graph_label2\"] for this_label in fake_labels]))\n        label2_true = deepcopy(torch.stack([this_label[\"node_label2\"] for this_label in fake_labels]))\n        label3_true = deepcopy(torch.stack([this_label[\"edge_label3\"] for this_label in fake_labels]))\n        label4_true = deepcopy(torch.stack([this_label[\"node_label4\"] for this_label in fake_labels]))\n\n        # NOTE: Flatten due to the way Data objects are collated (concat along first dim, instead of stacked)\n        np.testing.assert_array_equal(collated_labels[\"graph_label1\"].numpy(), graph_label1_true.numpy())\n        np.testing.assert_array_equal(collated_labels[\"graph_label2\"].numpy(), graph_label2_true.numpy())\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label2\"].numpy(), label2_true.flatten(0, 1).unsqueeze(1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"edge_label3\"].numpy(), label3_true.flatten(0, 1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label4\"].numpy(), label4_true.flatten(0, 1).numpy()\n        )\n\n        # Remove some labels and check that the collation still works and puts `nan` in the right places\n        missing_labels = {\n            \"graph_label1\": [1, 3, 5, 7, 9],\n            \"graph_label2\": [0, 4, 3, 1, 7],\n            \"node_label2\": [0, 2, 4, 6, 8],\n            \"edge_label3\": [0, 1, 2, 3, 4],\n            \"node_label4\": [5, 1, 4, 9, 6],\n        }\n        for key, missing_idx in missing_labels.items():\n            for idx in missing_idx:\n                fake_labels[idx].pop(key)\n        graph_label1_true[missing_labels[\"graph_label1\"]] = float(\"nan\")\n        graph_label2_true[missing_labels[\"graph_label2\"]] = float(\"nan\")\n        label2_true[missing_labels[\"node_label2\"]] = float(\"nan\")\n        label3_true[missing_labels[\"edge_label3\"]] = float(\"nan\")\n        label4_true[missing_labels[\"node_label4\"]] = float(\"nan\")\n\n        # Collate labels and check for the right shapes\n        labels_size_dict = {\n            \"graph_label1\": [1],\n            \"graph_label2\": [3],\n            \"node_label2\": [5],\n            \"edge_label3\": [5, 2],\n            \"node_label4\": [5, 1],\n        }\n        collated_labels = collate_labels(\n            deepcopy(fake_labels), deepcopy(labels_size_dict), deepcopy(labels_dtype_dict)\n        )\n        self.assertEqual(collated_labels[\"graph_label1\"].shape, torch.Size([num_labels, 1]))  # , 1\n        self.assertEqual(collated_labels[\"graph_label2\"].shape, torch.Size([num_labels, 3]))  # , 1\n        self.assertEqual(collated_labels[\"node_label2\"].shape, torch.Size([num_labels * 5, 1]))  # , 5\n        self.assertEqual(collated_labels[\"edge_label3\"].shape, torch.Size([num_labels * 5, 2]))  # , 5, 2\n        self.assertEqual(collated_labels[\"node_label4\"].shape, torch.Size([num_labels * 5, 1]))  # , 5, 1\n\n        # Check that the values are correct when some labels are missing\n        # NOTE: Flatten due to the way Data objects are collated (concat along first dim, instead of stacked)\n        np.testing.assert_array_equal(collated_labels[\"graph_label1\"].numpy(), graph_label1_true.numpy())\n        np.testing.assert_array_equal(collated_labels[\"graph_label2\"].numpy(), graph_label2_true.numpy())\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label2\"].numpy(), label2_true.flatten(0, 1).unsqueeze(1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"edge_label3\"].numpy(), label3_true.flatten(0, 1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label4\"].numpy(), label4_true.flatten(0, 1).numpy()\n        )\n        # Now test the `graphium_collate_fn` function when only labels are given\n        fake_labels2 = [{\"labels\": this_label} for this_label in fake_labels]\n        collated_labels = graphium_collate_fn(\n            deepcopy(fake_labels2), labels_size_dict=labels_size_dict, labels_dtype_dict=labels_dtype_dict\n        )[\"labels\"]\n        self.assertEqual(collated_labels[\"graph_label1\"].shape, torch.Size([num_labels, 1]))\n        self.assertEqual(collated_labels[\"graph_label2\"].shape, torch.Size([num_labels, 3]))\n        self.assertEqual(collated_labels[\"node_label2\"].shape, torch.Size([num_labels * 5, 1]))  # , 5\n        self.assertEqual(collated_labels[\"edge_label3\"].shape, torch.Size([num_labels * 5, 2]))  # , 5, 2\n        self.assertEqual(collated_labels[\"node_label4\"].shape, torch.Size([num_labels * 5, 1]))  # , 5, 1\n\n        # Check that the values are correct when some labels are missing\n        # NOTE: Flatten due to the way Data objects are collated (concat along first dim, instead of stacked)\n        np.testing.assert_array_equal(collated_labels[\"graph_label1\"].numpy(), graph_label1_true.numpy())\n        np.testing.assert_array_equal(collated_labels[\"graph_label2\"].numpy(), graph_label2_true.numpy())\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label2\"].numpy(), label2_true.flatten(0, 1).unsqueeze(1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"edge_label3\"].numpy(), label3_true.flatten(0, 1).numpy()\n        )\n        np.testing.assert_array_equal(\n            collated_labels[\"node_label4\"].numpy(), label4_true.flatten(0, 1).numpy()\n        )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_data_utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pandas as pd\nimport unittest as ut\nimport graphium\nimport tempfile\n\n\nclass TestDataUtils(ut.TestCase):\n    def test_list_datasets(\n        self,\n    ):\n        datasets = graphium.data.utils.list_graphium_datasets()\n        assert isinstance(datasets, set)\n        assert len(datasets) > 0\n\n    def test_download_datasets(self):\n        dataset_dir = tempfile.TemporaryDirectory()\n        data_path = graphium.data.utils.download_graphium_dataset(\n            \"graphium-zinc-micro\", output_path=dataset_dir.name\n        )\n\n        fpath = graphium.utils.fs.join(data_path, \"ZINC-micro.csv\")\n        df = pd.read_csv(fpath)\n        assert df.shape == (1000, 4)  # type: ignore\n        assert df.columns.tolist() == [\"SMILES\", \"SA\", \"logp\", \"score\"]  # type: ignore\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_datamodule.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport unittest as ut\nimport numpy as np\nimport torch\nimport pandas as pd\nimport tempfile\n\nimport graphium\nfrom graphium.utils.fs import rm, exists, get_size\nfrom graphium.data import GraphOGBDataModule, MultitaskFromSmilesDataModule\n\nTEMP_CACHE_DATA_PATH = \"tests/temp_cache_0000\"\n\n\nclass Test_DataModule(ut.TestCase):\n    def test_ogb_datamodule(self):\n        # other datasets are too large to be tested\n        dataset_names = [\"ogbg-molhiv\", \"ogbg-molpcba\", \"ogbg-moltox21\", \"ogbg-molfreesolv\"]\n        dataset_name = dataset_names[3]\n\n        # Setup the featurization\n        featurization_args = {}\n        featurization_args[\"atom_property_list_float\"] = []  # [\"weight\", \"valence\"]\n        featurization_args[\"atom_property_list_onehot\"] = [\"atomic-number\", \"degree\"]\n        # featurization_args[\"conformer_property_list\"] = [\"positions_3d\"]\n        featurization_args[\"edge_property_list\"] = [\"bond-type-onehot\"]\n        featurization_args[\"add_self_loop\"] = False\n        featurization_args[\"use_bonds_weights\"] = False\n        featurization_args[\"explicit_H\"] = False\n\n        # Config for datamodule\n        task_specific_args = {}\n        task_specific_args[\"task_1\"] = {\"task_level\": \"graph\", \"dataset_name\": dataset_name}\n        dm_args = {}\n        dm_args[\"processed_graph_data_path\"] = None\n        dm_args[\"featurization\"] = featurization_args\n        dm_args[\"batch_size_training\"] = 16\n        dm_args[\"batch_size_inference\"] = 16\n        dm_args[\"num_workers\"] = 0\n        dm_args[\"pin_memory\"] = True\n        dm_args[\"featurization_n_jobs\"] = 0\n        dm_args[\"featurization_progress\"] = True\n        dm_args[\"featurization_backend\"] = \"loky\"\n        dm_args[\"featurization_batch_size\"] = 50\n\n        ds = GraphOGBDataModule(task_specific_args, **dm_args)\n\n        ds.prepare_data(save_smiles_and_ids=False)\n\n        # Check the keys in the dataset\n        ds.setup(save_smiles_and_ids=False)\n        assert set(ds.train_ds[0].keys()) == {\"features\", \"labels\"}\n\n        # Delete the cache if already exist\n        if exists(TEMP_CACHE_DATA_PATH):\n            rm(TEMP_CACHE_DATA_PATH, recursive=True)\n\n        # Reset the datamodule\n        ds = GraphOGBDataModule(task_specific_args, **dm_args)\n\n        ds.prepare_data(save_smiles_and_ids=True)\n\n        # Check the keys in the dataset\n        ds.setup(save_smiles_and_ids=True)\n        assert set(ds.train_ds[0].keys()) == {\"smiles\", \"mol_ids\", \"features\", \"labels\"}\n\n        # test module\n        assert ds.num_edge_feats == 5\n        assert ds.num_node_feats == 50\n        assert len(ds) == 642\n\n        # test batch loader\n        batch = next(iter(ds.train_dataloader()))\n        assert len(batch[\"smiles\"]) == 16\n        assert len(batch[\"labels\"][\"graph_task_1\"]) == 16\n        assert len(batch[\"mol_ids\"]) == 16\n\n    def test_none_filtering(self):\n        # Create the objects to filter\n        list_of_num = [ii for ii in range(100)]\n        list_of_str = [str(ii) for ii in list_of_num]\n        tuple_of_num = tuple(list_of_num)\n        array_of_num = np.asarray(list_of_num)\n        array_of_str = np.asarray(list_of_str)\n        tensor_of_num = torch.as_tensor(array_of_num)\n        arrays_of_num = np.stack([list_of_num, list_of_num, list_of_num], axis=1)\n        arrays_of_str = np.stack([list_of_str, list_of_str, list_of_str], axis=1)\n        tensors_of_num = torch.as_tensor(arrays_of_num)\n        dic = {\"str\": list_of_str, \"num\": list_of_num}\n        df = pd.DataFrame(dic)\n        df_shuffled = df.sample(frac=1)\n        series_num = df[\"num\"]\n        series_num_shuffled = df_shuffled[\"num\"]\n\n        # Create different indexes to use for filtering\n        all_idx_none = [[3, 17, 88], [22, 33, 44, 55, 66, 77, 88], [], np.arange(len(list_of_num))]\n\n        # Loop all the indexes and filter the objects.\n        for ii, idx_none in enumerate(all_idx_none):\n            msg = f\"Failed for ii={ii}\"\n\n            # Create the true filtered sequences\n            filtered_num = [ii for ii in range(100) if ii not in idx_none]\n            filtered_str = [str(ii) for ii in filtered_num]\n            assert len(filtered_num) == len(list_of_num) - len(idx_none)\n            assert len(filtered_str) == len(list_of_str) - len(idx_none)\n\n            # Filter the sequences from the Datamodule function\n            (\n                list_of_num_2,\n                list_of_str_2,\n                tuple_of_num_2,\n                array_of_num_2,\n                array_of_str_2,\n                tensor_of_num_2,\n                df_2,\n                df_shuffled_2,\n                dic_2,\n                arrays_of_num_2,\n                arrays_of_str_2,\n                tensors_of_num_2,\n                series_num_2,\n                series_num_shuffled_2,\n            ) = graphium.data.MultitaskFromSmilesDataModule._filter_none_molecules(\n                idx_none,\n                list_of_num,\n                list_of_str,\n                tuple_of_num,\n                array_of_num,\n                array_of_str,\n                tensor_of_num,\n                df,\n                df_shuffled,\n                dic,\n                arrays_of_num,\n                arrays_of_str,\n                tensors_of_num,\n                series_num,\n                series_num_shuffled,\n            )\n\n            df_shuffled_2 = df_shuffled_2.sort_values(by=\"num\", axis=0)\n            series_num_shuffled_2 = series_num_shuffled_2.sort_values(axis=0)\n\n            # Assert the filtering is done correctly\n            self.assertListEqual(list_of_num_2, filtered_num, msg=msg)\n            self.assertListEqual(list_of_str_2, filtered_str, msg=msg)\n            self.assertListEqual(list(tuple_of_num_2), filtered_num, msg=msg)\n            self.assertListEqual(array_of_num_2.tolist(), filtered_num, msg=msg)\n            self.assertListEqual(array_of_str_2.tolist(), filtered_str, msg=msg)\n            self.assertListEqual(tensor_of_num_2.tolist(), filtered_num, msg=msg)\n            for jj in range(arrays_of_num.shape[1]):\n                self.assertListEqual(arrays_of_num_2[:, jj].tolist(), filtered_num, msg=msg)\n                self.assertListEqual(arrays_of_str_2[:, jj].tolist(), filtered_str, msg=msg)\n                self.assertListEqual(tensors_of_num_2[:, jj].tolist(), filtered_num, msg=msg)\n            self.assertListEqual(dic_2[\"num\"], filtered_num, msg=msg)\n            self.assertListEqual(dic_2[\"str\"], filtered_str, msg=msg)\n            self.assertListEqual(df_2[\"num\"].tolist(), filtered_num, msg=msg)\n            self.assertListEqual(df_2[\"str\"].tolist(), filtered_str, msg=msg)\n            self.assertListEqual(series_num_2.tolist(), filtered_num, msg=msg)\n\n            # When the dataframe is shuffled, the lists are different because the filtering\n            # is done on the row indexes, not the dataframe indexes.\n            bool_to_check = (len(idx_none) == 0) or (len(idx_none) == len(df_shuffled))\n            self.assertIs(df_shuffled_2[\"num\"].tolist() == filtered_num, bool_to_check, msg=msg)\n            self.assertIs(df_shuffled_2[\"str\"].tolist() == filtered_str, bool_to_check, msg=msg)\n            self.assertIs(series_num_shuffled_2.tolist() == filtered_num, bool_to_check, msg=msg)\n\n    def test_caching(self):\n        # other datasets are too large to be tested\n        dataset_name = \"ogbg-molfreesolv\"\n\n        # Setup the featurization\n        featurization_args = {}\n        featurization_args[\"atom_property_list_float\"] = []  # [\"weight\", \"valence\"]\n        featurization_args[\"atom_property_list_onehot\"] = [\"atomic-number\", \"degree\"]\n        featurization_args[\"edge_property_list\"] = [\"bond-type-onehot\"]\n        featurization_args[\"add_self_loop\"] = False\n        featurization_args[\"use_bonds_weights\"] = False\n        featurization_args[\"explicit_H\"] = False\n\n        # Config for datamodule\n        task_specific_args = {}\n        task_specific_args[\"task_1\"] = {\"task_level\": \"graph\", \"dataset_name\": dataset_name}\n        dm_args = {}\n        dm_args[\"featurization\"] = featurization_args\n        dm_args[\"batch_size_training\"] = 16\n        dm_args[\"batch_size_inference\"] = 16\n        dm_args[\"num_workers\"] = 0\n        dm_args[\"pin_memory\"] = True\n        dm_args[\"featurization_n_jobs\"] = 0\n        dm_args[\"featurization_progress\"] = True\n        dm_args[\"featurization_backend\"] = \"loky\"\n        dm_args[\"featurization_batch_size\"] = 50\n\n        # Delete the cache if already exist\n        if exists(TEMP_CACHE_DATA_PATH):\n            rm(TEMP_CACHE_DATA_PATH, recursive=True)\n\n        # Prepare the data. It should create the cache there\n        assert not exists(TEMP_CACHE_DATA_PATH)\n        ds = GraphOGBDataModule(task_specific_args, processed_graph_data_path=TEMP_CACHE_DATA_PATH, **dm_args)\n        # assert not ds.load_data_from_cache(verbose=False)\n        ds.prepare_data(save_smiles_and_ids=False)\n\n        # Check the keys in the dataset\n        ds.setup(save_smiles_and_ids=False)\n        assert set(ds.train_ds[0].keys()) == {\"features\", \"labels\"}\n\n        # ds_batch = next(iter(ds.train_dataloader()))\n        train_loader = ds.get_dataloader(ds.train_ds, shuffle=False, stage=\"train\")\n        batch = next(iter(train_loader))\n\n        # Test loading cached data\n        assert exists(TEMP_CACHE_DATA_PATH)\n\n        cached_ds_from_ram = GraphOGBDataModule(\n            task_specific_args,\n            processed_graph_data_path=TEMP_CACHE_DATA_PATH,\n            dataloading_from=\"ram\",\n            **dm_args,\n        )\n        cached_ds_from_ram.prepare_data()\n        cached_ds_from_ram.setup()\n        cached_train_loader_from_ram = cached_ds_from_ram.get_dataloader(\n            cached_ds_from_ram.train_ds, shuffle=False, stage=\"train\"\n        )\n        batch_from_ram = next(iter(cached_train_loader_from_ram))\n\n        cached_ds_from_disk = GraphOGBDataModule(\n            task_specific_args,\n            processed_graph_data_path=TEMP_CACHE_DATA_PATH,\n            dataloading_from=\"disk\",\n            **dm_args,\n        )\n        cached_ds_from_disk.prepare_data()\n        cached_ds_from_disk.setup()\n        cached_train_loader_from_disk = cached_ds_from_disk.get_dataloader(\n            cached_ds_from_disk.train_ds, shuffle=False, stage=\"train\"\n        )\n        batch_from_disk = next(iter(cached_train_loader_from_disk))\n\n        # Features are the same\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_index, batch_from_ram[\"features\"].edge_index\n        )\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_index, batch_from_disk[\"features\"].edge_index\n        )\n\n        assert batch[\"features\"].num_nodes == batch_from_ram[\"features\"].num_nodes\n        assert batch[\"features\"].num_nodes == batch_from_disk[\"features\"].num_nodes\n\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_weight, batch_from_ram[\"features\"].edge_weight\n        )\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_weight, batch_from_disk[\"features\"].edge_weight\n        )\n\n        np.testing.assert_array_almost_equal(batch[\"features\"].feat, batch_from_ram[\"features\"].feat)\n        np.testing.assert_array_almost_equal(batch[\"features\"].feat, batch_from_disk[\"features\"].feat)\n\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_feat, batch_from_ram[\"features\"].edge_feat\n        )\n        np.testing.assert_array_almost_equal(\n            batch[\"features\"].edge_feat, batch_from_disk[\"features\"].edge_feat\n        )\n\n        np.testing.assert_array_almost_equal(batch[\"features\"].batch, batch_from_ram[\"features\"].batch)\n        np.testing.assert_array_almost_equal(batch[\"features\"].batch, batch_from_disk[\"features\"].batch)\n\n        np.testing.assert_array_almost_equal(batch[\"features\"].ptr, batch_from_ram[\"features\"].ptr)\n        np.testing.assert_array_almost_equal(batch[\"features\"].ptr, batch_from_disk[\"features\"].ptr)\n\n        # Labels are the same\n        np.testing.assert_array_almost_equal(\n            batch[\"labels\"].graph_task_1, batch_from_ram[\"labels\"].graph_task_1\n        )\n        np.testing.assert_array_almost_equal(\n            batch[\"labels\"].graph_task_1, batch_from_disk[\"labels\"].graph_task_1\n        )\n\n        np.testing.assert_array_almost_equal(batch[\"labels\"].x, batch_from_ram[\"labels\"].x)\n        np.testing.assert_array_almost_equal(batch[\"labels\"].x, batch_from_disk[\"labels\"].x)\n\n        np.testing.assert_array_almost_equal(batch[\"labels\"].edge_index, batch_from_ram[\"labels\"].edge_index)\n        np.testing.assert_array_almost_equal(batch[\"labels\"].edge_index, batch_from_disk[\"labels\"].edge_index)\n\n        np.testing.assert_array_almost_equal(batch[\"labels\"].batch, batch_from_ram[\"labels\"].batch)\n        np.testing.assert_array_almost_equal(batch[\"labels\"].batch, batch_from_disk[\"labels\"].batch)\n\n        np.testing.assert_array_almost_equal(batch[\"labels\"].ptr, batch_from_ram[\"labels\"].ptr)\n        np.testing.assert_array_almost_equal(batch[\"labels\"].ptr, batch_from_disk[\"labels\"].ptr)\n\n        # Delete the cache if already exist\n        if exists(TEMP_CACHE_DATA_PATH):\n            rm(TEMP_CACHE_DATA_PATH, recursive=True)\n\n        # Reset the datamodule\n        ds = GraphOGBDataModule(task_specific_args, processed_graph_data_path=TEMP_CACHE_DATA_PATH, **dm_args)\n\n        ds.prepare_data(save_smiles_and_ids=True)\n\n        ds.setup(save_smiles_and_ids=True)\n        assert set(ds.train_ds[0].keys()) == {\"smiles\", \"mol_ids\", \"features\", \"labels\"}\n\n        # test module\n        assert ds.num_edge_feats == 5\n        assert ds.num_node_feats == 50\n        assert len(ds) == 642\n\n        # test batch loader\n        batch = next(iter(ds.train_dataloader()))\n        assert len(batch[\"smiles\"]) == 16\n        assert len(batch[\"labels\"][\"graph_task_1\"]) == 16\n        assert len(batch[\"mol_ids\"]) == 16\n\n        # Delete the cache if already exist\n        if exists(TEMP_CACHE_DATA_PATH):\n            rm(TEMP_CACHE_DATA_PATH, recursive=True)\n\n    def test_datamodule_with_none_molecules(self):\n        # Setup the featurization\n        featurization_args = {}\n        featurization_args[\"atom_property_list_float\"] = []  # [\"weight\", \"valence\"]\n        featurization_args[\"atom_property_list_onehot\"] = [\"atomic-number\", \"degree\"]\n        featurization_args[\"edge_property_list\"] = [\"bond-type-onehot\"]\n\n        # Config for datamodule\n        bad_csv = \"tests/data/micro_ZINC_corrupt.csv\"\n        task_specific_args = {}\n        task_kwargs = {\"df_path\": bad_csv, \"split_val\": 0.0, \"split_test\": 0.0}\n        task_specific_args[\"task_1\"] = {\n            \"task_level\": \"graph\",\n            \"label_cols\": \"SA\",\n            \"smiles_col\": \"SMILES1\",\n            **task_kwargs,\n        }\n        task_specific_args[\"task_2\"] = {\n            \"task_level\": \"graph\",\n            \"label_cols\": \"logp\",\n            \"smiles_col\": \"SMILES2\",\n            **task_kwargs,\n        }\n        task_specific_args[\"task_3\"] = {\n            \"task_level\": \"graph\",\n            \"label_cols\": \"score\",\n            \"smiles_col\": \"SMILES3\",\n            **task_kwargs,\n        }\n\n        # Read the corrupted dataset and get stats\n        df = pd.read_csv(bad_csv)\n        bad_smiles = (df[\"SMILES1\"] == \"XXX\") & (df[\"SMILES2\"] == \"XXX\") & (df[\"SMILES3\"] == \"XXX\")\n        num_bad_smiles = sum(bad_smiles)\n\n        # Test the datamodule\n        datamodule = MultitaskFromSmilesDataModule(\n            task_specific_args=task_specific_args,\n            featurization_args=featurization_args,\n            featurization_n_jobs=0,\n            featurization_batch_size=1,\n        )\n        datamodule.prepare_data()\n        datamodule.setup(save_smiles_and_ids=True)\n\n        # Check that the number of molecules is correct\n        smiles = df[\"SMILES1\"].tolist() + df[\"SMILES2\"].tolist() + df[\"SMILES3\"].tolist()\n        num_unique_smiles = len(set(smiles)) - 1  # -1 because of the XXX\n        # self.assertEqual(len(datamodule.train_ds), num_unique_smiles - num_bad_smiles)\n\n        # Change the index of the dataframe\n        index_smiles = []\n        for ii in range(len(df)):\n            if df[\"SMILES1\"][ii] != \"XXX\":\n                smiles = df[\"SMILES1\"][ii]\n            elif df[\"SMILES2\"][ii] != \"XXX\":\n                smiles = df[\"SMILES2\"][ii]\n            elif df[\"SMILES3\"][ii] != \"XXX\":\n                smiles = df[\"SMILES3\"][ii]\n            else:\n                smiles = \"XXX\"\n            index_smiles.append(smiles)\n        df[\"idx_smiles\"] = index_smiles\n        df = df.set_index(\"idx_smiles\")\n\n        # Convert the smilies from the train_ds to a list, and check the content\n        train_smiles = [d[\"smiles\"] for d in datamodule.train_ds]\n\n        # Check that the set of smiles are the same\n        train_smiles_flat = list(set([item for sublist in train_smiles for item in sublist]))\n        train_smiles_flat.sort()\n        index_smiles_filt = list(set([smiles for smiles in index_smiles if smiles != \"XXX\"]))\n        index_smiles_filt.sort()\n        self.assertListEqual(train_smiles_flat, index_smiles_filt)\n\n        # Check that the smiles are correct for each datapoint in the dataset\n        for smiles in train_smiles:\n            self.assertEqual(len(set(smiles)), 1)  # Check that all smiles are the same\n            this_smiles = smiles[0]\n            true_smiles = df.loc[this_smiles][[\"SMILES1\", \"SMILES2\", \"SMILES3\"]]\n            num_true_smiles = sum(true_smiles != \"XXX\")\n            self.assertEqual(len(smiles), num_true_smiles)  # Check that the number of smiles is correct\n            self.assertEqual(\n                this_smiles, true_smiles[true_smiles != \"XXX\"].values[0]\n            )  # Check that the smiles are correct\n\n        # Convert the labels from the train_ds to a dataframe\n        train_labels = [{task: val[0] for task, val in d[\"labels\"].items()} for d in datamodule.train_ds]\n        train_labels_df = pd.DataFrame(train_labels)\n        train_labels_df = train_labels_df.rename(\n            columns={\"graph_task_1\": \"graph_SA\", \"graph_task_2\": \"graph_logp\", \"graph_task_3\": \"graph_score\"}\n        )\n        train_labels_df[\"smiles\"] = [s[0] for s in datamodule.train_ds.smiles]\n        train_labels_df = train_labels_df.set_index(\"smiles\")\n        train_labels_df = train_labels_df.sort_index()\n\n        # Check that the labels are correct\n        df2 = df.reset_index()[~bad_smiles].set_index(\"idx_smiles\").sort_index()\n        labels = train_labels_df[[\"graph_SA\", \"graph_logp\", \"graph_score\"]].values\n        nans = np.isnan(labels)\n        true_nans = df2[[\"SMILES1\", \"SMILES2\", \"SMILES3\"]].values == \"XXX\"\n        true_labels = df2[[\"SA\", \"logp\", \"score\"]].values\n        true_labels[true_nans] = np.nan\n        np.testing.assert_array_equal(nans, true_nans)  # Check that the nans are correct\n        np.testing.assert_array_almost_equal(\n            labels, true_labels, decimal=5\n        )  # Check that the label values are correct\n\n    def test_datamodule_multiple_data_files(self):\n        # Test single CSV files\n        csv_file = \"tests/data/micro_ZINC_shard_1.csv\"\n        task_kwargs = {\"df_path\": csv_file, \"split_val\": 0.0, \"split_test\": 0.0}\n        task_specific_args = {\n            \"task\": {\"task_level\": \"graph\", \"label_cols\": [\"score\"], \"smiles_col\": \"SMILES\", **task_kwargs}\n        }\n\n        ds = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n        ds.prepare_data()\n        ds.setup()\n\n        self.assertEqual(len(ds.train_ds), 10)\n\n        # Test multi CSV files\n        csv_file = \"tests/data/micro_ZINC_shard_*.csv\"\n        task_kwargs = {\"df_path\": csv_file, \"split_val\": 0.0, \"split_test\": 0.0}\n        task_specific_args = {\n            \"task\": {\"task_level\": \"graph\", \"label_cols\": [\"score\"], \"smiles_col\": \"SMILES\", **task_kwargs}\n        }\n\n        ds = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n        ds.prepare_data()\n        ds.setup()\n\n        self.assertEqual(len(ds.train_ds), 20)\n\n        # Test single Parquet files\n        parquet_file = \"tests/data/micro_ZINC_shard_1.parquet\"\n        task_kwargs = {\"df_path\": parquet_file, \"split_val\": 0.0, \"split_test\": 0.0}\n        task_specific_args = {\n            \"task\": {\"task_level\": \"graph\", \"label_cols\": [\"score\"], \"smiles_col\": \"SMILES\", **task_kwargs}\n        }\n\n        ds = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n        ds.prepare_data()\n        ds.setup()\n\n        self.assertEqual(len(ds.train_ds), 10)\n\n        # Test multi Parquet files\n        parquet_file = \"tests/data/micro_ZINC_shard_*.parquet\"\n        task_kwargs = {\"df_path\": parquet_file, \"split_val\": 0.0, \"split_test\": 0.0}\n        task_specific_args = {\n            \"task\": {\"task_level\": \"graph\", \"label_cols\": [\"score\"], \"smiles_col\": \"SMILES\", **task_kwargs}\n        }\n\n        ds = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n        ds.prepare_data()\n        ds.setup()\n\n        self.assertEqual(len(ds.train_ds), 20)\n\n    def test_splits_file(self):\n        # Test single CSV files\n        csv_file = \"tests/data/micro_ZINC_shard_1.csv\"\n        df = pd.read_csv(csv_file)\n\n        # Split the CSV file with 80/10/10\n        train = 0.8\n        val = 0.1\n        indices = np.arange(len(df))\n        split_train = indices[: int(len(df) * train)]\n        split_val = indices[int(len(df) * train) : int(len(df) * (train + val))]\n        split_test = indices[int(len(df) * (train + val)) :]\n\n        splits = {\"train\": split_train, \"val\": split_val, \"test\": split_test}\n\n        # Test the splitting using `splits` directly as `splits_path`\n        task_kwargs = {\n            \"df_path\": csv_file,\n            \"splits_path\": splits,\n            \"split_val\": 0.0,\n            \"split_test\": 0.0,\n        }\n        task_specific_args = {\n            \"task\": {\n                \"task_level\": \"graph\",\n                \"label_cols\": [\"score\"],\n                \"smiles_col\": \"SMILES\",\n                **task_kwargs,\n            }\n        }\n\n        ds = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n        ds.prepare_data(save_smiles_and_ids=True)\n        ds.setup(save_smiles_and_ids=True)\n\n        self.assertEqual(len(ds.train_ds), len(split_train))\n        self.assertEqual(len(ds.val_ds), len(split_val))\n        self.assertEqual(len(ds.test_ds), len(split_test))\n\n        # Create a TemporaryFile to save the splits, and test the datamodule\n        with tempfile.NamedTemporaryFile(suffix=\".pt\") as temp:\n            # Save the splits\n            torch.save(splits, temp)\n\n            # Test the datamodule\n            task_kwargs = {\n                \"df_path\": csv_file,\n                \"splits_path\": temp.name,\n                \"split_val\": 0.0,\n                \"split_test\": 0.0,\n            }\n            task_specific_args = {\n                \"task\": {\n                    \"task_level\": \"graph\",\n                    \"label_cols\": [\"score\"],\n                    \"smiles_col\": \"SMILES\",\n                    **task_kwargs,\n                }\n            }\n\n            ds2 = MultitaskFromSmilesDataModule(task_specific_args, featurization_n_jobs=0)\n            ds2.prepare_data(save_smiles_and_ids=True)\n            ds2.setup(save_smiles_and_ids=True)\n\n            self.assertEqual(len(ds2.train_ds), len(split_train))\n            self.assertEqual(len(ds2.val_ds), len(split_val))\n            self.assertEqual(len(ds2.test_ds), len(split_test))\n\n            # Check that the splits are the same\n            self.assertEqual(len(ds.train_ds.smiles), len(split_train))\n            np.testing.assert_array_equal(ds.train_ds.smiles, ds2.train_ds.smiles)\n            np.testing.assert_array_equal(ds.val_ds.smiles, ds2.val_ds.smiles)\n            np.testing.assert_array_equal(ds.test_ds.smiles, ds2.test_ds.smiles)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n\n    # Delete the cache\n    if exists(TEMP_CACHE_DATA_PATH):\n        rm(TEMP_CACHE_DATA_PATH, recursive=True)\n"
  },
  {
    "path": "tests/test_dataset.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport unittest as ut\n\nfrom graphium.data import load_micro_zinc\nfrom graphium.data.dataset import SingleTaskDataset, MultitaskDataset\nfrom graphium.data.smiles_transform import smiles_to_unique_mol_ids\nfrom graphium.data.utils import get_keys\n\n\nclass Test_Multitask_Dataset(ut.TestCase):\n    # Then we can choose different rows and columns for the tests as we see fit.\n    # Remember tests are supposed to be FAST, and reading from the file system multiple times slows things down.\n\n    # Make sure that the inputs to single task datasets are always lists!\n    # Do not pass a data frame itself, but turn it into a list to satisfy the type required.\n\n    def test_multitask_dataset_case_1(self):\n        \"\"\"Case: different tasks, all with the same smiles set.\n        - Check that for each task, all smiles are received from the initial DF.\n        - Check that for each task, you have the same label values as the initial DF.\n        \"\"\"\n\n        df_micro_zinc = load_micro_zinc()  # Has about 1000 molecules\n        df = df_micro_zinc.iloc[0:4]\n        num_unique_mols = 4\n\n        # Here we take the microzinc dataset and split the labels up into 'SA', 'logp' and 'score' in order to simulate having multiple single-task datasets\n        df_micro_zinc_SA = df[[\"SMILES\", \"SA\"]]\n        df_micro_zinc_logp = df[[\"SMILES\", \"logp\"]]\n        df_micro_zinc_score = df[[\"SMILES\", \"score\"]]\n\n        # We need to turn these dataframes into single-task datasets.\n        # We don't need to do featurization yet.\n        ds_micro_zinc_SA = SingleTaskDataset(\n            smiles=df_micro_zinc_SA.loc[:, \"SMILES\"].tolist(), labels=df_micro_zinc_SA.loc[:, \"SA\"].tolist()\n        )\n\n        ds_micro_zinc_logp = SingleTaskDataset(\n            smiles=df_micro_zinc_logp.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_logp.loc[:, \"logp\"].tolist(),\n        )\n        ds_micro_zinc_score = SingleTaskDataset(\n            smiles=df_micro_zinc_score.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_score.loc[:, \"score\"].tolist(),\n        )\n\n        # Create the multitask dataset\n        datasets_dict = {\"SA\": ds_micro_zinc_SA, \"logp\": ds_micro_zinc_logp, \"score\": ds_micro_zinc_score}\n        multitask_microzinc = MultitaskDataset(\n            datasets_dict, save_smiles_and_ids=True\n        )  # Can optionally have features\n\n        # Check: The number of unique molecules equals the number of datapoints in the multitask dataset.\n        self.assertEqual(num_unique_mols, multitask_microzinc.__len__())\n\n        # Check that for each task, you have the same label values as the initial DF.\n        for idx in range(multitask_microzinc.__len__()):\n            smiles = df[[\"SMILES\"]].iloc[idx].values[0]\n            # label = df[['SA']].iloc[idx]\n            label_SA = ds_micro_zinc_SA.labels[idx]\n            label_logp = ds_micro_zinc_logp.labels[idx]\n            label_score = ds_micro_zinc_score.labels[idx]\n\n            # Search for the mol id in the multitask dataset\n            mol_ids = smiles_to_unique_mol_ids([smiles])\n            mol_id = mol_ids[0]\n            found_idx = -1\n            for i, id in enumerate(multitask_microzinc.mol_ids):\n                if mol_id == id:\n                    found_idx = i\n\n            # Compare labels\n            self.assertEqual(label_SA, multitask_microzinc.labels[found_idx][\"SA\"])\n            self.assertEqual(label_logp, multitask_microzinc.labels[found_idx][\"logp\"])\n            self.assertEqual(label_score, multitask_microzinc.labels[found_idx][\"score\"])\n\n    def test_multitask_dataset_case_2(self):\n        \"\"\"Case: Different tasks, but with no intersection in the smiles (each task has a unique set of smiles)\n        - Check that the total dataset has as much smiles as all tasks together\n        - Check that, for each task, only the smiles related to that task have values, and ensure the value is what's expected from the initial DF\n        \"\"\"\n        df = load_micro_zinc()  # Has about 1000 molecules\n\n        # Choose non-overlapping smiles by choosing specific rows from the original dataframe.\n        df_rows_SA = df.iloc[0:200]  # 200 data points\n        df_rows_logp = df.iloc[200:400]  # 200 data points\n        df_rows_score = df.iloc[400:750]  # 350 data points\n        total_data_points = 750\n\n        # Here we split the data according to the task we care about.\n        df_micro_zinc_SA = df_rows_SA[[\"SMILES\", \"SA\"]]\n        df_micro_zinc_logp = df_rows_logp[[\"SMILES\", \"logp\"]]\n        df_micro_zinc_score = df_rows_score[[\"SMILES\", \"score\"]]\n\n        # We need to turn these dataframes into single-task datasets.\n        # We don't need to do featurization yet.\n        ds_micro_zinc_SA = SingleTaskDataset(\n            smiles=df_micro_zinc_SA.loc[:, \"SMILES\"].tolist(), labels=df_micro_zinc_SA.loc[:, \"SA\"].tolist()\n        )\n        ds_micro_zinc_logp = SingleTaskDataset(\n            smiles=df_micro_zinc_logp.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_logp.loc[:, \"logp\"].tolist(),\n        )\n        ds_micro_zinc_score = SingleTaskDataset(\n            smiles=df_micro_zinc_score.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_score.loc[:, \"score\"].tolist(),\n        )\n\n        # Create the multitask dataset\n        datasets_dict = {\"SA\": ds_micro_zinc_SA, \"logp\": ds_micro_zinc_logp, \"score\": ds_micro_zinc_score}\n        multitask_microzinc = MultitaskDataset(\n            datasets_dict, save_smiles_and_ids=True\n        )  # Can optionally have features\n\n        # The total dataset has as many molecules as there are smiles in all tasks put together\n        self.assertEqual(total_data_points, multitask_microzinc.__len__())\n\n        # For each task, only the smiles related to that task have values, and the value is what's expected from the initial DF.\n        for idx in range(len(ds_micro_zinc_SA)):\n            smiles = df[[\"SMILES\"]].iloc[idx].values[0]\n\n            task = \"task\"\n            if idx in range(0, 200):\n                task = \"SA\"\n            elif idx in range(200, 400):\n                task = \"logp\"\n            elif idx in range(400, 750):\n                task = \"score\"\n\n            # Labels of that molecule\n            label_SA = df[[\"SA\"]].iloc[idx].values[0]\n            label_logp = df[[\"logp\"]].iloc[idx].values[0]\n            label_score = df[[\"score\"]].iloc[idx].values[0]\n\n            # Search for that molecule in the multitask dataset\n            mol_ids = smiles_to_unique_mol_ids([smiles])\n            mol_id = mol_ids[0]\n            found_idx = -1\n            for i, id in enumerate(multitask_microzinc.mol_ids):\n                if mol_id == id:\n                    found_idx = i\n            multitask_microzinc_labels = get_keys(multitask_microzinc.labels[found_idx])\n            if task == \"SA\":\n                self.assertEqual(label_SA, multitask_microzinc.labels[found_idx][\"SA\"])\n                self.assertFalse(\"score\" in multitask_microzinc_labels)\n                self.assertFalse(\"logp\" in multitask_microzinc_labels)\n            elif task == \"logp\":\n                self.assertEqual(label_logp, multitask_microzinc.labels[found_idx][\"logp\"])\n                self.assertFalse(\"score\" in multitask_microzinc_labels)\n                self.assertFalse(\"SA\" in multitask_microzinc_labels)\n            elif task == \"score\":\n                self.assertEqual(label_score, multitask_microzinc.labels[found_idx][\"score\"])\n                self.assertFalse(\"SA\" in multitask_microzinc_labels)\n                self.assertFalse(\"logp\" in multitask_microzinc_labels)\n\n    def test_multitask_dataset_case_3(self):\n        \"\"\"Case: Different tasks, but with semi-intersection (some smiles unique per task, some intersect)\n        - Check that the total dataset has as much smiles as the unique number of smiles.\n        - Check that for each task, you retrieve the same smiles as expected from the initial DF\n        \"\"\"\n        df_micro_zinc = load_micro_zinc()  # Has about 1000 molecules\n        df = df_micro_zinc.iloc[0:5]\n\n        # Choose OVERLAPPING smiles by choosing specific rows from the original dataframe. The tasks will not necessarily have unique smiles.\n        df_rows_SA = df.iloc[0:3]\n        df_rows_logp = df.iloc[1:4]\n        df_rows_score = df.iloc[3:5]\n        total_data_points = 5\n\n        # Here we split the data according to the task we care about.\n        df_micro_zinc_SA = df_rows_SA[[\"SMILES\", \"SA\"]]\n        df_micro_zinc_logp = df_rows_logp[[\"SMILES\", \"logp\"]]\n        df_micro_zinc_score = df_rows_score[[\"SMILES\", \"score\"]]\n\n        # We need to turn these dataframes into single-task datasets.\n        # We don't need to do featurization yet.\n        ds_micro_zinc_SA = SingleTaskDataset(\n            smiles=df_micro_zinc_SA.loc[:, \"SMILES\"].tolist(), labels=df_micro_zinc_SA.loc[:, \"SA\"].tolist()\n        )\n        ds_micro_zinc_logp = SingleTaskDataset(\n            smiles=df_micro_zinc_logp.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_logp.loc[:, \"logp\"].tolist(),\n        )\n        ds_micro_zinc_score = SingleTaskDataset(\n            smiles=df_micro_zinc_score.loc[:, \"SMILES\"].tolist(),\n            labels=df_micro_zinc_score.loc[:, \"score\"].tolist(),\n        )\n\n        # Create the multitask dataset\n        datasets_dict = {\"SA\": ds_micro_zinc_SA, \"logp\": ds_micro_zinc_logp, \"score\": ds_micro_zinc_score}\n        multitask_microzinc = MultitaskDataset(\n            datasets_dict, save_smiles_and_ids=True\n        )  # Can optionally have features\n\n        # The multitask dataset has as many molecules as there are unique smiles across the single task datasets.\n        self.assertEqual(total_data_points, multitask_microzinc.__len__())\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_ensemble_layers.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different layers of graphium/nn/ensemble_layers\n\"\"\"\n\nimport numpy as np\nimport torch\nfrom torch.nn import Linear\nimport unittest as ut\n\nfrom graphium.nn.base_layers import FCLayer, MLP, MuReadoutGraphium\nfrom graphium.nn.ensemble_layers import (\n    EnsembleLinear,\n    EnsembleFCLayer,\n    EnsembleMLP,\n    EnsembleMuReadoutGraphium,\n)\nfrom graphium.nn.architectures import FeedForwardNN, EnsembleFeedForwardNN\n\n\nclass test_Ensemble_Layers(ut.TestCase):\n    def check_ensemble_linear(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        use_mureadout=False,\n    ):\n        msg = f\"Testing EnsembleLinear with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        if use_mureadout:\n            # Create EnsembleMuReadoutGraphium instance\n            ensemble_linear = EnsembleMuReadoutGraphium(in_dim, out_dim, num_ensemble)\n            # Create equivalent separate Linear layers with synchronized weights and biases\n            linear_layers = [MuReadoutGraphium(in_dim, out_dim) for _ in range(num_ensemble)]\n        else:\n            # Create EnsembleLinear instance\n            ensemble_linear = EnsembleLinear(in_dim, out_dim, num_ensemble)\n            # Create equivalent separate Linear layers with synchronized weights and biases\n            linear_layers = [Linear(in_dim, out_dim) for _ in range(num_ensemble)]\n\n        for i, linear_layer in enumerate(linear_layers):\n            linear_layer.weight.data = ensemble_linear.weight.data[i]\n            if ensemble_linear.bias is not None:\n                linear_layer.bias.data = ensemble_linear.bias.data[i].squeeze()\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_linear(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (num_ensemble, batch_size, out_dim), msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, linear_layer in enumerate(linear_layers):\n            individual_output = linear_layer(input_tensor)\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output[i].detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n        # Test with a sample input with the extra `num_ensemble` and `more_batch_dim` dimension\n        if more_batch_dim:\n            out_shape = (more_batch_dim, num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(more_batch_dim, num_ensemble, batch_size, in_dim)\n        else:\n            out_shape = (num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(num_ensemble, batch_size, in_dim)\n        ensemble_output = ensemble_linear(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, out_shape, msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, linear_layer in enumerate(linear_layers):\n            if more_batch_dim:\n                individual_output = linear_layer(input_tensor[:, i])\n                ensemble_output_i = ensemble_output[:, i]\n            else:\n                individual_output = linear_layer(input_tensor[i])\n                ensemble_output_i = ensemble_output[i]\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output_i.detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n    def test_ensemble_linear(self):\n        # more_batch_dim=0\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=0)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=0)\n\n        # more_batch_dim=1\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=1)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=1)\n\n        # more_batch_dim=7\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=7)\n        self.check_ensemble_linear(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=7)\n\n    def test_ensemble_mureadout_graphium(self):\n        # Test `use_mureadout`\n        # more_batch_dim=0\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=0, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=0, use_mureadout=True\n        )\n\n        # more_batch_dim=1\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=1, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=1, use_mureadout=True\n        )\n\n        # more_batch_dim=7\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=7, use_mureadout=True\n        )\n        self.check_ensemble_linear(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=7, use_mureadout=True\n        )\n\n    def check_ensemble_fclayer(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        is_readout_layer=False,\n    ):\n        msg = f\"Testing EnsembleFCLayer with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        # Create EnsembleFCLayer instance\n        ensemble_fclayer = EnsembleFCLayer(in_dim, out_dim, num_ensemble, is_readout_layer=is_readout_layer)\n\n        # Create equivalent separate FCLayer layers with synchronized weights and biases\n        fc_layers = [FCLayer(in_dim, out_dim, is_readout_layer=is_readout_layer) for _ in range(num_ensemble)]\n        for i, fc_layer in enumerate(fc_layers):\n            fc_layer.linear.weight.data = ensemble_fclayer.linear.weight.data[i]\n            if ensemble_fclayer.bias is not None:\n                fc_layer.linear.bias.data = ensemble_fclayer.linear.bias.data[i].squeeze()\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_fclayer(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (num_ensemble, batch_size, out_dim), msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, fc_layer in enumerate(fc_layers):\n            individual_output = fc_layer(input_tensor)\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output[i].detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n        # Test with a sample input with the extra `num_ensemble` and `more_batch_dim` dimension\n        if more_batch_dim:\n            out_shape = (more_batch_dim, num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(more_batch_dim, num_ensemble, batch_size, in_dim)\n        else:\n            out_shape = (num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(num_ensemble, batch_size, in_dim)\n        ensemble_output = ensemble_fclayer(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, out_shape, msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, fc_layer in enumerate(fc_layers):\n            if more_batch_dim:\n                individual_output = fc_layer(input_tensor[:, i])\n                ensemble_output_i = ensemble_output[:, i]\n            else:\n                individual_output = fc_layer(input_tensor[i])\n                ensemble_output_i = ensemble_output[i]\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output_i.detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n    def test_ensemble_fclayer(self):\n        # more_batch_dim=0\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=0)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=0)\n\n        # more_batch_dim=1\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=1)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=1)\n\n        # more_batch_dim=7\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=7)\n        self.check_ensemble_fclayer(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=7)\n\n        # Test `is_readout_layer`\n        self.check_ensemble_fclayer(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, is_readout_layer=True\n        )\n        self.check_ensemble_fclayer(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, is_readout_layer=True\n        )\n        self.check_ensemble_fclayer(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, is_readout_layer=True\n        )\n\n    def check_ensemble_mlp(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        last_layer_is_readout=False,\n    ):\n        msg = f\"Testing EnsembleMLP with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        # Create EnsembleMLP instance\n        hidden_dims = [17, 17, 17]\n        ensemble_mlp = EnsembleMLP(\n            in_dim, hidden_dims, out_dim, num_ensemble, last_layer_is_readout=last_layer_is_readout\n        )\n\n        # Create equivalent separate MLP layers with synchronized weights and biases\n        mlps = [\n            MLP(in_dim, hidden_dims, out_dim, last_layer_is_readout=last_layer_is_readout)\n            for _ in range(num_ensemble)\n        ]\n        for i, mlp in enumerate(mlps):\n            for j, layer in enumerate(mlp.fully_connected):\n                layer.linear.weight.data = ensemble_mlp.fully_connected[j].linear.weight.data[i]\n                if layer.bias is not None:\n                    layer.linear.bias.data = ensemble_mlp.fully_connected[j].linear.bias.data[i].squeeze()\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (num_ensemble, batch_size, out_dim), msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, mlp in enumerate(mlps):\n            individual_output = mlp(input_tensor)\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output[i].detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n        # Test with a sample input with the extra `num_ensemble` and `more_batch_dim` dimension\n        if more_batch_dim:\n            out_shape = (more_batch_dim, num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(more_batch_dim, num_ensemble, batch_size, in_dim)\n        else:\n            out_shape = (num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(num_ensemble, batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, out_shape, msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, mlp in enumerate(mlps):\n            if more_batch_dim:\n                individual_output = mlp(input_tensor[:, i])\n                ensemble_output_i = ensemble_output[:, i]\n            else:\n                individual_output = mlp(input_tensor[i])\n                ensemble_output_i = ensemble_output[i]\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output_i.detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n    def test_ensemble_mlp(self):\n        # more_batch_dim=0\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=0)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=0)\n\n        # more_batch_dim=1\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=1)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=1)\n\n        # more_batch_dim=7\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=7)\n        self.check_ensemble_mlp(in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=7)\n\n        # Test `last_layer_is_readout`\n        self.check_ensemble_mlp(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, last_layer_is_readout=True\n        )\n        self.check_ensemble_mlp(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, last_layer_is_readout=True\n        )\n        self.check_ensemble_mlp(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, last_layer_is_readout=True\n        )\n\n    def check_ensemble_feedforwardnn(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        last_layer_is_readout=False,\n    ):\n        msg = f\"Testing EnsembleFeedForwardNN with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        # Create EnsembleFeedForwardNN instance\n        hidden_dims = [17, 17, 17]\n        ensemble_mlp = EnsembleFeedForwardNN(\n            in_dim,\n            out_dim,\n            hidden_dims,\n            num_ensemble,\n            reduction=None,\n            last_layer_is_readout=last_layer_is_readout,\n        )\n\n        # Create equivalent separate MLP layers with synchronized weights and biases\n        mlps = [\n            FeedForwardNN(in_dim, out_dim, hidden_dims, last_layer_is_readout=last_layer_is_readout)\n            for _ in range(num_ensemble)\n        ]\n        for i, mlp in enumerate(mlps):\n            for j, layer in enumerate(mlp.layers):\n                layer.linear.weight.data = ensemble_mlp.layers[j].linear.weight.data[i]\n                if layer.bias is not None:\n                    layer.linear.bias.data = ensemble_mlp.layers[j].linear.bias.data[i].squeeze()\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (num_ensemble, batch_size, out_dim), msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        individual_outputs = []\n        for i, mlp in enumerate(mlps):\n            individual_outputs.append(mlp(input_tensor))\n        individual_outputs = torch.stack(individual_outputs).detach().numpy()\n        for i, mlp in enumerate(mlps):\n            ensemble_output_i = ensemble_output[i].detach().numpy()\n            np.testing.assert_allclose(\n                ensemble_output_i, individual_outputs[..., i, :, :], atol=1e-5, err_msg=msg\n            )\n\n        # Test with a sample input with the extra `num_ensemble` and `more_batch_dim` dimension\n        if more_batch_dim:\n            out_shape = (more_batch_dim, num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(more_batch_dim, num_ensemble, batch_size, in_dim)\n        else:\n            out_shape = (num_ensemble, batch_size, out_dim)\n            input_tensor = torch.randn(num_ensemble, batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, out_shape, msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        for i, mlp in enumerate(mlps):\n            if more_batch_dim:\n                individual_output = mlp(input_tensor[:, i])\n                ensemble_output_i = ensemble_output[:, i]\n            else:\n                individual_output = mlp(input_tensor[i])\n                ensemble_output_i = ensemble_output[i]\n            individual_output = individual_output.detach().numpy()\n            ensemble_output_i = ensemble_output_i.detach().numpy()\n            np.testing.assert_allclose(ensemble_output_i, individual_output, atol=1e-5, err_msg=msg)\n\n    def check_ensemble_feedforwardnn_mean(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        last_layer_is_readout=False,\n    ):\n        msg = f\"Testing EnsembleFeedForwardNN with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        # Create EnsembleFeedForwardNN instance\n        hidden_dims = [17, 17, 17]\n        ensemble_mlp = EnsembleFeedForwardNN(\n            in_dim,\n            out_dim,\n            hidden_dims,\n            num_ensemble,\n            reduction=\"mean\",\n            last_layer_is_readout=last_layer_is_readout,\n        )\n\n        # Create equivalent separate MLP layers with synchronized weights and biases\n        mlps = [\n            FeedForwardNN(in_dim, out_dim, hidden_dims, last_layer_is_readout=last_layer_is_readout)\n            for _ in range(num_ensemble)\n        ]\n        for i, mlp in enumerate(mlps):\n            for j, layer in enumerate(mlp.layers):\n                layer.linear.weight.data = ensemble_mlp.layers[j].linear.weight.data[i]\n                if layer.bias is not None:\n                    layer.linear.bias.data = ensemble_mlp.layers[j].linear.bias.data[i].squeeze()\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (batch_size, out_dim), msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        individual_outputs = []\n        for i, mlp in enumerate(mlps):\n            individual_outputs.append(mlp(input_tensor))\n        individual_outputs = torch.stack(individual_outputs, dim=-3)\n        individual_outputs = individual_outputs.mean(dim=-3).detach().numpy()\n        np.testing.assert_allclose(\n            ensemble_output.detach().numpy(), individual_outputs, atol=1e-5, err_msg=msg\n        )\n\n        # Test with a sample input with the extra `num_ensemble` and `more_batch_dim` dimension\n        if more_batch_dim:\n            out_shape = (more_batch_dim, batch_size, out_dim)\n            input_tensor = torch.randn(more_batch_dim, num_ensemble, batch_size, in_dim)\n        else:\n            out_shape = (batch_size, out_dim)\n            input_tensor = torch.randn(num_ensemble, batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor).detach()\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, out_shape, msg=msg)\n\n        # Make sure that the outputs of the individual layers are the same as the ensemble output\n        individual_outputs = []\n        for i, mlp in enumerate(mlps):\n            if more_batch_dim:\n                individual_outputs.append(mlp(input_tensor[:, i]))\n            else:\n                individual_outputs.append(mlp(input_tensor[i]))\n        individual_output = torch.stack(individual_outputs, dim=-3).mean(dim=-3).detach().numpy()\n        np.testing.assert_allclose(ensemble_output, individual_output, atol=1e-5, err_msg=msg)\n\n    def check_ensemble_feedforwardnn_simple(\n        self,\n        in_dim: int,\n        out_dim: int,\n        num_ensemble: int,\n        batch_size: int,\n        more_batch_dim: int,\n        last_layer_is_readout=False,\n        **kwargs,\n    ):\n        msg = f\"Testing EnsembleFeedForwardNN with in_dim={in_dim}, out_dim={out_dim}, num_ensemble={num_ensemble}, batch_size={batch_size}, more_batch_dim={more_batch_dim}\"\n\n        # Create EnsembleFeedForwardNN instance\n        hidden_dims = [17, 17, 17]\n        ensemble_mlp = EnsembleFeedForwardNN(\n            in_dim,\n            out_dim,\n            hidden_dims,\n            num_ensemble,\n            reduction=None,\n            last_layer_is_readout=last_layer_is_readout,\n            **kwargs,\n        )\n\n        # Test with a sample input\n        input_tensor = torch.randn(batch_size, in_dim)\n        ensemble_output = ensemble_mlp(input_tensor)\n\n        # Check for the output shape\n        self.assertEqual(ensemble_output.shape, (num_ensemble, batch_size, out_dim), msg=msg)\n\n    def test_ensemble_feedforwardnn(self):\n        # more_batch_dim=0\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=0\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=0\n        )\n\n        # more_batch_dim=1\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=1\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=1\n        )\n\n        # more_batch_dim=7\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=1, more_batch_dim=7\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=1, batch_size=13, more_batch_dim=7\n        )\n\n        # Test `last_layer_is_readout`\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, last_layer_is_readout=True\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, last_layer_is_readout=True\n        )\n        self.check_ensemble_feedforwardnn(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, last_layer_is_readout=True\n        )\n\n        # Test `reduction`\n        self.check_ensemble_feedforwardnn_mean(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, last_layer_is_readout=True\n        )\n        self.check_ensemble_feedforwardnn_mean(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, last_layer_is_readout=True\n        )\n        self.check_ensemble_feedforwardnn_mean(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, last_layer_is_readout=True\n        )\n\n        # Test `subset_in_dim`\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, subset_in_dim=0.5\n        )\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, subset_in_dim=0.5\n        )\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, subset_in_dim=0.5\n        )\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, subset_in_dim=7\n        )\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, subset_in_dim=7\n        )\n        self.check_ensemble_feedforwardnn_simple(\n            in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, subset_in_dim=7\n        )\n        with self.assertRaises(AssertionError):\n            self.check_ensemble_feedforwardnn_simple(\n                in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=0, subset_in_dim=1.5\n            )\n        with self.assertRaises(AssertionError):\n            self.check_ensemble_feedforwardnn_simple(\n                in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=1, subset_in_dim=39\n            )\n        with self.assertRaises(AssertionError):\n            self.check_ensemble_feedforwardnn_simple(\n                in_dim=11, out_dim=5, num_ensemble=3, batch_size=13, more_batch_dim=7, subset_in_dim=39\n            )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_featurizer.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different datasets of graphium/features/featurizer.py\n\"\"\"\n\nimport numpy as np\nimport unittest as ut\nfrom copy import deepcopy\nfrom rdkit import Chem\nimport datamol as dm\n\nfrom graphium.features.featurizer import (\n    get_mol_atomic_features_onehot,\n    get_mol_atomic_features_float,\n    get_mol_edge_features,\n    mol_to_adj_and_features,\n    mol_to_pyggraph,\n)\n\n\nclass test_featurizer(ut.TestCase):\n    smiles = [\n        \"C\",\n        \"CC\",\n        \"C1(C[N]CCC1)=O\",\n        \"CC(C)CC1=CC=C(C=C1)C(C)C(=O)O\",\n        \"OCCc1c(C)[n+](cs1)Cc2cnc(C)nc2N\",\n        \"O1C=C[C@H]([C@H]1O2)c3c2cc(OC)c4c3OC(=O)C5=C4CCC(=O)5\",\n        \"CC1(C2C1C(N(C2)C(=O)C(C(C)=B)NC(=O)C(F)(F)F)C(=O)NC(C(C3CCNC3=O)[Cl])C#N)C\",\n    ]\n\n    smiles_noble = [\"[He].[He]\", \"[He][He]\", \"[Kr][Kr]\"]\n\n    atomic_onehot_props = [\n        \"atomic-number\",\n        \"valence\",\n        \"degree\",\n        \"implicit-valence\",\n        \"hybridization\",\n        \"chirality\",\n        \"phase\",\n        \"type\",\n        \"group\",\n        \"period\",\n    ]\n\n    atomic_float_props = [\n        \"atomic-number\",\n        \"mass\",\n        \"valence\",\n        \"implicit-valence\",\n        \"hybridization\",\n        \"chirality\",\n        \"aromatic\",\n        \"in-ring\",\n        \"min-ring\",\n        \"max-ring\",\n        \"num-ring\",\n        \"degree\",\n        \"radical-electron\",\n        \"formal-charge\",\n        \"vdw-radius\",\n        \"covalent-radius\",\n        \"electronegativity\",\n        \"ionization\",\n        \"melting-point\",\n        \"metal\",\n        \"single-bond\",\n        \"aromatic-bond\",\n        \"double-bond\",\n        \"triple-bond\",\n        \"is-carbon\",\n        \"group\",\n        \"period\",\n    ]\n\n    edge_props = [\n        \"bond-type-onehot\",\n        \"bond-type-float\",\n        \"stereo\",\n        \"in-ring\",\n        \"conjugated\",\n        \"estimated-bond-length\",\n        \"conformer-bond-length\",\n    ]\n\n    def test_get_mol_atomic_features_onehot(self):\n        props = deepcopy(self.atomic_onehot_props)\n        bad_props = [\"bob\"]\n\n        all_smiles = self.smiles + self.smiles_noble\n\n        for s in all_smiles:\n            err_msg = f\"\\n\\tError for params:\\n\\t\\tSMILES: {s}\"\n            mol = dm.to_mol(s)\n\n            for ii in range(len(props)):\n                this_props = props[:ii]\n                err_msg2 = err_msg + f\"\\n\\t\\tprops: {this_props}\"\n                prop_dict = get_mol_atomic_features_onehot(mol, property_list=this_props)\n                self.assertListEqual(list(prop_dict.keys()), this_props, msg=err_msg)\n                for key, val in prop_dict.items():\n                    err_msg3 = err_msg2 + f\"\\n\\t\\tkey: {key}\"\n                    self.assertEqual(val.shape[0], mol.GetNumAtoms(), msg=err_msg3)\n                    self.assertGreater(val.shape[1], 1, msg=err_msg3)\n                    self.assertTrue(np.all((val == 0) | (val == 1)), msg=err_msg3)\n\n            with self.assertRaises(ValueError, msg=err_msg):\n                get_mol_atomic_features_onehot(mol, property_list=bad_props)\n\n    def test_get_mol_atomic_features_float(self):\n        props = deepcopy(self.atomic_float_props)\n\n        bad_props = [\"bob\"]\n\n        all_smiles = self.smiles + self.smiles_noble\n        for s in all_smiles:\n            err_msg = f\"\\n\\tError for params:\\n\\t\\tSMILES: {s}\"\n            mol = dm.to_mol(s)\n\n            for ii in range(len(props)):\n                this_props = props[:ii]\n                err_msg2 = err_msg + f\"\\n\\t\\tprops: {this_props}\"\n                prop_dict = get_mol_atomic_features_float(mol, property_list=this_props, mask_nan=None)\n                self.assertListEqual(list(prop_dict.keys()), this_props, msg=err_msg)\n                for key, val in prop_dict.items():\n                    err_msg3 = err_msg2 + f\"\\n\\t\\tkey: {key}\"\n                    self.assertListEqual(list(val.shape), [mol.GetNumAtoms()], msg=err_msg3)\n\n            with self.assertRaises(ValueError, msg=err_msg):\n                get_mol_atomic_features_float(mol, property_list=bad_props)\n\n    def test_get_mol_atomic_features_float_nan_mask(self):\n        for s in self.smiles_noble:\n            mol = dm.to_mol(s)\n\n            # Nothing happens when `mask_nan = None`, nans are still in the property array\n            prop_dict = get_mol_atomic_features_float(\n                mol, property_list=self.atomic_float_props, mask_nan=None\n            )\n            prop_array = np.concatenate(list(prop_dict.values()), axis=0)\n            nans = np.isnan(prop_array)\n\n            # Capture a raised error when `mask_nan = \"raise\"`\n            with self.assertRaises(ValueError):\n                prop_dict = get_mol_atomic_features_float(\n                    mol, property_list=self.atomic_float_props, mask_nan=\"raise\"\n                )\n\n            # Not sure how to Capture a logged warning when `mask_nan = \"warn\"`\n            # Here, I'm testing a behaviour similar to `mask_nan = None`\n            prop_dict = get_mol_atomic_features_float(\n                mol, property_list=self.atomic_float_props, mask_nan=\"warn\"\n            )\n            prop_array = np.concatenate(list(prop_dict.values()), axis=0)\n            self.assertEqual(len(self.atomic_float_props), len(prop_dict))\n            self.assertTrue(any(np.isnan(prop_array)))\n\n            # NaNs are replaced by `42` when `mask_nan=42`\n            prop_dict = get_mol_atomic_features_float(mol, property_list=self.atomic_float_props, mask_nan=42)\n            prop_array = np.concatenate(list(prop_dict.values()), axis=0)\n            self.assertEqual(len(self.atomic_float_props), len(prop_dict))\n            self.assertFalse(any(np.isnan(prop_array)))\n            self.assertTrue(all(prop_array[nans] == 42))\n\n    def test_get_mol_edge_features(self):\n        props = deepcopy(self.edge_props)\n        bad_props = [\"bob\"]\n\n        all_smiles = self.smiles + self.smiles_noble\n        for s in all_smiles:\n            err_msg = f\"\\n\\tError for params:\\n\\t\\tSMILES: {s}\"\n            mol = dm.to_mol(s)\n            for ii in range(len(props)):\n                this_props = props[: ii + 1]\n                err_msg2 = err_msg + f\"\\n\\t\\tprops: {this_props}\"\n                prop_dict = get_mol_edge_features(mol, property_list=this_props)\n                self.assertListEqual(list(prop_dict.keys()), this_props, msg=err_msg)\n                for key, val in prop_dict.items():\n                    err_msg3 = err_msg2 + f\"\\n\\t\\tkey: {key}\"\n                    self.assertEqual(val.shape[0], mol.GetNumBonds(), msg=err_msg3)\n\n            if mol.GetNumBonds() > 0:\n                with self.assertRaises(ValueError, msg=err_msg):\n                    get_mol_edge_features(mol, property_list=bad_props)\n\n    def test_mol_to_adj_and_features(self):\n        np.random.seed(42)\n\n        for s in self.smiles:\n            err_msg = f\"\\n\\tError for params:\\n\\t\\tSMILES: {s}\"\n            mol = dm.to_mol(s)\n            mol_Hs = Chem.AddHs(mol)  # type: ignore\n            mol_No_Hs = Chem.RemoveHs(mol)  # type: ignore\n\n            for explicit_H in [True, False]:\n                this_mol = mol_Hs if explicit_H else mol_No_Hs\n                for ii in np.arange(0, 5, 0.2):\n                    num_props = int(round(ii))\n                    err_msg2 = err_msg + f\"\\n\\t\\texplicit_H: {explicit_H}\\n\\t\\tii: {ii}\"\n\n                    adj, ndata, edata, _, _ = mol_to_adj_and_features(\n                        mol=mol,\n                        atom_property_list_onehot=np.random.choice(\n                            self.atomic_onehot_props, size=num_props, replace=False\n                        ),\n                        atom_property_list_float=np.random.choice(\n                            self.atomic_float_props, size=num_props, replace=False\n                        ),\n                        edge_property_list=np.random.choice(self.edge_props, size=num_props, replace=False),\n                        add_self_loop=False,\n                        explicit_H=explicit_H,\n                        use_bonds_weights=False,\n                    )\n\n                    self.assertEqual(adj.shape[0], this_mol.GetNumAtoms(), msg=err_msg2)\n                    if num_props > 0:\n                        self.assertEqual(ndata.shape[0], this_mol.GetNumAtoms(), msg=err_msg2)\n                        if this_mol.GetNumBonds() > 0:\n                            self.assertEqual(edata.shape[0], this_mol.GetNumBonds(), msg=err_msg2)\n                            self.assertGreaterEqual(edata.shape[1], num_props, msg=err_msg2)\n                        self.assertGreaterEqual(ndata.shape[1], num_props, msg=err_msg2)\n\n    def test_mol_to_pyggraph(self):\n        np.random.seed(42)\n\n        for s in self.smiles:\n            err_msg = f\"\\n\\tError for params:\\n\\t\\tSMILES: {s}\"\n            mol = dm.to_mol(s)\n            mol_Hs = Chem.AddHs(mol)  # type: ignore\n            mol_No_Hs = Chem.RemoveHs(mol)  # type: ignore\n\n            graph = mol_to_pyggraph(\n                mol=mol,\n                atom_property_list_onehot=[],\n                atom_property_list_float=[\"atomic-number\"],\n                edge_property_list=[\"bond-type-float\"],\n                add_self_loop=False,\n                explicit_H=False,\n                use_bonds_weights=False,\n                on_error=\"raise\",\n            )\n\n            # Check the number of nodes and edges\n            self.assertListEqual(list(graph[\"feat\"].shape), [mol.GetNumAtoms(), 1], msg=err_msg)\n            self.assertListEqual(list(graph[\"edge_feat\"].shape), [2 * mol.GetNumBonds(), 1], msg=err_msg)\n\n            # Check the node features\n            feat = graph[\"feat\"].to_dense().numpy() * 5 + 6  # Undo the scaling\n            atom_nums = np.asarray([atom.GetAtomicNum() for atom in mol.GetAtoms()])\n            np.testing.assert_array_almost_equal(feat[:, 0], atom_nums, decimal=5, err_msg=err_msg)\n\n            # Check the edge features\n            edge_feat = graph[\"edge_feat\"].to_dense().numpy()\n            bond_types = np.asarray([bond.GetBondTypeAsDouble() for bond in mol.GetBonds()]).repeat(2)\n            np.testing.assert_array_almost_equal(edge_feat[:, 0], bond_types, decimal=5, err_msg=err_msg)\n\n            # Check the edge indices\n            if mol.GetNumBonds() > 0:\n                edge_index = graph[\"edge_index\"].to_dense().numpy()\n                true_edge_index = []\n                for bond in mol.GetBonds():\n                    true_edge_index.append([bond.GetBeginAtomIdx(), bond.GetEndAtomIdx()])\n                    true_edge_index.append([bond.GetEndAtomIdx(), bond.GetBeginAtomIdx()])\n                true_edge_index = np.asarray(true_edge_index).T\n                np.testing.assert_array_equal(edge_index, true_edge_index, err_msg=err_msg)\n\n            # Loop over many possible combinations of properties\n            for explicit_H in [True, False]:\n                this_mol = mol_Hs if explicit_H else mol_No_Hs\n                for ii in np.arange(0, 5, 0.2):\n                    num_props = int(round(ii))\n                    err_msg2 = err_msg + f\"\\n\\t\\texplicit_H: {explicit_H}\\n\\t\\tii: {ii}\"\n\n                    graph = mol_to_pyggraph(\n                        mol=mol,\n                        atom_property_list_onehot=np.random.choice(\n                            self.atomic_onehot_props, size=num_props, replace=False\n                        ),\n                        atom_property_list_float=np.random.choice(\n                            self.atomic_float_props, size=num_props, replace=False\n                        ),\n                        edge_property_list=np.random.choice(self.edge_props, size=num_props, replace=False),\n                        add_self_loop=False,\n                        explicit_H=explicit_H,\n                        use_bonds_weights=False,\n                        on_error=\"raise\",\n                    )\n\n                    self.assertEqual(graph.num_nodes, this_mol.GetNumAtoms(), msg=err_msg2)\n                    self.assertEqual(graph.num_edges, 2 * this_mol.GetNumBonds(), msg=err_msg2)\n                    if num_props > 0:\n                        ndata = graph[\"feat\"]\n                        edata = graph[\"edge_feat\"]\n                        self.assertEqual(ndata.shape[0], this_mol.GetNumAtoms(), msg=err_msg2)\n                        self.assertEqual(edata.shape[0], 2 * this_mol.GetNumBonds(), msg=err_msg2)\n                        self.assertGreaterEqual(ndata.shape[1], num_props, msg=err_msg2)\n                        self.assertGreaterEqual(edata.shape[1], num_props, msg=err_msg2)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_finetuning.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport os\nimport unittest as ut\nfrom copy import deepcopy\nfrom os.path import abspath, dirname\n\nimport torch\nfrom lightning.pytorch.callbacks import Callback\nfrom omegaconf import OmegaConf\n\nimport graphium\nfrom graphium.config._loader import (\n    load_accelerator,\n    load_architecture,\n    load_datamodule,\n    load_metrics,\n    load_predictor,\n    load_trainer,\n    save_params_to_wandb,\n)\nfrom graphium.finetuning import GraphFinetuning, modify_cfg_for_finetuning\nfrom graphium.trainer import PredictorModule\n\nMAIN_DIR = dirname(dirname(abspath(graphium.__file__)))\nCONFIG_FILE = \"graphium/config/dummy_finetuning.yaml\"\n\nos.chdir(MAIN_DIR)\n\n\nclass Test_Finetuning(ut.TestCase):\n    def test_finetuning_from_task_head(self):\n        # Skip test if PyTDC package not installed\n        try:\n            import tdc\n        except ImportError:\n            self.skipTest(\"PyTDC needs to be installed to run this test. Use `pip install PyTDC`.\")\n\n        ##################################################\n        ### Test modification of config for finetuning ###\n        ##################################################\n\n        cfg = graphium.load_config(name=\"dummy_finetuning_from_task_head\")\n        cfg = OmegaConf.to_container(cfg, resolve=True)\n\n        cfg = modify_cfg_for_finetuning(cfg)\n\n        # Initialize the accelerator\n        cfg, accelerator_type = load_accelerator(cfg)\n\n        # Load and initialize the dataset\n        datamodule = load_datamodule(cfg, accelerator_type)\n        datamodule.task_specific_args[\"lipophilicity_astrazeneca\"].sample_size = 100\n\n        # Initialize the network\n        model_class, model_kwargs = load_architecture(\n            cfg,\n            in_dims=datamodule.in_dims,\n        )\n\n        datamodule.prepare_data()\n\n        metrics = load_metrics(cfg)\n\n        predictor = load_predictor(\n            cfg,\n            model_class,\n            model_kwargs,\n            metrics,\n            datamodule.get_task_levels(),\n            accelerator_type,\n            datamodule.featurization,\n            datamodule.task_norms,\n        )\n\n        # Create module map\n        module_map = deepcopy(predictor.model.pretrained_model.net._module_map)\n\n        cfg_finetune = cfg[\"finetuning\"]\n        finetuning_module = \"\".join([cfg_finetune[\"finetuning_module\"], \"-\", cfg_finetune[\"task\"]])\n        finetuning_module_from_pretrained = \"\".join(\n            [cfg_finetune[\"finetuning_module\"], \"-\", cfg_finetune[\"sub_module_from_pretrained\"]]\n        )\n\n        # Test for correctly modified shapes and number of layers in finetuning module\n        self.assertEqual(\n            len(module_map[finetuning_module]),\n            3,\n        )\n        self.assertEqual(module_map[finetuning_module][-1].linear.weight.size(0), 8)\n        self.assertEqual(predictor.model.finetuning_head.net.in_dim, 8)\n        self.assertEqual(len(predictor.model.finetuning_head.net.layers), 2)\n        self.assertEqual(predictor.model.finetuning_head.net.out_dim, 1)\n\n        ################################################\n        ### Test overwriting with pretrained weights ###\n        ################################################\n\n        # Load pretrained & replace in predictor\n        pretrained_model = PredictorModule.load_pretrained_model(\n            cfg[\"finetuning\"][\"pretrained_model\"], device=\"cpu\"\n        ).model\n\n        pretrained_model.create_module_map()\n        module_map_from_pretrained = deepcopy(pretrained_model._module_map)\n\n        # Finetuning module has only been partially overwritten\n        loaded_layers = module_map_from_pretrained[finetuning_module_from_pretrained]\n        overwritten_layers = module_map[finetuning_module]\n\n        for idx, (loaded, overwritten) in enumerate(zip(loaded_layers, overwritten_layers)):\n            if idx < 1:\n                assert torch.equal(loaded.linear.weight, overwritten.linear.weight)\n                assert torch.equal(loaded.linear.bias, overwritten.linear.bias)\n            else:\n                assert not torch.equal(loaded.linear.weight, overwritten.linear.weight)\n                assert not torch.equal(loaded.linear.bias, overwritten.linear.bias)\n\n            if idx + 1 == min(len(loaded_layers), len(overwritten_layers)):\n                break\n\n        for module_name in module_map.keys():\n            if module_name == finetuning_module:\n                break\n\n            loaded_module, overwritten_module = (\n                module_map_from_pretrained[module_name],\n                module_map[module_name],\n            )\n            for loaded_params, overwritten_params in zip(\n                loaded_module.parameters(), overwritten_module.parameters()\n            ):\n                assert torch.equal(loaded_params.data, overwritten_params.data)\n\n        #################################################\n        ### Test correct (un)freezing during training ###\n        #################################################\n\n        # Define test callback that checks for correct (un)freezing\n        class TestCallback(Callback):\n            def __init__(self, cfg):\n                super().__init__()\n\n                self.cfg_finetune = cfg[\"finetuning\"]\n\n            def on_train_epoch_start(self, trainer, pl_module):\n                module_map = pl_module.model.pretrained_model.net._module_map\n\n                finetuning_module = \"\".join(\n                    [self.cfg_finetune[\"finetuning_module\"], \"-\", self.cfg_finetune[\"task\"]]\n                )\n                training_depth = self.cfg_finetune[\"added_depth\"] + self.cfg_finetune.pop(\n                    \"unfreeze_pretrained_depth\", 0\n                )\n\n                frozen_parameters, unfrozen_parameters = [], []\n\n                if trainer.current_epoch == 0:\n                    frozen = True\n\n                    for module_name, module in module_map.items():\n                        if module_name == finetuning_module:\n                            # After the finetuning module, all parameters are unfrozen\n                            frozen = False\n\n                            frozen_parameters.extend(\n                                [\n                                    parameter.requires_grad\n                                    for parameter in module[:-training_depth].parameters()\n                                ]\n                            )\n                            unfrozen_parameters.extend(\n                                [\n                                    parameter.requires_grad\n                                    for parameter in module[-training_depth:].parameters()\n                                ]\n                            )\n                            continue\n\n                        if frozen:\n                            frozen_parameters.extend(\n                                [parameter.requires_grad for parameter in module.parameters()]\n                            )\n                        else:\n                            unfrozen_parameters.extend(\n                                [parameter.requires_grad for parameter in module.parameters()]\n                            )\n\n                    # Finetuning head is always unfrozen\n                    unfrozen_parameters.extend(\n                        [\n                            parameter.requires_grad\n                            for parameter in pl_module.model.finetuning_head.parameters()\n                        ]\n                    )\n\n                    assert not True in frozen_parameters\n                    assert not False in unfrozen_parameters\n\n                if trainer.current_epoch == 2:\n                    # All parameter are unfrozen starting from epoch_unfreeze_all\n                    unfrozen_parameters = [\n                        parameter.requires_grad for parameter in pl_module.model.parameters()\n                    ]\n\n                    assert not False in unfrozen_parameters\n\n        trainer = load_trainer(cfg, accelerator_type)\n\n        finetuning_training_kwargs = cfg[\"finetuning\"][\"training_kwargs\"]\n        trainer.callbacks.append(GraphFinetuning(**finetuning_training_kwargs))\n\n        # Add test callback to trainer\n        trainer.callbacks.append(TestCallback(cfg))\n\n        predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\"train\", \"val\"])\n\n        # Run the model training\n        trainer.fit(model=predictor, datamodule=datamodule)\n\n    def test_finetuning_from_gnn(self):\n        # Skip test if PyTDC package not installed\n        try:\n            import tdc\n        except ImportError:\n            self.skipTest(\"PyTDC needs to be installed to run this test. Use `pip install PyTDC`.\")\n\n        ##################################################\n        ### Test modification of config for finetuning ###\n        ##################################################\n\n        cfg = graphium.load_config(name=\"dummy_finetuning_from_gnn\")\n        cfg = OmegaConf.to_container(cfg, resolve=True)\n\n        cfg = modify_cfg_for_finetuning(cfg)\n\n        # Initialize the accelerator\n        cfg, accelerator_type = load_accelerator(cfg)\n\n        # Load and initialize the dataset\n        datamodule = load_datamodule(cfg, accelerator_type)\n        datamodule.task_specific_args[\"lipophilicity_astrazeneca\"].sample_size = 100\n\n        # Initialize the network\n        model_class, model_kwargs = load_architecture(\n            cfg,\n            in_dims=datamodule.in_dims,\n        )\n\n        datamodule.prepare_data()\n\n        metrics = load_metrics(cfg)\n\n        predictor = load_predictor(\n            cfg,\n            model_class,\n            model_kwargs,\n            metrics,\n            datamodule.get_task_levels(),\n            accelerator_type,\n            datamodule.featurization,\n            datamodule.task_norms,\n        )\n\n        # Create module map\n        module_map = deepcopy(predictor.model.pretrained_model.net._module_map)\n\n        cfg_finetune = cfg[\"finetuning\"]\n        finetuning_module = cfg_finetune[\"finetuning_module\"]\n\n        # Test for correctly modified shapes and number of layers in finetuning module\n        self.assertEqual(\n            len(module_map[finetuning_module]),\n            5,\n        )\n        self.assertEqual(module_map[finetuning_module][-1].model.lin.weight.size(0), 96)\n        self.assertEqual(len(module_map[\"graph_output_nn-graph\"]), 2)\n\n        assert predictor.model.pretrained_model.net.task_heads.graph_output_nn[\n            \"graph\"\n        ].graph_output_nn_kwargs[\"graph\"][\"pooling\"] == [\"mean\"]\n\n        ################################################\n        ### Test overwriting with pretrained weights ###\n        ################################################\n\n        # Load pretrained & replace in predictor\n        pretrained_model = PredictorModule.load_pretrained_model(\n            cfg[\"finetuning\"][\"pretrained_model\"], device=\"cpu\"\n        ).model\n\n        pretrained_model.create_module_map()\n        module_map_from_pretrained = deepcopy(pretrained_model._module_map)\n\n        # Finetuning module has only been partially overwritten\n        loaded_layers = module_map_from_pretrained[finetuning_module]\n        overwritten_layers = module_map[finetuning_module]\n\n        for idx, (loaded, overwritten) in enumerate(zip(loaded_layers, overwritten_layers)):\n            if idx < 2:\n                assert torch.equal(loaded.model.lin.weight, overwritten.model.lin.weight)\n            else:\n                assert not torch.equal(loaded.model.lin.weight, overwritten.model.lin.weight)\n\n            if idx + 1 == min(len(loaded_layers), len(overwritten_layers)):\n                break\n\n        for module_name in module_map.keys():\n            if module_name == finetuning_module:\n                break\n\n            loaded_module, overwritten_module = (\n                module_map_from_pretrained[module_name],\n                module_map[module_name],\n            )\n            for loaded_params, overwritten_params in zip(\n                loaded_module.parameters(), overwritten_module.parameters()\n            ):\n                assert torch.equal(loaded_params.data, overwritten_params.data)\n\n        #################################################\n        ### Test correct (un)freezing during training ###\n        #################################################\n\n        # Define test callback that checks for correct (un)freezing\n        class TestCallback(Callback):\n            def __init__(self, cfg):\n                super().__init__()\n\n                self.cfg_finetune = cfg[\"finetuning\"]\n\n            def on_train_epoch_start(self, trainer, pl_module):\n                module_map = pl_module.model.pretrained_model.net._module_map\n\n                training_depth = self.cfg_finetune[\"added_depth\"] + self.cfg_finetune.pop(\n                    \"unfreeze_pretrained_depth\", 0\n                )\n\n                frozen_parameters, unfrozen_parameters = [], []\n\n                if trainer.current_epoch == 0:\n                    frozen = True\n\n                    for module_name, module in module_map.items():\n                        if module_name == finetuning_module:\n                            # After the finetuning module, all parameters are unfrozen\n                            frozen = False\n\n                            frozen_parameters.extend(\n                                [\n                                    parameter.requires_grad\n                                    for parameter in module[:-training_depth].parameters()\n                                ]\n                            )\n                            unfrozen_parameters.extend(\n                                [\n                                    parameter.requires_grad\n                                    for parameter in module[-training_depth:].parameters()\n                                ]\n                            )\n                            continue\n\n                        if frozen:\n                            frozen_parameters.extend(\n                                [parameter.requires_grad for parameter in module.parameters()]\n                            )\n                        else:\n                            unfrozen_parameters.extend(\n                                [parameter.requires_grad for parameter in module.parameters()]\n                            )\n\n                    assert not True in frozen_parameters\n                    assert not False in unfrozen_parameters\n\n                if trainer.current_epoch == 1:\n                    # All parameter are unfrozen starting from epoch_unfreeze_all\n                    unfrozen_parameters = [\n                        parameter.requires_grad for parameter in pl_module.model.parameters()\n                    ]\n\n                    assert not False in unfrozen_parameters\n\n        trainer = load_trainer(cfg, accelerator_type)\n\n        finetuning_training_kwargs = cfg[\"finetuning\"][\"training_kwargs\"]\n        trainer.callbacks.append(GraphFinetuning(**finetuning_training_kwargs))\n\n        # Add test callback to trainer\n        trainer.callbacks.append(TestCallback(cfg))\n\n        predictor.set_max_nodes_edges_per_graph(datamodule, stages=[\"train\", \"val\"])\n\n        # Run the model training\n        trainer.fit(model=predictor, datamodule=datamodule)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_ipu_dataloader.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n# General imports\nimport yaml\nimport unittest as ut\nimport numpy as np\nfrom copy import deepcopy\nfrom warnings import warn\nfrom unittest.mock import patch\nfrom lightning import Trainer, LightningModule\nfrom functools import partial\nimport pytest\nfrom typing import Any, Callable, Dict, Iterable, List, Optional, Tuple, Union\n\nimport torch\nfrom torch.utils.data.dataloader import default_collate\nfrom lightning_graphcore import IPUStrategy\n\n\ndef random_packing(num_nodes, batch_size):\n    ipu_batch_size = int(len(num_nodes) / batch_size)\n    indices = np.arange(len(num_nodes))\n    np.random.shuffle(indices)\n    indices = np.reshape(indices, (ipu_batch_size, batch_size)).tolist()\n    return indices\n\n\ndef global_batch_collator(batch_size, batches):\n    packs = []\n    for pack_idx in range(0, len(batches), batch_size):\n        packs.append(default_collate(batches[pack_idx : pack_idx + batch_size]))\n    global_batch = default_collate(packs)\n    global_batch = (global_batch[0], tuple(global_batch[1]))\n    return global_batch\n\n\n@pytest.mark.ipu\nclass test_DataLoading(ut.TestCase):\n    class TestSimpleLightning(LightningModule):\n        # Create a basic Ligthning for testing the batch sizes\n        def __init__(self, batch_size, node_feat_size, edge_feat_size, num_batch) -> None:\n            super().__init__()\n            self.batch_size = batch_size\n            self.node_feat_size = node_feat_size\n            self.edge_feat_size = edge_feat_size\n            self.layer = torch.nn.Linear(node_feat_size, 1)\n            self.loss_fn = torch.nn.L1Loss()\n            self.num_batch = num_batch\n\n        def validation_step(self, batch, batch_idx):\n            self.assert_shapes(batch, batch_idx, \"val\")\n            loss = self.forward(batch)\n            return loss\n\n        def training_step(self, batch, batch_idx):\n            self.assert_shapes(batch, batch_idx, \"train\")\n            loss = self.forward(batch)\n            return loss\n\n        def forward(self, batch):\n            out = self.layer(batch[1][0]).squeeze(-1)\n            loss = self.loss_fn(out, batch[0])\n            return loss\n\n        def assert_shapes(self, batch, batch_idx, step):\n            # Test the shape of the labels\n            this_shape = list(batch[0].shape)\n            true_shape = [1, self.batch_size]\n            assert (\n                this_shape == true_shape\n            ), f\"Shape of the labels is `{this_shape}` but should be {true_shape}\"\n\n            # Test the shape of the first feature\n            this_shape = list(batch[1][0].shape)\n            true_shape = [1, self.batch_size, self.node_feat_size]\n            assert (\n                this_shape == true_shape\n            ), f\"Shape of the feature 0 is `{this_shape}` but should be {true_shape}\"\n\n            # Test the shape of the second feature\n            this_shape = list(batch[1][1].shape)\n            true_shape = [1, self.batch_size, self.edge_feat_size]\n            assert (\n                this_shape == true_shape\n            ), f\"Shape of the feature 0 is `{this_shape}` but should be {true_shape}\"\n\n        def configure_optimizers(self):\n            return torch.optim.Adam(self.parameters(), lr=1e-3)\n\n    class TestDataset(torch.utils.data.Dataset):\n        # Create a simple dataset for testing the Lightning integration\n        def __init__(self, labels, node_features, edge_features):\n            self.labels = labels\n            self.node_features = node_features\n            self.edge_features = edge_features\n\n        def __len__(self):\n            return len(self.labels)\n\n        def __getitem__(self, idx):\n            # [label, [feat1, feat2]]\n            return [self.labels[idx], [self.node_features[idx], self.edge_features[idx]]]\n\n    # @pytest.mark.skip\n    def test_poptorch_simple_deviceiterations_gradient_accumulation(self):\n        \"\"\"\n        Test a simple version of the device-iterations and gradient accumulation\n        to make sure that the dataloader and models handle them correcly.\n        \"\"\"\n\n        with patch(\"poptorch.ipuHardwareIsAvailable\", return_value=True):\n            with patch(\"lightning_graphcore.accelerator._IPU_AVAILABLE\", new=True):\n                import poptorch\n\n                assert poptorch.ipuHardwareIsAvailable()\n                from lightning_graphcore.accelerator import _IPU_AVAILABLE\n\n                assert _IPU_AVAILABLE is True\n\n                # Initialize constants\n                gradient_accumulation = 2\n                device_iterations = 3\n                batch_size = 5\n                num_replicate = 7\n                node_feat_size = 11\n                edge_feat_size = 13\n\n                # Initialize the batch info and poptorch options\n                opts = poptorch.Options()\n                opts.useIpuModel(True)\n                opts.deviceIterations(device_iterations)\n                training_opts = deepcopy(opts)\n                training_opts.Training.gradientAccumulation(gradient_accumulation)\n                inference_opts = deepcopy(opts)\n\n                # Initialize the dataset\n                num_batch = device_iterations * gradient_accumulation * num_replicate\n                data_size = num_batch * batch_size\n                dataset = self.TestDataset(\n                    labels=np.random.rand(data_size).astype(np.float32),\n                    node_features=[\n                        np.random.rand(node_feat_size).astype(np.float32) for ii in range(data_size)\n                    ],\n                    edge_features=[\n                        np.random.rand(edge_feat_size).astype(np.float32) for ii in range(data_size)\n                    ],\n                )\n\n                # Initialize the dataloader\n                train_dataloader = poptorch.DataLoader(\n                    options=training_opts,\n                    dataset=deepcopy(dataset),\n                    batch_size=batch_size,\n                    collate_fn=partial(global_batch_collator, batch_size),\n                )\n\n                val_dataloader = poptorch.DataLoader(\n                    options=inference_opts,\n                    dataset=deepcopy(dataset),\n                    batch_size=batch_size,\n                    collate_fn=partial(global_batch_collator, batch_size),\n                )\n\n                # Build the model, and run it on \"IPU\"\n                model = self.TestSimpleLightning(batch_size, node_feat_size, edge_feat_size, num_batch)\n\n                strategy = IPUStrategy(\n                    training_opts=training_opts, inference_opts=inference_opts, autoreport=True\n                )\n                trainer = Trainer(\n                    logger=True,\n                    enable_checkpointing=False,\n                    max_epochs=2,\n                    strategy=strategy,\n                    num_sanity_val_steps=0,\n                    accelerator=\"ipu\",\n                    devices=1,\n                )\n                trainer.fit(model=model, train_dataloaders=train_dataloader, val_dataloaders=val_dataloader)\n\n    @pytest.mark.skip\n    def test_poptorch_graphium_deviceiterations_gradient_accumulation_full(self):\n        \"\"\"\n        Test the device-iterations and gradient accumulation in a way\n        that is very similar to the Graphium code\n        to make sure that the dataloader and models handle them correcly.\n        \"\"\"\n        with patch(\"poptorch.ipuHardwareIsAvailable\", return_value=True):\n            with patch(\"lightning_graphcore.accelerator._IPU_AVAILABLE\", new=True):\n                try:\n                    import poptorch\n                except Exception as e:\n                    warn(f\"Skipping this test because poptorch is not available.\\n{e}\")\n                    return\n\n                from lightning_graphcore import IPUStrategy\n                import lightning_graphcore\n\n                # Current library imports\n                from graphium.config._loader import (\n                    load_datamodule,\n                    load_metrics,\n                    load_architecture,\n                    load_accelerator,\n                    load_predictor,\n                    load_trainer,\n                )\n                from graphium.utils.safe_run import SafeRun\n\n                # Simplified testing config - reflecting the toymix requirements\n                CONFIG_FILE = \"tests/config_test_ipu_dataloader_multitask.yaml\"\n                with open(CONFIG_FILE, \"r\") as f:\n                    cfg = yaml.safe_load(f)\n\n                cfg, accelerator = load_accelerator(cfg)\n\n                # Load the datamodule, and prepare the data\n                datamodule = load_datamodule(cfg, accelerator_type=accelerator)\n                datamodule.prepare_data()\n                metrics = load_metrics(cfg)\n                model_class, model_kwargs = load_architecture(cfg, in_dims=datamodule.in_dims)\n                # datamodule.setup()\n                predictor = load_predictor(\n                    cfg,\n                    model_class,\n                    model_kwargs,\n                    metrics,\n                    datamodule.get_task_levels(),\n                    accelerator,\n                    datamodule.featurization,\n                    datamodule.task_norms,\n                )\n                assert poptorch.ipuHardwareIsAvailable()\n                trainer = load_trainer(cfg, \"test\", accelerator, \"date_time_suffix\")\n                # Run the model training\n                with SafeRun(\n                    name=\"TRAINING\", raise_error=cfg[\"constants\"][\"raise_train_error\"], verbose=True\n                ):\n                    trainer.fit(model=predictor, datamodule=datamodule)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_ipu_losses.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport unittest as ut\nimport torch\nfrom torch.nn import BCELoss, MSELoss, L1Loss, BCEWithLogitsLoss\nfrom copy import deepcopy\nimport pytest\n\nfrom graphium.ipu.ipu_losses import BCELossIPU, MSELossIPU, L1LossIPU, BCEWithLogitsLossIPU, HybridCELossIPU\nfrom graphium.trainer.losses import HybridCELoss\n\n\n@pytest.mark.ipu\nclass test_Losses(ut.TestCase):\n    torch.manual_seed(42)\n    preds = torch.rand((100, 10), dtype=torch.float32)\n    target = torch.rand((100, 10), dtype=torch.float32)\n\n    th = 0.7\n    nan_th = 0.2\n    preds_greater = preds > th\n    target_greater = (target > th).to(torch.float32)\n    target_greater_nan = deepcopy(target_greater)\n    is_nan = target < nan_th\n    target_greater_nan[target < nan_th] = torch.nan\n    target_nan = deepcopy(target)\n    target_nan[target < nan_th] = torch.nan\n\n    def test_bce(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target_greater)\n        target_nan = deepcopy(self.target_greater_nan)\n\n        # Regular loss\n        loss_true = BCELoss()(preds, target)\n        loss_ipu = BCELossIPU()(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular BCELoss is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular BCELoss is different\"\n        )\n\n        # Weighted loss\n        weight = torch.rand(preds.shape[1], dtype=torch.float32)\n        loss_true = BCELoss(weight=weight)(preds, target)\n        loss_ipu = BCELossIPU(weight=weight)(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular BCELoss is NaN\")\n        self.assertAlmostEqual(loss_true.item(), loss_ipu.item(), msg=\"Weighted BCELoss is different\")\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = BCELoss()(preds[not_nan], target[not_nan])\n        loss_ipu = BCELossIPU()(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Regular BCELoss with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular BCELossIPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular BCELoss with NaN is different\"\n        )\n\n        # Weighted loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        weight = torch.rand(preds.shape, dtype=torch.float32)\n        loss_true = BCELoss(weight=weight[not_nan])(preds[not_nan], target_nan[not_nan])\n        loss_ipu = BCELossIPU(weight=weight)(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Weighted BCELoss with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Weighted BCELossIPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Weighted BCELoss with NaN is different\"\n        )\n\n    def test_mse(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target)\n        target_nan = deepcopy(self.target_nan)\n\n        # Regular loss\n        loss_true = MSELoss()(preds, target)\n        loss_ipu = MSELossIPU()(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular MSELoss is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular MSELoss is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = MSELoss()(preds[not_nan], target[not_nan])\n        loss_ipu = MSELossIPU()(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Regular MSELoss with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular MSELossIPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular MSELoss with NaN is different\"\n        )\n\n    def test_l1(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target)\n        target_nan = deepcopy(self.target_nan)\n\n        # Regular loss\n        loss_true = L1Loss()(preds, target)\n        loss_ipu = L1LossIPU()(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular MAELoss is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular MAELoss is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = L1Loss()(preds[not_nan], target[not_nan])\n        loss_ipu = L1LossIPU()(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Regular MAELoss with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular MAELossIPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular MAELoss with NaN is different\"\n        )\n\n    def test_bce_logits(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target_greater)\n        target_nan = deepcopy(self.target_greater_nan)\n\n        # Regular loss\n        loss_true = BCEWithLogitsLoss()(preds, target)\n        loss_ipu = BCEWithLogitsLossIPU()(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular BCEWithLogitsLoss is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular BCEWithLogitsLoss is different\"\n        )\n\n        # Weighted loss\n        weight = torch.rand(preds.shape[1], dtype=torch.float32)\n        loss_true = BCEWithLogitsLoss(weight=weight)(preds, target)\n        loss_ipu = BCEWithLogitsLossIPU(weight=weight)(preds, target)\n        self.assertFalse(loss_true.isnan(), \"Regular BCEWithLogitsLoss is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), msg=\"Weighted BCEWithLogitsLoss is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = BCEWithLogitsLoss()(preds[not_nan], target[not_nan])\n        loss_ipu = BCEWithLogitsLossIPU()(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Regular test_bce_logits with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular test_bce_logits with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular BCELoss with NaN is different\"\n        )\n\n        # Weighted loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        weight = torch.rand(preds.shape, dtype=torch.float32)\n        loss_true = BCEWithLogitsLoss(weight=weight[not_nan])(preds[not_nan], target_nan[not_nan])\n        loss_ipu = BCEWithLogitsLossIPU(weight=weight)(preds, target_nan)\n        self.assertFalse(loss_true.isnan(), \"Weighted test_bce_logits with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Weighted test_bce_logits with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(),\n            loss_ipu.item(),\n            places=6,\n            msg=\"Weighted BCEWithLogitsLoss with NaN is different\",\n        )\n"
  },
  {
    "path": "tests/test_ipu_metrics.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport unittest as ut\nimport torch\nfrom torchmetrics.functional import (\n    auroc,\n    average_precision,\n    precision,\n    accuracy,\n    recall,\n    pearson_corrcoef,\n    spearman_corrcoef,\n    r2_score,\n    f1_score,\n    fbeta_score,\n    mean_squared_error,\n    mean_absolute_error,\n)\nfrom copy import deepcopy\nimport pytest\n\nfrom graphium.ipu.ipu_metrics import (\n    auroc_ipu,\n    average_precision_ipu,\n    precision_ipu,\n    accuracy_ipu,\n    recall_ipu,\n    pearson_ipu,\n    spearman_ipu,\n    r2_score_ipu,\n    f1_score_ipu,\n    fbeta_score_ipu,\n    mean_squared_error_ipu,\n    mean_absolute_error_ipu,\n)\n\n\n@pytest.mark.ipu\nclass test_Metrics(ut.TestCase):\n    torch.manual_seed(42)\n    preds = torch.rand((100, 10), dtype=torch.float32)\n    target = torch.rand((100, 10), dtype=torch.float32)\n\n    th = 0.7\n    nan_th = 0.2\n    preds_greater = preds > th\n    target_greater = (target > th).to(torch.float32)\n    target_greater_nan = deepcopy(target_greater)\n    is_nan = target < nan_th\n    target_greater_nan[target < nan_th] = torch.nan\n    target_nan = deepcopy(target)\n    target_nan[target < nan_th] = torch.nan\n\n    def test_auroc(self):\n        preds = deepcopy(self.preds)[:, 0]\n        target = deepcopy(self.target)[:, 0]\n        target_nan = deepcopy(self.target_nan)[:, 0]\n\n        target[target < 0.5] = 0\n        target[target >= 0.5] = 1\n\n        target_nan[target_nan < 0.5] = 0\n        target_nan[target_nan >= 0.5] = 1\n\n        # Regular loss\n        score_true = auroc(preds, target.to(int))\n        score_ipu = auroc_ipu(preds, target)\n        self.assertFalse(score_true.isnan(), \"Regular AUROC score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Regular AUROC score is different\"\n        )\n\n        # Weighted loss (As in BCE)\n        sample_weights = torch.rand(preds.shape[0], dtype=torch.float32)\n        score_true = auroc(preds, target.to(int), sample_weights=sample_weights)\n        score_ipu = auroc_ipu(preds, target, sample_weights=sample_weights)\n        self.assertFalse(score_true.isnan(), \"Regular AUROC score is NaN\")\n        self.assertAlmostEqual(score_true.item(), score_ipu.item(), msg=\"Weighted AUROC score is different\")\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = auroc(preds[not_nan], target[not_nan].to(int))\n        score_ipu = auroc_ipu(preds, target_nan)\n        self.assertFalse(score_true.isnan(), \"Regular AUROC score with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Regular AUROCIPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Regular AUROC score with NaN is different\"\n        )\n\n        # Weighted loss with NaNs in target (As in BCE)\n        not_nan = ~target_nan.isnan()\n        sample_weights = torch.rand(preds.shape, dtype=torch.float32)\n        loss_true = auroc(preds[not_nan], target_nan[not_nan].to(int), sample_weights=sample_weights[not_nan])\n        loss_ipu = auroc_ipu(preds, target_nan, sample_weights=sample_weights)\n        self.assertFalse(loss_true.isnan(), \"Weighted AUROC score with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Weighted AUROC IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            # AssertionError: 0.6603766679763794 != 0.6234951615333557 within 2 places\n            loss_true.item(),\n            loss_ipu.item(),\n            places=6,\n            msg=\"Weighted AUROC with NaN is different\",\n        )\n\n    def test_average_precision(self):  # TODO: Make work with multi-class\n        preds = deepcopy(self.preds)[:, 0]\n        target = deepcopy(self.target)[:, 0]\n        target_nan = deepcopy(self.target_nan)[:, 0]\n\n        target[target < 0.5] = 0\n        target[target >= 0.5] = 1\n\n        target_nan[target_nan < 0.5] = 0\n        target_nan[target_nan >= 0.5] = 1\n\n        # Regular loss\n        score_true = average_precision(preds, target.to(int), task=\"binary\")\n        score_ipu = average_precision_ipu(preds, target.to(int), task=\"binary\")\n        self.assertFalse(score_true.isnan(), \"Regular Average Precision is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Regular Average Precision is different\"\n        )\n\n        # Regular average precision with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = average_precision(preds[not_nan], target[not_nan].to(int), task=\"binary\")\n        score_ipu = average_precision_ipu(preds, target_nan, task=\"binary\")\n        self.assertFalse(score_true.isnan(), \"Regular Average Precision with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Regular Average Precision IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Regular Average Precision with NaN is different\",\n        )\n\n    def test_precision(self):\n        preds = deepcopy(self.preds)[:, :4]\n        target = deepcopy(self.target)[:, 0]\n        t = deepcopy(target)\n\n        target[t < 0.4] = 0\n        target[(t >= 0.4) & (t < 0.6)] = 1\n        target[(t >= 0.6) & (t < 0.8)] = 2\n        target[(t >= 0.8)] = 3\n\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n        target_nan_bin = deepcopy(target_nan)\n        target_nan_bin[target_nan > 0] = 1\n\n        # Micro precision binary\n        score_true = precision(preds[:, 0], target.to(int) > 0, average=\"micro\")\n        score_ipu = precision_ipu(preds[:, 0], target > 0, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Precision binary is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Precision binary is different\"\n        )\n\n        # Micro precision binary with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = precision(preds[:, 0][not_nan], target_nan_bin[not_nan].to(int), average=\"micro\")\n        score_ipu = precision_ipu(preds[:, 0], target_nan_bin, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Precision binary with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Precision binary IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Precision with NaN is different\"\n        )\n\n        # Micro precision\n        score_true = precision(preds, target.to(int), average=\"micro\")\n        score_ipu = precision_ipu(preds, target, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Precision is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Precision is different\"\n        )\n\n        # Micro precision with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = precision(preds[not_nan], target[not_nan].to(int), average=\"micro\")\n        score_ipu = precision_ipu(preds, target_nan, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Precision with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Precision IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Precision with NaN is different\"\n        )\n\n        # Macro precision\n        score_true = precision(preds, target.to(int), average=\"macro\", num_classes=4)\n        score_ipu = precision_ipu(preds, target, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Precision is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro Precision is different\"\n        )\n\n        # Macro precision multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = precision(preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4)\n        score_ipu = precision_ipu(preds, target_nan, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Precision multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro Precision multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Macro Precision multiclass with NaN is different\",\n        )\n\n        # Macro precision multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = precision(preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4)\n        score_ipu = precision_ipu(preds, target_nan, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Precision multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro Precision multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Macro Precision multiclass with NaN is different\",\n        )\n\n        # Weighted precision multiclass\n        score_true = precision(preds, target.to(int), average=\"weighted\", num_classes=4)\n        score_ipu = precision_ipu(preds, target, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Precision multiclass is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Weighted Precision multiclass is different\"\n        )\n\n        # Weighted precision multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = precision(preds[not_nan], target[not_nan].to(int), average=\"weighted\", num_classes=4)\n        score_ipu = precision_ipu(preds, target_nan, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Precision multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Weighted Precision multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Regular Average Precision multiclass with NaN is different\",\n        )\n\n    def test_accuracy(self):\n        preds = deepcopy(self.preds)[:, :4]\n        target = deepcopy(self.target)[:, 0]\n        t = deepcopy(target)\n\n        target[t < 0.4] = 0\n        target[(t >= 0.4) & (t < 0.6)] = 1\n        target[(t >= 0.6) & (t < 0.8)] = 2\n        target[(t >= 0.8)] = 3\n\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n        target_nan_bin = deepcopy(target_nan)\n        target_nan_bin[target_nan > 0] = 1\n\n        # Micro accuracy binary\n        score_true = accuracy(preds[:, 0], target.to(int) > 0, average=\"micro\")\n        score_ipu = accuracy_ipu(preds[:, 0], target > 0, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Accuracy binary is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Accuracy binary is different\"\n        )\n\n        # Micro accuracy binary with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = accuracy(preds[:, 0][not_nan], target_nan_bin[not_nan].to(int), average=\"micro\")\n        score_ipu = accuracy_ipu(preds[:, 0], target_nan_bin, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Accuracy binary with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Accuracy binary IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Accuracy with NaN is different\"\n        )\n\n        # Micro accuracy\n        score_true = accuracy(preds, target.to(int), average=\"micro\")\n        score_ipu = accuracy_ipu(preds, target, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Accuracy is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Accuracy is different\"\n        )\n\n        # Micro accuracy with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = accuracy(preds[not_nan], target[not_nan].to(int), average=\"micro\")\n        score_ipu = accuracy_ipu(preds, target_nan, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Accuracy with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Accuracy IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Accuracy with NaN is different\"\n        )\n\n        # Macro accuracy\n        score_true = accuracy(preds, target.to(int), average=\"macro\", num_classes=4)\n        score_ipu = accuracy_ipu(preds, target, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Accuracy is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro Accuracy is different\"\n        )\n\n        # Macro accuracy with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = accuracy(preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4)\n        score_ipu = accuracy_ipu(preds, target_nan, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Accuracy with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro Accuracy IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro Accuracy with NaN is different\"\n        )\n\n        # Weighted accuracy\n        score_true = accuracy(preds, target.to(int), average=\"weighted\", num_classes=4)\n        score_ipu = accuracy_ipu(preds, target, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Accuracy is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Weighted Accuracy is different\"\n        )\n\n        # Weighted accuracy with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = accuracy(preds[not_nan], target[not_nan].to(int), average=\"weighted\", num_classes=4)\n        score_ipu = accuracy_ipu(preds, target_nan, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Accuracy with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Weighted Accuracy IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Regular Accuracy with NaN is different\"\n        )\n\n    def test_recall(self):\n        preds = deepcopy(self.preds)[:, :4]\n        target = deepcopy(self.target)[:, 0]\n        t = deepcopy(target)\n\n        target[t < 0.4] = 0\n        target[(t >= 0.4) & (t < 0.6)] = 1\n        target[(t >= 0.6) & (t < 0.8)] = 2\n        target[(t >= 0.8)] = 3\n\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n        target_nan_bin = deepcopy(target_nan)\n        target_nan_bin[target_nan > 0] = 1\n\n        # Micro recall binary\n        score_true = recall(preds[:, 0], target.to(int) > 0, average=\"micro\")\n        score_ipu = recall_ipu(preds[:, 0], target > 0, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Recall binary is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Recall binary is different\"\n        )\n\n        # Micro recall binary with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = recall(preds[:, 0][not_nan], target_nan_bin[not_nan].to(int), average=\"micro\")\n        score_ipu = recall_ipu(preds[:, 0], target_nan_bin, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Recall binary with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Recall binary IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Recall binary with NaN is different\"\n        )\n\n        # Micro recall\n        score_true = recall(preds, target.to(int), average=\"micro\")\n        score_ipu = recall_ipu(preds, target, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Recall is NaN\")\n        self.assertAlmostEqual(score_true.item(), score_ipu.item(), places=6, msg=\"Micro Recall is different\")\n\n        # Micro recall with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = recall(preds[not_nan], target[not_nan].to(int), average=\"micro\")\n        score_ipu = recall_ipu(preds, target_nan, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro Recall with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro Recall IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro Recall with NaN is different\"\n        )\n\n        # Macro recall multiclass\n        score_true = recall(preds, target.to(int), average=\"macro\", num_classes=4)\n        score_ipu = recall_ipu(preds, target, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Recall is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro Recall multiclass is different\"\n        )\n\n        # Macro recall multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = recall(preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4)\n        score_ipu = recall_ipu(preds, target_nan, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro Recall multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro Recall multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro Recall multiclass with NaN is different\"\n        )\n\n        # Weighted recallmulticlass\n        score_true = recall(preds, target.to(int), average=\"weighted\", num_classes=4)\n        score_ipu = recall_ipu(preds, target, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Recall is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Weighted Recall is different\"\n        )\n\n        # Weighted recall multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = recall(preds[not_nan], target[not_nan].to(int), average=\"weighted\", num_classes=4)\n        score_ipu = recall_ipu(preds, target_nan, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted Recall multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Weighted Recall multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Regular Recall multiclass with NaN is different\",\n        )\n\n    def test_pearsonr(self):\n        preds = deepcopy(self.preds)[:, 0]\n        target = deepcopy(self.target)[:, 0] + preds\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n\n        # Regular loss\n        score_true = pearson_corrcoef(preds, target)\n        score_ipu = pearson_ipu(preds, target)\n        self.assertFalse(score_true.isnan(), \"Pearson is NaN\")\n        self.assertAlmostEqual(score_true.item(), score_ipu.item(), places=4, msg=\"Pearson is different\")\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = pearson_corrcoef(preds[not_nan], target[not_nan])\n        score_ipu = pearson_ipu(preds, target_nan)\n        self.assertFalse(score_true.isnan(), \"Regular PearsonR with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"IPU PearsonR score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=4, msg=\"Pearson with NaN is different\"\n        )\n\n    def test_spearmanr(self):\n        preds = deepcopy(self.preds)[:, 0]\n        target = deepcopy(self.target)[:, 0] + preds\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n\n        # Regular loss\n        score_true = spearman_corrcoef(preds, target)\n        score_ipu = spearman_ipu(preds, target)\n        self.assertFalse(score_true.isnan(), \"Spearman is NaN\")\n        self.assertAlmostEqual(score_true.item(), score_ipu.item(), places=4, msg=\"Spearman is different\")\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = spearman_corrcoef(preds[not_nan], target[not_nan])\n        score_ipu = spearman_ipu(preds, target_nan)\n        self.assertFalse(score_true.isnan(), \"Regular Spearman with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"IPU Spearman score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=4, msg=\"Spearman with NaN is different\"\n        )\n\n    def test_r2_score(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target) + preds\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan] = float(\"nan\")\n\n        # Regular loss\n        score_true = r2_score(preds, target)\n        score_ipu = r2_score_ipu(preds, target)\n        self.assertFalse(score_true.isnan(), \"r2_score is NaN\")\n        self.assertAlmostEqual(score_true.item(), score_ipu.item(), places=4, msg=\"r2_score is different\")\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_ipu = r2_score_ipu(preds, target_nan, multioutput=\"raw_values\")\n        for ii in range(preds.shape[1]):\n            score_true = r2_score(\n                preds[:, ii][not_nan[:, ii]], target_nan[:, ii][not_nan[:, ii]], multioutput=\"raw_values\"\n            )\n            self.assertFalse(score_true.isnan().any(), f\"{ii}: r2_score with target_nan is NaN\")\n            self.assertFalse(score_ipu[ii].isnan().any(), f\"{ii}: IPU r2_score with target_nan is NaN\")\n            self.assertAlmostEqual(\n                score_true.item(), score_ipu[ii].item(), places=4, msg=f\"{ii}: r2_score with NaN is different\"\n            )\n\n    def test_fbeta_score(self):\n        preds = deepcopy(self.preds)[:, :4]\n        target = deepcopy(self.target)[:, 0]\n        t = deepcopy(target)\n\n        target[t < 0.4] = 0\n        target[(t >= 0.4) & (t < 0.6)] = 1\n        target[(t >= 0.6) & (t < 0.8)] = 2\n        target[(t >= 0.8)] = 3\n\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n        target_nan_bin = deepcopy(target_nan)\n        target_nan_bin[target_nan > 0] = 1\n\n        # Micro fbeta_score binary\n        score_true = fbeta_score(preds[:, 0], target.to(int) > 0, average=\"micro\", beta=0.5)\n        score_ipu = fbeta_score_ipu(preds[:, 0], target > 0, average=\"micro\", beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Micro FBETA_score binary is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro FBETA_score binary is different\"\n        )\n\n        # Micro fbeta_score binary with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = fbeta_score(\n            preds[:, 0][not_nan], target_nan_bin[not_nan].to(int), average=\"micro\", beta=0.5\n        )\n        score_ipu = fbeta_score_ipu(preds[:, 0], target_nan_bin, average=\"micro\", beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Micro FBETA_score binary with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro FBETA_score binary IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Micro FBETA_score binary with NaN is different\",\n        )\n\n        # Micro fbeta_score\n        score_true = fbeta_score(preds, target.to(int), average=\"micro\", beta=0.5)\n        score_ipu = fbeta_score_ipu(preds, target, average=\"micro\", beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Micro FBETA_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro FBETA_score is different\"\n        )\n\n        # Micro fbeta_score with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = fbeta_score(preds[not_nan], target[not_nan].to(int), average=\"micro\", beta=0.5)\n        score_ipu = fbeta_score_ipu(preds, target_nan, average=\"micro\", beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Micro FBETA_score with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro FBETA_score IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro FBETA_score with NaN is different\"\n        )\n\n        # Macro fbeta_score multiclass\n        score_true = fbeta_score(preds, target.to(int), average=\"macro\", num_classes=4, beta=0.5)\n        score_ipu = fbeta_score_ipu(preds, target, average=\"macro\", num_classes=4, beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Macro FBETA_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro FBETA_score multiclass is different\"\n        )\n\n        # Macro fbeta_score multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = fbeta_score(\n            preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4, beta=0.5\n        )\n        score_ipu = fbeta_score_ipu(preds, target_nan, average=\"macro\", num_classes=4, beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Macro FBETA_score multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro FBETA_score multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Macro FBETA_score multiclass with NaN is different\",\n        )\n\n        # Weighted fbeta_scoremulticlass\n        score_true = fbeta_score(preds, target.to(int), average=\"weighted\", num_classes=4, beta=0.5)\n        score_ipu = fbeta_score_ipu(preds, target, average=\"weighted\", num_classes=4, beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Weighted FBETA_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Weighted FBETA_score is different\"\n        )\n\n        # Weighted fbeta_score multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = fbeta_score(\n            preds[not_nan], target[not_nan].to(int), average=\"weighted\", num_classes=4, beta=0.5\n        )\n        score_ipu = fbeta_score_ipu(preds, target_nan, average=\"weighted\", num_classes=4, beta=0.5)\n        self.assertFalse(score_true.isnan(), \"Weighted FBETA_score multiclass with target_nan is NaN\")\n        self.assertFalse(\n            score_ipu.isnan(), \"Weighted FBETA_score multiclass IPU score with target_nan is NaN\"\n        )\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Regular FBETA_score multiclass with NaN is different\",\n        )\n\n    def test_f1_score(self):\n        preds = deepcopy(self.preds)[:, :4]\n        target = deepcopy(self.target)[:, 0]\n        t = deepcopy(target)\n\n        target[t < 0.4] = 0\n        target[(t >= 0.4) & (t < 0.6)] = 1\n        target[(t >= 0.6) & (t < 0.8)] = 2\n        target[(t >= 0.8)] = 3\n\n        target_nan = deepcopy(target)\n        target_nan[self.is_nan[:, 0]] = float(\"nan\")\n        target_nan_bin = deepcopy(target_nan)\n        target_nan_bin[target_nan > 0] = 1\n\n        # Micro f1_score binary\n        score_true = f1_score(preds[:, 0], target.to(int) > 0, average=\"micro\")\n        score_ipu = f1_score_ipu(preds[:, 0], target > 0, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro F1_score binary is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro F1_score binary is different\"\n        )\n\n        # Micro f1_score binary with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = f1_score(preds[:, 0][not_nan], target_nan_bin[not_nan].to(int), average=\"micro\")\n        score_ipu = f1_score_ipu(preds[:, 0], target_nan_bin, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro F1_score binary with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro F1_score binary IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro F1_score binary with NaN is different\"\n        )\n\n        # Micro f1_score\n        score_true = f1_score(preds, target.to(int), average=\"micro\")\n        score_ipu = f1_score_ipu(preds, target, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro F1_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro F1_score is different\"\n        )\n\n        # Micro f1_score with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = f1_score(preds[not_nan], target[not_nan].to(int), average=\"micro\")\n        score_ipu = f1_score_ipu(preds, target_nan, average=\"micro\")\n        self.assertFalse(score_true.isnan(), \"Micro F1_score with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Micro F1_score IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Micro F1_score with NaN is different\"\n        )\n\n        # Macro f1_score multiclass\n        score_true = f1_score(preds, target.to(int), average=\"macro\", num_classes=4)\n        score_ipu = f1_score_ipu(preds, target, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro F1_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Macro F1_score multiclass is different\"\n        )\n\n        # Macro f1_score multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = f1_score(preds[not_nan], target[not_nan].to(int), average=\"macro\", num_classes=4)\n        score_ipu = f1_score_ipu(preds, target_nan, average=\"macro\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Macro F1_score multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Macro F1_score multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Macro F1_score multiclass with NaN is different\",\n        )\n\n        # Weighted f1_scoremulticlass\n        score_true = f1_score(preds, target.to(int), average=\"weighted\", num_classes=4)\n        score_ipu = f1_score_ipu(preds, target, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted F1_score is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(), score_ipu.item(), places=6, msg=\"Weighted F1_score is different\"\n        )\n\n        # Weighted f1_score multiclass with NaNs in target\n        not_nan = ~target_nan.isnan()\n        score_true = f1_score(preds[not_nan], target[not_nan].to(int), average=\"weighted\", num_classes=4)\n        score_ipu = f1_score_ipu(preds, target_nan, average=\"weighted\", num_classes=4)\n        self.assertFalse(score_true.isnan(), \"Weighted F1_score multiclass with target_nan is NaN\")\n        self.assertFalse(score_ipu.isnan(), \"Weighted F1_score multiclass IPU score with target_nan is NaN\")\n        self.assertAlmostEqual(\n            score_true.item(),\n            score_ipu.item(),\n            places=6,\n            msg=\"Regular F1_score multiclass with NaN is different\",\n        )\n\n    def test_mse(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target)\n        target_nan = deepcopy(self.target_nan)\n        squared = True\n\n        # Regular loss\n        loss_true = mean_squared_error(preds, target, squared)\n        loss_ipu = mean_squared_error_ipu(preds=preds, target=target, squared=squared)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Squared Error is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular Mean Squared Error is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = mean_squared_error(preds[not_nan], target[not_nan], squared)\n        loss_ipu = mean_squared_error_ipu(preds=preds, target=target_nan, squared=squared)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Squared Error with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular Mean Squared Error IPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(),\n            loss_ipu.item(),\n            places=6,\n            msg=\"Regular Mean Squared Error with NaN is different\",\n        )\n\n        squared = False\n\n        # Regular loss\n        loss_true = mean_squared_error(preds, target, squared)\n        loss_ipu = mean_squared_error_ipu(preds=preds, target=target, squared=squared)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Squared Error is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular Mean Squared Error is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = mean_squared_error(preds[not_nan], target[not_nan], squared)\n        loss_ipu = mean_squared_error_ipu(preds=preds, target=target_nan, squared=squared)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Squared Error with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular Mean Squared Error IPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(),\n            loss_ipu.item(),\n            places=6,\n            msg=\"Regular Mean Squared Error with NaN is different\",\n        )\n\n    def test_mae(self):\n        preds = deepcopy(self.preds)\n        target = deepcopy(self.target)\n        target_nan = deepcopy(self.target_nan)\n\n        # Regular loss\n        loss_true = mean_absolute_error(preds, target)\n        loss_ipu = mean_absolute_error_ipu(preds=preds, target=target)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Absolute Error is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(), loss_ipu.item(), places=6, msg=\"Regular Mean Absolute Error is different\"\n        )\n\n        # Regular loss with NaNs in target\n        not_nan = ~target_nan.isnan()\n        loss_true = mean_absolute_error(preds[not_nan], target[not_nan])\n        loss_ipu = mean_absolute_error_ipu(preds=preds, target=target_nan)\n        self.assertFalse(loss_true.isnan(), \"Regular Mean Absolute Error with target_nan is NaN\")\n        self.assertFalse(loss_ipu.isnan(), \"Regular Mean Absolute Error IPU with target_nan is NaN\")\n        self.assertAlmostEqual(\n            loss_true.item(),\n            loss_ipu.item(),\n            places=6,\n            msg=\"Regular Mean Absolute Error with NaN is different\",\n        )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_ipu_options.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pytest\nfrom graphium.config._loader import _get_ipu_opts, load_ipu_options\nfrom graphium.ipu.ipu_utils import ipu_options_list_to_file\n\nimport tempfile\nfrom typing import Optional, List\nimport os\n\nCONFIG_EXTRACT = {\n    \"trainer\": {\"trainer\": {\"accumulate_grad_batches\": 10}},\n    \"accelerator\": {\n        \"type\": \"ipu\",\n        \"config_override\": {\n            \"datamodule\": {\n                \"args\": {\n                    \"ipu_dataloader_training_opts\": {\n                        \"mode\": \"async\",\n                        \"max_num_nodes_per_graph\": 44,\n                        \"max_num_edges_per_graph\": 80,\n                    },\n                    \"ipu_dataloader_inference_opts\": {\n                        \"mode\": \"async\",\n                        \"max_num_nodes_per_graph\": 44,\n                        \"max_num_edges_per_graph\": 80,\n                    },\n                    \"batch_size_training\": 50,\n                    \"batch_size_inference\": 50,\n                }\n            },\n            \"predictor\": {\"optim_kwargs\": {\"loss_scaling\": 1024}},\n            \"trainer\": {\"trainer\": {\"precision\": 16, \"accumulate_grad_batches\": 4}},\n        },\n        \"ipu_config\": [\n            \"deviceIterations(5)\",\n            \"replicationFactor(16)\",\n            \"TensorLocations.numIOTiles(128)\",\n            '_Popart.set(\"defaultBufferingDepth\", 128)',\n            \"Precision.enableStochasticRounding(True)\",\n        ],\n        \"ipu_inference_config\": [\n            \"deviceIterations(1)\",\n            \"replicationFactor(4)\",\n            \"TensorLocations.numIOTiles(32)\",\n            '_Popart.set(\"defaultBufferingDepth\", 16)',\n            \"Precision.enableStochasticRounding(True)\",\n        ],\n    },\n}\n\n\n@pytest.mark.ipu\ndef test_ipu_options():\n    try:\n        import poptorch\n\n        ipu_opts, ipu_inference_opts = _get_ipu_opts(CONFIG_EXTRACT)\n\n        # Define the expected IPU options for comparison\n        expected_ipu_opts = [\n            \"deviceIterations(5)\",\n            \"replicationFactor(16)\",\n            \"TensorLocations.numIOTiles(128)\",\n            '_Popart.set(\"defaultBufferingDepth\", 128)',\n            \"Precision.enableStochasticRounding(True)\",\n        ]\n        expected_ipu_inference_opts = [\n            \"deviceIterations(1)\",\n            \"replicationFactor(4)\",\n            \"TensorLocations.numIOTiles(32)\",\n            '_Popart.set(\"defaultBufferingDepth\", 16)',\n            \"Precision.enableStochasticRounding(True)\",\n        ]\n\n        # Test the _get_ipu_opts method\n        ipu_opts, ipu_inference_opts = _get_ipu_opts(CONFIG_EXTRACT)\n        assert ipu_opts == expected_ipu_opts, f\"Expected {expected_ipu_opts}, but got {ipu_opts}\"\n        assert (\n            ipu_inference_opts == expected_ipu_inference_opts\n        ), f\"Expected {expected_ipu_inference_opts}, but got {ipu_inference_opts}\"\n\n        # Test the load_ipu_options method\n        ipu_training_opts, ipu_inference_opts = load_ipu_options(\n            ipu_opts=ipu_opts,\n            seed=42,\n            model_name=\"test_model\",\n            gradient_accumulation=CONFIG_EXTRACT[\"trainer\"][\"trainer\"].get(\"accumulate_grad_batches\", None),\n            ipu_inference_opts=ipu_inference_opts,\n        )\n\n        # Ensure that the options objects are not None\n        assert ipu_training_opts is not None, \"Expected ipu_training_opts not to be None\"\n        assert ipu_inference_opts is not None, \"Expected ipu_inference_opts not to be None\"\n\n        # Test the properties of the options objects\n        assert (\n            ipu_training_opts.replication_factor == 16\n        ), \"Expected replication_factor of ipu_training_opts to be 16\"\n        assert (\n            ipu_inference_opts.replication_factor == 4\n        ), \"Expected replication_factor of ipu_inference_opts to be 4\"\n        assert ipu_training_opts._popart, \"Expected _popart of ipu_training_opts to be True\"\n        assert ipu_inference_opts._popart, \"Expected _popart of ipu_inference_opts to be True\"\n\n    except ImportError:\n        pytest.skip(\"Skipping this test because poptorch is not available\")\n\n\n@pytest.mark.ipu\ndef test_ipu_options_list_to_file():\n    # Define a list of IPU options\n    ipu_options = [\n        \"deviceIterations(5)\",\n        \"replicationFactor(16)\",\n        \"TensorLocations.numIOTiles(128)\",\n        '_Popart.set(\"defaultBufferingDepth\", 128)',\n        \"Precision.enableStochasticRounding(True)\",\n    ]\n\n    # Call the function with the list of IPU options\n    tmp_file = ipu_options_list_to_file(ipu_options)\n\n    # Check that the function returns a temporary file object\n    assert isinstance(tmp_file, tempfile._TemporaryFileWrapper)\n\n    # Check that the temporary file exists\n    assert os.path.exists(tmp_file.name)\n\n    # Check the contents of the temporary file\n    with open(tmp_file.name, \"r\") as f:\n        contents = f.read().splitlines()\n    assert contents == ipu_options\n\n    # Check the behavior when the input is None\n    tmp_file = ipu_options_list_to_file(None)\n    assert tmp_file is None\n"
  },
  {
    "path": "tests/test_ipu_poptorch.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pytest\n\n\n@pytest.mark.ipu\ndef test_poptorch():\n    # Run this test only if poptorch is available\n    # Primarily to test the install and SDK is correctly activated\n    try:\n        import poptorch\n\n        opts = poptorch.Options()\n\n    except ImportError:\n        raise ImportError\n    assert True\n"
  },
  {
    "path": "tests/test_ipu_to_dense_batch.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pytest\nimport torch\nfrom torch_geometric.data import Data, Batch\nfrom graphium.ipu.to_dense_batch import to_dense_batch\nfrom warnings import warn\n\n\n# General imports\nimport yaml\nimport unittest as ut\nimport numpy as np\nfrom copy import deepcopy\nfrom warnings import warn\nfrom lightning import Trainer, LightningModule\nfrom lightning_graphcore import IPUStrategy\nfrom functools import partial\n\nimport torch\nfrom torch.utils.data.dataloader import default_collate\n\n# Current library imports\nfrom graphium.config._loader import load_datamodule, load_metrics, load_architecture, load_accelerator\n\n\n@pytest.mark.ipu\nclass TestIPUBatch:\n    @pytest.fixture(autouse=True)\n    def setup_class(self):\n        self.in_dim = 12\n        self.out_dim = 12\n        self.in_dim_edges = 10\n        self.out_dim_edges = 10\n        self.edge_idx1 = torch.stack(\n            [torch.tensor([0, 1, 2, 3, 2], dtype=torch.int), torch.tensor([1, 2, 3, 0, 0], dtype=torch.int)]\n        )\n        self.edge_idx2 = torch.stack(\n            [torch.tensor([0, 0, 0, 1], dtype=torch.int), torch.tensor([0, 1, 2, 0], dtype=torch.int)]\n        )\n        self.x1 = torch.randn(self.edge_idx1.max().item() + 1, self.in_dim, dtype=torch.float32)\n        self.e1 = torch.randn(self.edge_idx1.shape[-1], self.in_dim_edges, dtype=torch.float32)\n        self.x2 = torch.randn(self.edge_idx2.max().item() + 1, self.in_dim, dtype=torch.float32)\n        self.e2 = torch.randn(self.edge_idx2.shape[-1], self.in_dim_edges, dtype=torch.float32)\n        self.g1 = Data(feat=self.x1, edge_index=self.edge_idx1, edge_feat=self.e1)\n        self.g2 = Data(feat=self.x2, edge_index=self.edge_idx2, edge_feat=self.e2)\n        self.bg = Batch.from_data_list([self.g1, self.g2])\n        self.attn_kwargs = {\"embed_dim\": self.in_dim, \"num_heads\": 2, \"batch_first\": True}\n\n    # @pytest.mark.skip\n    @pytest.mark.parametrize(\"max_num_nodes_per_graph, batch_size\", [(10, 5), (20, 10), (30, 15)])\n    def test_ipu_to_dense_batch(self, max_num_nodes_per_graph, batch_size):\n        # Run this test only if poptorch is available\n        try:\n            import poptorch\n\n            opts = poptorch.Options()\n            opts.useIpuModel(True)\n\n            class MyModel(torch.nn.Module):\n                def __init__(self):\n                    super(MyModel, self).__init__()\n\n                def forward(self, x, batch):\n                    return to_dense_batch(\n                        x,\n                        batch=batch,\n                        batch_size=batch_size,\n                        max_num_nodes_per_graph=max_num_nodes_per_graph,\n                        drop_nodes_last_graph=False,\n                    )\n\n            model = MyModel()\n            model = model.eval()\n            poptorch_model_inf = poptorch.inferenceModel(model, options=opts)\n            # for data in train_dataloader:\n            out, mask, idx = poptorch_model_inf(self.bg.feat, self.bg.batch)\n            # Check the output sizes\n            assert out.size() == torch.Size([batch_size, max_num_nodes_per_graph, 12])\n            # Check the mask for true / false values\n            assert mask.size() == torch.Size([batch_size, max_num_nodes_per_graph])\n            assert torch.sum(mask) == 7\n            assert (mask[0][:4] == True).all()\n            assert (mask[0][4:] == False).all()\n            assert (mask[1][:3] == True).all()\n            assert (mask[1][3:] == False).all()\n            assert (mask[2:] == False).all()\n\n            # Check the idx are all the true values in the mask\n            assert (mask.flatten()[idx] == True).all()\n            poptorch_model_inf.detachFromDevice()\n        except ImportError:\n            pytest.skip(\"Skipping this test because poptorch is not available\")\n\n    def test_ipu_to_dense_batch_no_batch_no_max_nodes(self):\n        h_dense, mask = to_dense_batch(\n            self.bg.feat,\n            batch=None,\n            batch_size=None,\n            max_num_nodes_per_graph=None,\n            drop_nodes_last_graph=False,\n        )\n        # Add assertions to check the output as needed\n        assert torch.allclose(h_dense, self.bg.feat.unsqueeze(0), atol=1e-5), \"Tensors are not equal\"\n        assert mask.size(1) == h_dense.size(1)\n        assert mask.all().item(), \"Not all values in the tensor are True\"\n\n    def test_ipu_to_dense_batch_no_batch(self):\n        max_nodes_per_graph = 10\n        h_dense, mask, id = to_dense_batch(\n            self.bg.feat,\n            batch=None,\n            batch_size=None,\n            max_num_nodes_per_graph=max_nodes_per_graph,\n            drop_nodes_last_graph=False,\n        )\n        assert mask.size() == (1, max_nodes_per_graph)\n        assert torch.sum(mask) == 7\n        assert torch.equal(id, torch.arange(7))\n        assert h_dense.size() == (1, max_nodes_per_graph, self.bg.feat.size(-1))\n\n    def test_ipu_to_dense_batch_drop_last(self):\n        out, mask, idx = to_dense_batch(\n            self.bg.feat,\n            batch=None,\n            batch_size=None,\n            max_num_nodes_per_graph=3,\n            drop_nodes_last_graph=True,\n        )\n        # Add assertions to check the output as needed\n        assert mask.size(1) == out.size(1)\n        # Check the mask and output have been clipped\n        assert mask.size() == torch.Size([1, 3])\n        assert mask.all().item(), \"Not all values in the tensor are True\"\n"
  },
  {
    "path": "tests/test_loaders.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nfrom graphium.config._loader import merge_dicts\nfrom copy import deepcopy\nimport unittest as ut\n\n\nclass TestLoader(ut.TestCase):\n    def test_merge_dicts(self):\n        dict_a = {\"a\": {\"b\": {\"c\": 1, \"d\": 2}, \"e\": 3}, \"f\": 4}\n\n        dict_b = {\"a\": {\"b\": {\"g\": 6}}, \"h\": 7}\n\n        dict_c = {\n            \"a\": {\n                \"b\": {\n                    \"c\": 1,\n                },\n            }\n        }\n\n        # Check that the new keys are added correctly\n        merge_dicts(dict_a, dict_b)\n        self.assertEqual(dict_a[\"a\"][\"b\"][\"g\"], dict_b[\"a\"][\"b\"][\"g\"])\n        self.assertEqual(dict_a[\"h\"], dict_b[\"h\"])\n\n        # Check that no error is thrown if a key exists, but the value is identical\n        merge_dicts(dict_a, dict_c)\n        self.assertEqual(dict_a[\"a\"][\"b\"][\"c\"], dict_c[\"a\"][\"b\"][\"c\"])\n\n        # Check that an error is thrown if a key exists, but the value is different\n        dict_d = deepcopy(dict_c)\n        dict_d[\"a\"][\"b\"][\"c\"] = 2\n        with self.assertRaises(ValueError):\n            merge_dicts(dict_a, dict_d)\n\n\n# Main\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_losses.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the metrics and wrappers of graphium/trainer/metrics/...\n\"\"\"\n\nimport torch\nimport unittest as ut\nfrom torch.nn import functional as F\n\nfrom graphium.trainer.losses import HybridCELoss\nfrom graphium.trainer.predictor_options import EvalOptions\n\n\ndef _parse(loss_fun):\n    eval_options = EvalOptions(loss_fun=loss_fun, metrics_on_progress_bar=None)\n    return eval_options.parse_loss_fun(loss_fun)\n\n\nclass test_HybridCELoss(ut.TestCase):\n    input = torch.Tensor([[0.1, 0.1, 0.3, 0.5, 0.0], [0.1, 0.0, 0.7, 0.2, 0.0]])\n    target = torch.Tensor([3, 0]).long()\n    brackets = torch.Tensor([0, 1, 2, 3, 4])\n    regression_input = torch.Tensor([2.0537, 2.0017])  # inner product of input and brackets\n    regression_target = torch.Tensor([3, 0]).float()\n\n    def test_pure_ce_loss(self):\n        loss = HybridCELoss(n_brackets=len(self.brackets), alpha=1.0, reduction=\"none\")\n        assert torch.allclose(\n            loss(self.input, self.target),\n            F.cross_entropy(self.input, self.target, reduction=\"none\"),\n        )\n        assert loss(self.input, self.target).shape == (2,)\n\n    def test_pure_mae_loss(self):\n        loss = HybridCELoss(\n            n_brackets=len(self.brackets),\n            alpha=0.0,\n            regression_loss=\"mae\",\n            reduction=\"none\",\n        )\n        assert torch.allclose(\n            loss(self.input, self.target),\n            F.l1_loss(self.regression_input, self.regression_target, reduction=\"none\"),\n            rtol=1e-04,\n            atol=1e-07,\n        )\n        assert loss(self.input, self.target).shape == (2,)\n\n    def test_pure_mse_loss(self):\n        loss = HybridCELoss(\n            n_brackets=len(self.brackets),\n            alpha=0.0,\n            regression_loss=\"mse\",\n            reduction=\"none\",\n        )\n\n        assert torch.allclose(\n            loss(self.input, self.target),\n            F.mse_loss(self.regression_input, self.regression_target, reduction=\"none\"),\n            rtol=1e-04,\n            atol=1e-07,\n        )\n        assert loss(self.input, self.target).shape == (2,)\n\n    def test_hybrid_loss(self):\n        loss = HybridCELoss(n_brackets=len(self.brackets), alpha=0.5, regression_loss=\"mse\")\n\n        ce_loss = F.cross_entropy(self.input, self.target)\n        mse_loss = F.mse_loss(self.regression_input, self.regression_target)\n\n        assert torch.allclose(\n            loss(self.input, self.target), 0.5 * ce_loss + 0.5 * mse_loss, rtol=1e-04, atol=1e-07\n        )\n        assert loss(self.input, self.target).shape == torch.Size([])\n\n    def test_loss_parser(self):\n        # HybridCE cannot be parsed from a string because it requires specifying n_brackets\n        loss_fun = \"hybrid_ce\"\n        self.assertRaises(TypeError, _parse, loss_fun=loss_fun)\n\n        # HybridCE requires n_brackets to be specified\n        loss_fun = {\"name\": \"hybrid_ce\"}\n        self.assertRaises(TypeError, _parse, loss_fun=loss_fun)\n\n        loss_fun = {\"name\": \"hybrid_ce\", \"n_brackets\": 3}\n        parsed = _parse(loss_fun)\n\n        assert isinstance(parsed, HybridCELoss)\n        assert len(parsed.brackets) == 3\n        assert parsed.regression_loss == F.mse_loss\n\n        loss_fun = {\"name\": \"hybrid_ce\", \"n_brackets\": 5, \"regression_loss\": \"mae\"}\n        parsed = _parse(loss_fun)\n\n        assert isinstance(parsed, HybridCELoss)\n        assert len(parsed.brackets) == 5\n        assert parsed.regression_loss == F.l1_loss\n\n\nclass test_BCELoss(ut.TestCase):\n    def test_loss_parser(self):\n        loss_fun = \"bce\"\n        parsed = _parse(loss_fun)\n\n        assert isinstance(parsed, torch.nn.BCELoss)\n\n        loss_fun = {\"name\": \"bce\"}\n        parsed = _parse(loss_fun)\n\n        assert isinstance(parsed, torch.nn.BCELoss)\n        assert parsed.reduction != \"sum\"\n\n        loss_fun = {\"name\": \"bce\", \"reduction\": \"sum\"}\n        parsed = _parse(loss_fun)\n\n        assert isinstance(parsed, torch.nn.BCELoss)\n        assert parsed.reduction == \"sum\"\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_metrics.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the metrics and wrappers of graphium/trainer/metrics/...\n\"\"\"\n\nimport torch\nimport unittest as ut\nimport tempfile\nimport os\nimport operator as op\n\nfrom graphium.trainer.metrics import (\n    MetricWrapper,\n    Thresholder,\n)\n\nfrom torchmetrics.functional import mean_squared_error\n\n\nclass test_Metrics(ut.TestCase):\n    def test_thresholder(self):\n        torch.manual_seed(42)\n        preds = torch.rand(100, dtype=torch.float32)\n        target = torch.rand(100, dtype=torch.float32)\n\n        th = 0.7\n        preds_greater = preds > th\n        target_greater = target > th\n\n        # Test thresholder greater\n        for th_on_preds in [True, False]:\n            for th_on_target in [True, False]:\n                thresholder = Thresholder(\n                    threshold=th, operator=\"greater\", th_on_target=th_on_target, th_on_preds=th_on_preds\n                )\n                preds2, target2 = thresholder(preds, target)\n                if th_on_preds:\n                    self.assertListEqual(preds2.tolist(), preds_greater.tolist())\n                else:\n                    self.assertListEqual(preds2.tolist(), preds.tolist())\n                if th_on_target:\n                    self.assertListEqual(target2.tolist(), target_greater.tolist())\n                else:\n                    self.assertListEqual(target2.tolist(), target.tolist())\n\n        # Test thresholder lower\n        for th_on_preds in [True, False]:\n            for th_on_target in [True, False]:\n                thresholder = Thresholder(\n                    threshold=th, operator=\"lower\", th_on_target=th_on_target, th_on_preds=th_on_preds\n                )\n                preds2, target2 = thresholder(preds, target)\n                if th_on_preds:\n                    self.assertListEqual(preds2.tolist(), (~preds_greater).tolist())\n                else:\n                    self.assertListEqual(preds2.tolist(), preds.tolist())\n                if th_on_target:\n                    self.assertListEqual(target2.tolist(), (~target_greater).tolist())\n                else:\n                    self.assertListEqual(target2.tolist(), target.tolist())\n\n\nclass test_MetricWrapper(ut.TestCase):\n    def test_target_nan_mask(self):\n        torch.random.manual_seed(42)\n\n        for sz in [(100,), (100, 1), (100, 10)]:\n            err_msg = f\"Error for `sz = {sz}`\"\n\n            # Generate prediction and target matrices\n            target = torch.rand(sz, dtype=torch.float32)\n            preds = (0.5 * target) + (0.5 * torch.rand(sz, dtype=torch.float32))\n            is_nan = torch.rand(sz) > 0.3\n            target = (target > 0.5).to(torch.float32)\n            target[is_nan] = float(\"nan\")\n\n            # Compute score with different ways of ignoring NaNs\n            metric = MetricWrapper(metric=\"mse\", target_nan_mask=None)\n            score1 = metric(preds, target)\n            self.assertTrue(torch.isnan(score1), msg=err_msg)\n\n            # Replace NaNs by 0\n            metric = MetricWrapper(metric=\"mse\", target_nan_mask=0)\n            score2 = metric(preds, target)\n\n            this_target = target.clone()\n            this_target[is_nan] = 0\n            this_preds = preds.clone()\n            self.assertAlmostEqual(score2, mean_squared_error(this_preds, this_target), msg=err_msg)\n\n            # Replace NaNs by 1.5\n            metric = MetricWrapper(metric=\"mse\", target_nan_mask=1.5)\n            score3 = metric(preds, target)\n\n            this_target = target.clone()\n            this_target[is_nan] = 1.5\n            this_preds = preds.clone()\n            self.assertAlmostEqual(score3, mean_squared_error(this_preds, this_target), msg=err_msg)\n\n            # Flatten matrix and ignore NaNs\n            metric = MetricWrapper(metric=\"mse\", target_nan_mask=\"ignore\", multitask_handling=\"flatten\")\n            score4 = metric(preds, target)\n\n            this_target = target.clone()[~is_nan]\n            this_preds = preds.clone()[~is_nan]\n            self.assertAlmostEqual(score4, mean_squared_error(this_preds, this_target), msg=err_msg)\n\n            # Ignore NaNs in each column and average the score\n            metric = MetricWrapper(\n                metric=\"mse\", target_nan_mask=\"ignore\", multitask_handling=\"mean-per-label\"\n            )\n            score5 = metric(preds, target)\n\n            this_target = target.clone()\n            this_preds = preds.clone()\n            this_is_nan = is_nan.clone()\n            if len(sz) == 1:\n                this_target = target.unsqueeze(-1)\n                this_preds = preds.unsqueeze(-1)\n                this_is_nan = is_nan.unsqueeze(-1)\n\n            this_target = [this_target[:, ii][~this_is_nan[:, ii]] for ii in range(this_target.shape[1])]\n            this_preds = [this_preds[:, ii][~this_is_nan[:, ii]] for ii in range(this_preds.shape[1])]\n            mse = []\n            for ii in range(len(this_preds)):\n                mse.append(mean_squared_error(this_preds[ii], this_target[ii]))\n            mse = torch.mean(torch.stack(mse))\n            self.assertAlmostEqual(score5.tolist(), mse.tolist(), msg=err_msg)\n\n    def test_pickling(self):\n        pickle_file = os.path.join(tempfile.gettempdir(), \"test_metric_pickled.pkl\")\n        metrics = [\"mae\", \"mse\", mean_squared_error]\n        target_nan_masks = [None, 2, \"ignore\"]\n        multitask_handlings = [None, \"flatten\", \"mean-per-label\"]\n        squeeze_targets = [True, False]\n        target_to_ints = [True, False]\n        other_kwargs = [{}, {\"squared\": False}]\n        thresholds = [\n            None,\n            {\"threshold\": 0.2, \"operator\": \"greater\"},\n            {\"threshold\": 0.3, \"operator\": op.lt},\n            {\"threshold\": 0.4, \"operator\": \"lower\"},\n            {\"threshold\": 0.5, \"operator\": \"lower\", \"th_on_preds\": False, \"th_on_target\": True},\n            {\"threshold\": 0.6, \"operator\": \"lower\"},\n        ]\n\n        for metric in metrics:\n            for target_nan_mask in target_nan_masks:\n                for kwargs in other_kwargs:\n                    for threshold_kwargs in thresholds:\n                        for multitask_handling in multitask_handlings:\n                            for squeeze_target in squeeze_targets:\n                                for target_to_int in target_to_ints:\n                                    err_msg = f\"{metric} - {target_nan_mask} - {kwargs} - {threshold_kwargs}\"\n\n                                    if (multitask_handling is None) and (target_nan_mask == \"ignore\"):\n                                        # Raise with incompatible options\n                                        with self.assertRaises(ValueError):\n                                            MetricWrapper(\n                                                metric=metric,\n                                                threshold_kwargs=threshold_kwargs,\n                                                target_nan_mask=target_nan_mask,\n                                                multitask_handling=multitask_handling,\n                                                squeeze_target=squeeze_target,\n                                                target_to_int=target_to_int,\n                                                **kwargs,\n                                            )\n\n                                    else:\n                                        metric_wrapper = MetricWrapper(\n                                            metric=metric,\n                                            threshold_kwargs=threshold_kwargs,\n                                            target_nan_mask=target_nan_mask,\n                                            multitask_handling=multitask_handling,\n                                            squeeze_target=squeeze_target,\n                                            target_to_int=target_to_int,\n                                            **kwargs,\n                                        )\n\n                                        # Check that the metric can be saved and re-loaded without error\n                                        torch.save(metric_wrapper, pickle_file)\n                                        metric_wrapper2 = torch.load(pickle_file)\n                                        self.assertTrue(metric_wrapper == metric_wrapper2, msg=err_msg)\n\n                                        # Check that the metric only contains primitive types\n                                        state = metric_wrapper.__getstate__()\n                                        if state[\"threshold_kwargs\"] is not None:\n                                            self.assertIsInstance(\n                                                state[\"threshold_kwargs\"], dict, msg=err_msg\n                                            )\n                                        if isinstance(metric, str):\n                                            self.assertIsInstance(state[\"metric\"], str, msg=err_msg)\n\n    def test_classifigression_target_squeezing(self):\n        preds = torch.Tensor([[0.1, 0.1, 0.3, 0.5, 0.0, 0.1, 0.0, 0.7, 0.2, 0.0]])\n        target = torch.Tensor([3, 0])\n        expected_scores = [0.5, 0.75]\n        n_brackets = 5\n        metrics = [\"accuracy\", \"averageprecision\"]\n        other_kwargs = [\n            {\"task\": \"multiclass\", \"num_classes\": n_brackets, \"top_k\": 1},\n            {\"task\": \"multiclass\", \"num_classes\": n_brackets},\n        ]\n\n        for metric, kwargs, expected_score in zip(metrics, other_kwargs, expected_scores):\n            metric_wrapper = MetricWrapper(\n                metric=metric,\n                multitask_handling=\"mean-per-label\",\n                squeeze_targets=True,\n                target_to_int=True,\n                **kwargs,\n            )\n            score = metric_wrapper(preds, target)\n\n            assert score == expected_score\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_mtl_architecture.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different architectures of graphium/nn/architectures...\n\nThe layers are not thoroughly tested due to the difficulty of testing them\n\"\"\"\n\nimport torch\nimport unittest as ut\nfrom torch_geometric.data import Data, Batch\nfrom copy import deepcopy\nfrom itertools import product\nimport sys\nimport traceback\n\nimport graphium\nfrom graphium.config._loader import load_architecture\nfrom graphium.nn.architectures import TaskHeads, FullGraphMultiTaskNetwork, GraphOutputNN\nfrom graphium.nn.base_layers import FCLayer\n\nfrom graphium.utils.spaces import LAYERS_DICT\n\nkwargs = {\n    \"activation\": \"relu\",\n    \"last_activation\": \"none\",\n    \"normalization\": \"none\",\n    \"dropout\": 0.2,\n    \"name\": \"LNN\",\n    \"layer_type\": FCLayer,\n    \"residual_type\": \"none\",\n    \"residual_skip_steps\": 1,\n}\n# task kwargs\ntask_1_kwargs = {\n    \"out_dim\": 5,\n    \"task_level\": \"node\",\n    \"hidden_dims\": [5, 6, 7],\n}\ntask_2_kwargs = {\n    \"out_dim\": 3,\n    \"task_level\": \"edge\",\n    \"hidden_dims\": [8, 9, 10],\n}\ntask_3_kwargs = {\n    \"out_dim\": 4,\n    \"task_level\": \"graph\",\n    \"hidden_dims\": [2, 2, 2],\n}\ntask_4_kwargs = {\n    \"out_dim\": 4,\n    \"task_level\": \"nodepair\",\n    \"hidden_dims\": [2, 2, 2],\n}\n\n# level-wise kwargs\nnode_level_kwargs = {\n    \"out_dim\": 8,\n    \"hidden_dims\": [8, 9, 10],\n    \"activation\": \"relu\",\n    \"last_activation\": \"none\",\n    \"normalization\": \"none\",\n    \"dropout\": 0.2,\n    \"name\": \"LNN\",\n    \"layer_type\": FCLayer,\n    \"residual_type\": \"none\",\n    \"residual_skip_steps\": 1,\n}\ngraph_level_kwargs = {\n    \"pooling\": [\"sum\", \"max\"],\n    \"out_dim\": 8,\n    \"hidden_dims\": [8, 9, 10],\n    \"activation\": \"relu\",\n    \"last_activation\": \"none\",\n    \"normalization\": \"none\",\n    \"dropout\": 0.2,\n    \"name\": \"LNN\",\n    \"layer_type\": FCLayer,\n    \"residual_type\": \"none\",\n    \"residual_skip_steps\": 1,\n}\nedge_level_kwargs = {\n    \"out_dim\": 8,\n    \"hidden_dims\": [8, 9, 10],\n    \"activation\": \"relu\",\n    \"last_activation\": \"none\",\n    \"normalization\": \"none\",\n    \"dropout\": 0.2,\n    \"name\": \"LNN\",\n    \"layer_type\": FCLayer,\n    \"residual_type\": \"none\",\n    \"residual_skip_steps\": 1,\n}\n\nnodepair_level_kwargs = {\n    \"out_dim\": 8,\n    \"hidden_dims\": [8, 9, 10],\n    \"activation\": \"relu\",\n    \"last_activation\": \"none\",\n    \"normalization\": \"none\",\n    \"dropout\": 0.2,\n    \"name\": \"LNN\",\n    \"layer_type\": FCLayer,\n    \"residual_type\": \"none\",\n    \"residual_skip_steps\": 1,\n}\n\ntask_1_params = {}\ntask_1_params.update(task_1_kwargs)\ntask_1_params.update(kwargs)\ntask_2_params = {}\ntask_2_params.update(task_2_kwargs)\ntask_2_params.update(kwargs)\ntask_3_params = {}\ntask_3_params.update(task_3_kwargs)\ntask_3_params.update(kwargs)\ntask_4_params = {}\ntask_4_params.update(task_4_kwargs)\ntask_4_params.update(kwargs)\n\n\ndef toy_test_data(in_dim=7, in_dim_edges=3):\n    edge_idx1 = torch.stack([torch.tensor([0, 1, 2, 3, 2]), torch.tensor([1, 2, 3, 0, 0])])\n    edge_idx2 = torch.stack([torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0])])\n    x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\n    e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\n    x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\n    e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\n    # edge_idx1, e1 = add_self_loops(edge_idx1, e1)\n    # edge_idx2, e2 = add_self_loops(edge_idx2, e2)\n    g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\n    g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\n    bg = Batch.from_data_list([g1, g2])\n    num_nodes_per_graph = bg.batch.bincount()\n    batch_node = bg[\"feat\"].size()[0]\n    batch_edge = bg[\"edge_feat\"].size()[0]\n    batch_graph = 2\n    batch_nodepair = ((num_nodes_per_graph.max().item() ** 2) - num_nodes_per_graph.max().item()) // 2\n    return bg, batch_node, batch_edge, batch_graph, batch_nodepair\n\n\nclass test_GraphOutputNN(ut.TestCase):\n    def generate_test_data(self):\n        g_1 = torch.tensor([[2, 3, 4], [0, 1, 0], [0, 1, 1], [1, 0, 0]])\n        g_2 = torch.tensor([[3, 4, 0], [0, 1, 1], [1, 0, 0]])\n        g_3 = torch.tensor([[3, 4, 0], [0, 1, 1], [1, 0, 0], [2, 2, 2], [4, 4, 4]])\n\n        batch = torch.tensor([0, 0, 0, 0, 1, 1, 1, 2, 2, 2, 2, 2])\n        x = torch.concat([g_1, g_2, g_3]).float()\n\n        nan = -1\n        expected_result = [\n            # graph 1\n            [\n                [2.0, 4.0, 4.0, 2.0, 2.0, 4.0],\n                [2.0, 4.0, 5.0, 2.0, 2.0, 3.0],\n                [3.0, 3.0, 4.0, 1.0, 3.0, 4.0],\n                [nan, nan, nan, nan, nan, nan],\n                [0.0, 2.0, 1.0, 0.0, 0.0, 1.0],\n                [1.0, 1.0, 0.0, 1.0, 1.0, 0.0],\n                [nan, nan, nan, nan, nan, nan],\n                [1.0, 1.0, 1.0, 1.0, 1.0, 1.0],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n            ],\n            # graph 2\n            [\n                [3.0, 5.0, 1.0, 3.0, 3.0, 1.0],\n                [4.0, 4.0, 0.0, 2.0, 4.0, 0.0],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n                [1.0, 1.0, 1.0, 1.0, 1.0, 1.0],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n                [nan, nan, nan, nan, nan, nan],\n            ],\n            # graph 3\n            [\n                [3.0, 5.0, 1.0, 3.0, 3.0, 1.0],\n                [4.0, 4.0, 0.0, 2.0, 4.0, 0.0],\n                [5.0, 6.0, 2.0, 1.0, 2.0, 2.0],\n                [7.0, 8.0, 4.0, 1.0, 0.0, 4.0],\n                [1.0, 1.0, 1.0, 1.0, 1.0, 1.0],\n                [2.0, 3.0, 3.0, 2.0, 1.0, 1.0],\n                [4.0, 5.0, 5.0, 4.0, 3.0, 3.0],\n                [3.0, 2.0, 2.0, 1.0, 2.0, 2.0],\n                [5.0, 4.0, 4.0, 3.0, 4.0, 4.0],\n                [6.0, 6.0, 6.0, 2.0, 2.0, 2.0],\n            ],\n        ]\n\n        return x, batch, expected_result\n\n    def test_nodepair_max_num_nodes_not_set(self):\n        in_dim = 3\n        in_dim_edges = 8\n\n        graph_output_nn_kwargs = {\n            \"node\": node_level_kwargs,\n            \"edge\": edge_level_kwargs,\n            \"graph\": graph_level_kwargs,\n            \"nodepair\": nodepair_level_kwargs,\n        }\n\n        graph_output_nn = GraphOutputNN(\n            in_dim=in_dim,\n            in_dim_edges=in_dim_edges,\n            task_level=\"nodepair\",\n            graph_output_nn_kwargs=graph_output_nn_kwargs,\n        )\n\n        x, batch, expected_result = self.generate_test_data()\n\n        out = graph_output_nn.compute_nodepairs(node_feats=x, batch=batch)\n        out = torch.nan_to_num(out, nan=-1)\n        self.assertListEqual(expected_result, out.tolist())\n\n    def test_nodepair_with_max_num_nodes(self):\n        in_dim = 3\n        in_dim_edges = 8\n\n        graph_output_nn_kwargs = {\n            \"node\": node_level_kwargs,\n            \"edge\": edge_level_kwargs,\n            \"graph\": graph_level_kwargs,\n            \"nodepair\": nodepair_level_kwargs,\n        }\n\n        graph_output_nn = GraphOutputNN(\n            in_dim=in_dim,\n            in_dim_edges=in_dim_edges,\n            task_level=\"nodepair\",\n            graph_output_nn_kwargs=graph_output_nn_kwargs,\n        )\n\n        max_num_nodes = 5  # if we change this value, we also have to change the expected result.\n        x, batch, expected_result = self.generate_test_data()\n\n        out = graph_output_nn.compute_nodepairs(node_feats=x, batch=batch, max_num_nodes=max_num_nodes)\n        out = torch.nan_to_num(out, nan=-1)\n        self.assertListEqual(expected_result, out.tolist())\n\n\nclass test_TaskHeads(ut.TestCase):\n    def test_task_heads_forward(self):\n        in_dim = 8  # Dimension of the incoming data\n        in_dim_edges = 8\n\n        task_heads_params = {\n            \"task_1\": task_1_params,\n            \"task_2\": task_2_params,\n            \"task_3\": task_3_params,\n            \"task_4\": task_4_params,\n        }\n        graph_output_nn_kwargs = {\n            \"node\": node_level_kwargs,\n            \"edge\": edge_level_kwargs,\n            \"graph\": graph_level_kwargs,\n            \"nodepair\": nodepair_level_kwargs,\n        }\n        # Create the \"multitask\" network. Really it's just an input going to various FFNNs since there's nothing shared.\n        multi_head_nn = TaskHeads(\n            in_dim=in_dim,\n            in_dim_edges=in_dim_edges,\n            task_heads_kwargs=task_heads_params,\n            graph_output_nn_kwargs=graph_output_nn_kwargs,\n        )\n\n        # Test the sizes of the MLPs for each head\n        # Head for task_1\n        task_1_head = multi_head_nn.task_heads[\"task_1\"]\n\n        # Check the dimensions\n        self.assertEqual(len(task_1_head.layers), len(task_1_kwargs[\"hidden_dims\"]) + 1)\n        self.assertEqual(task_1_head.layers[0].in_dim, in_dim)\n        self.assertEqual(task_1_head.layers[1].in_dim, task_1_kwargs[\"hidden_dims\"][0])\n        self.assertEqual(task_1_head.layers[2].in_dim, task_1_kwargs[\"hidden_dims\"][1])\n        self.assertEqual(task_1_head.layers[3].in_dim, task_1_kwargs[\"hidden_dims\"][2])\n\n        # Head for task_2\n        task_2_head = multi_head_nn.task_heads[\"task_2\"]\n\n        # Check the dimensions\n        self.assertEqual(len(task_2_head.layers), len(task_2_kwargs[\"hidden_dims\"]) + 1)\n        self.assertEqual(task_2_head.layers[0].in_dim, in_dim)\n        self.assertEqual(task_2_head.layers[1].in_dim, task_2_kwargs[\"hidden_dims\"][0])\n        self.assertEqual(task_2_head.layers[2].in_dim, task_2_kwargs[\"hidden_dims\"][1])\n        self.assertEqual(task_2_head.layers[3].in_dim, task_2_kwargs[\"hidden_dims\"][2])\n\n        # Head for task_3\n        task_3_head = multi_head_nn.task_heads[\"task_3\"]\n\n        # Check the dimensions\n        self.assertEqual(len(task_3_head.layers), len(task_3_kwargs[\"hidden_dims\"]) + 1)\n        self.assertEqual(task_3_head.layers[0].in_dim, in_dim)\n        self.assertEqual(task_3_head.layers[1].in_dim, task_3_kwargs[\"hidden_dims\"][0])\n        self.assertEqual(task_3_head.layers[2].in_dim, task_3_kwargs[\"hidden_dims\"][1])\n        self.assertEqual(task_3_head.layers[3].in_dim, task_3_kwargs[\"hidden_dims\"][2])\n\n        # Head for task_4\n        task_4_head = multi_head_nn.task_heads[\"task_3\"]\n\n        # Check the dimensions\n        self.assertEqual(len(task_4_head.layers), len(task_4_kwargs[\"hidden_dims\"]) + 1)\n        self.assertEqual(task_4_head.layers[0].in_dim, in_dim)\n        self.assertEqual(task_4_head.layers[1].in_dim, task_4_kwargs[\"hidden_dims\"][0])\n        self.assertEqual(task_4_head.layers[2].in_dim, task_4_kwargs[\"hidden_dims\"][1])\n        self.assertEqual(task_4_head.layers[3].in_dim, task_4_kwargs[\"hidden_dims\"][2])\n\n        # Check the output: It's a per-task prediction!\n        bg, batch_node, batch_edge, batch_graph, batch_nodepair = toy_test_data(\n            in_dim=in_dim, in_dim_edges=in_dim_edges\n        )\n        feat_out = multi_head_nn.forward(bg)\n\n        self.assertListEqual(\n            list(feat_out[\"task_1\"].shape), [batch_node, task_1_kwargs[\"out_dim\"]]\n        )  # node level task\n        self.assertListEqual(\n            list(feat_out[\"task_2\"].shape), [batch_edge, task_2_kwargs[\"out_dim\"]]\n        )  # edge level task\n        self.assertListEqual(\n            list(feat_out[\"task_3\"].shape), [batch_graph, task_3_kwargs[\"out_dim\"]]\n        )  # graph level task\n        self.assertListEqual(\n            list(feat_out[\"task_4\"].shape), [batch_graph, batch_nodepair, task_4_kwargs[\"out_dim\"]]\n        )  # nodepair level task\n\n    def test_task_heads_non_supported_level(self):\n        in_dim = 8  # Dimension of the incoming data\n        in_dim_edges = 8\n\n        fake_task = deepcopy(task_1_params)\n        fake_task[\"task_level\"] = \"certainly_not_supported_level\"\n        task_heads_params = {\n            \"task_1\": fake_task,\n            \"task_2\": task_2_params,\n            \"task_3\": task_3_params,\n            \"task_4\": task_4_params,\n        }\n        graph_output_nn_kwargs = {\n            \"certainly_not_supported_level\": node_level_kwargs,\n            \"edge\": edge_level_kwargs,\n            \"graph\": graph_level_kwargs,\n            \"nodepair\": nodepair_level_kwargs,\n        }\n\n        with self.assertRaises(ValueError):\n            TaskHeads(\n                in_dim=in_dim,\n                in_dim_edges=in_dim_edges,\n                task_heads_kwargs=task_heads_params,\n                graph_output_nn_kwargs=graph_output_nn_kwargs,\n            )\n\n\nclass test_Multitask_NN(ut.TestCase):\n    pyg_kwargs = {\n        \"activation\": \"relu\",\n        \"last_activation\": \"none\",\n        \"normalization\": \"none\",\n        \"dropout\": 0.2,\n        \"name\": \"LNN\",\n    }\n\n    in_dim = 7\n    fullgraph_out_dim = 11\n    in_dim_edges = 3\n    hidden_dims = [6, 6, 6, 6, 6]\n\n    virtual_nodes = [\"none\", \"mean\", \"sum\"]\n    norms = [\"none\", None, \"batch_norm\", \"layer_norm\"]\n    pna_kwargs = {\"aggregators\": [\"mean\", \"max\", \"sum\"], \"scalers\": [\"identity\", \"amplification\"]}\n\n    gnn_layers_kwargs = {\n        \"pyg:gin\": {},\n        \"pyg:gine\": {\"in_dim_edges\": in_dim_edges},\n        \"pyg:gated-gcn\": {\"in_dim_edges\": in_dim_edges, \"hidden_dims_edges\": hidden_dims},\n        \"pyg:pna-msgpass#1\": {\"layer_kwargs\": pna_kwargs, \"in_dim_edges\": 0},\n        \"pyg:pna-msgpass#2\": {\"layer_kwargs\": pna_kwargs, \"in_dim_edges\": in_dim_edges},\n    }\n\n    def test_full_graph_multitask_forward(self):\n        # Params for the network\n        temp_dim_1 = 5\n        temp_dim_2 = 7\n        temp_dim_edges = 21\n\n        default_graph_output_nn_kwargs = {\n            \"node\": dict(out_dim=10, hidden_dims=[3, 3, 3, 3]),\n            \"edge\": dict(out_dim=11, hidden_dims=[4, 4, 4, 4]),\n            \"graph\": dict(out_dim=12, hidden_dims=[5, 5, 5, 5]),\n            \"nodepair\": dict(out_dim=13, hidden_dims=[6, 6, 6, 6]),\n        }\n\n        # TODO: Test all combinations?\n        all_task_heads_kwargs = dict(\n            task_node=task_1_params,\n            task_edge=task_2_params,\n            task_graph=task_3_params,\n            task_nodepair=task_4_params,\n        )\n        only_node_and_graph_task_heads_kwargs = dict(task_node=task_1_params, task_graph=task_3_params)\n\n        # TODO: Configure properly in pytest best-practice\n        options = product(\n            [[\"sum\"], [\"mean\", \"max\"]],\n            [1, 2],\n            self.virtual_nodes,\n            self.norms,\n            self.gnn_layers_kwargs.items(),\n            [None, all_task_heads_kwargs, only_node_and_graph_task_heads_kwargs],\n            [None, dict(in_dim=self.in_dim, out_dim=temp_dim_1, hidden_dims=[4, 4, 4, 4, 4])],\n            [None, dict(in_dim=self.in_dim_edges, out_dim=temp_dim_edges, hidden_dims=[4, 4])],\n        )\n        for (\n            pooling,\n            residual_skip_steps,\n            virtual_node,\n            normalization,\n            (layer_name, this_kwargs),\n            task_heads_kwargs,\n            pre_nn_kwargs,\n            pre_nn_edges_kwargs,\n        ) in options:\n            err_msg = f\"pooling={pooling}, virtual_node={virtual_node}, layer_name={layer_name}, residual_skip_steps={residual_skip_steps}, normalization={normalization}, task_heads={task_heads_kwargs}, pre_nn_kwargs={pre_nn_kwargs}, graph_output_nn_kwargs={default_graph_output_nn_kwargs}, pre_nn_edges_kwargs={pre_nn_edges_kwargs}\"\n            layer_type = layer_name.split(\"#\")[0]\n\n            # TODO: graph_output_nn is currently non-optional, should it be optional?\n            if task_heads_kwargs is not None:\n                continue\n\n            # TODO: Allow to pass single graph_output_nn_kwargs to apply to all levels?\n\n            bg, num_nodes, num_edges, num_graphs, num_nodepairs = toy_test_data(\n                in_dim=self.in_dim, in_dim_edges=self.in_dim_edges\n            )\n\n            expected_dim_edges = self.in_dim_edges\n            this_kwargs2 = deepcopy(this_kwargs)\n            if pre_nn_edges_kwargs is not None:\n                this_kwargs2[\"in_dim_edges\"] = expected_dim_edges = temp_dim_edges\n\n            gnn_kwargs = dict(\n                in_dim=self.in_dim if pre_nn_kwargs is None else temp_dim_1,\n                out_dim=temp_dim_2,\n                hidden_dims=self.hidden_dims,\n                residual_type=\"densenet\",\n                residual_skip_steps=residual_skip_steps,\n                layer_type=layer_type,\n                **this_kwargs2,\n                **self.pyg_kwargs,\n            )\n\n            graph_output_nn_kwargs = deepcopy(default_graph_output_nn_kwargs)\n            graph_output_nn_kwargs[\"graph\"][\"pooling\"] = pooling\n\n            expectFailure = False\n\n            if (not LAYERS_DICT[layer_type].layer_supports_edges) and (pre_nn_edges_kwargs is not None):\n                expectFailure = True\n\n            if (not LAYERS_DICT[layer_type].layer_supports_edges) and (\n                task_heads_kwargs is not None and \"task_edge\" in task_heads_kwargs\n            ):\n                expectFailure = True\n\n            if (\n                \"in_dim_edges\" in this_kwargs2\n                and this_kwargs2[\"in_dim_edges\"] < 1\n                and (task_heads_kwargs is not None and \"task_edge\" in task_heads_kwargs)\n            ):\n                expectFailure = True\n\n            if task_heads_kwargs is not None and \"task_graph\" in task_heads_kwargs:\n                expectFailure = True\n\n            if expectFailure:\n                with self.assertRaises(ValueError):\n                    multitask_graph_nn = FullGraphMultiTaskNetwork(\n                        gnn_kwargs=gnn_kwargs,\n                        pre_nn_kwargs=pre_nn_kwargs,\n                        pre_nn_edges_kwargs=pre_nn_edges_kwargs,\n                        task_heads_kwargs=task_heads_kwargs,\n                        graph_output_nn_kwargs=graph_output_nn_kwargs,\n                    )\n                continue\n\n            try:\n                multitask_graph_nn = FullGraphMultiTaskNetwork(\n                    gnn_kwargs=gnn_kwargs,\n                    pre_nn_kwargs=pre_nn_kwargs,\n                    pre_nn_edges_kwargs=pre_nn_edges_kwargs,\n                    task_heads_kwargs=task_heads_kwargs,\n                    graph_output_nn_kwargs=graph_output_nn_kwargs,\n                )\n\n                batch_out = multitask_graph_nn.forward(bg)\n            except Exception as e:\n                exc_type, exc_value, exc_traceback = sys.exc_info()\n                msg = err_msg + \"\\n\" + str(traceback.format_exception(exc_type, exc_value, exc_traceback))\n                self.fail(msg)\n\n            if task_heads_kwargs is not None:\n                if \"task_node\" in task_heads_kwargs:\n                    self.assertListEqual(\n                        list(batch_out[\"task_node\"].shape),\n                        [num_nodes, task_1_kwargs[\"out_dim\"]],\n                        msg=err_msg,\n                    )\n                if \"task_edge\" in task_heads_kwargs:\n                    self.assertListEqual(\n                        list(batch_out[\"task_edge\"].shape),\n                        [num_edges, task_2_kwargs[\"out_dim\"]],\n                        msg=err_msg,\n                    )\n                if \"task_graph\" in task_heads_kwargs:\n                    self.assertListEqual(\n                        list(batch_out[\"task_graph\"].shape),\n                        [num_graphs, task_3_kwargs[\"out_dim\"]],\n                        msg=err_msg,\n                    )\n                if \"task_nodepair\" in task_heads_kwargs:\n                    self.assertListEqual(\n                        list(batch_out[\"task_nodepair\"].shape),\n                        [num_nodepairs, task_4_kwargs[\"out_dim\"]],\n                        msg=err_msg,\n                    )\n            else:\n                feat_out = batch_out[\"feat\"]\n                edge_feat_out = batch_out[\"edge_feat\"]\n\n                out_dim_edges = (\n                    multitask_graph_nn.gnn.full_dims_edges[-1]\n                    if multitask_graph_nn.gnn.full_dims_edges is not None\n                    else expected_dim_edges\n                )\n\n                if not LAYERS_DICT[layer_type].layer_supports_edges or this_kwargs2[\"in_dim_edges\"] == 0:\n                    self.assertEqual(multitask_graph_nn.out_dim_edges, 0)\n                else:\n                    self.assertEqual(multitask_graph_nn.out_dim_edges, out_dim_edges)\n                self.assertListEqual(list(feat_out.shape), [num_nodes, temp_dim_2], msg=err_msg)\n                self.assertListEqual(list(edge_feat_out.shape), [num_edges, out_dim_edges], msg=err_msg)\n\n    def test_full_graph_multi_task_from_config(self):\n        cfg = graphium.load_config(name=\"zinc_default_multitask_pyg\")\n\n        # Initialize the network\n        in_dims = {\"feat\": self.in_dim, \"edge_feat\": self.in_dim_edges}\n        model_class, model_kwargs = load_architecture(cfg, in_dims=in_dims)\n\n        multitask_full_graph_nn = model_class(**model_kwargs)\n\n        # Test\n        bg, num_nodes, num_edges, num_graphs, num_nodepairs = toy_test_data(\n            in_dim=self.in_dim, in_dim_edges=self.in_dim_edges\n        )\n        batch_out = multitask_full_graph_nn.forward(bg)\n\n        self.assertListEqual(\n            list(batch_out[\"task_1\"].shape),\n            [num_nodes, task_1_kwargs[\"out_dim\"]],\n        )\n        self.assertListEqual(\n            list(batch_out[\"task_2\"].shape),\n            [num_edges, task_2_kwargs[\"out_dim\"]],\n        )\n        self.assertListEqual(\n            list(batch_out[\"task_3\"].shape),\n            [num_graphs, task_3_kwargs[\"out_dim\"]],\n        )\n        self.assertListEqual(\n            list(batch_out[\"task_4\"].shape),\n            [num_graphs, num_nodepairs, task_4_kwargs[\"out_dim\"]],\n        )\n\n    def test_full_graph_multi_task_set_max_num_nodes(self):\n        cfg = graphium.load_config(name=\"zinc_default_multitask_pyg\")\n\n        # Initialize the network\n        in_dims = {\"feat\": self.in_dim, \"edge_feat\": self.in_dim_edges}\n        model_class, model_kwargs = load_architecture(cfg, in_dims=in_dims)\n\n        multitask_full_graph_nn: FullGraphMultiTaskNetwork = model_class(**model_kwargs)\n        multitask_full_graph_nn.set_max_num_nodes_edges_per_graph(10, 4)\n\n        # Probing\n        self.assertEqual(\n            multitask_full_graph_nn.task_heads.graph_output_nn[\"nodepair\"].max_num_nodes_per_graph, 10\n        )\n        self.assertEqual(multitask_full_graph_nn.gnn.layers[2].max_num_edges_per_graph, 4)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_multitask_datamodule.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport shutil\nimport tempfile\nimport unittest as ut\n\nimport pytest\nfrom omegaconf import OmegaConf\nimport pandas as pd\nimport numpy as np\nimport graphium\n\n\nclass Test_Multitask_DataModule(ut.TestCase):\n    def setUp(self):\n        # Create a temporary directory\n        self.tmp_test_dir = tempfile.mkdtemp()\n\n    def tearDown(self):\n        # Remove the directory after the test\n        shutil.rmtree(self.tmp_test_dir)\n\n    def test_multitask_fromsmiles_dm(\n        self,\n    ):  # TODO: I think we can remove this as it tests tiny_zinc which only contain graph level labels\n        \"\"\"Cover similar testing as for the original data module.\"\"\"\n        df = graphium.data.load_tiny_zinc()  # 100 molecules\n\n        # Here we take the microzinc dataset and split the labels up into 'SA', 'logp' and 'score' in order to simulate having multiple single-task datasets\n        df_micro_zinc_SA = df[[\"SMILES\", \"SA\"]]\n        df_micro_zinc_logp = df[[\"SMILES\", \"logp\"]]\n        df_micro_zinc_score = df[[\"SMILES\", \"score\"]]\n\n        # Setup the featurization. This will be the same across all tasks.\n        featurization_args = {}\n        featurization_args[\"atom_property_list_float\"] = []  # [\"weight\", \"valence\"]\n        featurization_args[\"atom_property_list_onehot\"] = [\"atomic-number\", \"degree\"]\n        featurization_args[\"edge_property_list\"] = [\"in-ring\", \"bond-type-onehot\"]\n        featurization_args[\"add_self_loop\"] = False\n        featurization_args[\"use_bonds_weights\"] = False\n        featurization_args[\"explicit_H\"] = False\n\n        # Config for multitask datamodule.\n\n        # Per-task arguments.\n        dm_task_args_SA = {}\n        dm_task_args_SA[\"df\"] = df_micro_zinc_SA\n        dm_task_args_SA[\"task_level\"] = \"graph\"\n        dm_task_args_SA[\"smiles_col\"] = \"SMILES\"\n        dm_task_args_SA[\"label_cols\"] = [\"SA\"]\n        dm_task_args_SA[\"split_val\"] = 0.2\n        dm_task_args_SA[\"split_test\"] = 0.2\n        dm_task_args_SA[\"seed\"] = 19\n        dm_task_args_SA[\"splits_path\"] = None  # This may not always be provided\n        dm_task_args_SA[\"sample_size\"] = None  # This may not always be provided\n        dm_task_args_SA[\"idx_col\"] = None  # This may not always be provided\n        dm_task_args_SA[\"weights_col\"] = None  # This may not always be provided\n        dm_task_args_SA[\"weights_type\"] = None  # This may not always be provided\n\n        dm_task_args_logp = {}\n        dm_task_args_logp[\"df\"] = df_micro_zinc_logp\n        dm_task_args_logp[\"task_level\"] = \"graph\"\n        dm_task_args_logp[\"smiles_col\"] = \"SMILES\"\n        dm_task_args_logp[\"label_cols\"] = [\"logp\"]\n        dm_task_args_logp[\"split_val\"] = 0.2\n        dm_task_args_logp[\"split_test\"] = 0.2\n        dm_task_args_logp[\"seed\"] = 19\n        dm_task_args_logp[\"splits_path\"] = None  # This may not always be provided\n        dm_task_args_logp[\"sample_size\"] = None  # This may not always be provided\n        dm_task_args_logp[\"idx_col\"] = None  # This may not always be provided\n        dm_task_args_logp[\"weights_col\"] = None  # This may not always be provided\n        dm_task_args_logp[\"weights_type\"] = None  # This may not always be provided\n\n        dm_task_args_score = {}\n        dm_task_args_score[\"df\"] = df_micro_zinc_score\n        dm_task_args_score[\"task_level\"] = \"graph\"\n        dm_task_args_score[\"smiles_col\"] = \"SMILES\"\n        dm_task_args_score[\"label_cols\"] = [\"score\"]\n        dm_task_args_score[\"split_val\"] = 0.2\n        dm_task_args_score[\"split_test\"] = 0.2\n        dm_task_args_score[\"seed\"] = 19\n        dm_task_args_score[\"splits_path\"] = None  # This may not always be provided\n        dm_task_args_score[\"sample_size\"] = None  # This may not always be provided\n        dm_task_args_score[\"idx_col\"] = None  # This may not always be provided\n        dm_task_args_score[\"weights_col\"] = None  # This may not always be provided\n        dm_task_args_score[\"weights_type\"] = None  # This may not always be provided\n\n        dm_task_kwargs = {}\n        dm_task_kwargs[\"SA\"] = dm_task_args_SA\n        dm_task_kwargs[\"logp\"] = dm_task_args_logp\n        dm_task_kwargs[\"score\"] = dm_task_args_score\n\n        dm_args = {}\n\n        # Task-specific arguments for the datamodule\n        dm_args[\"task_specific_args\"] = dm_task_kwargs\n\n        # Task-independent arguments\n        dm_args[\"featurization\"] = featurization_args\n        dm_args[\"featurization_n_jobs\"] = 16\n        dm_args[\"featurization_progress\"] = True\n        dm_args[\"featurization_backend\"] = \"loky\"\n        dm_args[\"num_workers\"] = 0\n        dm_args[\"pin_memory\"] = True\n        dm_args[\"processed_graph_data_path\"] = None\n        dm_args[\"batch_size_training\"] = 16\n        dm_args[\"batch_size_inference\"] = 16\n\n        # Create the data module\n        dm = graphium.data.MultitaskFromSmilesDataModule(**dm_args)\n\n        # self.assertEqual(50, dm.num_node_feats)    # Not implemeneted error\n        # self.assertEqual(6, dm.num_edge_feats)\n\n        dm.prepare_data()\n        dm.setup()\n\n        # self.assertEqual(len(dm), 100)                      # Should this have a fixed value for when it's initialized? MTL dataset only gets created after.\n        self.assertEqual(len(dm.train_ds), 60)  # type: ignore\n        self.assertEqual(len(dm.val_ds), 20)  # type: ignore\n        self.assertEqual(len(dm.test_ds), 20)  # type: ignore\n        # assert dm.num_node_feats == 50\n        # assert dm.num_edge_feats == 6\n\n        for dl in [dm.train_dataloader(), dm.val_dataloader(), dm.test_dataloader()]:\n            it = iter(dl)\n            batch = next(it)\n\n            assert set(batch.keys()) == {\"labels\", \"features\"}\n\n            # assert batch[\"labels\"].shape == (16, 1)            # Single-task case\n            assert batch[\"labels\"][\"graph_SA\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_logp\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_score\"].shape == (16, 1)\n\n    # @pytest.mark.skip\n    def test_multitask_fromsmiles_from_config(self):\n        config = graphium.load_config(name=\"zinc_default_multitask_pyg\")\n\n        df = graphium.data.load_tiny_zinc()  # 100 molecules\n\n        # Here we take the microzinc dataset and split the labels up into 'SA', 'logp' and 'score' in order to simulate having multiple single-task datasets\n        df_micro_zinc_SA = df[[\"SMILES\", \"SA\"]]\n        df_micro_zinc_logp = df[[\"SMILES\", \"logp\"]]\n        df_micro_zinc_score = df[[\"SMILES\", \"score\"]]\n\n        # dm_args = dict(config.datamodule.args)\n        dm_args = OmegaConf.to_container(config.datamodule.args, resolve=True)\n        # dm_args[\"task_specific_args\"][\"SA\"][\"df\"] = df\n        dm_args[\"task_specific_args\"][\"SA\"][\"df\"] = df_micro_zinc_SA\n        dm_args[\"task_specific_args\"][\"logp\"][\"df\"] = df_micro_zinc_logp\n        dm_args[\"task_specific_args\"][\"score\"][\"df\"] = df_micro_zinc_score\n\n        dm_args[\"task_specific_args\"][\"SA\"][\"smiles_col\"] = \"SMILES\"\n        dm_args[\"task_specific_args\"][\"logp\"][\"smiles_col\"] = \"SMILES\"\n        dm_args[\"task_specific_args\"][\"score\"][\"smiles_col\"] = \"SMILES\"\n\n        dm_args[\"task_specific_args\"][\"SA\"][\"label_cols\"] = [\"SA\"]\n        dm_args[\"task_specific_args\"][\"logp\"][\"label_cols\"] = [\"logp\"]\n        dm_args[\"task_specific_args\"][\"score\"][\"label_cols\"] = [\"score\"]\n\n        dm_args[\"task_specific_args\"][\"SA\"][\"df_path\"] = None\n        dm_args[\"task_specific_args\"][\"logp\"][\"df_path\"] = None\n        dm_args[\"task_specific_args\"][\"score\"][\"df_path\"] = None\n\n        dm = graphium.data.MultitaskFromSmilesDataModule(**dm_args)\n\n        # assert dm.num_node_feats == 50\n        # assert dm.num_edge_feats == 6\n\n        dm.prepare_data()\n        dm.setup()\n\n        # self.assertEqual(len(dm), 100)                      # Should this have a fixed value for when it's initialized? MTL dataset only gets created after.\n        self.assertEqual(len(dm.train_ds), 60)  # type: ignore\n        self.assertEqual(len(dm.val_ds), 20)  # type: ignore\n        self.assertEqual(len(dm.test_ds), 20)  # type: ignore\n        # assert dm.num_node_feats == 50\n        # assert dm.num_edge_feats == 6\n\n        for dl in [dm.train_dataloader(), dm.val_dataloader(), dm.test_dataloader()]:\n            it = iter(dl)\n            batch = next(it)\n\n            assert set(batch.keys()) == {\"labels\", \"features\"}\n\n            # assert batch[\"labels\"].shape == (16, 1)            # Single-task case\n            assert batch[\"labels\"][\"graph_SA\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_logp\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_score\"].shape == (16, 1)\n\n    def test_multitask_fromsmiles_from_config_csv(self):\n        config = graphium.load_config(name=\"zinc_default_multitask_pyg\")\n\n        dm_args = OmegaConf.to_container(config.datamodule.args, resolve=True)\n        dm = graphium.data.MultitaskFromSmilesDataModule(**dm_args)\n\n        dm.prepare_data()\n        dm.setup()\n\n        # self.assertEqual(len(dm), 100)                      # Should this have a fixed value for when it's initialized? MTL dataset only gets created after.\n        self.assertEqual(len(dm.train_ds), 60)  # type: ignore\n        self.assertEqual(len(dm.val_ds), 20)  # type: ignore\n        self.assertEqual(len(dm.test_ds), 20)  # type: ignore\n        # assert dm.num_node_feats == 50\n        # assert dm.num_edge_feats == 6\n\n        for dl in [dm.train_dataloader(), dm.val_dataloader(), dm.test_dataloader()]:\n            it = iter(dl)\n            batch = next(it)\n\n            assert set(batch.keys()) == {\"labels\", \"features\"}\n\n            # assert batch[\"labels\"].shape == (16, 1)            # Single-task case\n            assert batch[\"labels\"][\"graph_SA\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_logp\"].shape == (16, 1)\n            assert batch[\"labels\"][\"graph_score\"].shape == (16, 1)\n\n    def test_multitask_fromsmiles_from_config_parquet(self):\n        config = graphium.load_config(name=\"fake_multilevel_multitask_pyg\")\n\n        dm_args = OmegaConf.to_container(config.datamodule.args, resolve=True)\n        dm = graphium.data.MultitaskFromSmilesDataModule(**dm_args)\n\n        dm.prepare_data()\n        dm.setup()\n\n        self.assertEqual(len(dm.train_ds), 1004)  # type: ignore\n\n        dl = dm.train_dataloader()\n        it = iter(dl)\n        batch = next(it)\n\n        assert set(batch.keys()) == {\"labels\", \"features\"}\n\n        # assert batch[\"labels\"].shape == (16, 1)            # Single-task case\n        assert batch[\"labels\"][\"graph_SA\"].shape == (16, 1)\n        assert batch[\"labels\"][\"node_logp\"].shape == (\n            batch[\"features\"].feat.size(0),\n            2,\n        )  # test node level\n        assert batch[\"labels\"][\"edge_score\"].shape == (\n            batch[\"features\"].edge_feat.size(0),\n            2,\n        )  # test edge level\n\n    def test_multitask_with_missing_fromsmiles_from_config_parquet(self):\n        config = graphium.load_config(name=\"fake_and_missing_multilevel_multitask_pyg\")\n\n        dm_args = OmegaConf.to_container(config.datamodule.args, resolve=True)\n        dm = graphium.data.MultitaskFromSmilesDataModule(**dm_args)\n\n        dm.prepare_data()\n        dm.setup()\n\n        self.assertEqual(len(dm.train_ds), 1004)  # type: ignore\n\n        dl = dm.train_dataloader()\n        it = iter(dl)\n        batch = next(it)\n\n        assert set(batch.keys()) == {\"labels\", \"features\"}\n\n        # assert batch[\"labels\"].shape == (16, 1)            # Single-task case\n        assert batch[\"labels\"][\"graph_SA\"].shape == (16, 1)\n        assert batch[\"labels\"][\"node_logp\"].shape == (\n            batch[\"features\"].feat.size(0),\n            2,\n        )  # test node level\n        assert batch[\"labels\"][\"edge_score\"].shape == (\n            batch[\"features\"].edge_feat.size(0),\n            2,\n        )  # test edge level\n\n    def test_extract_graph_level_singletask(self):\n        df = pd.read_parquet(f\"tests/converted_fake_multilevel_data.parquet\")\n        num_graphs = len(df)\n        label_cols = [\"graph_label\"]\n        output = graphium.data.datamodule.extract_labels(df, \"graph\", label_cols)\n\n        assert isinstance(output, np.ndarray)\n        assert len(output.shape) == 2\n        assert output.shape[0] == num_graphs\n        assert output.shape[1] == 1\n\n    def test_extract_graph_level_multitask(self):\n        df = pd.read_parquet(f\"tests/converted_fake_multilevel_data.parquet\")\n        num_graphs = len(df)\n        label_cols = [\"graph_label\", \"graph_label\"]\n        output = graphium.data.datamodule.extract_labels(df, \"graph\", label_cols)\n\n        assert isinstance(output, np.ndarray)\n        assert len(output.shape) == 2\n        assert output.shape[0] == num_graphs\n        assert output.shape[1] == len(label_cols)\n\n    def test_extract_graph_level_multitask_missing_cols(self):\n        df = pd.read_parquet(f\"tests/converted_fake_multilevel_data.parquet\")\n        num_graphs = len(df)\n        label_cols = [\"graph_label\", \"graph_label\"]\n\n        drop_index = [2, 5, 21, 237, 192, 23, 127, 11]\n        for replace in [1, 2]:\n            for missing_col in label_cols[:replace]:\n                df[missing_col].iloc[drop_index] = None\n\n            output = graphium.data.datamodule.extract_labels(df, \"graph\", label_cols)\n\n            assert isinstance(output, np.ndarray)\n            assert len(output.shape) == 2\n            assert output.shape[0] == num_graphs\n            assert output.shape[1] == len(label_cols)\n\n    def test_non_graph_level_extract_labels(self):\n        df = pd.read_parquet(f\"tests/converted_fake_multilevel_data.parquet\")\n\n        for level in [\"node\", \"edge\", \"nodepair\"]:\n            label_cols = [f\"{level}_label_{suffix}\" for suffix in [\"list\", \"np\"]]\n            output = graphium.data.datamodule.extract_labels(df, level, label_cols)\n\n            assert isinstance(output, list)\n            assert len(output[0].shape) == 2\n            assert output[0].shape[1] == len(label_cols)\n\n    def test_non_graph_level_extract_labels_missing_cols(self):\n        df = pd.read_parquet(f\"tests/converted_fake_multilevel_data.parquet\")\n\n        for level in [\"node\", \"edge\", \"nodepair\"]:\n            label_cols = [f\"{level}_label_{suffix}\" for suffix in [\"list\", \"np\"]]\n            drop_index = [2, 5, 21, 237, 192, 23, 127, 11]\n            for replace in [1, 2]:\n                for missing_col in label_cols[:replace]:\n                    df.loc[drop_index, missing_col] = None\n\n                output = graphium.data.datamodule.extract_labels(df, level, label_cols)\n\n                for idx in drop_index:\n                    assert len(output[idx].shape) == 2\n                    assert output[idx].shape[1] == len(label_cols)\n\n                    # Check that number of labels is adjusted correctly\n                    if replace == 1:\n                        non_missing_col = label_cols[1]\n                        assert output[idx].shape[0] == len(df[non_missing_col][idx])\n\n    def test_tdc_admet_benchmark_data_module(self):\n        \"\"\"\n        Verifies that the ADMET-specific subclass of the MultiTaskDataModule works.\n        Checks if all main endpoints can be run and if the split is correct.\n        \"\"\"\n\n        try:\n            from tdc.benchmark_group import admet_group\n            from tdc.utils import retrieve_benchmark_names\n        except ImportError:\n            self.skipTest(\"PyTDC needs to be installed to run this test. Use `pip install PyTDC`.\")\n            raise\n\n        # Make sure we can initialize the module and run the main endpoints\n        data_module = graphium.data.ADMETBenchmarkDataModule()\n        data_module.prepare_data()\n        data_module.setup()\n\n        for dl in [\n            data_module.train_dataloader(),\n            data_module.val_dataloader(),\n            data_module.test_dataloader(),\n        ]:\n            batch = next(iter(dl))\n            assert set(batch.keys()) == {\"labels\", \"features\"}\n\n        # # Validate the split\n        group = admet_group(path=self.tmp_test_dir)\n        benchmark_names = retrieve_benchmark_names(\"admet_group\")\n\n        # For each of the endpoints...\n        for name in benchmark_names:\n            # Get the split from the benchmark group (ground truth)\n            benchmark = group.get(name)\n            train, val = group.get_train_valid_split(0, name)\n            test = benchmark[\"test\"]\n\n            # Get the split from the data module\n            params = data_module._get_task_specific_arguments(name, 0, self.tmp_test_dir)\n            split = pd.read_csv(params.splits_path)\n            data = params.df\n\n            # Check that the split is the same\n            for ground_truth, label in [(train, \"train\"), (val, \"val\"), (test, \"test\")]:\n                y_true = ground_truth[\"Y\"].values\n                y_module = data.loc[split[label].dropna(), \"Y\"].values\n\n                assert len(y_true) == len(y_module)\n                assert np.allclose(np.sort(y_true), np.sort(y_module))\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_mup.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the implementation of mup\n\"\"\"\n\nimport unittest as ut\nfrom copy import deepcopy\nimport torch.nn as nn\nimport torch\nimport yaml\n\nfrom torch_geometric.data import Batch, Data\n\nfrom graphium.nn.architectures import FeedForwardNN, FeedForwardPyg, FullGraphMultiTaskNetwork\n\n\ndef get_pyg_graphs(in_dim, in_dim_edges):\n    edge_idx1 = torch.stack([torch.tensor([0, 1, 2, 3, 2]), torch.tensor([1, 2, 3, 0, 0])])\n    edge_idx2 = torch.stack([torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0])])\n    x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\n    e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\n    x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\n    e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\n    g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\n    g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\n    bg = Batch.from_data_list([g1, g2])\n\n    return bg\n\n\nclass test_mup(ut.TestCase):\n    kwargs = dict(\n        in_dim=12,\n        out_dim=60,\n        hidden_dims=8 * [84],\n        depth=None,\n        activation=\"LeakyReLU\",\n        last_activation=\"LeakyReLU\",\n        dropout=0.1,\n        last_dropout=0.2,\n        normalization=\"batch_norm\",\n        first_normalization=\"batch_norm\",\n        last_normalization=\"batch_norm\",\n        residual_type=\"simple\",\n        residual_skip_steps=2,\n        name=\"testing\",\n        layer_type=\"fc\",\n        layer_kwargs=None,\n    )\n\n    def test_feedforwardnn_mup(self):\n        kwargs = deepcopy(self.kwargs)\n        model = FeedForwardNN(**kwargs, last_layer_is_readout=False)\n        model_lastreadout = FeedForwardNN(**kwargs, last_layer_is_readout=True)\n        base_1 = model.make_mup_base_kwargs(divide_factor=1)\n        base_1_lastreadout = model_lastreadout.make_mup_base_kwargs(divide_factor=1)\n\n        base_2 = model.make_mup_base_kwargs(divide_factor=2)\n        base_2_lastreadout = model_lastreadout.make_mup_base_kwargs(divide_factor=2)\n        kwargs_2 = deepcopy(base_1)\n        kwargs_2.update(dict(out_dim=30, hidden_dims=8 * [42]))\n        kwargs_2_lastreadout = deepcopy(base_1_lastreadout)\n        kwargs_2_lastreadout.update(dict(hidden_dims=8 * [42]))\n\n        # Check the kwargs matching\n        for key in kwargs_2.keys():\n            if isinstance(kwargs_2[key], nn.Module):\n                # Can't match the random weights\n                self.assertEqual(str(kwargs_2[key]), str(base_2[key]), msg=key)\n            else:\n                self.assertEqual(kwargs_2[key], base_2[key], msg=key)\n\n        # Check the kwargs matching\n        for key in kwargs_2_lastreadout.keys():\n            if isinstance(kwargs_2_lastreadout[key], nn.Module):\n                # Can't match the random weights\n                self.assertEqual(str(kwargs_2_lastreadout[key]), str(base_2_lastreadout[key]), msg=key)\n            else:\n                self.assertEqual(kwargs_2_lastreadout[key], base_2_lastreadout[key], msg=key)\n\n        # Test that the models with divide_factor=1 can be built run a forward pass\n        in_features = torch.randn((10, kwargs[\"in_dim\"]))\n        model_1 = FeedForwardNN(**base_1)\n        model_1.forward(deepcopy(in_features))\n        model_1_lastreadout = FeedForwardNN(**base_1_lastreadout)\n        model_1_lastreadout.forward(deepcopy(in_features))\n\n        # Test that the models with divide_factor=2 can be built run a forward pass\n        model_2 = FeedForwardNN(**base_2)\n        model_2.forward(deepcopy(in_features))\n        model_2_lastreadout = FeedForwardNN(**base_2_lastreadout)\n        model_2_lastreadout.forward(deepcopy(in_features))\n\n    def test_feedforwardgraph_mup(self):\n        kwargs = deepcopy(self.kwargs)\n        in_dim_edges = kwargs[\"in_dim\"]\n        kwargs.update(dict(layer_type=\"pyg:gine\", in_dim_edges=in_dim_edges))\n        model = FeedForwardPyg(**kwargs, last_layer_is_readout=False)\n        model_lastreadout = FeedForwardPyg(**kwargs, last_layer_is_readout=True)\n        base_1 = model.make_mup_base_kwargs(divide_factor=1)\n        base_1_lastreadout = model_lastreadout.make_mup_base_kwargs(divide_factor=1)\n\n        base_2 = model.make_mup_base_kwargs(divide_factor=2)\n        base_2_lastreadout = model_lastreadout.make_mup_base_kwargs(divide_factor=2)\n        kwargs_2 = deepcopy(base_1)\n        kwargs_2.update(dict(out_dim=30, hidden_dims=8 * [42]))\n        kwargs_2_lastreadout = deepcopy(base_1_lastreadout)\n        kwargs_2_lastreadout.update(dict(hidden_dims=8 * [42]))\n\n        # Check the kwargs matching\n        for key in kwargs_2.keys():\n            if isinstance(kwargs_2[key], nn.Module):\n                # Can't match the random weights\n                self.assertEqual(str(kwargs_2[key]), str(base_2[key]), msg=key)\n            else:\n                self.assertEqual(kwargs_2[key], base_2[key], msg=key)\n\n        # Check the kwargs matching\n        for key in kwargs_2_lastreadout.keys():\n            if isinstance(kwargs_2_lastreadout[key], nn.Module):\n                # Can't match the random weights\n                self.assertEqual(str(kwargs_2_lastreadout[key]), str(base_2_lastreadout[key]), msg=key)\n            else:\n                self.assertEqual(kwargs_2_lastreadout[key], base_2_lastreadout[key], msg=key)\n\n        # Test that the models with divide_factor=1 can be built run a forward pass\n        in_features = get_pyg_graphs(in_dim=kwargs[\"in_dim\"], in_dim_edges=in_dim_edges)\n        model_1 = FeedForwardPyg(**base_1)\n        model_1.forward(deepcopy(in_features))\n        model_1_lastreadout = FeedForwardPyg(**base_1_lastreadout)\n        model_1_lastreadout.forward(deepcopy(in_features))\n\n        # Test that the models with divide_factor=2 can be built run a forward pass\n        model_2 = FeedForwardPyg(**base_2)\n        model_2.forward(deepcopy(in_features))\n        model_2_lastreadout = FeedForwardPyg(**base_2_lastreadout)\n        model_2_lastreadout.forward(deepcopy(in_features))\n\n    def test_fullgraphmultitasknetwork(self):\n        # Load the configuration file for the model\n        CONFIG_FILE = \"tests/config_test_ipu_dataloader.yaml\"\n        with open(CONFIG_FILE, \"r\") as f:\n            cfg = yaml.safe_load(f)\n\n        # Make fake graphs\n        in_dim = 12\n        in_dim_edges = 12\n        pe_indims = {\n            \"laplacian_eigvec\": 5,\n            \"laplacian_eigval\": 5,\n            \"rw_return_probs\": 2,\n            \"edge_rw_transition_probs\": 2,\n            \"nodepair_rw_return_probs\": 1,\n            \"electrostatic\": 6,\n            \"edge_commute\": 1,\n            \"nodepair_graphormer\": 1,\n            \"positions_3d\": 3,\n        }\n        in_features = get_pyg_graphs(in_dim=in_dim, in_dim_edges=in_dim_edges)\n        for key, dim in pe_indims.items():\n            if key.startswith(\"edge_\"):\n                in_features[key] = torch.randn(in_features.num_edges, dim)\n            elif key.startswith(\"nodepair_\"):\n                in_features[key] = torch.randn(in_features.num_nodes, in_features.num_nodes, dim)\n            else:\n                in_features[key] = torch.randn(in_features.num_nodes, dim)\n\n        # Load the model\n        kwargs = {}\n        for key, val in cfg[\"architecture\"].items():\n            if key in [\"model_type\", \"mup_base_path\"]:\n                continue\n            kwargs[key + \"_kwargs\"] = val\n        kwargs[\"pre_nn_kwargs\"][\"in_dim\"] = in_dim + kwargs[\"pe_encoders_kwargs\"][\"out_dim\"]\n        kwargs[\"pre_nn_edges_kwargs\"][\"in_dim\"] = in_dim_edges + kwargs[\"pe_encoders_kwargs\"][\"edge_out_dim\"]\n        kwargs[\"pe_encoders_kwargs\"][\"in_dims\"] = pe_indims\n\n        model = FullGraphMultiTaskNetwork(**kwargs, last_layer_is_readout=True)\n\n        kw_1 = model.make_mup_base_kwargs(divide_factor=1)\n        kw_2 = model.make_mup_base_kwargs(divide_factor=2)\n\n        # Check the parameter sizes\n        for key, elem in kw_1.items():\n            if not isinstance(elem, dict):\n                continue\n            for subkey, subelem in elem.items():\n                if \"dim\" in subkey:\n                    match = f\"{key}:{subkey}\"\n                    if match in [\"pre_nn_kwargs:in_dim\"]:\n                        self.assertNotEqual(subelem, kw_2[key][subkey], msg=match)\n                    elif match in [\n                        \"pre_nn_kwargs:out_dim\",\n                        \"pre_nn_edges_kwargs:out_dim\",\n                        \"gnn_kwargs:in_dim\",\n                        \"graph_output_nn_kwargs:in_dim\",\n                        \"gnn_kwargs:out_dim\",\n                        \"gnn_kwargs:in_dim_edges\",\n                        \"pe_encoders_kwargs:out_dim\",\n                        \"graph_output_nn_kwargs:out_dim\",\n                    ]:\n                        # Divide by 2\n                        self.assertEqual(round(subelem / 2), kw_2[key][subkey], msg=match)\n                    elif match in [\n                        \"pre_nn_kwargs:hidden_dims\",\n                        \"pre_nn_edges_kwargs:hidden_dims\",\n                        \"gnn_kwargs:hidden_dims\",\n                        \"graph_output_nn_kwargs:hidden_dims\",\n                        \"gnn_kwargs:hidden_dims_edges\",\n                    ]:\n                        # Arrays divide by 2\n                        new_list = [round(e / 2) for e in subelem]\n                        self.assertListEqual(new_list, kw_2[key][subkey], msg=match)\n                elif subkey in [\"homo\", \"alpha\", \"cv\"]:\n                    for subsubkey, subsubelem in subelem.items():\n                        match = f\"{key}:{subsubkey}\"\n                        if match in [\n                            \"task_heads_kwargs:out_dim\",\n                            \"task_heads_kwargs:out_dim\",\n                            \"task_heads_kwargs:out_dim\",\n                        ]:\n                            # No divide\n                            self.assertEqual(subsubelem, kw_2[key][subkey][subsubkey], msg=match)\n                        elif match in [\n                            \"task_heads_kwargs:in_dim\",\n                            \"task_heads_kwargs:in_dim\",\n                            \"task_heads_kwargs:in_dim\",\n                        ]:\n                            # Divide by 2\n                            self.assertEqual(round(subsubelem / 2), kw_2[key][subkey][subsubkey], msg=match)\n                        elif match in [\n                            \"task_heads_kwargs:hidden_dims\",\n                            \"task_heads_kwargs:hidden_dims\",\n                            \"task_heads_kwargs:hidden_dims\",\n                        ]:\n                            # Divide by 2 a list\n                            new_list = [round(e / 2) for e in subsubelem]\n                            self.assertListEqual(new_list, kw_2[key][subkey][subsubkey], msg=match)\n\n        # Test that the models with divide_factor=1 can be built run a forward pass\n        kw_1[\"last_layer_is_readout\"] = False\n        model_1 = FullGraphMultiTaskNetwork(**kw_1)\n        model_1.forward(deepcopy(in_features))\n        kw_1[\"last_layer_is_readout\"] = True\n        model_1 = FullGraphMultiTaskNetwork(**kw_1)\n        model_1.forward(deepcopy(in_features))\n\n        # Test that the models with divide_factor=2 can be built run a forward pass\n        kw_2[\"last_layer_is_readout\"] = False\n        model_2 = FullGraphMultiTaskNetwork(**kw_2)\n        model_2.forward(deepcopy(in_features))\n        kw_2[\"last_layer_is_readout\"] = True\n        model_2 = FullGraphMultiTaskNetwork(**kw_2)\n        model_2.forward(deepcopy(in_features))\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_packing.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n# General imports\nimport unittest as ut\nimport numpy as np\n\nimport torch\nfrom torch_geometric.data import Data, Batch\n\n# Current library imports\nfrom graphium.utils.packing import (\n    smart_packing,\n    get_pack_sizes,\n    fast_packing,\n    hybrid_packing,\n    node_to_pack_indices_mask,\n)\n\n\ndef random_packing(num_nodes, batch_size):\n    ipu_batch_size = int(len(num_nodes) / batch_size)\n    indices = np.arange(len(num_nodes))\n    np.random.shuffle(indices)\n    indices = np.reshape(indices, (ipu_batch_size, batch_size)).tolist()\n    return indices\n\n\nclass test_Packing(ut.TestCase):\n    def test_smart_packing(self):\n        np.random.seed(42)\n\n        batch_sizes = [2, 4, 8, 16, 32, 64]\n        ipu_batch_sizes = [2, 3, 4, 8, 16, 32, 64]\n\n        for batch_size in batch_sizes:\n            for ipu_batch_size in ipu_batch_sizes:\n                err_msg = f\"bz={batch_size}, ipu_bz={ipu_batch_size}\"\n\n                # Generate random batch size\n                global_batch = batch_size * ipu_batch_size\n                num_nodes = np.abs(np.random.gamma(2, 20, size=global_batch)).astype(int)\n\n                # Use the smart packing\n                packed_indices = smart_packing(num_nodes=num_nodes, batch_size=batch_size)\n                pack_num_nodes = get_pack_sizes(packed_indices, num_nodes)\n\n                # Use the random packing\n                rand_packed_indices = random_packing(num_nodes=num_nodes, batch_size=batch_size)\n                rand_pack_num_nodes = get_pack_sizes(rand_packed_indices, num_nodes)\n\n                # Assert that the smart packing is better than the random packing\n                self.assertLessEqual(max(pack_num_nodes), max(rand_pack_num_nodes), msg=err_msg)\n                self.assertGreaterEqual(min(pack_num_nodes), min(rand_pack_num_nodes), msg=err_msg)\n\n                # Assert that the total number of atoms is right\n                self.assertEqual(sum(pack_num_nodes), sum(num_nodes), msg=err_msg)\n                self.assertEqual(sum(rand_pack_num_nodes), sum(num_nodes), msg=err_msg)\n\n                # Assert that all index are there\n                self.assertListEqual(\n                    np.sort(np.asarray(packed_indices).flatten()).tolist(), np.arange(len(num_nodes)).tolist()\n                )\n                self.assertListEqual(\n                    np.sort(np.asarray(rand_packed_indices).flatten()).tolist(),\n                    np.arange(len(num_nodes)).tolist(),\n                )\n\n    def test_fast_packing(self):\n        np.random.seed(42)\n\n        # Start at 4 for fast_packing for better statistical significance\n        batch_sizes = [4, 8, 16, 32, 64]\n        ipu_batch_sizes = [4, 8, 16, 32, 64]\n\n        for batch_size in batch_sizes:\n            for ipu_batch_size in ipu_batch_sizes:\n                err_msg = f\"bz={batch_size}, ipu_bz={ipu_batch_size}\"\n\n                # Generate random batch size\n                global_batch = batch_size * ipu_batch_size\n                num_nodes = np.abs(np.random.gamma(2, 20, size=global_batch)).astype(int)\n\n                # Use the smart packing\n                packed_indices = fast_packing(num_nodes=num_nodes, batch_size=batch_size)\n                pack_num_nodes = get_pack_sizes(packed_indices, num_nodes)\n\n                # Use the random packing\n                rand_packed_indices = random_packing(num_nodes=num_nodes, batch_size=batch_size)\n                rand_pack_num_nodes = get_pack_sizes(rand_packed_indices, num_nodes)\n\n                # Assert that the smart packing is better than the random packing\n                self.assertLessEqual(max(pack_num_nodes), max(rand_pack_num_nodes), msg=err_msg)\n                self.assertGreaterEqual(min(pack_num_nodes), min(rand_pack_num_nodes), msg=err_msg)\n\n                # Assert that the total number of atoms is right\n                self.assertEqual(sum(pack_num_nodes), sum(num_nodes), msg=err_msg)\n                self.assertEqual(sum(rand_pack_num_nodes), sum(num_nodes), msg=err_msg)\n\n                # Assert that all index are there\n                self.assertListEqual(\n                    np.sort(np.asarray(packed_indices).flatten()).tolist(), np.arange(len(num_nodes)).tolist()\n                )\n                self.assertListEqual(\n                    np.sort(np.asarray(rand_packed_indices).flatten()).tolist(),\n                    np.arange(len(num_nodes)).tolist(),\n                )\n\n    def test_hybrid_packing(self):\n        np.random.seed(42)\n\n        batch_sizes = [2, 4, 8, 16, 32, 64]\n        ipu_batch_sizes = [2, 3, 4, 8, 16, 32, 64]\n\n        for batch_size in batch_sizes:\n            for ipu_batch_size in ipu_batch_sizes:\n                err_msg = f\"bz={batch_size}, ipu_bz={ipu_batch_size}\"\n\n                # Generate random batch size\n                global_batch = batch_size * ipu_batch_size\n                num_nodes = np.abs(np.random.gamma(2, 20, size=global_batch)).astype(int)\n\n                # Use the smart packing\n                packed_indices = hybrid_packing(num_nodes=num_nodes, batch_size=batch_size)\n                pack_num_nodes = get_pack_sizes(packed_indices, num_nodes)\n\n                # Use the random packing\n                rand_packed_indices = random_packing(num_nodes=num_nodes, batch_size=batch_size)\n                rand_pack_num_nodes = get_pack_sizes(rand_packed_indices, num_nodes)\n\n                # Assert that the smart packing is better than the random packing\n                self.assertLessEqual(max(pack_num_nodes), max(rand_pack_num_nodes), msg=err_msg)\n                self.assertGreaterEqual(min(pack_num_nodes), min(rand_pack_num_nodes), msg=err_msg)\n\n                # Assert that the total number of atoms is right\n                self.assertEqual(sum(pack_num_nodes), sum(num_nodes), msg=err_msg)\n                self.assertEqual(sum(rand_pack_num_nodes), sum(num_nodes), msg=err_msg)\n\n                # Assert that all index are there\n                self.assertListEqual(\n                    np.sort(np.asarray(packed_indices).flatten()).tolist(), np.arange(len(num_nodes)).tolist()\n                )\n                self.assertListEqual(\n                    np.sort(np.asarray(rand_packed_indices).flatten()).tolist(),\n                    np.arange(len(num_nodes)).tolist(),\n                )\n\n    def test_node_to_pack_indices_mask(self):\n        # Create a dummy batch\n        in_dim = 7\n        in_dim_edges = 11\n        max_num_nodes_per_graph = 20\n        batch_size_per_pack = 5\n\n        torch.manual_seed(42)\n\n        # Create a dummy batch of graphs\n        batch, all_num_nodes = [], []\n        for ii in range(100):\n            num_nodes = torch.randint(1, max_num_nodes_per_graph, (1,)).item()\n            all_num_nodes.append(num_nodes)\n            num_edges = abs(round(2.2 * num_nodes) + torch.randint(-2, 2, (1,)).item()) + 1\n            x = torch.randn(num_nodes, in_dim, dtype=torch.float32)\n            edge_idx = torch.randint(0, num_nodes, (2, num_edges))\n            e = torch.randn(edge_idx.shape[-1], in_dim_edges, dtype=torch.float32)\n            g = Data(h=x, edge_index=edge_idx, edge_attr=e)\n            batch.append(g)\n        batch = Batch.from_data_list(batch)\n\n        # Get the packing\n        packed_graph_idx = fast_packing(all_num_nodes, batch_size_per_pack)\n        pack_sizes = get_pack_sizes(packed_graph_idx, all_num_nodes)\n        max_pack_size = max(pack_sizes)\n        num_packs = len(pack_sizes)\n\n        # Get the node to pack indices and the mask\n        pack_from_node_idx, pack_attn_mask = node_to_pack_indices_mask(packed_graph_idx, all_num_nodes)\n\n        # Assert that the nodes to pack indices are correct\n        h = torch.arange(batch.num_nodes, dtype=torch.float32)\n        packed_shape = [num_packs, max_pack_size]\n        h_packed = torch.zeros(packed_shape)\n        h_packed[pack_from_node_idx[:, 0], pack_from_node_idx[:, 1]] = h\n        h_packed_unique = torch.sort(torch.unique(h_packed))[0]\n        np.testing.assert_array_equal(h_packed_unique, torch.arange(batch.num_nodes))\n        self.assertEqual(h_packed.sum(), h.sum())\n\n        # Test again with additional h dimension\n        h = batch.h\n        packed_shape = [num_packs, max_pack_size] + list(h.shape[1:])\n        h_packed = torch.zeros(packed_shape)\n        h_packed[pack_from_node_idx[:, 0], pack_from_node_idx[:, 1]] = h\n        h_packed_unique = torch.sort(torch.unique(h_packed))[0]\n        h_packed_unique = h_packed_unique[h_packed_unique != 0]\n        np.testing.assert_array_almost_equal(h_packed_unique, torch.unique(h))\n        self.assertAlmostEqual(h_packed.sum().item(), h.sum().item(), places=3)\n\n        # Assert that the mask is correct by counting the number of False values (the sum of squared number of nodes per pack)\n        num_false = (~pack_attn_mask).sum([1, 2])\n        num_expected = torch.as_tensor(\n            [sum([all_num_nodes[graph_idx] ** 2 for graph_idx in pack]) for pack in packed_graph_idx]\n        )\n        np.testing.assert_array_equal(num_false, num_expected)\n\n        # Assert that the mask is correct by counting the number of elements in each row and column\n        num_expected = []\n        for pack in packed_graph_idx:\n            pack_num_expected = []\n            for graph_idx in pack:\n                num_nodes = all_num_nodes[graph_idx]\n                for ii in range(num_nodes):\n                    pack_num_expected.append(num_nodes)\n            pack_num_expected.extend([0] * (max_pack_size - len(pack_num_expected)))\n            num_expected.append(pack_num_expected)\n        num_expected = torch.as_tensor(num_expected)\n        num_false_row = (~pack_attn_mask).sum([2])\n        num_false_col = (~pack_attn_mask).sum([1])\n        np.testing.assert_array_equal(num_false_row, num_expected)\n        np.testing.assert_array_equal(num_false_col, num_expected)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_pe_nodepair.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the positional encodings in graphium/features/*\n\"\"\"\n\nimport numpy as np\nimport networkx as nx\nimport unittest as ut\n\nfrom graphium.features.electrostatic import compute_electrostatic_interactions\nfrom graphium.features.commute import compute_commute_distances\nfrom graphium.features.graphormer import compute_graphormer_distances\n\n\nclass test_positional_encodings(ut.TestCase):\n    # Test graphs\n    adj_dict = {}\n    max_dict = {}\n\n    # 6-ring\n    adj = np.asarray(\n        [\n            [0, 1, 0, 0, 0, 1],\n            [1, 0, 1, 0, 0, 0],\n            [0, 1, 0, 1, 0, 0],\n            [0, 0, 1, 0, 1, 0],\n            [0, 0, 0, 1, 0, 1],\n            [1, 0, 0, 0, 1, 0],\n        ]\n    )\n    adj_dict[\"6-ring\"] = adj\n    max_dict[\"6-ring\"] = 3\n\n    # 5-path\n    G = nx.path_graph(5)\n    adj = nx.to_numpy_array(G)\n    adj_dict[\"5-path\"] = adj\n    max_dict[\"5-path\"] = 4\n\n    # 4-clique\n    adj = 1 - np.eye(4)\n    adj_dict[\"4-clique\"] = adj\n    max_dict[\"4-clique\"] = 1\n\n    # 4-barbell\n    H = nx.barbell_graph(4, 0)\n    adj = nx.to_numpy_array(H)\n    adj_dict[\"4-barbell\"] = adj\n    max_dict[\"4-barbell\"] = 3\n\n    def test_dimensions(self):\n        for _, adj in self.adj_dict.items():\n            pe, _, _ = compute_electrostatic_interactions(adj, cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n            pe, _, _ = compute_commute_distances(adj, adj.shape[0], cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n    def test_symmetry(self):\n        for _, adj in self.adj_dict.items():\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe, pe.T)\n\n            pe, _, _ = compute_commute_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe, pe.T)\n\n    def test_max_dist(self):\n        for key, adj in self.adj_dict.items():\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe.max(), self.max_dict[key])\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_pe_rw.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the positional encodings in graphium/features/*\n\"\"\"\n\nimport numpy as np\nimport networkx as nx\nimport unittest as ut\n\nfrom graphium.features.rw import compute_rwse\n\n\nclass test_pe_spectral(ut.TestCase):\n    def test_caching_and_outputs(self):\n        # 4-barbell\n        G = nx.barbell_graph(4, 0)\n        adj = nx.to_numpy_array(G)\n        num_nodes = adj.shape[0]\n        cache = {}\n\n        ksteps1 = [4, 6]\n        ksteps2 = [2]\n        ksteps3 = [6, 7]\n\n        pe1, _, cache = compute_rwse(\n            adj.astype(np.float32), ksteps1, num_nodes, cache, pos_type=\"rw_transition_probs\"\n        )\n\n        pe2, _, cache = compute_rwse(\n            adj.astype(np.float32), ksteps2, num_nodes, cache, pos_type=\"rw_return_probs\"\n        )\n\n        pe3, _, cache = compute_rwse(\n            adj.astype(np.float32), ksteps3, num_nodes, cache, pos_type=\"rw_return_probs\"\n        )\n\n        self.assertTrue(all([k in cache[\"ksteps\"] for k in ksteps1 + ksteps2 + ksteps3]))\n        self.assertTrue(pe1.shape, np.zeros((num_nodes, num_nodes, len(ksteps1))))\n        self.assertTrue(pe2.shape, np.zeros((num_nodes, len(ksteps2))))\n        self.assertTrue(pe3.shape, np.zeros((num_nodes, len(ksteps3))))\n\n        for i in range(len(ksteps1)):\n            np.testing.assert_array_almost_equal(pe1[..., i].sum(1), np.ones(num_nodes))\n\n        self.assertGreaterEqual(pe1.min(), 0.0)\n        self.assertLessEqual(pe1.max(), 1.0)\n\n        self.assertGreaterEqual(pe2.min(), 0.0)\n        self.assertLessEqual(pe2.max(), 1.0)\n\n        self.assertGreaterEqual(pe3.min(), 0.0)\n        self.assertLessEqual(pe3.max(), 1.0)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_pe_spectral.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the positional encodings in graphium/features/*\n\"\"\"\n\nimport numpy as np\nimport networkx as nx\nimport unittest as ut\n\nfrom graphium.features.spectral import compute_laplacian_pe\n\n\nclass test_pe_spectral(ut.TestCase):\n    # 2 disconnected 3 cliques\n    adj1 = np.zeros((6, 6))\n    adj_3clq = 1 - np.eye(3)\n    adj1[:3, :3] = adj_3clq\n    adj1[3:, 3:] = adj_3clq\n\n    # 3-clique\n    adj2 = 1 - np.eye(6)\n\n    def test_for_connected_vs_disconnected_graph(self):\n        num_pos = 3\n\n        # test if pe works identically on connected vs disconnected graphs\n        eigvals_pe1, _, _, cache = compute_laplacian_pe(self.adj1, num_pos, cache={})\n        eigvals_pe1 = np.real(eigvals_pe1).astype(np.float32)\n        _, eigvecs_pe1, _, _ = compute_laplacian_pe(self.adj1, num_pos, cache=cache)\n\n        # We expect to cache 4 objects in when running the functon for the first time\n        self.assertEqual(len(cache.keys()), 4)\n\n        eigvals_pe2, _, _, _ = compute_laplacian_pe(self.adj2, num_pos, cache={})\n        eigvals_pe2 = np.real(eigvals_pe2).astype(np.float32)\n        _, eigvecs_pe2, _, _ = compute_laplacian_pe(self.adj2, num_pos, cache={})\n\n        np.testing.assert_array_almost_equal(2 * eigvals_pe1, eigvals_pe2)\n        self.assertListEqual(list(eigvals_pe2.shape), [self.adj2.shape[0], num_pos])\n        self.assertListEqual(list(eigvecs_pe2.shape), [self.adj2.shape[0], num_pos])\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_pos_transfer_funcs.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the positional encodings in graphium/features/*\n\"\"\"\n\nimport numpy as np\nimport networkx as nx\nimport unittest as ut\n\nfrom graphium.features.spectral import compute_laplacian_pe\nfrom graphium.features.transfer_pos_level import (\n    node_to_edge,\n    node_to_nodepair,\n    edge_to_nodepair,\n    nodepair_to_node,\n    nodepair_to_edge,\n    graph_to_node,\n)\n\n\nclass test_pos_transfer_funcs(ut.TestCase):\n    # 4-barbell\n    G = nx.barbell_graph(4, 0)\n    adj = nx.to_numpy_array(G)\n    num_nodes, num_feat = 8, 5\n    node_pe = np.random.rand(num_nodes, num_feat)\n\n    def test_different_pathways_from_node_to_edge(self):\n        edge_pe1, _ = node_to_edge(self.node_pe, self.adj, {})\n        nodepair_pe1 = node_to_nodepair(self.node_pe, self.num_nodes)\n        edge_pe2, _ = nodepair_to_edge(nodepair_pe1, self.adj, {})\n        nodepair_pe2, _ = edge_to_nodepair(edge_pe1, self.adj, self.num_nodes, {})\n        edge_pe3, _ = nodepair_to_edge(nodepair_pe2, self.adj, {})\n        np.testing.assert_array_almost_equal(edge_pe1, edge_pe2)\n        np.testing.assert_array_almost_equal(edge_pe1, edge_pe3)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_positional_encoders.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different datasets of graphium/features/featurizer.py\n\"\"\"\n\nimport numpy as np\nimport unittest as ut\nfrom copy import deepcopy\nfrom rdkit import Chem\nimport datamol as dm\nimport torch\nfrom scipy.sparse import coo_matrix\n\nfrom graphium.features.featurizer import GraphDict\nfrom graphium.features.positional_encoding import graph_positional_encoder\nfrom graphium.nn.encoders import laplace_pos_encoder, mlp_encoder, signnet_pos_encoder\n\n# TODO: Test the MLP_encoder and signnet_pos_encoder\n\n\nclass test_positional_encoder(ut.TestCase):\n    smiles = [\n        \"C\",\n        \"CC\",\n        \"CC.CCCC\",\n        \"CC(C)CC1=CC=C(C=C1)C(C)C(=O)O\",\n        \"OCCc1c(C)[n+](cs1)Cc2cnc(C)nc2N\",\n        \"O1C=C[C@H]([C@H]1O2)c3c2cc(OC)c4c3OC(=O)C5=C4CCC(=O)5\",\n    ]\n    mols = [dm.to_mol(s) for s in smiles]\n    adjs = [Chem.rdmolops.GetAdjacencyMatrix(mol) for mol in mols]\n\n    def test_laplacian_eigvec_eigval(self):\n        for ii, adj in enumerate(deepcopy(self.adjs)):\n            for num_pos in [1, 2, 4]:  # Can't test too much eigs because of multiplicities\n                for disconnected_comp in [True, False]:\n                    err_msg = f\"adj_id={ii}, num_pos={num_pos}, disconnected_comp={disconnected_comp}\"\n\n                    # returns a dictionary of computed pe\n                    pos_kwargs = {\n                        \"pos_type\": \"laplacian_eigvec\",\n                        \"num_pos\": num_pos,\n                        \"disconnected_comp\": disconnected_comp,\n                        \"pos_level\": \"node\",\n                    }\n                    num_nodes = adj.shape[0]\n                    eigvecs, cache = graph_positional_encoder(adj, num_nodes, pos_kwargs=pos_kwargs)\n                    pos_kwargs[\"pos_type\"] = \"laplacian_eigval\"\n                    eigvals, cache = graph_positional_encoder(adj, num_nodes, pos_kwargs=pos_kwargs)\n\n                    self.assertEqual(list(eigvecs.shape), [adj.shape[0], num_pos], msg=err_msg)\n                    self.assertEqual(list(eigvals.shape), [adj.shape[0], num_pos], msg=err_msg)\n\n                    # Compute eigvals and eigvecs\n                    lap = np.diag(np.sum(adj, axis=1)) - adj\n                    true_eigvals, true_eigvecs = np.linalg.eig(lap)\n                    sort_idx = np.argsort(true_eigvals)\n                    true_eigvals, true_eigvecs = true_eigvals[sort_idx], true_eigvecs[:, sort_idx]\n                    true_eigvecs = true_eigvecs / (np.sum(true_eigvecs**2, axis=0, keepdims=True) + 1e-8)\n\n                    true_num_pos = min(num_pos, len(true_eigvals))\n                    true_eigvals, true_eigvecs = true_eigvals[:true_num_pos], true_eigvecs[:, :true_num_pos]\n\n                    if not (\".\" in self.smiles[ii]):\n                        np.testing.assert_array_almost_equal(\n                            np.abs(true_eigvecs),\n                            np.abs(eigvecs[:, :true_num_pos]),\n                            decimal=6,\n                            err_msg=err_msg,\n                        )\n                        self.assertAlmostEqual(np.sum(true_eigvecs[:, 1:]), 0, places=6, msg=err_msg)\n                        np.testing.assert_array_almost_equal(\n                            true_eigvals, eigvals[0, :true_num_pos], decimal=6, err_msg=err_msg\n                        )\n\n    # didn't actually check the exact computation result because the code was adapted\n    def test_rwse(self):\n        for ii, adj in enumerate(deepcopy(self.adjs)):\n            for ksteps in [1, 2, 4]:\n                err_msg = f\"adj_id={ii}, ksteps={ksteps}\"\n\n                num_nodes = adj.shape[0]\n                pos_kwargs = {\"pos_type\": \"rw_return_probs\", \"ksteps\": ksteps, \"pos_level\": \"node\"}\n                rwse_embed, cache = graph_positional_encoder(adj, num_nodes, pos_kwargs=pos_kwargs)\n                self.assertEqual(list(rwse_embed.shape), [num_nodes, ksteps], msg=err_msg)\n\n    # TODO: work in progress\n\n    \"\"\"\n    continue debugging here, see how to adapt the laplace_pos_encoder\n    code running now, question is where to add the laplace_pos_encoder\n    \"\"\"\n\n    def test_laplacian_eigvec_with_encoder(self):\n        for ii, adj in enumerate(deepcopy(self.adjs)):\n            for num_pos in [2, 4, 8]:  # Can't test too much eigs because of multiplicities\n                for disconnected_comp in [True, False]:\n                    for model_type in [\"Transformer\", \"DeepSet\", \"MLP\"]:\n                        err_msg = f\"adj_id={ii}, num_pos={num_pos}, disconnected_comp={disconnected_comp}\"\n\n                        # returns a dictionary of computed pe\n                        pos_kwargs = {\n                            \"pos_type\": \"laplacian_eigvec\",\n                            \"num_pos\": num_pos,\n                            \"disconnected_comp\": disconnected_comp,\n                            \"pos_level\": \"node\",\n                        }\n                        num_nodes = adj.shape[0]\n                        eigvecs, cache = graph_positional_encoder(adj, num_nodes, pos_kwargs=pos_kwargs)\n                        pos_kwargs[\"pos_type\"] = \"laplacian_eigval\"\n                        eigvals, cache = graph_positional_encoder(adj, num_nodes, pos_kwargs=pos_kwargs)\n\n                        input_keys = [\"laplacian_eigvec\", \"laplacian_eigval\"]\n                        in_dim = num_pos\n                        hidden_dim = 64\n                        out_dim = 64\n                        num_layers = 1\n\n                        eigvecs = torch.from_numpy(eigvecs)\n                        eigvals = torch.from_numpy(eigvals)\n\n                        g = GraphDict(\n                            {\n                                \"adj\": coo_matrix(adj),\n                                \"data\": {\"laplacian_eigval\": eigvals, \"laplacian_eigvec\": eigvecs},\n                            }\n                        )\n                        batch = g.make_pyg_graph()\n\n                        encoder = laplace_pos_encoder.LapPENodeEncoder(\n                            input_keys=input_keys,\n                            output_keys=[\"node\"],\n                            in_dim=in_dim,  # Size of Laplace PE embedding\n                            hidden_dim=hidden_dim,\n                            out_dim=out_dim,\n                            model_type=model_type,  # 'Transformer' or 'DeepSet'\n                            num_layers=num_layers,\n                            num_layers_post=2,  # Num. layers to apply after pooling\n                            dropout=0.1,\n                            first_normalization=None,\n                        )\n\n                        hidden_embed = encoder(batch, key_prefix=None)\n                        assert \"node\" in hidden_embed.keys()\n                        self.assertEqual(list(hidden_embed[\"node\"].shape), [num_nodes, out_dim], msg=err_msg)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_positional_encodings.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the positional encodings in graphium/features/*\n\"\"\"\n\nimport numpy as np\nimport networkx as nx\nimport unittest as ut\n\n# from graphium.features.spectral import compute_laplacian_positional_eigvecs # TODO: add tests\n# from graphium.features.rw import compute_rwse # TODO: add tests\nfrom graphium.features.electrostatic import compute_electrostatic_interactions\nfrom graphium.features.commute import compute_commute_distances\nfrom graphium.features.graphormer import compute_graphormer_distances\n\n\nclass test_positional_encodings(ut.TestCase):\n    # Test graphs\n    adj_dict = {}\n    max_dict = {}\n\n    # 6-ring\n    adj = np.asarray(\n        [\n            [0, 1, 0, 0, 0, 1],\n            [1, 0, 1, 0, 0, 0],\n            [0, 1, 0, 1, 0, 0],\n            [0, 0, 1, 0, 1, 0],\n            [0, 0, 0, 1, 0, 1],\n            [1, 0, 0, 0, 1, 0],\n        ]\n    )\n    adj_dict[\"6-ring\"] = adj\n    max_dict[\"6-ring\"] = 3\n\n    # 5-path\n    G = nx.path_graph(5)\n    adj = nx.to_numpy_array(G)\n    adj_dict[\"5-path\"] = adj\n    max_dict[\"5-path\"] = 4\n\n    # 4-clique\n    adj = 1 - np.eye(4)\n    adj_dict[\"4-clique\"] = adj\n    max_dict[\"4-clique\"] = 1\n\n    # 4-barbell\n    H = nx.barbell_graph(4, 0)\n    adj = nx.to_numpy_array(H)\n    adj_dict[\"4-barbell\"] = adj\n    max_dict[\"4-barbell\"] = 3\n\n    def test_dimensions(self):\n        for _, adj in self.adj_dict.items():\n            pe, _, _ = compute_electrostatic_interactions(adj, cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n            pe, _, _ = compute_commute_distances(adj, adj.shape[0], cache={})\n            self.assertEqual(pe.shape, adj.shape)\n\n    def test_symmetry(self):\n        for _, adj in self.adj_dict.items():\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe, pe.T)\n\n            pe, _, _ = compute_commute_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe, pe.T)\n\n    def test_max_dist(self):\n        for key, adj in self.adj_dict.items():\n            pe, _, _ = compute_graphormer_distances(adj, adj.shape[0], cache={})\n            np.testing.assert_array_almost_equal(pe.max(), self.max_dict[key])\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_predictor.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the file graphium/trainer/predictor.py\n\"\"\"\n\nimport torch\nfrom torch.nn import BCELoss, MSELoss\nimport unittest as ut\n\nfrom graphium.trainer.predictor_options import EvalOptions\n\n\nclass test_Predictor(ut.TestCase):\n    def test_parse_loss_fun(self):\n        losses = [\"bce\", \"mse\", \"mae\", \"l1\", BCELoss(), MSELoss()]\n        preds = torch.rand(10, 5)\n        target = (torch.rand(10, 5) > 0.5).to(preds.dtype)\n        for this_loss in losses:\n            loss_fun = EvalOptions.parse_loss_fun(this_loss)\n            loss = loss_fun(preds, target)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_pyg_layers.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the different layers of graphium/nn/pyg_layers/...\n\nThe layers are not thoroughly tested due to the difficulty of testing them\n\"\"\"\n\nimport numpy as np\nimport torch\nimport unittest as ut\nfrom torch_geometric.data import Data, Batch\nfrom copy import deepcopy\nimport pytest\n\nfrom graphium.nn.pyg_layers import (\n    GINConvPyg,\n    GINEConvPyg,\n    MPNNPlusPyg,\n    GatedGCNPyg,\n    PNAMessagePassingPyg,\n    GPSLayerPyg,\n    VirtualNodePyg,\n    DimeNetPyg,\n)\n\nfrom graphium.nn.pyg_layers.utils import (\n    PreprocessPositions,\n    GaussianLayer,\n)\n\n\nclass test_Pyg_Layers(ut.TestCase):\n    in_dim = 21\n    out_dim = 11\n    in_dim_edges = 13\n    out_dim_edges = 17\n\n    edge_idx1 = torch.stack([torch.tensor([0, 1, 2, 3, 2]), torch.tensor([1, 2, 3, 0, 0])])\n    edge_idx2 = torch.stack([torch.tensor([0, 0, 0, 1]), torch.tensor([0, 1, 2, 0])])\n    x1 = torch.randn(edge_idx1.max() + 1, in_dim, dtype=torch.float32)\n    e1 = torch.randn(edge_idx1.shape[-1], in_dim_edges, dtype=torch.float32)\n    x2 = torch.randn(edge_idx2.max() + 1, in_dim, dtype=torch.float32)\n    e2 = torch.randn(edge_idx2.shape[-1], in_dim_edges, dtype=torch.float32)\n    # edge_idx1, e1 = add_self_loops(edge_idx1, e1)\n    # edge_idx2, e2 = add_self_loops(edge_idx2, e2)\n    g1 = Data(feat=x1, edge_index=edge_idx1, edge_feat=e1)\n    g2 = Data(feat=x2, edge_index=edge_idx2, edge_feat=e2)\n    bg = Batch.from_data_list([g1, g2])\n\n    kwargs = {\n        \"activation\": \"relu\",\n        \"dropout\": 0.1,\n        \"normalization\": \"batch_norm\",\n        \"droppath_rate\": 0.1,\n        \"layer_idx\": 1,\n        \"layer_depth\": 10,\n    }\n\n    def test_gpslayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        kwargs = deepcopy(self.kwargs)\n        kwargs.pop(\"droppath_rate\")\n        kwargs[\"droppath_rate_attn\"] = 0.2\n        kwargs[\"droppath_rate_ffn\"] = 0.3\n        kwargs[\"mpnn_kwargs\"] = {\"droppath_rate_ffn\": 0.4}\n\n        layer = GPSLayerPyg(in_dim=self.in_dim, out_dim=self.out_dim, **kwargs)\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertTrue(layer.layer_inputs_edges)\n        self.assertFalse(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer. Should crash due to different dim for nodes vs edges\n        with self.assertRaises(ValueError):\n            bg = layer.forward(bg)\n\n        # Create new edge attributes with same dim and check that it works\n        bg.edge_feat = torch.zeros((bg.edge_feat.shape[0], self.in_dim), dtype=torch.float32)\n        bg = layer.forward(bg)\n        self.assertEqual(bg.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg.feat.shape[1], self.out_dim * layer.out_dim_factor)\n\n    def test_ginlayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        layer = GINConvPyg(in_dim=self.in_dim, out_dim=self.out_dim, **self.kwargs)\n\n        # Check the re-implementation of abstract methods\n        self.assertFalse(layer.layer_supports_edges)\n        self.assertFalse(layer.layer_inputs_edges)\n        self.assertFalse(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer\n        bg = layer.forward(bg)\n        self.assertEqual(bg.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg.feat.shape[1], self.out_dim * layer.out_dim_factor)\n\n    def test_ginelayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        layer = GINEConvPyg(in_dim=self.in_dim, out_dim=self.out_dim, **self.kwargs)\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertTrue(layer.layer_inputs_edges)\n        self.assertFalse(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer. Should crash due to different dim for nodes vs edges\n        with self.assertRaises(ValueError):\n            bg = layer.forward(bg)\n\n        # Create new edge attributes with same dim and check that it works\n        bg.edge_feat = torch.zeros((bg.edge_feat.shape[0], self.in_dim), dtype=torch.float32)\n        bg = layer.forward(bg)\n        self.assertEqual(bg.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg.feat.shape[1], self.out_dim * layer.out_dim_factor)\n\n    def test_mpnnlayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        # need in_dim = out_dim for skip connection\n        # mpnn layer accept different dimension for node and edge features\n        layer = MPNNPlusPyg(\n            in_dim=self.in_dim,\n            out_dim=self.in_dim,\n            use_edges=True,\n            in_dim_edges=self.in_dim_edges,\n            out_dim_edges=self.in_dim_edges,\n            **self.kwargs,\n        )\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertTrue(layer.layer_inputs_edges)\n        self.assertTrue(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Create new edge attributes with same dim and check that it works\n        bg.edge_feat = torch.zeros((bg.edge_feat.shape[0], self.in_dim_edges), dtype=torch.float32)\n        bg = layer.forward(bg)\n        self.assertEqual(bg.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg.feat.shape[1], self.in_dim * layer.out_dim_factor)\n\n    def test_gatedgcnlayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        e_in = bg.edge_feat\n        layer = GatedGCNPyg(\n            in_dim=self.in_dim,\n            out_dim=self.out_dim,\n            in_dim_edges=self.in_dim_edges,\n            out_dim_edges=self.out_dim_edges,\n            **self.kwargs,\n        )\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertTrue(layer.layer_inputs_edges)\n        self.assertTrue(layer.layer_outputs_edges)\n        self.assertIsInstance(layer.layer_supports_edges, bool)\n        self.assertIsInstance(layer.layer_inputs_edges, bool)\n        self.assertIsInstance(layer.layer_outputs_edges, bool)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer with edges\n        bg2 = layer.forward(bg)\n        self.assertEqual(bg2.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg2.feat.shape[1], self.out_dim * layer.out_dim_factor)\n\n    def test_pnamessagepassinglayer(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        aggregators = [\"mean\", \"max\", \"min\", \"std\", \"sum\"]\n        scalers = [\"identity\", \"amplification\", \"attenuation\"]\n\n        layer = PNAMessagePassingPyg(\n            in_dim=self.in_dim, out_dim=self.out_dim, aggregators=aggregators, scalers=scalers, **self.kwargs\n        )\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertIsInstance(layer.layer_supports_edges, bool)\n        # self.assertFalse(layer.layer_inputs_edges)\n        self.assertFalse(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer\n        bg2 = layer.forward(bg)\n        self.assertEqual(bg2.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg2.feat.shape[1], self.out_dim * layer.out_dim_factor)\n\n        # Now try with edges\n        bg = deepcopy(self.bg)\n        layer = PNAMessagePassingPyg(\n            in_dim=self.in_dim,\n            out_dim=self.out_dim,\n            aggregators=aggregators,\n            scalers=scalers,\n            in_dim_edges=self.in_dim_edges,\n            **self.kwargs,\n        )\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        # self.assertTrue(layer.layer_inputs_edges)\n        self.assertIsInstance(layer.layer_supports_edges, bool)\n        # self.assertIsInstance(layer.layer_inputs_edges, bool)\n        self.assertFalse(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        bg2 = layer.forward(bg)\n        self.assertEqual(bg2.feat.shape[0], feat_in.shape[0])\n        self.assertEqual(bg2.feat.shape[1], self.out_dim * layer.out_dim_factor)\n        self.assertTrue((bg2.edge_feat == self.bg.edge_feat).all)\n\n    @pytest.mark.skip_ipu\n    def test_dimenetlayer(self):\n        from graphium.nn.encoders.bessel_pos_encoder import BesselSphericalPosEncoder\n\n        bg = deepcopy(self.bg)\n        # dummy input pos\n        bg.pos = torch.randn((bg.feat.shape[0], 3), dtype=torch.float32)\n\n        # position 3d encoder\n        pos_enc = BesselSphericalPosEncoder(\n            input_keys=[\"pos\"],\n            output_keys=[\n                \"node_feat\",\n                \"edge_feat\",\n                \"edge_rbf\",\n                \"triplet_sbf\",\n                \"radius_edge_index\",\n            ],  # The keys to return\n            in_dim=3,\n            out_dim=self.in_dim,\n            out_dim_edges=self.in_dim_edges,\n            num_output_layers=2,\n            num_layers=2,\n            num_spherical=4,\n            num_radial=32,\n        )\n\n        enc_output = pos_enc(bg, None)  # forward requires: pos, edge_index, batch\n\n        bg.feat = bg.feat + enc_output[\"node_feat\"]  # [num_nodes, in_dim]\n        bg.edge_feat = enc_output[\"edge_feat\"]  # [num_edges, out_dim_edges]\n        bg.radius_edge_index = enc_output[\"radius_edge_index\"]  # [2, num_edges]\n        bg.edge_rbf = enc_output[\"edge_rbf\"]  # [num_edges, num_radial]\n        bg.triplet_sbf = enc_output[\"triplet_sbf\"]  # [num_triplets, num_spherical * num_radial]\n\n        kwargs = deepcopy(self.kwargs)\n        kwargs[\"num_bilinear\"] = 32\n        kwargs[\"num_spherical\"] = 4\n        kwargs[\"num_radial\"] = 32\n\n        layer = DimeNetPyg(\n            in_dim=self.in_dim,\n            out_dim=self.out_dim,\n            in_dim_edges=self.in_dim_edges,\n            out_dim_edges=self.out_dim_edges,\n            **kwargs,\n        )\n\n        # Check the re-implementation of abstract methods\n        self.assertTrue(layer.layer_supports_edges)\n        self.assertTrue(layer.layer_inputs_edges)\n        self.assertTrue(layer.layer_outputs_edges)\n        self.assertEqual(layer.out_dim_factor, 1)\n\n        # Apply layer with encoded feats as input\n        bg2 = layer.forward(bg)\n        # output sanity check\n        self.assertEqual(bg2.edge_feat.shape, bg.edge_feat.shape)\n        self.assertEqual(bg2.edge_feat.shape[1], self.out_dim_edges)\n        self.assertEqual(bg2.feat.shape[1], self.out_dim)\n        # not change rbf/sbf embedding\n        self.assertTrue((bg2.edge_rbf == bg.edge_rbf).all)\n        self.assertTrue((bg2.triplet_sbf == bg.triplet_sbf).all)\n\n    def test_preprocess3Dfeaturelayer(self):\n        bg = deepcopy(self.bg)\n        num_heads = 2\n        num_kernel = 2\n        in_dim = 3\n        bg.positions_3d = torch.zeros(bg.feat.size()[0], in_dim)\n        layer = PreprocessPositions(\n            num_heads=num_heads,\n            embed_dim=self.out_dim,\n            num_kernel=num_kernel,\n            in_dim=in_dim,\n            first_normalization=\"layer_norm\",\n        )\n        # bias: [batch, num_heads, nodes, nodes]\n        # node_feature: [total_nodes, embed_dim]\n        bias, node_feature = layer.forward(\n            bg, max_num_nodes_per_graph=4, on_ipu=False, positions_3d_key=\"positions_3d\"\n        )\n        self.assertEqual(bias.size(), torch.Size([2, num_heads, 4, 4]))\n        self.assertFalse(np.isnan(bias.detach().numpy()).any())\n        self.assertEqual(node_feature.size(), torch.Size([7, self.out_dim]))\n\n    def test_gaussianlayer(self):\n        num_kernels = 3\n        input = torch.zeros(2, 4, 4)\n        layer = GaussianLayer(num_kernels=num_kernels)\n        # tensor_with_kernel: [batch, nodes, nodes, num_kernel]\n        tensor_with_kernel = layer.forward(input)\n        self.assertEqual(tensor_with_kernel.size(), torch.Size([2, 4, 4, 3]))\n\n    def test_pooling_virtual_node(self):\n        bg = deepcopy(self.bg)\n        feat_in = bg.feat\n        e_in = bg.edge_feat\n        vn_h = 0\n\n        vn_types = [\"sum\", \"mean\"]\n        for vn_type, use_edges, expected_v_node in zip(vn_types, [False, True], [(2, 21), (2, 34)]):\n            with self.subTest(\n                vn_type=vn_type,\n                use_edges=use_edges,\n                expected_v_node=expected_v_node,\n            ):\n                vn_h = 0.0\n                print(vn_h, self.out_dim, feat_in.size())\n                layer = VirtualNodePyg(\n                    in_dim=self.in_dim,\n                    out_dim=self.in_dim,\n                    in_dim_edges=self.in_dim_edges,\n                    out_dim_edges=self.in_dim_edges,\n                    vn_type=vn_type,\n                    use_edges=use_edges,\n                    **self.kwargs,\n                )\n\n                feat_out, vn_out, e_out = layer.forward(bg, feat_in, vn_h, edge_feat=e_in)\n                assert vn_out.shape == expected_v_node\n                assert feat_out.shape == (7, 21)\n                assert e_out.shape == (9, 13)\n                if use_edges is False:\n                    # i.e. that the node features have been updated\n                    assert torch.equal(feat_out, feat_in) == False\n                    # And that the edge features have not\n                    assert torch.equal(e_out, e_in) == True\n                else:\n                    assert torch.equal(feat_out, feat_in) == False\n                    assert torch.equal(e_out, e_in) == False\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_residual_connections.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals.\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the file residual_connections.py\n\"\"\"\n\nimport numpy as np\nimport torch\nimport unittest as ut\n\nfrom graphium.nn.residual_connections import (\n    ResidualConnectionConcat,\n    ResidualConnectionDenseNet,\n    ResidualConnectionNone,\n    ResidualConnectionSimple,\n    ResidualConnectionWeighted,\n    ResidualConnectionRandom,\n)\n\n\nclass test_ResidualConnectionNone(ut.TestCase):\n    def test_get_true_out_dims_none(self):\n        full_dims = [4, 6, 8, 10, 12]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n        rc = ResidualConnectionNone(skip_steps=1)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = out_dims[:-1]\n\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n    def test_forward_none(self):\n        rc = ResidualConnectionNone(skip_steps=1)\n        num_loops = 10\n        shape = (3, 11)\n        h_original = [torch.rand(shape) for _ in range(num_loops)]\n\n        h_prev = None\n        for ii in range(num_loops):\n            feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n            np.testing.assert_array_equal(feat.numpy(), h_original[ii].numpy(), err_msg=f\"ii={ii}\")\n            self.assertIsNone(h_prev)\n\n\nclass test_ResidualConnectionSimple(ut.TestCase):\n    def test_get_true_out_dims_simple(self):\n        full_dims = [4, 6, 8, 10, 12]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n        rc = ResidualConnectionSimple(skip_steps=1)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = out_dims[:-1]\n\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n    def test_forward_simple(self):\n        for skip_steps in [1, 2, 3]:\n            rc = ResidualConnectionSimple(skip_steps=skip_steps)\n            num_loops = 10\n            shape = (3, 11)\n            h_original = [torch.ones(shape) * (ii + 1) for ii in range(num_loops)]\n\n            h_prev = None\n            for ii in range(num_loops):\n                feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n\n                if ((ii % skip_steps) == 0) and (ii > 0):\n                    h_expected = (\n                        torch.sum(torch.stack(h_original[0 : ii + 1 : skip_steps], dim=0), dim=0)\n                    ).numpy()\n                    h_expected_prev = h_expected\n                else:\n                    h_expected = h_original[ii].numpy()\n                if ii == 0:\n                    h_expected_prev = h_expected\n\n                np.testing.assert_array_equal(\n                    feat.numpy(), h_expected, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n                np.testing.assert_array_equal(\n                    h_prev.numpy(), h_expected_prev, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n\n\nclass test_ResidualConnectionRandom(ut.TestCase):\n    def test_get_true_out_dims_random(self):\n        full_dims = [4, 6, 8, 10, 12]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n        rc = ResidualConnectionRandom(skip_steps=1, out_dims=out_dims)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = out_dims[:-1]\n\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        rc = ResidualConnectionRandom(skip_steps=1, num_layers=len(out_dims))\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = out_dims[:-1]\n\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        with self.assertRaises(ValueError):\n            rc = ResidualConnectionRandom(skip_steps=1, out_dims=None, num_layers=None)\n\n    def test_forward_random(self):\n        for skip_steps in [1, 2, 3]:\n            num_loops = 10\n\n            if skip_steps > 1:\n                with self.assertRaises(ValueError):\n                    rc = ResidualConnectionRandom(skip_steps=skip_steps, num_layers=num_loops)\n                continue\n\n            rc = ResidualConnectionRandom(skip_steps=skip_steps, num_layers=num_loops + 1)\n            shape = (3, 11)\n            h_original = [torch.ones(shape) * (ii + 1) for ii in range(num_loops)]\n\n            h_prev = None\n            # Not really testing the expected values due to randomness, just testing if it runs\n            for ii in range(num_loops):\n                print(ii)\n                feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n\n\nclass test_ResidualConnectionWeighted(ut.TestCase):\n    def test_get_true_out_dims_weighted(self):\n        full_dims = [4, 6, 8, 10, 12]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n        rc = ResidualConnectionWeighted(skip_steps=1, out_dims=full_dims[1:])\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = out_dims[:-1]\n\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n    def test_forward_weighted(self):\n        for skip_steps in [1, 2, 3]:\n            num_loops = 10\n            shape = (3, 11)\n            full_dims = [shape[1]] * (num_loops + 1)\n            rc = ResidualConnectionWeighted(\n                skip_steps=skip_steps,\n                out_dims=full_dims[1:],\n                activation=\"none\",\n                normalization=\"none\",\n                bias=False,\n            )\n\n            h_original = [torch.ones(shape) * (ii + 1) for ii in range(num_loops)]\n            h_forward = []\n\n            h_prev = None\n            step_counter = 0\n            for ii in range(num_loops):\n                h_prev_backup = h_prev\n                feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n\n                if ((ii % skip_steps) == 0) and (ii > 0):\n                    h_forward.append(rc.residual_list[step_counter].forward(h_prev_backup))\n                    h_expected = (h_forward[-1] + h_original[ii]).detach().numpy()\n                    h_expected_prev = h_expected\n                    step_counter += 1\n                else:\n                    h_expected = h_original[ii].detach().numpy()\n                if ii == 0:\n                    h_expected_prev = h_expected\n\n                np.testing.assert_array_equal(\n                    feat.detach().numpy(), h_expected, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n                np.testing.assert_array_equal(\n                    h_prev.detach().numpy(),\n                    h_expected_prev,\n                    err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\",\n                )\n\n\nclass test_ResidualConnectionConcat(ut.TestCase):\n    def test_get_true_out_dims_concat(self):\n        full_dims = [4, 6, 8, 10, 12, 14, 16, 18, 20]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n\n        # skip_steps=1\n        rc = ResidualConnectionConcat(skip_steps=1)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = [6, 14, 18, 22, 26, 30, 34]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        # skip_steps=2\n        rc = ResidualConnectionConcat(skip_steps=2)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = [6, 8, 16, 12, 24, 16, 32]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        # skip_steps=3\n        rc = ResidualConnectionConcat(skip_steps=3)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = [6, 8, 10, 18, 14, 16, 30]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n    def test_forward_concat(self):\n        for skip_steps in [1, 2, 3]:\n            rc = ResidualConnectionConcat(skip_steps=skip_steps)\n            num_loops = 10\n            shape = (3, 11)\n            h_original = [torch.ones(shape) * (ii + 1) for ii in range(num_loops)]\n\n            h_prev = None\n            for ii in range(num_loops):\n                feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n\n                if ((ii % skip_steps) == 0) and (ii > 0):\n                    h_expected = (\n                        torch.cat(h_original[ii - skip_steps : ii + 1 : skip_steps][::-1], dim=-1)\n                    ).numpy()\n                    h_expected_prev = h_original[ii].numpy()\n                else:\n                    h_expected = h_original[ii].numpy()\n                if ii == 0:\n                    h_expected_prev = h_expected\n\n                np.testing.assert_array_equal(\n                    feat.numpy(), h_expected, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n                np.testing.assert_array_equal(\n                    h_prev.numpy(), h_expected_prev, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n\n\nclass test_ResidualConnectionDenseNet(ut.TestCase):\n    def test_get_true_out_dims_densenet(self):\n        full_dims = [4, 6, 8, 10, 12, 14, 16, 18, 20]\n        in_dims, out_dims = full_dims[:-1], full_dims[1:]\n\n        # skip_steps=1\n        rc = ResidualConnectionDenseNet(skip_steps=1)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = np.cumsum(out_dims).tolist()[:-1]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        # skip_steps=2\n        rc = ResidualConnectionDenseNet(skip_steps=2)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = [6, 8, 10 + 6, 12, 14 + 10 + 6, 16, 18 + 14 + 10 + 6]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n        # skip_steps=3\n        rc = ResidualConnectionDenseNet(skip_steps=3)\n        true_out_dims = rc.get_true_out_dims(out_dims)\n        expected_out_dims = [6, 8, 10, 12 + 6, 14, 16, 18 + 12 + 6]\n        self.assertListEqual(expected_out_dims, true_out_dims)\n\n    def test_forward_densenet(self):\n        for skip_steps in [1, 2, 3]:\n            rc = ResidualConnectionDenseNet(skip_steps=skip_steps)\n            num_loops = 10\n            shape = (3, 11)\n            h_original = [torch.ones(shape) * (ii + 1) for ii in range(num_loops)]\n\n            h_prev = None\n            for ii in range(num_loops):\n                feat, h_prev = rc.forward(h_original[ii], h_prev, step_idx=ii)\n\n                if ((ii % skip_steps) == 0) and (ii > 0):\n                    h_expected = (torch.cat(h_original[0 : ii + 1 : skip_steps][::-1], dim=-1)).numpy()\n                    h_expected_prev = h_expected\n                else:\n                    h_expected = h_original[ii].numpy()\n                if ii == 0:\n                    h_expected_prev = h_expected\n\n                np.testing.assert_array_equal(\n                    feat.numpy(), h_expected, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n                np.testing.assert_array_equal(\n                    h_prev.numpy(), h_expected_prev, err_msg=f\"Error at: skip_steps={skip_steps}, ii={ii}\"\n                )\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  },
  {
    "path": "tests/test_training.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\nimport pytest\nfrom graphium.cli.train_finetune_test import cli\nimport sys\nimport subprocess\nimport os\nfrom unittest.mock import patch\n\n\nclass TestCLITraining:\n    @classmethod\n    def setup_class(cls):\n        print(\"Setting up the test class...\")\n\n        # Equivalent of the bash commands to download the data files\n        toymix_dir = \"expts/data/neurips2023/small-dataset/\"\n        subprocess.run([\"mkdir\", \"-p\", toymix_dir])\n\n        base_url = \"https://storage.valencelabs.com/graphium/datasets/neurips_2023/Small-dataset/\"\n        files = [\n            \"ZINC12k.csv.gz\",\n            \"Tox21-7k-12-labels.csv.gz\",\n            \"qm9.csv.gz\",\n            \"qm9_random_splits.pt\",\n            \"Tox21_random_splits.pt\",\n            \"ZINC12k_random_splits.pt\",\n        ]\n\n        for file in files:\n            file_path = f\"{toymix_dir}{file}\"\n            if not os.path.exists(file_path):\n                print(f\"Downloading {file}...\")\n                subprocess.run([\"wget\", \"-P\", toymix_dir, f\"{base_url}{file}\"])\n            else:\n                print(f\"{file} already exists. Skipping...\")\n\n        print(\"Data has been successfully downloaded.\")\n\n    def call_cli_with_overrides(self, acc_type: str, acc_prec: str, load_type: str) -> None:\n        overrides = [\n            f\"accelerator={acc_type}\",\n            \"tasks=toymix\",\n            \"training=toymix\",\n            # Reducing number of parameters in the toymix architecture\n            \"architecture=toymix\",\n            \"architecture.pe_encoders.encoders.la_pos.hidden_dim=16\",\n            \"architecture.pe_encoders.encoders.la_pos.num_layers=1\",\n            \"architecture.pe_encoders.encoders.rw_pos.hidden_dim=16\",\n            \"architecture.pe_encoders.encoders.rw_pos.num_layers=1\",\n            \"architecture.pre_nn.hidden_dims=32\",\n            \"architecture.pre_nn.depth=1\",\n            \"architecture.pre_nn.out_dim=16\",\n            \"architecture.gnn.in_dim=16\",\n            \"architecture.gnn.out_dim=16\",\n            \"architecture.gnn.depth=2\",\n            \"architecture.task_heads.qm9.depth=1\",\n            \"architecture.task_heads.tox21.depth=1\",\n            \"architecture.task_heads.zinc.depth=1\",\n            # Set the number of epochs\n            \"constants.max_epochs=2\",\n            \"+datamodule.args.task_specific_args.qm9.sample_size=1000\",\n            \"+datamodule.args.task_specific_args.tox21.sample_size=1000\",\n            \"+datamodule.args.task_specific_args.zinc.sample_size=1000\",\n            \"trainer.trainer.check_val_every_n_epoch=1\",\n            f\"trainer.trainer.precision={acc_prec}\",\n            f\"datamodule.args.dataloading_from={load_type}\",\n        ]\n        if acc_type == \"ipu\":\n            overrides.append(\"accelerator.ipu_config=['useIpuModel(True)']\")\n            overrides.append(\"accelerator.ipu_inference_config=['useIpuModel(True)']\")\n        # Backup the original sys.argv\n        original_argv = sys.argv.copy()\n\n        # Replace sys.argv with the desired overrides\n        hydra_overrides = [\"script_name\"] + overrides\n        sys.argv = hydra_overrides\n        # Call the function\n        cli()\n\n        # Restore the original sys.argv\n        sys.argv = original_argv\n\n    @pytest.mark.parametrize(\"load_type\", [\"RAM\", \"disk\"])\n    def test_cpu_cli_training(self, load_type):\n        self.call_cli_with_overrides(\"cpu\", \"32\", load_type)\n\n    @pytest.mark.ipu\n    @pytest.mark.skip\n    @pytest.mark.parametrize(\"load_type\", [\"RAM\", \"disk\"])\n    def test_ipu_cli_training(self, load_type):\n        with patch(\"poptorch.ipuHardwareIsAvailable\", return_value=True):\n            with patch(\"lightning_graphcore.accelerator._IPU_AVAILABLE\", new=True):\n                import poptorch\n\n                assert poptorch.ipuHardwareIsAvailable()\n                from lightning_graphcore.accelerator import _IPU_AVAILABLE\n\n                assert _IPU_AVAILABLE is True\n                self.call_cli_with_overrides(\"ipu\", \"16-true\", load_type)\n"
  },
  {
    "path": "tests/test_utils.py",
    "content": "\"\"\"\n--------------------------------------------------------------------------------\nCopyright (c) 2023 Valence Labs, Recursion Pharmaceuticals and Graphcore Limited.\n\nUse of this software is subject to the terms and conditions outlined in the LICENSE file.\nUnauthorized modification, distribution, or use is prohibited. Provided 'as is' without\nwarranties of any kind.\n\nValence Labs, Recursion Pharmaceuticals and Graphcore Limited are not liable for any damages arising from its use.\nRefer to the LICENSE file for the full terms and conditions.\n--------------------------------------------------------------------------------\n\"\"\"\n\n\"\"\"\nUnit tests for the metrics and wrappers of graphium/utils/...\n\"\"\"\n\nimport torch\nimport numpy as np\nimport scipy as sp\nimport unittest as ut\nimport gzip\n\nfrom graphium.utils.read_file import file_opener\nfrom graphium.utils.tensor import (\n    nan_mad,\n    nan_mean,\n    nan_std,\n    nan_var,\n    nan_median,\n    dict_tensor_fp16_to_fp32,\n    tensor_fp16_to_fp32,\n)\nfrom graphium.utils.safe_run import SafeRun\n\n\nclass test_nan_statistics(ut.TestCase):\n    torch.manual_seed(42)\n\n    dims = [\n        None,\n        (0),\n        (1),\n        (2),\n        (-1),\n        (-2),\n        (-3),\n        (0, 1),\n        (0, 2),\n    ]\n    # Create tensor\n    sz = (10, 6, 7)\n    tensor = torch.randn(sz, dtype=torch.float32) ** 2 + 3\n    is_nan = torch.rand(sz) > 0.4\n    tensor[is_nan] = float(\"nan\")\n\n    def test_nan_mean(self):\n        for keepdim in [False, True]:\n            for dim in self.dims:\n                err_msg = f\"Error for :\\n dim = {dim}\\n keepdim = {keepdim}\"\n\n                tensor = self.tensor.clone()\n\n                # Prepare the arguments for numpy vs torch\n                if dim is not None:\n                    torch_kwargs = {\"dim\": dim, \"keepdim\": keepdim}\n                    numpy_kwargs = {\"axis\": dim, \"keepdims\": keepdim}\n                else:\n                    torch_kwargs = {}\n                    numpy_kwargs = {}\n\n                # Compare the nan-mean\n                torch_mean = nan_mean(tensor, **torch_kwargs)\n                numpy_mean = np.nanmean(tensor.numpy(), **numpy_kwargs)\n                np.testing.assert_almost_equal(torch_mean.numpy(), numpy_mean, decimal=6, err_msg=err_msg)\n\n    def test_nan_std_var(self):\n        for unbiased in [True, False]:\n            for keepdim in [False, True]:\n                for dim in self.dims:\n                    err_msg = f\"Error for :\\n\\tdim = {dim}\\n\\tkeepdim = {keepdim}\\n\\tunbiased = {unbiased}\"\n\n                    tensor = self.tensor.clone()\n\n                    # Prepare the arguments for numpy vs torch\n                    if dim is not None:\n                        torch_kwargs = {\"dim\": dim, \"keepdim\": keepdim, \"unbiased\": unbiased}\n                        numpy_kwargs = {\"axis\": dim, \"keepdims\": keepdim, \"ddof\": float(unbiased)}\n                    else:\n                        torch_kwargs = {\"unbiased\": unbiased}\n                        numpy_kwargs = {\"ddof\": float(unbiased)}\n\n                    # Compare the std\n                    torch_std = nan_std(tensor, **torch_kwargs)\n                    numpy_std = np.nanstd(tensor.numpy(), **numpy_kwargs)\n                    np.testing.assert_almost_equal(torch_std.numpy(), numpy_std, decimal=6, err_msg=err_msg)\n\n                    # Compare the variance\n                    torch_var = nan_var(tensor, **torch_kwargs)\n                    numpy_var = np.nanvar(tensor.numpy(), **numpy_kwargs)\n                    np.testing.assert_almost_equal(torch_var.numpy(), numpy_var, decimal=6, err_msg=err_msg)\n\n    def test_nan_median(self):\n        for keepdim in [False, True]:\n            # Cannot test\n            for dim in self.dims:  # in [d for d in self.dims if not isinstance(d, Tuple)]:\n                err_msg = f\"Error for :\\n dim = {dim}\\n keepdim = {keepdim}\"\n\n                tensor = torch.randn(\n                    (7, 9, 11, 13)\n                )  # Need odd number of values to properly compare torch to numpy\n\n                # Prepare the arguments for numpy vs torch\n                if dim is not None:\n                    torch_kwargs = {\"dim\": dim, \"keepdim\": keepdim}\n                    numpy_kwargs = {\"axis\": dim, \"keepdims\": keepdim}\n                else:\n                    torch_kwargs = {}\n                    numpy_kwargs = {}\n\n                # Compare the nan-median\n                torch_med = nan_median(tensor, **torch_kwargs)\n                numpy_med = np.nanmedian(tensor.numpy(), **numpy_kwargs)\n                torch_sum = torch.nansum(tensor, **torch_kwargs)\n                np.testing.assert_almost_equal(torch_med.numpy(), numpy_med, decimal=4, err_msg=err_msg)\n                self.assertListEqual(list(torch_med.shape), list(torch_sum.shape))\n\n    def test_nan_mad(self):\n        for normal in [False, True]:\n            # Cannot test\n            for dim in self.dims:  # in [d for d in self.dims if not isinstance(d, Tuple)]:\n                err_msg = f\"Error for :\\n dim = {dim}\\n normal = {normal}\"\n\n                tensor = torch.randn(\n                    (7, 9, 11, 13)\n                )  # Need odd number of values to properly compare torch to numpy\n\n                # Prepare the arguments for numpy vs torch\n                if dim is not None:\n                    torch_kwargs = {\"dim\": dim, \"keepdim\": False, \"normal\": normal}\n                    numpy_kwargs = {\"axis\": dim, \"nan_policy\": \"omit\", \"scale\": 1 / 1.4826 if normal else 1.0}\n                else:\n                    torch_kwargs = {\"normal\": normal}\n                    numpy_kwargs = {\"axis\": dim, \"nan_policy\": \"omit\", \"scale\": 1 / 1.4826 if normal else 1.0}\n\n                # Compare the nan-median\n                torch_mad = nan_mad(tensor, **torch_kwargs)\n                numpy_mad = sp.stats.median_abs_deviation(tensor.numpy(), **numpy_kwargs)\n                np.testing.assert_almost_equal(torch_mad.numpy(), numpy_mad, decimal=4, err_msg=err_msg)\n\n\nclass test_SafeRun(ut.TestCase):\n    def test_safe_run(self):\n        # Error is caught\n        with SafeRun(name=\"bob\", raise_error=False, verbose=0):\n            raise ValueError(\"This is an error\")\n\n        # Error is caught\n        with SafeRun(name=\"bob\", raise_error=False, verbose=0):\n            2 + None\n\n        # Error is not caught\n        with self.assertRaises(ValueError):\n            with SafeRun(name=\"bob\", raise_error=True, verbose=0):\n                raise ValueError(\"This is an error\")\n\n        # Error is not caught\n        with self.assertRaises(TypeError):\n            with SafeRun(name=\"bob\", raise_error=True, verbose=0):\n                2 + None\n\n        # No error. Runs correctly\n        with SafeRun(name=\"bob\", raise_error=True, verbose=0):\n            print(\"This is not an error\")\n\n        # No error. Runs correctly\n        with SafeRun(name=\"bob\", raise_error=False, verbose=0):\n            print(\"This is not an error\")\n\n\nclass TestTensorFp16ToFp32(ut.TestCase):\n    def test_tensor_fp16_to_fp32(self):\n        # Create a tensor\n        tensor = torch.randn(10, 10).half()\n\n        # Convert the tensor to fp32\n        tensor_fp32 = tensor_fp16_to_fp32(tensor)\n        self.assertTrue(tensor_fp32.dtype == torch.float32)\n\n        # Don't convert the tensor to fp32\n        tensor = torch.randn(10, 10).int()\n        tensor_fp32 = tensor_fp16_to_fp32(tensor)\n        self.assertFalse(tensor_fp32.dtype == torch.float32)\n\n        # Don't convert the tensor to fp32\n        tensor = torch.randn(10, 10).double()\n        tensor_fp32 = tensor_fp16_to_fp32(tensor)\n        self.assertFalse(tensor_fp32.dtype == torch.float32)\n\n    def test_dict_tensor_fp16_to_fp32(self):\n        # Create a dictionary of tensors\n        tensor_dict = {\n            \"a\": torch.randn(10, 10).half(),\n            \"b\": torch.randn(10, 10).half(),\n            \"c\": torch.randn(10, 10).double(),\n            \"d\": torch.randn(10, 10).half(),\n            \"e\": torch.randn(10, 10).float(),\n            \"f\": torch.randn(10, 10).half(),\n            \"g\": torch.randn(10, 10).int(),\n            \"h\": {\n                \"h1\": torch.randn(10, 10).double(),\n                \"h2\": torch.randn(10, 10).half(),\n                \"h3\": torch.randn(10, 10).float(),\n                \"h4\": torch.randn(10, 10).half(),\n                \"h5\": torch.randn(10, 10).int(),\n            },\n        }\n\n        # Convert the dictionary to fp32\n        tensor_dict_fp32 = dict_tensor_fp16_to_fp32(tensor_dict)\n\n        # Check that the dictionary is correctly converted\n        for key, tensor in tensor_dict_fp32.items():\n            if key in [\"a\", \"b\", \"d\", \"e\", \"f\"]:\n                self.assertEqual(tensor.dtype, torch.float32)\n            elif key in [\"h\"]:\n                for key2, tensor2 in tensor.items():\n                    if key2 in [\"h2\", \"h3\", \"h4\"]:\n                        self.assertEqual(tensor2.dtype, torch.float32)\n                    else:\n                        self.assertNotEqual(tensor2.dtype, torch.float32)\n            else:\n                self.assertNotEqual(tensor.dtype, torch.float32)\n\n\nif __name__ == \"__main__\":\n    ut.main()\n"
  }
]